1. 43
  1.  

  2. 7
    1. 2

      Indeed!

    2. 2

      While it general it’s good to set expectations up front and explicitly (I’ve written a CONTRIBUTING.md for a crate that says “Please don’t contribute foo”.), I think this is not a case that should need to be said. Obviously, it’s not practical to get people who reject this rather obvious community norm to explicitly opt out in the READMEs they write, but it still seems like sending the wrong message overall to suggest that intent to use the safe/unsafe split the way the language is designed is opt-in.

      1. 3

        I think that’s still a discussion to be had. While I am a huge fan of soundness and strive for it in my own work, I’m not sure that rejecting people with more relaxed feelings is good for the ecosystem. In my work with low-level UI and graphics, I see unsoundness all the time. An overly strict norm might turn people off from getting involved at all.

        For example, let’s say I write some rendering code that exposes Vulkan interfaces for reading and writing surfaces. Because of the design of Vulkan, these should naturally be unsafe. Publishing interfaces marked with unsafe is quite sound, and it’s very possible to use this stuff correctly, but I fear that the “Rust community” response might be “can’t use this.”

        1. 2

          I’d rather aim for advocacy that libraries that have unsafe entry points are an acceptable part of the ecosystem when providing safe abstractions is impractical (I trust you on the nature of Vulkan) than to dilute the meaning of the safe/unsafe split by signaling that the default is that safe can be unsound and such unsoundness might not be taken as a bug that requires remedying.

          That is, I don’t want to dilute the meaning of safe if order to cater to superficial rejection of unsafe.

          Partly my view of this is informed by having written a crate, encoding_rs, that uses unsafe for performance internally when the problem domain obviously could obviously be addressed (as in computations yielding the correct results albeit more slowly) with safe-only code. (Furthermore, even the fast code could become safe-only if the standard library evolved more features.) Yet, I haven’t seen the community reaction that stereotypes predict. That is, I haven’t seen folks reject encoding_rs on the grounds of use of unsafe.

          That’s not at all the same as unsafe in the outward API, but it suggests that the Rust community isn’t as rejective of unsafe as it might appear.

          1. 3

            Good points, and thanks for the data point that you haven’t gotten much pushback from the use of unsafe in encoding_rs.

            1. 1

              Exposing a standardised and unsafe API like Vulkan (call them Vulkan-like APIs for the sake of brevity) directly seems like a really good example of where libraries with unsafe entry points should be an acceptable part of the ecosystem.

              The only other option as I see it would be that Vulkan is simply not exposed at all. Anyone that wants to use it wraps it directly without exposing it in their library interface, or they use an existing higher-level and (not obviously un)safe wrapper. But do we even know if there’s a reasonable way to wrap Vulkan in its full generality in a safe way? It seems intuitively obvious to me that there are applications of Vulkan - say, a rendering engine - that should be able to be given a safe interface, but Vulkan in its full generality at maximum performance? That’s questionable.

              Yet, I haven’t seen the community reaction that stereotypes predict. That is, I haven’t seen folks reject encoding_rs on the grounds of use of unsafe.

              I would hope not, that sounds to me (and I’m not a Rust programmer, although I have written some Rust code) like one of the primary points of unsafe.

        2. 2

          Why is the response to the Rust community slut shaming a developer into abandoning their project a dress code instead of making clear that en-masse harassment is not ok?

          1. 3

            Because its better to address the root cause? I’d prefer the Rust community remains positive and constructive. The childhood lesson can be left for the developers that misbehaved.

            1. 3

              This already happened enough in other threads. For example https://words.steveklabnik.com/a-sad-day-for-rust And it is very reasonable to decide on how we can handle safety expectations like these, which fuel the mob.

            2. 1

              Great write-up! I love your overall idea where maintainers just tell us what their goal is. Then folks respect it. I also like your approach to saying calling unsafe stuff is just going to be unsafe: just make the use of it safer. Just seems realistic.

              “This is most likely to yield positive results for lower level data structures; it should finally be possible to implement a doubly-linked list efficiently and without anxiety.”

              I still say re-code the unsafe construct in C, throw all its analyzers/testers at it, and port the final form back to unsafe Rust. That should do until a similar level of tooling comes online for Rust. Which will never happen anyway given C’s inertia and how much has been built for working with it. Even with Rust-based tools, the fact that different tools find different bugs means analyzing a C version of unsafe, Rust code with C tooling will still find more bugs on average. So, that approach should cover yall on high-performance, data structures if you know Rust and C.

              Tangent since you mentioned them: I read the Reddit thread with comments by Shnatzel on the smoke test. Yall might find it funny that they also feed Open Clip Art to random things to see what happens. One component was exploded by a picture of a volcanic eruption haha.

              1. 9

                Hmm, I’m not sure. Obviously tooling is more mature in C space, but just porting things back and forth has its own risks. Shared mutable state is fine in C (it’s the norm), but defined as UB in Rust. Conversely, signed integer overflow is UB in C but safe in Rust.

                Also, MIRI is the tool of choice for running safety tests in Rust, and I’m not sure anything quite like it exists in C, though the LLVM sanitizer suite is close.

                1. 1

                  When I conceived this, I originally wanted it to be an automated process where defects went through a sanity check to make sure they even made sense across two languages. For instance, a C tool analyzing Java code reporting use-after-free’s wouldn’t be so helpful given the GC.

                  I think a checklist could still knock out a lot of that problem. For each type of issue, it shows you the considerations. You can also use command line flags in a lot of the C tools to turn off certain kinds of analysis.

                2. 7

                  Yes, C has more mature tools, but these tools work with a harder to analyze language.

                  C static analyzers find wide variety of general problems in large codebases, but Rust’s use of unsafe is usually limited to very small, but very tricky problems. C analyzers I’ve seen point out potential null dereferences and function-local codepaths that leak memory, but are not nearly smart enough to say that a custom lock makes a wrong assumption about memory model.

                  There’s also a big problem of equivalence between C and Rust semantics at the very edge cases that these tools would be for. The code isn’t just for the hardware you compile it for. It’s symbolically executed by the optimizer as if it was running on the imaginary machine from the C spec.

                3. 1

                  … ideal of perfect soundness: that code using a sound library, no matter how devious, is unable to trigger undefined behavior

                  Generally, that’s not what soundness means. In type theory, soundness basically means that the runtime type of a value is the same as its compile-time type: https://www.scala-lang.org/blog/2016/02/03/essence-of-scala.html

                  Undefined behaviour is a different property than soundness.

                  1. 4

                    I’m using it in a different sense than type theory, though I think there is a connection, basically I’m following David Tolnay’s usage. Perhaps it would be better to clarify this.

                    1. 4

                      It’s exactly the same meaning. Rust’s type system is sound in the absence of undefined behavior in unsafe code. Thus if your codebase is absent of undefined behavior then we can rely on the compile-time types of things to reason about their runtime behavior; otherwise we cannot.

                      1. 2

                        Hmm but there was recent demonstrated unsoundness without the use of unsafe? https://internals.rust-lang.org/t/unsoundness-in-pin/11311

                        1. 5

                          That is a bug in unsafe code in the standard library, not anything to do with the type system. Pin is a library abstraction so the language rules or compiler don’t hold special knowledge associated with its correctness.