1. 23
    1. 6

      This seems bad? C should copy D’s scope, not Go’s defer.

      1. 1

        Why? Because you like it better?

        edit: I happen to like Go’s defer, but I have no technical arguments to why it would be or would not be superior to D’s scope feature, so I’d be interested in any such arguments.

        1. 4

          Go’s defer is function-scoped. You can’t use it for block-scoped cleanup.

          1. 1

            True. However, the proposal here does not rule out block-scoped defers (see 5.1), but leaves it implementation defined whether to allow it.

            1. 4
              void f() {
                  for ... {
                      r = get_resource();
                      defer cleanup(r);

              This code would compile and run successfully with or without block-scoped defers, but it’s good and safe code in one case, and an enormous bug in the other. Is it a good idea to leave that (quite important!) semantic distinction up to the implementation’s discretion?

              edit: This isn’t aimed at you, birkelund, but rather a more general observation.

              1. 1

                My reading of the spec (see 3.1.4) is that the code would either compile and do the right thing (i.e. cleanup on each loop iteration), or not compile (with an error along the lines of “defer only allowed in top-level function bodies”).

                1. 1

                  Oh! Interesting, thanks for the clarification.

        2. 3

          If I’m reading this proposal correctly, there’s no way to do defer without dynamically allocating memory.

          Zig’s defer is better - no dynamic allocations necessary.

        3. 2

          Honestly - and I say this as a heavy Go user - Go’s defer mechanism for ‘catching panics’ sucks. D’s scope mechanism provides a much cleaner way to specify in which situation (success/failure) the deferred function should run, as well as providing more granular scopes as someone else pointed out.

          With all that said, I think I like Python’s context managers the best, as they provide simple rules to define the scope and allow hiding of state management inside of a library.

      2. 1

        On top of that, I never understood why defer needed a function – just give it an expression.

        defer close(fd);
        defer free(p);
    2. 4

      This would be really cool to see, but… C? Lambdas? Did I miss something?

      1. 3

        This paper builds on the assumption that at least simple lambdas are integrated into C23

        1. 2

          ah. I guess I did miss something :)

    3. 2

      With clang in objective-c mode you can write a macro to do this using the cleanup attribute and a block

      1. 2

        You don’t need Objective-C mode, -fblocks is enough. Using blocks in this way is fine because the lifetime is coupled to the stack frame and it’s easy to reason about capture. Lambdas are difficult in C in the general case. They’re easy in a language with GC because capturing the block implicitly extends the lifetime of the captures. In Objective-C, reference counting normally works well here (though you can create cycles if you store a block in an object that is captured). It would have been a mess in C++, except that it was added at the same time as C++11 introduced unique_ptr and shared_ptr.

    4. 1

      I’m a bit unclear on the “functions or compound blocks” scope question. The proposal says it’s a constraint violation to “ask for” a block-scoped deferral if the implementation doesn’t support them. But it doesn’t define what it means to ask for such a deferral.

      Contrived example:

      double foo(int x) {
          double y[] = { 1.0, 2.0, 3.0 };
          double *z = y;
          if (x) {
              z = calloc(3, sizeof(double));
              defer [&z]{ free(z); };
              puts("Using allocated array");
          return z[0] + z[1] + z[2];

      Is this asking for a block-scoped deferral or a function-scoped one?

      I haven’t been following the latest C developments for a while so hopefully this is just a matter of me being unaware of a well-defined idiom.