1. 14
  1.  

  2. 0

    This sure seems weird. wtf is going on with python lately?

    First twisted style async… which was like.. ok. I’m not a fan but fine.
    Then walrus operator, and now string eval’ing subinterpreters which … don’t really seem to do much other than break lots of modules? o.O

    1. 2

      A couple of recent features that have made their way into “Python” are features that already existed, in the sense that lower-level C code using the CPython APIs could already do these things, and are just bubbling those up to the level where actual Python code can use them too.

      For example, positional-only arguments were a feature the C API already supported and that some of Python’s C-implemented built-ins already used; now they’re supported in pure-Python argument signatures as well. Subinterpreters are another thing that already existed in the C API, and now are on track to also be exposed for creation and manipulation at the level of pure Python code.

      1. 1

        I still can’t quite wrap my head around the async stuff. Whenever I’d have to use it I end up with just tossing awaits around when the interpreter complains… This is probably more of my own failing than the implementation of it, but it’s a bit of a curveball to how “normal” Python was written.

        I don’t mind the walrus operator to be honest. I know it only saves a line, but it implicitly tells the reader that this variable is only relevant in the following code block. If it was defined outside the block then you would have to make sure it’s not used anywhere else if you want to modify things.

        The subinterpreter though… I don’t know. Since it’s evaling strings I just see it as a massive pain in the ass as you would have to pass the code in as text, so strings inside that code will be annoying to deal with. On top of that you might get all the security issues of eval as well, and all this without actually getting it done faster since the GIL is still there… It honestly just seems like a shittier version of threads until they deal with the GIL.

        1. 1

          For a more straightforward approach, see PyPy’s features borrowed from from Stackless Python. Both Stackless Python and Go were inspired by Limbo’s approach to concurrency.

          1. 1

            async/await makes sense to me, but that’s because I tried writing a generator based coroutine library using yield from and sendto (actually I didn’t have yield from being on Py 2 at the time and had to inject every layer with a complex macro that would call generator.send to propagate exceptions/results up the stack). If you try doing that, await becomes “suspend this coroutine until the thing I’m awaiting is done” while handling all the exceptions, etc that come from an inversion of control flow.

            As to sub interpreters, I think the real pain point is that there’s no easy way to:

            • create or memmove a complex python object in/to/from a Shared Memory mapping
            • attach/detach from the GC

            If that existed, it would be much easier to do multiprocess concurrency send/recv semantics.

            I would absolutely love something like:

            Sender process (assume both have the same exact definition of user class A, B)

            fh = mmap.mmap(“/dev/shm/myshared”)
            b = A(“p”) # currently in the private process heap
            
            with gc.foreign_heap(fh) as roots:
                a = A(“A”, B(1), {“1”:2})
                a.field = 6
                roots.move(b)
                gc_refs: Tuple[int] = roots.detach(a, b)
            # a is now deleted or functionally invalid/cleared
            pipe.send(pickle.dumps(refs))
            

            Receiver:

            fh = mmap.mmap(“/dev/shm/myshared”)
            refs = pickle.loads(pipe.recv())
            with gc.foreign_heap(fh) as roots:
                a, b = root.attach(refs)
                assert a.field == 6
                assert b.foo == “p”
            

            At least something like that would allow me to build a nice, guarded Channel send/receive semantics without the immense overhead of pickle (I used pickle in this example for brevity - in reality, it would be simple enough to just send the gc root integers over the pipe in by struct packing or getting lazy and cffi casting to char and back again)

            1. 1

              If you try doing that, await becomes “suspend this coroutine until the thing I’m awaiting is done” while handling all the exceptions, etc that come from an inversion of control flow.

              Oh absolutely. I understand the concept, but it seems nearly random when it makes a library incompatible or when you have to use await or wrap something around a function. I’m guessing it’s because I haven’t really spent enough time to understand what’s going on in the background and what limitations that arise from it. It just feels a bit weird when you’re not familiar with it.