1. 31
  1.  

  2. 7

    I have to say, pipenv was a little disappointing. I have the feeling that it was too often recommended as the solution for everything, before it had a chance to stabilise. I’ve switched back to using pip-tools.

    1. 8

      You might give Poetry a shot. I realized how frustrating Python’s tooling was after a recent year-long stint at a Rails shop, was equally disenchanted with Pipenv, then found Poetry, and have been very happy since.

      1. 1

        Thanks, yeah I’m trying out poetry now and am happy so far. It doesn’t seem to support C-extensions so I can’t move all my projects over. But baby steps, I guess.

      2. 4

        As another formerly disappointed pipenv user, poetry is great!

      3. 6

        I don’t think the description for pyenv is quite correct. While you can use pyenv plugins to manage virtualenvs, the main feature is managing python versions. Having any version or implementation (pypy, jython, etc) available, and be able to set any of those as the global python is the main goal of pyenv, dealing with virtualenvs is just a nice extra, really.

        Other than that, nice write up =)

        1. 3

          FWIW I use pyenv for three things:

          1. Creating and managing virtualenvs tied to specific Python versions (so that when I’m developing something I know I’m on exactly the same Python version the production environment will use), and
          2. Providing the appropriate Python interpreter versions for tox to use in creating and managing its virtualenvs.
          3. Setting my global default python to be whatever the latest is (currently 3.7).
          1. 2

            Yeah maybe it can be further clarified. I use environments always with the same python version but different dependencies versions so the way I think about them is maybe not representative of the common use cases.

            1. 2

              Just curious why you would use pyenv for that instead of, say, virtualenv?

              1. 2

                Personally I don’t. I think that for some people it saves some clicks and mental overhead and I can see why but I personally don’t use it because I need to make lot of switches between virtualenvs during my workflow (because rebuilding the containers appropriately would take too much time) so I prefer control over automation.

            2. 1

              anyway on a comment on reddit they explained my mistake and yes, I wrote something that was plain wrong. Now it should be correct. What I described is apparently a plugin called pyenv-virtualenv

            3. 3

              Virtualenv is nice if you develop a server, but what would you use for some small command line tools?

              For example, I have this 200 lines script which parses some CI artefacts and generates a CSV output. This is fine with just the Python standard lib. Now I would like to generate an Excel sheet but that requires an external package. The easiest way would be copy & paste? Better options?

              1. 1

                Depends how much upfront work the users are willing to do. If you can convince/guide them to install pipsi, then you’re good: just package your utilities as python packages, and have them install it with pipsi. I think you can even send them the package as a zip, so you don’t even need PyPI.

                If that’s too much (and sometimes it is), then you’re just screwed. Either take the time to use something like PyInstaller, for Windows, and snap/flatpack/deb/rpm, for Linux, or give up.

                1. 3

                  pipsi is basically unmaintained. I would suggest giving pipx a try instead.

                2. 1

                  This is the use case for packaging: you have some code, it has at least one dependency that isn’t part of Python itself, and you want to distribute it for use on other computers (possibly, though not necessarily, by other people).

                  The answer there is to build a package, and then install that package on each computer that needs to run the code. The standard approach to this is to use setuptools, write a setup.py file which specifies what to package and what the dependencies are, and use it to generate the package (preferably a wheel), then use pip to install that package on other machines (you do not need to upload the package to a public index; you can distribute the package any way you like, and people who have a copy of it can pip install from it). There are alternative packaging tools out there, and fans of particular ones will recommend them, but this is the standard workflow for Python.

                3. 3

                  [pytest]

                  For: testing your code

                  When: always, you lazy ass

                  Hey there! There’s plenty of times you might not want to use pytest. Maybe you’re trying to generate tests from module metadata. Maybe you need to run a test against multiple implementations. In my personal experience unittest is better than pytest for doing stuff way, way, way off the beaten path.

                  1. 1

                    Adding to that: unittest runs much faster than pytest. I find it really important to have a fast running suite of tests.

                    1. 1

                      I literally wrote a class called ModuleMetadata to generate tests this last week and I had a lot of difficult with pytest. It’s working now so I don’t want to touch it, but do you have any resources or tips for generating tests with unittest?

                      1. 2

                        This might be what you need. https://pypi.org/project/pytest-datafiles/

                        I saw a guy from Mozilla giving a great presentation on how to use this tool

                      2. 1

                        Well, the “when” is when do you need to use a test suite. The alternatives are included in the when.

                      3. 2

                        Great list, thanks! I’m going to share this around at my company.