Anytime packaging and Python discussions come up, I dream of an official lockfile. Being forced to use a third-party package manager or create your own lockfiles from pip-tools is a nightmare.
I wonder if these third-party tools are using packaging.version.Version?
create your own lockfiles from pip-tools is a nightmare
Curious what your issues have been? I’ve used pip-tools quite a bit and it’s almost always “just worked” for me. The only times it’s really had issues was when I was intentionally doing “weird” things (eg forcing installation of a set of package versions that said they weren’t compatible because the upstream was broken and I knew they actually were.)
It mostly comes up when building CI scripts where I can’t just stuff everything into a container image. We have either Windows or MacOS hosts for development, but our CI is obviously Linux. So, how do I build these different platform-specific lockfiles? And how do I specify them in my pyproject.toml dynamically so it picks the correct lockfile for the platform? I don’t know! So I end up using the whole *.in process to lock my deps with platform prefixes and end up having to run two install commands for deps and for the package itself.
But I don’t work with developers on my team, so this ends up confusing them about why we’re doing any of this in the first place. The reason? I’ve seen a lot of broken / old code that I can no longer reproduce because they didn’t even bother with a requirements.txt in the past.
That’s fair, though not an issue I personally deal with much. But, I don’t see how that’s relevant to pip-tools vs some standard format, right? The fundamental problem is the fact that your deps will resolve differently depending on platform…and that’s a much bigger can of worms to address than just standardizing the lockfile.
This is a bit of a sidestep here but one thing that works very well for “not so great” developers is to use the VS Code development environment configuration setup. You do a bunch of configuration, and when someone opens the project in VS Code, it sets up a container and runs the “right commands” to get everything running (including things like running the LSP inside the container, but the GUI is still native windows/mac)
cross-compilation stuff on Python is a … I don’t know if mess is the right word but it’s a thing, but the vs code stuff just really smoothed everything out for me. Got to a point where people just needed to install VS Code, Git, and Docker, check out the project, and then everyone is running on linux
I, fortunately and unfortunately, work for a very large institution. We have zero virtualization available to our Windows hosts anymore and are kind of being pushed to other OSes if we need to do dev work. But my coworkers, being non-devs, probably won’t be moving unless they absolutely are forced to. I really like the dev container setup in vscode, I just don’t have the power to push people onto using it despite being the “tech lead” in title 🥲
Anytime packaging and Python discussions come up, I dream of an official lockfile. Being forced to use a third-party package manager or create your own lockfiles from pip-tools is a nightmare.
I wonder if these third-party tools are using
packaging.version.Version
?Curious what your issues have been? I’ve used pip-tools quite a bit and it’s almost always “just worked” for me. The only times it’s really had issues was when I was intentionally doing “weird” things (eg forcing installation of a set of package versions that said they weren’t compatible because the upstream was broken and I knew they actually were.)
It mostly comes up when building CI scripts where I can’t just stuff everything into a container image. We have either Windows or MacOS hosts for development, but our CI is obviously Linux. So, how do I build these different platform-specific lockfiles? And how do I specify them in my
pyproject.toml
dynamically so it picks the correct lockfile for the platform? I don’t know! So I end up using the whole*.in
process to lock my deps with platform prefixes and end up having to run two install commands for deps and for the package itself.But I don’t work with developers on my team, so this ends up confusing them about why we’re doing any of this in the first place. The reason? I’ve seen a lot of broken / old code that I can no longer reproduce because they didn’t even bother with a
requirements.txt
in the past.That’s fair, though not an issue I personally deal with much. But, I don’t see how that’s relevant to pip-tools vs some standard format, right? The fundamental problem is the fact that your deps will resolve differently depending on platform…and that’s a much bigger can of worms to address than just standardizing the lockfile.
This is a bit of a sidestep here but one thing that works very well for “not so great” developers is to use the VS Code development environment configuration setup. You do a bunch of configuration, and when someone opens the project in VS Code, it sets up a container and runs the “right commands” to get everything running (including things like running the LSP inside the container, but the GUI is still native windows/mac)
cross-compilation stuff on Python is a … I don’t know if mess is the right word but it’s a thing, but the vs code stuff just really smoothed everything out for me. Got to a point where people just needed to install VS Code, Git, and Docker, check out the project, and then everyone is running on linux
I, fortunately and unfortunately, work for a very large institution. We have zero virtualization available to our Windows hosts anymore and are kind of being pushed to other OSes if we need to do dev work. But my coworkers, being non-devs, probably won’t be moving unless they absolutely are forced to. I really like the dev container setup in vscode, I just don’t have the power to push people onto using it despite being the “tech lead” in title 🥲
Oh, so WSL is totally unavailable to you? That sounds rough.
It sure is.