Overall, there was agreement that the original motivations for a large, “batteries-included” standard library no longer held up to scrutiny. “In the good old days,” Ned Deily reminisced, “We said ‘batteries-included’ because we didn’t have a good story for third-party installation.” But in 2023, installing third-party packages from PyPI is much easier.
I’m pretty unconvinced about this for script cases. For development projects setting up a virtualenv or whatever is fine, but for scripts, requirements installation being basically “oh yeah just set it up in the global environment (also be prepared for users to force the install and then break other stuff on their system)” still kinda sucks.
If we are moving towards more things requiring files being installed, I would very much like an out of the box solution for “this script uses pathlib and dataclasses, so please run it in an environment where this is available”. A solution that doesn’t involve polluting the current working directory (though that might just be some dot file). If a solution to this exists, I’m much happier about moving forward with a “slimmer” standard library.
Really I’m a fan of the huge standard library because there’s a bunch of serious things that work well (argparse is great, and it’s great that it’s in the standard library, and it’s not clear to me that it would be there in a new version of the standard library). But if the contributors are all for a slimmer one, hard for me to really say it’s not a good idea
I don’t think Python would be nearly as successful without its current iteration of std. Installing packages, as you allude to, is still atrocious despite their optimism that it’s so easy now.
And tell me what happens. If you’re on Windows, run py instead of python, and on line three change to running the appropriate bin/ script for your Windows terminal type (.bat for cmd.exe or .ps1 for PowerShell).
Many language package managers, even ones that get widespread praise, have some sort of “start new project” command, and require you to be in the project directory to work with dependencies. Literally the only thing that’s extra about the Python approach is “activating” the isolated environment, and the differences in command names, which are not part of Python packaging (every example of doing command-line stuff in Python, not just packaging, has to account for the fact that on Windows the executable is named py)
The way I see it, the average Python user isn’t necessarily a programmer. Maybe they code on the side, writing small scripts, experimenting with what Python can do. They just run python myscript.py and it works everywhere. The details you gave are just one complication out of the many many problems that package managers introduce, and would affect usability for those users. (e.g. requiring a project.toml or equivalent, version collisions between packages, only works when there’s network access, etc.)
Btw I’m not against getting rid of some of the less used packages, but personally I’d prefer to grow the standard library. Python is already a huge downloads, what’s a few more scripts, if it helps usability?
P.S. you don’t find it odd that if I switch between windows and linux, I have to remember if it’s bin or scripts, if it’s cmd.bat or .ps1, etc.?). And also, I use windows and I always use python, never used py. So many pointless details..
Again: I don’t see how you can argue for Python being uniquely bad among language package managers when it seems your position is that language package managers are all inherently bad and wrong due to not meeting the needs of non-“programmers”.
Yes, package management is generally bad in most languages. It is a little bit worse in Python unfortunately, but not that much.
But because Python comes with a large and high-quality std lib, you don’t have to deal with package management as much, and sometimes users can avoid it for a long time. Which is why it is uniquely good.
python 3 is installed on all of the above machines. I’m not sure what you expected to get our of your experiment if you’re so dismissive of someone trying to help.
For the last one, yep – the package python-is-python3 symlinks /usr/bin/python to python3. I have no idea what the deal is the ensurepip error. It seems debian-like systems need some additional package installed.
You’re missing out! pathlib is much easier to work with than os for file management, and dataclasses is great for defining typed, well-rounded classes. Most of my classes nowadays are a dataclass.
I guess you are the exception. How do you work with paths, classes or command line interfaces instead? ( I know that there are alternatives, but these are a great start)
What did get deprecated that is about paths and classes? The only few CLI module that were being deprecated (before decision was reverted) have modern alternative module in the stdlib.
If we are moving towards more things requiring files being installed, I would very much like an out of the box solution for “this script uses pathlib and dataclasses, so please run it in an environment where this is available”. A solution that doesn’t involve polluting the current working directory (though that might just be some dot file). If a solution to this exists, I’m much happier about moving forward with a “slimmer” standard library.
In principle I agree, but this wouldn’t actually help me much. I’ve done a lot of work in environments where an ambient Python interpreter (it might be 10 years old) is table stakes, but installing anything third-party has such a high bar as to be effectively impossible. And I don’t even really disagree with policies like this: trusting Python is much easier than trusting everyone with a PyPI account.
So in these cases, we’d all write Python. People who would have run a hundred miles to use Perl instead wrote Python. Having more stuff in the standard library just made it objectively better for us almost all of the time. Making it easier to install untrusted code wouldn’t have helped.
The key value of the Python standard library is that it guarantees that a piece of code will continue to be maintained in a responsible way.
If a new Python release breaks a module in the standard library, that module will be fixed to continue working with Python.
The same thing comes up in Django world a lot, when comparing Django (“batteries included”) to other ecosystems like Flask where you end up pulling together your own choice of ORM and template language.
Being able to pick “the best tool for the job” feels intuitively better, but it comes at the cost that any one of those dependencies might not be maintained in the future.
I’ve had a similar argument with people in the FreeBSD world. You are conflating two things!
Is maintained by a specific team and has some long-term stability guarantees.
Is shipped in a single package.
Just because these have historically been conflated doesn’t mean that they are intrinsically linked. It’s easy to a move things to optional packages and still maintain them in the same way and have the same stability guarantees.
Agreed, this has played out in Django and Python world quite a bit already.
What I care about is that code is in a position where it can be maintained by a long-term organization that can provide ongoing maintenance guarantees, for example:
Cannon brought up the question of whether a module like pathlib belonged. “It’s just sugar,” he remarked – i.e., hardly a “core utility” or a protocol that allowed people to write better code.
This is worrisome. As I see it, writing clearer and more beautiful code is to write better code, but it appears, the Python core members doesn’t share that view
I’m pretty unconvinced about this for script cases. For development projects setting up a virtualenv or whatever is fine, but for scripts, requirements installation being basically “oh yeah just set it up in the global environment (also be prepared for users to force the install and then break other stuff on their system)” still kinda sucks.
If we are moving towards more things requiring files being installed, I would very much like an out of the box solution for “this script uses
pathlib
anddataclasses
, so please run it in an environment where this is available”. A solution that doesn’t involve polluting the current working directory (though that might just be some dot file). If a solution to this exists, I’m much happier about moving forward with a “slimmer” standard library.Really I’m a fan of the huge standard library because there’s a bunch of serious things that work well (
argparse
is great, and it’s great that it’s in the standard library, and it’s not clear to me that it would be there in a new version of the standard library). But if the contributors are all for a slimmer one, hard for me to really say it’s not a good ideaI don’t think Python would be nearly as successful without its current iteration of std. Installing packages, as you allude to, is still atrocious despite their optimism that it’s so easy now.
Hey if it were up to me we would keep growing the stdlib. I do think there’s value in tracking deps internally tho
Do me a favor. On a device of your choice that has a Python interpreter available, open a terminal and run:
And tell me what happens. If you’re on Windows, run
py
instead ofpython
, and on line three change to running the appropriatebin/
script for your Windows terminal type (.bat
forcmd.exe
or.ps1
for PowerShell).This is parody, right? Because you’re proving his point with flying colors
What’s parody about it?
Many language package managers, even ones that get widespread praise, have some sort of “start new project” command, and require you to be in the project directory to work with dependencies. Literally the only thing that’s extra about the Python approach is “activating” the isolated environment, and the differences in command names, which are not part of Python packaging (every example of doing command-line stuff in Python, not just packaging, has to account for the fact that on Windows the executable is named
py
)The way I see it, the average Python user isn’t necessarily a programmer. Maybe they code on the side, writing small scripts, experimenting with what Python can do. They just run
python myscript.py
and it works everywhere. The details you gave are just one complication out of the many many problems that package managers introduce, and would affect usability for those users. (e.g. requiring a project.toml or equivalent, version collisions between packages, only works when there’s network access, etc.)Btw I’m not against getting rid of some of the less used packages, but personally I’d prefer to grow the standard library. Python is already a huge downloads, what’s a few more scripts, if it helps usability?
P.S. you don’t find it odd that if I switch between windows and linux, I have to remember if it’s bin or scripts, if it’s cmd.bat or .ps1, etc.?). And also, I use windows and I always use
python
, never usedpy
. So many pointless details..Again: I don’t see how you can argue for Python being uniquely bad among language package managers when it seems your position is that language package managers are all inherently bad and wrong due to not meeting the needs of non-“programmers”.
I never said Python is uniquely bad. The opposite, right now it is uniquely good, and I think it shouldn’t try to be more like node.
Someone said:
I gave a sample of starting a project and installing a package into it. You replied:
But now you are saying Python is “uniquely good” on this?
I do not understand.
Yes, package management is generally bad in most languages. It is a little bit worse in Python unfortunately, but not that much.
But because Python comes with a large and high-quality std lib, you don’t have to deal with package management as much, and sometimes users can avoid it for a long time. Which is why it is uniquely good.
On one machine I get:
On another I get a long error message starting with:
On yet another it worked, but only if I changed
python
->python3
in your commands.Using Python 2, three+ years after it EOL’d and 13 years after Python 3 came out, is its own pile of problems that I can’t help you with.
python 3 is installed on all of the above machines. I’m not sure what you expected to get our of your experiment if you’re so dismissive of someone trying to help.
Ubuntu? I tried the commands too and default
python
is still apparently Python 2.For the last one, yep – the package
python-is-python3
symlinks /usr/bin/python to python3. I have no idea what the deal is theensurepip
error. It seems debian-like systems need some additional package installed.[Comment removed by author]
FYI I don’t think
pathlib
,dataclasses
orargparse
are under threat. At least they are not in the list on PEP 594 – Removing dead batteries from the standard library.I’ve never even heard of most of those, so if that’s what they want to trim out, I’m all for it.
You’re missing out!
pathlib
is much easier to work with thanos
for file management, anddataclasses
is great for defining typed, well-rounded classes. Most of my classes nowadays are a dataclass.I guess you are the exception. How do you work with paths, classes or command line interfaces instead? ( I know that there are alternatives, but these are a great start)
they’re referring to the things in the linked deprecation list, not the list from the comment.
What did get deprecated that is about paths and classes? The only few CLI module that were being deprecated (before decision was reverted) have modern alternative module in the stdlib.
I was referring to the list in the PEP, but for the sake of argument,
os.path
,class
, andsys.args
all work perfectly fine.Pathlib
is so much better thanos.path
, andsys.args
doesn’t handle flags, subcommands, or default values.(I know you’re just saying it for the sake of argument, but I’m still here to shill the value of
pathlib
)With pattern matching, you barely need
argparse
anymore! /s (just pass in all the args in exactly the right order every time)Sorry, misunderstood you there :)
In principle I agree, but this wouldn’t actually help me much. I’ve done a lot of work in environments where an ambient Python interpreter (it might be 10 years old) is table stakes, but installing anything third-party has such a high bar as to be effectively impossible. And I don’t even really disagree with policies like this: trusting Python is much easier than trusting everyone with a PyPI account.
So in these cases, we’d all write Python. People who would have run a hundred miles to use Perl instead wrote Python. Having more stuff in the standard library just made it objectively better for us almost all of the time. Making it easier to install untrusted code wouldn’t have helped.
The key value of the Python standard library is that it guarantees that a piece of code will continue to be maintained in a responsible way.
If a new Python release breaks a module in the standard library, that module will be fixed to continue working with Python.
The same thing comes up in Django world a lot, when comparing Django (“batteries included”) to other ecosystems like Flask where you end up pulling together your own choice of ORM and template language.
Being able to pick “the best tool for the job” feels intuitively better, but it comes at the cost that any one of those dependencies might not be maintained in the future.
I’ve had a similar argument with people in the FreeBSD world. You are conflating two things!
Just because these have historically been conflated doesn’t mean that they are intrinsically linked. It’s easy to a move things to optional packages and still maintain them in the same way and have the same stability guarantees.
Agreed, this has played out in Django and Python world quite a bit already.
What I care about is that code is in a position where it can be maintained by a long-term organization that can provide ongoing maintenance guarantees, for example:
In the Flask world the Pallets Project achieves similar goals: https://palletsprojects.com/
This is worrisome. As I see it, writing clearer and more beautiful code is to write better code, but it appears, the Python core members doesn’t share that view
Sometimes I bring up the question of whether a core member like Cannon belongs.