Quite a bit has changed in the last three and a half years when it comes to Python packaging. What specifically did you want to highlight here, and why?
Anyway, some responses:
> there’s one thing that irritates me every single time I revisit the page and that is the misleading recommendation of their own tool pipenv.... PyPA recommends pipenv as the standard tool for dependency management... PyPA still advertises pipenv all over the place and only mentions poetry a couple of times, although poetry seems to be the more mature product.
This looks very different now, but I don't think the recommendation was "misleading". The archived page didn't say that the tool was "the standard", just a recommendation. Meanwhile, Poetry lagged behind for years in implementing standards. Pipenv was also focused only on dependency management (which basically == creating venvs + invoking Pip + updating lockfiles), whereas Poetry is designed as a complete workflow tool (== dependency management + build frontend + build backend + package uploader + project metadata management + a bunch of integration + probably stuff I forgot).
> You would expect exactly one distribution for Python packages, but here in Python land, we have several ones.
Conda came into existence because the scientific Python community needed to solve packaging problems right now (around 2012 - see https://jakevdp.github.io/blog/2016/08/25/conda-myths-and-mi... ) that everyone else was not prepared to deal with. From everything I've heard, it's much less useful for Python now, but you can't make it stop existing. Similarly, Linux distros are going to have their own packaging for everything no matter what you do, and Windows users aren't going to benefit from that packaging. So no, no I would not expect any such thing.
> In my opinion, there should be compiled wheels for all packages available that need it, built and provided by PyPI.
This was and is completely and utterly infeasible. PyPI already relies on Fastly's generosity in serving over a petabyte of data per day. The resources for a build farm just aren't there (and if they offered one, suddenly everyone would want to use it). Especially since building these projects requires running arbitrary code (to orchestrate arbitrary build processes that need to invoke who-knows-what compiler for who-knows-what language). You'd have to spin up a VM/Docker container/etc. for each build - and then you'd find that you don't have a way to communicate the build-time requirements, and even if you did you'd have to rely on what PyPI could provide.
Of course, you could do it for pure Python projects, but these are (and were) trivial to build locally anyway.
> Then you have manylinux, an excellent idea to create some common ground for a portable Linux build distribution. However, sometimes when a new version of manylinux is released some upstream maintainers immediately start supporting only that version, breaking a lot of systems.
I'm not convinced that, by mid-2021, there had been enough versions of the manylinux wheel standard to make this claim confidently.
> For the longest time, the setup.py and requirements.txt were (and, spoiler alert: still is) the backbone of your packaging efforts. In setup.py you define the meta data of your package, including its dependencies.... you’ll still have to provide that file with an empty setup() in order to allow for editable pip installs. In my opinion, that makes this feature useless
This seems like an overreaction. Pulling metadata out of setup.py solves an important practical problem: you don't need to run setup.py to get the metadata, which means you don't need to have your build environment set up (which might come into play if you want setup.py to `import` something third-party), which means you don't need to already have the metadata (the `setup_requires` keyword argument to `setup()`).
It was really not necessary to have a setup.py in 2021, certainly not for pure Python projects. Other build backends existed (besides Poetry's, too). But you could go without `setup.py` even with Setuptools as your backend since April of 2019, if you didn't need editable wheels. (Pip started supporting PEP 517 editable wheels in October 2021, but I don't know what build systems would have supported them - Setuptools didn't until August 2022.) And the thing is, you don't need editable wheels. They're a hack, and a complex 99% solution for something where the 98% solution is literally creating a one-line `.pth` file in the appropriate `site-packages` folder.
> Pipfile and Pipfile.lock are supposed to replace requirements.txt some day.
I don't know that PyPA was still making this claim in 2021, but if they were it was naive. They gave up on maintaining the Pipfile spec in early 2022 (https://github.com/pypa/pipfile) and most of the ideas were subsumed by pyproject.toml anyway. Lock files turned out to be a much harder problem, currently being worked on (https://peps.python.org/pep-0751/).
> pip and setuptools support pyproject.toml to some extent, but not to a point where it completely replaces setup.py yet.
pyproject.toml isn't meant as a replacement for setup.py and can never really be one. The point is that setup.py can now be focused on orchestrating the build, rather than metadata.
> Ironically, Python settled for the TOML file format here, although there is currently no support for reading TOML files in Python’s standard library.
It was still the right choice IMO. Tools like Pip just vendored third-party support and one of those packages made it into the standard library in late 2022 (Python 3.11).
> Regarding the tooling, pip and twine are sufficient and do their job just fine.
Pip was sufficient, but I wouldn't call it "just fine" even today. No real problems with Twine, though. You just have to be a bit careful with your API secret key. Poetry adds its own `poetry publish` command which just seems entirely redundant.
Quite a bit has changed in the last three and a half years when it comes to Python packaging. What specifically did you want to highlight here, and why?
Anyway, some responses:
> there’s one thing that irritates me every single time I revisit the page and that is the misleading recommendation of their own tool pipenv.... PyPA recommends pipenv as the standard tool for dependency management... PyPA still advertises pipenv all over the place and only mentions poetry a couple of times, although poetry seems to be the more mature product.
This looks very different now, but I don't think the recommendation was "misleading". The archived page didn't say that the tool was "the standard", just a recommendation. Meanwhile, Poetry lagged behind for years in implementing standards. Pipenv was also focused only on dependency management (which basically == creating venvs + invoking Pip + updating lockfiles), whereas Poetry is designed as a complete workflow tool (== dependency management + build frontend + build backend + package uploader + project metadata management + a bunch of integration + probably stuff I forgot).
> You would expect exactly one distribution for Python packages, but here in Python land, we have several ones.
Conda came into existence because the scientific Python community needed to solve packaging problems right now (around 2012 - see https://jakevdp.github.io/blog/2016/08/25/conda-myths-and-mi... ) that everyone else was not prepared to deal with. From everything I've heard, it's much less useful for Python now, but you can't make it stop existing. Similarly, Linux distros are going to have their own packaging for everything no matter what you do, and Windows users aren't going to benefit from that packaging. So no, no I would not expect any such thing.
> In my opinion, there should be compiled wheels for all packages available that need it, built and provided by PyPI.
This was and is completely and utterly infeasible. PyPI already relies on Fastly's generosity in serving over a petabyte of data per day. The resources for a build farm just aren't there (and if they offered one, suddenly everyone would want to use it). Especially since building these projects requires running arbitrary code (to orchestrate arbitrary build processes that need to invoke who-knows-what compiler for who-knows-what language). You'd have to spin up a VM/Docker container/etc. for each build - and then you'd find that you don't have a way to communicate the build-time requirements, and even if you did you'd have to rely on what PyPI could provide.
Of course, you could do it for pure Python projects, but these are (and were) trivial to build locally anyway.
> Then you have manylinux, an excellent idea to create some common ground for a portable Linux build distribution. However, sometimes when a new version of manylinux is released some upstream maintainers immediately start supporting only that version, breaking a lot of systems.
I'm not convinced that, by mid-2021, there had been enough versions of the manylinux wheel standard to make this claim confidently.
> For the longest time, the setup.py and requirements.txt were (and, spoiler alert: still is) the backbone of your packaging efforts. In setup.py you define the meta data of your package, including its dependencies.... you’ll still have to provide that file with an empty setup() in order to allow for editable pip installs. In my opinion, that makes this feature useless
This seems like an overreaction. Pulling metadata out of setup.py solves an important practical problem: you don't need to run setup.py to get the metadata, which means you don't need to have your build environment set up (which might come into play if you want setup.py to `import` something third-party), which means you don't need to already have the metadata (the `setup_requires` keyword argument to `setup()`).
It was really not necessary to have a setup.py in 2021, certainly not for pure Python projects. Other build backends existed (besides Poetry's, too). But you could go without `setup.py` even with Setuptools as your backend since April of 2019, if you didn't need editable wheels. (Pip started supporting PEP 517 editable wheels in October 2021, but I don't know what build systems would have supported them - Setuptools didn't until August 2022.) And the thing is, you don't need editable wheels. They're a hack, and a complex 99% solution for something where the 98% solution is literally creating a one-line `.pth` file in the appropriate `site-packages` folder.
> Pipfile and Pipfile.lock are supposed to replace requirements.txt some day.
I don't know that PyPA was still making this claim in 2021, but if they were it was naive. They gave up on maintaining the Pipfile spec in early 2022 (https://github.com/pypa/pipfile) and most of the ideas were subsumed by pyproject.toml anyway. Lock files turned out to be a much harder problem, currently being worked on (https://peps.python.org/pep-0751/).
> pip and setuptools support pyproject.toml to some extent, but not to a point where it completely replaces setup.py yet.
pyproject.toml isn't meant as a replacement for setup.py and can never really be one. The point is that setup.py can now be focused on orchestrating the build, rather than metadata.
> Ironically, Python settled for the TOML file format here, although there is currently no support for reading TOML files in Python’s standard library.
It was still the right choice IMO. Tools like Pip just vendored third-party support and one of those packages made it into the standard library in late 2022 (Python 3.11).
> Regarding the tooling, pip and twine are sufficient and do their job just fine.
Pip was sufficient, but I wouldn't call it "just fine" even today. No real problems with Twine, though. You just have to be a bit careful with your API secret key. Poetry adds its own `poetry publish` command which just seems entirely redundant.