dannyz 3 days ago

While `uv` works amazingly well I think a lot of people don't realize that installing packages through conda (or let's say the conda-forge ecosystem) has technical advantages compared to wheels/pypi.

When you install the numpy wheel through `uv` you are likely installing a pre-compiled binary that bundles openblas inside of it. When you install numpy through conda-forge, it dynamically links against a dummy blas package that can be substituted for mkl, openblas, accelerate, whatever you prefer on your system. It's a much better solution to be able to rely on a separate package rather than having to bundle every dependency.

Then lets say you install scipy. Scipy also has to bundle openblas in their wheel, and now you have two copies of openblas sitting around. They don't conflict, but this quickly becomes an odd thing to have to do.

  • KolenCh 2 days ago

    In this sense I personally prefer pixi because of this. It is pixi like but resolves using conda channels like conda, and similar to conda it supports PyPI packages via uv.

    With a background in scientific computing where many of the dependencies I managed are compiled, conda packages gives me much more control.

    P.S. I’d like to point out to others to differentiate between package index and package managers. PyPI is an index (that hosts packages in a predefined format) while pip, poetry, uv are package managers that resolve and build your environments using the index.

    Similarly but a bit more confusingly, conda can be understood as the index, hosted by anaconda but can also be hosted elsewhere, with different “channels” (kinda like a GitHub organization) where conda-forge is a popular one built by communities. Conda is also a reference implementation of a package manager that uses anaconda channels to resolve. Mamba is an independent, performant, drop in replacement of conda. And pixi is a different one with a different interface by the author of mamba.

    Even more confusingly, there are distributions. Distributions come with a set of predefined packages together with the package manager such that you just start running things immediately (sort of like a TeXLive distribution in relation to the package manager tlmgr.) there are anaconda distributions (if you installed anaconda instead of installing conda, that’s what you get), but also Intels distribution for Python, mini forge, mambaforge, etc.

  • fire_lake 2 days ago

    Is there a reason this behaviour couldn’t be implemented in uv?

    Is this beyond what the pyproject.toml spec supports?

    • aragilar 17 hours ago

      Fundamentally, conda is like a linux distro (or homebrew): it is cross-language package manager designed to work with a coherent set of packages (either via the anaconda channel or conda-forge). uv is currently a different installer for PyPI, which means inheriting all the positives and negatives of it. One of the negatives is the packages are not coherent, so everything needs to be vendored in such a way as to not interfere with other packages. Unless Astral wants to pay packagers to create a parallel ecosystem, uv cannot do this.

    • setopt 2 days ago

      My guess is that the difference is more that PyPI intends to be a Python package repository, and thus I don’t think you can just upload say a binary copy of MKL without accompanying Python code. It’s originally a source-based repository with binary wheels being an afterthought. (I still remember the pre-wheel nightmare `pip install numpy` used to give, when it required compiling the C/C++/Fortran pieces which often failed and was often hard to debug…)

      But Anaconda and CondaForge are general package repository, they are not Python-specific but are happy to be used for R, Julia, C/C++/Fortran binaries, etc. it’s primarily a binary-based repository. For example, you can `conda install python` but you can’t `pip install python`.

      I don’t know if there is any technical barrier or just a philosophical barrier. Clearly, Pip handles binary blobs inside of Python packages fine, so I would guess the latter but am happy to be corrected :).

japanuspus 3 days ago

The problem conda solved that nothing had solved before was installing binary dependencies on MS Windows.

Before conda, getting a usable scipy install up and running on MS Windows was a harrowing experience. And having two independent installations was basically impossible. The real hard work that went into conda was reverse engineering all the nooks and crannies of the DLL loading heuristics, to allow it to ensure that you loaded what you intended.

If you are working on macOS and deploying to some *nix in the cloud, you are unlikely to find any value in this. But in ten years as lead on a large tool that was deployed to personal (Windows) laptops in a corporate environment, I did not find anything that beat conda.

  • dist-epoch 2 days ago

    This was mostly because most scientific packages didn't provide Windows binary builds for many years.

    Today you can just "pip install scipy" on Windows at it will just work.

    • wruza 2 days ago

      Oh right, recently I started learning classic ML and “just” tried to install tensorflow, which, itself or through one of dependencies, stopped providing windows binaries since x.y.z and so my python has to be downgraded to 3.a and then other dependencies stop installing. Eventually I managed to find a proper version intersection for everything together with some shady repo, but it felt like one more requirement and I’ll get overconstrained.

      • disgruntledphd2 2 days ago

        Tensorflow is the worst. Basically every time my python env was borked (with multiple incompatible versions of Numpy) it was down to tensorflow.

        • coredog64 a day ago

          You said it. I was working with an official Google library that used TF and it didn’t work at all with 3.12. I spent a day building the wheels for 3.12 only to find there was a bug with dataclasses. :|

          I can’t recall the library, but there was another major project that just deprecated TF because it was the cause of so many build problems.

  • hooloovoo_zoo 2 days ago

    It also made a bunch of neural net libraries much easier to install before containers became popular.

alsodumb 3 days ago

As someone with admittedly no formal CS education, I've been using conda for all of my grad school and never managed to break it.

I create a virtual environment for every project. I install almost all packages with pip, except for any binaries or CUDA related things from conda. I always exported the conda yaml file and managed to reproduce the code/environment including the Python version. I've seen a lot of posts over time praising poetry and other tools and complaining about conda but I could never relate to any of them.

Am i doing something wrong? Or something right?

  • duped 3 days ago

    My experience with conda is that its fine if you're the original author of whatever you're using it for and never share it with anyone else. But as a professional I usually have to pull in someone else's work and make it function on a completely different machine/environment. I've only had negative experiences with conda for that reason. IME the hard job of package management is not getting software to work in one location, but allowing that software to be moved somewhere else and used in the same way. Poetry solves that problem, conda doesn't.

    Poetry isn't perfect, but it's working in an imperfect universe and at least gets the basics (lockfiles) correct to where packages can be semi-reproducible.

    There's another rant to be had at the very existence of venvs as part of the solution, but that's neither poetry or anaconda's fault.

    • LarsDu88 3 days ago

      Poetry is pretty slow. I think `uv` will ultimately displace it on that basis alone.

      • AnthonBerg 2 days ago

        For what it’s worth – A small technical fact:

        It is entirely possible to use poetry to determine the precise set of packages to install and write a requirements.txt, and then shotgun install those packages in parallel. I used a stupidly simple fish shell for loop that ran every requirements line as a pip install with an “&” to background the job and a “wait” after the loop. (iirc) Could use xargs or parallel too.

        This is possible at least. Maybe it breaks in some circumstances but I haven’t hit it.

        • macinjosh 2 days ago

          That poor package server getting 39 simultaneous pulls at the same time from one user.

          • AnthonBerg 2 days ago

            This is indeed something to consider!

            Not as an excuse for bad behavior but rather to consider infrastructure and expectations:

            The packages might be cached locally.

            There might be many servers – a CDN and/or mirrors.

            Each server might have connection limits.

            (The machine downloading the packages miiiiiight be able to serve as a mirror for others.)

            If these are true, then it’s altruistically self-interested for everyone that the downloader gets all the packages as quickly as possible to be able to get stuff done.

            I don’t know if they are true. I’d hope that local caching, CDNs and mirrors as well as reasonable connection limits were a self-evident and obviously minimal requirement for package distribution in something as arguably nation-sized as Python.

            And… just… everywhere, really.

      • emptiestplace 3 days ago

        I actually can't believe how fast `uv` is.

        • jkrubin 3 days ago

          Ditto. It’s wild.

          • delduca 2 days ago

            Poetry is a pain. uv is much better IME/IMO.

      • robertlagrant 2 days ago

        How is uv so much faster? My understanding is Poetry is slow sometimes because PyPi doesn't have all the metadata required to solve things, so it needs to download packages and then figure it out.

        • sswatson 3 hours ago

          If I recall correctly, uv is doing some ninja stuff like guessing the part of the relevant file that is likely to contain the metadata it needs and then doing a range request to avoid downloading the whole file.

      • lrog 3 days ago

        Can you recommend any good article / comparison of uv vs poetry vs conda?

        We've used different combinations of pipx+lockfiles or poetry, which has been so far OK'ish. But recently discovered uv and are wondering about existing experience so far across the industry.

        • NeutralForest 2 days ago

          From my experience, uv is way better and it's also PEP compliant in terms of pyproject.toml. Which means in cas uv isn't a big player in the future, migrating away isn't too difficult.

          At the same time, poetry still uses a custom format and is pretty slow.

    • alkh 3 days ago

      +1. On top of that, even with the new resolver it still takes ages to resolve a dependency for me, so somethimes I end up just using pip directly. Not sure if I am doing something wrong(mb you have to manually tweak something in the configs?) but it's pretty common for me to experience this

      • aja12 3 days ago

        Like sibling comments, after using poetry for years (and pipx for tools), I tried uv a few months ago

        I was so amazed of the speed, I moved all my projects to uv and have not yet looked back.

        uv replaces all of pip, pipx and poetry for me, I does not do more than these tools, but it does it right and fast.

        If you're at liberty to try uv, you should try it someday, you might like it. (nothing wrong with staying with poetry or pyenv though, they get the job done)

      • snicker7 3 days ago

        I believe the problem is the lack of proper dependency indexing at PyPI. The SAT solvers used by poetry or pdm or uv often have to download multiple versions of the same dependencies to find a solution.

    • throwawaymaths 3 days ago

      imagine being a beginner to programming and being told "use venvs"

      or worse, imagine being a longtime user of shells but not python and then being presented a venv as a solution to the problem that for some reason python doesn't stash deps in a subdirectory of your project

      • Tainnor 3 days ago

        You don't need to stash deps in a subdirectory, IMHO that's a node.js design flaw that leads to tons of duplication. I don't think there's any other package manager for a popular language that works like this by default (Bundlers does allow you to version dependencies which can be useful for deployment, but you still only ever get one version of any dependency unlike node).

        You just need to have some sort of wrapper/program that knows how to figure out which dependencies to use for a project. With bundler, you just wrap everything in "bundle exec" (or use binstubs).

        • marcosdumay 2 days ago

          What was unique to node.js was the decision to not only store the dependencies in a sub-folder, but also to apply that rule, recursively, for every one of the projects you add as a dependency.

          There are many dependency managers that use a project-local flat storage, and a global storage was really frowned upon until immutable versions and reliable identifiers became popular some 10 years ago.

      • lmz 3 days ago

        Wasn't node the only programming language that used a subdirectory for deps by default?

        Ruby and Perl certainly didn't have it - although Ruby did subsequently add Bundler to gems and gems supported multiversioning.

        • chuckadams 2 days ago

          It’s fairly common for Perl apps to use Carton (more or less a Perl clone of Bundler) to install vendored dependencies.

          • lmz 2 days ago

            Oh that's nice. When I last looked (quite a long time ago), local::lib seemed to be the recommended way, and that seemed a bit more fiddly than python's virtualenv.

            • chuckadams 2 days ago

              Carton uses local::lib under the covers. I found local::lib far less fiddly than virtualenv myself, but it just doesn't try to do as much as virtualenv. These days I do PHP for a living, and for all the awfulness in php, they did nail it with composer.

        • throwawaymaths 3 days ago

          Rust, julia, elixir

          • shwouchk 3 days ago

            julia just store the analogue of a requirements.txt (Project.toml) and the lock file (Manifest.toml). And has its own package issues including packages regularly breaking for every minor release (although i enjoy the language and will keep using it)

          • lelandbatey 3 days ago

            All those came after Python/C/C++ etc which were all from the wild-west of the "what is package management?" dark ages. The designers of those languages almost certainly thought the exact thought of "how can we do package management better than existing technology like pip?"

          • duped 2 days ago

            Rust doesn't store dependencies under your project dir, but it does build them under your target.

      • duped 3 days ago

        I have imagined this, because I've worked on products where our first time user had never used a CLI tool or REPL before. It's a nightmare. That said, it's no less a nightmare than every other CLI tool, because even our most basic conventions are tribal knowledge that are not taught outside of our communities and it's always an uphill battle teaching ones that may be unfamiliar to someone from a different tribe.

        • MobiusHorizons 3 days ago

          It is true that every field (honestly every corner of most fields) has certain specific knowledge that is both incredibly necessary to get anything done, and completely arbitrary. These are usually historical reactions to problems solved in a very particular way usually without a lot of thought, simply because it was an expedient option at the time.

          I feel like venv is one such solution. A workaround that doesn’t solve the problem at its root, so much as make the symptoms manageable. But there is (at least for me) a big difference between things like that and the cool ideas that underlie shell tooling like Unix pipes. Things like jq or fzf are awesome examples of new tooling that fit beautifully in the existing paradigm but make it more powerful and useful.

      • wakawaka28 3 days ago

        Beginners in Python typically don't need venvs. They can just install a few libraries (or no libraries even) to get started. If you truly need venvs then you're either past the initial learning phase or you're learning how to run Python apps instead of learning Python itself.

        For some libraries, it is not acceptable to stash the dependencies for every single toy app you use. I don't know how much space TensorFlow or PyQt use but I'm guessing most people don't want to install those in many venvs.

        • michaelmrose 2 days ago

          Intelligent systems simply cache and re-use versions and do stash deps for every toy project without consuming space.

          Also installing everything with pip is a great way to enjoy unexplainable breakage when a Doesn't work with v1 and b doesn't work with v2.

          It also leads to breaking Linux systems where a large part of the system is python code. Especially where user upgrades system python for no reason.

          • wakawaka28 2 days ago

            If you install a package in a fresh environment then it does actually get installed. It can be inherited from the global environment but I don't think disparate venvs that separately install a package actually share the package files. If they did, then a command executed in one tree could destroy the files in another tree. I have not done an investigation to look into this today but I think I'm right about this.

            • michaelmrose 2 days ago

              In better designed systems than python they do. To share them with python you need something with dedup. Eg BTRFS ZFS

              • wakawaka28 a day ago

                Python's venv design is not obviously unintelligent. It must work on all sorts of filesystems, which limits how many copies can be stored and how they can be associated. More advanced filesystems can support saving space explicitly for software that exploits them, and implicitly for everyone, but there is a cost to everything.

        • throwawaymaths 3 days ago

          i remember reading somewhere (on twitter iirc) an amateur sex survey statistician who decided she needed to use python to analyze her dataset, being guided toward setting up venvs pretty early on by her programmer friends and getting extremely frustrated.

          • jazzyjackson 3 days ago

            Was it aella? I don't know of any other sex survey statisticians so I'm assuming you mean aella. She has a pretty funny thread here but no mention of venvs: (non-musk-link https://xcancel.com/Aella_Girl/status/1522633160483385345)

              Every google for help I do is useless. Each page is full of terms I don't understand at *all*. They're like "Oh solving that error is simple, just take the library and shove it into the jenga package loader so you can execute the lab function with a pasta variation".
            
            She probably would have been better off being pointed towards jupyter, but that's neither here nor there
            • wakawaka28 2 days ago

              Good grief there seems to be no getting away from that woman. One of my ex girlfriends was fascinated by her but to me she is quite boring. If she wasn't fairly attractive, nobody would care about her banal ramblings.

  • theamk 3 days ago

    You are doing something right, author does some pretty unusual things:

    - Setup custom kernels in Jupyter Notebook

    - Hardlink the environments, then install same packages via pip in one and conda in others

    - install conda inside conda (!!!) and enter nested environment

    - Use tox within conda

    I believe as long as you treat the environments as "cattle" (if it goes bad, remove it and re-create from yaml file), you should not have any problems. It's clearly not the case of for the post's author though.

    • fluorinerocket 3 days ago

      Yep nuke the bad env and start over. Conda is great only problem are when a package is not available on conda forge or you have to compile and install with setup.py. But then you can blow the env away and start over.

  • bean-weevil 3 days ago

    As someone with a formal computer science, half of my friends who work in other sciences have asked me to help them fix their broken conda environments

  • rcxdude 3 days ago

    This is exactly the kind of thing that causes python package nightmares. Pip is barely aware of packages it's installed itself, let alone packages from other package managers and especially other package repositories. Mixing conda and pip is 100% doing it wrong (not that there's an easy way to do it right, but stick to one or the other, I would generally recommend just using pip, the reasons for conda's existance are mostly irrelevant now)

    • skeledrew 3 days ago

      I still run into cases where a pip install that fails due to some compile issue works fine via conda. It's still very relevant. It's pip that should be switched out for something like poetry.

      • rcxdude 3 days ago

        poetry vs pip does very little for compilation-related install failures. Most likely the difference is whether you are getting a binary package or not, and conda's repository may have a binary package that pypi does not (but also vice-versa: nowadays pypi has decent binary packages, previously conda gained a lot of popularity because it had them while pypi generally did not, especially on windows). But the main badness comes from mixing them in the same environment (or mixing pypi packages with linux distribution packages, i.e. pip installing outside of a virtual environment).

        (I do agree pip is still pretty lackluster, but the proposed replacements don't really get to the heart of the problem and seem to lack staying power. I'm in 'wait and see' mode on most of them)

        • skeledrew a day ago

          Oh I meant that poetry could be a general replacement for pip (actually it used it in it's backend) because it does a great job managing dependencies and projects in general.

    • whywhywhywhy 2 days ago

      Works absolutely fine as possible with Python using conda to manage the environments and python versions and pip to install the packages.

  • maurosilber 3 days ago

    I had the same experience. But you should try pixi, which is to conda what uv is to pip.

    • akdor1154 3 days ago

      Isn't uv to conda what uv is to pip?

      • LarsDu88 3 days ago

        `uv` is not a drop-in replacement for `conda` in the sense that `conda` also handles non-python dependencies, has its own distinct api server for packages, and has its own packaging yaml standard.

        `pixi` basically covers `conda` while using the same solver as `uv` and is written in Rust like `uv`.

        Now is it a good idea to have python's package management tool handle non-python packages? I think that's debateable. I personally am in favor of a world where `uv` is simply the final python package management solution.

        Wrote an article on it here: https://dublog.net/blog/so-many-python-package-managers/

        • traversaro 2 days ago

          I am not sure pixi uses the same solver of uv, at least in general. pixi uses resolvo (https://github.com/mamba-org/resolvo) for conda packages, while uv (that in turns uses pubgrub https://github.com/pubgrub-rs/pubgrub) for pip packages.

        • immmmmm 2 days ago

          I have been using pixi for half a year and it has been fantastic.

          It’s fast, takes yml files as an input (which is super convenient) and super intuitive

          Quite surprised it isn’t more popular

        • Balinares 3 days ago

          Bookmarking. Thanks for sharing the link, looks like a great overview of that particular tragic landscape. :)

          Also crossing fingers that uv ends up being the last one standing when the comprehensive amounts of dust here settle. But until then, I'll look into pixi, on the off chance it minimizes some of my workplace sorrows.

  • fluorinerocket 3 days ago

    Same but I try to use conda to install everything first, and only use pip as a last resort. If pip only installs the package and no dependency it's fine

  • jszymborski 3 days ago

    God forbid you should require conda-forge and more than three packages lest the dependency resolver take longer than the heat death of the planet to complete.

    • fransje26 3 days ago

      Install mamba first?

      • jszymborski 3 days ago

        Mamba is indeed a lot better. I personally just don't bother with conda and stick to pip + venv.

  • throwawaymaths 3 days ago

    i think you got lucky and fell into best practices on your first go

    > except for any binaries or CUDA related things from conda

    doing the default thing with cuda related python packages used to often result in "fuck it, reinstall linux". admittedly, i dont know how it is now. i have one machine that runs python with a gpu and it runs only one python program.

    • disgruntledphd2 2 days ago

      > doing the default thing with cuda related python packages used to often result in "fuck it, reinstall linux"

      From about 2014-17 you are correct, but it appears (on ubuntu at least), that it mostly works now. Maybe I've just gotten better at dealing with the pain though...

  • thangngoc89 3 days ago

    1. You need to run export manual while other tools you mentioned would create it automatically (the lock file) 2. Distinguishes between direct dependencies (packages you added yourself) and indirect dependencies (packages of the packages)

These335 3 days ago

Can somebody please eli5 why it is so unanimously accepted that Python's package management is terrible? For personal projects venv + requirements.txt has never caused problems for me. For work projects we use poetry because of an assumption that we would need something better but I remain unconvinced (nothing was causing a problem for that decision to be made).

  • tasuki 3 days ago

    In your requirements.txt, do you pin the concrete versions or leave some leeway?

    If you aren't precise, you're gonna get different versions of your dependencies on different machines. Oops.

    Pinning concrete versions is of course better, but then there isn't a clear and easy way to upgrade all dependencies and check whether ci still passes.

    • marcosdumay 2 days ago

      You should use freeze files. Whatever language you are using, you should specify your dependencies on the loosest way possible, and use freeze files to pin them down.

      The only difference from one language to another is that some make this mandatory, while in others it's only something that you should really do and there isn't any other real option you should consider.

  • marcosdumay 2 days ago

    > For personal projects venv + requirements.txt has never caused problems for me.

    That means you don't use Windows.

    What is great. Keep not using it. But most people will have a different experience.

    • kristianp 2 days ago

      Wait, are most of the people having problems in this thread using windows? It's been mentioned a couple of times, but not by most.

      • disgruntledphd2 2 days ago

        There's a bunch of people who tend to have problems with Python dep management: - Windows users

        - DS/compiled libs users (mostly Fortran/Cuda/C++)

        - Anyone with dependencies on native/non python libraries.

        Conda definitely helps with 2 and 3 above, and uv is at least a nice, fast API over pip (which is better since it started doing dependency checking and binary wheels).

        More generally, lots of the issues come from the nature of python as a glue language over compiled libraries, which is a relatively harder problem in general.

    • Kwpolska 2 days ago

      There are no Windows-specific issues in venv + pip. Windows can be more painful if you need to compile C extensions, but you usually don't, since most commonly used packages have had binary wheels for Windows on PyPI for many years.

  • Kwpolska 2 days ago

    For using packages, venv + requirements.txt works, but is a bit clunky and confusing. Virtual environments are very easy to break by moving them or by updating your OS (and getting a new Python with it). Poetry is one alternative, but there are far too many options and choices to make. For building packages, there are similarly many competing options with different qualities and issues.

    • rbanffy 19 hours ago

      > Virtual environments are very easy to break by moving them

      Virtual environments are cattle. Don’t treat them as pets. Just use pip freeze and recreate them elsewhere.

  • shoo 2 days ago

    i think there might be merit to gdiamos's point that python is a popular language with a large number of users, and this might mean that python package management isn't unusually bad, but more users implies more complaints.

    i think there was a significant step change improvement in python packaging around 2012, when the wheel format was introduced, which standardised distributing prebuilt platform-specific binary packages. for packages with gnarly native library dependencies / build toolchains (e.g. typical C/fortran numeric or scientific library wrapped in a layer of python bindings), once someone sets up a build server to bake wheels for target platforms, it becomes very easy to pip install them without dragging in that project's native build-from-source toolchain.

    venv + pip (+ perhaps maintaining a stack of pre-built wheels for your target platform, for a commercial project where you want to be able to reproduce builds) gets most of the job done, and those ingredients have been in place for over 10 years.

    around the time wheel was introduced, i was working at a company that shipped desktop software to windows machines, we used python for some of the application components. between venv + pip + wheels, it was OK.

    where there were rough edges were things like: we have a dep on python wrapper library pywhatever, which requires a native library libwhatever.dll built from the c++ whatever project to be installed -- but libwhatever.dll has nothing to do with python, maybe its maintainers kindly provide an msi installer, so if you install it into a machine, it gets installed into the windows system folder, so venv isn't able to manage it & offer isolation if you need to install multiple versions for different projects / product lines, as venv only manages python packages, not arbitrary library dependencies from other ecosystems

    but it's a bit much blame python for such difficulties: if you have a python library that has a native dependency on something that isnt a python package, you need to do something else to manage that dep. that's life. if you're trying to do it on windows, which doesn't have an O/S level package manager.. well, that's life.

  • mardifoufs 3 days ago

    Try building a package and you will get hundreds of little paper cuts. Need a different index for some packages? It will work with a cli "pip install -from-index", but pip will not let you add an index in a requirement.txt for... security reasons. That means, good luck trying to "enforce" the CUDA version of pytorch without using third party tooling. So you either hard code a direct link (breaks platform portability), as that will work, or give up trying to make your project installable with "Pip install " Or "python build". Remember, pytorch basically has no CUDA builds anymore in its pypi index and no way to get CUDA torch from there (but I think this might have changed recently? )

    Oh, and if some package you are using has a bug or something that requires you to vendor it in your repo, well then good luck because again, PEP 508 does not support installing another package from a relative link. You either need to put all the code inside the same package, vendored dependency included, and do some weird stuff to make sure that the module you wanted to vendor is used first, or... you just have to use the broken package, again for some sort of security reasons apparently.

    Again, all of that might even work when using pip from the cli, but good luck trying to make a requirements.txt or define dependencies in a standard way that is even slightly outside of a certain workflow.

    • vel0city 2 days ago

      You can include command line parameters like index-url and include-extra-index-url and find-links and what not in a requirements file.

      • mardifoufs a day ago

        And have them build with "Pip install ." or python build? With the default setuptools config (or even any tweaks to it)? It works until you actually try to package the app, that's where the edge cases start piling up, a lot of them due to very weird decisions made on a whim on some random discourse thread.

        Adding index URLs is explicitly not supported in the requirements.txt in setuptools or the default python build tool.

  • mrweasel 2 days ago

    I can sort of the the argument, if you really really need to lock down your dependencies to very specific version, which I don't recommend you do.

    For development I use venv and pip, sometimes pyenv if I need a specific Python version. For production, I install Python packages with apt. The operating system can deal with upgrading minor library versions.

    I really hate most other package managers, they are all to confusing and to hard to use. You need to remember to pull in library update, rebuild and release. Poetry sucks too, it's way to complicated to use.

    The technical arguments against Python packages managers are completely valid, but when people bring up Maven, NPM or even Go as role models I check out. The ergonomics of those tools are worse than venv and pip. I also think that's why we put up with pip and venv, they are so much easier to use than the alternative (maybe excluding uv). If a project uses Poetry, I just know that I'm going to be spending half a day upgrading dependencies, because someone locked them down a year ago and there's now 15 security holes that needs to be plugged.

    No, what Python needs is to pull in requests and a web framework into the standard library and then we can start build 50% of our projects without any dependencies at all. They could pull in Django, it only has two or three dependencies anyway.

  • invaliduser 3 days ago

    venv + requirements.txt has worked for every single python project I made for the last 2 years (I'm new to python). Only issue I had was when using a newish python version and not having a specific library released yet for this new version, but downgrading python solved this.

    Being new to the ecosystem I have no clue why people would use Conda and why it matters. I tried it, but was left bewildered, not understanding the benefits.

    • dagw 3 days ago

      I have no clue why people would use Conda

      The big thing to realise is that when Conda first was released it was the only packaging solution that truly treated Windows as a first class citizen and for a long time was really the only way to easily install python packages on Windows. This got it a huge following in the scientific community where many people don't have a solid programming/computer background and generally still ran Windows on their desktops.

      Conda also not only manages your python interpreter and python libraries, it manages your entire dependency chain down to the C level in a cross platform way. If a python library is a wrapper around a C library then pip generally won't also install the C library, Conda (often) will. If you have two different projects that need two different versions of GDAL or one needs OpenBLAS and one that needs MKL, or two different versions of CUDA then Conda (attempts to) solve that in a way that transparently works on Windows, Linux and MacOS. Using venv + requirements.txt you're out of luck and will have to fall back on doing everything in its own docker container.

      Conda lets you mix private and public repos as well as mirroring public packages on-perm in a transparent way much smoother than pip, and has tools for things like audit logging, find grained access control, package signing and centralised controls and policy management.

      Conda also has support for managing multi-language projects. Does your python project need nodejs installed to build the front-end? Conda can also manage your nodejs install. Using R for some statistical analysis in some part of your data pipeline? Conda will mange your R install. Using a Java library for something? Conda will make sure everybody has the right version of Java installed.

      Also, it at least used to be common for people writing numeric and scientific libraries to release Conda packages first and then only eventually publish on PyPi once the library was 'done' (which could very well be never). So if you wanted the latest cutting edge packages in many fields you needed Conda.

      Now there are obviously a huge class a projects where none of these features are needed and mean nothing. If you don't need Conda, then Conda is no longer the best answer. But there are still a lot of niche things Conda still does better than any other tool.

      • disgruntledphd2 2 days ago

        > it manages your entire dependency chain down to the C level in a cross platform way.

        I love conda, but this isn't true. You need to opt-in to a bunch of optional compiler flags to get a portable yml file, and then it can often fail on different OS's/versions anyway.

        I haven't done too much of this since 2021 (gave up and used containers instead) but it was a nightmare getting windows/mac builds to work correctly with conda back then.

        • dagw 19 hours ago

          it was a nightmare getting windows/mac builds to work correctly

          I think both statements can be true. Yes getting cross platform windows/Mac/Linux builds to work using Conda could definitely be a nightmare as you say. At the same time it was still easier with Conda than any other tool I've tried.

    • drwu 3 days ago

      I am using Conda to build binary modules for different Python versions.

      As user of the modules, venv is sufficient.

  • gdiamos 3 days ago

    It's popular enough that it causes pain for a lot of people.

    Coming from C++, IMO, it is vastly better.

    • dccsillag 3 days ago

      Well, yes, but that's an extremely low bar!!

whywhywhywhy 2 days ago

Really the issue is python itself, it shouldn't be treating it's installs and packages as something that's linked and intertwined to the base operating system.

People like to complain about node packages but never seen people have the trouble with them that they have with python.

  • theamk 2 days ago

    What do you do though if you want to import code written in C++? Especially complex, dependency-heavy like CUDA/ML stuff?

    You can just give up and say that "The proper way to do this is to use the Nvidia CUDA toolkit to write your cuda app in C++ and then invoke it as a separate process from node" [0]. That apparently works for node, but Python wants much more.

    If you actually want to use high-performance native code in your slow compiled language, then no solution is going to be very good, that's because the problem is inherently hard.

    You can rely on host OS as much as possible - if OS is known, provide binaries; if it's unknown, provide source code and hope user has C/C++/Rust/Fortran compilers to build it. That's what uv, pip, etc.. do.

    You can create your own parallel OS, bringing your own copy of every math libray, as well as CUDA even if there are perfectly good versions installed on the system - that's what conda/minconda does.

    You can implement as much as possible in your own language, so there is much less need to use "high-performance native language" - that's what Rust and Go do. Sadly, that's not an option for Python.

    [0] https://stackoverflow.com/questions/20875456/how-can-i-use-c...

  • rbanffy 19 hours ago

    In Python you need to deliberately mess with the system Python by running your package installer under sudo. That’s not something you do accidentally.

    When dealing with the system Python you should always use the system package manager. That extends to Macs for both brew and MacPorts.

  • macinjosh 2 days ago

    This is spot on. Running some python projects on nixos is a nightmare because of this model. Especially if it’s ML related.

mdaniel 2 days ago

I hope you read it while it was available, because the domain has expired

    Domain Name: pyherald.com
    Registry Domain ID: 2663190918_DOMAIN_COM-VRSN
    Registrar WHOIS Server: whois.namesilo.com
    Registrar URL: https://www.namesilo.com/
    Updated Date: 2024-12-21T07:00:00Z
    Creation Date: 2021-12-21T07:00:00Z
    Registrar Registration Expiration Date: 2024-12-21T07:00:00Z
https://web.archive.org/web/20241220211119/https://pyherald.... is the most recent snap
BiteCode_dev 3 days ago

Conda used to be a life saver when years and years ago, compiled extensions were hard to install because you had to compile them yourself.

Nowadays, thanks to wheels being numerous and robust, the appeal of anaconda is disappearing for most users except for some exotic mixes.

conda itself now causes more trouble than it solves as it's slow, and lives in its own incompatible world.

But anaconda solves a different problem now that nobody else solves, and that's managing Python for big corporation. This is worth a lot of money to big structures that need to control packages origin, permissions, updates, and so on, at scale.

So it thrives there.

BrenBarn 3 days ago

Nothing in the "article" seems to support the title. A lot of it is just about Python packaging in general, or about problems when mixing conda- and pip-installed packages.

In my experience conda is enormously superior to the standard Python packaging tools.

  • Balinares 3 days ago

    If we're doing anecdotal evidence, then mine is that conda is by far the worst of the main Python packaging solutions in use. The absurd slowness and incompatibility with the entire rest of the Python world are only the visible tip of that iceberg. To the best of my ability to tell, conda largely exists to make up for endemic deficiencies in Windows software distribution toolchains (not Python specific) and sadly it's not even good at that either.

    Mind you, glad it works for you. Warms my grey heart to know there's some balance in this universe. :)

    • maxnoe 3 days ago

      The "absurd slowness" is gone since more than a year when it switched to using the libmamba solver.

      • Balinares 19 hours ago

        We must be living in slightly different universes. (I'm in the Berenstain one. Things are not amazing here TBH.) I'll grant you that libmamba is faster, but we're still talking "tragicomic, leaning tragic" which is not in fact a qualitative improvement because we're still in the territory of switching to a different task while resolution, eventually, occurs.

        Another poster mentioned pixi elsewhere in the thread. I'll need to look into that.

zefrieira 3 days ago

I think Pixi mostly solves the main issues of conda by forcing users to have project-specific environments. It also solves environments incredibly fast, so it’s really quick to create new projects/environments. https://pixi.sh/

SamPatt 3 days ago

Conda is the only package manager I've used on Ubuntu that intermittently and inexplicably gets stuck when installing or uninstalling. It will sometimes resolve itself if left alone for hours, but often won't.

I avoid it as much as possible.

  • StableAlkyne 3 days ago

    It's because of the SAT solver for dependencies. Unlike Pip, it keeps track of every package you installed and goes out of its way to avoid installing incompatible packages.

    Why go through all this trouble? Because originally it was meant to be a basic "scientific Python" distribution, and needed to be strict around what's installed for reproducibility reasons.

    It's IMO overkill for most users, and I suspect most scientific users don't care either - most of the time I see grads and researchers just say "fuck it" and use Pip whenever Conda refuses to get done in a timely fashion.

    And the ones who do care about reproducibility are using R anyway, since there's a perception those libraries are "more correct" (read: more faithful to the original publication) than Pythonland. And TBH I can't blame them when the poster child of it is Sklearn's RandomForestRegressor not even being correctly named - it's bagged trees under the default settings, and you don't get any indication of this unless you look at that specific kwarg in the docs.

    Personally, I use Conda not for reproducibility, but so all of my projects have independent environments without having to mess with containers

    • _Wintermute 2 days ago

      > And the ones who do care about reproducibility are using R anyway

      I worked in a pharma company with lots of R code and this comment is bringing up some PTSD. One time we spent weeks trying to recreate an "environment" to reproduce a set of results. Try installing a specific version of a package, and all the dependencies it pulls in are the latest version, whether or not they are compatible. Nobody actually records the package versions they used.

      The R community are only now realising that reproducible environments are a good thing, and not everybody simply wants the latest version of a package. Packrat was a disaster, renv is slightly better.

    • Balinares 3 days ago

      > Personally, I use Conda not for reproducibility, but so all of my projects have independent environments without having to mess with containers

      A perfectly reasonable goal, yup! Thankfully not one that, in fact, requires conda. Automated per-project environments are increasingly the default way of doing things in Python, thank goodness. It's been a long time coming.

  • KolenCh 2 days ago

    The situation has since changed as the solver is rewritten. (Upstreamed from the work done in mamba.) I encourage you to try again.

    • retrochameleon a day ago

      Does that mean I should consider using conda again over mamba? (It's nice to not be a black sheep, but conda's performance was embarrasingly abysmal)

      • KolenCh a day ago

        As far as the solver is concerned, there should be no difference as it has been streamed. But I personally can’t see a reason to go back, as mamba is supposed to be a drop in replacement of conda. I default to use mamba and switch to conda only when necessary. There are some cases mamba can’t handle correctly, such as the case where you want to roll back to an earlier revision: https://github.com/mamba-org/mamba/issues/803

  • curiousgal 3 days ago

    Try it again. It now uses libmamba for solving dependencies

pmarreck 3 days ago

I feel like a major selling point of Nix is "solving the Python dependency-hell problem" (as well as that of pretty much every other stack)

I've seen so many issues with different Python venvs from different Python project directories stepping on each others' dependencies somehow (probably because there are some global ones) that the fact that I can now just stick a basic and barely-modified-per-project Python flake.nix file in each one and be always guaranteed to have the entirely of the same dependencies available when I run it 6 months later is a win.

  • drawnwren 3 days ago

    Do you have a publicly available copy of your flake?

    • jyap 3 days ago

      I started using Devenv which utilizes Nix. You might want to check that out.

      https://devenv.sh/

j0057 3 days ago

This seems to be an aggregation of some posts on python-list. Basically, extra-random opinions.

I'll offer mine: I won't say that Python packaging is generally excellent, but it's gotten much better over the years. The pyproject.toml is a godsend, there's the venv module built-in to Python, pip will by default no longer install package outside of a venv. Dependency groups are being added, meaning that the requirements.txt files can also be specified in the project.toml. Documentation is pretty good, especially if you avoid blog posts from 5+ years ago.

  • jszymborski 3 days ago

    pip + venv or just using Poetry usually is 100% headache-free for me. Conda, however, is usually a great way to ensure I have an awful time.

ur-whale 2 days ago

I tried Conda a number of time over the years, regretted it every time.

These days, when I absolutely have to use it because some obscure piece of software can't run unless Conda, I install it in a VM so that:

    - I protect my working system from the damage of installing Conda on it

    - I can throw the whole garbage fire away without long term brain damage to my system once I'm done
jkrubin 3 days ago

All thoughts and opinions about conda aside, it’s the only sane way (on several platforms) to install gdalbins + gdal-python-bindings.

I don’t mind conda. It has a lot of caveats and weird quirks

  • cozzyd 2 days ago

    dnf install python3-gdal?

sznio 2 days ago

The domain got parked.

teekert 2 days ago

People here focus on Python, but to me, a bioinformatician, conda is much more, it provides 99.99% of the tools I need. Like bwa, samtools, rsem, salmon, fastqc, R. And many, many obscure tools.

  • mbreese 2 days ago

    I wish you luck with tracking down versions of software used when you're writing papers... especially if you're using multiple conda environments. This is pretty much the example used in the article -- version mismatches.

    But, I think this illustrates the problem very well.

    Conda isn't just used for Python. It's used for general tools and libraries that Python scripts depend on. They could be C/C++ that needs to be compiled. It could be a Cython library. It could be...

    When you're trying to be a package manager that operates on-top of the operating system's package manager, you're always going to have issues. And that is why Conda is such a mess, it's trying to do too much. Installation issues are one of the reason why I stopped writing so many projects in Python. For now, I'm only doing smaller scripts in Python. Anything larger than a module gets written in something else.

    People here have mentioned Rust as an example of a language with a solid dependency toolchain. I've used more Go, which similarly has had dependency management tooling from the begining. By and large, these languages aren't trying to bring in C libraries that need to be compiled and linked into Python accessible code (it's probably possible, but not the main use-case).

    For Python code though, when I do need to import a package, I always start with a fresh venv virtual environment, install whatever libraries are needed in that venv, and then always run the python from that absolute path (ex: `venv/bin/python3 script.py`). This has solved 99% of my dependency issues. If you can separate yourself from the system python as much as possible, you're 90% of the way there.

    Side rant: Which, is why I think there is a problem with Python to begin with -- *nix OSes all include a system level Python install. Dependencies only become a problem when you're installing libraries in a global path. If you can have separate dependency trees for individual projects, you're largely safe. It's not very storage efficient, but that's a different issue.

    • ebolyen 2 days ago

      > I wish you luck with tracking down versions of software used when you're writing papers... especially if you're using multiple conda environments.

      How would you do this otherwise? I find `conda list` to be terribly helpful.

      As a tool developer for bioinformaticians, I can't imagine trying to work with OS package managers, so that would leave vendoring multiple languages and libraries in a home-grown scheme slightly worse and more brittle than conda.

      I also don't think it's realistic to imagine that any single language (and thus language-specific build tools or pkg manager) is sufficient. Since we're still using fortran deep in the guts of many higher level libraries (recent tensor stuff is disrupting this a bit, but it's not like openBLAS isn't still there as a default backend).

      • mbreese 2 days ago

        > home-grown scheme slightly worse and more brittle than conda

        I think you might be surprised as to how long this has been going on (or maybe you already know...). When I started with HPC and bioinformatics, Modules were already well established as a mechanism for keeping track of versioning and multiple libraries and tools. And this was over 20 years ago.

        The trick to all of this is to be meticulous in how data and programs are organized. If you're organized, then all of the tracking and trails are easy. It's just soooo easy to be disorganized. This is especially true with non-devs who are trying to use a Conda installed tool. You certainly can be organized and use Conda, but more often than not, for me, tools published with Conda have been a $WORKSFORME situation. If it works, great. If it doesn't... well, good luck trying to figure out what went wrong.

        I generally try to keep my dependency trees light and if I need to install a tool, I'll manually install the version I need. If I need multiple versions, modules are still a thing. I generally am hesitant to trust most academic code and pipelines, so blindly installing with Conda is usually my last resort.

        I'm far more comfortable with Docker-ized pipelines though. At least then you know when the dev says $WORKSFORME, it will also $WORKFORYOU.

  • GuestFAUniverse 2 days ago

    And then somebody tries to install mamba via conda and that house of cards reveals itself.

    • teekert 2 days ago

      I install Snakemake via miniforge which uses mambaforge to make its own envs. Biology is messy ;)

oivey 3 days ago

Besides the horrendous formatting, some stuff in this article seem incorrect or irrelevant. Like, is this even possible?

> A single Anaconda distribution may have multiple NumPy versions installed at the same time, although only one will be available to the Python process (note that this means that sub-processes created in this Python process won’t necessarily have the same version of NumPy!).

I’m pretty sure there’s not, but maybe there is some insane way to cause subprocesses to do this. Besides that, under the authors definition, different Python virtualenvs also install multiple copies of libraries in the same way conda does.

The comments about Jupyter also seem very confused. It’s hard to make heads or tails of exactly what the author is saying. There might be some misunderstandings of how Jupyter kernels select environments.

> Final warning: no matter how ridiculous this is: the current directory in Python is added to the module lookup path, and it precedes every other lookup location. If, accidentally, you placed a numpy.py in the current directory of your Python process – that is going to be the numpy module you import.

This has nothing to do with conda.

shadowgovt 3 days ago

I think Python had a pretty good idea in standardizing a packaging protocol and then allowing competing implementations, but I would have preferred a single "blessed" solution. More than one package management option in an ecosystem always adds some kind of "can't get there from here" friction and an additional maintenance burden on package maintainers.

poetry has been working well enough for me as of late, but it'd be nice if I didn't have to pick.

icameron 3 days ago

Conda: a package manager disaster that became paid license required for companies over 200 employees. It worked 5 years ago, we can no longer legally use it

  • rbanffy 19 hours ago

    It kind of limits its own blast radius that way.

smcleod 3 days ago

I honestly have no idea why anyone still uses Conda, it's a right pain in the ass. Python package management in general is a nightmare, but whenever I run up a project that uses Conda I immediately disregard it and use uv / pyenv.

benreesman 2 days ago

I strongly suspect that there is about to be a spike in Python packaging discussion over and above the high ambient baseline.

uv is here to kick ass and chew bubblegum. And it’s all out of gum.

kristianp 2 days ago

Are most people having problems with python packages using Windows? It's been mentioned a couple of times in this thread, but not that often.

prpl 2 days ago

conda was for scientific python, but had to solve for everything below python to make that work. There was no generic binary solution before python for multiple architectures and operating systems.

viraptor 3 days ago

> The traditional setup.py install command may install multiple versions of the same package into the same directory

Wait, what? In what situation would that ever happen? Especially given the directories for packages are not versioned, so setuptools should never do two different versions in any way.

The_Colonel 3 days ago

It's rare to see something as systematically broken as Python package/dependencies ecosystem.

What I don't understand - what makes this so difficult to solve in Python? It seems that many other platforms solved this a long time ago - maven 2.0 was released almost 20 years ago. While it wasn't / isn't by no means perfect, its fundamentals were decent already back then.

One thing which I think messed this up from the beginning was applying the Unix philosophy with several/many individual tools as opposed to one cohesive system - requirements.txt, setuptools, pip, pipx, pipenv, venv... were always woefully inadequate, but produced a myriad of possible combinations to support. It seems like simplicity was the main motivation for such design, but these certainly seems like examples of being too simplistic for the job.

I recently tried to run a Python app (after having a couple of years break from Python) which used conda and I got lost there quickly. Project README described using conda, mamba, anaconda, conda-forge, mini-forge, mini-conda ... In the end, nothing I tried worked.

  • perrygeo 3 days ago

    > what makes this so difficult to solve in Python?

    Python creates the perfect storm for package management hell:

    - Most the valuable libraries are natively compiled (so you get all the fun of distributing binaries for every platform without any of the traditional benefits of native compilation)

    - The dynamic nature makes it challenging to understand the non-local impacts of changes without a full integration test suite (library developers break each other all the time without realizing it, semantic versioning is a farce)

    - Too many fractured packaging solutions, not a single one well designed. And they all conflict.

    - A bifurcated culture of interactive use vs production code - while they both ostensibly use the same language, they have wildly different sub-cultures and best practices.

    - Churn: a culture that largely disavows strong backwards compatibility guarantees, in favor of the "move fast and break things" approach. (Consequence: you have to move fast too just to keep up with all the breakage)

    - A culture that values ease of use above simplicity of implementation. Python developers would rather save 1 line of code in the moment, even if it pushes the complexity off to another part of the system. The quite obvious consequence is an ever-growing backlog of complexity.

    Some of the issues are technical. But I'd argue that the final bullet is why all of the above problems are getting worse, not better.

    • braza 2 days ago

      > Too many fractured packaging solutions, not a single one well designed. And they all conflict.

      100% this.

      Last 4 years, one of the most frustrating parts of SWE that I need to deal with on a daily basis is packaging data science & machine learning applications and APIs in Python.

      Maybe this is a very mid-solution, but one solution that I found was to use dockerized local environments with all dependencies pinned via Poetry [1]. The start setup is not easy, but now using some other Make file, it's something that I take only 4 hours with a DS to explain and run together and save tons of hours of in debugging and dependency conflict.

      > Python developers would rather save 1 line of code in the moment, even if it pushes the complexity off to another part of the system.

      Sounds odd to me in several projects that I worked on that folks bring the entire dependency on Scikit-Learn due to the train_test_split function [2] because the team thought that it would be simpler and easier to write a function that splits the dataset.

      [1] - https://github.com/orgs/python-poetry/discussions/1879 [2] - https://scikit-learn.org/1.5/modules/generated/sklearn.model...

      • biztos 2 days ago

        I'm trying to do the same but with uv instead of poetry. So far so good, and it helps that for me delivering as a docker container is a requirement, but I have no idea what's going to happen if I need to run "real" ML stuff. (Just doing a lot of plotting so far.)

    • aaroninsf 2 days ago

      I agree with all of these and it makes me wonder as I do from time to time,

      has anyone managed to make a viable P#, a clean break which retains most of what most people love about the language and environment; and cheerfully asserts new and immutable change in things like <the technical parts of the above>.

      When I have looked into this it seems people can't help but improve one-more-thing or one-other-thing and end up just enjoying vaguely-pythonic language design.

      • graemep 2 days ago

        IronPython? The problem with that is compatibility with, and easy access to, existing libraries which is the main reason to use Python in the first place.

        I also think some of the criticisms in the GP comment are not accurate. most of the valuable libraries are native compiled? Some important ones are, but not all.

        I think a lot of the problem is that Python's usage has changed. Its great for a wide range of uses (scripting, web apps and other server stuff, even GUIs) but its really not a great match for scientific computing and the like but has become widely used there because it is easy to learn (and has lots of libraries for that now!).

  • TheAceOfHearts 3 days ago

    The problem is that Python refuses to take responsibility for the whole ecosystem. One of the biggest success stories in programming language development has been Rust's realization that all of it matters: language, version management, package management, and build tools. To have a truly outstanding experience you need to take responsibility for the whole ecosystem. Python and many other older languages just focus on one part of the ecosystem, while letting others take care of different parts.

    If Python leadership had true visionaries they would sit down, analyze every publicly available Python project and build a single set of tools that could gradually and seamlessly replace the existing clusterfuck.

    Python developers will pretend the language is all about simplicity and then hand you over to the most deranged ecosystem imaginable. It sure is easy to pretend that you have a really simple ecosystem when you cover your eyes and focus on a small segment of the overall experience.

    • mook 3 days ago

      You can kind of see this in golang. Originally it came with stuff to download dependencies, but it had major issues with more complex projects and some community-made tools became popular instead. But it meant that multiple tools were used in different places and it was kind of a mess. Later on a new system was done in the default toolchain and even though it has problems it's good enough that it's now surprising for somebody to use non-default tools.

    • skeledrew 3 days ago

      Who will pay for all this responsibility?

      • TheAceOfHearts 3 days ago

        I don't know, but are we going to pretend that it would be particularly difficult to get funding for drastically simplifying and improving the tooling for one of the world's most popular programming languages?

        I'm not sure how Rust is doing it, but the problem is hardly insurmountable.

        • skeledrew a day ago

          The PSF does have massive financial challenges. I don't know how Rust does it either, but I think there's far less general overhead due to its specificity. Python has a far broader reach, with a lot of diverse use cases to cater to.

      • awestroke 3 days ago

        Yeah, who will pay for this drastic reduction in wasted time?

  • hobofan 3 days ago

    > What I don't understand - what makes this so difficult to solve in Python?

    I think there are many answers to this, and there are many factors contributing to it, but if I had to pick one: The setup.py file. It needs to be executed to determine the dependencies of a project. Since it's a script, that allows any maintainer of any package you are using to do arbitrarily complex/dumb stuff in it like e.g. conditionally adding dependencies based on host system specific environment markers, or introduce assumptions on the environment it is being installed to. That makes trying to achieve all the things you'f want from a modern package manager so much harder.

    This also means that the problem isn't just concentrated in 1-2 central package management projects, but scattered throughout the ecosystem (and some of the worst offenders are some of Python's most popular sub-ecosystems).

    There is some light with the introduction of the pyproject.toml, and now uv as a tool taking advantage of it.

    • fire_lake 3 days ago

      > The setup.py file. It needs to be executed to determine the dependencies of a project.

      Yes, this should never have been allowed. It solved a problem in the short term but in the long term has caused no end of pain.

    • Kwpolska 2 days ago

      setup.py allowed arbitrary things, but at least it always went through setuptools (or closely related predecessors, such as distribute or distlib). There is now pyproject.toml, but at the same time, there are tons of build backends that can do different things. And one of the most popular modern packaging tools, poetry, uses a non-standard section for the package data.

  • dxuh 3 days ago

    I think at least part of it is that there are so many solutions for Python packaging, which are often intermixed or only half-supported by developers. It's a tough ask to provide dedicated support for pip, conda, poetry and what else is there plus a couple different ways to create virtual environments. Of course if you do everything right, you set it up once (if even that) and it just keeps working forever, but it is never like that. Someone will use a tool you haven't and it will not work correctly and they will find a workaround and the mess starts.

    Also I think that Python packages are sometimes distributed as shared libraries is a problem. When I think about conan or vcpkg (package managers for C and C++), they usually suck because some dependencies are available on some platforms and not on others or even in one version on one platform and in another version on another and you get messes all around if you need to support multiple platforms.

    I think generally binary package managers are almost always bad* and source based package managers almost always work well (I think those are essentially easy mode).

    *: unless they maintain a source package of their own that they actually support and have a fixed set of well-supported platforms (like system package managers on most Linux distros do).

    • theamk 3 days ago

      The problem is a lot of Python source is actually a C/C++ file, so simply having "source based package manager for Python" is very annoying, as you'd have to manage your C/C++ sources with some other mechanisms.

      This is exactly the reason I've moved from pip to conda for some projects: "pip" was acting a source-based package manager, and thus asking for C tools, libraries and dev headers to be installed - but not providing them as they were non-Python and thus declared out of scope. Especially on older Linux distributions, getting dependencies right can be quite a task.

      • vel0city 2 days ago

        This used to be a big headache for me, especially having developers on Windows but deployment targets in Linux, but a lot of the libraries I commonly use these days are either pure python or ship wheels for the platforms I use.

        Were your issues recent or from several years ago?

        • theamk 2 days ago

          The issues were recent (as of few months ago), but the OS's were pretty old - Ubuntu 20.04 and even 18.04. Those are still officially supported with Ubuntu Pro (free for individuals), but have ancient libraries and Python versions.

  • cdavid 3 days ago

    A lot of path dependency, but essentially

      1. A good python solution needs to support native extensions. Few other languages solve this well, especially across unix + windows.
      2. Python itself does not have package manager included.
    
    I am not sure solving 2 alone is enough, because it will be hard to fix 1 then. And ofc 2 would needs to have solution for older python versions.

    My guess is that we're stuck in a local maximum for a while, with uv looking like a decent contender.

    • 9dev 3 days ago

      PHP and composer do. You can specify native extensions in the composer.json file, along with an optional version requirement, and install them using composer just fine. Dependencies can in turn depend on specific extensions, or just recommend them without mandating an installation. This works across UNIX and Windows, as far as I’m aware.

      • dagw 3 days ago

        PHP and composer do.

        Is that a new feature? Pretty sure it didn't a few years ago. If the thing I need needed the libfoo C library then I first had to install libfoo on my computer using apt/brew/etc. If a new version of the PHP extension comes out that uses libfoo 2.0, then it was up to me to update libfoo first. There was no way for composer to install and manage libfoo.

      • theamk 3 days ago

        does not seem so... Something as simple as "yaml" already requires reaching to apt-get: http://bd808.com/pecl-file_formats-yaml/

        > Php-yaml can be installed using PHP's PECL package manager. This extension requires the LibYAML C library version 0.1.0 or higher to be installed.

            $ sudo apt-get install libyaml-dev
        
        This is basically how "pip" works, and while it's fine for basic stuff, it gets pretty bad if you want to install fancy numerical of cryptography package on a LTS linux system that's at the end of the support period.

        I am guessing that PHP might simply have less need for native packages, being more web-oriented.

  • pmarreck 3 days ago

    Nix solves it for me. Takes a bit more effort upfront, but the payoff is "Python dependency determinism," which is pretty much unachievable in any other way, so...

    • reactordev 3 days ago

      The answer is not Yet Another Tool In The Chain. Python community itself needs to address this. Because if they don’t then you’ll have requirements.txt, setuptools, pyproject, pip, pipx, pipenv, pyenv, venv, nix.

      • The_Colonel 3 days ago

        Agreed. Often there's a quite tight coupling between the core platform devs and package management - node.js has its npm, rust cargo, go has one as well and for the most part it seems to have worked out fine for them. Java and .NET (and I think PHP) are different in the sense that the package management systems have no relation to the platform developers, but industry standards (maven, gradle, NuGET, Composer) still appeared and are widely accepted.

        But with Python it seems completely fractured - everyone tries to solve it their own way, with nothing becoming a truly widely used solution. More involvement from the Python project could make a difference. From my perspective, this mess is currently Python's biggest problem and should be prioritized accordingly.

        • neonsunset 3 days ago

          FWIW Nuget to .NET is what Cargo crates are to Rust instead of what Maven and Gradle are to Java. The package manager is just a part of the SDK.

          Even the CLI workflow is identical: dotnet add package / cargo add (.NET had it earlier too, it's nice that Cargo now also has it).

          • ElectricalUnion 3 days ago

            Wait, newer versions of thr JDK Java SDK now bundle maven and gradle? Why does everyone use mvnw/gradlew for?

            • neonsunset 3 days ago

              This was referring to package manager being just a part of .NET's SDK. Gradle and Maven continue to ship separately.

          • The_Colonel 3 days ago

            Right, I forgot NuGet got adopted by Microsoft. But it started and gained prominence independently.

      • 331c8c71 3 days ago

        Nix is cross-language though. So it will be useful even if the Python mess is cleaned up a bit.

      • eviks 3 days ago

        Well, there is no way to address it then, no magic will eliminate everything from the list.

        So another tool isn't meaningfully different (and it can be the answer): if "the community" migrates to the new tool it wouldn't matter that there's a dozen of other unused tools.

        Same thing if "the community" fixes an existing tool and migrates to it: other unused tools will still exist

      • pxc 2 days ago

        Nix isn't 'yet another tool in the chain'; Nix demands to run the whole show, and in the Nix world native dependencies in all programming language are first class citizens that the ecosystem is already committed to handling.

        > Python community itself needs to address this.

        The Python community can't address it, really, because that would make the Python community responsible for a general-purpose package management system not at all limited to Python, but including packages written in C, C++, and Rust to start, and also Fortran, maybe Haskell and Go, too.

        The only role the Python community can realistically play in such a solution is making Python packages well-behaved (i.e., no more arbitrary code at build time or install time) and standardizing a source format rich with metadata about all dependencies (including non-Python dependencies). There seems to be some interest in this in the Python community, but not much.

        The truth, perhaps bitter, is that for languages whose most important packages all have dependencies foreign to the ecosystem, the only sane package management strategy is slotting yourself into polyglot software distributions like Nix, Guix, Spack, Conda, Pkgsrc, MacPorts, MSYS2, your favorite Linux distro, whatever. Python doesn't need a grand, unifying Python package manager so much as a limited, unified source package format.

      • otabdeveloper4 3 days ago

        Nix isn't another tool, it's a tool that subsumes all other tools.

      • turboponyy 3 days ago

        The thing is, Nix is not Yet Another Tool, it is the tool.

        • sethops1 2 days ago

          And so was Docker before Nix

          • pxc 2 days ago

            Docker is kinda the opposite of Nix in this respect— Docker is fundamentally parasitic on other tools for dependency management, and Nix handles dependencies itself.

            That parasitism is also Docker's strength: bring along whatever knowledge you have of your favorite language ecosystem's toolchain; it'll not only apply but it'll likely be largely sufficient.

            Build systems like Buck and Bazel are more like Nix in this respect: they take over the responsibilities of soke tools in your language's toolchain (usually high-level build tools, sometimes also dependency managers) so they can impose a certain discipline and yield certain benefits (crucially fine-grained, incremental compilation).

            Anyway, Docker doesn't fetch or resolve the dependencies of Python packages. It leaves that to other tools (Nix, apt-get, whatever) and just does you the favor of freezing the result as a binary artifact. Immensely useful, but solves a different problem than the main one here, even if it eases some of the same burdens.

  • ngrilly 2 days ago

    Agreed. But the problem is now fully solved by https://docs.astral.sh/uv/.

    • biztos 2 days ago

      I'm enjoying uv but I wouldn't say the problem is "fully" solved -- for starters it's not uncommon to do `uv add foo` and then 5K lines of gobbledygook later it says "missing foo-esoterica.dll" and I have to go back to the multiplatform drawing board.

      • ngrilly 2 days ago

        Could it be a problem with a specific Python package being installed rather than uv itself?

    • jokethrowaway 2 days ago

      easy if you start from scratch, hard if you want to get existing projects working

      also it doesn't always work, I got stuck with some dependencies when it works it's amazing

  • coliveira 2 days ago

    It is not a new discovery that Python is terrible for packaging and distribution. Unfortunately, very little has been done about this. The fact that Python is used on particular environments controlled by the developers, mainly machine learning, makes this even more difficult to fix.

    • irskep 2 days ago

      It's not really true to say "very little has been done." Thousands of person-hours have been invested into this problem! But the results have been mixed.

      At least uv is nice! https://docs.astral.sh/uv/

      • Kwpolska 2 days ago

        Time was spent, but on what? Creating 15+ different, competing tools? That won’t improve things. Blessing one tool and adopting something equivalent to node_modules could, but the core team is not interested in improving things this way.

  • cogman10 2 days ago

    > what makes this so difficult to solve in Python?

    I think the answer is the same thing that makes it difficult to make a good package manager for C++.

    When a language doesn't start with decent package management, it becomes really hard to retrofit a good one later in the lifespan of that language. Everyone can see "this sucks" but there's simply no good route to change the status quo.

    I think Java is the one language I've seen that has successfully done the switch.

    • Kwpolska 2 days ago

      Java, C#, JavaScript (node) all disagree. If the Python core team wanted good packaging, they could have done it ages ago. Sure, a good solution might not be applicable for past Python versions, but they aren’t doing anything to make it any better.

  • zmakrr 2 days ago

    PyPI was always broken due to weird ideas for problems that were long solved in other languages or distributions. They had/have the backing of fastly.net, which created an arrogant and incompetent environment where people listed to no one.

    Conda suffers from the virtual environment syndrome. Virtual environments are always imperfect and confusing. System libraries sometime leak through. The "scientific" Python stack has horrible mixtures of C/C++/Cython etc., all poorly written ad difficult to build.

    Projects deteriorated in their ability to build from source due to the availability of binary wheels and the explosion of build systems. In 2010 there was a good chance that building a C project worked. Now you fight with meson versions, meson-python, cython versions, libc versions and so forth.

    There is no longer any culture of correctness and code cleanliness in the Python ecosystem. A lot of good developers have left. Some current developers work for the companies who sell solutions for the chaos in the ecosystem.

    • auxym 2 days ago

      > The "scientific" Python stack has horrible mixtures of C/C++/Cython

      Don't forget a whole lot of FORTRAN :)

  • woodruffw 3 days ago

    Python packaging’s complexities are difficult to attribute to any single cause. But path dependency, extremely broad adoption, and social conventions with the Python community (which has historically preferred standards over picking single tools) are all contributing factors.

    Most of these aspects have significantly improved over the last decade, at least for the standard packaging ecosystem. I don’t know about Conda, which has always been its own separate thing.

  • BiteCode_dev 3 days ago

    Python packaging is broken mostly because bootstrapping is broken, and it cascades to packaging but people don't know the bootstrapping is responsible and blame packaging.

    Not saying packaging doesn't have faults, but on it's own, on a good Python setup, it's actually better than average. But few people have a good setup. In fact most people don't know what a good setup looks like.

    And here is why bootstrapping is broken: https://www.bitecode.dev/p/why-is-the-python-installation-pr...

    • thangngoc89 3 days ago

      uv solves this issue nicely. Uv manages Python version and being a single binary, installing uv involved downloading a file and add it to PATH

      • BiteCode_dev 3 days ago

        Yes, that's one of the most important success of the tool. Being in rust, it is completely independent from the Python setup, and therefore it doesn't care if you botched it. And with the indy greg build, it can even avoid the pyenv pitfall of compiling on your machine on linux.

    • skeledrew 3 days ago

      My single setup routine has served me well for years, with little to no change: pipx as the tools manager, miniconda for env bootstrap and management, poetry (installed with pipx) for project management (works great with conda envs) and autoenv to ensure the correct env is always active for any project I'm currently in. The only issue I may potentially have is if I install anything apart from Python via conda, as that won't be reflected in the pyproject file.

  • f1shy 3 days ago

    >> One thing which I think messed this up from the beginning was applying the Unix philosophy with several/many individual tools as opposed to one cohesive system

    Well, Unix IS the cohesive system..

  • jokethrowaway 2 days ago

    my approach is to ignore all the *conda stuff and:

    yay -S python-virtualenv # I'm on arch, do not confuse with 12 similarly named alternatives pyenv virtualenv 3.10 random-python-crap pyenv local 3.10.6/envs/random-python-crap pip install -r requirements.txt

    and it works (sometimes deps are in some other places, or you have to pass -c constraints.txt or there is no file and you need to create it in various ways)

    At least by not using local .env directories, I always know where to find them.

    I install a lot of AI projecst so I have around 1TB just for the same python dependencies installed over and over again.

    Sometimes I can get away with trying to use the same venv for two different projects but 8/10 deps get broken.

  • liveoneggs 2 days ago

    python developers run pyenv inside of docker containers.. they just have no clue what good dependency management could even possibly look like

  • fire_lake 3 days ago

    I don’t think the Python community has a culture of thinking about software engineering in a principled and systematic way like you would see in places like Haskell, Rust or Clojure communities.

    Pythons strength (and weakness) is an emphasis on quick scripts, data science and statistics.

    There’s simply not the right people with the right mindset.

    • skeledrew 3 days ago

      Not "right" or "wrong" mindset. Just different.

      • HideousKojima 2 days ago

        No it's wrong because of the mess it makes. Which makes even the things that that crowd of people wants to focus on, like wuick scripts or data science, harder.

      • fire_lake 21 hours ago

        If your objective is to reliably build software then it’s objectively worse.

        • skeledrew 12 hours ago

          Many, probably the majority, just want to build something quickly and be done, or get to the next iteration. It's a huge reason why Python is widely adopted in classrooms and for ML/AI. It's objectively better than other languages that force extra overhead on users by default.

          • fire_lake 12 hours ago

            I would argue that it should be possible to make something rigorous and easy to use here. The Python model is pure incidental complexity.

            • skeledrew 10 hours ago

              It's not an argument that so far stands in this context. Otherwise a decent amount of the languages most preferred by experienced software engineers would be used more generally by literally anyone else outside that set. And Python then would either be very different, or have far less mind share.

              Also keep in mind that a) Python has been around longer than every other "popular" language, and so b) it has a lot of baggage that it has to maintain in order to avoid another 2to3 fiasco.

orf 6 days ago

Impossible to read on mobile, least of all because of the lack of any word breaking.

  • anakaine 3 days ago

    Seconded. How about we don't write a blog trashing an implementation of something when our own design is missing some very basic accessibility and ux features.

    Though I agree with the premise, Conda is an absolute pest when you start customising an environment with a number of packages. Dependency resolution hell.

  • foepys 3 days ago

    How does one even do that? Browsers try quite hard to not break words and make text readable by default.

    • pphysch 3 days ago

      The author put "word-break:break-all" on a parent element of the text content, which itself is a <ul> containing <p>, probably for "layout" purposes. Methinks some CSS education is desperately needed.

    • deathanatos 2 days ago

      The blog author explicitly requested it, with `word-break: break-all`.

      Now why you would do that … IDK.

      Weirdly, it's the second post on HN this quarter to do it, from a completely different site. Makes me wonder if there's some viral piece of CSS advice out there …? (And nobody looks at their site…?) Bad LLM output?

  • kelnos 2 days ago

    I see this on Firefox on desktop too. I usually try not to criticize the presentation of things posted here, but this is completely unreadable. I've tried several times to get through the first couple paragraphs, but it's just not worth the extra mental effort.

  • bangaladore 3 days ago

    Even on Desktop.

    When 30% of your page is junk, it makes you wonder about the other 30%...

  • Finnucane 2 days ago

    I gave up after a few lines. Why would you do this?