I’m setting up a new laptop and considering which of the (many) environment managers to use this time around. My standard has been miniconda, since a big plus for me is the ability to set and download specific python version for different projects all in one tool. I also quite like having global access to different environments (i.e. environments aren’t tied to specific projects). I typically have a standard GenDataSci
environment always available for initially testing things out, then if I know I’ll be continuing as a single project I’ll make a stand alone environment for it.
But I’ve also used poetry for tighter control and reproducibility when I’m actually packaging to publish on PyPI. Hatch looks interesting as well but I can’t tell if it includes the ability to have separate python version installs for each environment.
What workflows and managers are people using now?
It depends on what I’m developing: if I’m using Python for prototyping stuff for another language, using Jupyter with Anaconda, without virtual environments, tends to be my go-to, so I can have everything I need easily available and easy to debug. However, when I’m working on a package or script that will have to be used by others, I’m using vanilla virtual environments, so that I can check if everything works correctly in a vanilla environment.
I’ve never looked into poetry, though, what are its upsides?
For one thing, it creates a lock file which is super useful for packaging. Rather than just listing often open-ended package requirements, it defines exactly which versions are installed and locks to that. I think it also has a pretty strict dependency resolver which, again, is nice for package publishing if a bit frustrating for development. Also it makes publishing to PyPI very easy, with nice commands inside poetry rather than needing to use something else like flit.