I’ve been trying Workstation recently. Python dependency issues caused me to switch to Silverblue for the last 2 years. A new machine with Nvidia got me to try WS. I just had a mystery problem with Python after booting today and that got me looking into Anaconda. I didn’t know it was used under the kernel like this. I’m not sure how I feel about this level of Python integration. I would feel a lot more comfortable with a less accessible precompiled binary but I know I am likely making naïve assumptions saying this. Any thoughts or insights?

  • @j4k3OP
    link
    English
    11 year ago

    https://stackoverflow.com/questions/33683530/is-anaconda-for-fedora-different-from-anaconda-for-python

    I should have searched for this first I guess. That is reassuring. I was mostly uncomfortable with the idea of the two being the same.

    Still Anaconda from RH claims the software is mostly written in Python. That still makes me uneasy. I’ve always thought of C as very near to the hardware assembly and an interpreted language as prioritizing portability, flexibility, and access. I find it far harder to hack around with a binary written in C versus all of the python stuff I’ve encountered. Maybe I’m just mistaken in my understanding of how this code is running.

    I look at C programs as tied more to the hardware they are compiled to run on with permanence. I look at python as a language designed to constantly depreciate itself and its code base. It feels like an extension of proprietary hardware planned obsolescence and manipulation. I don’t consider programs written in Python to have permanence or long term value because their toolchains become nearly impossible to track down from scratch.

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      You might be even more concerned to find that your Fedora package manager, DNF, is also written in Python: https://github.com/rpm-software-management/dnf

      Fact of the matter is that Python is a language that gets used all the time for system level things, and frequently you just don’t know it because there is no “.py” extension.

      I’m not sure I understand your concerns about python…

      1. Performance is worse than C, yes. But writing performance sensitive code in Python is quite silly, it’s common to put that in a C library and use that within python to get the best of both worlds. DNF does this with libdnf.
      2. “It feels like an extension of proprietary hardware planned obsolescence and manipulation.” This is very confusing to me. There has been one historic version change (2->3) which broke compatibility in a major way, and this version change had a literal decade of help and resources and parallel development. The source code for every Python interpreter version is freely available to build and tweak if you’re unhappy with a particular version. Most python scripts are written and used for ages without any changes.
      3. “i don’t consider programs written in Python to have permanence or long term value because their toolchains become nearly impossible to track down from scratch.” Again, what? As I said, every Python version is available to download, build, and install, and tweak. It’s pretty much impossible for python code to every become unusable.

      Anyway, people like the Fedora folks working on anaconda choose a language that makes sense for their purpose. Python absolutely makes sense for this purpose compared to C. It allows for fast development and flexibility, and there’s not much in an installer program that needs high performance.

      That’s not to say C isn’t a very important language too. But it’s important to use the best tool for the job.

      • @j4k3OP
        link
        English
        21 year ago

        I mean, I’m playing with offline AI right now and reproducibility sucks. Most of this is python based. I think I need to switch to Nix for this situation where I need to compile most tools from source. While python itself my be easily available for previous versions, sorting out the required dependencies is outside of what most users are capable of reproducing. I get the impression C is more centralized with less potential to cause problems. At least when it comes to hobbyist embedded projects, the stuff centered around C has staying power long term. The toolchains are easy to reproduce. Most python stuff is more trouble to reproduce than it is worth after it is a few years old. IMO it feels something like the Android SDK where obsolescence is part of the design. I shouldn’t need to track down all the previous package versions just to reproduce software. It should simply work by default when all of the dependencies are named. There should be something like a central library of deprecated packages that guarantees reproducibility. Even if these packages are available somewhere in the aether, piecing together relevant documentation for them is nearly impossible to sort out in historical context. This has been my experience as an intermediate user and hobbyist.

        • @[email protected]
          link
          fedilink
          English
          21 year ago

          I see what you mean. The python ML ecosystem is… not far off from what you describe.

          But please consider Python as a language outside the pytorch/numpy/whatever else ecosystem. The vast majority of Python doesn’t need you to setup a conda environment with a bunch of ML dependencies. It’s just some code and a couple of libraries in a virtualenv. And for system stuff, there’s almost never any dependency except the standard library.