So, I have a python script I’d like to run from time to time from the CLI (on Linux) that resides inside a venv. What’s the recommended/intended way to do this?
Write a wrapper shell script and put it inside a $PATH-accessible directory that activates the virtual environment, runs the python script and deactivates the venv again? This seems a bit convoluted, but I can’t think of a better way.

      • @[email protected]
        link
        fedilink
        26 months ago

        I use my own Zsh project (zpy) to manage venvs stored like ~/.local/share/venvs/HASH-OF-PROJECT-PATH/venv, so use zpy’s vpy function to launch a script with its associated Python executable ad-hoc, or add a full path shebang to the script with zpy’s vpyshebang function.

        vpy and vpyshebang in the docs

        If anyone else is a Zsh fan and has any questions, I’m more than happy to answer or demo.

        • Faulkmore
          link
          fedilink
          16 months ago

          @Andy The convention is to place the venv in a .venv/ sub folder. Follow the convention!

          This is shell agnostic

          Learn pyenv and minimize shell scripts (only lives within a Makefile).

          Shell scripts within Python packages is depreciated

          • @[email protected]
            link
            fedilink
            26 months ago

            The convention

            That’s one convention. I don’t like it, I prefer to keep my venvs elsewhere. One reason is that it makes it simpler to maintain multiple venvs for a single project, using a different Python version for each, if I ever want to. It shouldn’t matter to anyone else, as it’s my environment, not some aspect of the shared repo. If I ever needed it there for some reason, I could always ln -s $VIRTUAL_ENV .venv.

            Learn pyenv

            I have used pyenv. It’s fine. These days I use mise instead, which I prefer. But neither of them dictate how I create and store venvs.

            Shell scripts within Python packages is depreciated

            I don’t understand if what you’re referencing relates to my comment.

            • @[email protected]
              link
              fedilink
              16 months ago

              The multiple venv for different Python versions sounds exactly like what tox does

              Then setup a github action that does nightly builds. Which will catch issues caused by changes that only tested against one python version or on one platform

              py313 is a good version to test against cuz there were many modules removed or depreciated or APIs changed

              good luck. Hope some of my advice is helpful

              • @[email protected]
                link
                fedilink
                26 months ago

                Thanks, yes, I use nox and github actions for automated environments and testing in my own projects, and tox instead of nox when it’s someone else’s project. But for ad hoc, local and interactive multiple environments, I don’t.

                • @[email protected]
                  link
                  fedilink
                  12 months ago

                  Are you using github actions locally? Feel silly making gh actions and workflows and only github runs them

                  • @[email protected]
                    link
                    fedilink
                    12 months ago

                    No, I don’t use GHA locally, but the actions are defined to run the same things that I do run locally (e.g. invoke nox). I try to keep the GHA-exclusive boilerplate to a minimum. Steps can be like:

                    - name: fetch code
                      uses: actions/checkout@v4
                    
                    - uses: actions/setup-python@v5
                      with:
                        allow-prereleases: true
                        python-version: |
                          3.13
                          3.12
                          3.11
                          3.10
                          3.9
                          3.8
                          3.7
                    
                    - run: pipx install nox
                    
                    - name: run ward tests in nox environment
                      run: nox -s test test_without_toml combine_coverage --force-color
                      env:
                        PYTHONIOENCODING: utf-8
                    
                    - name: upload coverage data
                      uses: codecov/codecov-action@v4
                      with:
                        files: ./coverage.json
                        token: ${{ secrets.CODECOV_TOKEN }}
                    

                    Sometimes if I want a higher level interface to tasks that run nox or other things locally, I use taskipy to define them in my pyproject.toml, like:

                    [tool.taskipy.tasks]
                    fmt = "nox -s fmt"
                    lock = "nox -s lock"
                    test = "nox -s test test_without_toml typecheck -p 3.12"
                    docs = "nox -s render_readme render_api_docs"
                    
    • @[email protected]
      link
      fedilink
      26 months ago

      This. I’ve experimented by using pex before and one or two other means of executable python wrappers and they suck. Just do as lakeeffect says.

    • whoareu
      link
      fedilink
      16 months ago

      I think the path to venv should be absolute right?

      • @lakeeffect
        link
        16 months ago

        Yeah, for the most part but really depends on what you’re trying to do specifically.

      • @[email protected]
        link
        fedilink
        16 months ago

        Just activate the venv and then put it out of your mind. Can activate it with either a relative or absolute path. Doesn’t matter which