Hi! I've been working with poetry
recently and am happy that I can publish packages to pypi or my own hosted package repository with it. I would like to share my current setup with TravisCI in hopes that it could help others. I would also like to learn from those who have more experience with poetry
to get tips on how I might make this setup better, or see how it works with other continuous integration platforms, or other more in depth CI/CD requirements.
My requirements for travis-ci are fairly common.
poetry
to install my project dependencies and dev dependencies._.travis.yml_
language: python
python:
- 3.6
env:
global:
- secure: "<encrypted MYPYPI_USER=username>"
- secure: "<encrypted MYPYPI_PASS=p@ssword>"
before_install:
- pip install poetry
install:
- poetry install
script:
- poetry run flake8 my_package test
- poetry run coverage run --source=my_package -m unittest discover -b
before_deploy:
- poetry config repositories.mypypi http://mypypi.example.com/simple
- poetry config http-basic.mypypi $MYPYPI_USER $MYPYPI_PASS
- poetry build -f sdist
deploy:
provider: script
script: poetry publish -r mypypi
skip_cleanup: true
on:
tags: true
I have in the past used the built in travis pypi deployment, but it requires a setup.py
( which I don't have anymore! π). So instead I'm running my poetry publish as a script deployment when I tag my repo.
So when master is at a spot where I want to deploy a new version of the package. I do something like.
poetry version minor
git commit -am 'bumped the version'
git tag <version>
# SIDE NOTE: it would be nice to be able to do `git tag $(poetry version --show)`
# or possibly have the bump command output the NEW_VERSION=poetry version minor --output
git push --tags
In order to configure poetry with the credentials to push to our repository I have set $MYPYPI_USER
and $MYPYPI_PASS
encrypted environment variables in travis.
That's what I have. Cheers πΊ
If someone is checking the lock file in, it'd also be nice to include best practices for updating it. Ideally we'd get pyup support for it.
Your .travis.yml
inspired my own:
language: python
python: "3.6"
dist: xenial
stages:
- lint
- test
- name: deploy
if: tag IS present
cache:
pip: true
directories:
- "$HOME/.cache/pre-commit"
jobs:
include:
- stage: lint
install:
- pip install pre-commit
- pre-commit install-hooks
script:
- pre-commit run --all-files
- stage: test
install:
- pip install --upgrade pip
- pip install poetry
- poetry install -v
script:
- pytest --cov
- stage: deploy
script:
- echo Deploying to PyPI...
before_deploy:
# User and password environment variables are set as hidden variables through
# the web interface in the project settings.
- pip install --upgrade pip
- pip install poetry
- poetry config http-basic.pypi $PYPI_USER $PYPI_PASS
- poetry build
deploy:
provider: script
script: poetry publish
skip_cleanup: true
on:
all_branches: true # Travis recognizes tag names as "branches"
condition: $TRAVIS_BUILD_STAGE_NAME = Deploy
repo: daltonmaag/statmake
tags: true
Notes:
lint
stage is run before test
and deploy
is run.deploy:
section at the end can test if it's run from the deploy
stage (see the condition
). Reading the output of the stages makes me think it would otherwise deploy after every stage, but I'm not sure. That would be a gross misfeature.on Gitlab setting virtualenvs.in-project
to true worked for me and then caching the .venv
directory
The relevant section of the .gitlab-ci.yml
file are
...
cache:
paths:
- .venv
key: "${CI_COMMIT_REF_SLUG}"
...
script:
- poetry config settings.virtualenvs.in-project true
- poetry install
I came into a slightly different .travis-ci.yml
based on your suggestions :
language: python
dist: xenial
python:
- "3.7"
stages:
- Quality
- Publish
before_install:
- pip install poetry
install:
- poetry install
jobs:
include:
- stage: Quality
name: lint
script: make lint
- name: type
script: make type
- name: tests
script: make tests
- stage: Publish
script: skip
before_deploy:
- poetry config http-basic.pypi $PYPI_USERNAME $PYPI_PASSWORD
- poetry build -f sdist
deploy:
provider: script
script: poetry publish
skip_cleanup: true
on:
tags: true
if: tag IS present
after_success:
poetry run coveralls
Is it recommended to install poetry through pip
? From README:
Be aware, however, that it will also install poetry's dependencies which might cause conflicts.
In addition #166 may be related.
@FranklinYu isn't that true of all dev deps, say pytest? To test a package I need pytest, so I typically install all dev deps with poetry install which may create conflicts or, more often for me, hide a missing dep. I had the same problem before using poetry, just didn't think about it.
@piccolbo Bad things happen if you install poetry via pip but have it configured not to create a virtual environment. (As far as I can tell this isn't relevant to the configs discussed above though.)
It might be possible to speed up your builds if you don't have poetry create a virtualenvironment, and rely on the installer script to ensure the vendorized dependencies are present. (This probably depends on your CI system to some degree though.)
@piccolbo I'm lost at what you mean. If you add pytest through Poetry then it shouldn't conflict with other dependencies. This is the whole point of Poetry, isn't it?
@FranklinYu I consider a conflict when even Poetry can not satisfy all dep constraints. Maybe I am not using the term in the most appropriate way. More generally, I was referring to the fact the dev environment is more complicated than the regular one but the only one in which we can test. So one may be unable to test the environment a user would get in say an empty env. Let's say mypackage requires pandas >=3.14.15 but installing pytest requires pandas >=4.0.0 (I am making this up). Can't test with pandas <4 anymore. Maybe I am worrying for no reason since one can't test all dep combinations anyway.
Let's say mypackage requires pandas >=3.14.15 but installing pytest requires pandas >=4.0.0 (I am making this up).
In this case Poetry will try to come up with a solution (like finding another pytest version). When this is impossible, it would raise an error before trying to install anything. This happens in version resolution, not during installation. This is the point of using Poetry instead of requirements.txt
.
@FranklinYu @dmontagu thanks for your answers but they address different questions. Unless I find a better way to explain my question, I guess it's better to leave it at that.
Can I continue the sharing π ?
Here is my typical .gitlab-ci.yml
:
# Global --------------------------
variables:
PIP_CACHE_DIR: "${CI_PROJECT_DIR}/.cache/pip"
cache:
key: "${CI_JOB_NAME}"
paths:
- .cache/pip
- .venv
stages:
- quality
- tests
# Jobs templates ------------------
.install-deps-template: &install-deps
before_script:
- pip install poetry
- poetry --version
- poetry config settings.virtualenvs.in-project true
- poetry install -vv
.quality-template: &quality
<<: *install-deps
image: python:3.6
stage: quality
.test-template: &test
<<: *install-deps
stage: tests
coverage: '/TOTAL.*\s(\d+\.\d+\%)/'
script: make test
artifacts:
paths:
- tests/logs
when: always
expire_in: 1 week
# Quality jobs ----------------------
check-bandit:
<<: *quality
script: make check-bandit
check-black:
<<: *quality
script: make check-black
check-flake8:
<<: *quality
script: make check-flake8
check-isort:
<<: *quality
script: make check-isort
check-safety:
<<: *quality
script: make check-safety
# Tests jobs ------------------------
python3.6:
<<: *test
image: python:3.6
python3.7:
<<: *test
image: python:3.7
python3.8:
<<: *test
image: python:3.8
Adding to the conversation here, I took some of the recommendations from this thread, as well as others, and mixed in some of my own past patterns, and came up with the following .travis.yml
, found here:
language: python
python:
- "3.5"
- "3.6"
- "3.7"
- "3.8"
- "pypy3"
install: pip install poetry tox-travis codecov
script: tox
after_success: codecov
stages:
- test
- lint
- name: deploy
if: tag IS present
jobs:
fast_finish: true
include:
- stage: test
python: 3.7
env: TOXENV=docs
- stage: lint
python: 3.7
env: TOXENV=black
- python: 3.7
env: TOXENV=mypy
- python: 3.7
env: TOXENV=pylint
- python: 3.7
env: TOXENV=vulture
- stage: deploy
python: 3.7
install: true
script: true
after_success: true
before_deploy:
- pip install --upgrade pip
- pip install poetry
- poetry config pypi-token.pypi $PYPI_PASSWORD
deploy:
- provider: script
script: poetry publish
on:
branch: master
tags: true
- provider: script
script: ./scripts/build_and_publish_docker.sh
on:
branch: master
tags: true
This has allowed me to continue using tox
for development, with the following:
[testenv]
whitelist_externals =
poetry
setenv =
PYTHONDONTWRITEBYTECODE=1
PYTHONHASHSEED=0
PYTHONWARNINGS=ignore
commands =
poetry install --no-root -v
poetry run pytest {posargs}
Related: https://github.com/python-poetry/poetry/issues/2102
Installing Poetry the same virtual environment as the package being built creates issues.
Here is my .travis.yml
based on those above :
dist: xenial
language: python
python:
- 3.8
- 3.6
- 3.7
stages:
- test
- name: deploy
if: tag IS present
# Tests
before_install:
- pip install poetry
install:
- export PYTHONPATH=$PYTHONPATH:$(pwd)/turbulette
- poetry install --no-root -v
services:
- postgresql
env:
- DB_DRIVER=postgresql DB_HOST=localhost DB_PORT=5432 DB_USER=postgres DB_PASSWORD=""
PYTEST_TURBULETTE_SETTINGS=tests.settings
before_script:
- psql -c 'create database test;' -U postgres
script: pytest
jobs:
include:
- stage: deploy
script: skip
before_deploy:
- pip install --upgrade pip
- pip install poetry
- poetry config http-basic.pypi $PYPI_USERNAME $PYPI_PASSWORD
deploy:
provider: script
script: poetry publish --build
on:
tags: true
repo: gazorby/turbulette
after_success:
- bash <(curl -s https://codecov.io/bash)
And here is my CI config with a GitHub workflow (migrated from my GitLab CI config, previously commented here):
name: ci
on:
push:
branches:
- master
pull_request:
branches:
- master
defaults:
run:
shell: bash
jobs:
quality:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up Python 3.6
uses: actions/setup-python@v1
with:
python-version: 3.6
- name: Set up the cache
uses: actions/cache@v1
with:
path: .venv
key: cache-python-packages
- name: Set up the project
run: |
pip install poetry safety
poetry config virtualenvs.in-project true
make setup
- name: Check if the documentation builds correctly
run: make check-docs
- name: Check the code quality
run: make check-code-quality
- name: Check if the code is correctly typed
run: make check-types
- name: Check for vulnerabilities in dependencies
run: make check-dependencies
tests:
strategy:
max-parallel: 6
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
python-version: [3.6,3.7,3.8]
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
- name: Set up the cache
uses: actions/cache@v1
env:
cache-name: cache-python-packages
with:
path: .venv
key: ${{ matrix.os }}-${{ matrix.python-version }}-${{ env.cache-name }}
restore-keys: |
${{ matrix.os }}-${{ matrix.python-version }}-
${{ matrix.os }}-
- name: Set up the project
run: |
pip install poetry
poetry config virtualenvs.in-project true
make setup
- name: Run the test suite
run: make test
The make actions almost all run tools with poetry run ...
.
When I run "poetry config" via CI/CD i invariably get keyring lock errors.
@earonesty In case you're using Tox, this suggestion solved a keyring error I had been getting with poetry config
in CI/CD: https://github.com/jaraco/keyring/issues/283#issuecomment-469712817
Use actions/cache with a variation on their pip
cache example to cache Poetry dependencies for faster installation.
- name: Set up Poetry cache for Python dependencies
uses: actions/cache@v2
if: startsWith(runner.os, 'Linux')
with:
path: ~/.cache/pypoetry
key: ${{ runner.os }}-poetry-${{ hashFiles('**/poetry.lock') }}
restore-keys: ${{ runner.os }}-poetry-
Installing Poetry via pip
can lead to dependency conflicts, so the custom installer is recommended. The command listed in the docs exits in GitHub Actions with 127
(not on $PATH
).
There are some additional modifications needed for GitHub Actions:
-y
to avoid prompts.$GITHUB_PATH
(note that the ::set-env
syntax has been deprecated).poetry install
to separate step to ensure Poetry is on $GITHUB_PATH
.- name: Install Poetry
run: |
curl -fsS -o get-poetry.py https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py
python get-poetry.py -y
echo "$HOME/.poetry/bin" >> $GITHUB_PATH
- name: Install dependencies
run: poetry install --no-interaction
Poetry allows config from the poetry config
command, or by environment variables. Environment variables are a more dependable way to configure Poetry in CI.
env:
POETRY_VIRTUALENVS_CREATE: false
${{ secrets.PYPI_TOKEN }}
(secret name is PYPI_TOKEN
in this example, and username for PyPI tokens is __token__
).poetry publish --build
to build and publish in one step.- name: Build Python package and publish to PyPI
if: startsWith(github.ref, 'refs/tags/')
run: poetry publish --build -u __token__ -p ${{ secrets.PYPI_TOKEN }}
That's why they call it Poetry. Beautiful.
Expand this details element for an example workflow from br3ndonland/inboard that uses these tips.
name: builds
on:
push:
branches: [develop, master]
tags:
- "[0-9v]+.[0-9]+.[0-9a-z]+"
workflow_dispatch:
jobs:
python:
runs-on: ubuntu-latest
env:
POETRY_VIRTUALENVS_CREATE: false
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Set up Poetry cache for Python dependencies
uses: actions/cache@v2
if: startsWith(runner.os, 'Linux')
with:
path: ~/.cache/pypoetry
key: ${{ runner.os }}-poetry-${{ hashFiles('**/poetry.lock') }}
restore-keys: ${{ runner.os }}-poetry-
- name: Set up pre-commit cache
uses: actions/cache@v2
if: startsWith(runner.os, 'Linux')
with:
path: ~/.cache/pre-commit
key: ${{ runner.os }}-pre-commit-${{ hashFiles('.pre-commit-config.yaml') }}
restore-keys: ${{ runner.os }}-pre-commit-
- name: Install Poetry
run: |
curl -fsS -o get-poetry.py https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py
python get-poetry.py -y
echo "$HOME/.poetry/bin" >> $GITHUB_PATH
- name: Install dependencies
run: poetry install --no-interaction -E fastapi
- name: Run pre-commit hooks
run: pre-commit run --all-files
- name: Run unit tests
run: pytest
- name: Build Python package and publish to PyPI
if: startsWith(github.ref, 'refs/tags/')
run: poetry publish --build -u __token__ -p ${{ secrets.PYPI_TOKEN }}
docker:
runs-on: ubuntu-latest
needs: [python]
steps:
- uses: actions/checkout@v2
- name: Log in to Docker registry
run: docker login ghcr.io -u ${{ github.actor }} -p ${{ secrets.PAT_GHCR }}
- name: Build Docker images
run: |
docker build . --rm --target base -t ghcr.io/br3ndonland/inboard:base --cache-from python:3.8
docker build . --rm --target starlette -t ghcr.io/br3ndonland/inboard:starlette
docker build . --rm --target fastapi -t ghcr.io/br3ndonland/inboard:fastapi
- name: Push Docker images to registry
run: |
docker push ghcr.io/br3ndonland/inboard:base
docker push ghcr.io/br3ndonland/inboard:starlette
docker push ghcr.io/br3ndonland/inboard:fastapi
- name: Add Git tag to Docker images
if: startsWith(github.ref, 'refs/tags/')
run: |
GIT_TAG=$(echo ${{ github.ref }} | cut -d / -f 3)
docker tag ghcr.io/br3ndonland/inboard:base ghcr.io/br3ndonland/inboard:base-"$GIT_TAG"
docker tag ghcr.io/br3ndonland/inboard:starlette ghcr.io/br3ndonland/inboard:starlette-"$GIT_TAG"
docker tag ghcr.io/br3ndonland/inboard:fastapi ghcr.io/br3ndonland/inboard:fastapi-"$GIT_TAG"
docker push ghcr.io/br3ndonland/inboard:base-"$GIT_TAG"
docker push ghcr.io/br3ndonland/inboard:starlette-"$GIT_TAG"
docker push ghcr.io/br3ndonland/inboard:fastapi-"$GIT_TAG"
- name: Tag and push latest image
run: |
docker tag ghcr.io/br3ndonland/inboard:fastapi ghcr.io/br3ndonland/inboard:latest
docker push ghcr.io/br3ndonland/inboard:latest
Dependabot now offers automated version updates, with (preliminary) support for Poetry :tada:. If you have access to the Dependabot beta, set up _.github/dependabot.yml_ as described in the docs:
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
Dependabot will now send you PRs when dependency updates are available. Although package-ecosystem
must be set to pip
, it will pick up the _pyproject.toml_ and _poetry.lock_. Check the status of the repo at _Insights -> Dependency graph -> Dependabot_.
Can I continue the sharing π ?
Here is my typical
.gitlab-ci.yml
:# Global -------------------------- variables: PIP_CACHE_DIR: "${CI_PROJECT_DIR}/.cache/pip" cache: key: "${CI_JOB_NAME}" paths: - .cache/pip - .venv stages: - quality - tests # Jobs templates ------------------ .install-deps-template: &install-deps before_script: - pip install poetry - poetry --version - poetry config settings.virtualenvs.in-project true - poetry install -vv .quality-template: &quality <<: *install-deps image: python:3.6 stage: quality .test-template: &test <<: *install-deps stage: tests coverage: '/TOTAL.*\s(\d+\.\d+\%)/' script: make test artifacts: paths: - tests/logs when: always expire_in: 1 week # Quality jobs ---------------------- check-bandit: <<: *quality script: make check-bandit check-black: <<: *quality script: make check-black check-flake8: <<: *quality script: make check-flake8 check-isort: <<: *quality script: make check-isort check-safety: <<: *quality script: make check-safety # Tests jobs ------------------------ python3.6: <<: *test image: python:3.6 python3.7: <<: *test image: python:3.7 python3.8: <<: *test image: python:3.8
This is great. @pawamoy , do you have any pointers on what PIP_CACHE_DIR
does exactly? It doesn't seem widely used. I am trying to get poetry
to install into the local directory when doing pip install poetry
. For this, the --target
options works. It allows us to install the entire thing, including modules and bin/
aries into a subdirectory of the working directory. This allows for caching.
However, it also means we have to set PATH
as well as PYTHONPATH
to find that new custom location. That is not the most sexy approach. PIP_CACHE_DIR
seems to alleviate this, but it does not actually change the install path. So as far as I can see, your pip install poetry
is not cached and runs each time, can you confirm this?
Thanks for your inspirational file!
I have split this into two steps for now:
variables:
PIP_DOWNLOAD_DIR: ".pip"
before_script:
# Allow caching by only downloading first:
- pip download --dest=${PIP_DOWNLOAD_DIR} poetry # STEP 1
- pip install --find-links=${PIP_DOWNLOAD_DIR} poetry # STEP 2
# Make available for caching by installing to current directory:
- poetry config virtualenvs.in-project true
- poetry install -vv
pip download
will just download the dependencies, allowing them to be cached. A later pip install --find-links
to that same directory allows to install into the system (container)-wide Python. This allows us to ignore all of PATH
and PYTHONPATH
, it will Just Workβ’. The downside is that the install process itself is not cached and will have to rerun.
@pawamoy Thanks a lot for sharing, I discovered you could have templates in .gitlab-ci.yml
with your example. On the other hand, is there a reason to run quality check jobs in parallel instead of running them sequentially?
Since they have no side effects I feel like its a waste of ressources (ci/cd pipeline minutes).
Thanks again for sharing your example.
@cglacet You mean the other way around, parallel v sequentially? Towards this, you might like the needs
keyword. It allows you to build a DAG of job dependencies, speeding stuff up.
The templates used there are YAML anchors, so not GitLab-specific. The extends
allows you to do the same thing (essentially have job templates) and is GitLab specific. I find it more readable, but don't know if features differ. For simple stuff, they do the same.
Lastly, if your config is Makefile
-based and has calls make <target>
in each job's script, where the job name corresponds to target names, those concepts can be combined:
.make:
# Note we cannot have `default: script:`, so this approach works better.
script: make ${CI_JOB_NAME}
Then, later, for example:
preflight:
extends: .make
stage: .pre
(The .pre
stage is always available, as is .post
, as well as build
, test
, deploy
, if no other stages
are defined). The above will call make preflight
, with the term preflight
only occurring once in the config. Pretty dry! I suppose it gets more cumbersome to cross-check for existing make
targets though.
@alexpovel about PIP_CACHE_DIR
, I'm not sure anymore :sweat_smile: I guess I just picked it from another example somewhere :slightly_smiling_face: Thanks for sharing your solution, this is interesting!
@cglacet I think what you mean is: why don't I run all the checks in a single job, to avoid installing deps 5 times instead of just one? Well indeed, it's a waste of resources. I think I just liked seeing more green checkmarks in GitLab CI. But if wasting time and resources by installing too many deps is an issue to you (well, it should be to everyone), you can always reduce the installed dependencies to a minimum by making use of extras
! For example, a flake8
extra with flake8 and all its plugins, then a tests
extra with only the dependencies required for running the tests, etc. You'd then adapt the "install deps" step to make use of these extras. This way you can have your parallel jobs without any waste :slightly_smiling_face: And it's faster than sequential jobs :wink:
@alexpovel I didn't know all these GitLab CI configuration features, very nice! The snippet I shared was written when GitLab didn't have the extends
ability yet :slightly_smiling_face: The trick with make and the CI job name is really nice, thanks for sharing!
@pawamoy
you can always reduce the installed dependencies to a minimum by making use of
extras
... This way you can have your parallel jobs without any waste
I feel it's still a waste because you have to start 5 docker instances instead of one. But I agree that having parallel jobs isn't only a cost, reducing user build time is clearly a plus if you build often and don't want to wait. I guess the choice depend on the use case. From my personal perspective I would love to remain under 400 minutes per month so even if it's only saving 20% I would gladly sacrifice a x3 speedup factor on the user build time.
Now I just need to wait and see when I change my mind about this π . The future me will maybe think I'm an ignorant.
For those who wonder how to publish to gitlab from your CI, I struggled finding the info so here is how I did it (I'm not sure that's the right way, but it's working decently):
publish-package:
stage: publish
image: python:3.7
before_script:
- pip install poetry
- poetry --version
- poetry config repositories.gitlab https://gitlab.com/api/v4/projects/${CI_PROJECT_ID}/packages/pypi
- poetry config http-basic.gitlab gitlab-ci-token ${CI_JOB_TOKEN}
script:
- make publish
only:
- tags
In my Makefile:
build:
@poetry build
publish: build
@poetry publish -r gitlab
A link that might help building better CI on gitlab: Predefined environment variables reference.
@alexpovel No I meant it this way. When building in parallel you save "real time" because you actually wait less for the whole build to finish. On the other hand consume more computation time (because in this case you start several containers and install the dependencies many times).
Most helpful comment
Can I continue the sharing π ?
Here is my typical
.gitlab-ci.yml
: