Poetry: Export requirements.txt

Created on 11 May 2018  路  12Comments  路  Source: python-poetry/poetry

Hi @sdispater

I want to use poetry for managing my dependencies for AWS Lambda functions with serverless.
For this, it would be awesome to have export to requirements.txt to use it in serverless plugin.

WDYT about this?

Most helpful comment

FYI, export current appends hash to an editable source repo, to which pip install fails
see https://github.com/pypa/pip/issues/4995

All 12 comments

I also need something like this to deploy on Heroku.

I added this to my tooling as a workaround:

# Makefile

requirements.txt: pyproject.lock
    poetry run pip freeze > $@

It doesn't work for me:

$ poetry run pip freeze    

[ImportError]     
No module named utils.appdirs  

run <args> (<args>)...

@Tonkonozhenko Have $ poetry lock and $ poetry install run successfully?

If the sole purpose is deploying to Heroku, exporting Pipfile and Pipfile.lock would also work.

I am also using poetry together with tools for Lambda functions (chalice in this case) that require a requirements file. Native support in either poetry or chalice would be nice, but I currently work around this by generating the requirements file in a specific Make target:

# See https://github.com/sdispater/poetry/issues/100
define get_requirements
import tomlkit
with open('pyproject.lock') as t:
    lock = tomlkit.parse(t.read())
    for p in lock['package']:
        if not p['category'] == 'dev':
            print(f"{p['name']}=={p['version']}")
endef
export get_requirements

requirements.txt: pyproject.lock
        python -c "$$get_requirements" > $@

While not extensively tested, this works by explicitly pinning all transitive requirements in the requirements file.

@cauebs That would work but as per the project readme is probably better to go with exporting requirements.txt rather than the proprietary pipfile.
Having this feature together with the already existing one of being able to specify the location of a virtualenv (eg /venv #213) would mean pretty much full backward compatibility to work with any cloud platform or tooling (eg pycharm etc...).

It would be ideal if the requirements.txt could be automatically kept in sync with the pyproject.toml file as that would make things less likely to be messed up.

I'm deploying with Docker and having a requirements.txt is much simpler for this. Currently I'm using pipenv to create this requirements.txt but I definitely prefer to work withpoetry. I have prototyped a function to export a requirements.txt from poetry with markers (platform and python_version) and package source (git, etc.). It looks to work well but I'm sure I don't cover all the cases (in for platform, etc.). Also I didn't add a CLI command for the moment, but this command could be perhaps named freeze. A possibility to keep it in sync as suggested by @yunti is also a good idea but this needs more thinking (optional or not, sync when running which commands, etc?). I'm open to comments and can do a pull request if needed. But if @sdispater or someone else want to incorporate and adapt my code, I'll be happy!

Here's my code: https://gist.github.com/etienned/686d6743d66415eb39a715bb369fe2f6

For information, this is added here: #675.

@etienned thanks for the notification.
@sdispater thanks for the implementation.

FYI, export current appends hash to an editable source repo, to which pip install fails
see https://github.com/pypa/pip/issues/4995

Is this going to be backported to the python 2 version of poetry?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

Euphorbium picture Euphorbium  路  3Comments

ambv picture ambv  路  3Comments

jackemuk picture jackemuk  路  3Comments

sobolevn picture sobolevn  路  3Comments

ulope picture ulope  路  3Comments