Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

⚡️🐍📦 Serverless plugin to bundle Python packages

License

NotificationsYou must be signed in to change notification settings

serverless/serverless-python-requirements

Repository files navigation

serverlessGithub Actionsnpmcode style: prettier

A Serverless Framework plugin to automatically bundle dependencies fromrequirements.txt and make them available in yourPYTHONPATH.


Originally developed byCapital One, now maintained in scope of Serverless, Inc

Capital One considers itself the bank a technology company would build. It's delivering best-in-class innovation so that its millions of customers can manage their finances with ease. Capital One is all-in on the cloud and is a leader in the adoption of open source, RESTful APIs, microservices and containers. We build our own products and release them with a speed and agility that allows us to get new customer experiences to market quickly. Our engineers use artificial intelligence and machine learning to transform real-time data, software and algorithms into the future of finance, reimagined.


Install

sls plugin install -n serverless-python-requirements

This will automatically add the plugin to your project'spackage.json and the plugins section of itsserverless.yml. That's all that's needed for basic use! The plugin will now bundle your pythondependencies specified in yourrequirements.txt orPipfile when you runsls deploy.

For a more in depth introduction on how to use this plugin, check outthis post on the Serverless Blog

If you're on a mac, check outthese notes about using python installed by brew.

Cross compiling

Compiling non-pure-Python modules or fetching their manylinux wheels issupported on non-linux OSs via the use of Docker andofficial AWS build images.To enable docker usage, add the following to yourserverless.yml:

custom:pythonRequirements:dockerizePip:true

The dockerizePip option supports a special case in addition to booleans of'non-linux' which makesit dockerize only on non-linux environments.

To utilize your own Docker container instead of the default, add the following to yourserverless.yml:

custom:pythonRequirements:dockerImage:<image name>:tag

This must be the full image name and tag to use, including the runtime specific tag if applicable.

Alternatively, you can define your Docker image in your own Dockerfile and add the following to yourserverless.yml:

custom:pythonRequirements:dockerFile:./path/to/Dockerfile

WithDockerfile the path to the Dockerfile that must be in the current folder (or a subfolder).Please note thedockerImage and thedockerFile are mutually exclusive.

To install requirements from private git repositories, add the following to yourserverless.yml:

custom:pythonRequirements:dockerizePip:truedockerSsh:true

ThedockerSsh option will mount your$HOME/.ssh/id_rsa and$HOME/.ssh/known_hosts as avolume in the docker container.

In case you want to use a different key, you can specify the path (absolute) to it throughdockerPrivateKey option:

custom:pythonRequirements:dockerizePip:truedockerSsh:truedockerPrivateKey:/home/.ssh/id_ed25519

If your SSH key is password protected, you can usessh-agentbecause$SSH_AUTH_SOCK is also mounted & the env var is set.It is important that the host of your private repositories has already been added in your$HOME/.ssh/known_hosts file, as the install process will fail otherwise due to host authenticityfailure.

You can also pass environment variables to docker by specifying them indockerEnvoption:

custom:pythonRequirements:dockerEnv:      -https_proxy

🏁 Windows notes

✨🍰✨ Pipenv support

Requirespipenv in version2022-04-08 or higher.

If you include aPipfile and havepipenv installed, this will usepipenv to generate requirements instead of arequirements.txt. It is fully compatible with all options such aszip anddockerizePip. If you don't want this plugin to generate it for you, set the following option:

custom:pythonRequirements:usePipenv:false

✨📝✨ Poetry support

If you include apyproject.toml and havepoetry installed instead of arequirements.txt this will usepoetry export --without-hashes -f requirements.txt -o requirements.txt --with-credentials to generate them. It is fully compatible with all options such aszip anddockerizePip. If you don't want this plugin to generate it for you, set the following option:

custom:pythonRequirements:usePoetry:false

Be aware that if nopoetry.lock file is present, a new one will be generated on the fly. To help having predictable builds,you can set therequirePoetryLockFile flag to true to throw an error whenpoetry.lock is missing.

custom:pythonRequirements:requirePoetryLockFile:false

If your Poetry configuration includes custom dependency groups, they will not be installed automatically. To include them in the deployment package, use thepoetryWithGroups,poetryWithoutGroups andpoetryOnlyGroups options which wrappoetry export's--with,--without and--only parameters.

custom:pythonRequirements:poetryWithGroups:      -internal_dependencies      -lambda_dependencies

Poetry with git dependencies

Poetry by default generates the exported requirements.txt file with-e and that breaks pip with-t parameter(used to install all requirements in a specific folder). In order to fix that we remove all-e from the generated file but,for that to work you need to add the git dependencies in a specific way.

Instead of:

[tool.poetry.dependencies]bottle = {git ="git@github.com/bottlepy/bottle.git",tag ="0.12.16"}

Use:

[tool.poetry.dependencies]bottle = {git ="https://git@github.com/bottlepy/bottle.git",tag ="0.12.16"}

Or, if you have an SSH key configured:

[tool.poetry.dependencies]bottle = {git ="ssh://git@github.com/bottlepy/bottle.git",tag ="0.12.16"}

Dealing with Lambda's size limitations

To help deal with potentially large dependencies (for example:numpy,scipyandscikit-learn) there is support for compressing the libraries. This doesrequire a minor change to your code to decompress them. To enable this add thefollowing to yourserverless.yml:

custom:pythonRequirements:zip:true

and add this to your handler module before any code that imports your deps:

try:importunzip_requirementsexceptImportError:pass

Slim Package

Works on non 'win32' environments: Docker, WSL are includedTo remove the tests, information and caches from the installed packages,enable theslim option. This will:strip the.so files, remove__pycache__anddist-info directories as well as.pyc and.pyo files.

custom:pythonRequirements:slim:true

Custom Removal Patterns

To specify additional directories to remove from the installed packages,define a list of patterns in the serverless config using theslimPatternsoption and glob syntax. These patterns will be added to the default ones (**/*.py[c|o],**/__pycache__*,**/*.dist-info*).Note, the glob syntax matches against whole paths, so to match a file in anydirectory, start your pattern with**/.

custom:pythonRequirements:slim:trueslimPatterns:      -'**/*.egg-info*'

To overwrite the default patterns set the optionslimPatternsAppendDefaults tofalse (true by default).

custom:pythonRequirements:slim:trueslimPatternsAppendDefaults:falseslimPatterns:      -'**/*.egg-info*'

This will remove all folders within the installed requirements that matchthe names inslimPatterns

Option not to strip binaries

In some cases, stripping binaries leads to problems like "ELF load command address/offset not properly aligned", even when done in the Docker environment. You can still slim down the package without*.so files with:

custom:pythonRequirements:slim:truestrip:false

Lambda Layer

Another method for dealing with large dependencies is to put them into aLambda Layer.Simply add thelayer option to the configuration.

custom:pythonRequirements:layer:true

The requirements will be zipped up and a layer will be created automatically.Now just add the reference to the functions that will use the layer.

functions:hello:handler:handler.hellolayers:      -Ref:PythonRequirementsLambdaLayer

If the layer requires additional or custom configuration, add them onto thelayer option.

custom:pythonRequirements:layer:name:${self:provider.stage}-layerNamedescription:Python requirements lambda layercompatibleRuntimes:        -python3.7licenseInfo:GPLv3allowedAccounts:        -'*'

Omitting Packages

You can omit a package from deployment with thenoDeploy option. Note thatdependencies of omitted packages must explicitly be omitted too.

This example makes it instead omit pytest:

custom:pythonRequirements:noDeploy:      -pytest

Extra Config Options

Caching

You can enable two kinds of caching with this plugin which are currently both ENABLED by default.First, a download cache that will cache downloads that pip needs to compile the packages.And second, a what we call "static caching" which caches output of pip after compiling everything for your requirements file.Since generallyrequirements.txt files rarely change, you will often see large amounts of speed improvements when enabling the static cache feature.These caches will be shared between all your projects if no customcacheLocation is specified (see below).

Please note: This has replaced the previously recommended usage of "--cache-dir" in the pipCmdExtraArgs

custom:pythonRequirements:useDownloadCache:trueuseStaticCache:true

Other caching options

There are two additional options related to caching.You can specify where in your system that this plugin caches with thecacheLocation option.By default it will figure out automatically where based on your username and your OS to store the cache via theappdirectory module.Additionally, you can specify how many max static caches to store withstaticCacheMaxVersions, as a simple attempt to limit disk space usage for caching.This is DISABLED (set to 0) by default.Example:

custom:pythonRequirements:useStaticCache:trueuseDownloadCache:truecacheLocation:'/home/user/.my_cache_goes_here'staticCacheMaxVersions:10

Extra pip arguments

You can specify extra argumentssupported by pip to be passed to pip like this:

custom:pythonRequirements:pipCmdExtraArgs:      ---compile

Extra Docker arguments

You can specify extra arguments to be passed todocker build during the build step, anddocker run during the dockerized pip install step:

custom:pythonRequirements:dockerizePip:truedockerBuildCmdExtraArgs:['--build-arg', 'MY_GREAT_ARG=123']dockerRunCmdExtraArgs:['-v', '${env:PWD}:/my-app']

Customize requirements file name

Somepip workflows involve using requirements files not namedrequirements.txt.To support these, this plugin has the following option:

custom:pythonRequirements:fileName:requirements-prod.txt

Per-function requirements

Note: this feature does not work with Pipenv/Poetry, it requiresrequirements.txtfiles for your Python modules.

If you have different python functions, with different sets of requirements, you can avoidincluding all the unecessary dependencies of your functions by using the following structure:

├── serverless.yml├── function1│      ├── requirements.txt│      └── index.py└── function2       ├── requirements.txt       └── index.py

With the content of yourserverless.yml containing:

package:individually:truefunctions:func1:handler:index.handlermodule:function1func2:handler:index.handlermodule:function2

The result is 2 zip archives, with only the requirements for function1 in the first one, and onlythe requirements for function2 in the second one.

Quick notes on the config file:

  • Themodule field must be used to tell the plugin where to find therequirements.txt file foreach function.
  • Thehandler field must not be prefixed by the folder name (already known throughmodule) asthe root of the zip artifact is already the path to your function.

Customize Python executable

Sometimes your Python executable isn't available on your$PATH aspython2.7orpython3.6 (for example, windows or using pyenv).To support this, this plugin has the following option:

custom:pythonRequirements:pythonBin:/opt/python3.6/bin/python

Vendor library directory

For certain libraries, default packaging produces too large an installation,even when zipping. In those cases it may be necessary to tailor make a versionof the module. In that case you can store them in a directory and use thevendor option, and the plugin will copy them along with all the otherdependencies to install:

custom:pythonRequirements:vendor:./vendored-librariesfunctions:hello:handler:hello.handlervendor:./hello-vendor# The option is also available at the function level

Manual invocation

The.requirements andrequirements.zip (if using zip support) files are leftbehind to speed things up on subsequent deploys. To clean them up, run:

sls requirements clean

You can also create them (andunzip_requirements ifusing zip support) manually with:

sls requirements install

The pip download/static cache is outside the serverless folder, and should be manually cleaned when i.e. changing python versions:

sls requirements cleanCache

Invalidate requirements caches on package

If you are using your own Python library, you have to cleanup.requirements on any update. You can use the following option to cleanup.requirements everytime you package.

custom:pythonRequirements:invalidateCaches:true

🍎🍺🐍 Mac Brew installed Python notes

Brew wilfully breaks the--target option with no seeming intention to fix itwhich causes issues since this uses that option. There are a few easy workarounds for this:

OR

  • Create a virtualenv and activate it while using serverless.

OR

Also,brew seems to cause issues with pipenv,so make sure you install pipenv using pip.

🏁 WindowsdockerizePip notes

For usage ofdockerizePip on Windows do Step 1 only if running serverless on windows, or do both Step 1 & 2 if running serverless inside WSL.

  1. Enabling shared volume in Windows Docker Taskbar settings
  2. Installing the Docker client on Windows Subsystem for Linux (Ubuntu)

Native Code Dependencies During Build

Some Python packages require extra OS dependencies to build successfully. To deal with this, replace the default image with aDockerfile like:

FROM public.ecr.aws/sam/build-python3.9# Install your dependenciesRUN yum -y install mysql-devel

Then update yourserverless.yml:

custom:pythonRequirements:dockerFile:Dockerfile

Native Code Dependencies During Runtime

Some Python packages require extra OS libraries (*.so files) at runtime. You need to manually include these files in the root directory of your Serverless package. The simplest way to do this is to use thedockerExtraFiles option.

For instance, themysqlclient package requireslibmysqlclient.so.1020. If you use the Dockerfile from the previous section, add an item to thedockerExtraFiles option in yourserverless.yml:

custom:pythonRequirements:dockerExtraFiles:      -/usr/lib64/mysql57/libmysqlclient.so.1020

Then verify the library gets included in your package:

sls packagezipinfo .serverless/xxx.zip

If you can't see the library, you might need to adjust your package include/exclude configuration inserverless.yml.

Optimising packaging time

If you wish to exclude most of the files in your project, and only include the source files of your lambdas and their dependencies you may well use an approach like this:

package:individually:falseinclude:    -'./src/lambda_one/**'    -'./src/lambda_two/**'exclude:    -'**'

This will be very slow. Serverless adds a default"&ast;&ast;" include. If you are using thecacheLocation parameter to this plugin, this will result in all of the cached files' names being loaded and then subsequently discarded because of the exclude pattern. To avoid this happening you can add a negated include pattern, as is observed inserverless/serverless#5825.

Use this approach instead:

package:individually:falseinclude:    -'!./**'    -'./src/lambda_one/**'    -'./src/lambda_two/**'exclude:    -'**'

Custom Provider Support

Scaleway

This plugin is compatible with theScaleway Serverless Framework Plugin to package dependencies for Python functions deployed onScaleway. To use it, add the following to yourserverless.yml:

provider:name:scalewayruntime:python311plugins:  -serverless-python-requirements  -serverless-scaleway-functions

To handle native dependencies, it's recommended to use the Docker builder with the image provided by Scaleway:

custom:pythonRequirements:# Can use any Python version supported by ScalewaydockerImage:rg.fr-par.scw.cloud/scwfunctionsruntimes-public/python-dep:3.11

Contributors


[8]ページ先頭

©2009-2025 Movatter.jp