While working on the web-minify project, I had to figure out just how it is possible to deploy a Python package to PyPi from gitlab. Here are my findings, hopefully it is useful for someone else!
Key information
- The gitlab docker executor will run each job in a separate container.
- No data is shared between jobs by default, so you have to use build artifacts to facilitate sharing between jobs
- The build job can thus prepare the package data in the dist/ folder ready for deployment and then mark the dist/ folder as a build artifact
- The deploy job will then have the dist/ folder available (all subsequent jobs after a build artifact has been defined will have access to the artifacts)
- The deploy job can then invoke twine to upload the package to PyPi
- Twine takes a username and a password from a configuration file
- To avoid storing the password in the source code, a TWINE_PASSWORD environment variable is set in gitlab configuration of the project.
- PyPi supports uploading packages using API tokens instead of requiring a username/password. In this mode the username is set to "__token__" and the password is actually a long token that you get from twine config upon creation of the token instead of an actual personal password. This allows us to have some granularity of permissions, creating a token that can only access one project instead of all the projects of a user. Anyway, the token is set as a gitlab variable called TWINE_PASSWORD.
You can look at the .gitlab-ci.yaml and Makefile of the the web-minify project to see an example of how this is done.
Good luck!