Features#
PyScaffold comes with a lot of elaborated features and configuration defaults to make the most common tasks in developing, maintaining and distributing your own Python package as easy as possible.
Configuration, Packaging & Distribution#
All configuration can be done in setup.cfg
like changing the description,
URL, classifiers, installation requirements and so on as defined by setuptools.
That means in most cases it is not necessary to tamper with setup.py
.
The syntax of setup.cfg
is pretty much self-explanatory and well commented,
check out this example or setuptools’ documentation.
If you use tox, PyScaffold will already configure everything out of the box [1] so you can easily build your distribution, in a PEP 517/PEP 518 compliant way, by just running:
tox -e build
Alternatively, if you are not a huge fan of isolated builds, or prefer running
the commands yourself, you can execute python -m build --no-isolation
.
Uploading to PyPI#
Of course uploading your package to the official Python package index PyPI for distribution also works out of the box. Just create a distribution as mentioned above and use tox to publish with:
tox -e publish
This will first upload your package using TestPyPI, so you can be a good
citizen of the Python world, check/test everything is fine, and then, when you
are absolutely sure the moment has come for your package to shine, you can go
ahead and run tox -e publish -- --repository pypi
[2]. Just
remember that for this to work, you have to first register a PyPI account (and
also a TestPyPI one).
Under the hood, tox uses twine for uploads to PyPI (as configured by
PyScaffold in the tox.ini
file), so if you prefer running things yourself,
you can also do:
pip install twine
twine upload --repository testpypi dist/*
Please notice that PyPI does not allow uploading local versions, e.g. 0.0.dev5+gc5da6ad
,
for practical reasons. Thus, you have to create a Git tag before uploading a version
of your distribution. Read more about it in the versioning section below.
Namespace Packages#
If you want to work with namespace packages, you will be glad to hear that
PyScaffold supports the PEP 420 specification for implicit namespaces,
which is very useful to distribute a larger package as a collection of smaller ones.
putup
can automatically setup everything you need with the --namespace
option. For example, use:
putup my_project --package my_package --namespace com.my_domain
to define my_package
inside the namespace com.my_domain
, Java-style.
Note
Prior to PyScaffold 4.0, namespaces were generated
explicitly with pkg_resources, instead of PEP 420. Moreover, if you
are developing “subpackages” for already existing namespaces, please check
which convention the namespaces are currently following. Different styles of
namespace packages might be incompatible. If you don’t want to update
existing namespace packages to PEP 420, you will probably need to
manually copy the __init__.py
file for the umbrella namespace folder
from an existing project. Additionally have a look in our FAQ
about how to disable implicit namespaces.
Package and Files Data#
Additional data, e.g. images and text files, that must reside within your package, e.g.
under my_project/src/my_package
, and are tracked by Git will automatically be included
if include_package_data = True
in setup.cfg
. In case that data files are not packaged,
use git ls-files
to debug if they are really tracked by Git.
It is not necessary to have a MANIFEST.in
file for this to work. Just make
sure that all files are added to your repository.
To read this data in your code, use:
from pkgutil import get_data
data = get_data('my_package', 'path/to/my/data.txt')
Starting from Python 3.7 an even better approach is using importlib.resources
:
from importlib.resources import read_text, read_binary
data = read_text('my_package.sub_package', 'data.txt')
Note that we need a proper package structure in this case, i.e. directories need
to contain __init__.py
and be named as a valid Python package (which follow
the same rules as variable names).
We only specify the file data.txt
, no path is allowed.
The library importlib_resources provides a backport of this feature.
Please have in mind that the include_package_data
option in setup.cfg
is only
guaranteed to be read when creating a wheels distribution. Other distribution methods might
behave unexpectedly (e.g. always including data files even when
include_package_data = False
). Therefore, the best option if you want to have
data files in your repository but not as part of the pip installable package
is to add them somewhere outside the src
directory (e.g. a files
directory in the root of the project, or inside tests
if you use them for
checks). Additionally you can exclude them explicitly via the
[options.packages.find] exclude
option in setup.cfg
.
More information about data files support is available on the setuptools
website.
Tip
Using package files to store runtime configuration or mutable data is not considered good practice. Package files should be read-only. If you need configuration files, or files that should be written at runtime, please consider doing so inside standard locations in the user’s home folder (platformdirs is a good library for that). If needed you can even create them at the first usage from a read-only template, which in turn can be a package file.
Versioning and Git Integration#
Your project is already an initialised Git repository and setuptools uses the
information of tags to infer the version of your project with the help of
setuptools_scm. To use this feature you need to tag with the format
MAJOR.MINOR[.PATCH]
, e.g. 0.0.1
or 0.1
.
You can run python -m setuptools_scm
to retrieve the current PEP 440-compliant version [4].
This version will be used when building a package and is also accessible through
my_project.__version__
. If you want to upload to PyPI you have to tag the current commit
before uploading since PyPI does not allow local versions, e.g. 0.0.dev5+gc5da6ad
,
for practical reasons.
Please check our docs for the best practices and common errors with version numbers.
Pre-commit Hooks#
Unleash the power of Git by using its pre-commit hooks.
This feature is available through the --pre-commit
flag.
After your project’s scaffold was generated, make sure pre-commit is
installed, e.g. pip install pre-commit
, then just run pre-commit install
.
It goes unsaid that also a default .gitignore
file is provided that is well
adjusted for Python projects and the most common tools.
Sphinx Documentation#
PyScaffold will prepare a docs
directory with all you need to start writing
your documentation. Start editing the file docs/index.rst
to extend the documentation
and note that even the Numpy and Google style docstrings are activated by default.
If you have tox in your system, simply run tox -e docs
or tox -e
doctests
to compile the docs or run the doctests.
Alternatively, if you have make and Sphinx installed in your computer, build the
documentation with make -C docs html
and run doctests with
make -C docs doctest
. Just make sure Sphinx 1.3 or above is installed.
The documentation also works with Read the Docs. Please check the RTD guides to learn how to import your documents into the website.
Note
In order to generate the docs locally, you will need to install any
dependency used to build your doc files (and probably all your project dependencies) in
the same Python environment where Sphinx is installed (either the global Python
installation or a conda/virtualenv/venv environment).
For example, if you want to use the Read the Docs classic theme,
the sphinx_rtd_theme
package should be installed.
If you are using tox -e docs
, tox will take care of generating a
virtual environment and installing all these dependencies automatically.
You will only need to list your doc dependencies (like sphinx_rtd_theme
)
under the deps
property of the [testenv:{docs,doctests}]
section
in the tox.ini
file.
Your can also use the docs/requirements.txt
file to store them.
This file can be used by both Read the Docs and tox
when generating the docs.
Dependency Management in a Breeze#
PyScaffold out of the box allows developers to express abstract dependencies
and take advantage of pip
to manage installation. It also can be used
together with a virtual environment (also called virtual env)
to avoid dependency hell during both development and production stages.
If you like the traditional style of dependency management using a virtual env
co-located with your package, PyScaffold can help to reduce the boilerplate.
With the --venv
option, a virtualenv will be bootstrapped and waiting to be
activated. And if you are the kind of person that always install the same
packages when creating a virtual env, PyScaffold’s option --venv-install
PACKAGE
will be the right one for you. You can even integrate pip-tools in
this workflow, by putting a -e file:.
in your requirements.in.
Alternatively, PyPA’s Pipenv can be integrated in any PyScaffold-generated
project by following standard setuptools conventions. Keeping abstract
requirements in setup.cfg
and running pipenv install -e .
is basically
what you have to do.
You can check the details on how all of that works in Dependency Management.
Warning
Experimental Feature - Pipenv and pip-tools support is experimental and might change in the future.
Automation, Tests & Coverage#
PyScaffold relies on pytest to run all automated tests defined in the subfolder
tests
. Some sane default flags for pytest are already defined in the
[tool:pytest]
section of setup.cfg
. The pytest plugin pytest-cov is used
to automatically generate a coverage report. It is also possible to provide
additional parameters and flags on the commandline, e.g., type:
pytest -h
to show the help of pytest (requires pytest to be installed in your system or virtual environment).
JUnit and Coverage HTML/XML#
For usage with a continuous integration software JUnit and Coverage XML output
can be activated in setup.cfg
. Use the flag --cirrus
to generate
templates of the Cirrus CI configuration file
.cirrus.yml
which even features the coverage and stats system Coveralls.
Alternatively, you can also generate configuration files for
GitLab CI or GitHub Actions by running putup
with the
--gitlab
or --github-actions
flags.
Managing test environments and tasks with tox#
Projects generated with PyScaffold are configured by default to use tox to run some common tasks. Tox is a virtual environment management and test tool that allows you to define and run custom tasks that call executables from Python packages.
If you simply install tox and run from the root folder of your project:
tox
tox will download the dependencies you have specified, build the package, install it in a virtual environment and run the tests using pytest, so you are sure everything is properly tested. You can rely on the tox documentation for detailed configuration options (which include the possibility of running the tests for different versions of Python).
You are not limited to running your tests, with tox you can define all sorts of automation tasks. Here are a few examples for you:
tox -e build # will bundle your package and create a distribution inside the `dist` folder
tox -e publish # will upload your distribution to a package index server
tox -e docs # will build your docs
but you can go ahead and check tox examples, or this tox tutorial from
Sean Hammond for more ideas, e.g. running static code analyzers (pyflakes and
pep8) with flake8. Run tox -av
to list all the available tasks.
Management of Requirements & Licenses#
Installation requirements of your project can be defined inside setup.cfg
,
e.g. install_requires = numpy; scipy
. To avoid package dependency problems
it is common to not pin installation requirements to any specific version,
although minimum versions, e.g. sphinx>=1.3
, and/or maximum versions, e.g.
pandas<0.12
, are used frequently in accordance with semantic versioning.
For test/dev purposes, you can additionally create a requirements.txt
pinning packages to specific version, e.g. numpy==1.13.1
.
This helps to ensure reproducibility, but be sure to read our
Dependency Management Guide to understand the role of a
requirements.txt
file for library and application projects
(pip-compile
from pip-tools can help you to manage that file).
Packages defined in requirements.txt
can be easily installed with:
pip install -r requirements.txt
The most popular open source licenses can be easily added to your project with
the help of the --license
flag. You only need to specify the license identifier
according to the SPDX index so PyScaffold can generate the appropriate
LICENSE.txt
and configure your package. For example:
putup --license MPL-2.0 my_project
will create the my_project
package under the Mozilla Public License 2.0
The available licenses can be listed with putup --help
, and you can find
more information about each license in the SPDX index and choosealicense.com.
Extensions#
PyScaffold offers several extensions:
If you want a project setup for a Data Science task, just use
--dsproject
after having installed pyscaffoldext-dsproject.Have a
README.md
based on Markdown instead ofREADME.rst
by using--markdown
after having installed pyscaffoldext-markdown.Create a Django project with the flag
--django
which is equivalent todjango-admin startproject my_project
enhanced by PyScaffold’s features (requires pyscaffoldext-django).… and many more like
--gitlab
to create the necessary files for GitLab CI,--github-actions
to configure GitHub Actions,--travis
for Travis CI (see pyscaffoldext-travis), or--cookiecutter
for Cookiecutter integration (see pyscaffoldext-cookiecutter).
Find more extensions within the PyScaffold organisation and consider contributing your own,
it is very easy!
You can quickly generate a template for your extension with the
--custom-extension
option after having installed pyscaffoldext-custom-extension.
Have a look in our guide on writing extensions to get started.
All extensions can easily be installed with pip install pyscaffoldext-NAME
.
Easy Updating#
Keep your project’s scaffold up-to-date by applying putup --update my_project
when a new version of PyScaffold was released.
An update will only overwrite files that are not often altered by users like
setup.py
. To update all files use --update --force
.
An existing project that was not setup with PyScaffold can be converted with
putup --force existing_project
. The force option is completely safe to use
since the git repository of the existing project is not touched!
Please check out the Updating from Previous Versions docs for more information on how to migrate
from old versions and configuration options in setup.cfg
.
Adding features#
With the help of an experimental updating functionality it is also possible to
add additional features to your existing project scaffold. If a scaffold lacking
.cirrus.yml
was created with putup my_project
it can later be added by issuing
putup my_project --update --cirrus
. For this to work, PyScaffold stores all
options that were initially used to put up the scaffold under the [pyscaffold]
section in setup.cfg
. Be aware that right now PyScaffold provides no way to
remove a feature which was once added.
PyScaffold Configuration#
After having used PyScaffold for some time, you probably will notice yourself
repeating the same options most of the time for every new project.
Don’t worry, PyScaffold now allows you to set default flags using the
experimental default.cfg
file [3].
Check out our Configuration section to get started.