{"id":13869949,"url":"https://github.com/gecrooks/modern-python-template","last_synced_at":"2026-03-11T13:39:13.515Z","repository":{"id":45342150,"uuid":"273809754","full_name":"gecrooks/modern-python-template","owner":"gecrooks","description":"Minimal Viable Product for an open source, github hosted, python package","archived":false,"fork":false,"pushed_at":"2025-10-22T22:51:22.000Z","size":129,"stargazers_count":19,"open_issues_count":0,"forks_count":1,"subscribers_count":1,"default_branch":"master","last_synced_at":"2025-12-15T05:33:34.737Z","etag":null,"topics":["cookiecutter","python","python-template"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/gecrooks.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-06-21T00:56:24.000Z","updated_at":"2025-12-04T23:25:23.000Z","dependencies_parsed_at":"2024-01-16T07:23:54.479Z","dependency_job_id":"15612516-3712-4534-bb39-348a30a2fc75","html_url":"https://github.com/gecrooks/modern-python-template","commit_stats":null,"previous_names":["gecrooks/gecrooks-python-template"],"tags_count":17,"template":false,"template_full_name":null,"purl":"pkg:github/gecrooks/modern-python-template","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gecrooks%2Fmodern-python-template","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gecrooks%2Fmodern-python-template/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gecrooks%2Fmodern-python-template/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gecrooks%2Fmodern-python-template/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/gecrooks","download_url":"https://codeload.github.com/gecrooks/modern-python-template/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gecrooks%2Fmodern-python-template/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":30382673,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-11T12:49:11.341Z","status":"ssl_error","status_checked_at":"2026-03-11T12:46:41.342Z","response_time":84,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.6:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cookiecutter","python","python-template"],"created_at":"2024-08-05T20:01:22.961Z","updated_at":"2026-03-11T13:39:13.509Z","avatar_url":"https://github.com/gecrooks.png","language":"Python","funding_links":[],"categories":["Python"],"sub_categories":[],"readme":"# modern-python-template: How to setup an open source, github hosted, python package\n\n2025-10: The meta on how to setup a python project has changed again, and this example is now out of date. Just use uv.\n\n\n\n![Build Status](https://github.com/gecrooks/modern-python-template/workflows/Build/badge.svg) \n\n[Source](https://github.com/gecrooks/modern-python-template)\n\n## History\n\n### v3 2024-02-28\n* Move all project configuration into pyproject.toml (and remove the legacy setup.cfg and setup.py files)\n* Replace isort, black, and flake8 with ruff\n\n\n## Quickstart\n\nThis is a [cookiecutter](https://github.com/cookiecutter/cookiecutter) python template for a minimal python package. \nInstall and run cookiecutter, answer the configuration questions, and you should be good to go.\n\n    pip install -U cookiecutter\n    cookiecutter https://github.com/gecrooks/modern-python-template.git\n\nTo complete github setup, create a new empty repo on github with the same name, add origin to our\nproject, and push to github.\n\n    cd example_python_project\n    git remote add origin https://github.com/somebody/example_python_project.git\n    git push -u origin master\n    git push origin v0.0.0\n\nOn github, you'll want to complete the About section (project description, website, and topics), add your PyPi user name and password as Secrets (if you're planning to upload to PyPi), and protect the [master branch](https://amachreeowanate.medium.com/how-to-protect-the-master-branch-on-github-ab85e9b6b03).\n\n## About: On the creation and crafting of a python project\n\nThis is a discussion of the steps needed to setup an open source, github hosted, python package ready for further development.\nThe minimal project we're building is located in the [example_python_project](example_python_project) subdirectory. The rest of the files in the repo are for a [cookiecutter](https://github.com/cookiecutter/cookiecutter) template to create the example python project.\n\n## Naming\n\nThe first decision to make is the name of the project. And for python packages the most important criteria is that the name isn't already taken on [pypi](https://pypi.org/), the repository from which we install python packages with `pip`. So we should do a quick Internet search: This name is available on pypi, there are no other repos of that name on github, and a google search doesn't pull up anything relevant. So we're good to go. \n\nNote that github repo and pypi packages are generally named using dashes (`-`), but that the corresponding python modules are named with underscores (`_`). (The reason for this dichotomy appears to be that underscores don't work well in URLs, but dashes are frowned upon in filenames.)\n\n## License\n\nThe next decision is which of the plethora of [Open Source](https://opensource.org/licenses) licenses to use. We'll use the [Apache License](https://opensource.org/licenses/Apache-2.0), a perfectly reasonable, and increasingly popular choice. \n\n\n## Create repo\n\nNext we need to initialize a git repo. It's easiest to create the repo on github and clone to our local machine (This way we don't have to mess around setting the origin and such like). Github will helpfully add a `README.md`, the license, and a python `.gitignore` for us. On Github, add a description, website url (typically pointing at readthedocs), project tags, and review the rest of github's settings. \n \n\nNote that MacOS likes to scatter `.DS_Store` folders around (they store the finder icon display options). We don't want to accidentally add these to our repo. But this is a machine/developer issue, not a project issue. So if you're on a mac you should configure git to ignore `.DS_Store` globally.\n\n```\n    # specify a global exclusion list\n    git config --global core.excludesfile ~/.gitignore\n    # adding .DS_Store to that list\n    echo .DS_Store \u003e\u003e ~/.gitignore\n```\n\n## Clone repo \n\nOn our local machine the first thing we do is create a new conda environment. (You have conda installed, right?) This way if we balls up the installation of some dependency (which happens distressingly often) we can nuke the environment and start again. \n```\n    $ conda create --name GPT\n    $ source activate GPT\n    (GPT) $ python --version\n    Python 3.11.0\n```\n\nNow we clone the repo locally.\n\n```\n    (GPT) $ git clone https://github.com/gecrooks/modern-python-template.git\n    Cloning into 'modern-python-template'...\n    remote: Enumerating objects: 4, done.\n    remote: Counting objects: 100% (4/4), done.\n    remote: Compressing objects: 100% (3/3), done.\n    remote: Total 4 (delta 0), reused 0 (delta 0), pack-reused 0\n    Unpacking objects: 100% (4/4), done.\n    (GPT) $ cd modern-python-template\n```\n\nLets tag this initial commit for posterities sake (And so I can [link](https://github.com/gecrooks/modern-python-template/releases/tag/v0.0.0) to the code at this instance).\n```\n  (GPT) $ git tag v0.0.0\n  (GPT) $ git push origin v0.0.0\n```\nFor reasons that are unclear to me the regular `git push` doesn't push tags. We have push the tags explicitly by name. Note we need to specify a full MAJOR.MINOR.PATCH version number, and not just e.g. '0.1', for technical reasons that have to do with how we're going to manage package versions.\n\n\n## Branch\nIt's always best to craft code in a branch, and then merge that code into the master branch.\n```\n$ git branch gec001-init\n$ git checkout gec001-init\nSwitched to branch 'gec001-init'\n```\nI tend to name branches with my initials (so I know it's my branch on multi-developer projects), a serial number (so I can keep track of the chronological order of branches), and a keyword (if I know ahead of time what the branch is for).\n\n\n## Packaging\n\nLet's complete the minimum viable python project. We need the actual python module, signaled by a (currently) blank `__init__.py` file. \n```\n    (GPT) $ mkdir example_python_project\n    (GPT) $ touch example_python_project/__init__.py\n```\n\nPython standards for packaging and distribution seems to be in flux (again...). So, following what I think the current standard is, we need 3 files, `setup.py`, `pyproject.toml`, and `setup.cfg`. \n\nThe modern `setup.py` is just a husk:\n\n```\n#!/usr/bin/env python\n\nimport setuptools\n\nif __name__ == \"__main__\":\n    setuptools.setup(use_scm_version=True)\n```\nOur only addition is `use_scm_version=True`, which activates versioning with git tags. More on that anon. Don't forget to set executable permissions on the setup.py script.\n```\n $ chmod a+x setup.py\n```\nThe [pyproject.toml](https://snarky.ca/what-the-heck-is-pyproject-toml/) file (written in [toml](https://github.com/toml-lang/toml) format) is a recent addition to the canon. It specifies the tools used to build the project.\n```\n# pyproject.toml\n[build-system]\nrequires = [\"setuptools\u003e=42\", \"wheel\", \"setuptools_scm[toml]\u003e=3.4\"]\nbuild-backend = \"setuptools.build_meta\"\n\n\n# pyproject.toml\n[tool.setuptools_scm]\n\n```\nAgain, the parts with `setuptools_scm` are additions. \n\n\nAll of the rest of the metadata goes in `setup.cfg` (in INI format).\n```\n# Setup Configuration File\n# setup.cfg is the configuration file for setuptools. It tells setuptools about your package\n# (such as the name and version) as well as which code files to include. Eventually much of\n# this configuration may be able to move to pyproject.toml.\n#\n# https://packaging.python.org/tutorials/packaging-projects/\n# https://docs.python.org/3/distutils/configfile.html\n# [INI](https://docs.python.org/3/install/index.html#inst-config-syntax) file format.\n#\n# Project cut from gecrooks_python_template cookiecutter template\n# https://github.com/gecrooks/modern-python-template\n\n\n[metadata]\n# https://setuptools.readthedocs.io/en/latest/userguide/declarative_config.html\n# SPDX license short-form identifier, https://spdx.org/licenses/\n# https://pypi.org/classifiers/\n# setuptools v53.1.0+ expects lower cased keys, e.g. \"Name\" must be \"name\".\n\nname = {{cookiecutter.module_name}}\nsummary = {{cookiecutter.short_description}}\nlong_description = file:README.md\nlong_description_content_type = text/markdown\nkeywords = python\nurl = https://github.com/{{cookiecutter.github_username}}/{{cookiecutter.module_name}}/\nauthor = {{cookiecutter.author_name}}\nauthor_email = {{cookiecutter.author_email}}\nlicense = {{cookiecutter.license}}\nlicense_file = LICENSE\nclassifiers=\n    Development Status :: 4 - Beta\n    Intended Audience :: Developers\n    Intended Audience :: Science/Research\n    Programming Language :: Python\n    Natural Language :: English\n    Operating System :: OS Independent\n    Programming Language :: Python :: 3\n    Programming Language :: Python :: 3.9\n    Programming Language :: Python :: 3.10\n    Programming Language :: Python :: 3.11\n    Topic :: Scientific/Engineering\n    Topic :: Software Development\n    Topic :: Software Development :: Libraries\n    Topic :: Software Development :: Libraries :: Python Modules\n    Typing :: Typed\n\n\n[options]\nzip_safe = True\npython_requires = \u003e= 3.9\npackages = find:\n\ninstall_requires =\n    numpy\n\nsetup_requires =\n    setuptools_scm\n\n\n[options.extras_require]\ndev =\n    numpy \u003e= 1.20               # v1.20 introduces typechecking for numpy\n    setuptools_scm\n    pytest \u003e= 4.6\n    pytest-cov\n    flake8\n    mypy\n    black\n    isort\n    sphinx\n```\nConfusingly there are two different standards for metadata. At present the metadata\nlives in `setup.cfg` and should follow the setuptools \n[specification](https://setuptools.readthedocs.io/en/latest/userguide/declarative_config.html). \nBut the intention seems to be\nthat in the long run the metadata moves to `pyproject.toml` and follows a different \n[specification](https://packaging.python.org/specifications/core-metadata/).\n\n\nIt's good practice to support at least two consecutive versions of python. Starting with 3.9, python is moving to an annual [release schedule](https://www.python.org/dev/peps/pep-0602/). The initial 3.x.0 release will be in early October and the first bug patch 3.x.1 in early December, second in February, and so on.  Since it takes many important packages some time to upgrade (e.g. numpy and tensorflow are often bottlenecks), one should probably plan to upgrade python support around the beginning of each year. Upgrading involves changing the python version numbers in the workflow tests and `config.cfg`, and then cleaning up any `__future__` or conditional imports, or other hacks added to maintain compatibility with older python releases. If you protected the master branch on github, and added required status checks, you'll need to update those too. Supporting older python versions is often a good idea, if you don't need the newest wizz-bang python features. \n\n\nWe can now install our package (as editable -e, so that the code in our repo is live).\n```\n   $ pip install -e .[dev] \n``` \nThe optional `[dev]` will install all of the extra packages we need for test and development, listed under `[options.extras_require]` above.\n\n\n\n## Versioning\nOur project needs a version number (e.g. '3.1.4'). We'll try and follow the [semantic versioning](https://semver.org/) conventions. But as long as the major version number is '0' we're allowed to break things.\n\nThere should be a \n[single source of truth](https://packaging.python.org/guides/single-sourcing-package-version/) for this number.\nMy favored approach is use git tags as the source of truth (Option 7 in the above linked list). We're going to tag releases anyways, so if we also hard code the version number into the python code we'd violate the single source of truth principle. We use the [setuptools_scm](https://github.com/pypa/setuptools_scm) package to automatically construct a version number from the latest git tag during installation.\n\nThe convention is that the version number of a python packages should be available as `packagename.__version__`. \nSo we add the following code to `example_python_project/config.py` to extract the version number metadata.\n```\n\n__all__ = [\"__version__\", \"importlib_metadata\", \"about\"]\n\n\n# Backwards compatibility imports\ntry:\n    # python \u003e= 3.9\n    from importlib import metadata as importlib_metadata  # type: ignore\nexcept ImportError:  # pragma: no cover\n    import importlib_metadata  # type: ignore  # noqa: F401\n\n\ntry:\n    __version__ = importlib_metadata.version(__package__)  # type: ignore\nexcept Exception:  # pragma: no cover\n    # package is not installed\n    __version__ = \"0.0.0\"\n\n```\nand then in `example_python_project/__init__.py`, we import this version number.\n```\nfrom .config import __version__ as __version__                      # noqa: F401\n```\nWe put the code to extract the version number in `config.py` and not `__init__.py`, because we don't want to pollute our top level package namespace. \n\nThe various pragmas in the code above (\"pragma: no cover\" and \"type: ignore\") are there because the conditional imports confuse both our type checker and code coverage tools.\n\n\n\n## about\n\nOne of my tricks is to add a function to print the versions of the core upstream dependencies. This can be extremely helpful when debugging configuration or system dependent bugs, particularly when running continuous integration tests.\n\n```\n# Configuration (\u003e python -m example_python_project.about)\nplatform                 macOS-10.16-x86_64-i386-64bit\nexample_python_project   0.0.0\npython                   3.10.3\nnumpy                    1.20.1\nsetuptools_scm           5.0.2\npytest                   6.2.2\npytest-cov               2.11.1\nflake8                   6.0.0\nmypy                     0.812\nblack                    20.8b1\nisort                    5.7.0\nsphinx                   3.5.1\npre-commit               2.20.0\n```\nThe `about()` function to print this information is placed in `about_.py`. The file `about.py` contains the standard python command line interface (CLI), \n```\nif __name__ == '__main__':\n    import example_python_project\n    example_python_project.about()\n```\nIt's important that `about.py` isn't imported by any other code in the package, else we'll get multiple import warnings when we try to run the CLI. \n\nIf you don't want the `about` functionality remove the file `about.py`, `about()` function in config.py, and relevant tests in `config_test.py`, and edit the Makefile.\n\n## Unit tests\n\nWay back when I worked as a commercial programmer, the two most important things that I learned were source control and unit tests. Both were largely unknown in the academic world at the time.\n\n(I was once talking to a chap who was developing a new experimental platform. The plan was to build several dozens of these gadgets, and sell them to other research groups so they didn't have to build their own. A couple of grad students wandered in. They were working with one of the prototypes, and they'd found some minor bug. Oh yes, says the chap, who goes over to his computer, pulls up the relevant file, edits the code, and gives the students a new version of that file. He didn't run any tests, because there were no tests. And there was no source control, so there was no record of the change he'd just made. That was it. The horror.)\n\nCurrently, the two main options for python unit tests appear to be `unittest` from the standard library and `pytest`. To me `unittest` feels very javonic. There's a lot of boiler plate code and I believe it's a direct descendant of an early java unit testing framework. Pytest, on the other hand, feels pythonic. In the basic case all we have to do is to write functions (whose names are prefixed with 'test_'), within which we test code with `asserts`. Easy.\n\nThere's two common ways to organize tests. Either we place tests in a separate directory, or they live in the main package along with the rest of the code. In the past I've used the former approach. It keeps the test organized and separate from the production code. But I'm going to try the second approach for this project. The advantage is that the unit tests for a piece of code live right next to the code being tested.\n\nLet's test that we can access the version number (There is no piece of code too trivial that it shouldn't have a unit test.) In `example_python_project/config_test.py` we add\n\n```\nimport example_python_project\n\ndef test_version():\n    assert example_python_project.__version__\n```\nand run our test. (The 'python -m' prefix isn't strictly necessary, but it helps ensure that pytest is running under the correct copy of python.)\n```\n\n(GTP) $ python -m pytest\n========================================================================================== test session starts ===========================================================================================\nplatform darwin -- Python 3.8.3, pytest-5.4.3, py-1.8.2, pluggy-0.13.1\nrootdir: /Users/work/Work/Projects/example_python_project\ncollected 1 item                                                                                                                                                                                         \n\nexample_python_project/config_test.py .                                                                                                                                                                            [100%]\n\n=========================================================================================== 1 passed in 0.02s ============================================================================================\n```\n\nNote that in the main code we'll access the package with relative imports, e.g.\n```\nfrom . import __version__\n```\nBut in the test code we use absolute imports. \n```\nfrom example_python_project import __version__\n```\nIn tests we want to access our code in the same way we would access it from the outside as an end user.\n\n\n## Test coverage\n\nAt a bare minimum the unit tests should run (almost) every line of code. If a line of code never runs, then how do you know it works at all? (High code coverage does not mean you have a [good test suite](https://preslav.me/2020/12/03/the-myth-of-code-coverage/). But a good set of unit tests will have high code coverage.)\n\nSo we want to monitor the test coverage. The [pytest-cov](https://pypi.org/project/pytest-cov/) plugin to pytest will do this for us. Configuration is placed in the setup.cfg file (Config can also be placed in a separate `.coveragerc`, but I think it's better to avoid a proliferation of configuration files.)\n```\n# pytest configuration\n[tool:pytest]\ntestpaths =\n    example_python_project\n\n\n# Configuration for test coverage\n#\n# https://coverage.readthedocs.io/en/latest/config.html\n#\n# python -m pytest --cov\n\n[coverage:paths]\nsource =\n    example_python_project\n\n[coverage:run]\nomit =\n    *_test.py\n\n[coverage:report]\n# Use ``# pragma: no cover`` to exclude specific lines\nexclude_lines =\n    pragma: no cover\n    except ImportError\n    assert False\n    raise NotImplementedError()\n    pass\n```\n\nWe have to explicitly omit the unit tests since we have placed the test files in the same directories as the code to test.\n\nThe [pragma](https://en.wikipedia.org/wiki/Directive_(programming)) `pragma: no cover` is used to mark untestable lines. This often happens with conditional imports used for backwards compatibility between python versions. The other excluded lines are common patterns of code that don't need test coverage.\n\n\n## Linting\n\nWe need to lint our code before pushing any commits. I like [flake8](https://flake8.pycqa.org/en/latest/). It's faster than pylint, and (I think) better error messages. I will hereby declare:\n\n    The depth of the indentation shall be 4 spaces. \n    And 4 spaces shall be the depth of the indentation. \n    Two spaces thou shall not use. \n    And tabs are right out. \n\nFour spaces is standard. [Tabs are evil](https://www.emacswiki.org/emacs/TabsAreEvil). I've worked on a project with 2-space indents, and I see the appeal, but I found it really weird. \n\nMost of flake8's defaults are perfectly reasonable and in line with [PEP8](https://www.python.org/dev/peps/pep-0008/) guidance. But even [Linus](https://lkml.org/lkml/2020/5/29/1038) agrees that the old standard of 80 columns of text is too restrictive. (Allegedly, 2-space indents were [Google's](https://www.youtube.com/watch?v=wf-BqAjZb8M\u0026feature=youtu.be\u0026t=260) solution to the problem that 80 character lines are too short. Just make the indents smaller!) Raymond Hettinger suggests 90ish (without a hard cutoff), and [black](https://black.readthedocs.io/en/stable/the_black_code_style.html) uses 88. So let's try 88.\n\n\nThe configuration also lives in `setup.cfg`.\n```\n# flake8 linter configuration\n[flake8]\nmax-line-length = 88\nignore = E203, W503\n```\nWe need to override the linter on occasion. We add pragmas such as `# noqa: F401` to assert that no, really, in this case we do know what we're doing.\n\n\nTwo other python code format tools to consider using are [isort](https://pypi.org/project/isort/) and [black, The uncompromising code formatter](https://black.readthedocs.io/en/stable/). Isort sorts your import statements into a canonical order. And Black is the Model-T Ford of code formatting -- any format you want, so long as it's Black. I could quibble about some of Black's code style, but in the end it's just easier to blacken your code and accept black's choices, and thereby gain a consistent coding style across developers. \n\nThe command `make delint` will run `isort` and `black` on your code, with the right magic incantations so that they are compatible. (`isort --profile black` which appears to be equivalent to `isort -m 3 --tc --line-length 88`. We set this configuration project wide in `setup.cfg`)\n\n\n## Copyright\nIt's common practice to add a copyright and license notice to the top of every source file -- something like this:\n```\n\n# Copyright 2019-, Gavin E. Crooks and contributors\n#\n# This source code is licensed under the Apache License, Version 2.0\n# found in the LICENSE file in the root directory of this source tree.\n\n```\n\nI tend to forget to add these lines. So let's add a unit test `example_python_project/config_test.py::test_copyright` to make sure we don't.\n```\ndef test_copyright():\n    \"\"\"Check that source code files contain a copyright line\"\"\"\n    exclude = set(['example_python_project/version.py'])\n    for fname in glob.glob('example_python_project/**/*.py', recursive=True):\n        if fname in exclude:\n            continue\n        print(\"Checking \" + fname + \" for copyright header\")\n\n        with open(fname) as f:\n            for line in f.readlines():\n                if not line.strip():\n                    continue\n                assert line.startswith('# Copyright')\n                break\n```\n\n\n## API Documentation\n[Sphinx](https://www.sphinx-doc.org/en/master/usage/quickstart.html) is the standard \ntool used to generate API documentation from the python source. Use the handy quick start tools. \n```\n$ mkdir docsrc\n$ cd docsrc\n$ sphinx-quickstart\n```\nThe defaults are reasonable. Enter the project name and author when prompted. \n\nEdit the conf.py, and add the following collection of extensions.\n```\nextensions = [\n    'sphinx.ext.autodoc',\n    'sphinx.ext.napoleon',\n]\n```\n[Autodoc](https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html) automatically extracts documentation from docstrings, and [napolean](https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html) enables [Google style](http://google.github.io/styleguide/pyguide.html) python docstrings.\n\nWe also add a newline at the end of `conf.py`, since the lack of a blank line at the end upsets our linter.\n\nGo ahead and give it a whirl. This won't do anything interesting yet, but it's a start.\n```\n$ make html\n```\n\nOne problem is that sphinx creates three (initially) empty directories, `_build`, `_static`, and `_templates`. But we can't add empty directories to git, since git only tracks files. The workaround is to add an empty `.gitignore` file to each of the `_static` and `_templates` directories. (An alternative convention is to add a `.gitkeep` file.) If we never want the files in these directories to be under source control, we can add a `*` to the `.gitignore` file.  Sphinx will create the `_build` directory when it's needed.\n\n```\n$ touch _templates/.gitignore _build/.gitignore _static/.gitignore\n$ git add -f _templates/.gitignore _build/.gitignore _static/.gitignore\n$ git add Makefile *.*\n# cd ..\n```\n\n\nNote that we have placed the sphinx documentation tools in `docsrc` rather than the more traditional `docs`. This is to keep the `docs` directory available to serve documentation using `githubs-pages`. (We also have to update the root `.gitignore` file.)\n\n\n## Makefile\nI like to add a Makefile with targets for all of the common development tools I need to run. This is partially for convenience, and partially as documentation, i.e. here are all the commands you need to run to test, lint, typecheck, and build the code (and so on.) I use a [clever hack](https://marmelab.com/blog/2016/02/29/auto-documented-makefile.html) so that the makefile self documents.\n\n```\n(GTP) $ make\nabout        Report versions of dependent packages\nstatus       git status --short --branch\ninit         Install package ready for development\nall          Run all tests\ntest         Run unittests\ncoverage     Report test coverage\nlint         Lint check python source\ndelint       Run isort and black to delint project\ntypecheck    Static typechecking \ndocs         Build documentation\ndocs-open    Build documentation and open in webbrowser\ndocs-clean   Clean documentation build\ndocs-github-pages Install html in docs directory ready for github pages\npragmas      Report all pragmas in code\nbuild        Setuptools build\nrequirements Make requirements.txt\n```\n\nThe pragmas target searches the code and lists all of the pragmas that occur. Common uses of [pragmas](https://en.wikipedia.org/wiki/Directive_(programming)) are to override the linter, tester, or typechecker. \n\n\n## Readthedocs\nWe'll host our API documentation on [Read the Docs](readthedocs.org). We'll need a basic configuration file, `.readthedocs.yml`.\n```\nversion: 2\nformats: []\nsphinx:\n  configuration: docs/conf.py\npython:\n  version: 3.9\n```\nI've already got a readthedocs account, so setting up a new project takes but a few minutes. \n\n\n## README.md\n\nWe add some basic information and installation instructions to `README.mb`. Github displays this file on your project home page (but under the file list, so if you have a lot of files at the top level of your project, people might not notice your README.)\n\nA handy trick is to add Build Status and Documentation Status badges for Github actions tests and readthedocs. These will proudly declare that your tests are passing (hopefully). (See top of this file)\n\n\n## Continuous Integration\n\nAnother brilliant advance to software engineering practice is continuous integration (CI). The basic idea is that all code gets thoroughly tested before it's added to the master branch.\n\nGithub now makes this very easy to setup with Github actions. They even provide basic templates. This testing workflow lives in `.github/workflows/python-build.yml`, and is a modification of Github's  `python-package.yml` workflow.\n```\n# This workflow will install Python dependencies, run tests and lint with a variety of Python versions\n# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions\n\nname: Python package\n\non:\n  push:\n    branches: [ master ]\n  pull_request:\n    branches: [ master ]\n  schedule:\n    - cron: \"0 13 * * *\"  # Every day at 1pm UTC (6am PST)    \n\njobs:\n  build:\n\n    runs-on: ubuntu-latest\n    strategy:\n      matrix:\n        python-version: ['3.9', '3.10', '3.11']\n\n    steps:\n    - uses: actions/checkout@v2\n    - name: Set up Python ${{ matrix.python-version }}\n      uses: actions/setup-python@v2\n      with:\n        python-version: ${{ matrix.python-version }}\n    - name: Install dependencies\n      run: |\n        python -m pip install --upgrade pip\n        python -m pip install flake8 pytest\n        if [ -f requirements.txt ]; then pip install -r requirements.txt; fi\n        python -m pip install -e .[dev]  # install package + test dependencies\n    - name: About\n      run: |\n        python -m $(python -Wi setup.py --name).about\n    - name: Lint with flake8\n      run: |\n        flake8 .\n    - name: Test with pytest\n      run: |\n        python -m pytest --cov-fail-under 100\n    - name: Typecheck with mypy\n      run: |\n        mypy\n    - name: Build documentation with sphinx\n      run: |\n        sphinx-build -M html docsrc docsrc/_build\n\n```\nNote that these tests are picky. Not only must the unit tests pass, but test coverage must be 100%, the code must be delinted, blackened, isorted, and properly typed, and the docs have to build without error.\n\nIt's a good idea to set a cron job to run the test suite against the main branch on a regular basis (the `schedule` block above). This will alert you of problems caused by your dependencies updating. (For instance, one of my other projects just broke, apparently because flake8 updated its rules.)\n\nLet's add, commit, and push our changes. \n```\n$ git status\nOn branch gec001-init\nChanges to be committed:\n  (use \"git reset HEAD \u003cfile\u003e...\" to unstage)\n\n    new file:   .readthedocs.yml\n    new file:   .github/workflows/python-package.yml\n    new file:   Makefile\n    modified:   README.md\n    new file:   docs/Makefile\n    new file:   docs/_build/.gitignore\n    new file:   docs/_static/.gitignore\n    new file:   docs/_templates/.gitignore\n    new file:   docs/conf.py\n    new file:   docs/index.rst\n    new file:   pyproject.toml\n    new file:   example_python_project/__init__.py\n    new file:   example_python_project/about.py      \n    new file:   example_python_project/config.py\n    new file:   example_python_project/config_test.py  \n    new file:   setup.cfg\n    new file:   setup.py\n    \n$ git commit -m \"Minimum viable package\"\n...\n$ git push --set-upstream origin gec001-init\n...\n```\nIf all goes well Github will see our push, and build and test the code in the branch. Probably all the tests won't pass on the first try. It's easy to forget something (which is why we have automatic tests). So tweak the code, and push another commit until the tests pass.\n\n## Git pre-commit\n\nAnother handy trick is to add a (pre-commit](https://ljvmiranda921.github.io/notebook/2018/06/21/precommits-using-black-and-flake8/) hook to git, so that some tests are run before code can be committed.\nA basic example hook to run black before commit is located in `.pre-commit-config.yaml`. The make command `init` \nwill install the pre-commit hook.\n \n## Editorconfig\n\n[EditorConfig](https://editorconfig.org/) is a handy way of specifying code formatting conventions, such as indent levels and line endings. The .editorconfig lives in the root of the repository, and is understood by many popular IDEs and ext editors.\n\n\n## PyPi\n\nWe should now be ready to do a test submission to PyPI, The Python Package Index (PyPI).\nFollow the directions laid out in the [python packaging](https://packaging.python.org/tutorials/packaging-projects/) documentation.\n\n```\n$ pip install -q build twine\n...\n$ git tag v0.1.0rc1\n$ python -m build \n...\n```\nWe tag our release candidate so that we get a clean version number (pypi will object to the development version numbers setuptools_scm generates if the tag or git repo isn't up to date).\n\nFirst we push to the pypi's test repository.\n```\n(GTP) $ python -m twine upload --repository testpypi dist/*\n```\nYou'll need to create a pypi account if you don't already have one. \n\nLet's make sure it worked by installing from pypi into a fresh conda environment.\n```\n(GTP) $ conda deactivate\n$ conda create --name tmp\n$ conda activate tmp\n(tmp) $ pip install --index-url https://test.pypi.org/simple/ --no-deps modern-python-template\n(tmp) $ python -m example_python_project.about\n(tmp) $ conda activate GTP\n```\n\n\n## Merge and Tag\n\nOver on github we create a pull request, wait for the github action checks to give us the green light once all the tests have passed, and then squash and merge. \n\nThe full developer sequence goes something like this\n\n1.) Sync the master branch. \n```\n$ git checkout master\n$ git pull origin master\n```\n(If we're working on somebody else's project, this step is a little more complicated. We fork the project on github, clone our fork to the local machine, and then set git's 'upstream' to be the original repo. We then sync our local master branch with the upstream master branch\n```\n$ git checkout master\n$ git fetch upstream\n$ git merge upstream/master\n```\nThis should go smoothly as long as you never commit directly to your local master branch.)\n\n\n2.) Create a working branch.\n```\n$ git branch BRANCH\n$ git checkout BRANCH\n```\n\n3.) Do a bunch of development on the branch, committing incremental changes as we go along.\n\n4.) Sync the master branch with github (since other development may be ongoing.) (i.e. repeat step 1)\n\n5.) Rebase our branch to master. \n```\n$ git checkout BRANCH\n$ git rebase master\n```\nIf there are conflicts, resolve them, and then go back to step 4.\n\n6.) Sync our branch to github\n\n```\n$ git push\n```\n\n7.) Over on github, create a pull request to merge into the master branch\n\n8.) Wait for the integration tests to pass. If they don't, fix them, and then go back to step 4.\n\n9.) Squash and merge into the master branch on github. Squashing merges all of our commits on the branch into a single commit to merge into the master branch. We generally don't want to pollute the master repo history with lots of micro commits. (On multi-developer projects, code should be reviewed. Somebody other than the branch author approves the changes before the final merge into master.)\n\n10.) Goto step 1. Back on our local machine, we resync master, create a new branch, and continue developing. \n\n\n## Tag and release\n\nAssuming everything went well, you can now upload a release to pypi proper. We can add a [github workflow](.github/workflows/python-publish.yml) to automatically upload new releases tagged on github. The only additional configuration is to upload `PYPI_USERNAME` and `PYPI_PASSWORD` to github as secrets (under your repo settings). \n\n## Extras: requirements.txt\nThe `setup.cfg` file specifies the minimum versions of dependencies.  But for testing and deployment it can be useful to pin exact versions.\n\n    \u003e pip freeze \u003e requirements.txt\n\nAnd to install these exact versions:\n    \n    \u003e pip install -r requirements.txt\n\nIf a `requirements.txt` exists then those versions are installed by the github workflows and the `make init` command.\n\n\n## Extras: MANIFEST.in\nYou don't need a [`MANIFEST.in` file](https://www.remarkablyrestrained.com/python-setuptools-manifest-in/).\n\nHistorically, this file was used to specify which additional files, (typically data files) should be included in a packaged distribution. \nBut `setuptools_scm` takes care of that for us (in most cases), by default including all files under source control.\n\n\n## Cookiecutter\n\nHaving shorted out our basic module configuration and layout, the next trick is to turn the package into a \n[cookiecutter](https://cookiecutter.readthedocs.io/) project template. That way we can create a new project in\njust a few moments.\n\n    pip install -U cookiecutter\n    cookiecutter https://github.com/gecrooks/modern-python-template.git\n  \nAnswer the questions, create a new empty repo on github with the same name, push, and you should be good to go.\n\n    cd example_python_project\n    git remote add origin https://github.com/somebody/example_python_project.git\n    git push -u origin master\n\n\nThe basic idea is to replace customizable text with  template strings, e.g. `{{cookiecutter.author_email}}`. \nDefaults for these templates are stored in `cookiecutter.json`. In particular example_python_package is moved to a directory called \n`{{cookiecutter.module_name}}`, and the module code is moved to \n`{{cookiecutter.module_name}}/{{cookiecutter.module_name}}`. \nI'm more or less following [cookiecutter-pypackage](https://github.com/audreyfeldroy/cookiecutter-pypackage)\n\nOne tricky bit is that some of the github configuration files already contain similar template strings. So we have to\nwrap those strings in special raw tags.\n\n    {% raw %} some stuff with {templates} {% endraw %}\n\nI also added some pre- and post- templating hooks (in the `hooks` subdirectory). These initialize and tag a git repo in the created module, and pip install the package.\n\n\n## Conclusion\n\nBy my count our minimal project has 13 configuration files (In python, toml, yaml, INI, gitignore, Makefile, and plain text formats), 2 documentation files, one file of unit tests, and 3 files of code (containing 31 lines of code). \n\nWe're now ready to create a new git branch and start coding in earnest.\n\n\n## Further reading\n\n* [Boring Python: dependency management, by James Bennett](https://www.b-list.org/weblog/2022/may/13/boring-python-dependencies/)\n* [Boring Python: code quality, by James Bennett](https://www.b-list.org/weblog/2022/dec/19/boring-python-code-quality/)\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgecrooks%2Fmodern-python-template","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgecrooks%2Fmodern-python-template","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgecrooks%2Fmodern-python-template/lists"}