Python Lint And Format#
Azure SDK Python Guidelines#
https://azure.github.io/azure-sdk/python_implementation.html
Lint#
- Update 2023-05-21: Replaced flake8, pylint, black and isort by ruff. When replacing pylint, should add check by mypy.
- Update 2023-11-07: Bandit could be replaced by ruff too with the support of flake-bandit.
Note
The only thing ruff can't do at the moment is type checking.
ruff#
Fast, just blazing fast.
# lint
ruff check
# format
ruff format
# show ignored ruff alerts
ruff check --ignore-noqa --exit-zero
try also uv pip install
For pip install
users, try uv pip install from the same author of ruff.
A highly opinionated and subjective perspective, but finally, we have a real competitor (or even a replacement) of pip install
. If you have a private PyPi index, all you need to setup is export UV_DEFAULT_INDEX=$PIP_INDEX_URL
; otherwise, just add uv
before pip install
command.
pylint#
Warning
Replaced by ruff.
As pylint has too many options, it's recommended to use the pylint config file:
# file ~/.pylintrc, can be generated by pylint --generate-rcfile
[MASTER]
[MESSAGES CONTROL]
disable=
C0116, # Missing function or method docstring (missing-function-docstring)
W1203, # Use lazy % formatting in logging functions (logging-fstring-interpolation)
[format]
max-line-length = 88
[MISCELLANEOUS]
# List of note tags to take in consideration, separated by a comma.
notes=FIXME
[VARIABLES]
# List of additional names supposed to be defined in builtins. Remember that
# you should avoid defining new builtins when possible.
additional-builtins=
spark
But we can also ignore some warnings directly in the pylint command:
To show all the inline ignored pylint alerts: pylint --enable=suppressed-message
Ignore Unused Argument given a Function Name Expression#
Use dummy variable to ignore the Pylint warning on unused-argument.
flake8#
Warning
Replaced by ruff.
# ignore W503 because of black format. BTW, flake8 also has W504 which is in contrary to W503.
# ignore E501, line too long because we have the same check at Pylint side already.
flake8 . \
--exclude=venv \
--extend-ignore=E203,E501,W503, \
--max-complexity=7 \
--show-source \
--statistics \
--count \
--jobs=auto
flake8 [a_file_path]
To show all the inline ignored flake8 alerts: flake8 --disable-noqa || true
There's a very nice flake8 plugin called flake8-cognitive-complexity which checks the Cognitive Complexity in addition to the Cyclomatic Complexity provided by flake8 out of the box. We dont need to add extra parameter to use the Cognitive Complexity in flake8, it's set to --max-cognitive-complexity=7
by default once the plugin is installed. By the way, Sonar sets the Cognitive Complexity threshold to 15 by default.
To fix imported but not used
error in __init__.py
file, could by __all__
attribute (the most elegant) or by --per-file-ignores.
bandit#
Warning
Replaced by ruff.
The bandit config file format is not well documented, I passed a lot of time to test the config.
$ cat .bandit
# https://github.com/PyCQA/bandit/issues/400
exclude_dirs:
- "./venv/*"
# https://github.com/PyCQA/bandit/pull/633
assert_used:
skips:
- "*/*_test.py"
- "*/test_*.py"
ossaudit#
ossaudit uses Sonatype OSS Index to audit Python packages for known vulnerabilities.
It can check installed packages and/or packages specified in dependency files. The following formats are supported with dparse:
- PIP requirement files
- Pipfile
- Pipfile.lock
- tox.ini
- conda.yml
# check installed packages and packages listed in two requirements files
$ ossaudit --installed --file requirements.txt --file requirements-dev.txt
Found 0 vulnerabilities in 214 packages
Github has already provided, free of charge, the vulnerable dependencies alert.
pyright#
faster than mypy.
pyproject.toml:
```toml
[tool.pyright]
reportUnnecessaryTypeIgnoreComment = true
include = []
exclude = []
running pyright locally:
running pyright in pre-commit:
running pyright in Github Action:
running pyright in other CI Solutions: https://github.com/microsoft/pyright/blob/main/docs/ci-integration.md
mypy#
For projects having sqlalchemy, we often install the sqlalchemy-stubs
plugin as sqlalchemy uses some dynamic classes.
And also django-stubs, pandas-stubs, types-setuptools, types-requests etc.
[mypy]
ignore_missing_imports = True # We recommend using this approach only as a last resort: it's equivalent to adding a # type: ignore to all unresolved imports in your codebase.
plugins = sqlmypy # sqlalchemy-stubs
exclude = (?x)(
^venv
| ^build
)
running mypy:
mypy .
mypy . --exclude [a regular expression that matches file path]
mypy . --exclude venv[//] # exclude venv folder under the root
Warning
When using mypy, it would be better to use mypy against to all files in the project, but not some of them.
ignore lint error in one line#
Warning
All linters could be replaced by ruff, only typing checkers like mypy and pyright are not supported by ruff.
linter | ignore in one line |
---|---|
ruff | (2 spaces)# noqa: |
pylint | (2 spaces)# pylint: disable={errorIdentifier} |
flake8 | (2 spaces)# noqa: |
bandit | (2 spaces)# nosec |
pyright | (2 spaces)# pyright: ignore [reportOptionalMemberAccess, reportGeneralTypeIssues] |
mypy | (2 spaces)# type: ignore |
multiple linters | (2 spaces)# type: ignore # noqa: {errorIdentifier} # pylint: disable={errorIdentifier} |
To ignore Pylint within a code block
## https://stackoverflow.com/a/48836605/5095636
import sys
sys.path.append("xx/xx")
## pylint: disable=wrong-import-position
from x import ( # noqa: E402
a,
b,
)
from y import c # noqa: E402
## pylint: enable=wrong-import-position
Format#
isort#
Warning
Replaced by ruff.
isort . --profile=black --virtual-env=venv --recursive --check-only
isort . --profile=black --virtual-env=venv --recursive
isort [a_file_path]
Warning
Be very careful with isort, it's not uncompromising, especially for some codes that dynamically import some modules inside a function instead of from the beginning of a file. People use often this to avoid circular import problem. Always run the tests after the isort.
black#
Warning
Replaced by ruff.
Using black with other tools: https://black.readthedocs.io/en/stable/guides/using_black_with_other_tools.html
Coverage#
For Open Source, Codecov and SonarQube are both free, and have both Github Actions. SonarQube is also a powerful static linting tool.
SonarQube#
CI based config https://docs.sonarsource.com/sonarcloud/advanced-setup/ci-based-analysis/sonarscanner-cli/
Automatic analysis Triggered on each push https://docs.sonarsource.com/sonarcloud/advanced-setup/automatic-analysis
VSCode#
Just my 2 cents, try the errorlens extension in VSCode, it will lint all the warnings/errors on live when coding, it's really cool.
And don't forget to install the official SonarLint extension, it will give you extra lint. It eats a lot of memory with its java processes nevertheless.
"python.formatting.provider": "none",
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter",
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
// "source.organizeImports": true
},
},
"python.linting.banditEnabled": true,
"python.linting.banditArgs": [
"-r",
"-c",
"~/pyproject.toml"
],
"python.linting.ignorePatterns": [
".vscode/*.py",
"**/site-packages/**/*.py",
"venv/"
],
"python.linting.mypyEnabled": true,
"python.linting.mypyArgs": [
"--follow-imports=silent",
"--ignore-missing-imports",
"--show-column-numbers",
"--no-pretty",
"--warn-return-any",
"--warn-unused-configs",
"--show-error-codes"
],
"sonarlint.connectedMode.connections.sonarqube": [
{
"serverUrl": "https://sonar.xxx",
"connectionId": "sonar.xxx"
}
],
"[json]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
// "editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[jsonc]": {
"editor.defaultFormatter": "vscode.json-language-features"
},
pyproject.toml#
pyproject.toml
is the new standard in Python introduced by PEP 518 (2016) for build system requirements, PEP 621 (2020) for project metadata, and PEP 660 (2021) for wheel based editable installs.
It's fun to know why Python authority chose this name, and very interesting to understand their POV of different file formats :smile:.
All the the major tools (setuptools, pip-tools, poetry) support this new standard, and the repo awesome-pyproject maintains a list of Python tools which are compatible to pyproject.toml
.
Warning
We cannot officially declare flake8 config in pyproject.toml.
Hereunder an example of pyproject.toml
file in my fastapi-demo repo.
[tool.ruff]
fix = true
show-fixes = true
lint.select = ["ALL"]
lint.ignore = [
# https://beta.ruff.rs/docs/rules/
"D", # pydocstyle
"E501", # line too long, handled by black
"B008", # do not perform function calls in argument defaults
"ANN", # flake8-annotations
"PTH123", # pathlib-open - this would force pathlib usage anytime open or with open was used.
"FA102", # Missing `from __future__ import annotations`, but uses PEP 604 union.
"ERA001", # Found commented-out code
"TD", # flake8-todo
"FIX002", # Line contains TODO, consider resolving the issue
"COM812", # missing-trailing-comma
"ISC001", # single-line-implicit-string-concatenation
]
[tool.ruff.lint.per-file-ignores]
"tests/**/*.py" = [
# at least this three should be fine in tests:
"S101", # asserts allowed in tests...
# "ARG", # Unused function args -> fixtures nevertheless are functionally relevant...
# "FBT", # Don't care about booleans as positional arguments in tests, e.g. via @pytest.mark.parametrize()
]
"tools/**/*.py" = ["ALL"]
"migrations/**/*.py" = ["ALL"]
"_local_test/**/*.py" = ["ALL"]
[tool.ruff.lint.isort]
combine-as-imports = true
force-wrap-aliases = true
[tool.ruff.lint.pylint]
max-args = 8
[tool.ruff.lint.pep8-naming]
classmethod-decorators = ["pydantic.validator"]
[tool.pyright]
reportUnnecessaryTypeIgnoreComment = true
exclude = ["_local_test", "tools", "migrations", ".venv"]
# include = ["app"]
venvPath = "."
venv = ".venv"
[tool.pytest.ini_options]
# testpaths = ["tests/unit"] # no unit test yet
testpaths = ["tests/integration"]
# https://pytest-asyncio.readthedocs.io/en/latest/concepts.html#auto-mode
asyncio_mode = "auto"
addopts = """
-v -s
--junitxml=junit.xml
--cov app
--cov-report=html
--cov-report=xml
--cov-report=term-missing:skip-covered
--cov-fail-under=70
"""
# env is enabled by pytest-env
env = ["TESTING=yes"]
[tool.mypy]
plugins = ["pydantic.mypy"]
exclude = ["^.venv/", "^build/", "^_local_test/"]
follow_imports = "silent"
warn_redundant_casts = true
warn_unused_ignores = true
disallow_any_generics = true
check_untyped_defs = true
no_implicit_reexport = true
# for strict mypy: (this is the tricky one :-))
disallow_untyped_defs = false
[tool.pydantic-mypy]
# https://docs.pydantic.dev/2.6/integrations/mypy/
init_forbid_extra = true
init_typed = true
warn_required_dynamic_aliases = true
[project]
name = "fastapi-demo"
dynamic = ["version", "dependencies", "optional-dependencies"]
authors = [{ name = "Xiang ZHU", email = "xiang.zhu@outlook.com" }]
description = "fastapi-demo"
readme = "README.md"
requires-python = ">=3.11,<3.13"
classifiers = [
"Operating System :: POSIX :: Linux",
"Programming Language :: Python :: 3.11",
"Topic :: Software Development :: Libraries :: Python Modules",
]
[project.urls]
repository = "https://github.com/copdips/fastapi-demo"
documentation = "https://github.com/copdips/fastapi-demo"
[build-system]
requires = ["setuptools", "wheel"]
build-backend = "setuptools.build_meta"
[tool.setuptools.dynamic]
version = { file = ["VERSION"] }
[tool.setuptools.dynamic.dependencies]
file = ["requirements/base.txt"]
[tool.setuptools.dynamic.optional-dependencies]
dev = { file = ["requirements/dev.txt"] }
docs = { file = ["requirements/docs.txt"] }
[tool.coverage.run]
relative_files = true
...
[tool.mypy]
incremental = true
ignore_missing_imports = true
warn_return_any = true
warn_unused_configs = true
# disallow_untyped_defs = true
exclude = [
"^.venv/",
"^build/",
"^_local_test/",
]
...
Git pre-commit#
"Git hook scripts are useful for identifying simple issues before submission to code review. We run our hooks on every commit to automatically point out issues in code such as missing semicolons, trailing whitespace, and debug statements. By pointing these issues out before code review, this allows a code reviewer to focus on the architecture of a change while not wasting time with trivial style nitpicks."
pip install pre-commit
pre-commit install
## install the script along with the hook environments in one command
## https://pre-commit.com/index.html#pre-commit-install-hooks
pre-commit install --install-hooks
## Auto-update pre-commit config to the latest repos' versions.
pre-commit autoupdate
## Clean out cached pre-commit files.
pre-commit clean
## Clean unused cached repos.
pre-commit gc
## Run single check
pre-commit run black
## continuous integration
## https://pre-commit.com/index.html#usage-in-continuous-integration
pre-commit run --all-files
## check only files which have changed
pre-commit run --from-ref origin/HEAD --to-ref HEAD
## Azure pipeline example with cache
https://pre-commit.com/index.html#azure-pipelines-example
## automatically enabling pre-commit on repositories
## https://pre-commit.com/index.html#automatically-enabling-pre-commit-on-repositories
git config --global init.templateDir ~/.git-template
pre-commit init-templatedir ~/.git-template
Online examples#
pylint github pre-commit-config.yaml
Create a file named .pre-commit-config.yaml
to the root of your project#
Note
Although each lint has its own config to exclude some files from checking, pre-commit also has the key exclude with list value or regex to exclude file from sending to linter.
Note
language: system
means using the executables from the same environment of current Python interpreter.
Warning
When using mypy in pre-commit, it would be better run pre-commit run --all-files
, mypy doesn't work well with only diff files sent by pre-commit run --from-ref origin/${pullrequest_target_branch_name} --to-ref HEAD
.
Hereunder an example of .pre-commit-config.yaml
file in my fastapi-demo repo.
## Installation:
# pip install pre-commit
# pre-commit install
# pre-commit autoupdate
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: check-json
exclude: devcontainer.json
- id: check-yaml
- id: check-toml
- id: end-of-file-fixer
- id: trailing-whitespace
- id: debug-statements
- id: requirements-txt-fixer
- id: detect-private-key
- id: mixed-line-ending
args: ["--fix=lf"]
- id: check-added-large-files
# - id: no-commit-to-branch
- repo: https://github.com/Lucas-C/pre-commit-hooks
rev: v1.5.5
hooks:
- id: forbid-crlf
- id: remove-crlf
- id: forbid-tabs
- id: remove-tabs
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v4.0.0-alpha.8
hooks:
- id: prettier
exclude: ".md"
- repo: https://github.com/pre-commit/pygrep-hooks
rev: v1.10.0
hooks:
- id: python-check-blanket-type-ignore
- id: python-check-mock-methods
- id: python-no-log-warn
- id: python-use-type-annotations
- repo: https://github.com/RobertCraigie/pyright-python
rev: v1.1.388
hooks:
- id: pyright
- repo: local
hooks:
- id: ruff
name: ruff
entry: ruff check --force-exclude
language: system
types: [python, pyi]
args: []
require_serial: true
- id: ruff-format
name: ruff-format
entry: ruff format --force-exclude
language: system
types: [python, pyi]
args: []
require_serial: true
- id: pytest
name: pytest
# types: [python]
entry: pytest
language: system
pass_filenames: false
always_run: true
repos:
- repo: local
hooks:
- id: mypy
name: mypy
language: system
entry: mypy
types: [python]
args:
# - --strict
- --show-error-codes
Install the git hook scripts#
$ pre-commit install
pre-commit installed at .git/hooks/pre-commit
$ pre-commit install --hook-type post-merge
pre-commit installed at .git/hooks/post-merge
$ pre-commit install --hook-type pre-merge-commit
pre-commit installed at .git/hooks/pre-merge-commit
Note
You could also run pre-commit install --hook-type pre-push
to register pre-push hooks.
Run against all the files#
"it's usually a good idea to run the hooks against all of the files when adding new hooks (usually pre-commit will only run on the changed files during git hooks)"
Run for changed files only in CI#
Please check also this official doc.
Warning
When using mypy, it would be better to use mypy against to all files in the project, but not the changed one only.
Git commit#
Each time we use git commit to stage some files, these files will be sent to pre-commit to be checked against to the hooks defined in .pre-commit-config.yaml
.
Temporarily disabling hooks#
The official doc gives the example how to disable explicitly hooks by hooks' ids: SKIP=flake8 git commit -m "foo"
, but if you want to disable completely all the hooks, an easy way might be found here by using git commit --no-verify
or its shortcut git commit -n
. If you use pre-commit during push, you can disable pre-commit during push by git push --no-verify
or git push -n
.
Automatically enabling pre-commit on repositories#
https://pre-commit.com/#automatically-enabling-pre-commit-on-repositories