Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to ruff for formatting #415

Draft
wants to merge 7 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .codespellrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
[codespell]
skip = .git,*.pdf,*.svg,*.html,dataset_description.json,*.gii,*.json
# te - TE
# Weill - name
# reson - Reson. abbreviation in citation
# DNE: acronym for does not exist
# fo: acronym for file object
ignore-words-list = te,weill,reson,DNE,fo
52 changes: 32 additions & 20 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
@@ -1,26 +1,38 @@
name: Linters
name: Contribution checks

on:
- push
- pull_request
push:
branches:
- main
pull_request:
branches:
- main

defaults:
run:
shell: bash

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true

permissions:
contents: read

jobs:
lint:
style:
runs-on: ubuntu-latest
steps:
- name: Set up environment
uses: actions/checkout@v4
with: # no need for the history
fetch-depth: 1
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.9'
- name: Install dependencies
run: |
pip install flake8 flake8-absolute-import flake8-black flake8-docstrings \
flake8-isort flake8-pyproject flake8-unused-arguments \
flake8-use-fstring pep8-naming \
codespell tomli
- name: Run linters
run: python -m flake8 cubids
- uses: actions/checkout@v4
- run: pipx run ruff check .
- run: pipx run ruff format --diff .

codespell:
name: Check for spelling errors
runs-on: ubuntu-latest

steps:
- name: Checkout
uses: actions/checkout@v4
- name: Codespell
uses: codespell-project/actions-codespell@v2
6 changes: 3 additions & 3 deletions cubids/cubids.py
Original file line number Diff line number Diff line change
Expand Up @@ -709,11 +709,11 @@ def copy_exemplars(self, exemplars_dir, exemplars_tsv, min_group_size):
----------
exemplars_dir : :obj:`str`
path to the directory that will contain one subject
from each Acqusition Group (*_AcqGrouping.tsv)
from each Acquisition Group (*_AcqGrouping.tsv)
example path: /Users/Covitz/tsvs/CCNP_Acq_Groups/
exemplars_tsv : :obj:`str`
path to the .tsv file that lists one subject
from each Acqusition Group (*_AcqGrouping.tsv
from each Acquisition Group (*_AcqGrouping.tsv
from the cubids-group output)
example path: /Users/Covitz/tsvs/CCNP_Acq_Grouping.tsv
min_group_size : :obj:`int`
Expand Down Expand Up @@ -923,7 +923,7 @@ def get_nifti_associations(self, nifti):
A list of paths to files associated with the given NIfTI file, excluding
the NIfTI file itself.
"""
# get all assocation files of a nifti image
# get all association files of a nifti image
no_ext_file = str(nifti).split("/")[-1].split(".")[0]
associations = []
for path in Path(self.path).rglob(f"sub-*/**/{no_ext_file}.*"):
Expand Down
2 changes: 1 addition & 1 deletion cubids/workflows.py
Original file line number Diff line number Diff line change
Expand Up @@ -396,7 +396,7 @@ def copy_exemplars(
if use_datalad:
if not bod.is_datalad_clean():
raise Exception(
"Untracked changes. Need to save " + str(bids_dir) + " before coyping exemplars"
"Untracked changes. Need to save " + str(bids_dir) + " before copying exemplars"
)
bod.copy_exemplars(
str(exemplars_dir),
Expand Down
8 changes: 4 additions & 4 deletions docs/example.rst
Original file line number Diff line number Diff line change
Expand Up @@ -185,8 +185,8 @@ BIDS validation
but can be slowed down by extremely large datasets.

.. warning::
For internetless use cases, please see dedicated section of the `Installation page
<https://cubids.readthedocs.io/en/latest/installation.html>`_ on how to download a local version
For internetless use cases, please see dedicated section of the `Installation page
<https://cubids.readthedocs.io/en/latest/installation.html>`_ on how to download a local version
of the validator.

After that, you will need to add ``--local-validator`` option to the command string above.
Expand Down Expand Up @@ -216,7 +216,7 @@ To do this, we run the ``cubids purge`` command.
``cubids purge`` requires as input a list of files to cleanly "purge" from the dataset.
You can create this file in any text editor,
as long as it is saved as plain text ``.txt``.
When specifying files in this text file,
When specifying files in this text file,
always use relative paths starting from your BIDS directory.
For this example, we created the following file:

Expand Down Expand Up @@ -321,7 +321,7 @@ based on acquisition fields such as dimension and voxel sizes, number of volumes

While ``v0_validation.tsv`` identified all the BIDS validation errors present in the dataset,
it did not identify any potential issues that might be present within the sidecars' metadata.
Below, we see insances of missing metadata fields in a handful of sidecars,
Below, we see instances of missing metadata fields in a handful of sidecars,
which may impact successful execution of BIDS Apps.

.. csv-table:: v0_summary.tsv
Expand Down
94 changes: 45 additions & 49 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -56,19 +56,10 @@ doc = [
tests = [
"codespell",
"coverage",
"flake8",
"flake8-absolute-import",
"flake8-black",
"flake8-docstrings",
"flake8-isort",
"flake8-pyproject",
"flake8-unused-arguments",
"flake8-use-fstring",
"flake8-warnings",
"niworkflows",
"pep8-naming",
"pytest",
"pytest-cov",
"ruff ~= 0.9.2",
"tomli",
]
maint = [
Expand Down Expand Up @@ -122,50 +113,55 @@ version-file = "cubids/_version.py"
# Developer tool configurations
#

# Disable black
[tool.black]
exclude = ".*"

[tool.ruff]
line-length = 99
target-version = ['py38']
include = '\.pyi?$'
exclude = '''
(
/(
\.eggs # exclude a few common directories in the
| \.git # root of the project
| \.github
| \.hg
| \.pytest_cache
| _build
| build
| dist
)/
| versioneer.py
| cubids/_version.py
)
'''

[tool.isort]
profile = "black"
multi_line_output = 3
src_paths = ["isort", "test"]
known_local_folder = ["cubids"]

[tool.flake8]
max-line-length = "99"
doctests = "False"
exclude = [
"*build/",
"cubids/_version.py",
"cubids/_warnings.py",
"cubids/config.py",
"cubids/data/",
"cubids/tests/",

[tool.ruff.lint]
extend-select = [
"F",
"E",
"W",
"I",
"UP",
"YTT",
"S",
"BLE",
"B",
"A",
# "CPY",
"C4",
"DTZ",
"T10",
# "EM",
"EXE",
"FA",
"ISC",
"ICN",
"PT",
"Q",
]
ignore = ["D107", "E203", "E402", "E722", "W503", "N803", "N806", "N815"]
per-file-ignores = [
"**/__init__.py : F401",
"docs/conf.py : E265",
ignore = [
"S105",
"S311", # We are not using random for cryptographic purposes
"ISC001",
"S603",
]

[tool.ruff.lint.flake8-quotes]
inline-quotes = "single"

[tool.ruff.lint.extend-per-file-ignores]
"*/test_*.py" = ["S101"]
"docs/conf.py" = ["A001"]
"docs/sphinxext/github_link.py" = ["BLE001"]

[tool.ruff.format]
quote-style = "single"

[tool.pytest.ini_options]
addopts = ["--doctest-modules"]

Expand Down
102 changes: 102 additions & 0 deletions tox.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
[tox]
requires =
tox>=4
envlist =
py3{10,11,12}-latest
py310-min
py3{10,11,12}-pre
skip_missing_interpreters = true

# Configuration that allows us to split tests across GitHub runners effectively
[gh-actions]
python =
3.10: py310
3.11: py311
3.12: py312

[gh-actions:env]
DEPENDS =
min: min
latest: latest
pre: pre

[testenv]
description = Pytest with coverage
labels = test
pip_pre =
pre: true
pass_env =
# getpass.getuser() sources for Windows:
LOGNAME
USER
LNAME
USERNAME
# Pass user color preferences through
PY_COLORS
FORCE_COLOR
NO_COLOR
CLICOLOR
CLICOLOR_FORCE
PYTHON_GIL
extras = test
setenv =
pre: PIP_EXTRA_INDEX_URL=https://pypi.anaconda.org/scientific-python-nightly-wheels/simple

commands =
pytest --cov-report term-missing --durations=20 --durations-min=1.0 {posargs:-n auto}

[testenv:style]
description = Check our style guide
labels = check
deps =
ruff
skip_install = true
commands =
ruff check --diff
ruff format --diff

[testenv:style-fix]
description = Auto-apply style guide to the extent possible
labels = pre-release
deps =
ruff
skip_install = true
commands =
ruff check --fix
ruff format
ruff check --select ISC001

[testenv:spellcheck]
description = Check spelling
labels = check
deps =
codespell[toml]
skip_install = true
commands =
codespell . {posargs}

[testenv:build{,-strict}]
labels =
check
pre-release
deps =
build
twine
skip_install = true
set_env =
# Ignore specific known warnings:
# https://github.com/pypa/pip/issues/11684
# https://github.com/pypa/pip/issues/12243
strict: PYTHONWARNINGS=error,once:pkg_resources is deprecated as an API.:DeprecationWarning:pip._internal.metadata.importlib._envs,once:Unimplemented abstract methods {'locate_file'}:DeprecationWarning:pip._internal.metadata.importlib._dists
commands =
python -m build
python -m twine check dist/*

[testenv:publish]
depends = build
labels = release
deps =
twine
skip_install = true
commands =
python -m twine upload dist/*
Loading