-
Notifications
You must be signed in to change notification settings - Fork 137
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
merge devel to master to release v0.2.22 #786
Conversation
<!--pre-commit.ci start--> updates: - [github.com/astral-sh/ruff-pre-commit: v0.6.5 → v0.6.7](astral-sh/ruff-pre-commit@v0.6.5...v0.6.7) <!--pre-commit.ci end--> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…g#737) <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Introduced a mandatory parameter for pseudopotential files and an optional parameter to specify the destination directory for symbolic links. - **Improvements** - Enhanced clarity and functionality in handling input parameters for pseudopotential and orbital files. - Streamlined output construction for `pp_file` and `numerical_orbital`. - **Bug Fixes** - Added tests to validate error handling for mismatches between provided pseudopotential files and numerical orbitals, ensuring appropriate exceptions are raised. - **Documentation** - Added comments for better understanding of code functionality. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: root <pxlxingliang> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Fix the RTD build error. Pinning mamba to `<2` is a workaround for jupyterlite/xeus#122. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Chores** - Added a constraint specifying that the `python` dependency must be less than version 3.13. - Updated the Python version in the Read the Docs configuration to `mambaforge-23.11`. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
Generated by the task: njzjz-bot/njzjz-bot#9. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Chores** - Updated workflow configurations for benchmarking and testing to improve the setup process for the `uv` tool. - Replaced the previous installation method with a new action that enables caching and specifies cache dependencies. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Signed-off-by: Jinzhe Zeng <[email protected]> Co-authored-by: Jinzhe Zeng <[email protected]>
<!--pre-commit.ci start--> updates: - [github.com/pre-commit/pre-commit-hooks: v4.6.0 → v5.0.0](pre-commit/pre-commit-hooks@v4.6.0...v5.0.0) - [github.com/astral-sh/ruff-pre-commit: v0.6.7 → v0.6.9](astral-sh/ruff-pre-commit@v0.6.7...v0.6.9) - [github.com/asottile/blacken-docs: 1.18.0 → 1.19.0](adamchainz/blacken-docs@1.18.0...1.19.0) <!--pre-commit.ci end--> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Jinzhe Zeng <[email protected]>
<!--pre-commit.ci start--> updates: - [github.com/astral-sh/ruff-pre-commit: v0.6.9 → v0.7.0](astral-sh/ruff-pre-commit@v0.6.9...v0.7.0) <!--pre-commit.ci end--> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…s` in `pyproject.toml` (deepmodeling#743) `project.entry-points.console_scripts` is not allowed per [PEP 621](https://peps.python.org/pep-0621/). <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Updated the command entry point for the `dpdata` tool for improved accessibility. - **Chores** - Organized optional dependencies for better clarity and management. - Maintained existing configurations for tools and package data without functional changes. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
Reference deepmodeling#728 to add the spin information for lammps. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Introduced functions to extract and normalize spin data from LAMMPS dump files. - Added support for registering spin data types in the system. - Created new simulation input and output files for LAMMPS. - Added new compute and dump commands for calculating and outputting spin properties in LAMMPS. - **Tests** - Implemented unit tests for verifying the handling of spin configurations in LAMMPS files, including scenarios with zero spin values and incomplete data. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: root <pxlxingliang> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Han Wang <[email protected]>
Compared to `sphinx-rtd-theme`, `sphinx-book-theme` is a clean and modern Sphinx theme. Generated by the task: njzjz-bot/njzjz-bot#11.
<!--pre-commit.ci start--> updates: - [github.com/astral-sh/ruff-pre-commit: v0.7.0 → v0.7.1](astral-sh/ruff-pre-commit@v0.7.0...v0.7.1) - [github.com/asottile/blacken-docs: 1.19.0 → 1.19.1](adamchainz/blacken-docs@1.19.0...1.19.1) <!--pre-commit.ci end--> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…deling#744) <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit ## Release Notes - **New Features** - Enhanced atomic movement handling with the addition of a `move` parameter across multiple functions, enabling more detailed tracking of atom movements. - Introduced selective dynamics support in VASP POSCAR data handling, allowing for better control over atomic movement flags. - Added structured representation for cubic boron nitride crystal in new test files. - **Bug Fixes** - Updated atomic positions and magnetic properties in test structures to reflect accurate simulation data. - **Tests** - Added new test methods to validate the handling of movement flags and ensure the integrity of output structures across various scenarios. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
<!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Chores** - Streamlined the optional dependencies section by removing extraneous comments in the configuration file. - The `test` dependency now clearly specifies `"numpy<2"` without additional notes. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
<!--pre-commit.ci start--> updates: - [github.com/astral-sh/ruff-pre-commit: v0.7.1 → v0.7.2](astral-sh/ruff-pre-commit@v0.7.1...v0.7.2) <!--pre-commit.ci end--> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
- support spin for deepmd format - modify name "mag_forces" to "force_mags" <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Introduced a new function to register spin data types, enhancing data handling capabilities. - Added unit tests for reading spin data from NPY and raw formats. - **Changes** - Updated dictionary keys from `"mag_forces"` to `"force_mags"` across multiple files for consistency. - Introduced new data entries and modified existing numerical values in various raw data files. - **Bug Fixes** - Adjusted test cases to reflect the updated naming convention for force magnitudes. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: root <pxlxingliang> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
<!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Enhanced data retrieval to include magnetic moment information in atomic coordinates. - Introduced a new function for transforming magnetic moment data into Cartesian coordinates. - Added structured input format for iron simulations, detailing atomic and magnetic properties. - **Bug Fixes** - Improved handling of magnetic data registration in various format classes. - **Tests** - Added a new test method to validate the reading of spin data from the structured input file. - Updated atomic positions in tests to include magnetic moment information. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: root <pxlxingliang> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…modeling#752) In case, some one may just save only as a stru file and has no pseudopotential information. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Bug Fixes** - Improved error handling and messaging for cases where the pseudo potential file is not provided. - Added checks to ensure parameter lengths match the total number of atoms, enhancing robustness. - **New Features** - Introduced a default value for the pseudo potential file parameter, allowing for more flexible function usage. - **Tests** - Added a new test to verify functionality when the pseudo potential file is not specified, ensuring reliable output with minimal parameters. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: root <pxlxingliang> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
<!--pre-commit.ci start--> updates: - [github.com/astral-sh/ruff-pre-commit: v0.7.2 → v0.7.3](astral-sh/ruff-pre-commit@v0.7.2...v0.7.3) <!--pre-commit.ci end--> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
<!--pre-commit.ci start--> updates: - [github.com/astral-sh/ruff-pre-commit: v0.7.3 → v0.7.4](astral-sh/ruff-pre-commit@v0.7.3...v0.7.4) <!--pre-commit.ci end--> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4 to 5. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/codecov/codecov-action/releases">codecov/codecov-action's releases</a>.</em></p> <blockquote> <h2>v5.0.0</h2> <h2>v5 Release</h2> <p><code>v5</code> of the Codecov GitHub Action will use the <a href="https://github.com/codecov/wrapper">Codecov Wrapper</a> to encapsulate the <a href="https://github.com/codecov/codecov-cli">CLI</a>. This will help ensure that the Action gets updates quicker.</p> <h3>Migration Guide</h3> <p>The <code>v5</code> release also coincides with the opt-out feature for tokens for public repositories. In the <code>Global Upload Token</code> section of the settings page of an organization in codecov.io, you can set the ability for Codecov to receive a coverage reports from any source. This will allow contributors or other members of a repository to upload without needing access to the Codecov token. For more details see <a href="https://docs.codecov.com/docs/codecov-tokens#uploading-without-a-token">how to upload without a token</a>.</p> <blockquote> <p>[!WARNING]<br /> <strong>The following arguments have been changed</strong></p> <ul> <li><code>file</code> (this has been deprecated in favor of <code>files</code>)</li> <li><code>plugin</code> (this has been deprecated in favor of <code>plugins</code>)</li> </ul> </blockquote> <p>The following arguments have been added:</p> <ul> <li><code>binary</code></li> <li><code>gcov_args</code></li> <li><code>gcov_executable</code></li> <li><code>gcov_ignore</code></li> <li><code>gcov_include</code></li> <li><code>report_type</code></li> <li><code>skip_validation</code></li> <li><code>swift_project</code></li> </ul> <p>You can see their usage in the <code>action.yml</code> <a href="https://github.com/codecov/codecov-action/blob/main/action.yml">file</a>.</p> <h2>What's Changed</h2> <ul> <li>chore(deps): bump to eslint9+ and remove eslint-config-google by <a href="https://github.com/thomasrockhu-codecov"><code>@thomasrockhu-codecov</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1591">codecov/codecov-action#1591</a></li> <li>build(deps-dev): bump <code>@octokit/webhooks-types</code> from 7.5.1 to 7.6.1 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1595">codecov/codecov-action#1595</a></li> <li>build(deps-dev): bump typescript from 5.6.2 to 5.6.3 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1604">codecov/codecov-action#1604</a></li> <li>build(deps-dev): bump <code>@typescript-eslint/parser</code> from 8.8.0 to 8.8.1 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1601">codecov/codecov-action#1601</a></li> <li>build(deps): bump <code>@actions/core</code> from 1.11.0 to 1.11.1 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1597">codecov/codecov-action#1597</a></li> <li>build(deps): bump github/codeql-action from 3.26.9 to 3.26.11 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1596">codecov/codecov-action#1596</a></li> <li>build(deps-dev): bump <code>@typescript-eslint/eslint-plugin</code> from 8.8.0 to 8.8.1 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1600">codecov/codecov-action#1600</a></li> <li>build(deps-dev): bump eslint from 9.11.1 to 9.12.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1598">codecov/codecov-action#1598</a></li> <li>build(deps): bump github/codeql-action from 3.26.11 to 3.26.12 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1609">codecov/codecov-action#1609</a></li> <li>build(deps): bump actions/checkout from 4.2.0 to 4.2.1 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1608">codecov/codecov-action#1608</a></li> <li>build(deps): bump actions/upload-artifact from 4.4.0 to 4.4.3 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1607">codecov/codecov-action#1607</a></li> <li>build(deps-dev): bump <code>@typescript-eslint/parser</code> from 8.8.1 to 8.9.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1612">codecov/codecov-action#1612</a></li> <li>build(deps-dev): bump <code>@typescript-eslint/eslint-plugin</code> from 8.8.1 to 8.9.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1611">codecov/codecov-action#1611</a></li> <li>build(deps-dev): bump <code>@typescript-eslint/eslint-plugin</code> from 8.9.0 to 8.10.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1615">codecov/codecov-action#1615</a></li> <li>build(deps-dev): bump eslint from 9.12.0 to 9.13.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1618">codecov/codecov-action#1618</a></li> <li>build(deps): bump github/codeql-action from 3.26.12 to 3.26.13 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1617">codecov/codecov-action#1617</a></li> <li>build(deps-dev): bump <code>@typescript-eslint/parser</code> from 8.9.0 to 8.10.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1614">codecov/codecov-action#1614</a></li> <li>build(deps-dev): bump <code>@typescript-eslint/eslint-plugin</code> from 8.10.0 to 8.11.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1620">codecov/codecov-action#1620</a></li> <li>build(deps-dev): bump <code>@typescript-eslint/parser</code> from 8.10.0 to 8.11.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1619">codecov/codecov-action#1619</a></li> <li>build(deps-dev): bump <code>@types/jest</code> from 29.5.13 to 29.5.14 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1622">codecov/codecov-action#1622</a></li> <li>build(deps): bump actions/checkout from 4.2.1 to 4.2.2 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1625">codecov/codecov-action#1625</a></li> <li>build(deps): bump github/codeql-action from 3.26.13 to 3.27.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1624">codecov/codecov-action#1624</a></li> <li>build(deps-dev): bump <code>@typescript-eslint/eslint-plugin</code> from 8.11.0 to 8.12.1 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1626">codecov/codecov-action#1626</a></li> <li>build(deps-dev): bump <code>@typescript-eslint/eslint-plugin</code> from 8.12.1 to 8.12.2 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/codecov/codecov-action/pull/1629">codecov/codecov-action#1629</a></li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md">codecov/codecov-action's changelog</a>.</em></p> <blockquote> <h2>4.0.0-beta.2</h2> <h3>Fixes</h3> <ul> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/1085">#1085</a> not adding -n if empty to do-upload command</li> </ul> <h2>4.0.0-beta.1</h2> <p><code>v4</code> represents a move from the <a href="https://github.com/codecov/uploader">universal uploader</a> to the <a href="https://github.com/codecov/codecov-cli">Codecov CLI</a>. Although this will unlock new features for our users, the CLI is not yet at feature parity with the universal uploader.</p> <h3>Breaking Changes</h3> <ul> <li>No current support for <code>aarch64</code> and <code>alpine</code> architectures.</li> <li>Tokenless uploading is unsuported</li> <li>Various arguments to the Action have been removed</li> </ul> <h2>3.1.4</h2> <h3>Fixes</h3> <ul> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/967">#967</a> Fix typo in README.md</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/971">#971</a> fix: add back in working dir</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/969">#969</a> fix: CLI option names for uploader</li> </ul> <h3>Dependencies</h3> <ul> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/970">#970</a> build(deps-dev): bump <code>@types/node</code> from 18.15.12 to 18.16.3</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/979">#979</a> build(deps-dev): bump <code>@types/node</code> from 20.1.0 to 20.1.2</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/981">#981</a> build(deps-dev): bump <code>@types/node</code> from 20.1.2 to 20.1.4</li> </ul> <h2>3.1.3</h2> <h3>Fixes</h3> <ul> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/960">#960</a> fix: allow for aarch64 build</li> </ul> <h3>Dependencies</h3> <ul> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/957">#957</a> build(deps-dev): bump jest-junit from 15.0.0 to 16.0.0</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/958">#958</a> build(deps): bump openpgp from 5.7.0 to 5.8.0</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/959">#959</a> build(deps-dev): bump <code>@types/node</code> from 18.15.10 to 18.15.12</li> </ul> <h2>3.1.2</h2> <h3>Fixes</h3> <ul> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/718">#718</a> Update README.md</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/851">#851</a> Remove unsupported path_to_write_report argument</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/898">#898</a> codeql-analysis.yml</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/901">#901</a> Update README to contain correct information - inputs and negate feature</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/955">#955</a> fix: add in all the extra arguments for uploader</li> </ul> <h3>Dependencies</h3> <ul> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/819">#819</a> build(deps): bump openpgp from 5.4.0 to 5.5.0</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/835">#835</a> build(deps): bump node-fetch from 3.2.4 to 3.2.10</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/840">#840</a> build(deps): bump ossf/scorecard-action from 1.1.1 to 2.0.4</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/841">#841</a> build(deps): bump <code>@actions/core</code> from 1.9.1 to 1.10.0</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/843">#843</a> build(deps): bump <code>@actions/github</code> from 5.0.3 to 5.1.1</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/869">#869</a> build(deps): bump node-fetch from 3.2.10 to 3.3.0</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/872">#872</a> build(deps-dev): bump jest-junit from 13.2.0 to 15.0.0</li> <li><a href="https://redirect.github.com/codecov/codecov-action/issues/879">#879</a> build(deps): bump decode-uri-component from 0.2.0 to 0.2.2</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/codecov/codecov-action/commit/5c47607acb93fed5485fdbf7232e8a31425f672a"><code>5c47607</code></a> fix: override commit and pr values for PR cases (<a href="https://redirect.github.com/codecov/codecov-action/issues/1657">#1657</a>)</li> <li><a href="https://github.com/codecov/codecov-action/commit/3b1354a6c45db9f1008891f4eafc1a7e94ce1d18"><code>3b1354a</code></a> chore(release): 5.0.1 (<a href="https://redirect.github.com/codecov/codecov-action/issues/1656">#1656</a>)</li> <li><a href="https://github.com/codecov/codecov-action/commit/2e2a9c6d5862d22e0f8cfb59d46bc47bf8eb1fe0"><code>2e2a9c6</code></a> fix: update tokenless branch logic (<a href="https://redirect.github.com/codecov/codecov-action/issues/1650">#1650</a>)</li> <li><a href="https://github.com/codecov/codecov-action/commit/cfc521b7a1dcdbcf0ecf149c162c19ff9bd9568c"><code>cfc521b</code></a> Update README.md</li> <li><a href="https://github.com/codecov/codecov-action/commit/06425412c8015bc1ab2385b41c7ea204f77b91bf"><code>0642541</code></a> fix: use marketplace v5 badge (<a href="https://redirect.github.com/codecov/codecov-action/issues/1646">#1646</a>)</li> <li><a href="https://github.com/codecov/codecov-action/commit/968872560f81e7bdde9272853e65f2507c0eca7c"><code>9688725</code></a> Update README.md</li> <li><a href="https://github.com/codecov/codecov-action/commit/2112eaec1bedbdabc7e93d5312449d0d62b07c60"><code>2112eae</code></a> chore(deps): bump wrapper to 0.0.23 (<a href="https://redirect.github.com/codecov/codecov-action/issues/1644">#1644</a>)</li> <li><a href="https://github.com/codecov/codecov-action/commit/193421c5b3d1aca4209c9754f224ca0d85729414"><code>193421c</code></a> fixL use the correct source (<a href="https://redirect.github.com/codecov/codecov-action/issues/1642">#1642</a>)</li> <li><a href="https://github.com/codecov/codecov-action/commit/6018df70b05b191502ce08196e76e30ea3578615"><code>6018df7</code></a> fix: update container builds (<a href="https://redirect.github.com/codecov/codecov-action/issues/1640">#1640</a>)</li> <li><a href="https://github.com/codecov/codecov-action/commit/eff1a643d6887ee5935d4ca343e9076dc377d416"><code>eff1a64</code></a> fix: add missing vars (<a href="https://redirect.github.com/codecov/codecov-action/issues/1638">#1638</a>)</li> <li>Additional commits viewable in <a href="https://github.com/codecov/codecov-action/compare/v4...v5">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=codecov/codecov-action&package-manager=github_actions&previous-version=4&new-version=5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
fix deepmodeling#757 <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Enhanced handling of magnetic moments and forces to ensure consistent three-dimensional vector representation. - Updated input parameters for calculation configuration to reflect changes in naming conventions and spin configurations. - **Tests** - Introduced a new test method to validate behavior with the updated input file, improving test coverage and ensuring accuracy in results. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: root <pxlxingliang> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…type "ndarray[Unknown, Unknown]" (deepmodeling#765)
<!--pre-commit.ci start--> updates: - [github.com/astral-sh/ruff-pre-commit: v0.8.2 → v0.8.4](astral-sh/ruff-pre-commit@v0.8.2...v0.8.4) <!--pre-commit.ci end--> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
fix issue deepmodeling#773 Co-authored-by: Han Wang <[email protected]>
<!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Documentation** - Updated package name from "DP-GEN" to "dpdata" in README - Revised Python version support description to match package name change <!-- end of auto-generated comment: release notes by coderabbit.ai --> Co-authored-by: Han Wang <[email protected]>
<!--pre-commit.ci start--> updates: - [github.com/astral-sh/ruff-pre-commit: v0.8.4 → v0.8.6](astral-sh/ruff-pre-commit@v0.8.4...v0.8.6) <!--pre-commit.ci end--> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Han Wang <[email protected]>
…ng#777) <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Precision Improvements** - Updated numpy array data types from `float32` to `float64` for enhanced numerical precision - Refined energy and virial value representations in test data files using scientific notation - Removed redundant error handling code - **Test Data Updates** - Updated energy value formats in multiple test XYZ files for improved accuracy <!-- end of auto-generated comment: release notes by coderabbit.ai --> Co-authored-by: Han Wang <[email protected]>
<!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Improvements** - Enhanced flexibility in data handling by making forces data optional in the system configuration. - Added a method to check for the presence of forces data in the system. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Han Wang <[email protected]>
<!--pre-commit.ci start--> updates: - [github.com/astral-sh/ruff-pre-commit: v0.8.6 → v0.9.1](astral-sh/ruff-pre-commit@v0.8.6...v0.9.1) <!--pre-commit.ci end--> --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
<!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Bug Fixes** - Enhanced error handling when processing forces in various data formats. - Added conditional checks to prevent potential runtime errors when force data is missing. - Improved robustness of data conversion methods across multiple plugins. - **Refactor** - Streamlined data handling for optional force and virial information. - Implemented safer data extraction methods in ASE, PWmat, and VASP plugins. - Corrected a typographical error in the documentation of the driver methods. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
<!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit ## Release Notes - **Python Version Support** - Dropped support for Python 3.7 - Now requires Python 3.8 or higher - **Documentation** - Updated installation instructions to reflect new Python version requirements - **Code Maintenance** - Simplified type imports and version-specific code handling - Removed legacy version compatibility checks - **Build Configuration** - Updated GitHub Actions workflow to test only Python 3.8 and 3.12 <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: Han Wang <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
<!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Enhanced support for PyMatGen structure format conversion - Improved handling of periodic boundary conditions (PBC) - **Tests** - Added new test class for PyMatGen structure conversion - Expanded test data with additional element types (Fe, Li, O, P) - **Bug Fixes** - Refined atomic species list generation - Improved error handling for structure periodicity <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: Han Wang <[email protected]>
<!--pre-commit.ci start--> updates: - [github.com/astral-sh/ruff-pre-commit: v0.9.1 → v0.9.2](astral-sh/ruff-pre-commit@v0.9.1...v0.9.2) <!--pre-commit.ci end--> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
<!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Documentation** - Updated Python version compatibility from 3.7 to 3.8 and above in README <!-- end of auto-generated comment: release notes by coderabbit.ai -->
CodSpeed Performance ReportMerging #786 will not alter performanceComparing 🎉 Hooray!
|
📝 WalkthroughWalkthroughThis pull request introduces a comprehensive set of changes across the
Changes
Sequence DiagramsequenceDiagram
participant User
participant WorkflowConfig
participant UVSetup
participant DependencyCache
User->>WorkflowConfig: Trigger Workflow
WorkflowConfig->>UVSetup: Use astral-sh/setup-uv@v5
UVSetup->>DependencyCache: Check Cache
alt Cache Hit
DependencyCache-->>UVSetup: Restore Cached Dependencies
else Cache Miss
UVSetup->>DependencyCache: Install Dependencies
UVSetup->>DependencyCache: Save Cache
end
UVSetup-->>WorkflowConfig: Environment Ready
Possibly related PRs
Suggested reviewers
✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 8
🧹 Nitpick comments (20)
dpdata/lammps/dump.py (1)
321-331
: Handle missing spin data without removing previous spin information.Currently, when spin data is missing in any frame, the entire spin data is removed from the system. Consider handling missing spin data on a per-frame basis by setting spin values to
None
or an appropriate default for frames where spin data is missing, instead of discarding all spin data collected so far.🧰 Tools
🪛 Ruff (0.8.2)
326-326: No explicit
stacklevel
keyword argument found(B028)
dpdata/abacus/scf.py (2)
280-280
: Simplify multipleisinstance
checks foratommag
.You can merge the
isinstance
checks into a single call for better readability.Apply this diff:
- if not (isinstance(atommag, list) or isinstance(atommag, float)): + if not isinstance(atommag, (list, float)):🧰 Tools
🪛 Ruff (0.8.2)
280-280: Multiple
isinstance
calls foratommag
, merge into a single callMerge
isinstance
calls foratommag
(SIM101)
295-298
: Use a ternary operator for concise assignment ofmag_norm
.Simplify the
if-else
block by using a ternary operator.Apply this diff:
- if isinstance(atommag, list): - mag_norm = np.linalg.norm(atommag) - else: - mag_norm = atommag + mag_norm = np.linalg.norm(atommag) if isinstance(atommag, list) else atommag🧰 Tools
🪛 Ruff (0.8.2)
295-298: Use ternary operator
mag_norm = np.linalg.norm(atommag) if isinstance(atommag, list) else atommag
instead ofif
-else
-blockReplace
if
-else
-block withmag_norm = np.linalg.norm(atommag) if isinstance(atommag, list) else atommag
(SIM108)
dpdata/pymatgen/structure.py (1)
14-19
: Consider relaxing the PBC validation.The current implementation raises an error for partial PBC conditions. Consider supporting partial PBC or providing a more informative error message explaining valid alternatives.
- raise ValueError(f"Partial pbc condition {structure.pbc} is not supported") + raise ValueError( + f"Partial periodic boundary conditions {structure.pbc} are not supported. " + "Please use either fully periodic (True, True, True) or " + "non-periodic (False, False, False) conditions." + )tests/test_deepmd_spin.py (1)
18-22
: Consider adding more comprehensive spin data validation.While the current checks verify the presence and shape of spin data, consider adding assertions for:
- Value ranges for spins and force magnitudes
- Consistency between spins and force magnitudes
- Edge cases (zero spins, maximum values)
tests/test_pymatgen_structure.py (1)
31-42
: Consider parameterizing precision values.The precision values (
places
,e_places
,f_places
,v_places
) are repeated across test classes. Consider moving these to a common base class or configuration.class PrecisionMixin: places = 6 e_places = 6 f_places = 6 v_places = 6Also applies to: 45-57
tests/test_vasp_poscar_dump.py (2)
37-42
: Test setup looks good but could be more robust.The test correctly sets up the system and performs the conversion. However, consider adding assertions to verify the system state before conversion.
Add assertions to verify the initial state:
def test_dump_move_flags(self): tmp_system = dpdata.System() tmp_system.from_vasp_poscar(os.path.join("poscars", "POSCAR.oh.c")) + # Verify initial state + expected_move = np.array([[[True, True, False], [False, False, False]]]) + self.assertTrue(np.array_equal(tmp_system["move"], expected_move)) tmp_system.to_vasp_poscar("tmp.POSCAR") self.system = dpdata.System() self.system.from_vasp_poscar("tmp.POSCAR")
43-50
: Consider using file context manager for safer file handling.The file is properly opened but should use a context manager to ensure proper cleanup.
- with open("tmp.POSCAR") as f: + try: + with open("tmp.POSCAR") as f: + content = f.read() + except IOError as e: + self.fail(f"Failed to read POSCAR file: {e}") - content = f.read()dpdata/plugins/lammps.py (1)
97-101
: Consider adding error handling for spin registration.While the implementation is correct, it might benefit from error handling around the spin registration.
data = dpdata.lammps.dump.system_data( lines, type_map, unwrap=unwrap, input_file=input_file ) - register_spin(data) + try: + register_spin(data) + except Exception as e: + # Log warning but don't fail if spin registration fails + import warnings + warnings.warn(f"Failed to register spin data: {e}") return datatests/test_vasp_poscar_to_system.py (1)
22-47
: Comprehensive error testing for move flags.The test class provides good coverage of error cases with clear error messages and proper cleanup.
However, consider adding a positive test case to verify successful move flag handling:
def test_move_flags_success(self): system = dpdata.System().from_vasp_poscar( os.path.join("poscars", "POSCAR.oh.c") ) system.data["move"] = np.array([[[True, True, True], [False, False, False]]]) system.to_vasp_poscar(self.tmp_file) # Verify the written file with open(self.tmp_file) as f: content = f.read() self.assertIn("T T T", content) self.assertIn("F F F", content)dpdata/vasp/poscar.py (1)
115-128
: Consider enhancing move flags validation.While the validation checks for list type and length, it could be more robust by also validating that each flag is a boolean value.
- if not isinstance(move_flags, list) or len(move_flags) != 3: + if not isinstance(move_flags, list) or len(move_flags) != 3 or not all(isinstance(flag, bool) for flag in move_flags): raise RuntimeError( - f"Invalid move flags: {move_flags}, should be a list of 3 bools" + f"Invalid move flags: {move_flags}, should be a list of 3 boolean values" )dpdata/unit.py (1)
6-14
: Consider adding version control for physical constants.The physical constants are hardcoded with values from scipy 2018. Consider:
- Adding a version identifier in the dictionary
- Documenting the source/reference for these values
- Adding validation to ensure consistency across updates
tests/test_lammps_spin.py (3)
40-50
: Consider adding cleanup in tearDown method.The
tearDown
method is defined but currently has a commented-out cleanup. Consider either implementing the cleanup or removing the empty method.- def tearDown(self): - pass # if os.path.isfile(self.lmp_coord_name):os.remove(self.lmp_coord_name) + def tearDown(self): + if os.path.isfile(self.lmp_coord_name): + os.remove(self.lmp_coord_name)
51-60
: Add assertions for spin data in output file.The test verifies file creation and coordinate reference but doesn't validate the spin data in the output file.
Consider adding assertions to verify the spin data:
def test_dump_input(self): self.tmp_system.to("lammps/lmp", self.lmp_coord_name) self.assertTrue(os.path.isfile(self.lmp_coord_name)) with open(self.lmp_coord_name) as f: c = f.read() + # Verify spin data in output + spin_ref = """ 0.6000000000 0.8000000000 0.0000000000 5.0000000000""" + self.assertTrue(spin_ref in c)
85-106
: Add test for invalid spin data format.The test class should include a test for handling invalid spin data formats.
Consider adding a test method:
def test_read_dump_invalid_spin(self): with self.assertRaises(ValueError): tmp_system = dpdata.System( "lammps/invalid_spin.dump", fmt="lammps/dump", type_map=["O", "H"], input_file="lammps/in.lmp", )dpdata/lammps/lmp.py (2)
130-141
: Add input validation for spin data.The
get_spins
function should validate the spin data format more thoroughly.Consider adding more validation:
def get_spins(lines): atom_lines = get_atoms(lines) if len(atom_lines[0].split()) < 8: return None + if not all(len(line.split()) >= 8 for line in atom_lines): + raise ValueError("Inconsistent spin data format across atoms") spins_ori = [] spins_norm = [] for ii in atom_lines: iis = ii.split() + if not all(isinstance(float(x), float) for x in iis[5:8]): + raise ValueError("Invalid spin values") spins_ori.append([float(jj) for jj in iis[5:8]]) spins_norm.append([float(iis[-1])]) return np.array(spins_ori) * np.array(spins_norm)
253-277
: Consider extracting spin formatting logic.The spin formatting logic in
from_system_data
is complex and could be extracted into a separate function.Consider refactoring:
def _format_spin_data(spins, spin_norm, idx): if spin_norm[idx] != 0: return ( spins[idx][0] / spin_norm[idx], spins[idx][1] / spin_norm[idx], spins[idx][2] / spin_norm[idx], spin_norm[idx], ) return ( spins[idx][0], spins[idx][1], spins[idx][2] + 1, spin_norm[idx], )tests/test_abacus_stru_dump.py (1)
32-34
: Add assertions for file content.The test only verifies file creation but not its content.
Consider adding content validation:
def test_dump_stru_without_pporb(self): self.system_ch4.to("stru", "STRU_tmp", mass=[12, 1]) self.assertTrue(os.path.isfile("STRU_tmp")) + with open("STRU_tmp") as f: + content = f.read() + self.assertNotIn("NUMERICAL_ORBITAL", content) + self.assertNotIn("NUMERICAL_DESCRIPTOR", content)dpdata/plugins/ase.py (1)
301-304
: Simplify dictionary key checks and improve force handling.
- The dictionary key checks can be simplified.
- The force handling is now more robust with proper existence checks.
Apply this diff to simplify the key checks:
- if "forces" in tmp.keys() and "forces" in dict_frames.keys(): + if "forces" in tmp and "forces" in dict_frames: dict_frames["forces"] = np.append( dict_frames["forces"], tmp["forces"][0] ) ... - if "forces" in dict_frames.keys(): + if "forces" in dict_frames: dict_frames["forces"] = dict_frames["forces"].reshape(len(sub_traj), -1, 3)Also applies to: 313-314
🧰 Tools
🪛 Ruff (0.8.2)
301-301: Use
key in dict
instead ofkey in dict.keys()
Remove
.keys()
(SIM118)
301-301: Use
key in dict
instead ofkey in dict.keys()
Remove
.keys()
(SIM118)
dpdata/system.py (1)
224-225
: Use f-strings for string formatting.The string formatting can be improved by using f-strings for better readability.
Apply this diff:
- "Sum of atom_numbs (%d) is not equal to natoms (%d)." # noqa: UP031 - % (sum(self.get_atom_numbs()), self.get_natoms()) + f"Sum of atom_numbs ({sum(self.get_atom_numbs())}) is not equal to natoms ({self.get_natoms()})." ... - ret += "\nFrame Numbers : %d" % self.get_nframes() # noqa: UP031 - ret += "\nAtom Numbers : %d" % self.get_natoms() # noqa: UP031 + ret += f"\nFrame Numbers : {self.get_nframes()}" + ret += f"\nAtom Numbers : {self.get_natoms()}"Also applies to: 279-280
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
tests/abacus.spin/OUT.ABACUS-nspin2/running_scf.log
is excluded by!**/*.log
📒 Files selected for processing (82)
.github/workflows/benchmark.yml
(1 hunks).github/workflows/test.yml
(2 hunks).pre-commit-config.yaml
(3 hunks).readthedocs.yaml
(1 hunks)README.md
(1 hunks)docs/conf.py
(3 hunks)docs/installation.md
(1 hunks)docs/make_format.py
(1 hunks)docs/rtd_environment.yml
(1 hunks)dpdata/abacus/md.py
(7 hunks)dpdata/abacus/relax.py
(4 hunks)dpdata/abacus/scf.py
(14 hunks)dpdata/ase_calculator.py
(1 hunks)dpdata/data_type.py
(1 hunks)dpdata/deepmd/comp.py
(2 hunks)dpdata/deepmd/hdf5.py
(2 hunks)dpdata/deepmd/mixed.py
(1 hunks)dpdata/deepmd/raw.py
(1 hunks)dpdata/driver.py
(1 hunks)dpdata/gaussian/gjf.py
(2 hunks)dpdata/lammps/dump.py
(3 hunks)dpdata/lammps/lmp.py
(5 hunks)dpdata/plugins/abacus.py
(5 hunks)dpdata/plugins/amber.py
(1 hunks)dpdata/plugins/ase.py
(3 hunks)dpdata/plugins/deepmd.py
(6 hunks)dpdata/plugins/gaussian.py
(1 hunks)dpdata/plugins/lammps.py
(2 hunks)dpdata/plugins/n2p2.py
(2 hunks)dpdata/plugins/pwmat.py
(1 hunks)dpdata/plugins/pymatgen.py
(1 hunks)dpdata/plugins/vasp.py
(6 hunks)dpdata/pwmat/atomconfig.py
(2 hunks)dpdata/pymatgen/structure.py
(1 hunks)dpdata/qe/traj.py
(2 hunks)dpdata/rdkit/sanitize.py
(1 hunks)dpdata/stat.py
(8 hunks)dpdata/system.py
(8 hunks)dpdata/unit.py
(1 hunks)dpdata/utils.py
(1 hunks)dpdata/vasp/outcar.py
(1 hunks)dpdata/vasp/poscar.py
(5 hunks)dpdata/vasp/xml.py
(3 hunks)dpdata/xyz/quip_gap_xyz.py
(2 hunks)pyproject.toml
(3 hunks)tests/abacus.scf/STRU.ch4
(1 hunks)tests/abacus.scf/stru_test
(1 hunks)tests/abacus.spin/INPUT.scf.nspin2
(2 hunks)tests/abacus.spin/STRU.spin
(1 hunks)tests/comp_sys.py
(2 hunks)tests/lammps/in.lmp
(1 hunks)tests/lammps/spin.lmp
(1 hunks)tests/lammps/traj.dump
(1 hunks)tests/lammps/traj_partial_spin.dump
(1 hunks)tests/plugin/pyproject.toml
(1 hunks)tests/poscars/POSCAR.oh.err1
(1 hunks)tests/poscars/POSCAR.oh.err2
(1 hunks)tests/poscars/poscar_ref_oh.py
(2 hunks)tests/pwmat/config_ref_ch4.py
(2 hunks)tests/pwmat/config_ref_oh.py
(2 hunks)tests/pymatgen_data/deepmd/type.raw
(1 hunks)tests/pymatgen_data/deepmd/type_map.raw
(1 hunks)tests/test_abacus_pw_scf.py
(1 hunks)tests/test_abacus_spin.py
(7 hunks)tests/test_abacus_stru_dump.py
(3 hunks)tests/test_custom_data_type.py
(1 hunks)tests/test_deepmd_spin.py
(1 hunks)tests/test_from_pymatgen.py
(0 hunks)tests/test_lammps_spin.py
(1 hunks)tests/test_msd.py
(1 hunks)tests/test_pymatgen_structure.py
(1 hunks)tests/test_remove_pbc.py
(1 hunks)tests/test_system_apply_pbc.py
(1 hunks)tests/test_vasp_poscar_dump.py
(1 hunks)tests/test_vasp_poscar_to_system.py
(1 hunks)tests/tmp.deepmd.spin/Fe16-npy/type.raw
(1 hunks)tests/tmp.deepmd.spin/Fe16-npy/type_map.raw
(1 hunks)tests/tmp.deepmd.spin/Fe16-raw/box.raw
(1 hunks)tests/tmp.deepmd.spin/Fe16-raw/coord.raw
(1 hunks)tests/tmp.deepmd.spin/Fe16-raw/energy.raw
(1 hunks)tests/tmp.deepmd.spin/Fe16-raw/force.raw
(1 hunks)tests/tmp.deepmd.spin/Fe16-raw/force_mag.raw
(1 hunks)
⛔ Files not processed due to max files limit (7)
- tests/tmp.deepmd.spin/Fe16-raw/spin.raw
- tests/tmp.deepmd.spin/Fe16-raw/type.raw
- tests/tmp.deepmd.spin/Fe16-raw/type_map.raw
- tests/tmp.deepmd.spin/Fe16-raw/virial.raw
- tests/xyz/xyz_unittest.field.xyz
- tests/xyz/xyz_unittest.sort.xyz
- tests/xyz/xyz_unittest.xyz
💤 Files with no reviewable changes (1)
- tests/test_from_pymatgen.py
✅ Files skipped from review due to trivial changes (33)
- tests/tmp.deepmd.spin/Fe16-npy/type_map.raw
- tests/tmp.deepmd.spin/Fe16-npy/type.raw
- tests/tmp.deepmd.spin/Fe16-raw/box.raw
- .readthedocs.yaml
- tests/tmp.deepmd.spin/Fe16-raw/force_mag.raw
- tests/poscars/poscar_ref_oh.py
- tests/test_system_apply_pbc.py
- tests/pymatgen_data/deepmd/type.raw
- tests/pymatgen_data/deepmd/type_map.raw
- tests/pwmat/config_ref_oh.py
- tests/tmp.deepmd.spin/Fe16-raw/force.raw
- dpdata/deepmd/comp.py
- tests/tmp.deepmd.spin/Fe16-raw/energy.raw
- docs/installation.md
- tests/test_custom_data_type.py
- dpdata/plugins/amber.py
- tests/test_remove_pbc.py
- dpdata/deepmd/raw.py
- dpdata/plugins/gaussian.py
- dpdata/vasp/xml.py
- dpdata/rdkit/sanitize.py
- dpdata/data_type.py
- dpdata/gaussian/gjf.py
- dpdata/deepmd/hdf5.py
- dpdata/vasp/outcar.py
- tests/test_msd.py
- tests/tmp.deepmd.spin/Fe16-raw/coord.raw
- tests/poscars/POSCAR.oh.err1
- .pre-commit-config.yaml
- dpdata/qe/traj.py
- dpdata/plugins/n2p2.py
- tests/lammps/traj.dump
- tests/poscars/POSCAR.oh.err2
🧰 Additional context used
🪛 actionlint (1.7.4)
.github/workflows/test.yml
29-29: shellcheck reported issue in this script: SC2102:info:1:26: Ranges can only match single chars (mentioned due to duplicates)
(shellcheck)
🪛 Ruff (0.8.2)
dpdata/abacus/relax.py
79-79: Do not assert False
(python -O
removes these calls), raise AssertionError()
Replace assert False
(B011)
dpdata/lammps/dump.py
218-218: No explicit stacklevel
keyword argument found
(B028)
278-278: No explicit stacklevel
keyword argument found
(B028)
326-326: No explicit stacklevel
keyword argument found
(B028)
dpdata/vasp/poscar.py
69-70: Use a single if
statement instead of nested if
statements
Combine if
statements using and
(SIM102)
dpdata/plugins/ase.py
301-301: Use key in dict
instead of key in dict.keys()
Remove .keys()
(SIM118)
301-301: Use key in dict
instead of key in dict.keys()
Remove .keys()
(SIM118)
313-313: Use key in dict
instead of key in dict.keys()
Remove .keys()
(SIM118)
dpdata/stat.py
119-119: Use of functools.lru_cache
or functools.cache
on methods can lead to memory leaks
(B019)
127-127: Use of functools.lru_cache
or functools.cache
on methods can lead to memory leaks
(B019)
156-156: Use of functools.lru_cache
or functools.cache
on methods can lead to memory leaks
(B019)
169-169: Use of functools.lru_cache
or functools.cache
on methods can lead to memory leaks
(B019)
dpdata/abacus/scf.py
280-280: Multiple isinstance
calls for atommag
, merge into a single call
Merge isinstance
calls for atommag
(SIM101)
295-298: Use ternary operator mag_norm = np.linalg.norm(atommag) if isinstance(atommag, list) else atommag
instead of if
-else
-block
Replace if
-else
-block with mag_norm = np.linalg.norm(atommag) if isinstance(atommag, list) else atommag
(SIM108)
773-773: No explicit stacklevel
keyword argument found
(B028)
🔇 Additional comments (74)
dpdata/lammps/dump.py (2)
200-235
: Functionget_spin_keys
implementation looks good.The function correctly extracts spin keys from the input file with appropriate error handling when the input file is missing or does not contain the expected keys.
🧰 Tools
🪛 Ruff (0.8.2)
218-218: No explicit
stacklevel
keyword argument found(B028)
238-280
: Functionget_spin
implementation handles spin data extraction appropriately.The function processes spin information from the dump file, including error handling for potential issues during data extraction.
🧰 Tools
🪛 Ruff (0.8.2)
278-278: No explicit
stacklevel
keyword argument found(B028)
dpdata/abacus/scf.py (2)
261-305
: Functionget_atom_mag_cartesian
is implemented correctly.The function accurately converts magnetic moments from spherical to Cartesian coordinates, handling various input types and edge cases appropriately.
🧰 Tools
🪛 Ruff (0.8.2)
280-280: Multiple
isinstance
calls foratommag
, merge into a single callMerge
isinstance
calls foratommag
(SIM101)
295-298: Use ternary operator
mag_norm = np.linalg.norm(atommag) if isinstance(atommag, list) else atommag
instead ofif
-else
-blockReplace
if
-else
-block withmag_norm = np.linalg.norm(atommag) if isinstance(atommag, list) else atommag
(SIM108)
570-571
: Verify that all references tomag_forces
have been updated toforce_mags
.Since
mag_forces
has been renamed toforce_mags
, ensure that all occurrences in the codebase are updated to prevent potential bugs.Run the following script to verify:
tests/lammps/in.lmp (1)
1-2
: LGTM!The added
compute
anddump
commands correctly configure the calculation and output of spin properties in the LAMMPS simulation.tests/abacus.spin/INPUT.scf.nspin2 (1)
2-2
: Verify consistency of nspin=2 configuration across test filesThe changes to suffix (ABACUS → ABACUS-nspin2) and nspin (4 → 2) parameters appear to be part of a broader update to spin configurations. Please ensure these changes are consistent with related test files and expected test outcomes.
Also applies to: 17-17
✅ Verification successful
Verified: nspin=2 configuration is correctly isolated for specific test case
The nspin=2 configuration in INPUT.scf.nspin2 is intentionally different as it's specifically used by the test_scf_nspin2 test case, while other test files maintain nspin=4 for their respective scenarios. The changes are consistent with the test structure.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for other nspin configurations in test files rg -l 'nspin.*[24]' tests/Length of output: 240
Script:
#!/bin/bash # Check nspin values in identified files echo "=== INPUT files ===" rg -A 1 'nspin' tests/abacus.spin/INPUT.* echo -e "\n=== Test file ===" rg 'nspin.*[24]' tests/test_abacus_spin.pyLength of output: 853
tests/lammps/spin.lmp (1)
1-12
: Well-structured LAMMPS test file with spin configurationsThe new test file correctly defines:
- System size (2 atoms, 2 types)
- Box dimensions with tilt factors
- Atomic positions with spin parameters
tests/abacus.spin/STRU.spin (1)
20-23
: Verify consistency of magnetic property formatsThe magnetic properties are specified in different formats across atoms:
- Atom 1:
mag 0 0 2
- Atom 2:
mag 3
- Atom 3:
mag 3 angle1 90
- Atom 4:
mag 3 4 0 angle1 90 angle2 90
Please verify if these varying formats are intentional and documented in the test suite.
✅ Verification successful
Magnetic property formats are consistent with test suite
The varying magnetic property formats are intentional and well-documented across the test suite. The formats are consistently used in test files (test_abacus_spin.py), reference files (stru.ref), and have dedicated parsing support (test_abacus_stru_dump.py).
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for similar magnetic property formats in test files rg -A 1 'mag.*[0-9].*angle' tests/Length of output: 1951
tests/abacus.scf/STRU.ch4 (1)
21-21
: Verify magnetic parameter consistency with CH4 symmetryThe magnetic parameters added to atomic positions:
- Carbon:
1 1 1
- Hydrogens:
1 0 1
,1 0 1
,0 1 1
These parameters should reflect the tetrahedral symmetry of CH4. Please verify if the varying magnetic parameters for hydrogen atoms are consistent with the expected molecular symmetry.Also applies to: 26-28
tests/abacus.scf/stru_test (2)
25-25
: LGTM: Carbon atom magnetic properties added correctly.The addition of magnetic properties (
mag 0.000000000000 0.000000000000 0.000000000000
) to the carbon atom position is consistent with the format and maintains the original occupancy values.
29-32
: Verify the occupancy pattern changes for hydrogen atoms.The occupancy values for hydrogen atoms have been modified with a specific pattern:
- H1:
0 0 0
(fully unoccupied)- H2 & H3:
1 0 1
(partially occupied)- H4:
0 1 1
(partially occupied)Please confirm if this occupancy pattern is intentional and physically meaningful for your system.
dpdata/pymatgen/structure.py (2)
27-28
: LGTM: New fields added safely.The addition of
"orig"
and"nopbc"
fields toinfo_dict
maintains backward compatibility while extending functionality.
8-8
: Verify backward compatibility of symbol extraction change.The change from
site.species_string
toii.specie.symbol
might affect systems using older versions of pymatgen. Please ensure this change is compatible across all supported pymatgen versions.✅ Verification successful
Symbol extraction change is safe to proceed
The codebase shows no usage of the old
species_string
pattern, and pymatgen dependencies are properly handled through a plugin architecture with appropriate error handling. The change tospecie.symbol
is consistent with the codebase's current implementation.🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check pymatgen version requirements in setup.py or requirements.txt fd -t f "setup.py|requirements.*" -x rg -l "pymatgen"Length of output: 53
Script:
#!/bin/bash # Check setup.py content directly cat setup.py # Find pymatgen import patterns and both symbol extraction methods rg "from pymatgen" -A 2 rg "import pymatgen" -A 2 rg "species_string|specie\.symbol" -A 2Length of output: 2282
tests/test_deepmd_spin.py (1)
24-34
: LGTM: File format tests are well-structured.The tests thoroughly verify both NPY and raw format handling, including proper file creation and cleanup.
Also applies to: 36-44
tests/test_pymatgen_structure.py (1)
9-14
: LGTM: Robust module availability checking.The pymatgen import check is well-implemented with proper exception handling and test skipping.
dpdata/plugins/pwmat.py (1)
34-40
: Good improvement in force handling logic.The changes improve error handling by only including forces when they are available. This prevents potential issues with None values.
dpdata/plugins/lammps.py (2)
17-27
: Well-structured spin data registration.The implementation correctly registers spin data type with appropriate shape and attributes.
65-95
: Comprehensive docstring with well-documented parameters.The docstring follows good practices with clear parameter descriptions and return type information.
tests/test_vasp_poscar_to_system.py (1)
17-19
: Good test for move flags validation.The test correctly verifies the expected move flags using numpy array comparison.
dpdata/plugins/pymatgen.py (3)
33-33
: LGTM! Clean import statement.The import statement correctly includes both
Structure
andLattice
frompymatgen.core
.
37-38
: LGTM! Improved species list construction and PBC handling.The changes improve the code by:
- Using a more efficient list comprehension for species construction
- Adding explicit PBC handling with a clear default behavior
41-41
: LGTM! Proper usage of Lattice with PBC.The code correctly uses the
Lattice
class to construct the cell with proper PBC handling.dpdata/plugins/abacus.py (4)
23-25
: LGTM! Improved data handling.The code correctly registers magnetic data after getting frame data from STRU.
60-62
: LGTM! Proper data type registration for magnetic data.The code correctly registers magnetic data types for both System and LabeledSystem.
Also applies to: 70-71
74-85
: LGTM! Well-structured move data registration.The new
register_move_data
function follows the same pattern asregister_mag_data
, maintaining consistency in the codebase.
94-94
: LGTM! Consistent usage of register_move_data.The
register_move_data
function is consistently called across all relevant format classes.Also applies to: 106-106, 118-118
tests/lammps/traj_partial_spin.dump (1)
1-52
: Test data looks good.This is a well-structured test fixture for LAMMPS trajectory data with and without spin information.
dpdata/stat.py (2)
12-12
: LGTM! Improved type annotations.The return type annotations are now more precise using
np.floating[Any]
instead ofnp.float64
.Also applies to: 28-28
58-63
: LGTM! Improved error messages.The assertion error messages now clearly indicate the expected type of the system arguments.
dpdata/vasp/poscar.py (5)
7-14
: LGTM! Well-implemented flag mapping function.The
move_flag_mapper
function correctly handles the conversion of VASP move flags ("T"/"F") to boolean values with proper error handling.
38-44
: LGTM! Robust selective dynamics handling.The implementation correctly processes move flags when selective dynamics is enabled, with proper error handling for invalid input formats.
55-58
: LGTM! Proper reshaping of move flags.The move flags are correctly reshaped to match the system structure (frames, atoms, dimensions).
64-74
: LGTM! Clean selective dynamics detection.The implementation properly handles the "Selective dynamics" header in POSCAR files with case-insensitive matching.
🧰 Tools
🪛 Ruff (0.8.2)
69-70: Use a single
if
statement instead of nestedif
statementsCombine
if
statements usingand
(SIM102)
99-101
: LGTM! Proper handling of selective dynamics header.The implementation correctly adds the "Selective Dynamics" header when move flags are present.
dpdata/plugins/vasp.py (2)
18-28
: LGTM! Well-structured move data registration.The
register_move_data
function properly defines and registers the move data type with appropriate dimensions and attributes.
107-108
: LGTM! Improved force data handling.The implementation now properly checks for None before assigning force data.
dpdata/utils.py (1)
6-6
: LGTM! Simplified typing imports.Direct import of Literal aligns with the project's Python version requirements.
dpdata/deepmd/mixed.py (1)
18-20
: LGTM! Improved assertion formatting.The assertion message is now more concise while maintaining clarity.
dpdata/unit.py (1)
21-29
: LGTM! Clear and well-documented constant definitions.The constant definitions are clear, with good use of the new
scipy_constant_value
function. The RYDBERG calculation correctly combines the required constants.docs/conf.py (2)
27-27
: LGTM! Appropriate linter directive added.The noqa directive for UP031 is correctly applied to the copyright line.
47-47
: Verify the sphinx_book_theme compatibility.The theme change from sphinx_rtd_theme to sphinx_book_theme needs verification:
- Ensure all existing documentation renders correctly
- Check mobile responsiveness
- Verify search functionality
Also applies to: 90-90
tests/test_abacus_pw_scf.py (1)
161-161
: LGTM! Improved array emptiness check.The change from
assertFalse(system_ch4.data["forces"])
toassertFalse(system_ch4.data["forces"].size)
is more explicit and reliable for checking array emptiness.dpdata/driver.py (1)
169-170
: LGTM! Improved error handling for optional data.The addition of key existence checks prevents KeyError exceptions when handling optional force and virial data.
tests/comp_sys.py (1)
121-123
: LGTM! Improved assertion message formatting.The changes enhance readability by consistently wrapping assertion messages in parentheses.
Also applies to: 127-129, 200-202, 209-211
dpdata/xyz/quip_gap_xyz.py (1)
164-164
: LGTM! Enhanced numerical precision.The change from float32 to float64 provides better numerical precision for scientific computations, which is crucial for accurate molecular dynamics simulations.
Also applies to: 178-181
dpdata/abacus/relax.py (1)
186-186
: LGTM! Improved data structure handling.The changes include:
- Added support for move parameter in get_coords
- Renamed mag_forces to force_mags for better clarity
- Added move data to the output dictionary when available
Also applies to: 220-222
tests/test_abacus_spin.py (3)
19-20
: LGTM! Improved test cleanup.Added proper cleanup of the INPUT file in tearDown, preventing test interference.
55-85
: LGTM! Comprehensive test coverage for nspin2.The new test_scf_nspin2 method provides coverage for:
- Energy validation
- Spin vector validation
- Force magnitudes validation
- Data persistence verification
194-208
: LGTM! Added test coverage for structure file spin reading.The new test_read_stru_spin method validates:
- Presence of spins in loaded data
- Correct parsing of various spin vector formats
- Accurate conversion of angles to vector components
tests/test_lammps_spin.py (1)
132-144
: LGTM! Good error handling test.The test properly verifies warning generation for incomplete spin data.
dpdata/abacus/md.py (3)
59-61
: LGTM! Improved assertion message readability.The assertion message formatting with parentheses enhances readability.
223-225
: LGTM! Consistent variable naming.The renaming from
mag_forces
toforce_mags
and addition ofmove
data is consistent with the codebase conventions.
170-172
: Verify the impact of addingmove
parameter.The addition of
move
parameter to the unpacking might affect other parts of the codebase.✅ Verification successful
The addition of
move
parameter is safe and consistent with existing usageThe codebase has two separate
get_coords
implementations:
- QE implementation returns 4 values (atom_names, natoms, types, coords)
- ABACUS implementation returns 6 values (including move and magmom)
All ABACUS-related files already handle these 6 return values consistently, and the QE implementation remains separate and unaffected.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for other calls to get_coords to ensure they handle the new parameter ast-grep --pattern 'get_coords($$$)'Length of output: 931
Script:
#!/bin/bash # Find implementations of get_coords function ast-grep --pattern 'def get_coords($$$)' # Also search for the function name in imports to verify if these are different implementations rg "import.*get_coords|from.*get_coords"Length of output: 85
Script:
#!/bin/bash # Look at the content of files where we know get_coords is used cat dpdata/qe/scf.py | grep -A 20 "def get_coords" cat dpdata/abacus/scf.py | grep -A 20 "def get_coords" # Try a broader pattern match for function definition ast-grep --pattern 'get_coords' # Search for files containing get_coords to ensure we don't miss any implementations rg -l "get_coords"Length of output: 2962
tests/test_abacus_stru_dump.py (2)
58-99
: LGTM! Comprehensive error handling tests.The test cases thoroughly verify different parameter mismatch scenarios with appropriate error messages.
153-194
: LGTM! Good test coverage for move data.The test properly verifies the conversion and formatting of move data from VASP format.
docs/make_format.py (1)
7-7
: Verify Python version requirements.The change from
typing_extensions
totyping
forLiteral
import indicates a minimum Python version requirement of 3.8. Please ensure this aligns with the project's compatibility requirements.✅ Verification successful
Change from
typing_extensions
totyping
is compatible with project requirementsThe project explicitly requires Python >=3.8 and has proper conditional dependencies set up for backward compatibility.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check Python version requirements in project configuration fd -t f "pyproject.toml|setup.py" -x cat {} | rg -A 5 "python.*version|requires.*python"Length of output: 656
dpdata/plugins/ase.py (1)
178-180
: Improve robustness of force handling.The conditional check for "forces" key before adding it to results improves error handling and prevents potential KeyErrors.
dpdata/plugins/deepmd.py (1)
21-41
: LGTM! Well-structured data type registration.The implementation correctly:
- Registers both "spins" and "force_mags" data types
- Defines appropriate shapes using
Axis
constants- Sets them as optional with
required=False
- Maps to DeePMD-kit names
dpdata/system.py (3)
1271-1272
: LGTM! Clean implementation of has_forces method.The method provides a clean way to check for the existence of forces data.
1047-1047
: Improve type safety with assertion.The assertion prevents potential issues with boolean type confusion.
1314-1315
: Improve robustness of force correction.The force correction now properly checks for the existence of forces in both systems before applying the correction.
tests/plugin/pyproject.toml (1)
14-14
: LGTM! Python version requirement is consistent.The update to require Python >=3.8 aligns with the project's overall Python version requirements.
.github/workflows/benchmark.yml (1)
16-21
: LGTM! Workflow improvements look solid.Good improvements to the benchmark workflow:
- Using official
astral-sh/setup-uv@v5
action is more reliable- Added caching will improve workflow performance
.github/workflows/test.yml (4)
9-9
: LGTM! Fixed Ubuntu version provides better stability.Using
ubuntu-22.04
instead ofubuntu-latest
ensures consistent CI environment.
12-12
: LGTM! Python version matrix aligns with requirements.Removing Python 3.7 from the test matrix is consistent with the project's move to Python >=3.8.
21-27
: LGTM! UV setup matches benchmark workflow.Good improvements to the test workflow:
- Using official
astral-sh/setup-uv@v5
action- Added caching with Python version-specific suffix
33-33
: LGTM! Codecov action update.Update to codecov-action@v5 follows good practice of keeping actions up-to-date.
tests/pwmat/config_ref_ch4.py (1)
38-38
: LGTM! Linting suppressions are appropriate.The added
# noqa: UP031
comments appropriately suppress string formatting warnings without affecting test functionality.Also applies to: 60-60
README.md (1)
11-11
: LGTM! Documentation update for Python version requirement.The README correctly reflects the updated minimum Python version requirement of 3.8, which aligns with the changes in
pyproject.toml
.dpdata/ase_calculator.py (1)
65-66
: LGTM! Improved error handling for forces data.The added conditional check prevents potential KeyError exceptions when forces data is not available, making the code more robust.
dpdata/pwmat/atomconfig.py (1)
57-57
: LGTM! Documented linting suppressions.The added noqa comments properly document the intentional use of %-formatting for precise numerical output, suppressing UP031 warnings.
Also applies to: 86-86
pyproject.toml (3)
30-30
: LGTM! Updated Python version requirement.The Python version requirement update to >=3.8 aligns with the changes in README and reflects the project's forward movement.
39-39
: LGTM! Updated entry points configuration.The change from
entry-points.console_scripts
toscripts
follows modern setuptools conventions.
54-54
: LGTM! Documentation dependencies update.Updates to sphinx-book-theme and deepmodeling-sphinx>=0.3.0 improve the documentation tooling.
Also applies to: 57-57
warnings.warn( | ||
f"Warning: spin info is not found in frame {ii}, remove spin info." | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Include stacklevel
parameter in warnings.warn
to improve warning traceability.
Adding stacklevel=2
ensures that warnings point to the relevant line in the user's code.
Apply this diff to fix the issue:
- warnings.warn(
- f"Warning: spin info is not found in frame {ii}, remove spin info."
- )
+ warnings.warn(
+ f"Warning: spin info is not found in frame {ii}, remove spin info.",
+ stacklevel=2
+ )
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
warnings.warn( | |
f"Warning: spin info is not found in frame {ii}, remove spin info." | |
) | |
warnings.warn( | |
f"Warning: spin info is not found in frame {ii}, remove spin info.", | |
stacklevel=2 | |
) |
🧰 Tools
🪛 Ruff (0.8.2)
326-326: No explicit stacklevel
keyword argument found
(B028)
warnings.warn(f"Error processing spin data: {str(e)}") | ||
return None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Include stacklevel
parameter in warnings.warn
call for correct warning location.
Adding stacklevel=2
to the warnings.warn()
call helps users trace the origin of the warning more accurately.
Apply this diff to fix the issue:
- warnings.warn(f"Error processing spin data: {str(e)}")
+ warnings.warn(f"Error processing spin data: {str(e)}", stacklevel=2)
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
warnings.warn(f"Error processing spin data: {str(e)}") | |
return None | |
warnings.warn(f"Error processing spin data: {str(e)}", stacklevel=2) | |
return None |
🧰 Tools
🪛 Ruff (0.8.2)
278-278: No explicit stacklevel
keyword argument found
(B028)
warnings.warn(f"Input file {inputfile} not found.") | ||
return None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add stacklevel
parameter to warnings.warn
to provide correct warning location.
When issuing warnings with warnings.warn()
, it's recommended to include the stacklevel
parameter to ensure that the warning message points to the correct location in the user's code where the issue occurred. Without it, the warning may point inside the library code instead of the user's code.
Apply this diff to fix the issue:
- warnings.warn(f"Input file {inputfile} not found.")
+ warnings.warn(f"Input file {inputfile} not found.", stacklevel=2)
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
warnings.warn(f"Input file {inputfile} not found.") | |
return None | |
warnings.warn(f"Input file {inputfile} not found.", stacklevel=2) | |
return None |
🧰 Tools
🪛 Ruff (0.8.2)
218-218: No explicit stacklevel
keyword argument found
(B028)
warnings.warn( | ||
"pp_file is not provided, will use empty string for pseudo potential file." | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Include stacklevel
parameter in warnings.warn
for correct warning location.
Adding stacklevel=2
helps users trace the origin of the warning more accurately.
Apply this diff:
- warnings.warn(
- "pp_file is not provided, will use empty string for pseudo potential file."
- )
+ warnings.warn(
+ "pp_file is not provided, will use empty string for pseudo potential file.",
+ stacklevel=2
+ )
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
warnings.warn( | |
"pp_file is not provided, will use empty string for pseudo potential file." | |
) | |
warnings.warn( | |
"pp_file is not provided, will use empty string for pseudo potential file.", | |
stacklevel=2 | |
) |
🧰 Tools
🪛 Ruff (0.8.2)
773-773: No explicit stacklevel
keyword argument found
(B028)
@@ -115,15 +116,15 @@ class Errors(ErrorsBase): | |||
SYSTEM_TYPE = LabeledSystem | |||
|
|||
@property | |||
@lru_cache() | |||
@lru_cache |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Warning: Potential memory leaks with @lru_cache on methods.
Using @lru_cache
on methods can lead to memory leaks as the cache entries are tied to instance methods. Consider one of these alternatives:
- Move the cached computation to a static method
- Use
@property
without caching - Implement custom caching with weak references
Example fix using static method:
- @property
- @lru_cache
- def e_errors(self) -> np.ndarray:
+ @staticmethod
+ @lru_cache
+ def _compute_e_errors(system_1: LabeledSystem, system_2: LabeledSystem) -> np.ndarray:
"""Energy errors."""
- assert isinstance(self.system_1, self.SYSTEM_TYPE)
- assert isinstance(self.system_2, self.SYSTEM_TYPE)
- return self.system_1["energies"] - self.system_2["energies"]
+ return system_1["energies"] - system_2["energies"]
+
+ @property
+ def e_errors(self) -> np.ndarray:
+ """Energy errors."""
+ return self._compute_e_errors(self.system_1, self.system_2)
Also applies to: 127-127, 156-156, 169-169
🧰 Tools
🪛 Ruff (0.8.2)
119-119: Use of functools.lru_cache
or functools.cache
on methods can lead to memory leaks
(B019)
def scipy_constant_value(key: str) -> float: | ||
return physical_constants[key] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for missing constants.
The function should handle KeyError gracefully when an invalid key is provided.
def scipy_constant_value(key: str) -> float:
- return physical_constants[key]
+ try:
+ return physical_constants[key]
+ except KeyError as e:
+ raise ValueError(f"Unknown physical constant: {key}") from e
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
def scipy_constant_value(key: str) -> float: | |
return physical_constants[key] | |
def scipy_constant_value(key: str) -> float: | |
try: | |
return physical_constants[key] | |
except KeyError as e: | |
raise ValueError(f"Unknown physical constant: {key}") from e |
assert False, "Unrecongnized coordinate type, %s, line:%d" % ( # noqa: UP031 | ||
loglines[i].split()[0], | ||
i, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Replace assert False
with raise AssertionError
.
The use of assert False
is discouraged as assertions can be disabled with Python's -O flag. Replace with an explicit exception raise.
- assert False, "Unrecongnized coordinate type, %s, line:%d" % ( # noqa: UP031
- loglines[i].split()[0],
- i,
- )
+ raise AssertionError(
+ "Unrecongnized coordinate type, %s, line:%d"
+ % (loglines[i].split()[0], i)
+ )
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
assert False, "Unrecongnized coordinate type, %s, line:%d" % ( # noqa: UP031 | |
loglines[i].split()[0], | |
i, | |
) | |
raise AssertionError( | |
"Unrecongnized coordinate type, %s, line:%d" | |
% (loglines[i].split()[0], i) | |
) |
🧰 Tools
🪛 Ruff (0.8.2)
79-79: Do not assert False
(python -O
removes these calls), raise AssertionError()
Replace assert False
(B011)
- python <3.13 | ||
- mamba <2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Consider updating mamba version constraint
The current constraint mamba <2
is too restrictive as Mamba 2.x is now stable and widely available. Consider using mamba <3
or mamba >=1.5.0,<3
to ensure access to the latest stable features and fixes.
🔗 Analysis chain
LGTM! Version constraints look appropriate.
The version constraints are well-chosen:
- Python <3.13 prevents future compatibility issues
- mamba <2 prevents potential breaking changes in the documentation build
Run this script to verify the mamba version compatibility:
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check mamba version compatibility
# Verify if mamba <2 is available in conda-forge
curl -s https://api.anaconda.org/package/conda-forge/mamba | jq -r '.versions[]' | sort -V
Length of output: 1102
Summary by CodeRabbit
I'll craft concise release notes focusing on the key user-facing changes across the dpdata library:
Release Notes
Python Version Support
New Features
Documentation
sphinx_rtd_theme
tosphinx-book-theme
Bug Fixes
Performance