Skip to content

Commit

Permalink
Merge pull request #426 from AllenInstitute/dev
Browse files Browse the repository at this point in the history
try update to python 3.11
  • Loading branch information
rcpeene authored Nov 26, 2024
2 parents 4349c2e + 24d8328 commit ad0cd95
Show file tree
Hide file tree
Showing 10 changed files with 1,139 additions and 185 deletions.
37 changes: 22 additions & 15 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,16 @@ on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
branches: [ "main" , "dev"]

workflow_dispatch:

jobs:
build:
runs-on:
group: LargerInstance
container:
image: rcpeene/openscope_databook:latest

env:
DANDI_API_KEY: ${{ secrets.DANDI_API_KEY }}
Expand All @@ -21,24 +23,27 @@ jobs:
- uses: actions/checkout@v3
with:
fetch-depth: 0
ref: main
ref: ${{ github.ref }}

# - name: Set up Python
# uses: actions/setup-python@v4
# with:
# python-version: "3.11"

- name: Set up Python
uses: actions/setup-python@v4
# with:
# python-version: "3.9"
# - name: Upgrading pip
# run: pip install --upgrade pip

- name: Upgrading pip
run: pip install --upgrade pip
# - name: Install deps
# run: pip install cython numpy

- name: Install deps
run: pip install cython numpy
- name: pip freeze
run: pip freeze

- name: Installing package
run: pip install -e .
# - name: Installing packages again (this prevents a weird error)
# run: pip install -r requirements.txt

- name: Installing packages again (this prevents a weird error)
run: pip install -r requirements.txt
# - name: Installing package
# run: pip install -e .

- name: Installing build dependencies
run: |
Expand Down Expand Up @@ -82,7 +87,9 @@ jobs:
rm ./docs/embargoed/*.nwb
- name: Printing log
run: git status
run: |
git config --global --add safe.directory /__w/openscope_databook/openscope_databook
git status
- name: Printing shortlog
run: git log | git shortlog -sn
Expand Down
18 changes: 10 additions & 8 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ jobs:
test:
runs-on:
group: LargerInstance
container:
image: rcpeene/openscope_databook:latest

env:
DANDI_API_KEY: ${{ secrets.DANDI_API_KEY }}
Expand All @@ -19,20 +21,20 @@ jobs:
steps:
- uses: actions/checkout@v3

- name: Upgrading pip
run: pip install --upgrade pip
# - name: Upgrading pip
# run: pip install --upgrade pip

- name: print environment
run: pip freeze

- name: Install cython
run: pip install cython numpy
# - name: Install cython
# run: pip install cython numpy

- name: Installing package
run: pip install -e .
# - name: Installing package
# run: pip install -e .

- name: Installing requirements
run: pip install -r ./requirements.txt
# - name: Installing requirements
# run: pip install -r ./requirements.txt

- name: Installing build dependencies
run: |
Expand Down
22 changes: 22 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
FROM ubuntu:22.04
# base requirements
RUN apt-get update
RUN apt-get install -y coreutils
RUN apt-get install -y libgl1-mesa-glx
RUN apt-get install -y libglib2.0-0
RUN apt-get install -y python3 python3-pip
RUN apt-get install -y git

RUN git config --global --add safe.directory /__w/openscope_databook/openscope_databook

# copy databook setup files
COPY requirements.txt ./openscope_databook/requirements.txt
COPY setup.py ./openscope_databook/setup.py
COPY README.md ./openscope_databook/README.md
COPY LICENSE.txt ./openscope_databook/LICENSE.txt
COPY databook_utils ./openscope_databook/databook_utils

# for reasons I don't understand, these must be installed before the rest the requirements
RUN pip install numpy cython
# set up databook dependencies
RUN pip install -e ./openscope_databook[dev]
2 changes: 1 addition & 1 deletion docs/contribution.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ The Databook can be forked via the GitHub Web UI from the Databook's [GitHub rep
## Initialize Locally
A local repo can be made by pressing the `code` button on the front page of the forked repo, and copying the HTTPS url. Then locally, run the command `git clone <copied_url_here>`. For more information on cloning GitHub repos, check out GitHub's [Cloning a Repository](https://docs.github.com/en/repositories/creating-and-managing-repositories/cloning-a-repository) Page.

Then the environment must be set up. You may set up a conda environment if you don't want to interfere with your local environment. After installing conda, this can be done with the commands `conda create --name databook python=3.9` followed by `activate databook` (Windows) or `source activate databook` (Mac/Linux). Within or without the conda environment, the dependencies for the databook can be installed by navigating to the openscope_databook directory and running `pip install -e . --user`.
Then the environment must be set up. You may set up a conda environment if you don't want to interfere with your local environment. After installing conda, this can be done with the commands `conda create --name databook python=3.11` followed by `activate databook` (Windows) or `source activate databook` (Mac/Linux). Within or without the conda environment, the dependencies for the databook can be installed by navigating to the openscope_databook directory and running `pip install -e . --user`.

Finally, notebooks can be run with Jupyter notebook by running `jupyter notebook ./docs`

Expand Down
89 changes: 35 additions & 54 deletions docs/embargoed/cell_matching.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -50,12 +50,10 @@
"import json\n",
"import os\n",
"\n",
"import matplotlib as mpl\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"\n",
"from PIL import Image\n",
"from time import sleep"
"from PIL import Image"
]
},
{
Expand Down Expand Up @@ -93,6 +91,13 @@
"id": "77d78e7d",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"A newer version (0.63.1) of dandi/dandi-cli is available. You are using 0.61.2\n"
]
},
{
"name": "stdout",
"output_type": "stream",
Expand Down Expand Up @@ -255,66 +260,42 @@
"name": "stderr",
"output_type": "stream",
"text": [
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"WARNING:root:many=True not supported from argparse\n",
"INFO:NwayMatching:NWAY_COMMIT_SHA None\n",
"INFO:NwayMatching:Nway matching version 0.6.0\n",
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"WARNING:root:many=True not supported from argparse\n",
"WARNING:root:many=True not supported from argparse\n",
"INFO:PairwiseMatching:Matching 1193675753 to 1194754135\n",
"INFO:PairwiseMatching:Matching 1193675753 to 1194754135: best registration was ['Crop', 'CLAHE', 'PhaseCorrelate']\n",
"multiprocessing.pool.RemoteTraceback: \n",
"\"\"\"\n",
"Traceback (most recent call last):\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 125, in worker\n",
" result = (True, func(*args, **kwds))\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 48, in mapstar\n",
" return list(map(*args))\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\nway_matching.py\", line 121, in pair_match_job\n",
" pair_match.run()\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\pairwise_matching.py\", line 495, in run\n",
" segmask_moving_3d_registered = transform_mask(\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\pairwise_matching.py\", line 384, in transform_mask\n",
" dtype=np.int)\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\numpy\\__init__.py\", line 338, in __getattr__\n",
" raise AttributeError(__former_attrs__[attr])\n",
"AttributeError: module 'numpy' has no attribute 'int'.\n",
"`np.int` was a deprecated alias for the builtin `int`. To avoid this error in existing code, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.\n",
"The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:\n",
" https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations\n",
"\"\"\"\n",
"\n",
"The above exception was the direct cause of the following exception:\n",
"\n",
"Traceback (most recent call last):\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\runpy.py\", line 196, in _run_module_as_main\n",
" return _run_code(code, main_globals, None,\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\runpy.py\", line 86, in _run_code\n",
" exec(code, run_globals)\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\nway_matching.py\", line 502, in <module>\n",
" nmod.run()\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\nway_matching.py\", line 462, in run\n",
" self.pair_matches = pool.map(pair_match_job, pair_arg_list)\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 367, in map\n",
" return self._map_async(func, iterable, mapstar, chunksize).get()\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 774, in get\n",
" raise self._value\n",
"AttributeError: module 'numpy' has no attribute 'int'.\n",
"`np.int` was a deprecated alias for the builtin `int`. To avoid this error in existing code, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.\n",
"The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:\n",
" https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations\n"
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\nway\\utils.py:48: FutureWarning: In a future version of pandas all arguments of DataFrame.sort_index will be keyword-only.\n",
" df = df.sort_index(0)\n",
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\nway\\utils.py:49: FutureWarning: In a future version of pandas all arguments of DataFrame.sort_index will be keyword-only.\n",
" df = df.sort_index(1)\n",
"INFO:NwayMatching:registration success(1) or failure (0):\n",
" 0 1\n",
"0 1 1\n",
"1 1 1\n",
"id map{\n",
" \"0\": 1193675753,\n",
" \"1\": 1194754135\n",
"}\n",
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\nway\\nway_matching.py:208: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.\n",
" matching_frame = matching_frame.append(pairframe)\n",
"INFO:NwayMatching:Nway matching is done!\n",
"INFO:NwayMatching:Creating match summary plots\n",
"WARNING:root:setting Dict fields not supported from argparse\n",
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\argschema\\utils.py:346: FutureWarning: '--nway_output.nway_matches' is using old-style command-line syntax with each element as a separate argument. This will not be supported in argschema after 2.0. See http://argschema.readthedocs.io/en/master/user/intro.html#command-line-specification for details.\n",
" warnings.warn(warn_msg, FutureWarning)\n",
"WARNING:root:many=True not supported from argparse\n",
"INFO:NwayMatching:wrote matching_output\\nway_match_fraction_plot_2024_11_14_13_37_50.png\n",
"INFO:NwayMatching:wrote matching_output\\nway_warp_overlay_plot_2024_11_14_13_37_50.png\n",
"INFO:NwayMatching:wrote matching_output\\nway_warp_summary_plot_2024_11_14_13_37_50.png\n",
"INFO:NwayMatching:wrote ./output.json\n"
]
}
],
"source": [
"!python -m nway.nway_matching --input_json input.json --output_json \"./output.json\" --output_dir matching_output"
"!python3 -m nway.nway_matching --input_json input.json --output_json \"./output.json\" --output_dir matching_output"
]
},
{
Expand Down Expand Up @@ -385,7 +366,7 @@
{
"data": {
"text/plain": [
"<matplotlib.image.AxesImage at 0x1c3b53e35b0>"
"<matplotlib.image.AxesImage at 0x21dff47bfa0>"
]
},
"execution_count": 13,
Expand Down Expand Up @@ -421,7 +402,7 @@
{
"data": {
"text/plain": [
"<matplotlib.image.AxesImage at 0x1c3b7dbdf00>"
"<matplotlib.image.AxesImage at 0x21dff4fe680>"
]
},
"execution_count": 14,
Expand Down
Loading

0 comments on commit ad0cd95

Please sign in to comment.