Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PYTHON-4687: Make lint fixes provided by pre-commit #41

Merged
merged 3 commits into from
Oct 3, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .evergreen/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -187,7 +187,7 @@ buildvariants:
- rhel87-small
tasks:
- name: test-llama-index

- name: test-docarray-rhel
display_name: DocArray RHEL
expansions:
Expand All @@ -210,4 +210,4 @@ buildvariants:
run_on:
- rhel87-small
tasks:
- name: test-autogen
- name: test-autogen
2 changes: 0 additions & 2 deletions .evergreen/provision-atlas.sh
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,7 @@ PYTHON_BINARY=$(find_python3)
# Should be called from src
EVERGREEN_PATH=$(pwd)/.evergreen
TARGET_DIR=$(pwd)/$DIR
PING_ATLAS=$EVERGREEN_PATH/ping_atlas.py
SCAFFOLD_SCRIPT=$EVERGREEN_PATH/scaffold_atlas.py
DEPLOYMENT_NAME=$DIR

set -ex
mkdir atlas
Expand Down
2 changes: 1 addition & 1 deletion .evergreen/scaffold_atlas.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ def upload_data(db: Database, filename: Path) -> None:
loaded_collection = json.load(f)

logger.info(
"Loading %s to Atlas database %s in colleciton %s",
"Loading %s to Atlas database %s in collection %s",
filename.name,
db.name,
collection_name,
Expand Down
6 changes: 3 additions & 3 deletions .evergreen/utils.sh
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ setup_local_atlas() {
*)
echo "Unrecognized state $STATE"
sleep 1
esac
esac
done

echo "container did not get healthy within 120 seconds, quitting"
Expand All @@ -92,14 +92,14 @@ setup_local_atlas() {
wait "$CONTAINER_ID"
EXPOSED_PORT=$(podman inspect --format='{{ (index (index .NetworkSettings.Ports "27017/tcp") 0).HostPort }}' "$CONTAINER_ID")
export CONN_STRING="mongodb://127.0.0.1:$EXPOSED_PORT/?directConnection=true"
# shellcheck disable=SC2154
echo "CONN_STRING=mongodb://127.0.0.1:$EXPOSED_PORT/?directConnection=true" > $workdir/src/.evergreen/.local_atlas_uri
}

fetch_local_atlas_uri() {
# shellcheck disable=SC2154
. $workdir/src/.evergreen/.local_atlas_uri

export CONN_STRING=$CONN_STRING
echo "$CONN_STRING"
}


2 changes: 1 addition & 1 deletion .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ jobs:
python -m pip install -U pip pre-commit
- name: Run linters
run: |
pre-commit run --hook-stage=manual --all-files
pre-commit run --hook-stage=manual --all-files
1 change: 0 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -49,4 +49,3 @@ xunit-results/

# Miscellaneous
.DS_Store

2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ Within each subdirectory you should expect to have:
The general layout of this repo looks like this:

```bash
├── LICENSE # License Agreeement
├── LICENSE # License Agreement
├── README.md # This Document
├── langchain-python # Folder scoped for one Integration
│ └── run.sh # Script that executes test
Expand Down
4 changes: 3 additions & 1 deletion autogen/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,18 +6,20 @@

set -x

# shellcheck disable=SC2154
. $workdir/src/.evergreen/utils.sh
PYTHON_BINARY=$(find_python3)
$PYTHON_BINARY -c "import sys; print(f'Python version found: {sys.version_info}')"

# Create and activate an isolated python venv environment
$PYTHON_BINARY -m venv venv
source venv/bin/activate
. venv/bin/activate
# Install autogen with extras
$PYTHON_BINARY -m pip install .[test,"retrievechat-mongodb"]


# Run tests. Sensitive variables in Evergreen come from Evergreen project: ai-ml-pipeline-testing/
# shellcheck disable=SC2154
MONGODB_URI=$autogen_mongodb_uri \
MONGODB_DATABASE="autogen_test_db" \
$PYTHON_BINARY -m pytest -v test/agentchat/contrib/vectordb/test_mongodb.py
4 changes: 3 additions & 1 deletion chatgpt-retrieval-plugin/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

set -x

# shellcheck disable=SC2154
. $workdir/src/.evergreen/utils.sh

PYTHON_BINARY=$(find_python3)
Expand All @@ -14,13 +15,14 @@ $PYTHON_BINARY -m pip install -U pip poetry
# Create a package specific poetry environment
$PYTHON_BINARY -m poetry env use $PYTHON_BINARY
# Activate the poetry env, which itself does not include poetry
source $($PYTHON_BINARY -m poetry env info --path)/bin/activate
. "$($PYTHON_BINARY -m poetry env info --path)/bin/activate"
# Recreate the poetry lock file
$PYTHON_BINARY -m poetry lock --no-update
# Install from pyproject.toml into package specific environment
$PYTHON_BINARY -m poetry install --with dev

# Run tests. Sensitive variables in Evergreen come from Evergeen project: ai-ml-pipeline-testing/
# shellcheck disable=SC2154
OPENAI_API_KEY=$openai_api_key \
DATASTORE="mongodb-atlas" \
BEARER_TOKEN="staylowandkeepmoving" \
Expand Down
6 changes: 4 additions & 2 deletions docarray/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,14 @@

set -x

# shellcheck disable=SC2154
. $workdir/src/.evergreen/utils.sh
PYTHON_BINARY=$(find_python3)
$PYTHON_BINARY -c "import sys; print(f'Python version found: {sys.version_info}')"

# Create and activate an isolated python venv environment
$PYTHON_BINARY -m venv venv
source venv/bin/activate
. venv/bin/activate
# Install Poetry
pip install -U pip poetry
# Recreate the poetry lock file
Expand All @@ -21,6 +22,7 @@ poetry install --with dev --extras mongo


# Run tests. Sensitive variables in Evergreen come from Evergeen project: ai-ml-pipeline-testing/
# shellcheck disable=SC2154
MONGODB_URI=$docarray_mongodb_uri \
MONGODB_DATABASE="docarray_test_db" \
pytest -v tests/index/mongo_atlas
pytest -v tests/index/mongo_atlas
7 changes: 6 additions & 1 deletion langchain-python/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,12 @@
# WORKING_DIR = src/langchain-python/langchain
set -x

# shellcheck disable=SC2154
. $workdir/src/.evergreen/utils.sh

PYTHON_BINARY=$(find_python3)

# shellcheck disable=SC2164
cd libs/partners/mongodb

$PYTHON_BINARY -m venv venv_pipeline
Expand All @@ -18,7 +20,10 @@ poetry lock --no-update

poetry install --with test --with test_integration

export MONGODB_ATLAS_URI=$(fetch_local_atlas_uri)
MONGODB_ATLAS_URI=$(fetch_local_atlas_uri)

export MONGODB_ATLAS_URI
# shellcheck disable=SC2154
export OPENAI_API_KEY=$openai_api_key

make test
Expand Down
7 changes: 5 additions & 2 deletions llama-index-python-kvstore/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,21 +2,23 @@

set -x

# shellcheck disable=SC2154
. $workdir/src/.evergreen/utils.sh

CONN_STRING=$(fetch_local_atlas_uri)
PYTHON_BINARY=$(find_python3)
$PYTHON_BINARY -c "import sys; print(f'Python version found: {sys.version_info}')"

# cd to the MongoDB integration. It has its own project
# shellcheck disable=SC2164
cd llama-index-integrations/storage/kvstore/llama-index-storage-kvstore-mongodb

# Install Poetry into base python
$PYTHON_BINARY -m pip install -U pip poetry
# Create a package specific poetry environment
$PYTHON_BINARY -m poetry env use $PYTHON_BINARY
# Activate the poetry env, which itself does not include poetry
source $($PYTHON_BINARY -m poetry env info --path)/bin/activate
. "$($PYTHON_BINARY -m poetry env info --path)/bin/activate"
# PYTHON-4522: Will fix requirement in llama-index repo
$PYTHON_BINARY -m poetry add motor
# Recreate the poetry lock file
Expand All @@ -25,6 +27,7 @@ $PYTHON_BINARY -m poetry lock --no-update
$PYTHON_BINARY -m poetry install --with dev

# Run tests. Sensitive variables in Evergreen come from Evergreen project: ai-ml-pipeline-testing/
# shellcheck disable=SC2154
OPENAI_API_KEY=$openai_api_key \
MONGODB_URI=$CONN_STRING \
MONGODB_DATABASE="llama_index_test_db" \
Expand Down
5 changes: 4 additions & 1 deletion llama-index-python-vectorstore/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,26 +2,29 @@

set -x

# shellcheck disable=SC2154
. $workdir/src/.evergreen/utils.sh

PYTHON_BINARY=$(find_python3)
$PYTHON_BINARY -c "import sys; print(f'Python version found: {sys.version_info}')"

# cd to the MongoDB integration. It has its own project
# shellcheck disable=SC2164
cd llama-index-integrations/vector_stores/llama-index-vector-stores-mongodb

# Install Poetry into base python
$PYTHON_BINARY -m pip install -U pip poetry
# Create a package specific poetry environment
$PYTHON_BINARY -m poetry env use $PYTHON_BINARY
# Activate the poetry env, which itself does not include poetry
source $($PYTHON_BINARY -m poetry env info --path)/bin/activate
. "$($PYTHON_BINARY -m poetry env info --path)/bin/activate"
# Recreate the poetry lock file
$PYTHON_BINARY -m poetry lock --no-update
# Install from pyproject.toml into package specific environment
$PYTHON_BINARY -m poetry install --with dev

# Run tests. Sensitive variables in Evergreen come from Evergreen project: ai-ml-pipeline-testing/
# shellcheck disable=SC2154
MONGODB_URI=$(fetch_local_atlas_uri) \
OPENAI_API_KEY=$openai_api_key \
MONGODB_DATABASE="llama_index_test_db" \
Expand Down
3 changes: 2 additions & 1 deletion semantic-kernel-csharp/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

set -x

# shellcheck disable=SC2154
. $workdir/src/.evergreen/utils.sh
# WORKING_DIR = src/semantic-kernel-csharp/semantic-kernel

Expand All @@ -20,4 +21,4 @@ sed -i -e 's/"MongoDB Atlas cluster is required"/null/g' dotnet/src/IntegrationT
# Run tests
echo "Running MongoDBMemoryStoreTests"
MongoDB__ConnectionString=$(fetch_local_atlas_uri) \
$DOTNET_SDK_PATH/dotnet test dotnet/src/IntegrationTests/IntegrationTests.csproj --filter SemanticKernel.IntegrationTests.Connectors.MongoDB.MongoDBMemoryStoreTests
$DOTNET_SDK_PATH/dotnet test dotnet/src/IntegrationTests/IntegrationTests.csproj --filter SemanticKernel.IntegrationTests.Connectors.MongoDB.MongoDBMemoryStoreTests
4 changes: 4 additions & 0 deletions semantic-kernel-python/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,14 @@

set -x

# shellcheck disable=SC2154
. $workdir/src/.evergreen/utils.sh

CONN_STRING=$(fetch_local_atlas_uri)
PYTHON_BINARY=$(find_python3)

# WORKING_DIR = src/semantic-kernel-python/semantic-kernel
# shellcheck disable=SC2164
cd python

# Temporary solution until https://github.com/microsoft/semantic-kernel/issues/9067 resolves
Expand All @@ -19,6 +21,7 @@ make install-python
make install-sk
make install-pre-commit

# shellcheck disable=SC2154
OPENAI_API_KEY=$openai_api_key \
OPENAI_ORG_ID="" \
AZURE_OPENAI_DEPLOYMENT_NAME="" \
Expand All @@ -28,6 +31,7 @@ OPENAI_API_KEY=$openai_api_key \
Python_Integration_Tests=1 \
uv run pytest tests/integration/connectors/memory/test_mongodb_atlas.py -k test_collection_knn

# shellcheck disable=SC2154
OPENAI_API_KEY=$openai_api_key \
OPENAI_ORG_ID="" \
AZURE_OPENAI_DEPLOYMENT_NAME="" \
Expand Down