Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert "Dra-update-63.1" #37

Merged
merged 1 commit into from
Jun 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
The diff you're trying to view is too large. We only load the first 3000 changed files.
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 0.63.1
current_version = 0.59.1
commit = False
tag = False
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(\-[a-z]+)?
Expand Down
2 changes: 1 addition & 1 deletion .devcontainer/python-connectors-generic/devcontainer.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
{
"name": "Python Development DevContainer (Generic)",

"image": "mcr.microsoft.com/devcontainers/python:1-3.10",
"image": "mcr.microsoft.com/devcontainers/python:0-3.10",
"features": {
"ghcr.io/devcontainers/features/docker-in-docker": {},
"ghcr.io/devcontainers/features/python:1": {
Expand Down
1 change: 0 additions & 1 deletion .github/CODEOWNERS
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@
/airbyte-integrations/connectors/destination-milvus @airbytehq/ai-language-models
/airbyte-integrations/connectors/destination-qdrant @airbytehq/ai-language-models
/airbyte-integrations/connectors/destination-chroma @airbytehq/ai-language-models
/airbyte-integrations/connectors/destination-snowflake-cortex @airbytehq/ai-language-models
/airbyte-cdk/python/airbyte_cdk/destinations/vector_db_based @airbytehq/ai-language-models

# CI/CD
Expand Down
15 changes: 13 additions & 2 deletions .github/workflows/airbyte-ci-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,6 @@ jobs:
- airbyte-ci/connectors/pipelines/**
- airbyte-ci/connectors/base_images/**
- airbyte-ci/connectors/common_utils/**
- airbyte-ci/connectors/connectors_insights/**
- airbyte-ci/connectors/connector_ops/**
- airbyte-ci/connectors/connectors_qa/**
- airbyte-ci/connectors/ci_credentials/**
Expand All @@ -47,13 +46,25 @@ jobs:
run-tests:
needs: changes
# We only run the Internal Poetry packages CI job if there are changes to the packages on a non-forked PR
if: needs.changes.outputs.internal_poetry_packages == 'true' && github.event.pull_request.head.repo.fork != true
if: needs.changes.outputs.internal_poetry_packages == 'true'
name: Internal Poetry packages CI
runs-on: tooling-test-large
permissions:
pull-requests: read
statuses: write
steps:
# The run-tests job will be triggered if a fork made changes to the internal poetry packages.
# We don't want forks to make changes to the internal poetry packages.
# So we fail the job if the PR is from a fork, it will make the required CI check fail.
- name: Check if PR is from a fork
id: check-if-pr-is-from-fork
if: github.event_name == 'pull_request'
shell: bash
run: |
if [ "${{ github.event.pull_request.head.repo.fork }}" == "true" ]; then
echo "PR is from a fork. Exiting workflow..."
exit 78
fi
- name: Checkout Airbyte
uses: actions/checkout@v4
with:
Expand Down
32 changes: 0 additions & 32 deletions .github/workflows/auto_merge.yml

This file was deleted.

55 changes: 0 additions & 55 deletions .github/workflows/community_ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -193,58 +193,3 @@ jobs:
name: pipeline-reports
path: /home/runner/work/airbyte/airbyte/airbyte-ci/connectors/pipelines/pipeline_reports/airbyte-ci/connectors/test/pull_request/**/output.html
retention-days: 7

internal_poetry_packages_ci:
name: Internal Poetry packages CI
if: github.event.pull_request.head.repo.fork == true
# Deployment of jobs on the community-ci environment requires manual approval
# This is something we set up in the GitHub environment settings:
# https://github.com/airbytehq/airbyte/settings/environments/2091483613/edit
# This is a safety measure to make sure the code running on our infrastructure has been reviewed before running on it
needs: fail_on_protected_path_changes
environment: community-ci
runs-on: community-tooling-test-small
timeout-minutes: 180 # 3 hours
permissions:
statuses: write
env:
MAIN_BRANCH_NAME: "master"

steps:
# This checkouts a fork which can contain untrusted code
# It's deemed safe as the community-ci environment requires manual reviewer approval to run
- name: Checkout fork
uses: actions/checkout@v4
with:
repository: ${{ github.event.pull_request.head.repo.full_name }}
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 1

# This will sync the .github folder of the main repo with the fork
# This allows us to use up to date actions and CI logic from the main repo
- name: Pull .github folder from main repository
id: pull_github_folder
run: |
git remote add main https://github.com/airbytehq/airbyte.git
git fetch main ${MAIN_BRANCH_NAME}
git checkout main/${MAIN_BRANCH_NAME} -- .github
git checkout main/${MAIN_BRANCH_NAME} -- airbyte-ci

- name: Run poe tasks for modified internal packages [PULL REQUEST]
# This path refers to the fork .github folder.
# We make sure its content is in sync with the main repo .github folder by pulling it in the previous step
id: run-airbyte-ci-test-pr
uses: ./.github/actions/run-airbyte-ci
with:
context: "pull_request"
dagger_cloud_token: ${{ secrets.DAGGER_CLOUD_TOKEN_2 }}
docker_hub_password: ${{ secrets.DOCKER_HUB_PASSWORD }}
docker_hub_username: ${{ secrets.DOCKER_HUB_USERNAME }}
gcp_gsm_credentials: ${{ secrets.GCP_GSM_CREDENTIALS }}
gcs_credentials: ${{ secrets.METADATA_SERVICE_PROD_GCS_CREDENTIALS }}
git_repo_url: ${{ github.event.pull_request.head.repo.clone_url }}
git_branch: ${{ github.head_ref }}
git_revision: ${{ github.event.pull_request.head.sha }}
github_token: ${{ github.token }}
subcommand: "test --modified"
is_fork: "true"
49 changes: 0 additions & 49 deletions .github/workflows/connectors_insights.yml

This file was deleted.

1 change: 0 additions & 1 deletion .github/workflows/connectors_version_increment_check.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,6 @@ jobs:
connectors_ci:
name: Connectors Version Increment Check
runs-on: connector-test-large
if: github.event.pull_request.head.repo.fork != true
timeout-minutes: 10
steps:
- name: Checkout Airbyte
Expand Down
3 changes: 3 additions & 0 deletions .github/workflows/format-fix-command.yml
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,9 @@ jobs:
continue-on-error: true
with:
context: "manual"
dagger_cloud_token: ${{ secrets.DAGGER_CLOUD_TOKEN_2 }}
docker_hub_password: ${{ secrets.DOCKER_HUB_PASSWORD }}
docker_hub_username: ${{ secrets.DOCKER_HUB_USERNAME }}
gcs_credentials: ${{ secrets.METADATA_SERVICE_PROD_GCS_CREDENTIALS }}
sentry_dsn: ${{ secrets.SENTRY_AIRBYTE_CI_DSN }}
github_token: ${{ secrets.GH_PAT_MAINTENANCE_OCTAVIA }}
Expand Down
25 changes: 2 additions & 23 deletions .github/workflows/metadata_service_deploy_orchestrator_dagger.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,6 @@ name: Connector Ops CI - Metadata Service Deploy Orchestrator

on:
workflow_dispatch:
inputs:
deployment_target:
description: "The deployment target for the metadata orchestrator (prod or dev)"
default: "dev"
push:
branches:
- master
Expand All @@ -18,9 +14,8 @@ jobs:
steps:
- name: Checkout Airbyte
uses: actions/checkout@v2
- name: Deploy the metadata orchestrator [On merge to master]
id: metadata-orchestrator-deploy-orchestrator-pipeline-prod
if: github.event_name == 'push'
- name: Deploy the metadata orchestrator
id: metadata-orchestrator-deploy-orchestrator-pipeline
uses: ./.github/actions/run-airbyte-ci
with:
subcommand: "metadata deploy orchestrator"
Expand All @@ -32,19 +27,3 @@ jobs:
gcp_gsm_credentials: ${{ secrets.GCP_GSM_CREDENTIALS }}
env:
DAGSTER_CLOUD_METADATA_API_TOKEN: ${{ secrets.DAGSTER_CLOUD_METADATA_API_TOKEN }}
DAGSTER_CLOUD_DEPLOYMENT: "prod"
- name: Deploy the metadata orchestrator [On workflow]
id: metadata-orchestrator-deploy-orchestrator-pipeline-branch
if: github.event_name == 'workflow_dispatch'
uses: ./.github/actions/run-airbyte-ci
with:
subcommand: "metadata deploy orchestrator"
context: "manual"
dagger_cloud_token: ${{ secrets.DAGGER_CLOUD_TOKEN_2 }}
github_token: ${{ secrets.GITHUB_TOKEN }}
docker_hub_username: ${{ secrets.DOCKER_HUB_USERNAME }}
docker_hub_password: ${{ secrets.DOCKER_HUB_PASSWORD }}
gcp_gsm_credentials: ${{ secrets.GCP_GSM_CREDENTIALS }}
env:
DAGSTER_CLOUD_METADATA_API_TOKEN: ${{ secrets.DAGSTER_CLOUD_METADATA_API_TOKEN }}
DAGSTER_CLOUD_DEPLOYMENT: ${{ inputs.deployment_target }}
8 changes: 3 additions & 5 deletions .github/workflows/publish-cdk-command-manually.yml
Original file line number Diff line number Diff line change
Expand Up @@ -224,9 +224,7 @@ jobs:
uses: ./.github/actions/run-airbyte-ci
with:
context: "master" # TODO: figure out why changing this yells with `The ci_gcs_credentials was not set on this PipelineContext.`
# Disable the dagger_cloud_token to disable remote cache access.
# See https://github.com/airbytehq/airbyte-internal-issues/issues/6439#issuecomment-2109503985 for context
#dagger_cloud_token: ${{ secrets.DAGGER_CLOUD_TOKEN_2 }}
dagger_cloud_token: ${{ secrets.DAGGER_CLOUD_TOKEN_2 }}
docker_hub_password: ${{ secrets.DOCKER_HUB_PASSWORD }}
docker_hub_username: ${{ secrets.DOCKER_HUB_USERNAME }}
gcp_gsm_credentials: ${{ secrets.GCP_GSM_CREDENTIALS }}
Expand Down Expand Up @@ -313,8 +311,8 @@ jobs:
uses: peter-evans/create-pull-request@v6
with:
token: ${{ secrets.GH_PAT_MAINTENANCE_OCTAVIA }}
commit-message: "chore: update CDK version following release"
title: "chore: update CDK version following release"
commit-message: Updating CDK version following release
title: Updating CDK version following release
body: This is an automatically generated PR triggered by a CDK release
branch: automatic-cdk-release
base: master
Expand Down
33 changes: 4 additions & 29 deletions .github/workflows/regression_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,20 +15,14 @@ on:
workflow_dispatch:
inputs:
connector_name:
description: Connector name (e.g. source-faker)
description: "Connector name (e.g. source-faker)"
required: true
connection_id:
description: ID of the connection to test; use "auto" to let the connection retriever choose a connection
description: "ID of the connection to test"
required: true
pr_url:
description: URL of the PR containing the code change
description: "URL of the PR containing the code change"
required: true
streams:
description: Streams to include in regression tests
use_local_cdk:
description: Use the local CDK when building the target connector
default: "false"
type: boolean

jobs:
regression_tests:
Expand Down Expand Up @@ -67,25 +61,6 @@ jobs:
id: fetch_last_commit_id_wd
run: echo "commit_id=$(git rev-parse origin/${{ steps.extract_branch.outputs.branch }})" >> $GITHUB_OUTPUT

- name: Setup Stream Parameters
if: github.event_name == 'workflow_dispatch'
run: |
if [ -z "${{ github.event.inputs.streams }}" ]; then
echo "STREAM_PARAMS=" >> $GITHUB_ENV
else
STREAMS=$(echo "${{ github.event.inputs.streams }}" | sed 's/,/ --connector_regression_tests.selected-streams=/g')
echo "STREAM_PARAMS=--connector_regression_tests.selected-streams=$STREAMS" >> $GITHUB_ENV
fi

- name: Setup Local CDK Flag
if: github.event_name == 'workflow_dispatch'
run: |
if ${{ github.event.inputs.use_local_cdk }}; then
echo "USE_LOCAL_CDK_FLAG=--use-local-cdk" >> $GITHUB_ENV
else
echo "USE_LOCAL_CDK_FLAG=" >> $GITHUB_ENV
fi

- name: Run Regression Tests [WORKFLOW DISPATCH]
if: github.event_name == 'workflow_dispatch' # TODO: consider using the matrix strategy (https://docs.github.com/en/actions/using-jobs/using-a-matrix-for-your-jobs). See https://github.com/airbytehq/airbyte/pull/37659#discussion_r1583380234 for details.
uses: ./.github/actions/run-airbyte-ci
Expand All @@ -102,4 +77,4 @@ jobs:
github_token: ${{ secrets.GH_PAT_MAINTENANCE_OSS }}
s3_build_cache_access_key_id: ${{ secrets.SELF_RUNNER_AWS_ACCESS_KEY_ID }}
s3_build_cache_secret_key: ${{ secrets.SELF_RUNNER_AWS_SECRET_ACCESS_KEY }}
subcommand: connectors ${{ env.USE_LOCAL_CDK_FLAG }} --name ${{ github.event.inputs.connector_name }} test --only-step connector_regression_tests --connector_regression_tests.connection-id=${{ github.event.inputs.connection_id }} --connector_regression_tests.pr-url=${{ github.event.inputs.pr_url }} ${{ env.STREAM_PARAMS }}
subcommand: "connectors --name ${{ github.event.inputs.connector_name }} test --only-step connector_regression_tests --connector_regression_tests.connection-id=${{ github.event.inputs.connection_id }} --connector_regression_tests.pr-url=${{ github.event.inputs.pr_url }}"
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ _Screenshot taken from [Airbyte Cloud](https://cloud.airbyte.com/signup)_.
- Create connectors in minutes with our [no-code Connector Builder](https://docs.airbyte.com/connector-development/connector-builder-ui/overview) or [low-code CDK](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview).
- Explore popular use cases in our [tutorials](https://airbyte.com/tutorials).
- Orchestrate Airbyte syncs with [Airflow](https://docs.airbyte.com/operator-guides/using-the-airflow-airbyte-operator), [Prefect](https://docs.airbyte.com/operator-guides/using-prefect-task), [Dagster](https://docs.airbyte.com/operator-guides/using-dagster-integration), [Kestra](https://docs.airbyte.com/operator-guides/using-kestra-plugin) or the [Airbyte API](https://reference.airbyte.com/reference/start).
- Easily transform loaded data with [SQL](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-sql) or [dbt](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-dbt).

Try it out yourself with our [demo app](https://demo.airbyte.io/), visit our [full documentation](https://docs.airbyte.com/) and learn more about [recent announcements](https://airbyte.com/blog-categories/company-updates). See our [registry](https://connectors.airbyte.com/files/generated_reports/connector_registry_report.html) for a full list of connectors already available in Airbyte or Airbyte Cloud.

Expand Down
Loading
Loading