You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
User:
"I've been at a stand still actually. It looks to me like the constraints from MWAA are going to prohibit me from using Snowflake as a destination. Writing to S3 is a viable replacement- but still having trouble here as well. The most recent version of airflow available on MWAA is 2.7.2.
At the moment, airflow is showing some import conflicts. It can't see the path to locate PipelineTasksGroup in my dag, nor can it find DltResource in init.py.
There is also a version conflict with s3fs which relies on aiobotocore. The constraint on aiobotocore here is 2.6.0 which is not"
Are you a dlt user?
I'd consider using dlt, but it's lacking a feature I need.
Use case
Run airflow on managed AWS with dlt. Library conflicts, seems S3 could be easier .
I might be able to help here. I have been using dlt with MWAA successfully. I'm not writing to Snowflake, but have run into problems with the constraints multiple times.
My workaround was to add --constraints /dev/null to the uploaded requirements.txt file, which overrides the default constraints imposed by Airflow. It seems hacky, but AWS actually encourages this if you don't like the imposed constraints.
we had a similar issue on v2.2.2 (--constraints not needed in this version though) and decided to install additional packages directly in a virtual environement using the PythonVirtualenvOperator operator. First virtualenv needs to be part of your requirements.txt
Feature description
User:
"I've been at a stand still actually. It looks to me like the constraints from MWAA are going to prohibit me from using Snowflake as a destination. Writing to S3 is a viable replacement- but still having trouble here as well. The most recent version of airflow available on MWAA is 2.7.2.
At the moment, airflow is showing some import conflicts. It can't see the path to locate PipelineTasksGroup in my dag, nor can it find DltResource in init.py.
There is also a version conflict with s3fs which relies on aiobotocore. The constraint on aiobotocore here is 2.6.0 which is not"
Are you a dlt user?
I'd consider using dlt, but it's lacking a feature I need.
Use case
Run airflow on managed AWS with dlt. Library conflicts, seems S3 could be easier .
You can ask more information from the user here https://dlthub-community.slack.com/archives/C04DQA7JJN6/p1707496343595039
The text was updated successfully, but these errors were encountered: