-
-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support loading maniefests from Snowflake stage #109
Comments
What a cool idea, @kokorin! This should be a natural fit as a |
Sure, I will try to create MR this week or on the weekend |
@nicholasyager I thought how to acquire Snowflake connection. Theoretically it should be possible to use DBT intrinsic function for that, but I'm not sure if connection is initialized at plugin loading time. And if the connection is not initialized, plugin can ruin Python module loading sequence if it will try to create a connection. Any advice on that? |
This is a fantastic question, @kokorin ! I suspect that a connection likely will not have been made to the warehouse by the adapter yet, since this all fires during compilation. The other unknown to me is if the underlying adapter code is using a driver that will actually store a file from stage if Worst case scenario, it should be possible to create a secondard connection within plugin code that uses profile.yml to generate a connection to Snowflake. This would be a bit of a kludge, but would be in line with other existing client implementatons. |
It's possible to initialize DBT Adapter during Plugin loading time: flags = get_flags()
project_dir = flags.PROJECT_DIR
profile = load_profile(
project_root=flags.PROJECT_DIR,
cli_vars=flags.VARS,
profile_name_override=flags.PROFILE,
target_override=flags.TARGET
)
creds = profile.credentials.to_dict(omit_none=True)
adapter = SnowflakeAdapter(profile, get_mp_context())
print(f">>>>>>>>> ADAPTER: {adapter}")
with adapter.connection_named("dbt-loom"):
response, table = adapter.connections.execute("select 1")
print(f">>>>> RESPONSE {response}")
print(f">>>>> TABLE {table}") Prints:
Will continue investigation |
Is your feature request related to a problem? Please describe.
We upload DBT artifacts (manifest, docs) to Snowflake stage.
It's possible to use SnowCLI to download artifacts (or custom solution), but it would require from data analysts to download fresh version of artifacts periodically after each DBT run in main project.
Describe the solution you'd like
It should be easy to add another manifest source called
snowflake
orsnowflake_stage
. Path can be set like<DB>.<Schema>.@<Stage>/path/in/stage
.Describe alternatives you've considered
SnowCLI or custom python script to download manifest using CLI.
Additional context
N/A
The text was updated successfully, but these errors were encountered: