You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Background
Checking what happened with the data during pipeline run should be easier both in the Python code and cli.
Tasks
document how to get state and trace via cli and code
document delete_completed_jobs in load. False by default. document what is left behind
change it to delete only completed jobs as it was the case before.
add method to pipeline remove completed package fully.
load storage should return a status of the package with schema name, jobs, table names and statuses. failed packages should contain error names. also links to files should be present. the info should be json loadable with dlt user log /callback log #73
load module should persist the effective schema updates: a list of tables and columns that were added/updated. it should include flags for variant columns. see Schema change log access #134
document PipelineStepFailed and __context__ exception chaining. document terminal vs. non terminal exceptions when loading. dlt does not retry the run or any other step
add a config parameter to fail loading on a terminal exception in load job
The text was updated successfully, but these errors were encountered:
Background
Checking what happened with the data during pipeline
run
should be easier both in the Python code and cli.Tasks
delete_completed_jobs
inload
. False by default. document what is left behind__context__
exception chaining. document terminal vs. non terminal exceptions when loading.dlt
does not retry therun
or any other stepThe text was updated successfully, but these errors were encountered: