Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge hotfixes in master to develop #169

Merged
merged 9 commits into from
Jan 8, 2025
2 changes: 1 addition & 1 deletion src/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ RUN pip install . -t python
# to change the hash of the file and get TF to realize it needs to be
# redeployed. Ticket for a better solution:
# https://gfw.atlassian.net/browse/GTC-1250
# change 14
# change 18

RUN yum install -y zip geos-devel

Expand Down
2 changes: 1 addition & 1 deletion src/datapump/jobs/geotrellis.py
Original file line number Diff line number Diff line change
Expand Up @@ -689,7 +689,7 @@ def _calculate_worker_count(self, limiting_src) -> int:
# if using a wildcard for a folder, just use hardcoded value
if "*" in limiting_src:
if GLOBALS.env == "production":
if self.table.analysis == Analysis.tcl:
if self.table.analysis == Analysis.tcl or self.table.analysis == Analysis.viirs:
return 200
else:
return 100
Expand Down
2 changes: 1 addition & 1 deletion src/datapump/jobs/jobs.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ class Job(StrictBaseModel, ABC):
step: str = JobStep.starting
status: JobStatus = JobStatus.starting
start_time: Optional[str] = None
timeout_sec: int = 14400
timeout_sec: int = 57600 # 16hr timeout
retries: int = 0
errors: List[str] = list()

Expand Down
4 changes: 3 additions & 1 deletion src/datapump/sync/sync.py
Original file line number Diff line number Diff line change
Expand Up @@ -788,7 +788,9 @@ def build_jobs (self, config: DatapumpConfig) -> List[Job]:
band_count=1,
compute_stats=False,
union_bands=True,
unify_projection=True
unify_projection=True,
# Sometimes this job runs over 2 hours, so increase timeout to 3 hours.
timeout_sec=3 * 3600
),
content_date_range=ContentDateRange(
start_date="2020-12-31", end_date=str(date.today())
Expand Down
Loading