Skip to content

Commit

Permalink
Bug fixes (#142)
Browse files Browse the repository at this point in the history
* Big query URL change and config update changes

* Added bigframes as BE dependency

* last code review comment

* Code review changes

* code cleanup

* import changes

* Revert "import changes"

This reverts commit 24ff2e7.

* import authAPI changes

* Scheduler - internal fixes

1. Logo in dark mode color change
2. Loader added for cluster and serverless dropdown

* version update to 0.1.74

* Client bug tracker fixes - ID 208,209

1. Tooltip for cards changed
2. Local kernel - Create scheduler loader fix

* Notebook Templates fix

* code cleanup

* Bug tracker fix - ID 211

Launcher title and logo changes

* Client bug tracker fixes - 206, 208, 209, 211

* version updated to 0.1.75

* big query entries API

* DPMS UI changes for new view

* schema list changes

* constant changes

* table info API and unit test cases

* Big query data explorer UI changes:

1. Datasets, tables and column API integration
2. Tree view changes

* database details modification

* Execution history calendar color code changes

* preview API changes and unit test case

* Preview Data initial changes

* css color fix

* Big query preview data UI changes

* API calls moved to service file

* Plugin minor UI changes

* Handled loading issue -  using oldNotebookValue

* Dataset icon added for big query

* table info API, Project API, and unit tests

* Big query Table Details UI changes

* Table Info date format changes

* Big Query Dataset Details UI changes

* Column type css changes and Details page fixes

* Table description hover basic title added

* Dataset Info page fixes

* Preview data limit set to 20 changes

* passing runtime version to  serverless dag

* Empty Dataset handled in tree view

* condition changes

* Big Query project hierarchy and cleanup

* DPMS old code retained changes

* Dataset API pageToken and CSS tree changes

* changing dataplex API to bigquery APIs

* Big Query region added and stored in local storage

* FE changes for APIs from Dataplex to bigQuery

* ID-213. Removed cluster option for BigQuery

* Big query enhancement changes

* Big Query schema added fields and tree condition changes

* BQ region grouping added and stored value in settings

* BQ dataset filter by BQ region changes

* settings changes

* BQ region onchange for empty handled

* version updated to 0.1.76

* Handling table count logic change

* adding page token

* Table preview API page token added

* Tree view based on total table counts

* preview check

* Tree view loader fix

* Preview API error handling

* code cleanup

* BQ table, dataset info page loader and preview empty handled

* BE code review changes and formating

* Internal review comments changes FE

* Review comments - bq region dropdown created

* license header year change for new files created

* variable name change to "dataset"

* table style handled using css selector

* useEffect review comments fixed

* useEffect review comments fixes

* usememo removed for schema tables

* preview column type added

* minor fix for Loader

* Serverless notebook logo based on language

* Bug tracker - 222 dag run always for selected date

* Type added

* setIsLoading missing added

* Table and dataset code  bigquery refactoring

* Rename wrapper file

* Bq preview index file changes

* Wrapper files name change

* Tree structure refactor logic changes

* Plugin toast handled based on BQ feature enable

* big query and dpms refactor

* ID:221-Toast message timer fix

* Multiple projects handling in dataset tree view

1. BQ project list api added.
2. All BQ api project added based on FE selection.
3. Handled tree structure FE for new view.

* Empty schema info page handled

* search functionality code commented out

* Height css changes

* schema empty error message added

* schema page minor UI fixes scroll

* Network configuration Error messages style change

* No cluster available handled for create pages

* search code enabled

* ipynb file removed

* Revert "search code enabled"

This reverts commit 799f604.

* Revert "ipynb file removed"

This reverts commit 0844b26.

* Delete ipynb file

* History server cluster no data message added

Toast removed for empty cluster

* Big Query search changes FE and BE

* code cleanup changes

* commented code removed

* Fetching all the records in backend

* Incorporated review comments

* code formatting

* removed utitlities

* By default display BigQuery tree in panel

* BQ search FE search removed and handled in BE

* big query tree loader icon size reduced

* No dataset - Toast removed

* search loader size reduced

* Handled dataset empty condition

* handled text overflow for big query tree data

* ID:231, 233 bugs fixed

* css fixes

* Dataset explorer reload after project/region change

* code cleanup

* name change

* license year change

* toast errors changes

* active toast fix

* bug tracker fixes - 235, 238, 241

Initial fixes for above bugs

* console log fix

* Bug tracker ID 237 fix

* code cleanuo

* code cleanup

* Bug tracker ID - 243 fix

* Preview scroll fix

* minor css fix

* Preview page pagination server side changes

* license year change

* Preview data object type handled

* Big Query dataset explorer new panel created

* Loader changes and pagination changes

* class name changes - BigQueryWidget

* Dataset explorer refresh icon added

* Refresh Icon moved to top level

* icon changes

* handling loader

* removed commented

* configure gateway name change

* get cached credentials added in utils

* Credentials cache added

* token expiry fetch changes

* Tooltip added for custom created panels

* Linear progress bar added for calendar load

* UI fixes and Loader added for tree all levels

* remove cache

* code formatting

* loader padding changes

* prettier fix

* Tooltip added for tree

* bug fix

1.Calendar progress bar, div fix
2.BQ, DPMS sequence fix
3.Left panel last letters fix

* code cleanup

* execution history height changes

* Backend file, folder name changes

* controller name fix

* Big Query code separation - FE

* Dpms service file separation changes

* Height resize handled on window height change

* version updated to 0.1.77

* ID 249- BQ preview pagination fixes

* ID 341324879 - Buganizer P4 fix

* ID 341318983 - P3 BQ region settings fixes

* Bug Tracker Id - 251 and Buganiser id 341323620 fix

* toast handling in dag list

---------

Co-authored-by: Jeyaprakash-NK <[email protected]>
Co-authored-by: aditee-accenture <[email protected]>
Co-authored-by: harsha-accenture <[email protected]>
Co-authored-by: saranyaloganathan23 <[email protected]>
Co-authored-by: harsha-accenture <[email protected]>
  • Loading branch information
6 people authored May 20, 2024
1 parent 3c2d9a6 commit 956dcd9
Show file tree
Hide file tree
Showing 78 changed files with 1,489 additions and 767 deletions.
File renamed without changes.
45 changes: 45 additions & 0 deletions dataproc_jupyter_plugin/commons/gcloudOperations.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# Copyright 2024 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.


from google.cloud.jupyter_config.config import (
gcp_credentials,
gcp_project,
gcp_project_number,
gcp_region,
)


class GetCachedCredentials:
def get_cached_credentials(self):
credentials = {
"project_id": "",
"project_number": 0,
"region_id": "",
"access_token": "",
"config_error": 0,
"login_error": 0,
}

try:
credentials["project_id"] = gcp_project()
credentials["region_id"] = gcp_region()
credentials["config_error"] = 0
credentials["access_token"] = gcp_credentials()
credentials["project_number"] = gcp_project_number()
return credentials
except Exception as ex:
self.log.exception(f"Error fetching credentials from gcloud ", ex)
credentials["config_error"] = 1
return credentials
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@


import json

from jupyter_server.base.handlers import APIHandler
import tornado
from dataproc_jupyter_plugin import handlers
from dataproc_jupyter_plugin.services.bigqueryService import (
BigQueryDatasetInfoService,
BigQueryDatasetListService,
Expand All @@ -26,8 +26,9 @@
BigQueryTableInfoService,
BigQueryTableListService,
)
from dataproc_jupyter_plugin.utils.constants import bq_public_dataset_project_id
from dataproc_jupyter_plugin.commons.constants import bq_public_dataset_project_id
from google.cloud.jupyter_config import gcp_project
from dataproc_jupyter_plugin.commons.gcloudOperations import GetCachedCredentials


class BigqueryDatasetController(APIHandler):
Expand All @@ -37,7 +38,7 @@ def get(self):
page_token = self.get_argument("pageToken")
project_id = self.get_argument("project_id")
bigquery_dataset = BigQueryDatasetListService()
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
dataset_list = bigquery_dataset.list_datasets(
credentials, page_token, project_id, self.log
)
Expand All @@ -55,7 +56,7 @@ def get(self):
dataset_id = self.get_argument("dataset_id")
project_id = self.get_argument("project_id")
bigquery_dataset = BigQueryTableListService()
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
table_list = bigquery_dataset.list_table(
credentials, dataset_id, page_token, project_id, self.log
)
Expand All @@ -72,7 +73,7 @@ def get(self):
dataset_id = self.get_argument("dataset_id")
project_id = self.get_argument("project_id")
bq_dataset = BigQueryDatasetInfoService()
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
dataset_info = bq_dataset.list_dataset_info(
credentials, dataset_id, project_id, self.log
)
Expand All @@ -90,7 +91,7 @@ def get(self):
table_id = self.get_argument("table_id")
project_id = self.get_argument("project_id")
bq_table = BigQueryTableInfoService()
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
table_info = bq_table.list_table_info(
credentials, dataset_id, table_id, project_id, self.log
)
Expand All @@ -106,12 +107,19 @@ def get(self):
try:
dataset_id = self.get_argument("dataset_id")
table_id = self.get_argument("table_id")
page_token = self.get_argument("pageToken")
max_results = self.get_argument("max_results")
start_index = self.get_argument("start_index")
project_id = self.get_argument("project_id")
bq_preview = BigQueryPreviewService()
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
preview_data = bq_preview.bigquery_preview_data(
credentials, dataset_id, table_id, page_token, project_id, self.log
credentials,
dataset_id,
table_id,
max_results,
start_index,
project_id,
self.log,
)
self.finish(json.dumps(preview_data))
except Exception as e:
Expand Down Expand Up @@ -139,7 +147,7 @@ def get(self):
system = self.get_argument("system")
projects = [gcp_project(), bq_public_dataset_project_id]
bq_search = BigQuerySearchService()
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
search_data = bq_search.bigquery_search(
credentials, search_string, type, system, projects, self.log
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@


import json
from dataproc_jupyter_plugin.commons.gcloudOperations import GetCachedCredentials
from jupyter_server.base.handlers import APIHandler
import tornado
from dataproc_jupyter_plugin import handlers
from dataproc_jupyter_plugin.services.clusterListService import ClusterListService


Expand All @@ -27,7 +27,7 @@ def get(self):
page_token = self.get_argument("pageToken")
page_size = self.get_argument("pageSize")
cluster = ClusterListService()
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
cluster_list = cluster.list_clusters(
credentials, page_size, page_token, self.log
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,9 @@
# limitations under the License.

import json
from dataproc_jupyter_plugin.commons.gcloudOperations import GetCachedCredentials
from jupyter_server.base.handlers import APIHandler
import tornado
from dataproc_jupyter_plugin import handlers
from dataproc_jupyter_plugin.services.composerService import ComposerService
from requests import HTTPError

Expand All @@ -26,7 +26,7 @@ def get(self):
"""Returns names of available composer environments"""
try:
environments_manager = ComposerService()
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
environments = environments_manager.list_environments(credentials, self.log)

except Exception as e:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,15 +15,15 @@

import json
import subprocess
from dataproc_jupyter_plugin.commons.gcloudOperations import GetCachedCredentials
from jupyter_server.base.handlers import APIHandler
import tornado
from dataproc_jupyter_plugin import handlers
from dataproc_jupyter_plugin.services.dagListService import (
DagListService,
DagDeleteService,
DagUpdateService,
)
from dataproc_jupyter_plugin.utils.constants import TAGS
from dataproc_jupyter_plugin.commons.constants import TAGS


class DagListController(APIHandler):
Expand All @@ -32,7 +32,7 @@ def get(self):
try:
dag = DagListService()
composer_name = self.get_argument("composer")
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
dag_list = dag.list_jobs(credentials, composer_name, TAGS, self.log)
self.finish(json.dumps(dag_list))
except Exception as e:
Expand Down Expand Up @@ -70,7 +70,7 @@ def get(self):
composer = self.get_argument("composer")
dag_id = self.get_argument("dag_id")
from_page = self.get_argument("from_page", default=None)
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
delete_response = dag.delete_job(
credentials, composer, dag_id, from_page, self.log
)
Expand All @@ -92,7 +92,7 @@ def get(self):
composer = self.get_argument("composer")
dag_id = self.get_argument("dag_id")
status = self.get_argument("status")
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
update_response = dag.update_job(
credentials, composer, dag_id, status, self.log
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@


import json
from dataproc_jupyter_plugin.commons.gcloudOperations import GetCachedCredentials
from jupyter_server.base.handlers import APIHandler
import tornado
from dataproc_jupyter_plugin import handlers
from dataproc_jupyter_plugin.services.dagRunService import (
DagRunListService,
DagRunTaskListService,
Expand All @@ -34,7 +34,7 @@ def get(self):
start_date = self.get_argument("start_date")
offset = self.get_argument("offset")
end_date = self.get_argument("end_date")
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
dag_run_list = dag_run.list_dag_runs(
credentials,
composer_name,
Expand All @@ -58,7 +58,7 @@ def get(self):
composer_name = self.get_argument("composer")
dag_id = self.get_argument("dag_id")
dag_run_id = self.get_argument("dag_run_id")
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
dag_run_list = dag_run.list_dag_run_task(
credentials, composer_name, dag_id, dag_run_id, self.log
)
Expand All @@ -78,7 +78,7 @@ def get(self):
dag_run_id = self.get_argument("dag_run_id")
task_id = self.get_argument("task_id")
task_try_number = self.get_argument("task_try_number")
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
dag_run_list = dag_run.list_dag_run_task_logs(
credentials,
composer_name,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,11 +13,11 @@
# limitations under the License.

import json
from dataproc_jupyter_plugin.commons.gcloudOperations import GetCachedCredentials
from jupyter_server.base.handlers import APIHandler
import tornado
from dataproc_jupyter_plugin import handlers
from dataproc_jupyter_plugin.services.editDagService import DagEditService
from dataproc_jupyter_plugin.utils.constants import TAGS
from dataproc_jupyter_plugin.commons.constants import TAGS


class EditDagController(APIHandler):
Expand All @@ -27,7 +27,7 @@ def get(self):
dag = DagEditService()
bucket_name = self.get_argument("bucket_name")
dag_id = self.get_argument("dag_id")
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
dag_details = dag.edit_jobs(dag_id, bucket_name, credentials, self.log)
self.finish(json.dumps(dag_details))
except Exception as e:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,9 @@
# limitations under the License.


from dataproc_jupyter_plugin.commons.gcloudOperations import GetCachedCredentials
from jupyter_server.base.handlers import APIHandler
import tornado
from dataproc_jupyter_plugin import handlers
from dataproc_jupyter_plugin.services.executorService import ExecutorService


Expand All @@ -25,7 +25,7 @@ def post(self):
try:
input_data = self.get_json_body()
execute = ExecutorService()
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
execute.execute(credentials, input_data, self.log)
except Exception as e:
self.log.exception(f"Error creating dag schedule: {str(e)}")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@


import json
from dataproc_jupyter_plugin.commons.gcloudOperations import GetCachedCredentials
from jupyter_server.base.handlers import APIHandler
import tornado
from dataproc_jupyter_plugin import handlers
from dataproc_jupyter_plugin.services.importErrorService import ImportErrorService


Expand All @@ -26,7 +26,7 @@ def get(self):
try:
import_errors = ImportErrorService()
composer_name = self.get_argument("composer")
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
import_errors_list = import_errors.list_import_errors(
credentials, composer_name, self.log
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@


import json
from dataproc_jupyter_plugin.commons.gcloudOperations import GetCachedCredentials
from jupyter_server.base.handlers import APIHandler
import tornado
from dataproc_jupyter_plugin import handlers
from dataproc_jupyter_plugin.services.runtimeListService import RuntimeListService


Expand All @@ -27,7 +27,7 @@ def get(self):
page_token = self.get_argument("pageToken")
page_size = self.get_argument("pageSize")
runtime = RuntimeListService()
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
runtime_list = runtime.list_runtime(
credentials, page_size, page_token, self.log
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@


import json
from dataproc_jupyter_plugin.commons.gcloudOperations import GetCachedCredentials
from jupyter_server.base.handlers import APIHandler
import tornado
from dataproc_jupyter_plugin import handlers
from dataproc_jupyter_plugin.services.triggerDagService import TriggerDagService


Expand All @@ -27,7 +27,7 @@ def get(self):
trigger_dag = TriggerDagService()
dag_id = self.get_argument("dag_id")
composer = self.get_argument("composer")
credentials = handlers.get_cached_credentials(self.log)
credentials = GetCachedCredentials.get_cached_credentials(self.log)
trigger = trigger_dag.dag_trigger(credentials, dag_id, composer, self.log)
self.finish(json.dumps(trigger))
except Exception as e:
Expand Down
Loading

0 comments on commit 956dcd9

Please sign in to comment.