- Tested versions: V2
- Environment: Google Cloud Platform (GCP)
- Supported inputs: Pub/Sub (pull)
- Supported Guardium versions:
- Guardium Data Protection: 11.4 and above
This is a Logstash filter plug-in for the universal connector that is featured in IBM Security Guardium. It parses a GCP (Google Cloud Platform) event logs into a Guardium record instance (which is a standard structure made out of several parts). The information is then sent over to Guardium. Guardium records include the accessor (the person who tried to access the data), the session, data, and exceptions. If there are no errors, the data contains details about the query "construct". The construct details the main action (verb) and collections (objects) involved. As of now, the BigQuery plug-in only supports Guardium Data Protection.
This plug-in contains a runtime dependency of Logstash Google PubSub input plug-in (version ~> 1.2.1, i.e. at least 1.2.1).
This version is compliant with Guardium Data Protection v11.4 and above. Please refer to the input plug-in repository for more information.
- Prerequisites
- BigQuery is automatically enabled in new projects. To activate BigQuery in an existing project, enable the BigQuery API. Please refer to Enable the BigQuery API for more information.
- Create a BigQuery dataset that stores the data. Please refer to Create a Dataset for more information
- Create a table
- Expand the: View actions option and click
Open
. - In the details panel, click on (+) Create table.
- On the Create table page, do the following:
Enter Table Name. e.g., “user”.
You can add more fields to the table by clicking
+Add Field
. - Click
Create Table
. Please refer to Create Table for more information
- Expand the: View actions option and click
- Query table data. Please refer to Query Table Data for more information
User can view and download the generated logs. The following Identity and Access Management roles are required to view and download logs:
- To view logs: roles/logging.viewer (Logs Viewer) roles/logging.privateLogViewer (Private Logs Viewer)
- To download logs: roles/logging.admin (Logging Admin) roles/logging.viewAccessor (Logs View Accessor)
- Go to the Pub/Sub topics page in the Cloud Console.
- Click
Create a topic
- In the Topic ID field, provide a unique topic name, for example, MyTopic.
- Click
Create Topic
.
- Display the menu for the topic created in the previous step and click
New subscription
. - Type a name for the subscription, such as MySub.
- Leave the delivery type as Pull.
- Click
Create
- In the Cloud Console, go to the Logging > Log Router page.
- Click
Create sink
. - In the Sink details panel, enter the following details:
- Sink name: Provide an identifier for the sink. Note that after you create the sink you cannot rename it. However, you can delete a sink and create a new one.
- Sink description (optional): Describe the purpose or use case for the sink.
- In the Sink destination panel, select the Cloud Pub/Sub topic as sink service and select the topic created in previous steps.
- Choose logs to include in the sink in the Build inclusion filter panel.
- You can filter the logs by log name, resource, and severity. Multi-region
- In cases of multiple regions, you need to do the same set of configurations per each region. Based on the region, different configuration files will be used for the input plug-in
To set permissions for the log sink to route to its destination, do the following:
- Obtain the sink's writer identity—an email address—from the new sink.
- Go to the Log Router page, and select
menu
>View sink details
. - The writer identity appears in the Sink details panel.
- Go to the Log Router page, and select
- If you have owner access to the destination:
-
Add the sink's writer identity to topic >>>>
- Navigate to the Topic created in the earlier steps
- Click on SHOW INFO panel
- Click on ADD PRINCIPAL
- Paste writer identity in the New Principals
- Give it the Pub/Sub Publisher role and subscriber role
-
Add the sink's writer identity to subscription >>>>
- Navigate to the Subscription
- Click on SHOW INFO panel
- Click on ADD PRINCIPAL
- Paste writer identity in the New Principals
- Give it the subscriber role
-
- Go to the Service accounts section of the IAM & Admin console.
- Select
project
and clickCreate Service Account
. - Enter a Service account name, such as Bigquery-pubsub.
- Click
Create
. - The owner role is required for the service account. Select the owner role from the drop-down menu
- Click
Continue
. You do not need to grant users access to this service account. - Click
Create Key
. The key is used by the Logstash input plug-in configuration file. - Select JSON and click
Create
.
Edit the Sink via Logs Router > Sink Inclusion Filter:
The purpose of this inclusion filter is to exclude unnecessary logs and include required logs with resource types and metadata reason as DELETE,TABLE_INSERT_REQUEST,TABLE_DELETE_REQUEST or CREATE and metadtata jobStatus.
(resource.type=("bigquery_project") AND protoPayload.authenticationInfo.principalEmail:* AND
(protoPayload.metadata.jobChange.job.jobStatus.jobState = DONE AND -protoPayload.metadata.jobChange.job.jobConfig.queryConfig.statementType = "SCRIPT"))
OR
(protoPayload.metadata.datasetDeletion.reason = "DELETE") OR (protoPayload.metadata.tableCreation.reason = "TABLE_INSERT_REQUEST") OR (protoPayload.metadata.tableDeletion.reason = "TABLE_DELETE_REQUEST") OR (protoPayload.metadata.datasetCreation.reason = "CREATE")
The inclusion filter mentioned above will be used to view the Audit logs in the GCP Logs Explorer.
- BigQueryAudit -
ACTIVITY
,DATA_ACCESS
logs - BigQuery Log -
EMERGENCY
,ALERT
,CRITICAL
,ERROR
,WARNING
,NOTICE
,DEBUG
,DEFAULT
- If no information regarding certain fields is available in the logs, those fields will not be mapped.
- Exception object will be prepared based on severity of the logs.
- The data model size is limited to 10 GB per table. If you have a 100 GB reservation per project per location, BigQuery BI Engine limits the reservation per table to 10 GB. The rest of the available reservation is used for other tables in the project.
- BigQuery cannot read the data in parallel if you use gzip compression. Loading compressed JSON data into BigQuery is slower than loading uncompressed data.
- You cannot include both compressed and uncompressed files in the same load job.
- JSON data must be newline delimited. Each JSON object must be on a separate line in the file.
- The maximum size for a gzip file is 4 GB.
- Log messages have a size limit of 100K bytes
- The Audit/Data access log doesn't contain a server IP. The default value is set 0.0.0.0 for the server IP.
- The following important fields cannot be mapped, as there is no information regarding these fields in the logs:
- Source program
- OS User
- Client HostName
serverHostName
pattern for BigQuery GCP : project-id_bigquery.googleapis.com.- When you try to create or delete a data set or table using BigQuery UI options, fields like the FULL SQL & Objects and Verbs column appear blank, because these actions don't receive any query from GCP logs. You can ignore these actions, by updating the inculsion filter: "(resource.type=("bigquery_project") AND protoPayload.authenticationInfo.principalEmail:* AND (protoPayload.metadata.jobChange.job.jobStatus.jobState = DONE AND -protoPayload.metadata.jobChange.job.jobConfig.queryConfig.statementType = "SCRIPT"))"
- The parser does not support queries in which a keyword is used as a table name or column name, or in scenarios of nested parameters inside functions.
- The BigQuery audit log doesn’t include login failed logs, so these will not appear in the guardium LOGIN_FAILED report.
The Guardium universal connector is the Guardium entry point for native audit/data_access logs. The Guardium universal connector identifies and parses the received events, and converts them to a standard Guardium format. The output of the Guardium universal connector is forwarded to the Guardium sniffer on the collector, for policy and auditing enforcements. Configure Guardium to read the native audit/data_access logs by customizing the BigQuery template.
- Configure the policies you require. See policies for more information.
- You must have permission for the S-Tap Management role. The admin user includes this role by default
- BigQuery-Guardium Logstash filter plug-in is automatically available with Guardium Data Protection versions 12.x, 11.4 with appliance bundle 11.0p490 or later or Guardium Data Protection version 11.5 with appliance bundle 11.0p540 or later releases.
Note: For Guardium Data Protection version 11.4 without appliance bundle 11.0p490 or prior or Guardium Data Protection version 11.5 without appliance bundle 11.0p540 or prior, download the Logstash_Offline_package_7.x plug-in. (Do not unzip the offline-package file throughout the procedure).
- On the collector, go to Setup > Tools and Views > Configure Universal Connector.
- Enable the universal connector if it is disabled.
- Click Upload File and select the relevant plugin based on the version of the Guardium. After it is uploaded, click
OK
.- For the Guardium 11.x, download the Logstash_Offline_package_7.x
- For the Guardium 12.x, download the Logstash_Offline_package_8.x
- Click
Upload File
and select the key.json file. After it is uploaded, clickOK
. - Click the Plus sign to open the Connector Configuration dialog box.
- Type a name in the Connector name field.
- Update the input section to add the details from the pubsub_big_query.conf file's input part, omitting the keyword "input{" at the beginning and its corresponding "}" at the end.
- Update the filter section to add the details from the pubsub_big_query.conf file's filter part, omitting the keyword "filter{" at the beginning and its corresponding "}" at the end.
- The 'type' fields should match in the input and filter configuration sections. This field should be unique for every individual connector added.
- Click
Save
. Guardium validates the new connector and displays it in the Configure Universal Connector page. - After the offline plug-in is installed and the configuration is uploaded and saved in the Guardium machine, restart the Universal Connector using the Disable/Enable button.