├── images ├── bin.png ├── bin_icon.png └── pipeline_dag.jpg ├── AUTHORS ├── .gitignore ├── src ├── test │ ├── resources │ │ ├── datacatalog-objects │ │ │ ├── TableB_pii_tag.json │ │ │ ├── TableA_pii_tag.json │ │ │ ├── TableA_sample_template_tag.json │ │ │ ├── TableB_entry.json │ │ │ └── TableA_entry.json │ │ ├── single_tag_no_column.json │ │ ├── single_tag_with_column.json │ │ ├── single_tag_enum_column.json │ │ ├── tag_list.json │ │ ├── delete_tag_operation_record.json │ │ ├── delete_tag_request_1.json │ │ ├── create_tag_request.json │ │ ├── update_catalog_request.json │ │ ├── update_tag_finish_request.json │ │ ├── catalog_test_permissions_audit_log.json │ │ ├── create_tag_operation_record.json │ │ ├── tablerow_camelcase_create_record.json │ │ └── tablerow_snakecase_create_record.json │ └── java │ │ └── com │ │ └── google │ │ └── cloud │ │ └── solutions │ │ └── catalogtagrecording │ │ ├── testing │ │ ├── TestResourceLoader.java │ │ ├── fakes │ │ │ ├── datacatalog │ │ │ │ ├── FakeDataCatalogListTagsApiResponse.java │ │ │ │ ├── FakeDataCatalogLookupEntryResponse.java │ │ │ │ ├── FakeListTagsRequestListTagsResponseTagPagedListDescriptor.java │ │ │ │ ├── FakeDataCatalogPagesListTagsResponse.java │ │ │ │ ├── FakeNoOpApiTracer.java │ │ │ │ └── FakeDataCatalogStub.java │ │ │ ├── FakeApiFutureBase.java │ │ │ └── FakeApiCallContext.java │ │ └── PCollectionSatisfies.java │ │ ├── ProtoToTableRowMapperTest.java │ │ ├── TagRecordingPipelineOptionsTest.java │ │ ├── JsonMessageParserTest.java │ │ ├── CatalogTagConverterTest.java │ │ └── EntityTagOperationRecordExtractorTest.java └── main │ ├── java │ └── com │ │ └── google │ │ └── cloud │ │ └── solutions │ │ └── catalogtagrecording │ │ ├── PipelineLauncher.java │ │ ├── TagRecordingPipelineOptions.java │ │ ├── TagUtility.java │ │ ├── ProtoToTableRowMapper.java │ │ ├── CatalogTagRecordingPipeline.java │ │ ├── DataCatalogService.java │ │ ├── JsonMessageParser.java │ │ ├── CatalogTagConverter.java │ │ ├── ProtoJsonConverter.java │ │ └── EntityTagOperationRecordExtractor.java │ └── proto │ └── tag_recording_messages.proto ├── CONTRIBUTING.md ├── launch_pipeline.sh ├── env.sh ├── tag_history_collector.yaml ├── code-of-conduct.md ├── camelEntityTagOperationRecords.schema ├── snakeEntityTagOperationRecords.schema ├── LICENSE ├── pom.xml └── README.md /images/bin.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GoogleCloudPlatform/datacatalog-tag-history/HEAD/images/bin.png -------------------------------------------------------------------------------- /images/bin_icon.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GoogleCloudPlatform/datacatalog-tag-history/HEAD/images/bin_icon.png -------------------------------------------------------------------------------- /images/pipeline_dag.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/GoogleCloudPlatform/datacatalog-tag-history/HEAD/images/pipeline_dag.jpg -------------------------------------------------------------------------------- /AUTHORS: -------------------------------------------------------------------------------- 1 | # This is the list of The Data Catalog Tag History's significant contributors. 2 | # 3 | # This does not necessarily list everyone who has contributed code, 4 | # especially since many employees of one corporation may be contributing. 5 | # To see the full list of contributors, see the revision history in 6 | # source control. 7 | Google LLC 8 | Anant Damle 9 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # User-specific stuff 2 | .idea/ 3 | 4 | # File-based project format 5 | *.iws 6 | *.iml 7 | 8 | # generated code 9 | out/ 10 | target/ 11 | .idea_modules/ 12 | 13 | # JIRA plugin 14 | atlassian-ide-plugin.xml 15 | 16 | # Crashlytics plugin (for Android Studio and IntelliJ) 17 | com_crashlytics_export_strings.xml 18 | crashlytics.properties 19 | crashlytics-build.properties 20 | fabric.properties 21 | -------------------------------------------------------------------------------- /src/test/resources/datacatalog-objects/TableB_pii_tag.json: -------------------------------------------------------------------------------- 1 | [ 2 | { 3 | "name": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/tableBentryId/tags/TableBIdTag2Id", 4 | "template": "projects/my-project-id/locations/us-central1/tagTemplates/pii_tag", 5 | "fields": { 6 | "type": { 7 | "displayName": "PII Type", 8 | "stringValue": "TELEPHONE_ID" 9 | } 10 | }, 11 | "templateDisplayName": "PII", 12 | "column": "phone_number" 13 | } 14 | ] -------------------------------------------------------------------------------- /src/test/resources/datacatalog-objects/TableA_pii_tag.json: -------------------------------------------------------------------------------- 1 | [ 2 | { 3 | "name": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/TableAIdTag2Id", 4 | "template": "projects/my-project-id/locations/us-central1/tagTemplates/pii_tag", 5 | "fields": { 6 | "type": { 7 | "displayName": "PII Type", 8 | "stringValue": "USER_ID" 9 | } 10 | }, 11 | "templateDisplayName": "PII" 12 | } 13 | ] -------------------------------------------------------------------------------- /src/test/resources/single_tag_no_column.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 3 | "template": "projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag", 4 | "fields": { 5 | "primary_source": { 6 | "displayName": "Primary Source", 7 | "boolValue": true 8 | }, 9 | "business_unit": { 10 | "displayName": "Business Unit", 11 | "stringValue": "Professional Services", 12 | "order": 1 13 | }, 14 | "updated_timestamp": { 15 | "displayName": "Updated Timestamp", 16 | "timestampValue": "2020-09-09T15:25:00Z", 17 | "order": 2 18 | } 19 | }, 20 | "templateDisplayName": "Testing Sample Catalog Tag" 21 | } -------------------------------------------------------------------------------- /src/test/resources/single_tag_with_column.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 3 | "template": "projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag", 4 | "fields": { 5 | "primary_source": { 6 | "displayName": "Primary Source", 7 | "boolValue": true 8 | }, 9 | "business_unit": { 10 | "displayName": "Business Unit", 11 | "stringValue": "Professional Services", 12 | "order": 1 13 | }, 14 | "version_number": { 15 | "displayName": "Tag Version", 16 | "doubleValue": 1.2, 17 | "order": 2 18 | } 19 | }, 20 | "column": "someColumn", 21 | "templateDisplayName": "Testing Sample Catalog Tag" 22 | } -------------------------------------------------------------------------------- /src/test/resources/single_tag_enum_column.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 3 | "template": "projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag", 4 | "fields": { 5 | "primary_source": { 6 | "displayName": "Primary Source", 7 | "boolValue": true 8 | }, 9 | "business_unit": { 10 | "displayName": "Business Unit", 11 | "stringValue": "Professional Services", 12 | "order": 1 13 | }, 14 | "data_category": { 15 | "displayName": "Data Category", 16 | "enumValue": { 17 | "displayName": "PII" 18 | }, 19 | "order": 2 20 | } 21 | }, 22 | "column": "someColumn", 23 | "templateDisplayName": "Testing Sample Catalog Tag" 24 | } -------------------------------------------------------------------------------- /src/test/resources/tag_list.json: -------------------------------------------------------------------------------- 1 | { 2 | "tags": [ 3 | { 4 | "name": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 5 | "template": "projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag", 6 | "fields": { 7 | "primary_source": { 8 | "displayName": "Primary Source", 9 | "boolValue": true 10 | }, 11 | "business_unit": { 12 | "displayName": "Business Unit", 13 | "stringValue": "Professional Services", 14 | "order": 1 15 | }, 16 | "updated_timestamp": { 17 | "displayName": "Updated Timestamp", 18 | "timestampValue": "2020-09-09T15:25:00Z", 19 | "order": 2 20 | } 21 | }, 22 | "templateDisplayName": "Testing Sample Catalog Tag" 23 | } 24 | ] 25 | } 26 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # How to Contribute 2 | 3 | We'd love to accept your patches and contributions to this project. There are 4 | just a few small guidelines you need to follow. 5 | 6 | ## Contributor License Agreement 7 | 8 | Contributions to this project must be accompanied by a Contributor License 9 | Agreement. You (or your employer) retain the copyright to your contribution; 10 | this simply gives us permission to use and redistribute your contributions as 11 | part of the project. Head over to to see 12 | your current agreements on file or to sign a new one. 13 | 14 | You generally only need to submit a CLA once, so if you've already submitted one 15 | (even if it was for a different project), you probably don't need to do it 16 | again. 17 | 18 | ## Code reviews 19 | 20 | All submissions, including submissions by project members, require review. We 21 | use GitHub pull requests for this purpose. Consult 22 | [GitHub Help](https://help.github.com/articles/about-pull-requests/) for more 23 | information on using pull requests. 24 | 25 | ## Community Guidelines 26 | 27 | This project follows 28 | [Google's Open Source Community Guidelines](https://opensource.google.com/conduct/). 29 | -------------------------------------------------------------------------------- /src/test/resources/datacatalog-objects/TableA_sample_template_tag.json: -------------------------------------------------------------------------------- 1 | [ 2 | { 3 | "name": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 4 | "template": "projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag", 5 | "fields": { 6 | "primary_source": { 7 | "displayName": "Primary Source", 8 | "boolValue": true 9 | }, 10 | "business_unit": { 11 | "displayName": "Business Unit", 12 | "stringValue": "Professional Services", 13 | "order": 1 14 | }, 15 | "updated_timestamp": { 16 | "displayName": "Updated Timestamp", 17 | "timestampValue": "2020-09-09T15:25:00Z", 18 | "order": 2 19 | }, 20 | "data_category": { 21 | "displayName": "Data Category", 22 | "enumValue": { 23 | "displayName": "PII" 24 | }, 25 | "order": 3 26 | }, 27 | "version_number": { 28 | "displayName": "Tag Version", 29 | "doubleValue": 1.2, 30 | "order": 4 31 | } 32 | }, 33 | "column": "columnName", 34 | "templateDisplayName": "Testing Sample Catalog Tag" 35 | } 36 | ] -------------------------------------------------------------------------------- /src/test/resources/delete_tag_operation_record.json: -------------------------------------------------------------------------------- 1 | { 2 | "reconcileTime": "2020-09-11T10:52:01.789Z", 3 | "entity": { 4 | "entityId": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/tableBentryId", 5 | "linkedResource": "//bigquery.googleapis.com/projects/myproject1/datasets/mydataset1/tables/TableB" 6 | }, 7 | "auditInformation": { 8 | "insertId": "1g6b184ckhz", 9 | "jobTime": "2020-09-09T15:52:57.614245506Z", 10 | "actuator": "runner@organization.com", 11 | "operation": { 12 | "type": "google.cloud.datacatalog.v1.DataCatalog.DeleteTag", 13 | "resource": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/tableBentryId/tags/CUzSbDlKy_z_bF" 14 | } 15 | }, 16 | "tags": [ 17 | { 18 | "tagId": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/tableBentryId/tags/TableBIdTag2Id", 19 | "templateId": "projects/my-project-id/locations/us-central1/tagTemplates/pii_tag", 20 | "templateName": "PII", 21 | "column": "phone_number", 22 | "fields": [ 23 | { 24 | "fieldId": "type", 25 | "fieldName": "PII Type", 26 | "kind": "STRING", 27 | "stringValue": "TELEPHONE_ID" 28 | } 29 | ] 30 | } 31 | ] 32 | } -------------------------------------------------------------------------------- /src/test/resources/datacatalog-objects/TableB_entry.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/tableBentryId", 3 | "type": "TABLE", 4 | "schema": { 5 | "columns": [ 6 | { 7 | "type": "TIMESTAMP", 8 | "mode": "NULLABLE", 9 | "column": "transaction_time" 10 | }, 11 | { 12 | "type": "STRING", 13 | "mode": "NULLABLE", 14 | "column": "phone_number" 15 | }, 16 | { 17 | "type": "DOUBLE", 18 | "mode": "NULLABLE", 19 | "column": "lat" 20 | }, 21 | { 22 | "type": "DOUBLE", 23 | "mode": "NULLABLE", 24 | "column": "lon" 25 | }, 26 | { 27 | "type": "INT64", 28 | "mode": "NULLABLE", 29 | "column": "id" 30 | }, 31 | { 32 | "type": "INT64", 33 | "mode": "NULLABLE", 34 | "column": "bucket" 35 | } 36 | ] 37 | }, 38 | "sourceSystemTimestamps": { 39 | "createTime": "2019-11-05T06:10:22.137Z", 40 | "updateTime": "2019-11-05T06:10:22.137Z", 41 | "expireTime": "1970-01-01T00:00:00Z" 42 | }, 43 | "linkedResource": "//bigquery.googleapis.com/projects/myproject1/datasets/mydataset1/tables/TableB", 44 | "bigqueryTableSpec": { 45 | "tableSourceType": "BIGQUERY_TABLE" 46 | }, 47 | "integratedSystem": "BIGQUERY" 48 | } 49 | -------------------------------------------------------------------------------- /launch_pipeline.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | # 3 | # Copyright 2020 The Data Catalog Tag History Authors. 4 | # 5 | # Licensed under the Apache License, Version 2.0 (the "License"); 6 | # you may not use this file except in compliance with the License. 7 | # You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | mvn clean generate-sources compile package exec:java \ 19 | -Dexec.mainClass=com.google.cloud.solutions.catalogtagrecording.PipelineLauncher \ 20 | -Dexec.cleanupDaemonThreads=false \ 21 | -Dmaven.test.skip=true \ 22 | -Dexec.args=" \ 23 | --streaming=true \ 24 | --project=${PROJECT_ID} \ 25 | --serviceAccount=${TAG_HISTORY_SERVICE_ACCOUNT_EMAIL} \ 26 | --runner=DataflowRunner \ 27 | --gcpTempLocation=gs://${TEMP_GCS_BUCKET}/temp/ \ 28 | --stagingLocation=gs://${TEMP_GCS_BUCKET}/staging/ \ 29 | --workerMachineType=n1-standard-1 \ 30 | --region=${REGION_ID} \ 31 | --tagsBigqueryTable=${PROJECT_ID}:${DATASET_ID}.${TABLE_ID} \ 32 | --catalogAuditLogsSubscription=projects/${PROJECT_ID}/subscriptions/${LOGS_SUBSCRIPTION_ID}" 33 | -------------------------------------------------------------------------------- /src/test/resources/datacatalog-objects/TableA_entry.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk", 3 | "type": "TABLE", 4 | "schema": { 5 | "columns": [ 6 | { 7 | "type": "TIMESTAMP", 8 | "mode": "NULLABLE", 9 | "column": "transaction_time" 10 | }, 11 | { 12 | "type": "INT64", 13 | "mode": "NULLABLE", 14 | "column": "keyid" 15 | }, 16 | { 17 | "type": "DOUBLE", 18 | "mode": "NULLABLE", 19 | "column": "lat" 20 | }, 21 | { 22 | "type": "DOUBLE", 23 | "mode": "NULLABLE", 24 | "column": "lon" 25 | }, 26 | { 27 | "type": "INT64", 28 | "mode": "NULLABLE", 29 | "column": "id" 30 | }, 31 | { 32 | "type": "INT64", 33 | "mode": "NULLABLE", 34 | "column": "bucket" 35 | } 36 | ] 37 | }, 38 | "sourceSystemTimestamps": { 39 | "createTime": "2019-11-05T06:10:22.137Z", 40 | "updateTime": "2019-11-05T06:10:22.137Z", 41 | "expireTime": "1970-01-01T00:00:00Z" 42 | }, 43 | "linkedResource": "//bigquery.googleapis.com/projects/myproject1/datasets/mydataset1/tables/TableA", 44 | "bigqueryTableSpec": { 45 | "tableSourceType": "BIGQUERY_TABLE" 46 | }, 47 | "integratedSystem": "BIGQUERY" 48 | } 49 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/testing/TestResourceLoader.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording.testing; 18 | 19 | import java.io.IOException; 20 | import java.nio.charset.StandardCharsets; 21 | import java.nio.file.Files; 22 | import java.nio.file.Paths; 23 | 24 | public final class TestResourceLoader { 25 | 26 | private static final String TEST_RESOURCE_FOLDER = "test"; 27 | 28 | public static String load(String resourceFileName) { 29 | try { 30 | byte[] bytes = 31 | Files.readAllBytes(Paths.get("src", TEST_RESOURCE_FOLDER, "resources", resourceFileName)); 32 | return new String(bytes, StandardCharsets.UTF_8); 33 | } catch (IOException ioException) { 34 | return ""; 35 | } 36 | } 37 | 38 | private TestResourceLoader() { 39 | } 40 | } 41 | -------------------------------------------------------------------------------- /src/main/java/com/google/cloud/solutions/catalogtagrecording/PipelineLauncher.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import org.apache.beam.sdk.Pipeline; 20 | import org.apache.beam.sdk.options.PipelineOptionsFactory; 21 | 22 | /** 23 | * Entry point to parse the CLI arguments as Pipeline options and launch the pipeline. 24 | */ 25 | public final class PipelineLauncher { 26 | 27 | public static void main(String[] args) { 28 | PipelineOptionsFactory.register(TagRecordingPipelineOptions.class); 29 | 30 | TagRecordingPipelineOptions options = 31 | PipelineOptionsFactory.fromArgs(args).as(TagRecordingPipelineOptions.class); 32 | 33 | CatalogTagRecordingPipeline.builder() 34 | .options(options) 35 | .pipeline(Pipeline.create(options)) 36 | .build() 37 | .run(); 38 | } 39 | } 40 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/testing/fakes/datacatalog/FakeDataCatalogListTagsApiResponse.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording.testing.fakes.datacatalog; 18 | 19 | import com.google.cloud.datacatalog.v1beta1.ListTagsResponse; 20 | import com.google.cloud.datacatalog.v1beta1.Tag; 21 | import com.google.cloud.solutions.catalogtagrecording.testing.fakes.FakeApiFutureBase; 22 | import com.google.common.collect.ImmutableCollection; 23 | 24 | /** 25 | * Fake implementation of List Tags Response 26 | */ 27 | public class FakeDataCatalogListTagsApiResponse extends FakeApiFutureBase { 28 | 29 | private final ImmutableCollection tags; 30 | 31 | public FakeDataCatalogListTagsApiResponse(ImmutableCollection tags) { 32 | this.tags = tags; 33 | } 34 | 35 | @Override 36 | public ListTagsResponse get() { 37 | return ListTagsResponse.newBuilder().addAllTags(tags).build(); 38 | } 39 | } 40 | -------------------------------------------------------------------------------- /src/main/proto/tag_recording_messages.proto: -------------------------------------------------------------------------------- 1 | syntax = "proto3"; 2 | 3 | package com.google.cloud.solutions.catalogtagrecording; 4 | 5 | import "google/protobuf/timestamp.proto"; 6 | 7 | message EntityTagOperationRecord { 8 | .google.protobuf.Timestamp reconcile_time = 1; 9 | 10 | CatalogEntry entity = 5; 11 | 12 | AuditInformation audit_information = 10; 13 | 14 | repeated CatalogTag tags = 15; 15 | } 16 | 17 | message CatalogEntry { 18 | string entity_id = 1; 19 | 20 | string linked_resource = 2; 21 | 22 | string sql_resource = 3; 23 | } 24 | 25 | message AuditInformation { 26 | string insert_id = 1; 27 | 28 | .google.protobuf.Timestamp job_time = 5; 29 | 30 | string actuator = 10; 31 | 32 | OperationInformation operation = 15; 33 | } 34 | 35 | message OperationInformation { 36 | string type = 1; 37 | string resource = 5; 38 | } 39 | 40 | message CatalogTag { 41 | string tag_id = 1; 42 | 43 | string template_id = 2; 44 | 45 | string template_name = 3; 46 | 47 | string column = 4; 48 | 49 | repeated CatalogTagField fields = 10; 50 | } 51 | 52 | message CatalogTagField { 53 | string field_id = 1; 54 | 55 | string field_name = 2; 56 | 57 | CatalogTagFieldKinds kind = 3; 58 | 59 | oneof value { 60 | bool bool_value = 11; 61 | double double_value = 12; 62 | string string_value = 13; 63 | .google.protobuf.Timestamp timestamp_value = 14; 64 | EnumValue enum_value = 15; 65 | } 66 | 67 | message EnumValue { 68 | string display_name = 1; 69 | } 70 | 71 | enum CatalogTagFieldKinds { 72 | UNKNOWN_TAG_FIELD_KIND = 0; 73 | BOOL = 1; 74 | DOUBLE = 2; 75 | STRING = 3; 76 | TIMESTAMP = 4; 77 | ENUM = 5; 78 | } 79 | } 80 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/testing/fakes/FakeApiFutureBase.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording.testing.fakes; 18 | 19 | import com.google.api.core.ApiFuture; 20 | import java.util.concurrent.ExecutionException; 21 | import java.util.concurrent.Executor; 22 | import java.util.concurrent.TimeUnit; 23 | import javax.annotation.Nonnull; 24 | 25 | public abstract class FakeApiFutureBase implements ApiFuture { 26 | 27 | @Override 28 | public final void addListener(Runnable runnable, Executor executor) { 29 | executor.execute(runnable); 30 | } 31 | 32 | @Override 33 | public final boolean cancel(boolean b) { 34 | return false; 35 | } 36 | 37 | @Override 38 | public final boolean isCancelled() { 39 | return false; 40 | } 41 | 42 | @Override 43 | public final boolean isDone() { 44 | return true; 45 | } 46 | 47 | @Override 48 | public final V get(long l, @Nonnull TimeUnit timeUnit) 49 | throws InterruptedException, ExecutionException { 50 | return get(); 51 | } 52 | } 53 | -------------------------------------------------------------------------------- /src/main/java/com/google/cloud/solutions/catalogtagrecording/TagRecordingPipelineOptions.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import org.apache.beam.sdk.extensions.gcp.options.GcpOptions; 20 | import org.apache.beam.sdk.options.Default; 21 | import org.apache.beam.sdk.options.Description; 22 | import org.apache.beam.sdk.options.Validation.Required; 23 | 24 | public interface TagRecordingPipelineOptions extends GcpOptions { 25 | 26 | @Description("The Pub/Sub subscription Id to use for receiving Data Catalog Audit logs") 27 | @Required 28 | String getCatalogAuditLogsSubscription(); 29 | 30 | void setCatalogAuditLogsSubscription(String pubsubSubscription); 31 | 32 | @Description("The BigQuery table Id in \":.\" format") 33 | @Required 34 | String getTagsBigqueryTable(); 35 | 36 | void setTagsBigqueryTable(String tableName); 37 | 38 | @Description( 39 | "The column naming convention to use: set true for snake_case and false for camelCase") 40 | @Default.Boolean(false) 41 | boolean isSnakeCaseColumnNames(); 42 | 43 | void setSnakeCaseColumnNames(boolean snakeCaseColumnNames); 44 | } 45 | -------------------------------------------------------------------------------- /env.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | # 3 | # Copyright 2020 The Data Catalog Tag History Authors. 4 | # 5 | # Licensed under the Apache License, Version 2.0 (the "License"); 6 | # you may not use this file except in compliance with the License. 7 | # You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # Setup the GCP project to use for this tutorial 19 | export PROJECT_ID="google.com:anantd" 20 | 21 | # The BigQuery region to use for Tags table 22 | export BIGQUERY_REGION="us" 23 | 24 | # The name of the BigQuery Dataset to create the Tag records table 25 | export DATASET_ID="catalog_dumper" 26 | 27 | # The name of the BigQuery table for Tag records 28 | export TABLE_ID="CamelCaseEntryLogRecord" 29 | #"EntityTagOperationRecords" 30 | 31 | # The Compute region to use for running Dataflow jobs and create a temporary storage bucket 32 | export REGION_ID="us-central1" 33 | 34 | # define the bucket id 35 | export TEMP_GCS_BUCKET="temp-tags-dumper" 36 | 37 | # define the name of the Pub/Sub log sink in Cloud Logging 38 | export LOGS_SINK_NAME="datacatalog-audit-pubsub" 39 | 40 | #define Pub/Sub topic for receiving AuditLog events 41 | export LOGS_SINK_TOPIC_ID="catalog-audit-log-sink" 42 | 43 | # define the subscription id 44 | export LOGS_SUBSCRIPTION_ID="catalog-tags-dumper" 45 | 46 | # name of the service account to use (not the email address) 47 | export TAG_HISTORY_SERVICE_ACCOUNT="tag-history-collector" 48 | export TAG_HISTORY_SERVICE_ACCOUNT_EMAIL="${TAG_HISTORY_SERVICE_ACCOUNT}@$(echo $PROJECT_ID | awk -F':' '{print $2"."$1}' | sed 's/^\.//').iam.gserviceaccount.com" 49 | -------------------------------------------------------------------------------- /src/test/resources/delete_tag_request_1.json: -------------------------------------------------------------------------------- 1 | { 2 | "protoPayload": { 3 | "@type": "type.googleapis.com/google.cloud.audit.AuditLog", 4 | "status": {}, 5 | "authenticationInfo": { 6 | "principalEmail": "runner@organization.com" 7 | }, 8 | "requestMetadata": { 9 | "callerIp": "1.2.3.4", 10 | "callerSuppliedUserAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36,gzip(gfe),gzip(gfe)", 11 | "requestAttributes": { 12 | "time": "2020-09-09T15:52:57.622061696Z", 13 | "auth": {} 14 | }, 15 | "destinationAttributes": {} 16 | }, 17 | "serviceName": "datacatalog.googleapis.com", 18 | "methodName": "google.cloud.datacatalog.v1.DataCatalog.DeleteTag", 19 | "authorizationInfo": [ 20 | { 21 | "resource": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/tableBentryId/tags/CUzSbDlKy_z_bF", 22 | "permission": "datacatalog.tagTemplates.use", 23 | "granted": true, 24 | "resourceAttributes": {} 25 | } 26 | ], 27 | "resourceName": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/tableBentryId/tags/CUzSbDlKy_z_bF", 28 | "request": { 29 | "@type": "type.googleapis.com/google.cloud.datacatalog.v1.DeleteTagRequest" 30 | }, 31 | "response": { 32 | "@type": "type.googleapis.com/google.protobuf.Empty" 33 | } 34 | }, 35 | "insertId": "1g6b184ckhz", 36 | "resource": { 37 | "type": "audited_resource", 38 | "labels": { 39 | "service": "datacatalog.googleapis.com", 40 | "project_id": "my-project-id", 41 | "method": "google.cloud.datacatalog.v1.DataCatalog.DeleteTag" 42 | } 43 | }, 44 | "timestamp": "2020-09-09T15:52:57.614245506Z", 45 | "severity": "NOTICE", 46 | "logName": "projects/my-project-id/logs/cloudaudit.googleapis.com%2Factivity", 47 | "receiveTimestamp": "2020-09-09T15:52:58.236399899Z" 48 | } -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/testing/PCollectionSatisfies.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording.testing; 18 | 19 | import static com.google.common.truth.Truth.assertThat; 20 | 21 | import com.google.common.collect.ImmutableSet; 22 | import java.util.Arrays; 23 | import org.apache.beam.sdk.transforms.SerializableFunction; 24 | 25 | public class PCollectionSatisfies { 26 | 27 | @SafeVarargs 28 | public static SerializableFunction, Void> expectedSet( 29 | T expected1, T... otherExpectedValues) { 30 | return expectedSet( 31 | ImmutableSet.builder() 32 | .add(expected1) 33 | .addAll(Arrays.asList(otherExpectedValues)) 34 | .build()); 35 | } 36 | 37 | public static SerializableFunction, Void> expectedSet(ImmutableSet expected) { 38 | 39 | return (SerializableFunction, Void>) 40 | input -> { 41 | assertThat(input).containsExactlyElementsIn(expected); 42 | return null; 43 | }; 44 | } 45 | 46 | public static SerializableFunction singleton(T expected) { 47 | return (SerializableFunction) 48 | input -> { 49 | assertThat(input).isEqualTo(expected); 50 | return null; 51 | }; 52 | } 53 | } 54 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/testing/fakes/datacatalog/FakeDataCatalogLookupEntryResponse.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording.testing.fakes.datacatalog; 18 | 19 | import static org.apache.commons.lang3.StringUtils.isBlank; 20 | 21 | import com.google.cloud.datacatalog.v1beta1.Entry; 22 | import com.google.cloud.datacatalog.v1beta1.LookupEntryRequest; 23 | import com.google.cloud.solutions.catalogtagrecording.testing.fakes.FakeApiFutureBase; 24 | import java.io.IOException; 25 | import java.util.concurrent.ExecutionException; 26 | 27 | public class FakeDataCatalogLookupEntryResponse extends FakeApiFutureBase { 28 | 29 | private final Entry entry; 30 | private final LookupEntryRequest request; 31 | 32 | public FakeDataCatalogLookupEntryResponse(Entry entry, LookupEntryRequest request) { 33 | this.entry = entry; 34 | this.request = request; 35 | } 36 | 37 | @Override 38 | public Entry get() throws ExecutionException { 39 | 40 | if (isBlank(request.getLinkedResource())) { 41 | throw new UnsupportedOperationException("Sql Resource based search is not supported in fake"); 42 | } 43 | 44 | if (entry == null) { 45 | String msg = String.format("Entry Not Found: (%s)", request.getLinkedResource()); 46 | throw new ExecutionException(msg, new IOException(msg)); 47 | } 48 | 49 | return entry; 50 | } 51 | } 52 | -------------------------------------------------------------------------------- /src/test/resources/create_tag_request.json: -------------------------------------------------------------------------------- 1 | { 2 | "protoPayload": { 3 | "@type": "type.googleapis.com/google.cloud.audit.AuditLog", 4 | "status": {}, 5 | "authenticationInfo": { 6 | "principalEmail": "runner@organization.com" 7 | }, 8 | "requestMetadata": { 9 | "callerIp": "1.2.3.4", 10 | "callerSuppliedUserAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36,gzip(gfe),gzip(gfe)", 11 | "requestAttributes": { 12 | "time": "2020-09-09T15:25:57.094999275Z", 13 | "auth": {} 14 | }, 15 | "destinationAttributes": {} 16 | }, 17 | "serviceName": "datacatalog.googleapis.com", 18 | "methodName": "google.cloud.datacatalog.v1.DataCatalog.CreateTag", 19 | "authorizationInfo": [ 20 | { 21 | "resource": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk", 22 | "permission": "datacatalog.tagTemplates.use", 23 | "granted": true, 24 | "resourceAttributes": {} 25 | } 26 | ], 27 | "resourceName": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk", 28 | "request": { 29 | "@type": "type.googleapis.com/google.cloud.datacatalog.v1.CreateTagRequest" 30 | }, 31 | "response": { 32 | "@type": "type.googleapis.com/google.cloud.datacatalog.v1.Tag" 33 | } 34 | }, 35 | "insertId": "sb27clc2el", 36 | "resource": { 37 | "type": "audited_resource", 38 | "labels": { 39 | "method": "google.cloud.datacatalog.v1.DataCatalog.CreateTag", 40 | "project_id": "my-project-id", 41 | "service": "datacatalog.googleapis.com" 42 | } 43 | }, 44 | "timestamp": "2020-09-09T15:25:57.088307939Z", 45 | "severity": "NOTICE", 46 | "logName": "projects/my-project-id/logs/cloudaudit.googleapis.com%2Factivity", 47 | "receiveTimestamp": "2020-09-09T15:25:58.986691326Z" 48 | } -------------------------------------------------------------------------------- /tag_history_collector.yaml: -------------------------------------------------------------------------------- 1 | # Copyright 2020 The Data Catalog Tag History Authors. 2 | # 3 | # Licensed under the Apache License, Version 2.0 (the "License"); 4 | # you may not use this file except in compliance with the License. 5 | # You may obtain a copy of the License at 6 | # 7 | # http://www.apache.org/licenses/LICENSE-2.0 8 | # 9 | # Unless required by applicable law or agreed to in writing, software 10 | # distributed under the License is distributed on an "AS IS" BASIS, 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | # See the License for the specific language governing permissions and 13 | # limitations under the License. 14 | 15 | title: "Catalog Tag history collector" 16 | description: "Allow the service account to read log entry on pubsub and DataCatalog tags to write Tag records in a BigQuery table." 17 | stage: "BETA" 18 | includedPermissions: 19 | # Generic 20 | - resourcemanager.projects.get 21 | # Data Catalog permissions 22 | - datacatalog.tagTemplates.getTag 23 | - datacatalog.tagTemplates.get 24 | - datacatalog.tagTemplates.getTag 25 | - datacatalog.entries.get 26 | - datacatalog.entries.list 27 | - datacatalog.entryGroups.get 28 | - datacatalog.entryGroups.list 29 | - datacatalog.tagTemplates.get 30 | - datacatalog.tagTemplates.getTag 31 | - datacatalog.taxonomies.get 32 | - datacatalog.taxonomies.list 33 | # BigQuery permissions 34 | - bigquery.datasets.get 35 | - bigquery.datasets.getIamPolicy 36 | - bigquery.models.getData 37 | - bigquery.models.getMetadata 38 | - bigquery.models.list 39 | - bigquery.tables.create 40 | - bigquery.tables.delete 41 | - bigquery.tables.export 42 | - bigquery.tables.get 43 | - bigquery.tables.getData 44 | - bigquery.tables.getIamPolicy 45 | - bigquery.tables.list 46 | - bigquery.tables.update 47 | - bigquery.tables.updateData 48 | # PubSub permissions 49 | - pubsub.topics.get 50 | - pubsub.snapshots.seek 51 | - pubsub.subscriptions.consume 52 | - pubsub.topics.attachSubscription 53 | -------------------------------------------------------------------------------- /src/test/resources/update_catalog_request.json: -------------------------------------------------------------------------------- 1 | { 2 | "protoPayload": { 3 | "@type": "type.googleapis.com/google.cloud.audit.AuditLog", 4 | "status": {}, 5 | "authenticationInfo": { 6 | "principalEmail": "runner@organization.com" 7 | }, 8 | "requestMetadata": { 9 | "callerIp": "1.2.3.4", 10 | "callerSuppliedUserAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36,gzip(gfe),gzip(gfe)", 11 | "requestAttributes": { 12 | "time": "2020-09-09T15:46:06.811116360Z", 13 | "auth": {} 14 | }, 15 | "destinationAttributes": {} 16 | }, 17 | "serviceName": "datacatalog.googleapis.com", 18 | "methodName": "google.cloud.datacatalog.v1.DataCatalog.UpdateTag", 19 | "authorizationInfo": [ 20 | { 21 | "resource": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 22 | "permission": "datacatalog.tagTemplates.use", 23 | "granted": true, 24 | "resourceAttributes": {} 25 | } 26 | ], 27 | "resourceName": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 28 | "request": { 29 | "@type": "type.googleapis.com/google.cloud.datacatalog.v1.UpdateTagRequest" 30 | }, 31 | "response": { 32 | "@type": "type.googleapis.com/google.cloud.datacatalog.v1.Tag" 33 | } 34 | }, 35 | "insertId": "1e833o3ckn9", 36 | "resource": { 37 | "type": "audited_resource", 38 | "labels": { 39 | "project_id": "my-project-id", 40 | "service": "datacatalog.googleapis.com", 41 | "method": "google.cloud.datacatalog.v1.DataCatalog.UpdateTag" 42 | } 43 | }, 44 | "timestamp": "2020-09-09T15:46:06.796559969Z", 45 | "severity": "NOTICE", 46 | "logName": "projects/my-project-id/logs/cloudaudit.googleapis.com%2Factivity", 47 | "receiveTimestamp": "2020-09-09T15:46:08.417640591Z" 48 | } -------------------------------------------------------------------------------- /src/test/resources/update_tag_finish_request.json: -------------------------------------------------------------------------------- 1 | { 2 | "protoPayload": { 3 | "@type": "type.googleapis.com/google.cloud.audit.AuditLog", 4 | "status": {}, 5 | "authenticationInfo": { 6 | "principalEmail": "runner@organization.com" 7 | }, 8 | "requestMetadata": { 9 | "callerIp": "1.2.3.4", 10 | "callerSuppliedUserAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36,gzip(gfe),gzip(gfe)", 11 | "requestAttributes": { 12 | "time": "2020-09-09T15:46:06.818985792Z", 13 | "auth": {} 14 | }, 15 | "destinationAttributes": {} 16 | }, 17 | "serviceName": "datacatalog.googleapis.com", 18 | "methodName": "google.cloud.datacatalog.v1.DataCatalog.UpdateTag", 19 | "authorizationInfo": [ 20 | { 21 | "resource": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 22 | "permission": "datacatalog.tagTemplates.use", 23 | "granted": true, 24 | "resourceAttributes": {} 25 | } 26 | ], 27 | "resourceName": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 28 | "request": { 29 | "@type": "type.googleapis.com/google.cloud.datacatalog.v1.UpdateTagRequest" 30 | }, 31 | "response": { 32 | "@type": "type.googleapis.com/google.cloud.datacatalog.v1.Tag" 33 | } 34 | }, 35 | "insertId": "1e833o3ckna", 36 | "resource": { 37 | "type": "audited_resource", 38 | "labels": { 39 | "method": "google.cloud.datacatalog.v1.DataCatalog.UpdateTag", 40 | "project_id": "my-project-id", 41 | "service": "datacatalog.googleapis.com" 42 | } 43 | }, 44 | "timestamp": "2020-09-09T15:46:06.818494983Z", 45 | "severity": "NOTICE", 46 | "logName": "projects/my-project-id/logs/cloudaudit.googleapis.com%2Factivity", 47 | "receiveTimestamp": "2020-09-09T15:46:08.417640591Z" 48 | } -------------------------------------------------------------------------------- /src/test/resources/catalog_test_permissions_audit_log.json: -------------------------------------------------------------------------------- 1 | { 2 | "protoPayload": { 3 | "@type": "type.googleapis.com/google.cloud.audit.AuditLog", 4 | "status": {}, 5 | "authenticationInfo": { 6 | "principalEmail": "runner@organization.com" 7 | }, 8 | "requestMetadata": { 9 | "callerIp": "1.2.3.4", 10 | "callerSuppliedUserAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36,gzip(gfe),gzip(gfe)", 11 | "requestAttributes": { 12 | "time": "2020-09-09T07:57:49.973487721Z", 13 | "auth": {} 14 | }, 15 | "destinationAttributes": {} 16 | }, 17 | "serviceName": "datacatalog.googleapis.com", 18 | "methodName": "google.cloud.datacatalog.v1.DataCatalog.TestUpdateTagPermission", 19 | "authorizationInfo": [ 20 | { 21 | "resource": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk", 22 | "permission": "datacatalog.tagTemplates.use", 23 | "granted": true, 24 | "resourceAttributes": {} 25 | } 26 | ], 27 | "resourceName": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk", 28 | "request": { 29 | "@type": "type.googleapis.com/google.cloud.datacatalog.v1.TestUpdateTagPermissionRequest" 30 | }, 31 | "response": { 32 | "@type": "type.googleapis.com/google.cloud.datacatalog.v1.TestUpdateTagPermissionResponse" 33 | } 34 | }, 35 | "insertId": "pj9wuhciwq", 36 | "resource": { 37 | "type": "audited_resource", 38 | "labels": { 39 | "project_id": "my-project-id", 40 | "service": "datacatalog.googleapis.com", 41 | "method": "google.cloud.datacatalog.v1.DataCatalog.TestUpdateTagPermission" 42 | } 43 | }, 44 | "timestamp": "2020-09-09T07:57:49.966001498Z", 45 | "severity": "NOTICE", 46 | "logName": "projects/my-project-id/logs/cloudaudit.googleapis.com%2Factivity", 47 | "receiveTimestamp": "2020-09-09T07:57:51.303584689Z" 48 | } -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/testing/fakes/datacatalog/FakeListTagsRequestListTagsResponseTagPagedListDescriptor.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording.testing.fakes.datacatalog; 18 | 19 | import com.google.api.gax.rpc.PagedListDescriptor; 20 | import com.google.cloud.datacatalog.v1beta1.ListTagsRequest; 21 | import com.google.cloud.datacatalog.v1beta1.ListTagsResponse; 22 | import com.google.cloud.datacatalog.v1beta1.Tag; 23 | 24 | class FakeListTagsRequestListTagsResponseTagPagedListDescriptor 25 | implements PagedListDescriptor { 26 | 27 | @Override 28 | public String emptyToken() { 29 | return ListTagsRequest.getDefaultInstance().getPageToken(); 30 | } 31 | 32 | @Override 33 | public ListTagsRequest injectToken(ListTagsRequest listTagsRequest, String s) { 34 | return listTagsRequest.toBuilder().setPageToken(s).build(); 35 | } 36 | 37 | @Override 38 | public ListTagsRequest injectPageSize(ListTagsRequest listTagsRequest, int i) { 39 | return listTagsRequest.toBuilder().setPageSize(i).build(); 40 | } 41 | 42 | @Override 43 | public Integer extractPageSize(ListTagsRequest listTagsRequest) { 44 | return listTagsRequest.getPageSize(); 45 | } 46 | 47 | @Override 48 | public String extractNextToken(ListTagsResponse listTagsResponse) { 49 | return listTagsResponse.getNextPageToken(); 50 | } 51 | 52 | @Override 53 | public Iterable extractResources(ListTagsResponse listTagsResponse) { 54 | return listTagsResponse.getTagsList(); 55 | } 56 | } 57 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/ProtoToTableRowMapperTest.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import static com.google.cloud.solutions.catalogtagrecording.ProtoJsonConverter.asJsonString; 20 | import static com.google.cloud.solutions.catalogtagrecording.ProtoJsonConverter.parseJson; 21 | import static com.google.cloud.solutions.catalogtagrecording.testing.TestResourceLoader.load; 22 | 23 | import com.google.api.services.bigquery.model.TableRow; 24 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.EntityTagOperationRecord; 25 | import org.junit.Test; 26 | import org.junit.runner.RunWith; 27 | import org.junit.runners.JUnit4; 28 | import org.skyscreamer.jsonassert.JSONAssert; 29 | 30 | @RunWith(JUnit4.class) 31 | public final class ProtoToTableRowMapperTest { 32 | 33 | @Test 34 | public void apply_snakeCase_valid() { 35 | EntityTagOperationRecord record = 36 | parseJson(load("create_tag_operation_record.json"), EntityTagOperationRecord.class); 37 | 38 | TableRow row = ProtoToTableRowMapper.withSnakeCase().apply(record); 39 | 40 | JSONAssert.assertEquals( 41 | load("tablerow_snakecase_create_record.json"), asJsonString(row), /*strict=*/ false); 42 | } 43 | 44 | @Test 45 | public void apply_camelCase_valid() { 46 | EntityTagOperationRecord record = 47 | parseJson(load("create_tag_operation_record.json"), EntityTagOperationRecord.class); 48 | 49 | TableRow row = ProtoToTableRowMapper.withCamelCase().apply(record); 50 | 51 | JSONAssert.assertEquals( 52 | load("tablerow_camelcase_create_record.json"), asJsonString(row), /*strict=*/ false); 53 | } 54 | } 55 | -------------------------------------------------------------------------------- /src/main/java/com/google/cloud/solutions/catalogtagrecording/TagUtility.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import com.google.cloud.datacatalog.v1beta1.Tag; 20 | import java.util.regex.Matcher; 21 | import java.util.regex.Pattern; 22 | 23 | /** 24 | * Utility methods to extract parts of Data Catalog Tag. 25 | */ 26 | public final class TagUtility { 27 | 28 | public static final Pattern ENTITY_TAG_MATCHER = 29 | Pattern.compile( 30 | "^(?projects/[^/]+/locations/[^/]+/entryGroups/[^/]+/entries/[^/]+)"); 31 | 32 | /** 33 | * Uses {@link Tag#getName()} field to extract parent. 34 | * 35 | * @param tag the DataCatalog Tag applied to an Entity. 36 | * @return the parent id. 37 | */ 38 | public static String extractParent(Tag tag) { 39 | return extractEntityId(tag.getName()); 40 | } 41 | 42 | /** 43 | * Returns the DataCatalog entryId by matching the regex 44 | *
^(?projects/[^/]+/locations/[^/]+/entryGroups/[^/]+/entries/[^/]+)
45 | */ 46 | public static String extractEntityId(String resource) { 47 | Matcher matcher = ENTITY_TAG_MATCHER.matcher(resource); 48 | if (!matcher.find()) { 49 | throw new InvalidCatalogResource(resource); 50 | } 51 | 52 | return matcher.group("entryId"); 53 | } 54 | 55 | /** 56 | * Custom Exception class to signal invalid entryId. 57 | */ 58 | public static class InvalidCatalogResource extends RuntimeException { 59 | 60 | public InvalidCatalogResource(String resource) { 61 | super( 62 | String.format( 63 | "Resource format is incorrect: %s%nNeed like:%s", 64 | resource, ENTITY_TAG_MATCHER.pattern())); 65 | } 66 | } 67 | 68 | private TagUtility() { 69 | } 70 | } 71 | -------------------------------------------------------------------------------- /src/test/resources/create_tag_operation_record.json: -------------------------------------------------------------------------------- 1 | { 2 | "reconcileTime": "2020-09-11T10:52:01.789Z", 3 | "entity": { 4 | "entityId": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk", 5 | "linkedResource": "//bigquery.googleapis.com/projects/myproject1/datasets/mydataset1/tables/TableA" 6 | }, 7 | "auditInformation": { 8 | "insertId": "sb27clc2el", 9 | "jobTime": "2020-09-09T15:25:57.088307939Z", 10 | "actuator": "runner@organization.com", 11 | "operation": { 12 | "type": "google.cloud.datacatalog.v1.DataCatalog.CreateTag", 13 | "resource": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk" 14 | } 15 | }, 16 | "tags": [ 17 | { 18 | "tagId": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 19 | "templateId": "projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag", 20 | "templateName": "Testing Sample Catalog Tag", 21 | "column": "columnName", 22 | "fields": [ 23 | { 24 | "fieldId": "primary_source", 25 | "fieldName": "Primary Source", 26 | "kind": "BOOL", 27 | "boolValue": true 28 | }, 29 | { 30 | "fieldId": "business_unit", 31 | "fieldName": "Business Unit", 32 | "kind": "STRING", 33 | "stringValue": "Professional Services" 34 | }, 35 | { 36 | "fieldId": "updated_timestamp", 37 | "fieldName": "Updated Timestamp", 38 | "kind": "TIMESTAMP", 39 | "timestampValue": "2020-09-09T15:25:00Z" 40 | }, 41 | { 42 | "fieldId": "data_category", 43 | "fieldName": "Data Category", 44 | "kind": "ENUM", 45 | "enumValue": { 46 | "displayName": "PII" 47 | } 48 | }, 49 | { 50 | "fieldId": "version_number", 51 | "fieldName": "Tag Version", 52 | "kind": "DOUBLE", 53 | "doubleValue": 1.2 54 | } 55 | ] 56 | }, 57 | { 58 | "tagId": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/TableAIdTag2Id", 59 | "templateId": "projects/my-project-id/locations/us-central1/tagTemplates/pii_tag", 60 | "templateName": "PII", 61 | "fields": [ 62 | { 63 | "fieldId": "type", 64 | "fieldName": "PII Type", 65 | "kind": "STRING", 66 | "stringValue": "USER_ID" 67 | } 68 | ] 69 | } 70 | ] 71 | } -------------------------------------------------------------------------------- /src/test/resources/tablerow_camelcase_create_record.json: -------------------------------------------------------------------------------- 1 | { 2 | "reconcileTime": "2020-09-11T10:52:01.789Z", 3 | "entity": { 4 | "entityId": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk", 5 | "linkedResource": "//bigquery.googleapis.com/projects/myproject1/datasets/mydataset1/tables/TableA" 6 | }, 7 | "auditInformation": { 8 | "insertId": "sb27clc2el", 9 | "jobTime": "2020-09-09T15:25:57.088307939Z", 10 | "actuator": "runner@organization.com", 11 | "operation": { 12 | "type": "google.cloud.datacatalog.v1.DataCatalog.CreateTag", 13 | "resource": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk" 14 | } 15 | }, 16 | "tags": [ 17 | { 18 | "tagId": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 19 | "templateId": "projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag", 20 | "templateName": "Testing Sample Catalog Tag", 21 | "column": "columnName", 22 | "fields": [ 23 | { 24 | "fieldId": "primary_source", 25 | "fieldName": "Primary Source", 26 | "kind": "BOOL", 27 | "boolValue": true 28 | }, 29 | { 30 | "fieldId": "business_unit", 31 | "fieldName": "Business Unit", 32 | "kind": "STRING", 33 | "stringValue": "Professional Services" 34 | }, 35 | { 36 | "fieldId": "updated_timestamp", 37 | "fieldName": "Updated Timestamp", 38 | "kind": "TIMESTAMP", 39 | "timestampValue": "2020-09-09T15:25:00Z" 40 | }, 41 | { 42 | "fieldId": "data_category", 43 | "fieldName": "Data Category", 44 | "kind": "ENUM", 45 | "enumValue": { 46 | "displayName": "PII" 47 | } 48 | }, 49 | { 50 | "fieldId": "version_number", 51 | "fieldName": "Tag Version", 52 | "kind": "DOUBLE", 53 | "doubleValue": 1.2 54 | } 55 | ] 56 | }, 57 | { 58 | "tagId": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/TableAIdTag2Id", 59 | "templateId": "projects/my-project-id/locations/us-central1/tagTemplates/pii_tag", 60 | "templateName": "PII", 61 | "fields": [ 62 | { 63 | "fieldId": "type", 64 | "fieldName": "PII Type", 65 | "kind": "STRING", 66 | "stringValue": "USER_ID" 67 | } 68 | ] 69 | } 70 | ] 71 | } 72 | -------------------------------------------------------------------------------- /src/test/resources/tablerow_snakecase_create_record.json: -------------------------------------------------------------------------------- 1 | { 2 | "reconcile_time": "2020-09-11T10:52:01.789Z", 3 | "entity": { 4 | "entity_id": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk", 5 | "linked_resource": "//bigquery.googleapis.com/projects/myproject1/datasets/mydataset1/tables/TableA" 6 | }, 7 | "audit_information": { 8 | "insert_id": "sb27clc2el", 9 | "job_time": "2020-09-09T15:25:57.088307939Z", 10 | "actuator": "runner@organization.com", 11 | "operation": { 12 | "type": "google.cloud.datacatalog.v1.DataCatalog.CreateTag", 13 | "resource": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk" 14 | } 15 | }, 16 | "tags": [ 17 | { 18 | "tag_id": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF", 19 | "template_id": "projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag", 20 | "template_name": "Testing Sample Catalog Tag", 21 | "column": "columnName", 22 | "fields": [ 23 | { 24 | "field_id": "primary_source", 25 | "field_name": "Primary Source", 26 | "kind": "BOOL", 27 | "bool_value": true 28 | }, 29 | { 30 | "field_id": "business_unit", 31 | "field_name": "Business Unit", 32 | "kind": "STRING", 33 | "string_value": "Professional Services" 34 | }, 35 | { 36 | "field_id": "updated_timestamp", 37 | "field_name": "Updated Timestamp", 38 | "kind": "TIMESTAMP", 39 | "timestamp_value": "2020-09-09T15:25:00Z" 40 | }, 41 | { 42 | "field_id": "data_category", 43 | "field_name": "Data Category", 44 | "kind": "ENUM", 45 | "enum_value": { 46 | "display_name": "PII" 47 | } 48 | }, 49 | { 50 | "field_id": "version_number", 51 | "field_name": "Tag Version", 52 | "kind": "DOUBLE", 53 | "double_value": 1.2 54 | } 55 | ] 56 | }, 57 | { 58 | "tag_id": "projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/TableAIdTag2Id", 59 | "template_id": "projects/my-project-id/locations/us-central1/tagTemplates/pii_tag", 60 | "template_name": "PII", 61 | "fields": [ 62 | { 63 | "field_id": "type", 64 | "field_name": "PII Type", 65 | "kind": "STRING", 66 | "string_value": "USER_ID" 67 | } 68 | ] 69 | } 70 | ] 71 | } 72 | -------------------------------------------------------------------------------- /code-of-conduct.md: -------------------------------------------------------------------------------- 1 | # Google Open Source Community Guidelines 2 | 3 | At Google, we recognize and celebrate the creativity and collaboration of open 4 | source contributors and the diversity of skills, experiences, cultures, and 5 | opinions they bring to the projects and communities they participate in. 6 | 7 | Every one of Google's open source projects and communities are inclusive 8 | environments, based on treating all individuals respectfully, regardless of 9 | gender identity and expression, sexual orientation, disabilities, 10 | neurodiversity, physical appearance, body size, ethnicity, nationality, race, 11 | age, religion, or similar personal characteristic. 12 | 13 | We value diverse opinions, but we value respectful behavior more. 14 | 15 | Respectful behavior includes: 16 | 17 | * Being considerate, kind, constructive, and helpful. 18 | * Not engaging in demeaning, discriminatory, harassing, hateful, sexualized, or 19 | physically threatening behavior, speech, and imagery. 20 | * Not engaging in unwanted physical contact. 21 | 22 | Some Google open source projects [may adopt][] an explicit project code of 23 | conduct, which may have additional detailed expectations for participants. Most 24 | of those projects will use our [modified Contributor Covenant][]. 25 | 26 | [may adopt]: https://opensource.google/docs/releasing/preparing/#conduct 27 | [modified Contributor Covenant]: https://opensource.google/docs/releasing/template/CODE_OF_CONDUCT/ 28 | 29 | ## Resolve peacefully 30 | 31 | We do not believe that all conflict is necessarily bad; healthy debate and 32 | disagreement often yields positive results. However, it is never okay to be 33 | disrespectful. 34 | 35 | If you see someone behaving disrespectfully, you are encouraged to address the 36 | behavior directly with those involved. Many issues can be resolved quickly and 37 | easily, and this gives people more control over the outcome of their dispute. 38 | If you are unable to resolve the matter for any reason, or if the behavior is 39 | threatening or harassing, report it. We are dedicated to providing an 40 | environment where participants feel welcome and safe. 41 | 42 | ## Reporting problems 43 | 44 | Some Google open source projects may adopt a project-specific code of conduct. 45 | In those cases, a Google employee will be identified as the Project Steward, 46 | who will receive and handle reports of code of conduct violations. In the event 47 | that a project hasn’t identified a Project Steward, you can report problems by 48 | emailing opensource@google.com. 49 | 50 | We will investigate every complaint, but you may not receive a direct response. 51 | We will use our discretion in determining when and how to follow up on reported 52 | incidents, which may range from not taking action to permanent expulsion from 53 | the project and project-sponsored spaces. We will notify the accused of the 54 | report and provide them an opportunity to discuss it before any action is 55 | taken. The identity of the reporter will be omitted from the details of the 56 | report supplied to the accused. In potentially harmful situations, such as 57 | ongoing harassment or threats to anyone's safety, we may take action without 58 | notice. 59 | 60 | *This document was adapted from the [IndieWeb Code of Conduct][] and can also 61 | be found at .* 62 | 63 | [IndieWeb Code of Conduct]: https://indieweb.org/code-of-conduct 64 | -------------------------------------------------------------------------------- /src/main/java/com/google/cloud/solutions/catalogtagrecording/ProtoToTableRowMapper.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import com.fasterxml.jackson.databind.ObjectMapper; 20 | import com.fasterxml.jackson.databind.ObjectReader; 21 | import com.google.api.services.bigquery.model.TableRow; 22 | import com.google.common.flogger.GoogleLogger; 23 | import com.google.protobuf.Message; 24 | import com.google.protobuf.util.JsonFormat; 25 | import java.io.IOException; 26 | import org.apache.beam.sdk.transforms.SerializableFunction; 27 | 28 | /** 29 | * Adaptor to convert a Proto message into a BigQuery TableRow by parsing the proto as a JSON 30 | * string. It uses the {@code snakeCase} option to define the field naming convention to use for 31 | * converting a Proto into JSON. 32 | */ 33 | public final class ProtoToTableRowMapper 34 | implements SerializableFunction { 35 | 36 | private static final GoogleLogger logger = GoogleLogger.forEnclosingClass(); 37 | 38 | private final boolean snakeCase; 39 | private final ObjectReader tableRowJsonReader; 40 | 41 | /** 42 | * Constructs an Adaptor instance with a given field naming convention. 43 | * 44 | * @param snakeCase set {@code true} to output {@code snake_case} formatted Column names in 45 | * TableRow, if set as {@code false} outputs {@code lowerCamelCase} formatted 46 | * column names. 47 | */ 48 | private ProtoToTableRowMapper(boolean snakeCase) { 49 | this.snakeCase = snakeCase; 50 | this.tableRowJsonReader = new ObjectMapper().readerFor(TableRow.class); 51 | } 52 | 53 | /** 54 | * Returns a snake_case format based instance. 55 | */ 56 | public static ProtoToTableRowMapper withSnakeCase() { 57 | return new ProtoToTableRowMapper<>(true); 58 | } 59 | 60 | /** 61 | * Returns a lowerCamelCase format based instance. 62 | */ 63 | public static ProtoToTableRowMapper withCamelCase() { 64 | return new ProtoToTableRowMapper<>(false); 65 | } 66 | 67 | private JsonFormat.Printer protoJsonPrinter() { 68 | JsonFormat.Printer jsonPrinter = JsonFormat.printer().omittingInsignificantWhitespace(); 69 | return (snakeCase) ? jsonPrinter.preservingProtoFieldNames() : JsonFormat.printer(); 70 | } 71 | 72 | @Override 73 | public TableRow apply(T record) { 74 | try { 75 | return tableRowJsonReader.readValue(protoJsonPrinter().print(record)); 76 | } catch (IOException ioException) { 77 | logger.atSevere().withCause(ioException).log( 78 | "Unable to convert record to TableRow:%n%s", record); 79 | } 80 | 81 | return new TableRow(); 82 | } 83 | } 84 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/testing/fakes/datacatalog/FakeDataCatalogPagesListTagsResponse.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording.testing.fakes.datacatalog; 18 | 19 | import com.google.api.core.ApiFuture; 20 | import com.google.api.gax.rpc.ApiCallContext; 21 | import com.google.api.gax.rpc.PageContext; 22 | import com.google.api.gax.rpc.UnaryCallable; 23 | import com.google.cloud.datacatalog.v1beta1.DataCatalogClient.ListTagsPagedResponse; 24 | import com.google.cloud.datacatalog.v1beta1.ListTagsRequest; 25 | import com.google.cloud.datacatalog.v1beta1.ListTagsResponse; 26 | import com.google.cloud.datacatalog.v1beta1.Tag; 27 | import com.google.cloud.solutions.catalogtagrecording.testing.fakes.FakeApiCallContext; 28 | import com.google.cloud.solutions.catalogtagrecording.testing.fakes.FakeApiFutureBase; 29 | import com.google.common.collect.ImmutableCollection; 30 | import com.google.common.collect.ImmutableMap; 31 | import java.util.concurrent.ExecutionException; 32 | 33 | public class FakeDataCatalogPagesListTagsResponse extends FakeApiFutureBase { 34 | 35 | private final ListTagsRequest request; 36 | private final ImmutableCollection tags; 37 | private final ApiCallContext callContext; 38 | 39 | public FakeDataCatalogPagesListTagsResponse( 40 | ListTagsRequest request, ApiCallContext callContext, ImmutableCollection tags) { 41 | this.request = request; 42 | this.tags = tags; 43 | this.callContext = buildCallContext(callContext); 44 | } 45 | 46 | private static ApiCallContext buildCallContext(ApiCallContext context) { 47 | 48 | if (context == null) { 49 | return FakeApiCallContext.builder() 50 | .setTracer(new FakeNoOpApiTracer()) 51 | .setExtraHeaders(ImmutableMap.of()) 52 | .build(); 53 | } 54 | 55 | return context; 56 | } 57 | 58 | @Override 59 | public ListTagsPagedResponse get() throws ExecutionException, InterruptedException { 60 | return ListTagsPagedResponse.createAsync( 61 | buildPageContext(), new FakeDataCatalogListTagsApiResponse(tags)) 62 | .get(); 63 | } 64 | 65 | private PageContext buildPageContext() { 66 | return PageContext.create( 67 | new UnaryCallable() { 68 | @Override 69 | public ApiFuture futureCall( 70 | ListTagsRequest listTagsRequest, ApiCallContext apiCallContext) { 71 | return new FakeDataCatalogListTagsApiResponse(tags); 72 | } 73 | }, 74 | new FakeListTagsRequestListTagsResponseTagPagedListDescriptor(), 75 | request, 76 | callContext); 77 | } 78 | } 79 | -------------------------------------------------------------------------------- /camelEntityTagOperationRecords.schema: -------------------------------------------------------------------------------- 1 | [ 2 | { 3 | "name": "reconcileTime", 4 | "type": "TIMESTAMP", 5 | "mode": "NULLABLE", 6 | "description": "Time in UTC this log was processed" 7 | }, 8 | { 9 | "name": "entity", 10 | "type": "RECORD", 11 | "mode": "NULLABLE", 12 | "fields": [ 13 | { 14 | "name": "entityId", 15 | "type": "STRING", 16 | "mode": "NULLABLE", 17 | "description": "The Data-Catalog's unique Id for the data entity. " 18 | }, 19 | { 20 | "name": "linkedResource", 21 | "type": "STRING", 22 | "mode": "NULLABLE" 23 | }, 24 | { 25 | "name": "sqlResource", 26 | "type": "STRING", 27 | "mode": "NULLABLE" 28 | } 29 | ] 30 | }, 31 | { 32 | "name": "auditInformation", 33 | "type": "RECORD", 34 | "mode": "NULLABLE", 35 | "fields": [ 36 | { 37 | "name": "insertId", 38 | "type": "STRING", 39 | "mode": "NULLABLE", 40 | "description": "Associated AuditLog entry's insertId to uniquely identify the operation" 41 | }, 42 | { 43 | "name": "jobTime", 44 | "type": "TIMESTAMP", 45 | "mode": "NULLABLE", 46 | "description": "Time in UTC the catalog operation completed." 47 | }, 48 | { 49 | "name": "actuator", 50 | "type": "STRING", 51 | "mode": "NULLABLE", 52 | "description": "Email address of the authorized user." 53 | }, 54 | { 55 | "name": "operation", 56 | "type": "RECORD", 57 | "mode": "NULLABLE", 58 | "fields": [ 59 | { 60 | "name": "type", 61 | "type": "STRING", 62 | "mode": "NULLABLE", 63 | "description": "Data Catalog operation type" 64 | }, 65 | { 66 | "name": "resource", 67 | "type": "STRING", 68 | "mode": "NULLABLE", 69 | "description": "Data Catalog operation's target resource." 70 | } 71 | ] 72 | } 73 | ] 74 | }, 75 | { 76 | "name": "tags", 77 | "type": "RECORD", 78 | "mode": "REPEATED", 79 | "fields": [ 80 | { 81 | "name": "tagId", 82 | "type": "STRING", 83 | "mode": "NULLABLE" 84 | }, 85 | { 86 | "name": "templateId", 87 | "type": "STRING", 88 | "mode": "NULLABLE", 89 | "description": "Data Catalog Tag Template Id" 90 | }, 91 | { 92 | "name": "templateName", 93 | "type": "STRING", 94 | "mode": "NULLABLE" 95 | }, 96 | { 97 | "name": "column", 98 | "type": "STRING", 99 | "mode": "NULLABLE" 100 | }, 101 | { 102 | "name": "fields", 103 | "type": "RECORD", 104 | "mode": "REPEATED", 105 | "fields": [ 106 | { 107 | "name": "fieldId", 108 | "type": "STRING", 109 | "mode": "NULLABLE" 110 | }, 111 | { 112 | "name": "fieldName", 113 | "type": "STRING", 114 | "mode": "NULLABLE" 115 | }, 116 | { 117 | "name": "kind", 118 | "type": "STRING", 119 | "mode": "NULLABLE" 120 | }, 121 | { 122 | "name": "boolValue", 123 | "type": "BOOLEAN", 124 | "mode": "NULLABLE" 125 | }, 126 | { 127 | "name": "doubleValue", 128 | "type": "FLOAT", 129 | "mode": "NULLABLE" 130 | }, 131 | { 132 | "name": "stringValue", 133 | "type": "STRING", 134 | "mode": "NULLABLE" 135 | }, 136 | { 137 | "name": "timestampValue", 138 | "type": "TIMESTAMP", 139 | "mode": "NULLABLE" 140 | }, 141 | { 142 | "name": "enumValue", 143 | "type": "RECORD", 144 | "mode": "NULLABLE", 145 | "fields": [ 146 | { 147 | "name": "displayName", 148 | "type": "STRING", 149 | "mode": "NULLABLE" 150 | } 151 | ] 152 | } 153 | ] 154 | } 155 | ] 156 | } 157 | ] -------------------------------------------------------------------------------- /snakeEntityTagOperationRecords.schema: -------------------------------------------------------------------------------- 1 | [ 2 | { 3 | "name": "reconcile_time", 4 | "type": "TIMESTAMP", 5 | "mode": "NULLABLE", 6 | "description": "Time in UTC this log was processed" 7 | }, 8 | { 9 | "name": "entity", 10 | "type": "RECORD", 11 | "mode": "NULLABLE", 12 | "fields": [ 13 | { 14 | "name": "entity_id", 15 | "type": "STRING", 16 | "mode": "NULLABLE", 17 | "description": "The Data-Catalog's unique Id for the data entity. " 18 | }, 19 | { 20 | "name": "linked_resource", 21 | "type": "STRING", 22 | "mode": "NULLABLE" 23 | }, 24 | { 25 | "name": "sql_resource", 26 | "type": "STRING", 27 | "mode": "NULLABLE" 28 | } 29 | ] 30 | }, 31 | { 32 | "name": "audit_information", 33 | "type": "RECORD", 34 | "mode": "NULLABLE", 35 | "fields": [ 36 | { 37 | "name": "insert_id", 38 | "type": "STRING", 39 | "mode": "NULLABLE", 40 | "description": "Associated AuditLog entry's insertId to uniquely identify the operation" 41 | }, 42 | { 43 | "name": "job_time", 44 | "type": "TIMESTAMP", 45 | "mode": "NULLABLE", 46 | "description": "Time in UTC the catalog operation completed." 47 | }, 48 | { 49 | "name": "actuator", 50 | "type": "STRING", 51 | "mode": "NULLABLE", 52 | "description": "Email address of the authorized user." 53 | }, 54 | { 55 | "name": "operation", 56 | "type": "RECORD", 57 | "mode": "NULLABLE", 58 | "fields": [ 59 | { 60 | "name": "type", 61 | "type": "STRING", 62 | "mode": "NULLABLE", 63 | "description": "Data Catalog operation type" 64 | }, 65 | { 66 | "name": "resource", 67 | "type": "STRING", 68 | "mode": "NULLABLE", 69 | "description": "Data Catalog operation's target resource." 70 | } 71 | ] 72 | } 73 | ] 74 | }, 75 | { 76 | "name": "tags", 77 | "type": "RECORD", 78 | "mode": "REPEATED", 79 | "fields": [ 80 | { 81 | "name": "tag_id", 82 | "type": "STRING", 83 | "mode": "NULLABLE" 84 | }, 85 | { 86 | "name": "template_id", 87 | "type": "STRING", 88 | "mode": "NULLABLE", 89 | "description": "Data Catalog Tag Template Id" 90 | }, 91 | { 92 | "name": "template_name", 93 | "type": "STRING", 94 | "mode": "NULLABLE" 95 | }, 96 | { 97 | "name": "column", 98 | "type": "STRING", 99 | "mode": "NULLABLE" 100 | }, 101 | { 102 | "name": "fields", 103 | "type": "RECORD", 104 | "mode": "REPEATED", 105 | "fields": [ 106 | { 107 | "name": "field_id", 108 | "type": "STRING", 109 | "mode": "NULLABLE" 110 | }, 111 | { 112 | "name": "field_name", 113 | "type": "STRING", 114 | "mode": "NULLABLE" 115 | }, 116 | { 117 | "name": "kind", 118 | "type": "STRING", 119 | "mode": "NULLABLE" 120 | }, 121 | { 122 | "name": "bool_value", 123 | "type": "BOOLEAN", 124 | "mode": "NULLABLE" 125 | }, 126 | { 127 | "name": "double_value", 128 | "type": "FLOAT", 129 | "mode": "NULLABLE" 130 | }, 131 | { 132 | "name": "string_value", 133 | "type": "STRING", 134 | "mode": "NULLABLE" 135 | }, 136 | { 137 | "name": "timestamp_value", 138 | "type": "TIMESTAMP", 139 | "mode": "NULLABLE" 140 | }, 141 | { 142 | "name": "enum_value", 143 | "type": "RECORD", 144 | "mode": "NULLABLE", 145 | "fields": [ 146 | { 147 | "name": "display_name", 148 | "type": "STRING", 149 | "mode": "NULLABLE" 150 | } 151 | ] 152 | } 153 | ] 154 | } 155 | ] 156 | } 157 | ] -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/testing/fakes/datacatalog/FakeNoOpApiTracer.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording.testing.fakes.datacatalog; 18 | 19 | import com.google.api.gax.tracing.ApiTracer; 20 | import org.threeten.bp.Duration; 21 | 22 | public class FakeNoOpApiTracer implements ApiTracer { 23 | 24 | @Override 25 | public Scope inScope() { 26 | return null; 27 | } 28 | 29 | @Override 30 | public void operationSucceeded() { 31 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 32 | // operation. 33 | } 34 | 35 | @Override 36 | public void operationCancelled() { 37 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 38 | // operation. 39 | } 40 | 41 | @Override 42 | public void operationFailed(Throwable throwable) { 43 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 44 | // operation. 45 | } 46 | 47 | @Override 48 | public void connectionSelected(String s) { 49 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 50 | // operation. 51 | } 52 | 53 | @Override 54 | public void attemptStarted(int i) { 55 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 56 | // operation. 57 | } 58 | 59 | @Override 60 | public void attemptSucceeded() { 61 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 62 | // operation. 63 | } 64 | 65 | @Override 66 | public void attemptCancelled() { 67 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 68 | // operation. 69 | } 70 | 71 | @Override 72 | public void attemptFailed(Throwable throwable, Duration duration) { 73 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 74 | // operation. 75 | } 76 | 77 | @Override 78 | public void attemptFailedRetriesExhausted(Throwable throwable) { 79 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 80 | // operation. 81 | } 82 | 83 | @Override 84 | public void attemptPermanentFailure(Throwable throwable) { 85 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 86 | // operation. 87 | } 88 | 89 | @Override 90 | public void lroStartFailed(Throwable throwable) { 91 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 92 | // operation. 93 | } 94 | 95 | @Override 96 | public void lroStartSucceeded() { 97 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 98 | // operation. 99 | } 100 | 101 | @Override 102 | public void responseReceived() { 103 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 104 | // operation. 105 | } 106 | 107 | @Override 108 | public void requestSent() { 109 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 110 | // operation. 111 | } 112 | 113 | @Override 114 | public void batchRequestSent(long l, long l1) { 115 | // Do nothing because this is a Fake and doesn't implement an actual gRPC 116 | // operation. 117 | } 118 | } 119 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/testing/fakes/FakeApiCallContext.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording.testing.fakes; 18 | 19 | import com.google.api.gax.rpc.ApiCallContext; 20 | import com.google.api.gax.rpc.TransportChannel; 21 | import com.google.api.gax.tracing.ApiTracer; 22 | import com.google.auth.Credentials; 23 | import com.google.auto.value.AutoValue; 24 | import java.util.List; 25 | import java.util.Map; 26 | import javax.annotation.Nonnull; 27 | import javax.annotation.Nullable; 28 | import org.threeten.bp.Duration; 29 | 30 | @AutoValue 31 | public abstract class FakeApiCallContext implements ApiCallContext { 32 | 33 | @Nullable 34 | public abstract Credentials getCredentials(); 35 | 36 | @Override 37 | public ApiCallContext withCredentials(Credentials credentials) { 38 | return this.toBuilder().setCredentials(credentials).build(); 39 | } 40 | 41 | @Nullable 42 | public abstract TransportChannel getTransportChannel(); 43 | 44 | @Override 45 | public ApiCallContext withTransportChannel(TransportChannel transportChannel) { 46 | return this.toBuilder().setTransportChannel(transportChannel).build(); 47 | } 48 | 49 | @Override 50 | public ApiCallContext withTimeout(@Nullable Duration duration) { 51 | return this.toBuilder().setTimeout(duration).build(); 52 | } 53 | 54 | @Override 55 | public ApiCallContext withStreamWaitTimeout(@Nullable Duration duration) { 56 | return this.toBuilder().setStreamWaitTimeout(duration).build(); 57 | } 58 | 59 | @Override 60 | public ApiCallContext withStreamIdleTimeout(@Nullable Duration duration) { 61 | return this.toBuilder().setStreamIdleTimeout(duration).build(); 62 | } 63 | 64 | @Override 65 | public ApiCallContext withTracer(@Nonnull ApiTracer apiTracer) { 66 | return this.toBuilder().setTracer(apiTracer).build(); 67 | } 68 | 69 | @Override 70 | public ApiCallContext nullToSelf(ApiCallContext apiCallContext) { 71 | return this; 72 | } 73 | 74 | @Override 75 | public ApiCallContext merge(ApiCallContext apiCallContext) { 76 | return this; 77 | } 78 | 79 | @Override 80 | public ApiCallContext withExtraHeaders(Map> map) { 81 | return this.toBuilder().setExtraHeaders(map).build(); 82 | } 83 | 84 | public static Builder builder() { 85 | return new AutoValue_FakeApiCallContext.Builder(); 86 | } 87 | 88 | public abstract Builder toBuilder(); 89 | 90 | @AutoValue.Builder 91 | public abstract static class Builder { 92 | 93 | public abstract Builder setTransportChannel(@Nullable TransportChannel transportChannel); 94 | 95 | public abstract Builder setCredentials(@Nullable Credentials newCredentials); 96 | 97 | public abstract Builder setTracer(ApiTracer newTracer); 98 | 99 | public abstract Builder setTimeout(@Nullable Duration newTimeout); 100 | 101 | public abstract Builder setStreamWaitTimeout(@Nullable Duration newStreamWaitTimeout); 102 | 103 | public abstract Builder setStreamIdleTimeout(@Nullable Duration newStreamIdleTimeout); 104 | 105 | public abstract Builder setExtraHeaders(Map> newExtraHeaders); 106 | 107 | public abstract FakeApiCallContext build(); 108 | } 109 | } 110 | -------------------------------------------------------------------------------- /src/main/java/com/google/cloud/solutions/catalogtagrecording/CatalogTagRecordingPipeline.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import com.google.api.services.bigquery.model.TableRow; 20 | import com.google.api.services.bigquery.model.TimePartitioning; 21 | import com.google.auto.value.AutoValue; 22 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.EntityTagOperationRecord; 23 | import com.google.protobuf.Descriptors; 24 | import org.apache.beam.sdk.Pipeline; 25 | import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO; 26 | import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition; 27 | import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition; 28 | import org.apache.beam.sdk.io.gcp.pubsub.PubsubIO; 29 | import org.apache.beam.sdk.transforms.SerializableFunction; 30 | 31 | /** 32 | * Defines a Dataflow pipeline to DAG that read's Data Catalog AuditLogs from PubSub, extracts the 33 | * present Tags attached to the Entry and writes them into a BigQuery Table. 34 | */ 35 | @AutoValue 36 | public abstract class CatalogTagRecordingPipeline { 37 | 38 | private static final Descriptors.FieldDescriptor RECONCILE_TIME_FIELD = 39 | EntityTagOperationRecord.getDescriptor().findFieldByName("reconcile_time"); 40 | 41 | abstract TagRecordingPipelineOptions options(); 42 | 43 | abstract Pipeline pipeline(); 44 | 45 | public final void run() { 46 | setupPipeline(); 47 | pipeline().run(); 48 | } 49 | 50 | private void setupPipeline() { 51 | pipeline() 52 | .apply( 53 | "ReadAuditLogs", 54 | PubsubIO.readStrings().fromSubscription(options().getCatalogAuditLogsSubscription())) 55 | .apply("ParseLogReadCatalogTags", EntityTagOperationRecordExtractor.defaultExtractor()) 56 | .apply( 57 | "WriteToBigQuery", 58 | BigQueryIO.write() 59 | .to(options().getTagsBigqueryTable()) 60 | .withTimePartitioning(dayPartitionOnReconcileTime()) 61 | .withFormatFunction(recordMapper()) 62 | .withCreateDisposition(CreateDisposition.CREATE_NEVER) 63 | .withWriteDisposition(WriteDisposition.WRITE_APPEND) 64 | .optimizedWrites()); 65 | } 66 | 67 | /** 68 | * Returns a mapping function to convert the proto message into a TableRow structure for writing 69 | * in BigQuery. 70 | */ 71 | private SerializableFunction recordMapper() { 72 | return options().isSnakeCaseColumnNames() 73 | ? ProtoToTableRowMapper.withSnakeCase() 74 | : ProtoToTableRowMapper.withCamelCase(); 75 | } 76 | 77 | /** 78 | * Returns a Date-partitioning configuration for the BigQuery output table. 79 | */ 80 | private TimePartitioning dayPartitionOnReconcileTime() { 81 | return new TimePartitioning() 82 | .setType("DAY") 83 | .setField( 84 | options().isSnakeCaseColumnNames() 85 | ? RECONCILE_TIME_FIELD.getName() 86 | : RECONCILE_TIME_FIELD.getJsonName()); 87 | } 88 | 89 | public static Builder builder() { 90 | return new AutoValue_CatalogTagRecordingPipeline.Builder(); 91 | } 92 | 93 | @AutoValue.Builder 94 | public abstract static class Builder { 95 | 96 | public abstract Builder options(TagRecordingPipelineOptions options); 97 | 98 | public abstract Builder pipeline(Pipeline pipeline); 99 | 100 | public abstract CatalogTagRecordingPipeline build(); 101 | } 102 | } 103 | -------------------------------------------------------------------------------- /src/main/java/com/google/cloud/solutions/catalogtagrecording/DataCatalogService.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import static com.google.common.base.Preconditions.checkArgument; 20 | import static com.google.common.base.Preconditions.checkNotNull; 21 | import static org.apache.commons.lang3.StringUtils.isNotBlank; 22 | 23 | import com.google.cloud.datacatalog.v1beta1.DataCatalogClient; 24 | import com.google.cloud.datacatalog.v1beta1.Entry; 25 | import com.google.cloud.datacatalog.v1beta1.stub.DataCatalogStub; 26 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.CatalogEntry; 27 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.CatalogTag; 28 | import com.google.common.collect.ImmutableSet; 29 | import java.io.IOException; 30 | import java.util.Optional; 31 | import java.util.stream.StreamSupport; 32 | 33 | /** 34 | * DataCatalog API wrapper to lookup an Entry using Id and other 35 | */ 36 | public class DataCatalogService implements AutoCloseable { 37 | private final DataCatalogClient dataCatalogClient; 38 | 39 | private DataCatalogService(DataCatalogClient dataCatalogClient) { 40 | this.dataCatalogClient = checkNotNull(dataCatalogClient); 41 | } 42 | 43 | /** Convenience Factory for building an instance Data Catalog service using provided Client. */ 44 | public static DataCatalogService using(DataCatalogClient catalogClient) { 45 | return new DataCatalogService(catalogClient); 46 | } 47 | 48 | /** 49 | * Convenience Factory for building an instance Data Catalog service using provided Stub. Creates 50 | * using a standard client if stub is null. 51 | */ 52 | public static DataCatalogService usingStub(DataCatalogStub catalogStub) throws IOException { 53 | return using( 54 | (catalogStub == null) ? DataCatalogClient.create() : DataCatalogClient.create(catalogStub)); 55 | } 56 | 57 | /** 58 | * Returns Entry with LinkedResource field populated from DataCatalog. 59 | */ 60 | public CatalogEntry enrichCatalogEntry(CatalogEntry entity) { 61 | 62 | return lookupEntry(entity) 63 | .map( 64 | entry -> 65 | CatalogEntry.newBuilder() 66 | .setEntityId(entry.getName()) 67 | .setLinkedResource(entry.getLinkedResource()) 68 | .build()) 69 | .orElse(entity); 70 | } 71 | 72 | /** Returns an entry object by looking up the EntryId through the DataCatalog API. */ 73 | public Optional lookupEntry(CatalogEntry entity) { 74 | checkNotNull(entity, "Entity can't be null"); 75 | checkArgument( 76 | isNotBlank(entity.getEntityId()), "Entity Id should a valid DataCatalog entityId"); 77 | 78 | return Optional.ofNullable(dataCatalogClient.getEntry(entity.getEntityId())); 79 | } 80 | 81 | /** Returns a BigQuery compatible representation of Tags attached to an entry. */ 82 | public ImmutableSet lookUpAllTags(String entryId) { 83 | ImmutableSet.Builder entryTagsBuilder = ImmutableSet.builder(); 84 | 85 | dataCatalogClient 86 | .listTags(entryId) 87 | .iteratePages() 88 | .forEach( 89 | listTagsPage -> 90 | StreamSupport.stream(listTagsPage.getValues().spliterator(), /*parallel=*/ false) 91 | .map(CatalogTagConverter::toCatalogTag) 92 | .forEach(entryTagsBuilder::add)); 93 | 94 | return entryTagsBuilder.build(); 95 | } 96 | 97 | @Override 98 | public void close() { 99 | dataCatalogClient.close(); 100 | } 101 | } 102 | -------------------------------------------------------------------------------- /src/main/java/com/google/cloud/solutions/catalogtagrecording/JsonMessageParser.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import static com.google.common.base.MoreObjects.firstNonNull; 20 | import static com.google.common.base.Preconditions.checkArgument; 21 | import static org.apache.commons.lang3.StringUtils.isNotBlank; 22 | 23 | import com.google.auto.value.AutoValue; 24 | import com.google.common.flogger.GoogleLogger; 25 | import com.jayway.jsonpath.DocumentContext; 26 | import com.jayway.jsonpath.JsonPath; 27 | import com.jayway.jsonpath.PathNotFoundException; 28 | import java.util.concurrent.TimeUnit; 29 | 30 | /** 31 | * A Parser for JSON using {@link JsonPath} library. 32 | * 33 | *

Provides services to read attributes of a structured Message. 34 | */ 35 | @AutoValue 36 | public abstract class JsonMessageParser { 37 | 38 | private static final GoogleLogger logger = GoogleLogger.forEnclosingClass(); 39 | 40 | abstract DocumentContext getParsedMessage(); 41 | 42 | public abstract String getRootPath(); 43 | 44 | /** 45 | * Returns the JSON message used by this parser instant. 46 | */ 47 | public String getJson() { 48 | return getParsedMessage().jsonString(); 49 | } 50 | 51 | /** 52 | * Returns a JSON parser for a subnode of the present Object. 53 | */ 54 | public JsonMessageParser forSubNode(String subNodeKey) { 55 | return JsonMessageParser.builder() 56 | .setParsedMessage(getParsedMessage()) 57 | .setRootPath(buildPath(subNodeKey)) 58 | .build(); 59 | } 60 | 61 | public static JsonMessageParser of(String messageJson) { 62 | return builder().setMessageJson(messageJson).build(); 63 | } 64 | 65 | /** 66 | * Returns a POJO read at the provided path or null if not found. 67 | */ 68 | public T read(String jsonPath) { 69 | try { 70 | return getParsedMessage().read(buildPath(jsonPath)); 71 | } catch (PathNotFoundException | NullPointerException exception) { 72 | logger.atInfo().withCause(exception).atMostEvery(1, TimeUnit.MINUTES).log( 73 | "error reading [%s]", jsonPath); 74 | return null; 75 | } 76 | } 77 | 78 | /** Returns a POJO read at the provided path or the provided default value if target is empty. */ 79 | public T readOrDefault(String jsonPath, T defaultValue) { 80 | return firstNonNull(read(jsonPath), defaultValue); 81 | } 82 | 83 | private String buildPath(String path) { 84 | checkArgument(path.startsWith("$.")); 85 | return getRootPath() + "." + path.replaceFirst("^\\$\\.", ""); 86 | } 87 | 88 | public static Builder builder() { 89 | return new AutoValue_JsonMessageParser.Builder().setRootPath("$"); 90 | } 91 | 92 | @AutoValue.Builder 93 | public abstract static class Builder { 94 | 95 | public Builder setMessageJson(String messageJson) { 96 | checkArgument( 97 | isNotBlank(messageJson), "JSON can't be null or empty (was: \"%s\")", messageJson); 98 | return setParsedMessage(JsonPath.parse(messageJson)); 99 | } 100 | 101 | public abstract Builder setParsedMessage(DocumentContext newParsedMessage); 102 | 103 | public abstract Builder setRootPath(String newRootPath); 104 | 105 | abstract JsonMessageParser autoBuild(); 106 | 107 | public JsonMessageParser build() { 108 | JsonMessageParser parser = autoBuild(); 109 | checkArgument( 110 | isNotBlank(parser.getRootPath()), 111 | "RootPath can't be null or empty (was: \"%s\") default root is $", 112 | parser.getRootPath()); 113 | 114 | return parser; 115 | } 116 | } 117 | } 118 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/TagRecordingPipelineOptionsTest.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import static com.google.common.truth.Truth.assertThat; 20 | 21 | import com.google.common.base.Splitter; 22 | import org.apache.beam.sdk.options.PipelineOptionsFactory; 23 | import org.junit.Assert; 24 | import org.junit.Test; 25 | import org.junit.runner.RunWith; 26 | import org.junit.runners.JUnit4; 27 | 28 | @RunWith(JUnit4.class) 29 | public final class TagRecordingPipelineOptionsTest { 30 | 31 | @Test 32 | public void getCatalogAuditLogsSubscription_flagAbsent_throwsException() { 33 | IllegalArgumentException iaex = 34 | Assert.assertThrows( 35 | IllegalArgumentException.class, 36 | () -> optionsFromArgString("--tagsBigqueryTable=project-id:dataset.table")); 37 | 38 | assertThat(iaex).hasMessageThat().contains("--catalogAuditLogsSubscription"); 39 | } 40 | 41 | @Test 42 | public void getCatalogAuditLogsSubscription_valid() { 43 | assertThat( 44 | optionsFromArgString( 45 | "--catalogAuditLogsSubscription=projects/my-subscription " 46 | + "--tagsBigqueryTable=project-id:dataset.table") 47 | .getCatalogAuditLogsSubscription()) 48 | .isEqualTo("projects/my-subscription"); 49 | } 50 | 51 | @Test 52 | public void getTagsBigqueryTable_flagAbsent_throwsException() { 53 | IllegalArgumentException iaex = 54 | Assert.assertThrows( 55 | IllegalArgumentException.class, 56 | () -> optionsFromArgString("--catalogAuditLogsSubscription=projects/my-subscription")); 57 | 58 | assertThat(iaex).hasMessageThat().contains("--tagsBigqueryTable"); 59 | } 60 | 61 | @Test 62 | public void getTagsBigqueryTable_valid() { 63 | assertThat( 64 | optionsFromArgString( 65 | "--catalogAuditLogsSubscription=projects/my-subscription " 66 | + "--tagsBigqueryTable=project-id:dataset.table") 67 | .getTagsBigqueryTable()) 68 | .isEqualTo("project-id:dataset.table"); 69 | } 70 | 71 | private static TagRecordingPipelineOptions optionsFromArgString(String argsString) { 72 | return PipelineOptionsFactory.fromArgs( 73 | Splitter.on(' ').trimResults().splitToList(argsString).toArray(new String[0])) 74 | .withValidation() 75 | .as(TagRecordingPipelineOptions.class); 76 | } 77 | 78 | @Test 79 | public void isSnakeCase_flagAbsent_false() { 80 | assertThat( 81 | optionsFromArgString( 82 | "--catalogAuditLogsSubscription=projects/my-subscription " 83 | + "--tagsBigqueryTable=project-id:dataset.table") 84 | .isSnakeCaseColumnNames()) 85 | .isFalse(); 86 | } 87 | 88 | @Test 89 | public void isSnakeCase_flagFalse_false() { 90 | assertThat( 91 | optionsFromArgString( 92 | "--catalogAuditLogsSubscription=projects/my-subscription " 93 | + "--tagsBigqueryTable=project-id:dataset.table " 94 | + "--snakeCaseColumnNames=false") 95 | .isSnakeCaseColumnNames()) 96 | .isFalse(); 97 | } 98 | 99 | @Test 100 | public void isSnakeCase_flagPresent_true() { 101 | assertThat( 102 | optionsFromArgString( 103 | "--catalogAuditLogsSubscription=projects/my-subscription " 104 | + "--tagsBigqueryTable=project-id:dataset.table " 105 | + "--snakeCaseColumnNames") 106 | .isSnakeCaseColumnNames()) 107 | .isTrue(); 108 | } 109 | 110 | @Test 111 | public void isSnakeCase_flagTrue_true() { 112 | assertThat( 113 | optionsFromArgString( 114 | "--catalogAuditLogsSubscription=projects/my-subscription " 115 | + "--tagsBigqueryTable=project-id:dataset.table " 116 | + "--snakeCaseColumnNames=true") 117 | .isSnakeCaseColumnNames()) 118 | .isTrue(); 119 | } 120 | } 121 | -------------------------------------------------------------------------------- /src/main/java/com/google/cloud/solutions/catalogtagrecording/CatalogTagConverter.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import static com.google.common.collect.ImmutableSet.toImmutableSet; 20 | 21 | import com.google.cloud.datacatalog.v1beta1.Tag; 22 | import com.google.cloud.datacatalog.v1beta1.TagField; 23 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.CatalogTag; 24 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.CatalogTagField; 25 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.CatalogTagField.CatalogTagFieldKinds; 26 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.CatalogTagField.EnumValue; 27 | import com.google.common.collect.ImmutableMap; 28 | import java.util.Map; 29 | import java.util.Optional; 30 | 31 | /** 32 | * Adaptor to convert a Data Catalog's {@link Tag} into {@link CatalogTag} defined in {@code 33 | * proto/tag_recording_messages.proto} for definitions. The transformation is required to store a 34 | * dynamic key-value pair as a repeated record in BigQuery table. 35 | */ 36 | public final class CatalogTagConverter { 37 | 38 | /** 39 | * Converts the Key-Value pair of the Tag fields into a repeated record for storing in BigQuery. 40 | * 41 | * @param tag the Data Catalog's representation of an Entry's Tag. 42 | * @return A BigQuery representation as flattened repeated record for the same Tag. 43 | */ 44 | public static CatalogTag toCatalogTag(Tag tag) { 45 | CatalogTag.Builder builder = 46 | CatalogTag.newBuilder() 47 | .setTagId(tag.getName()) 48 | .setTemplateId(tag.getTemplate()) 49 | .setTemplateName(tag.getTemplateDisplayName()) 50 | .addAllFields( 51 | Optional.ofNullable(tag.getFieldsMap()) 52 | .orElseGet(ImmutableMap::of) 53 | .entrySet() 54 | .stream() 55 | .map(CatalogTagFieldConverter::convert) 56 | .collect(toImmutableSet())); 57 | 58 | if (tag.getColumn() != null) { 59 | builder.setColumn(tag.getColumn()); 60 | } 61 | 62 | return builder.build(); 63 | } 64 | 65 | /** 66 | * Helper class to transform Data-type specific TagFields. 67 | */ 68 | private static class CatalogTagFieldConverter { 69 | 70 | static CatalogTagField convert(Map.Entry entry) { 71 | return convert(entry.getKey(), entry.getValue()); 72 | } 73 | 74 | static CatalogTagField convert(String fieldId, TagField tagField) { 75 | CatalogTagField.Builder builder = 76 | CatalogTagField.newBuilder().setFieldId(fieldId).setFieldName(tagField.getDisplayName()); 77 | 78 | switch (tagField.getKindCase()) { 79 | case BOOL_VALUE: 80 | builder.setBoolValue(tagField.getBoolValue()); 81 | builder.setKind(CatalogTagFieldKinds.BOOL); 82 | break; 83 | 84 | case DOUBLE_VALUE: 85 | builder.setDoubleValue(tagField.getDoubleValue()); 86 | builder.setKind(CatalogTagFieldKinds.DOUBLE); 87 | break; 88 | 89 | case STRING_VALUE: 90 | builder.setStringValue(tagField.getStringValue()); 91 | builder.setKind(CatalogTagFieldKinds.STRING); 92 | break; 93 | 94 | case TIMESTAMP_VALUE: 95 | builder.setTimestampValue(tagField.getTimestampValue()); 96 | builder.setKind(CatalogTagFieldKinds.TIMESTAMP); 97 | break; 98 | 99 | case ENUM_VALUE: 100 | builder.setEnumValue( 101 | EnumValue.newBuilder() 102 | .setDisplayName(tagField.getEnumValue().getDisplayName()) 103 | .build()); 104 | builder.setKind(CatalogTagFieldKinds.ENUM); 105 | break; 106 | 107 | case KIND_NOT_SET: 108 | throw new IllegalStateException("Unknown KIND of tagField"); 109 | } 110 | 111 | return builder.build(); 112 | } 113 | 114 | private CatalogTagFieldConverter() {} 115 | } 116 | 117 | private CatalogTagConverter() {} 118 | } 119 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/JsonMessageParserTest.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import static com.google.common.truth.Truth.assertThat; 20 | import static org.junit.Assert.assertThrows; 21 | 22 | import org.junit.Test; 23 | import org.skyscreamer.jsonassert.JSONAssert; 24 | 25 | public class JsonMessageParserTest { 26 | 27 | private static final String TEST_JSON = 28 | "{\n" 29 | + " \"code\": 403,\n" 30 | + " \"errors\": [\n" 31 | + " {\n" 32 | + " \"domain\": \"global\",\n" 33 | + " \"message\": \"Access Denied\",\n" 34 | + " \"reason\": \"accessDenied\"\n" 35 | + " }\n" 36 | + " ],\n" 37 | + " \"status\": \"PERMISSION_DENIED\",\n" 38 | + " \"keyWithNestedValues\": {\n" 39 | + " \"nestedKey1\": \"nestedValue1\",\n" 40 | + " \"secondLevelNestedKey\": {\n" 41 | + " \"keyInSecond\": \"ValueInSecond\"\n" 42 | + " }\n" 43 | + " }\n" 44 | + "}"; 45 | 46 | @Test 47 | public void getJson_completeJson() { 48 | JSONAssert.assertEquals( 49 | TEST_JSON, JsonMessageParser.of(TEST_JSON).getJson(), /*strict=*/ false); 50 | } 51 | 52 | @Test 53 | public void getRootPath_defaultRoot_equalsDollar() { 54 | assertThat(JsonMessageParser.of(TEST_JSON).getRootPath()).isEqualTo("$"); 55 | } 56 | 57 | @Test 58 | public void getRootPath_nestedRoot_equalsCompleteRoot() { 59 | assertThat( 60 | JsonMessageParser.of(TEST_JSON) 61 | .forSubNode("$.keyWithNestedValues") 62 | .forSubNode("$.secondLevelNestedKey") 63 | .getRootPath()) 64 | .isEqualTo("$.keyWithNestedValues.secondLevelNestedKey"); 65 | } 66 | 67 | @Test 68 | public void read_emptyRootSingleStringValue_correct() { 69 | assertThat(JsonMessageParser.of(TEST_JSON).read("$.status")) 70 | .isEqualTo("PERMISSION_DENIED"); 71 | } 72 | 73 | @Test 74 | public void read_emptyRootSingleIntegerValue_correct() { 75 | assertThat(JsonMessageParser.of(TEST_JSON).read("$.code")).isEqualTo(403); 76 | } 77 | 78 | @Test 79 | public void read_subNodeRootSingleValue_correct() { 80 | assertThat( 81 | JsonMessageParser.of(TEST_JSON) 82 | .forSubNode("$.keyWithNestedValues") 83 | .read("$.nestedKey1")) 84 | .isEqualTo("nestedValue1"); 85 | } 86 | 87 | @Test 88 | public void read_subNodeSecondLevelNestingRootSingleValue_correct() { 89 | assertThat( 90 | JsonMessageParser.of(TEST_JSON) 91 | .forSubNode("$.keyWithNestedValues") 92 | .forSubNode("$.secondLevelNestedKey") 93 | .read("$.keyInSecond")) 94 | .isEqualTo("ValueInSecond"); 95 | } 96 | 97 | @Test 98 | public void read_invalidKey_null() { 99 | assertThat(JsonMessageParser.of(TEST_JSON).read("$.missingKey")).isNull(); 100 | } 101 | 102 | @Test 103 | public void readOrDefault_keyPresent_actualValue() { 104 | assertThat(JsonMessageParser.of(TEST_JSON).readOrDefault("$.status", "DEFAULT")) 105 | .isEqualTo("PERMISSION_DENIED"); 106 | } 107 | 108 | @Test 109 | public void readOrDefault_keyMissing_defaultValue() { 110 | assertThat(JsonMessageParser.of(TEST_JSON).readOrDefault("$.status2", "DEFAULT")) 111 | .isEqualTo("DEFAULT"); 112 | } 113 | 114 | @Test 115 | public void readOrDefault_nullKey_defaultValue() { 116 | assertThat(JsonMessageParser.of(TEST_JSON).readOrDefault(null, "DEFAULT")).isEqualTo("DEFAULT"); 117 | } 118 | 119 | @Test 120 | public void setMessageJson_null_throwsNullPointerException() { 121 | assertThrows(IllegalArgumentException.class, () -> JsonMessageParser.of(null)); 122 | } 123 | 124 | @Test 125 | public void setMessageJson_empty_throwsNullPointerException() { 126 | assertThrows(IllegalArgumentException.class, () -> JsonMessageParser.of("")); 127 | } 128 | 129 | @Test 130 | public void setMessageJson_blank_throwsNullPointerException() { 131 | assertThrows(IllegalArgumentException.class, () -> JsonMessageParser.of(" ")); 132 | } 133 | } 134 | -------------------------------------------------------------------------------- /src/main/java/com/google/cloud/solutions/catalogtagrecording/ProtoJsonConverter.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import static com.google.common.collect.ImmutableList.toImmutableList; 20 | 21 | import com.fasterxml.jackson.core.JsonProcessingException; 22 | import com.fasterxml.jackson.databind.ObjectMapper; 23 | import com.google.common.collect.ImmutableList; 24 | import com.google.common.flogger.GoogleLogger; 25 | import com.google.protobuf.InvalidProtocolBufferException; 26 | import com.google.protobuf.Message; 27 | import com.google.protobuf.MessageOrBuilder; 28 | import com.google.protobuf.util.JsonFormat; 29 | import com.jayway.jsonpath.JsonPath; 30 | import java.io.IOException; 31 | import java.util.Collection; 32 | import java.util.LinkedHashMap; 33 | import java.util.List; 34 | import java.util.Map; 35 | import java.util.Objects; 36 | import java.util.concurrent.TimeUnit; 37 | import java.util.stream.Collectors; 38 | import javax.annotation.Nullable; 39 | 40 | /** 41 | * Utility methods to transform Java-beans to JSON using Jackson. 42 | */ 43 | public final class ProtoJsonConverter { 44 | 45 | private static final GoogleLogger logger = GoogleLogger.forEnclosingClass(); 46 | 47 | /** 48 | * Returns the Object serialized as JSON using Jackson ObjectWriter or Protobuf JsonFormat if the 49 | * object is a Message. 50 | * 51 | * @param object the object to get JSON representation of. 52 | * @return JSON representation of the object or EMPTY string in case of error. 53 | */ 54 | public static String asJsonString(T object) { 55 | try { 56 | 57 | if (object == null) { 58 | return ""; 59 | } 60 | 61 | if (object instanceof MessageOrBuilder) { 62 | return convertProtobufMessage((MessageOrBuilder) object); 63 | } 64 | 65 | if (object instanceof Collection) { 66 | return "[" 67 | + ((Collection) object) 68 | .stream().map(ProtoJsonConverter::asJsonString).collect(Collectors.joining(",")) 69 | + "]"; 70 | } 71 | 72 | if (object instanceof Map) { 73 | return mapAsJsonString((Map) object); 74 | } 75 | 76 | return convertJavaBeans(object); 77 | 78 | } catch (IOException exp) { 79 | logger.atSevere().withCause(exp).log("Error in converting to Json"); 80 | } 81 | 82 | return ""; 83 | } 84 | 85 | /** Returns a JSON string of the given Map. */ 86 | private static String mapAsJsonString(Map map) { 87 | if (map == null || map.isEmpty()) { 88 | return "{}"; 89 | } 90 | 91 | return "{" 92 | + map.entrySet().stream() 93 | .map( 94 | entry -> 95 | String.format("\"%s\": %s", entry.getKey(), asJsonString(entry.getValue()))) 96 | .collect(Collectors.joining(",")) 97 | + "}"; 98 | } 99 | 100 | /** Returns a JSON String representation using Jackson ObjectWriter. */ 101 | private static String convertJavaBeans(T object) throws JsonProcessingException { 102 | return new ObjectMapper().writer().writeValueAsString(object); 103 | } 104 | 105 | /** Returns a JSON String representation using Protobuf JsonFormat. */ 106 | public static String convertProtobufMessage(MessageOrBuilder message) 107 | throws InvalidProtocolBufferException { 108 | return JsonFormat.printer().print(message); 109 | } 110 | 111 | @SuppressWarnings("unchecked") 112 | @Nullable 113 | public static T parseJson(String json, Class protoClass) { 114 | try { 115 | Message.Builder builder = (Message.Builder) protoClass.getMethod("newBuilder").invoke(null); 116 | 117 | JsonFormat.parser().merge(json, builder); 118 | return (T) builder.build(); 119 | } catch (Exception protoException) { 120 | logger.atSevere().withCause(protoException).atMostEvery(1, TimeUnit.MINUTES).log( 121 | "error converting json:\n%s", json); 122 | return null; 123 | } 124 | } 125 | 126 | public static ImmutableList parseAsList( 127 | Collection jsons, Class protoClass) { 128 | return jsons.stream() 129 | .map(json -> ProtoJsonConverter.parseJson(json, protoClass)) 130 | .filter(Objects::nonNull) 131 | .collect(toImmutableList()); 132 | } 133 | 134 | public static ImmutableList parseAsList( 135 | String jsonArray, Class protoClass) { 136 | return parseAsList( 137 | JsonPath.parse(jsonArray).>>read("$").stream() 138 | .map(ProtoJsonConverter::asJsonString) 139 | .collect(toImmutableList()), 140 | protoClass); 141 | } 142 | 143 | private ProtoJsonConverter() {} 144 | } 145 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/CatalogTagConverterTest.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import static com.google.cloud.solutions.catalogtagrecording.CatalogTagConverter.toCatalogTag; 20 | import static com.google.cloud.solutions.catalogtagrecording.ProtoJsonConverter.parseJson; 21 | import static com.google.cloud.solutions.catalogtagrecording.testing.TestResourceLoader.load; 22 | import static com.google.common.truth.Truth.assertThat; 23 | 24 | import com.google.cloud.datacatalog.v1beta1.Tag; 25 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.CatalogTag; 26 | import org.junit.Test; 27 | import org.junit.runner.RunWith; 28 | import org.junit.runners.JUnit4; 29 | 30 | @SuppressWarnings("ConstantConditions") // Suppress Null checks for Proto-parser output. 31 | @RunWith(JUnit4.class) 32 | public final class CatalogTagConverterTest { 33 | 34 | @Test 35 | public void toCatalogTag_tagWithBooleanStringTimestampNoColumn_valid() { 36 | Tag tag = parseJson(load("single_tag_no_column.json"), Tag.class); 37 | assertThat(toCatalogTag(tag)) 38 | .isEqualTo( 39 | parseJson( 40 | "{\n" 41 | + " \"tagId\": \"projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF\",\n" 42 | + " \"templateId\": \"projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag\",\n" 43 | + " \"templateName\": \"Testing Sample Catalog Tag\",\n" 44 | + " \"fields\": [{\n" 45 | + " \"fieldId\": \"primary_source\",\n" 46 | + " \"fieldName\": \"Primary Source\",\n" 47 | + " \"kind\": \"BOOL\",\n" 48 | + " \"boolValue\": true\n" 49 | + " }, {\n" 50 | + " \"fieldId\": \"business_unit\",\n" 51 | + " \"fieldName\": \"Business Unit\",\n" 52 | + " \"kind\": \"STRING\",\n" 53 | + " \"stringValue\": \"Professional Services\"\n" 54 | + " }, {\n" 55 | + " \"fieldId\": \"updated_timestamp\",\n" 56 | + " \"fieldName\": \"Updated Timestamp\",\n" 57 | + " \"kind\": \"TIMESTAMP\",\n" 58 | + " \"timestampValue\": \"2020-09-09T15:25:00Z\"\n" 59 | + " }]\n" 60 | + "}", 61 | CatalogTag.class)); 62 | } 63 | 64 | @Test 65 | public void toCatalogTag_tagWithBooleanStringDoubleColumn_validColumn() { 66 | Tag tag = parseJson(load("single_tag_with_column.json"), Tag.class); 67 | 68 | assertThat(toCatalogTag(tag)) 69 | .isEqualTo( 70 | parseJson( 71 | "{\n" 72 | + " \"tagId\": \"projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF\",\n" 73 | + " \"templateId\": \"projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag\",\n" 74 | + " \"templateName\": \"Testing Sample Catalog Tag\",\n" 75 | + " \"column\": \"someColumn\",\n" 76 | + " \"fields\": [{\n" 77 | + " \"fieldId\": \"primary_source\",\n" 78 | + " \"fieldName\": \"Primary Source\",\n" 79 | + " \"kind\": \"BOOL\",\n" 80 | + " \"boolValue\": true\n" 81 | + " }, {\n" 82 | + " \"fieldId\": \"business_unit\",\n" 83 | + " \"fieldName\": \"Business Unit\",\n" 84 | + " \"kind\": \"STRING\",\n" 85 | + " \"stringValue\": \"Professional Services\"\n" 86 | + " }, {\n" 87 | + " \"fieldId\": \"version_number\",\n" 88 | + " \"fieldName\": \"Tag Version\",\n" 89 | + " \"kind\": \"DOUBLE\",\n" 90 | + " \"doubleValue\": 1.2\n" 91 | + " }]\n" 92 | + "}", 93 | CatalogTag.class)); 94 | } 95 | 96 | @Test 97 | public void toCatalogTag_tagWithEnumColumn_valid() { 98 | Tag tag = parseJson(load("single_tag_enum_column.json"), Tag.class); 99 | 100 | assertThat(toCatalogTag(tag)) 101 | .isEqualTo( 102 | parseJson( 103 | "{\n" 104 | + " \"tagId\": \"projects/my-project-id/locations/us/entryGroups/@bigquery/entries/cHJvamVjdHMvZ29vZ2xlLmNvbTphbmFudGQvZGF0YXNldHMvTXlEYXRhU2V0L3RhYmxlcy9Nb2NrUGlpUHJvY2Vzc2Vk/tags/CUzSbDlKy_z_bF\",\n" 105 | + " \"templateId\": \"projects/my-project-id/locations/us/tagTemplates/sample_catalog_tag\",\n" 106 | + " \"templateName\": \"Testing Sample Catalog Tag\",\n" 107 | + " \"column\": \"someColumn\",\n" 108 | + " \"fields\": [{\n" 109 | + " \"fieldId\": \"primary_source\",\n" 110 | + " \"fieldName\": \"Primary Source\",\n" 111 | + " \"kind\": \"BOOL\",\n" 112 | + " \"boolValue\": true\n" 113 | + " }, {\n" 114 | + " \"fieldId\": \"business_unit\",\n" 115 | + " \"fieldName\": \"Business Unit\",\n" 116 | + " \"kind\": \"STRING\",\n" 117 | + " \"stringValue\": \"Professional Services\"\n" 118 | + " }, {\n" 119 | + " \"fieldId\": \"data_category\",\n" 120 | + " \"fieldName\": \"Data Category\",\n" 121 | + " \"kind\": \"ENUM\",\n" 122 | + " \"enumValue\": {\n" 123 | + " \"displayName\": \"PII\"\n" 124 | + " }\n" 125 | + " }]\n" 126 | + "}", 127 | CatalogTag.class)); 128 | } 129 | } 130 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/testing/fakes/datacatalog/FakeDataCatalogStub.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording.testing.fakes.datacatalog; 18 | 19 | import static com.google.cloud.solutions.catalogtagrecording.ProtoJsonConverter.parseAsList; 20 | import static com.google.cloud.solutions.catalogtagrecording.ProtoJsonConverter.parseJson; 21 | import static com.google.common.base.Preconditions.checkArgument; 22 | import static com.google.common.base.Preconditions.checkState; 23 | import static com.google.common.collect.ImmutableList.toImmutableList; 24 | import static com.google.common.collect.ImmutableListMultimap.toImmutableListMultimap; 25 | import static com.google.common.collect.ImmutableMap.toImmutableMap; 26 | import static java.util.function.Function.identity; 27 | 28 | import com.google.api.core.ApiFuture; 29 | import com.google.api.gax.rpc.ApiCallContext; 30 | import com.google.api.gax.rpc.UnaryCallable; 31 | import com.google.cloud.datacatalog.v1beta1.DataCatalogClient.ListTagsPagedResponse; 32 | import com.google.cloud.datacatalog.v1beta1.Entry; 33 | import com.google.cloud.datacatalog.v1beta1.GetEntryRequest; 34 | import com.google.cloud.datacatalog.v1beta1.ListTagsRequest; 35 | import com.google.cloud.datacatalog.v1beta1.LookupEntryRequest; 36 | import com.google.cloud.datacatalog.v1beta1.Tag; 37 | import com.google.cloud.datacatalog.v1beta1.stub.DataCatalogStub; 38 | import com.google.cloud.solutions.catalogtagrecording.TagUtility; 39 | import com.google.cloud.solutions.catalogtagrecording.testing.TestResourceLoader; 40 | import com.google.cloud.solutions.catalogtagrecording.testing.fakes.FakeApiFutureBase; 41 | import com.google.common.collect.ImmutableCollection; 42 | import com.google.common.collect.ImmutableList; 43 | import com.google.common.collect.ImmutableListMultimap; 44 | import com.google.common.collect.ImmutableMap; 45 | import com.google.common.collect.ImmutableMultimap; 46 | import com.google.common.hash.Hashing; 47 | import java.io.Serializable; 48 | import java.util.Comparator; 49 | import java.util.List; 50 | import java.util.concurrent.TimeUnit; 51 | 52 | public final class FakeDataCatalogStub extends DataCatalogStub implements Serializable { 53 | 54 | private boolean shutdown; 55 | private boolean terminated; 56 | 57 | private final ImmutableMap predefinedEntriesById; 58 | private final ImmutableMap predefinedEntries; 59 | private final ImmutableMultimap predefinedTags; 60 | 61 | public FakeDataCatalogStub( 62 | ImmutableCollection predefinedEntries, ImmutableMultimap predefinedTags) { 63 | this.predefinedEntriesById = 64 | predefinedEntries.stream().collect(toImmutableMap(Entry::getName, identity())); 65 | 66 | this.predefinedEntries = 67 | predefinedEntries.stream().collect(toImmutableMap(Entry::getLinkedResource, identity())); 68 | 69 | this.predefinedTags = predefinedTags; 70 | } 71 | 72 | public static FakeDataCatalogStub buildWithTestData( 73 | List entryResourcesNames, List tagsResourceNames) { 74 | 75 | ImmutableList entries = 76 | entryResourcesNames.stream() 77 | .map(TestResourceLoader::load) 78 | .map(json -> parseJson(json, Entry.class)) 79 | .collect(toImmutableList()); 80 | 81 | ImmutableListMultimap entityTags = 82 | tagsResourceNames.stream() 83 | .map(TestResourceLoader::load) 84 | .map(tagsJson -> parseAsList(tagsJson, Tag.class)) 85 | .flatMap(List::stream) 86 | .collect(toImmutableListMultimap(TagUtility::extractParent, identity())); 87 | 88 | return new FakeDataCatalogStub(entries, entityTags); 89 | } 90 | 91 | @Override 92 | public UnaryCallable getEntryCallable() { 93 | checkState(!shutdown, "Stub shutdown"); 94 | 95 | return new UnaryCallable() { 96 | @Override 97 | public ApiFuture futureCall( 98 | GetEntryRequest getEntryRequest, ApiCallContext apiCallContext) { 99 | checkArgument( 100 | predefinedEntriesById.containsKey(getEntryRequest.getName()), "entryId not found."); 101 | 102 | return new FakeApiFutureBase() { 103 | @Override 104 | public Entry get() { 105 | return predefinedEntriesById.get(getEntryRequest.getName()); 106 | } 107 | }; 108 | } 109 | }; 110 | } 111 | 112 | @Override 113 | public UnaryCallable lookupEntryCallable() { 114 | checkState(!shutdown, "Stub shutdown"); 115 | 116 | return new UnaryCallable() { 117 | 118 | @Override 119 | public ApiFuture futureCall( 120 | LookupEntryRequest lookupEntryRequest, ApiCallContext apiCallContext) { 121 | return new FakeDataCatalogLookupEntryResponse( 122 | predefinedEntries.get(lookupEntryRequest.getLinkedResource()), lookupEntryRequest); 123 | } 124 | }; 125 | } 126 | 127 | @Override 128 | public UnaryCallable listTagsPagedCallable() { 129 | checkState(!shutdown, "Stub shutdown"); 130 | 131 | return new UnaryCallable() { 132 | @Override 133 | public ApiFuture futureCall( 134 | ListTagsRequest listTagsRequest, ApiCallContext apiCallContext) { 135 | return new FakeDataCatalogPagesListTagsResponse( 136 | listTagsRequest, apiCallContext, entryTagsInOrder(listTagsRequest.getParent())); 137 | } 138 | }; 139 | } 140 | 141 | @SuppressWarnings("UnstableApiUsage") // Ensures deterministic Tag ordering for UnitTests 142 | private ImmutableList entryTagsInOrder(String entryId) { 143 | return predefinedTags.get(entryId).stream() 144 | .sorted( 145 | Comparator.comparingInt(tag -> Hashing.sha256().hashBytes(tag.toByteArray()).asInt())) 146 | .collect(toImmutableList()); 147 | } 148 | 149 | @Override 150 | public void close() { 151 | // Do nothing because this is a Fake and doesn't implement an actual gRPC operation. 152 | } 153 | 154 | @Override 155 | public void shutdown() { 156 | shutdown = true; 157 | } 158 | 159 | @Override 160 | public boolean isShutdown() { 161 | return shutdown; 162 | } 163 | 164 | @Override 165 | public boolean isTerminated() { 166 | return terminated; 167 | } 168 | 169 | @Override 170 | public void shutdownNow() { 171 | shutdown(); 172 | } 173 | 174 | @Override 175 | public boolean awaitTermination(long l, TimeUnit timeUnit) { 176 | terminated = true; 177 | return true; 178 | } 179 | } 180 | -------------------------------------------------------------------------------- /src/main/java/com/google/cloud/solutions/catalogtagrecording/EntityTagOperationRecordExtractor.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import static com.google.cloud.solutions.catalogtagrecording.TagUtility.extractEntityId; 20 | 21 | import com.google.auto.value.AutoValue; 22 | import com.google.cloud.datacatalog.v1beta1.stub.DataCatalogStub; 23 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.AuditInformation; 24 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.CatalogEntry; 25 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.EntityTagOperationRecord; 26 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.OperationInformation; 27 | import com.google.common.base.Splitter; 28 | import com.google.common.collect.ImmutableSet; 29 | import com.google.common.collect.Iterables; 30 | import com.google.common.flogger.GoogleLogger; 31 | import com.google.protobuf.util.Timestamps; 32 | import java.io.IOException; 33 | import java.text.ParseException; 34 | import java.time.Clock; 35 | import java.time.Instant; 36 | import javax.annotation.Nullable; 37 | import org.apache.beam.sdk.transforms.Filter; 38 | import org.apache.beam.sdk.transforms.MapElements; 39 | import org.apache.beam.sdk.transforms.PTransform; 40 | import org.apache.beam.sdk.transforms.SerializableFunction; 41 | import org.apache.beam.sdk.transforms.SimpleFunction; 42 | import org.apache.beam.sdk.values.PCollection; 43 | 44 | /** 45 | * Custom Transform to encapsulate parsing LogEntry and enriching the information with Tags data 46 | * from DataCatalog. 47 | */ 48 | @AutoValue 49 | public abstract class EntityTagOperationRecordExtractor 50 | extends PTransform, PCollection> { 51 | 52 | protected static final GoogleLogger logger = GoogleLogger.forEnclosingClass(); 53 | 54 | /** 55 | * Set of events which will be processed by the enricher. 56 | */ 57 | private static final ImmutableSet USABLE_OPERATION_METHODS = 58 | ImmutableSet.of("CreateTag", "UpdateTag", "DeleteTag"); 59 | 60 | /** 61 | * Configurable Clock instant for Unit Testing. 62 | */ 63 | public abstract Clock clock(); 64 | 65 | /** 66 | * Configurable DataCatalogStub for Unit Testing. 67 | */ 68 | @Nullable 69 | public abstract DataCatalogStub catalogStub(); 70 | 71 | @Override 72 | public PCollection expand(PCollection input) { 73 | return input 74 | .apply("ExtractRecord", MapElements.via(new LogEntryToRecordMapper())) 75 | .apply("RetrieveCatalogTags", MapElements.via(new CatalogTagEnricher())) 76 | .apply("RemoveEmptyEnriched", Filter.by(new NonDefaultRecord())); 77 | } 78 | 79 | /** Filter to removes default record proto messages from upstream exceptions. */ 80 | private static class NonDefaultRecord 81 | implements SerializableFunction { 82 | 83 | @Override 84 | public Boolean apply(EntityTagOperationRecord record) { 85 | return !record.equals(EntityTagOperationRecord.getDefaultInstance()); 86 | } 87 | } 88 | 89 | /** Service to create OperationRecord by filtering LogEntries and parsing through a JsonParser. */ 90 | private class CatalogTagEnricher 91 | extends SimpleFunction { 92 | 93 | @Override 94 | public EntityTagOperationRecord apply(EntityTagOperationRecord record) { 95 | try (DataCatalogService service = DataCatalogService.usingStub(catalogStub())) { 96 | 97 | if (record.equals(EntityTagOperationRecord.getDefaultInstance())) { 98 | return record; 99 | } 100 | 101 | CatalogEntry enrichedEntry = service.enrichCatalogEntry(record.getEntity()); 102 | 103 | return record.toBuilder() 104 | .setEntity(enrichedEntry) 105 | .clearTags() 106 | .addAllTags(service.lookUpAllTags(enrichedEntry.getEntityId())) 107 | .build(); 108 | } catch (IOException ioException) { 109 | logger.atSevere().withCause(ioException).log("Error enriching record:%n%s", record); 110 | } 111 | 112 | return EntityTagOperationRecord.getDefaultInstance(); 113 | } 114 | } 115 | 116 | /** Parses a LogEntry JSON string into Proto message with all available fields initialized. */ 117 | private class LogEntryToRecordMapper extends SimpleFunction { 118 | 119 | @SuppressWarnings("ConstantConditions") // Suppress Null checks for Proto-parser output. 120 | @Override 121 | public EntityTagOperationRecord apply(String logEntryString) { 122 | try { 123 | 124 | JsonMessageParser logEntryParser = JsonMessageParser.of(logEntryString); 125 | JsonMessageParser auditLogParser = logEntryParser.forSubNode("$.protoPayload"); 126 | 127 | String catalogOperationMethod = auditLogParser.read("$.methodName"); 128 | String methodFragment = Iterables.getLast(Splitter.on(".").split(catalogOperationMethod)); 129 | 130 | if (!USABLE_OPERATION_METHODS.contains(methodFragment)) { 131 | return EntityTagOperationRecord.getDefaultInstance(); 132 | } 133 | 134 | String resourceName = auditLogParser.read("$.resourceName"); 135 | 136 | return EntityTagOperationRecord.newBuilder() 137 | .setReconcileTime(Timestamps.fromMillis(Instant.now(clock()).toEpochMilli())) 138 | .setEntity(CatalogEntry.newBuilder().setEntityId(extractEntityId(resourceName))) 139 | .setAuditInformation( 140 | AuditInformation.newBuilder() 141 | .setInsertId(logEntryParser.read("$.insertId")) 142 | .setActuator(auditLogParser.read("$.authenticationInfo.principalEmail")) 143 | .setJobTime(Timestamps.parse(logEntryParser.read("$.timestamp"))) 144 | .setOperation( 145 | OperationInformation.newBuilder() 146 | .setResource(resourceName) 147 | .setType(catalogOperationMethod))) 148 | .build(); 149 | 150 | } catch (ParseException e) { 151 | logger.atSevere().withCause(e).log( 152 | "Error extracting Record from LogEntry%n%s", logEntryString); 153 | } 154 | 155 | return EntityTagOperationRecord.getDefaultInstance(); 156 | } 157 | } 158 | 159 | public static EntityTagOperationRecordExtractor defaultExtractor() { 160 | return builder().build(); 161 | } 162 | 163 | public static Builder builder() { 164 | return new AutoValue_EntityTagOperationRecordExtractor.Builder().clock(Clock.systemUTC()); 165 | } 166 | 167 | @AutoValue.Builder 168 | public abstract static class Builder { 169 | 170 | public abstract Builder clock(Clock clock); 171 | 172 | public abstract Builder catalogStub(@Nullable DataCatalogStub catalogStub); 173 | 174 | public abstract EntityTagOperationRecordExtractor build(); 175 | } 176 | } 177 | -------------------------------------------------------------------------------- /src/test/java/com/google/cloud/solutions/catalogtagrecording/EntityTagOperationRecordExtractorTest.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The Data Catalog Tag History Authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package com.google.cloud.solutions.catalogtagrecording; 18 | 19 | import static com.google.cloud.solutions.catalogtagrecording.ProtoJsonConverter.parseJson; 20 | import static com.google.cloud.solutions.catalogtagrecording.testing.TestResourceLoader.load; 21 | 22 | import com.google.cloud.datacatalog.v1beta1.stub.DataCatalogStub; 23 | import com.google.cloud.solutions.catalogtagrecording.TagRecordingMessages.EntityTagOperationRecord; 24 | import com.google.cloud.solutions.catalogtagrecording.testing.PCollectionSatisfies; 25 | import com.google.cloud.solutions.catalogtagrecording.testing.fakes.datacatalog.FakeDataCatalogStub; 26 | import com.google.common.collect.ImmutableList; 27 | import java.time.Clock; 28 | import java.time.ZoneOffset; 29 | import java.time.ZonedDateTime; 30 | import org.apache.beam.sdk.testing.PAssert; 31 | import org.apache.beam.sdk.testing.TestPipeline; 32 | import org.apache.beam.sdk.transforms.Create; 33 | import org.apache.beam.sdk.values.PCollection; 34 | import org.junit.Rule; 35 | import org.junit.Test; 36 | import org.junit.runner.RunWith; 37 | import org.junit.runners.JUnit4; 38 | 39 | @RunWith(JUnit4.class) 40 | public final class EntityTagOperationRecordExtractorTest { 41 | 42 | @Rule 43 | public final transient TestPipeline p = TestPipeline.create(); 44 | 45 | private final Clock fixedClock = 46 | Clock.fixed(ZonedDateTime.parse("2020-09-11T10:52:01.789Z").toInstant(), ZoneOffset.UTC); 47 | 48 | @Test 49 | public void entityTagOperationRecord_serializable() { 50 | PCollection records = 51 | p.apply( 52 | Create.of( 53 | parseJson( 54 | load("create_tag_operation_record.json"), EntityTagOperationRecord.class))); 55 | 56 | PAssert.thatSingleton(records) 57 | .isEqualTo( 58 | parseJson(load("create_tag_operation_record.json"), EntityTagOperationRecord.class)); 59 | 60 | p.run(); 61 | } 62 | 63 | @Test 64 | public void expand_createMessage_valid() { 65 | DataCatalogStub fakeCatalogStub = 66 | FakeDataCatalogStub.buildWithTestData( 67 | ImmutableList.of("datacatalog-objects/TableA_entry.json"), 68 | ImmutableList.of( 69 | "datacatalog-objects/TableA_sample_template_tag.json", 70 | "datacatalog-objects/TableA_pii_tag.json")); 71 | 72 | PCollection records = 73 | p.apply(Create.of(load("create_tag_request.json"))) 74 | .apply( 75 | EntityTagOperationRecordExtractor.builder() 76 | .clock(fixedClock) 77 | .catalogStub(fakeCatalogStub) 78 | .build()); 79 | 80 | PAssert.thatSingleton(records) 81 | .satisfies( 82 | PCollectionSatisfies.singleton( 83 | parseJson( 84 | load("create_tag_operation_record.json"), EntityTagOperationRecord.class))); 85 | 86 | p.run(); 87 | } 88 | 89 | @Test 90 | public void expand_deleteMessage_valid() { 91 | DataCatalogStub fakeCatalogStub = 92 | FakeDataCatalogStub.buildWithTestData( 93 | ImmutableList.of("datacatalog-objects/TableB_entry.json"), 94 | ImmutableList.of("datacatalog-objects/TableB_pii_tag.json")); 95 | 96 | PCollection records = 97 | p.apply(Create.of(load("delete_tag_request_1.json"))) 98 | .apply( 99 | EntityTagOperationRecordExtractor.builder() 100 | .clock(fixedClock) 101 | .catalogStub(fakeCatalogStub) 102 | .build()); 103 | 104 | PAssert.thatSingleton(records) 105 | .satisfies( 106 | PCollectionSatisfies.singleton( 107 | parseJson( 108 | load("delete_tag_operation_record.json"), EntityTagOperationRecord.class))); 109 | 110 | p.run(); 111 | } 112 | 113 | @Test 114 | public void expand_multipleEvents_validSequence() { 115 | DataCatalogStub fakeCatalogStub = 116 | FakeDataCatalogStub.buildWithTestData( 117 | ImmutableList.of( 118 | "datacatalog-objects/TableA_entry.json", "datacatalog-objects/TableB_entry.json"), 119 | ImmutableList.of( 120 | "datacatalog-objects/TableA_sample_template_tag.json", 121 | "datacatalog-objects/TableA_pii_tag.json", 122 | "datacatalog-objects/TableB_pii_tag.json")); 123 | 124 | PCollection records = 125 | p.apply(Create.of(load("create_tag_request.json"), load("delete_tag_request_1.json"))) 126 | .apply( 127 | EntityTagOperationRecordExtractor.builder() 128 | .clock(fixedClock) 129 | .catalogStub(fakeCatalogStub) 130 | .build()); 131 | 132 | PAssert.that(records) 133 | .satisfies( 134 | PCollectionSatisfies.expectedSet( 135 | parseJson(load("create_tag_operation_record.json"), EntityTagOperationRecord.class), 136 | parseJson( 137 | load("delete_tag_operation_record.json"), EntityTagOperationRecord.class))); 138 | 139 | p.run(); 140 | } 141 | 142 | @Test 143 | public void expand_threeEventsWithOneInvalidEvent_invalidFilteredOut() { 144 | DataCatalogStub fakeCatalogStub = 145 | FakeDataCatalogStub.buildWithTestData( 146 | ImmutableList.of( 147 | "datacatalog-objects/TableA_entry.json", "datacatalog-objects/TableB_entry.json"), 148 | ImmutableList.of( 149 | "datacatalog-objects/TableA_sample_template_tag.json", 150 | "datacatalog-objects/TableA_pii_tag.json", 151 | "datacatalog-objects/TableB_pii_tag.json")); 152 | 153 | PCollection records = 154 | p.apply( 155 | Create.of( 156 | load("create_tag_request.json"), 157 | load("catalog_test_permissions_audit_log.json"), 158 | load("delete_tag_request_1.json"))) 159 | .apply( 160 | EntityTagOperationRecordExtractor.builder() 161 | .clock(fixedClock) 162 | .catalogStub(fakeCatalogStub) 163 | .build()); 164 | 165 | PAssert.that(records) 166 | .satisfies( 167 | PCollectionSatisfies.expectedSet( 168 | parseJson(load("create_tag_operation_record.json"), EntityTagOperationRecord.class), 169 | parseJson( 170 | load("delete_tag_operation_record.json"), EntityTagOperationRecord.class))); 171 | 172 | p.run(); 173 | } 174 | 175 | @Test 176 | public void expand_invalidEvent_empty() { 177 | DataCatalogStub fakeCatalogStub = 178 | FakeDataCatalogStub.buildWithTestData( 179 | ImmutableList.of( 180 | "datacatalog-objects/TableA_entry.json", "datacatalog-objects/TableB_entry.json"), 181 | ImmutableList.of( 182 | "datacatalog-objects/TableA_sample_template_tag.json", 183 | "datacatalog-objects/TableA_pii_tag.json", 184 | "datacatalog-objects/TableB_pii_tag.json")); 185 | 186 | PCollection records = 187 | p.apply(Create.of(load("catalog_test_permissions_audit_log.json"))) 188 | .apply( 189 | EntityTagOperationRecordExtractor.builder() 190 | .clock(fixedClock) 191 | .catalogStub(fakeCatalogStub) 192 | .build()); 193 | 194 | PAssert.that(records).empty(); 195 | 196 | p.run(); 197 | } 198 | } 199 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | 2 | Apache License 3 | Version 2.0, January 2004 4 | http://www.apache.org/licenses/ 5 | 6 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 7 | 8 | 1. Definitions. 9 | 10 | "License" shall mean the terms and conditions for use, reproduction, 11 | and distribution as defined by Sections 1 through 9 of this document. 12 | 13 | "Licensor" shall mean the copyright owner or entity authorized by 14 | the copyright owner that is granting the License. 15 | 16 | "Legal Entity" shall mean the union of the acting entity and all 17 | other entities that control, are controlled by, or are under common 18 | control with that entity. For the purposes of this definition, 19 | "control" means (i) the power, direct or indirect, to cause the 20 | direction or management of such entity, whether by contract or 21 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 22 | outstanding shares, or (iii) beneficial ownership of such entity. 23 | 24 | "You" (or "Your") shall mean an individual or Legal Entity 25 | exercising permissions granted by this License. 26 | 27 | "Source" form shall mean the preferred form for making modifications, 28 | including but not limited to software source code, documentation 29 | source, and configuration files. 30 | 31 | "Object" form shall mean any form resulting from mechanical 32 | transformation or translation of a Source form, including but 33 | not limited to compiled object code, generated documentation, 34 | and conversions to other media types. 35 | 36 | "Work" shall mean the work of authorship, whether in Source or 37 | Object form, made available under the License, as indicated by a 38 | copyright notice that is included in or attached to the work 39 | (an example is provided in the Appendix below). 40 | 41 | "Derivative Works" shall mean any work, whether in Source or Object 42 | form, that is based on (or derived from) the Work and for which the 43 | editorial revisions, annotations, elaborations, or other modifications 44 | represent, as a whole, an original work of authorship. For the purposes 45 | of this License, Derivative Works shall not include works that remain 46 | separable from, or merely link (or bind by name) to the interfaces of, 47 | the Work and Derivative Works thereof. 48 | 49 | "Contribution" shall mean any work of authorship, including 50 | the original version of the Work and any modifications or additions 51 | to that Work or Derivative Works thereof, that is intentionally 52 | submitted to Licensor for inclusion in the Work by the copyright owner 53 | or by an individual or Legal Entity authorized to submit on behalf of 54 | the copyright owner. For the purposes of this definition, "submitted" 55 | means any form of electronic, verbal, or written communication sent 56 | to the Licensor or its representatives, including but not limited to 57 | communication on electronic mailing lists, source code control systems, 58 | and issue tracking systems that are managed by, or on behalf of, the 59 | Licensor for the purpose of discussing and improving the Work, but 60 | excluding communication that is conspicuously marked or otherwise 61 | designated in writing by the copyright owner as "Not a Contribution." 62 | 63 | "Contributor" shall mean Licensor and any individual or Legal Entity 64 | on behalf of whom a Contribution has been received by Licensor and 65 | subsequently incorporated within the Work. 66 | 67 | 2. Grant of Copyright License. Subject to the terms and conditions of 68 | this License, each Contributor hereby grants to You a perpetual, 69 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 70 | copyright license to reproduce, prepare Derivative Works of, 71 | publicly display, publicly perform, sublicense, and distribute the 72 | Work and such Derivative Works in Source or Object form. 73 | 74 | 3. Grant of Patent License. Subject to the terms and conditions of 75 | this License, each Contributor hereby grants to You a perpetual, 76 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 77 | (except as stated in this section) patent license to make, have made, 78 | use, offer to sell, sell, import, and otherwise transfer the Work, 79 | where such license applies only to those patent claims licensable 80 | by such Contributor that are necessarily infringed by their 81 | Contribution(s) alone or by combination of their Contribution(s) 82 | with the Work to which such Contribution(s) was submitted. If You 83 | institute patent litigation against any entity (including a 84 | cross-claim or counterclaim in a lawsuit) alleging that the Work 85 | or a Contribution incorporated within the Work constitutes direct 86 | or contributory patent infringement, then any patent licenses 87 | granted to You under this License for that Work shall terminate 88 | as of the date such litigation is filed. 89 | 90 | 4. Redistribution. You may reproduce and distribute copies of the 91 | Work or Derivative Works thereof in any medium, with or without 92 | modifications, and in Source or Object form, provided that You 93 | meet the following conditions: 94 | 95 | (a) You must give any other recipients of the Work or 96 | Derivative Works a copy of this License; and 97 | 98 | (b) You must cause any modified files to carry prominent notices 99 | stating that You changed the files; and 100 | 101 | (c) You must retain, in the Source form of any Derivative Works 102 | that You distribute, all copyright, patent, trademark, and 103 | attribution notices from the Source form of the Work, 104 | excluding those notices that do not pertain to any part of 105 | the Derivative Works; and 106 | 107 | (d) If the Work includes a "NOTICE" text file as part of its 108 | distribution, then any Derivative Works that You distribute must 109 | include a readable copy of the attribution notices contained 110 | within such NOTICE file, excluding those notices that do not 111 | pertain to any part of the Derivative Works, in at least one 112 | of the following places: within a NOTICE text file distributed 113 | as part of the Derivative Works; within the Source form or 114 | documentation, if provided along with the Derivative Works; or, 115 | within a display generated by the Derivative Works, if and 116 | wherever such third-party notices normally appear. The contents 117 | of the NOTICE file are for informational purposes only and 118 | do not modify the License. You may add Your own attribution 119 | notices within Derivative Works that You distribute, alongside 120 | or as an addendum to the NOTICE text from the Work, provided 121 | that such additional attribution notices cannot be construed 122 | as modifying the License. 123 | 124 | You may add Your own copyright statement to Your modifications and 125 | may provide additional or different license terms and conditions 126 | for use, reproduction, or distribution of Your modifications, or 127 | for any such Derivative Works as a whole, provided Your use, 128 | reproduction, and distribution of the Work otherwise complies with 129 | the conditions stated in this License. 130 | 131 | 5. Submission of Contributions. Unless You explicitly state otherwise, 132 | any Contribution intentionally submitted for inclusion in the Work 133 | by You to the Licensor shall be under the terms and conditions of 134 | this License, without any additional terms or conditions. 135 | Notwithstanding the above, nothing herein shall supersede or modify 136 | the terms of any separate license agreement you may have executed 137 | with Licensor regarding such Contributions. 138 | 139 | 6. Trademarks. This License does not grant permission to use the trade 140 | names, trademarks, service marks, or product names of the Licensor, 141 | except as required for reasonable and customary use in describing the 142 | origin of the Work and reproducing the content of the NOTICE file. 143 | 144 | 7. Disclaimer of Warranty. Unless required by applicable law or 145 | agreed to in writing, Licensor provides the Work (and each 146 | Contributor provides its Contributions) on an "AS IS" BASIS, 147 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 148 | implied, including, without limitation, any warranties or conditions 149 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 150 | PARTICULAR PURPOSE. You are solely responsible for determining the 151 | appropriateness of using or redistributing the Work and assume any 152 | risks associated with Your exercise of permissions under this License. 153 | 154 | 8. Limitation of Liability. In no event and under no legal theory, 155 | whether in tort (including negligence), contract, or otherwise, 156 | unless required by applicable law (such as deliberate and grossly 157 | negligent acts) or agreed to in writing, shall any Contributor be 158 | liable to You for damages, including any direct, indirect, special, 159 | incidental, or consequential damages of any character arising as a 160 | result of this License or out of the use or inability to use the 161 | Work (including but not limited to damages for loss of goodwill, 162 | work stoppage, computer failure or malfunction, or any and all 163 | other commercial damages or losses), even if such Contributor 164 | has been advised of the possibility of such damages. 165 | 166 | 9. Accepting Warranty or Additional Liability. While redistributing 167 | the Work or Derivative Works thereof, You may choose to offer, 168 | and charge a fee for, acceptance of support, warranty, indemnity, 169 | or other liability obligations and/or rights consistent with this 170 | License. However, in accepting such obligations, You may act only 171 | on Your own behalf and on Your sole responsibility, not on behalf 172 | of any other Contributor, and only if You agree to indemnify, 173 | defend, and hold each Contributor harmless for any liability 174 | incurred by, or claims asserted against, such Contributor by reason 175 | of your accepting any such warranty or additional liability. 176 | 177 | END OF TERMS AND CONDITIONS 178 | 179 | APPENDIX: How to apply the Apache License to your work. 180 | 181 | To apply the Apache License to your work, attach the following 182 | boilerplate notice, with the fields enclosed by brackets "[]" 183 | replaced with your own identifying information. (Don't include 184 | the brackets!) The text should be enclosed in the appropriate 185 | comment syntax for the file format. We also recommend that a 186 | file or class name and description of purpose be included on the 187 | same "printed page" as the copyright notice for easier 188 | identification within third-party archives. 189 | 190 | Copyright 2020 Google LLC 191 | 192 | Licensed under the Apache License, Version 2.0 (the "License"); 193 | you may not use this file except in compliance with the License. 194 | You may obtain a copy of the License at 195 | 196 | http://www.apache.org/licenses/LICENSE-2.0 197 | 198 | Unless required by applicable law or agreed to in writing, software 199 | distributed under the License is distributed on an "AS IS" BASIS, 200 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 201 | See the License for the specific language governing permissions and 202 | limitations under the License. 203 | -------------------------------------------------------------------------------- /pom.xml: -------------------------------------------------------------------------------- 1 | 2 | 17 | 18 | 21 | 4.0.0 22 | 23 | com.google.cloud.solutions 24 | datacatalog-tag-history 25 | 0.1-SNAPSHOT 26 | 27 | 28 | 1.7.4 29 | 2.23.0 30 | 10.0.0 31 | 0.5.1 32 | 30.1.1-jre 33 | 1.5.0 34 | 4.13.1 35 | 8 36 | 1.8 37 | 1.8 38 | 3.7.0 39 | 1.6.0 40 | 3.0.2 41 | 3.1.0 42 | 2.22.2 43 | UTF-8 44 | 3.11.4 45 | 3.11.4 46 | 1.7.25 47 | GoogleCloudPlatform_bigquery-data-lineage 48 | googlecloudplatform 49 | https://sonarcloud.io 50 | 3.7.0.1746 51 | 1.0.1 52 | 53 | 54 | 55 | 56 | 57 | 58 | org.apache.beam 59 | beam-sdks-java-core 60 | ${beam.version} 61 | 62 | 63 | 64 | org.apache.beam 65 | beam-sdks-java-extensions-protobuf 66 | ${beam.version} 67 | 68 | 69 | 70 | 71 | 72 | org.apache.beam 73 | beam-runners-google-cloud-dataflow-java 74 | ${beam.version} 75 | runtime 76 | 77 | 78 | 79 | 80 | org.apache.beam 81 | beam-sdks-java-io-google-cloud-platform 82 | ${beam.version} 83 | 84 | 85 | 86 | 87 | 88 | 89 | com.google.auto.value 90 | auto-value-annotations 91 | ${auto-value.version} 92 | 93 | 94 | 95 | com.google.auto.value 96 | auto-value 97 | ${auto-value.version} 98 | 99 | 100 | 101 | 102 | com.google.protobuf 103 | protobuf-java 104 | 105 | 106 | 107 | com.google.protobuf 108 | protobuf-java-util 109 | 110 | 111 | 112 | 113 | com.jayway.jsonpath 114 | json-path 115 | 2.4.0 116 | 117 | 118 | 119 | 120 | 121 | com.google.cloud 122 | google-cloud-datacatalog 123 | 124 | 125 | 126 | 127 | 128 | com.google.flogger 129 | flogger 130 | ${flogger.version} 131 | 132 | 133 | 134 | com.google.flogger 135 | google-extensions 136 | ${flogger.version} 137 | 138 | 139 | 140 | com.google.flogger 141 | flogger-system-backend 142 | ${flogger.version} 143 | runtime 144 | 145 | 146 | 147 | 148 | org.slf4j 149 | slf4j-api 150 | ${slf4j.version} 151 | 152 | 153 | 154 | org.slf4j 155 | slf4j-jdk14 156 | ${slf4j.version} 157 | 158 | runtime 159 | 160 | 161 | 162 | 163 | junit 164 | junit 165 | ${junit.version} 166 | test 167 | 168 | 169 | 170 | 171 | org.hamcrest 172 | hamcrest-all 173 | 1.3 174 | test 175 | 176 | 177 | 178 | 179 | org.mockito 180 | mockito-all 181 | 1.10.19 182 | test 183 | 184 | 185 | 186 | 187 | com.google.truth 188 | truth 189 | ${truth.version} 190 | test 191 | 192 | 193 | 194 | com.google.truth.extensions 195 | truth-java8-extension 196 | ${truth.version} 197 | test 198 | 199 | 200 | 201 | 202 | org.skyscreamer 203 | jsonassert 204 | ${jsonassert.version} 205 | test 206 | 207 | 208 | 209 | 210 | org.apache.beam 211 | beam-runners-direct-java 212 | ${beam.version} 213 | test 214 | 215 | 216 | 217 | 218 | 219 | 220 | 221 | com.google.guava 222 | guava 223 | ${guava.version} 224 | 225 | 226 | 227 | 228 | com.google.protobuf 229 | protobuf-bom 230 | ${protobuf.version} 231 | pom 232 | import 233 | 234 | 235 | 236 | 237 | com.google.cloud 238 | libraries-bom 239 | ${cloud-libraries-bom.version} 240 | pom 241 | import 242 | 243 | 244 | 245 | 246 | 247 | 248 | 249 | com.github.os72 250 | protoc-jar-maven-plugin 251 | ${protobuf.version} 252 | 253 | 254 | generate-sources 255 | 256 | run 257 | 258 | 259 | ${protobuf.version} 260 | direct 261 | 262 | src/main/proto 263 | 264 | 265 | 266 | 267 | 268 | 269 | 270 | org.apache.maven.plugins 271 | maven-compiler-plugin 272 | ${maven-compiler-plugin.version} 273 | 274 | true 275 | -parameters 276 | ${maven.compiler.release} 277 | ${maven.compiler.source} 278 | ${maven.compiler.target} 279 | 280 | 281 | 282 | 283 | org.apache.maven.plugins 284 | maven-surefire-plugin 285 | ${maven-surefire-plugin.version} 286 | 287 | classes 288 | 4 289 | true 290 | 291 | 292 | 293 | org.apache.maven.surefire 294 | surefire-junit47 295 | ${maven-surefire-plugin.version} 296 | 297 | 298 | junit 299 | junit 300 | ${junit.version} 301 | 302 | 303 | 304 | 305 | 306 | 307 | org.jacoco 308 | jacoco-maven-plugin 309 | 0.8.5 310 | 311 | 312 | **/AutoValue* 313 | com/google/cloud/solutions/catalogtagrecording/TagRecordingMessages* 314 | com/google/cloud/solutions/catalogtagrecording/*Pipeline* 315 | 316 | 317 | 318 | 319 | 320 | prepare-agent 321 | 322 | 323 | 324 | report 325 | test 326 | 327 | report 328 | 329 | 330 | 331 | 332 | 333 | 335 | 336 | org.apache.maven.plugins 337 | maven-jar-plugin 338 | ${maven-jar-plugin.version} 339 | 340 | 341 | 345 | 346 | org.apache.maven.plugins 347 | maven-shade-plugin 348 | ${maven-shade-plugin.version} 349 | 350 | 351 | package 352 | 353 | shade 354 | 355 | 356 | ${project.artifactId}-bundled-${project.version} 357 | 358 | 359 | *:* 360 | 361 | META-INF/LICENSE 362 | META-INF/*.SF 363 | META-INF/*.DSA 364 | META-INF/*.RSA 365 | 366 | 367 | 368 | 369 | 371 | 372 | 373 | 374 | 375 | 376 | 377 | 378 | 379 | 380 | 381 | org.codehaus.mojo 382 | exec-maven-plugin 383 | ${maven-exec-plugin.version} 384 | 385 | false 386 | 387 | 388 | 389 | 390 | org.sonarsource.scanner.maven 391 | sonar-maven-plugin 392 | ${sonar.version} 393 | 394 | 395 | 396 | 397 | 398 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ## Record Data Catalog Tags changes to BigQuery in real-time 2 | 3 | - [Concepts](#concepts) 4 | - [Prerequisites](#prerequisites) 5 | - [Objectives](#objectives) 6 | - [Costs](#costs) 7 | - [Before you begin](#before-you-begin) 8 | - [Setting up your environment](#setup-environment) 9 | - [Creating resources](#creating-resources) 10 | * [Create BigQuery table](#setup-bigquery) 11 | * [Configure Pub/Sub topic and subscription](#configure-pubsub) 12 | * [Configure Log sink](#configure-log-sink) 13 | * [Create Service Accounts](#create-service-account) 14 | - [Deploy the Entry Tags recording pipeline](#launch-pipeline) 15 | - [Manual Test](#manual-test) 16 | - [Limitations](#limitations) 17 | - [Cleaning up](#cleanup) 18 | * [Delete the project](#delete-project) 19 | - [License](#license) 20 | 21 | Historical metadata of your data-warehouse is a treasure trove to discover not just insights about changing data patterns, but also quality and user behavior. The challenge is Data Catalog keeps a single version of metadata for fast searchability. 22 | 23 | This solution is intended for technical people with responsibility for metadata management, data governance and related analytics. 24 | 25 | This tutorial suggests a solution to create a historical record of metadata Data Catalog tags by creating change records in real-time through capture and parsing of the [Audit Logs](https://cloud.google.com/data-catalog/docs/how-to/audit-logging) from [Cloud Logging](https://cloud.google.com/logging) and processing them in real-time by using [Pub/Sub](https://cloud.google.com/pubsub) and [Dataflow](https://cloud.google.com/dataflow) to append into a [BigQuery](https://cloud.google.com/bigquery) table for historical analysis. 26 | 27 | ![Data Catalog Tag history recording solution architecture](images/data_catalog_tag_recorder_arch.svg) 28 | 29 | 30 | 31 | 33 | 34 | 35 | 39 | 40 |
Warning 32 |
Apply restrictive access controls to the Tag history BigQuery table. 36 |
37 | Data Catalog results are ACLed based on requestors' permissions 38 |
41 | 42 | ### Concepts 43 | 44 | * [Cloud Data Catalog](https://cloud.google.com/data-catalog) provides a single plane for searching and managing both [Technical and Business metadata](https://cloud.google.com/data-catalog/docs/concepts/overview#glossary) of your data-warehouse/data-lake in Google Cloud and beyond. Data Catalog uses [Tags](https://cloud.google.com/data-catalog/docs/concepts/overview#tags) to organize metadata and makes it discoverable. 45 | * [BigQuery](https://cloud.google.com/bigquery) is Google Cloud’s Serverless, highly scalable, and cost-effective multi-cloud data warehouse designed for business agility, that you can use to run petabyte sized queries. BigQuery also provides APIs for reading Table schema. 46 | * [Dataflow](https://cloud.google.com/dataflow) is Google Cloud’s Serverless data processing service by both stream and batch. 47 | * [Pub/Sub](https://cloud.google.com/pubsub) is Google Cloud’s flexible, reliable, real-time messaging service for independent applications to publish and subscribe to asynchronous events. 48 | 49 | 50 | ### Prerequisites 51 | 52 | This tutorial assumes some familiarity with Shell scripts and basic knowledge of Google Cloud Platform. 53 | 54 | ### Objectives 55 | 56 | 1. Setup Log sink to export Data Catalog Audit logs to PubSub 57 | 2. Deploy a streaming Dataflow pipeline to parse the logs 58 | 3. Enrich the logs with Entry Tag information from Data Catalog 59 | 4. Store the metadata tags attached to the modified Entry in a BigQuery table for historical reference 60 | 61 | 62 | ### Costs 63 | 64 | This tutorial uses billable components of Google Cloud, including 65 | 66 | * [Data Catalog](https://cloud.google.com/data-catalog/pricing) 67 | * [Cloud Dataflow](https://cloud.google.com/dataflow/pricing) 68 | * [Cloud PubSub](https://cloud.google.com/pubsub/pricing) 69 | * [Cloud Logging](https://cloud.google.com/logging/pricing) 70 | * [Cloud Storage](https://cloud.google.com/storage/pricing) 71 | * [BigQuery](https://cloud.google.com/bigquery/pricing) 72 | * Streaming API 73 | * Storage 74 | 75 | Use the [Pricing Calculator](https://cloud.google.com/products/calculator) to generate a cost estimate based on your projected usage. 76 | 77 | ### Before you begin 78 | 79 | For this tutorial, you need a Google Cloud [project](https://cloud.google.com/resource-manager/docs/cloud-platform-resource-hierarchy#projects). You can create a new one, or select a project you already created. 80 | 81 | 1. Select or create a Google Cloud project from the [project selector page](https://console.cloud.google.com/projectselector2/home/dashboard) 82 | 2. Make sure that billing is enabled for your Google Cloud project 83 | * [Enable billing](https://support.google.com/cloud/answer/6293499#enable-billing) for your project 84 | 3. Enable APIs for Data Catalog, BigQuery, Pub/Sub, Dataflow and Cloud Storage services 85 | 86 | [ENABLE THE APIS](https://console.cloud.google.com/flows/enableapi?apiid=datacatalog,bigquery,pubsub,storage_component,dataflow) 87 | 88 | You can use shell to enable required Google Cloud services. 89 | 90 | ```shell script 91 | gcloud services enable \ 92 | bigquery.googleapis.com \ 93 | storage_component \ 94 | datacatalog.googleapis.com \ 95 | dataflow.googleapis.com \ 96 | pubsub.googleapis.com 97 | ``` 98 | 99 | 4. In the Google Cloud Platform Console, go to Cloud Shell. 100 | 101 | [GO TO CLOUD SHELL](https://console.cloud.google.com/?cloudshell=true) 102 | 103 | At the bottom of the GCP Console, a [Cloud Shell](https://cloud.google.com/shell/docs/features) session opens and displays a command-line prompt. Cloud Shell is a shell environment with the Cloud SDK already installed, including the [gcloud](https://cloud.google.com/sdk/gcloud/) command-line tool, and with values already set for your current project. It can take a few seconds for the session to initialize. 104 | 105 | ### Setting up your environment 106 | 107 | 1. In Cloud Shell, clone the source repository: 108 | 109 | ```shell script 110 | git clone https://github.com/GoogleCloudPlatform/datacatalog-tag-history.git 111 | cd datacatalog-tag-history/ 112 | ``` 113 | 114 | Use your favourite editor to modify the env.sh file to set following variables. 115 | 116 | ```shell script 117 | # The GCP project to use for this tutorial 118 | export PROJECT_ID="your-project-id" 119 | 120 | # The BigQuery region to use for Tags table 121 | export BIGQUERY_REGION="" 122 | 123 | # The name of the BigQuery Dataset to create the Tag records table 124 | export DATASET_ID="" 125 | 126 | # The name of the BigQuery table for Tag records 127 | export TABLE_ID="EntityTagOperationRecords" 128 | 129 | # The Compute region to use for running Dataflow jobs and create a temporary storage bucket 130 | export REGION_ID="" 131 | 132 | # define the bucket id 133 | export TEMP_GCS_BUCKET="" 134 | 135 | # define the name of the Pub/Sub log sink in Cloud Logging 136 | export LOGS_SINK_NAME="datacatalog-audit-pubsub" 137 | 138 | #define Pub/Sub topic for receiving AuditLog events 139 | export LOGS_SINK_TOPIC_ID="catalog-audit-log-sink" 140 | 141 | # define the subscription id 142 | export LOGS_SUBSCRIPTION_ID="catalog-tags-dumper" 143 | 144 | # name of the service account to use (not the email address) 145 | export TAG_HISTORY_SERVICE_ACCOUNT="tag-history-collector" 146 | export TAG_HISTORY_SERVICE_ACCOUNT_EMAIL="${TAG_HISTORY_SERVICE_ACCOUNT}@$(echo $PROJECT_ID | awk -F':' '{print $2"."$1}' | sed 's/^\.//').iam.gserviceaccount.com" 147 | ``` 148 | 149 | 2. Set the variables in environment 150 | 151 | ```shell script 152 | source env.sh 153 | ``` 154 | 155 | ### Creating resources 156 | 157 | #### Create BigQuery table 158 | 159 | 1. Setup BigQuery dataset to store Entry's tags when a change event occurs. 160 | 161 | Create a new BigQuery dataset to store in the [region](https://cloud.google.com/bigquery/docs/locations) of your choice 162 | 163 | ```shell script 164 | bq --location ${BIGQUERY_REGION} \ 165 | --project_id=${PROJECT_ID} \ 166 | mk --dataset ${DATASET_ID} 167 | 168 | ``` 169 | 170 | 2. Create a Bigquery table for storing Tags using the provided schema, this creates a BigQuery table with `lowerCamelCase` column names. 171 | 172 | ```shell script 173 | bq mk --table \ 174 | --project_id=${PROJECT_ID} \ 175 | --description "Catalog Tag snapshots" \ 176 | --time_partitioning_field "reconcileTime" \ 177 | "${DATASET_ID}.${TABLE_ID}" camelEntityTagOperationRecords.schema 178 | ``` 179 | 180 | 181 | > Use the following command if you want `snake_case` column names 182 | > ```shell script 183 | > bq mk --table \ 184 | > --project_id=${PROJECT_ID} \ 185 | > --description "Catalog Tag snapshots" \ 186 | > --time_partitioning_field "reconcile_time" \ 187 | > "${DATASET_ID}.${TABLE_ID}" snakeEntityTagOperationRecords.schema 188 | > ``` 189 | 190 | #### Configure Pub/Sub topic and subscription 191 | 192 | Pub/Sub is Google Cloud's global messaging bus for decoupling processing modules. 193 | 194 | 1. Create Pub/Sub topic to receive Audit log events 195 | 196 | ```shell script 197 | gcloud pubsub topics create ${LOGS_SINK_TOPIC_ID} \ 198 | --project ${PROJECT_ID} 199 | ``` 200 | 201 | 2. Create a new Pub/Sub subscription 202 | Using a subscription (instead of direct topic) with a Dataflow pipeline, ensures that all messages are processed even when the pipeline may be temporarily down for updates or maintenance. 203 | 204 | ```shell script 205 | gcloud pubsub subscriptions create ${LOGS_SUBSCRIPTION_ID} \ 206 | --topic=${LOGS_SINK_TOPIC_ID} \ 207 | --topic-project=${PROJECT_ID} 208 | ``` 209 | 210 | #### Configure Log sink 211 | 212 | [Cloud Logging](https://cloud.google.com/logging) is GCP's powerful log management control plane. 213 | 214 | Create a Log sink to send DataCatalog audit events to the Pub/Sub topic, Cloud Logging will push new Data Catalog AuditLogs to the Pub/Sub topic for processing in real-time. \ 215 | 216 | ```shell script 217 | gcloud logging sinks create ${LOGS_SINK_NAME} \ 218 | pubsub.googleapis.com/projects/${PROJECT_ID}/topics/${LOGS_SINK_TOPIC_ID} \ 219 | --log-filter="protoPayload.serviceName=\"datacatalog.googleapis.com\" \ 220 | AND protoPayload.\"@type\"=\"type.googleapis.com/google.cloud.audit.AuditLog\"" 221 | ``` 222 | 223 | Give Pub/Sub "Publisher" permission to the logging service account to enable pushing log entries into the configured Pub/Sub topic. 224 | 225 | ```shell script 226 | # Identify the Logs writer service account 227 | export LOGGING_WRITER_IDENTITY="$(gcloud logging sinks describe ${LOGS_SINK_NAME} --format="get(writerIdentity)" --project ${PROJECT_ID})" 228 | 229 | # Grant Publish permission to the Logging writer 230 | gcloud pubsub topics add-iam-policy-binding ${LOGS_SINK_TOPIC_ID} \ 231 | --member=${LOGGING_WRITER_IDENTITY} \ 232 | --role='roles/pubsub.publisher' \ 233 | --project ${PROJECT_ID} 234 | ``` 235 | 236 | ### Create Service Accounts 237 | 238 | It is recommended to run pipelines with fine-grained access control to improve access partitioning. 239 | If your project does not have a user-created service account create one using following instructions. 240 | 241 | > You can use your browser by navigating to **IAM & Admin > [Service accounts](https://console.cloud.google.com/projectselector/iam-admin/serviceaccounts?supportedpurview=project)** on the Google Cloud console 242 | 243 | 1. Create a service account to use as the user-managed controller service account for Dataflow. 244 | 245 | ```shell script 246 | gcloud iam service-accounts create ${TAG_HISTORY_SERVICE_ACCOUNT} \ 247 | --description="Service Account to run the DataCatalog tag history collection and recording pipeline." \ 248 | --display-name="Data Catalog History collection account" 249 | ``` 250 | 2. Create a custom role with required permissions for accessing BigQuery, Pub/Sub, Dataflow and Data Catalog 251 | ```shell script 252 | export TAG_HISTORY_COLLECTOR_ROLE="tag_history_collector" 253 | 254 | gcloud iam roles create ${TAG_HISTORY_COLLECTOR_ROLE} --project=${PROJECT_ID} --file=tag_history_collector.yaml 255 | ``` 256 | 257 | 3. Apply the custom role to the service account 258 | ```shell script 259 | gcloud projects add-iam-policy-binding ${PROJECT_ID} \ 260 | --member="serviceAccount:${TAG_HISTORY_SERVICE_ACCOUNT_EMAIL}" \ 261 | --role=projects/${PROJECT_ID}/roles/${TAG_HISTORY_COLLECTOR_ROLE} 262 | ``` 263 | 264 | 4. Assign `dataflow.worker` role to allow the service account to run as dataflow worker. 265 | ```shell script 266 | gcloud projects add-iam-policy-binding ${PROJECT_ID} \ 267 | --member="serviceAccount:${TAG_HISTORY_SERVICE_ACCOUNT_EMAIL}" \ 268 | --role=roles/dataflow.worker 269 | ``` 270 | 271 | ### Deploy the Tag history recording pipeline 272 | 273 | 1. Create a Cloud storage bucket as a temporary and staging bucket for Dataflow. 274 | 275 | ```shell script 276 | gsutil mb -l ${REGION_ID} \ 277 | -p ${PROJECT_ID} \ 278 | gs://${TEMP_GCS_BUCKET} 279 | ``` 280 | 281 | 2. Launch Dataflow pipeline using Maven command 282 | 283 | ```shell script 284 | mvn clean generate-sources compile package exec:java \ 285 | -Dexec.mainClass=com.google.cloud.solutions.catalogtagrecording.PipelineLauncher \ 286 | -Dexec.cleanupDaemonThreads=false \ 287 | -Dmaven.test.skip=true \ 288 | -Dexec.args=" \ 289 | --streaming=true \ 290 | --project=${PROJECT_ID} \ 291 | --serviceAccount=${TAG_HISTORY_SERVICE_ACCOUNT_EMAIL} \ 292 | --runner=DataflowRunner \ 293 | --gcpTempLocation=gs://${TEMP_GCS_BUCKET}/temp/ \ 294 | --stagingLocation=gs://${TEMP_GCS_BUCKET}/staging/ \ 295 | --workerMachineType=n1-standard-1 \ 296 | --region=${REGION_ID} \ 297 | --tagsBigqueryTable=${PROJECT_ID}:${DATASET_ID}.${TABLE_ID} \ 298 | --catalogAuditLogsSubscription=projects/${PROJECT_ID}/subscriptions/${LOGS_SUBSCRIPTION_ID}" 299 | ``` 300 | 301 | > Add `--snakeCaseColumnNames` flag when using `snake_case` column name schema in [Setup BigQuery](#setup-bigquery) step. 302 | 303 | #### **Pipeline DAG** 304 | 305 | ![Recording Pipeline DAG](images/pipeline_dag.jpg) 306 | 307 | ### Manual Test 308 | 309 | Follow the guide to [Attach Tag](https://cloud.google.com/data-catalog/docs/quickstart-tagging#data-catalog-quickstart-console) to a Data Catalog Entry to verify that the tool captures all the Tags attached to modified Entry. 310 | 311 | 312 | ### Limitations 313 | 314 | * This implementation handles only the operations listed below. 315 | * `CreateTag` 316 | * `UpdateTag` 317 | * `DeleteTag` 318 | * Single Data Catalog operation creates multiple tag record entries due to multiple AuditLog events 319 | * The tool polls the Data Catalog service for Entry/Tag information, as the audit logs don't contain change-information. This can result in some changes to an Entry/Tags getting missed. 320 | 321 | ### Cleaning up 322 | 323 | To avoid incurring charges to your Google Cloud account for the resources used in this tutorial: 324 | 325 | #### Delete the project 326 | 327 | The easiest way to eliminate billing is to delete the project you created for the tutorial. 328 | 329 | 330 | 331 | 333 | 334 | 335 | 342 | 343 |
Caution: Deleting a project has the following effects: 332 |
336 |
    337 |
  • Everything in the project is deleted. If you used an existing project for this tutorial, when you delete it, you also delete any other work you've done in the project. 338 |
  • Custom project IDs are lost. When you created this project, you might have created a custom project ID that you want to use in the future. To preserve the URLs that use the project ID, such as an appspot.com URL, delete selected resources inside the project instead of deleting the whole project. 339 |
340 | If you plan to explore multiple tutorials and quickstarts, reusing projects can help you avoid exceeding project quota limits. 341 |
344 | 345 | 1. In the Cloud Console, go to the **Manage resources** page. 346 | 347 | [Go to the Manage resources page](https://console.cloud.google.com/iam-admin/projects) 348 | 349 | 2. In the project list, select the project that you want to delete and then click **Delete** ![delete](images/bin_icon.png) . 350 | 3. In the dialog, type the project ID and then click **Shut down** to delete the project. 351 | 352 | #### License 353 | 354 | Apache 2.0 355 | 356 | This is not an official Google product. 357 | --------------------------------------------------------------------------------