├── CODE_OF_CONDUCT.md ├── CONTRIBUTING.md ├── LICENSE ├── README.md ├── cftemplates └── GPSTEC315_lab_setup.yaml └── lab-guides ├── cleanup.md ├── dms-cdc.md ├── dms-fl.md ├── dms-inst.md ├── images ├── 1.png ├── 2.png ├── 3.png ├── Picture1.png ├── Picture2.png ├── action-items.png ├── apply_db.png ├── assessment.png ├── cfn-output.png ├── cfn-verify-output.png ├── cfn-verify.png ├── convert_schema.png ├── create_conn.png ├── create_conn_aurora.png ├── create_ep.png ├── create_rep_inst.png ├── create_rep_inst_detail.png ├── create_sep.png ├── create_task.png ├── create_task_mappings.png ├── create_task_mappings_cdc.png ├── create_tep.png ├── dashboard-open.png ├── dms.png ├── drop_FK.png ├── drop_trigger.png ├── event-dashboard.png ├── final_verification.png ├── hash-code.png ├── initial_verification.png ├── instance-connect.png ├── migration_complete.png ├── migration_complete_cdc.png ├── migration_progress.png ├── new_project.png ├── new_project1.png ├── num.png ├── reference-architecture.png ├── schema_objects.png ├── sct.png ├── sct_aurora.png ├── sct_oracle.png ├── stack-progress.png ├── table_size.png ├── table_statistics.png ├── table_statistics_cdc.png └── verify_target_db.png ├── lab-setup-verification.md ├── lab-setup.md ├── num-dt.md ├── optional-resolv.md ├── optional-validation.md └── sct.md /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | ## Code of Conduct 2 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 3 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 4 | opensource-codeofconduct@amazon.com with any additional questions or comments. 5 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing Guidelines 2 | 3 | Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional 4 | documentation, we greatly value feedback and contributions from our community. 5 | 6 | Please read through this document before submitting any issues or pull requests to ensure we have all the necessary 7 | information to effectively respond to your bug report or contribution. 8 | 9 | 10 | ## Reporting Bugs/Feature Requests 11 | 12 | We welcome you to use the GitHub issue tracker to report bugs or suggest features. 13 | 14 | When filing an issue, please check existing open, or recently closed, issues to make sure somebody else hasn't already 15 | reported the issue. Please try to include as much information as you can. Details like these are incredibly useful: 16 | 17 | * A reproducible test case or series of steps 18 | * The version of our code being used 19 | * Any modifications you've made relevant to the bug 20 | * Anything unusual about your environment or deployment 21 | 22 | 23 | ## Contributing via Pull Requests 24 | Contributions via pull requests are much appreciated. Before sending us a pull request, please ensure that: 25 | 26 | 1. You are working against the latest source on the *master* branch. 27 | 2. You check existing open, and recently merged, pull requests to make sure someone else hasn't addressed the problem already. 28 | 3. You open an issue to discuss any significant work - we would hate for your time to be wasted. 29 | 30 | To send us a pull request, please: 31 | 32 | 1. Fork the repository. 33 | 2. Modify the source; please focus on the specific change you are contributing. If you also reformat all the code, it will be hard for us to focus on your change. 34 | 3. Ensure local tests pass. 35 | 4. Commit to your fork using clear commit messages. 36 | 5. Send us a pull request, answering any default questions in the pull request interface. 37 | 6. Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation. 38 | 39 | GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and 40 | [creating a pull request](https://help.github.com/articles/creating-a-pull-request/). 41 | 42 | 43 | ## Finding contributions to work on 44 | Looking at the existing issues is a great way to find something to contribute on. As our projects, by default, use the default GitHub issue labels (enhancement/bug/duplicate/help wanted/invalid/question/wontfix), looking at any 'help wanted' issues is a great place to start. 45 | 46 | 47 | ## Code of Conduct 48 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 49 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 50 | opensource-codeofconduct@amazon.com with any additional questions or comments. 51 | 52 | 53 | ## Security issue notifications 54 | If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/). Please do **not** create a public github issue. 55 | 56 | 57 | ## Licensing 58 | 59 | See the [LICENSE](LICENSE) file for our project's licensing. We will ask you to confirm the licensing of your contribution. 60 | 61 | We may ask you to sign a [Contributor License Agreement (CLA)](http://en.wikipedia.org/wiki/Contributor_License_Agreement) for larger changes. 62 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | 3 | Permission is hereby granted, free of charge, to any person obtaining a copy of 4 | this software and associated documentation files (the "Software"), to deal in 5 | the Software without restriction, including without limitation the rights to 6 | use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of 7 | the Software, and to permit persons to whom the Software is furnished to do so. 8 | 9 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 10 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS 11 | FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR 12 | COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER 13 | IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN 14 | CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 15 | 16 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # AWS Database Migration Workshop 2 | 3 | Amazon Aurora is a MySQL and PostgreSQL compatible relational database built for the cloud that combines the performance and availability of high-end commercial databases with the simplicity and cost-effectiveness of open-source databases. 4 | 5 | In this hands-on workshop, you learn to migrate Oracle databases to Aurora PostgreSQL with minimum downtime using the AWS Schema Conversion Tool (SCT) and AWS Database Migration Service (DMS). 6 | 7 | This lab demonstrates how to use: 8 | 9 | - AWS Schema Conversion Tool (SCT) for database schema conversion 10 | - AWS Database Migration Service (DMS) for data migration 11 | 12 | ___ 13 | # Reference Architecture 14 | 15 | ![Reference Architecture](lab-guides/images/reference-architecture.png "Reference Architecture") 16 | ___ 17 | 18 | ## Prerequisites 19 | 20 | The following prerequisites must be met before you begin the workshop: 21 | 22 | - AWS Console Access 23 | - [RDP client](https://docs.aws.amazon.com/AWSEC2/latest/WindowsGuide/connecting_to_windows_instance.html?icmpid=docs_ec2_console#rdp-prereqs) 24 | ___ 25 | 26 | ## Workshop Activities 27 | 28 | 29 | 1. [Lab Setup](lab-guides/lab-setup.md) 30 | 2. [Schema Conversion Tool (SCT)](lab-guides/sct.md) 31 | 3. [DMS - Replication Instance & Endpoints](lab-guides/dms-inst.md) 32 | 4. [DMS - Replication Task for full-load](lab-guides/dms-fl.md) 33 | 5. [DMS - Replication Task for Change Data Capture (CDC)](lab-guides/dms-cdc.md) 34 | 6. [Validation - Using web application](lab-guides/optional-validation.md) 35 | 7. [Fixing issues with numeric data types](lab-guides/num-dt.md) 36 | 8. [Optional : Resolving SCT action items](lab-guides/optional-resolv.md) 37 | 9. [Cleanup](lab-guides/cleanup.md) 38 | 39 | ## License 40 | 41 | This library is licensed under the MIT-0 License. See the LICENSE file. 42 | 43 | ___ -------------------------------------------------------------------------------- /cftemplates/GPSTEC315_lab_setup.yaml: -------------------------------------------------------------------------------- 1 | AWSTemplateFormatVersion: 2010-09-09 2 | Description: Creates resources for the Oracle to Amazon Aurora migration workshop 3 | Mappings: 4 | RegionMap: 5 | us-east-1: 6 | "HVM64": "ami-001997e49afbbeb82" 7 | us-west-2: 8 | "HVM64": "ami-081ad13c8a0be0e0b" 9 | Resources: 10 | VPC: 11 | Type: 'AWS::EC2::VPC' 12 | Properties: 13 | EnableDnsSupport: 'true' 14 | EnableDnsHostnames: 'true' 15 | CidrBlock: 10.0.0.0/16 16 | Tags: 17 | - Key: Application 18 | Value: !Ref 'AWS::StackName' 19 | Subnet1: 20 | Type: 'AWS::EC2::Subnet' 21 | Properties: 22 | VpcId: !Ref VPC 23 | CidrBlock: 10.0.0.0/24 24 | AvailabilityZone: !Select [0, !GetAZs ''] 25 | Tags: 26 | - Key: Application 27 | Value: !Ref 'AWS::StackName' 28 | Subnet2: 29 | Type: 'AWS::EC2::Subnet' 30 | Properties: 31 | VpcId: !Ref VPC 32 | CidrBlock: 10.0.1.0/24 33 | AvailabilityZone: !Select [1, !GetAZs ''] 34 | Tags: 35 | - Key: Application 36 | Value: !Ref 'AWS::StackName' 37 | Subnet3: 38 | Type: 'AWS::EC2::Subnet' 39 | Properties: 40 | VpcId: !Ref VPC 41 | CidrBlock: 10.0.2.0/24 42 | AvailabilityZone: !Select [2, !GetAZs ''] 43 | Tags: 44 | - Key: Application 45 | Value: !Ref 'AWS::StackName' 46 | TargetDBSubnetGroup: 47 | Type: 'AWS::RDS::DBSubnetGroup' 48 | Properties: 49 | DBSubnetGroupDescription: Subnets available for the RDS DB Instance 50 | SubnetIds: 51 | - !Ref Subnet1 52 | - !Ref Subnet2 53 | - !Ref Subnet3 54 | InternetGateway: 55 | Type: 'AWS::EC2::InternetGateway' 56 | Properties: 57 | Tags: 58 | - Key: Application 59 | Value: !Ref 'AWS::StackName' 60 | AttachGateway: 61 | Type: 'AWS::EC2::VPCGatewayAttachment' 62 | Properties: 63 | VpcId: !Ref VPC 64 | InternetGatewayId: !Ref InternetGateway 65 | RouteTable: 66 | Type: 'AWS::EC2::RouteTable' 67 | Properties: 68 | VpcId: !Ref VPC 69 | Tags: 70 | - Key: Application 71 | Value: !Ref 'AWS::StackName' 72 | Route: 73 | Type: 'AWS::EC2::Route' 74 | DependsOn: AttachGateway 75 | Properties: 76 | RouteTableId: !Ref RouteTable 77 | DestinationCidrBlock: 0.0.0.0/0 78 | GatewayId: !Ref InternetGateway 79 | Subnet1RouteTableAssociation: 80 | Type: 'AWS::EC2::SubnetRouteTableAssociation' 81 | Properties: 82 | SubnetId: !Ref Subnet1 83 | RouteTableId: !Ref RouteTable 84 | Subnet2RouteTableAssociation: 85 | Type: 'AWS::EC2::SubnetRouteTableAssociation' 86 | Properties: 87 | SubnetId: !Ref Subnet2 88 | RouteTableId: !Ref RouteTable 89 | Subnet3RouteTableAssociation: 90 | Type: 'AWS::EC2::SubnetRouteTableAssociation' 91 | Properties: 92 | SubnetId: !Ref Subnet3 93 | RouteTableId: !Ref RouteTable 94 | EC2Instance: 95 | Type: 'AWS::EC2::Instance' 96 | Properties: 97 | InstanceType: m5.xlarge 98 | ImageId: !FindInMap [RegionMap, !Ref "AWS::Region", HVM64] 99 | Tags: 100 | - Key: Name 101 | Value: OracleXE-SCT 102 | NetworkInterfaces: 103 | - AssociatePublicIpAddress: 'true' 104 | DeleteOnTermination: 'true' 105 | DeviceIndex: 0 106 | SubnetId: !Ref Subnet1 107 | GroupSet: 108 | - !Ref InstanceSecurityGroup 109 | InstanceSecurityGroup: 110 | Type: 'AWS::EC2::SecurityGroup' 111 | Properties: 112 | GroupDescription: Enable RDP access via port 3389 113 | VpcId: !Ref VPC 114 | SecurityGroupIngress: 115 | - IpProtocol: tcp 116 | FromPort: '3389' 117 | ToPort: '3389' 118 | CidrIp: 0.0.0.0/0 119 | - IpProtocol: tcp 120 | FromPort: '1521' 121 | ToPort: '1521' 122 | CidrIp: 10.0.0.0/16 123 | AuroraPostgreSQLSecurityGroup: 124 | Type: 'AWS::EC2::SecurityGroup' 125 | Properties: 126 | VpcId: !Ref VPC 127 | GroupDescription: Aurora PostgreSQL Security Group 128 | SecurityGroupIngress: 129 | - IpProtocol: tcp 130 | FromPort: '5432' 131 | ToPort: '5432' 132 | CidrIp: 10.0.0.0/16 133 | AuroraPostgreSQLDBCluster: 134 | Type: 'AWS::RDS::DBCluster' 135 | Properties: 136 | DBSubnetGroupName: !Ref TargetDBSubnetGroup 137 | DBClusterIdentifier : !Ref 'AWS::StackName' 138 | VpcSecurityGroupIds: 139 | - !GetAtt 140 | - AuroraPostgreSQLSecurityGroup 141 | - GroupId 142 | Engine: aurora-postgresql 143 | EngineVersion: '10.7' 144 | DatabaseName: AuroraPostgreSQLDB 145 | DBClusterParameterGroupName: default.aurora-postgresql10 146 | MasterUsername: postgres 147 | MasterUserPassword: Aurora321 148 | Port: '5432' 149 | BackupRetentionPeriod: '7' 150 | DependsOn: AuroraPostgreSQLSecurityGroup 151 | AuroraPostgreSQLDBInstance: 152 | Type: 'AWS::RDS::DBInstance' 153 | Properties: 154 | Engine: aurora-postgresql 155 | EngineVersion: '10.7' 156 | DBClusterIdentifier: !Ref AuroraPostgreSQLDBCluster 157 | DBSubnetGroupName: !Ref TargetDBSubnetGroup 158 | AutoMinorVersionUpgrade: 'false' 159 | CopyTagsToSnapshot: 'true' 160 | DBInstanceClass: db.r5.xlarge 161 | PubliclyAccessible: 'true' 162 | Outputs: 163 | VpcId: 164 | Description: 'VPC Id' 165 | Value: !Ref VPC 166 | OracleSCTInstancePrivateIP: 167 | Description: Private IP address of the newly created Oracle DB and SCT EC2 instance 168 | Value: !GetAtt 169 | - EC2Instance 170 | - PrivateIp 171 | AuroraPostgreSQLEndpoint: 172 | Description: Connection Endpoint for the newly created AuroraPostgreSQL RDS instance 173 | Value: !GetAtt 174 | - AuroraPostgreSQLDBCluster 175 | - Endpoint.Address -------------------------------------------------------------------------------- /lab-guides/cleanup.md: -------------------------------------------------------------------------------- 1 | [Back to main guide](../README.md) 2 | 3 | ___ 4 | 5 | # 7. Cleanup 6 | 7 | Congratulations on successfully completing the workshop. Now, it is time to clean up your environment. 8 | 9 | ___ 10 | 11 | ## Cleanup steps 12 | 13 | 1. Delete the CloudFormation stack. 14 | 2. Delete the DMS task. 15 | 3. Delete the DMS replication instance. 16 | 4. Delete the DMS end points. 17 | 18 | ___ 19 | 20 | [Back to main guide](../README.md) 21 | -------------------------------------------------------------------------------- /lab-guides/dms-cdc.md: -------------------------------------------------------------------------------- 1 | [Back to main guide](../README.md)|[Next](optional-validation.md) 2 | 3 | ___ 4 | 5 | # Run a DMS Replication Task for Change Data Capture (CDC) 6 | 7 | You can use AWS Database Migration Service to perform continuous data replication. This helps you migrate your databases to AWS with virtually no downtime. All data changes to the source database that occur during the migration are continuously replicated to the target, allowing the source database to be fully operational during the migration process. After the database migration is complete, the target database will remain synchronized with the source for as long as you choose, allowing you to switchover the database at a convenient time. 8 | 9 | To capture change data, the source database must be in ARCHIVELOG mode and supplemental logging must be enabled. 10 | 11 | Refer to [Using Oracle LogMiner or Oracle Binary Reader for Change Data Capture (CDC)](https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Source.Oracle.html#CHAP_Source.Oracle.CDC) for more details on configuring the source database for CDC. 12 | 13 | _**Note: For this workshop, we have already made the configutation changes to the source Oracle database to support CDC.**_ 14 | 15 | In this activity, you perform the following tasks: 16 | 17 | - Run a DMS Replication Task for Change Data Capture (CDC) 18 | - Enable CDC on the source database 19 | - Set up and run a Replication Task 20 | - Introduce changes at the source database 21 | - Validate the CDC result at the target database 22 | 23 | ___ 24 | 25 | ## Task 1 - Configure and run a CDC Replication Task 26 | In this part of the lab you are going to create another Database Migration Task for capturing data changes from the source Oracle database and migrate to target Aurora PostgreSQL. 27 | 28 | 1. Click on **Database migration tasks** on the navigation menu, then click on the **Create task** button. 29 | 30 | ![Create replication task](images/create_task.png) 31 | 32 | 33 | 2. Create a data migration task with the following values for migrating the `HR` database. 34 | 35 | Parameter | Value 36 | --- | --- 37 | Task identifier | oracle-migration-task-cdc 38 | Replication instance | your replication instance 39 | Source database endpoint | oracle-source 40 | Target database endpoint | aurora-postgresql-target 41 | Migration type | Replicate data changes only 42 | Start task on create | Checked 43 | CDC start mode | Don’t use custom CDC start mode 44 | CDC stop mode | Don’t use custom CDC stop mode 45 | Create recovery table on target DB | Unchecked 46 | Target table preparation mode | Do nothing 47 | Include LOB columns in replication | Limited LOB mode 48 | Max LOB size (KB) | 32 49 | Enable validation | Unchecked 50 | Enable CloudWatch logs | Checked 51 | 52 | *Enabling the logging would help debugging issues that DMS encounters during data migration* 53 | 54 | 3. Expand the Table mappings section, and select Guided UI for the editing mode 55 | 4. The Table mappings are as below which are same as the mappings of the previous full-load replication task. Click on Add new selection rule button and enter the following values. 56 | 57 | Parameter | Value 58 | ----- | ----- 59 | Schema name | HR 60 | Table name| % 61 | Action | Include 62 | 63 | _Hint:click “Enter a schema” from drop down to enter the schema name._ 64 | 65 | 5. Next, expand the Transformation rules section, and click on Add new transformation rule. Then, create the following rules: 66 | 67 | Parameter | Value 68 | -------- | -------- 69 | Target | Schema 70 | Schema name | HR 71 | Action | Make lowercase 72 | 73 | Parameter | Value 74 | -------- | -------- 75 | Target | Table 76 | Schema Name | HR 77 | Table Name | % 78 | Action | Make lowercase 79 | 80 | Parameter | Value 81 | -------- | -------- 82 | Target | Column 83 | Schema Name | HR 84 | Table Name | % 85 | Column Name | % 86 | Action | Make lowercase 87 | 88 | Verify that your DMS task configuration is same as in the following screen-shot. 89 | 90 | ![Create task mappings](images/create_task_mappings_cdc.png) 91 | 92 | 6. After entering the values click on **Create task**. 93 | 94 | 7. Wait till the task status changes to **Replication ongoing**, this may take a couple of minutes. 95 | 96 | ![Migration Task Progress](images/migration_complete_cdc.png) 97 | 98 | ___ 99 | 100 | ## Task 2 - Validate the on going data replication / CDC. 101 | 102 | 1. Log in to the SQL Developer connecting to the source Oracle database. 103 | 2. Verify records in existing `REGIONS` table in `HR` schema. 104 | 105 | ```` 106 | SELECT * FROM HR.REGIONS; 107 | ```` 108 | 109 | ![Initial verification](images/initial_verification.png) 110 | 111 | 3. Add two new rows to the `REGIONS` table 112 | 113 | ```` 114 | INSERT INTO HR.REGIONS VALUES (5,'APAC'); 115 | 116 | INSERT INTO HR.REGIONS VALUES (6,'LATIN AMERICA'); 117 | 118 | COMMIT WORK; 119 | 120 | ```` 121 | 122 | 4. Log in to the SQL Developer connecting to the target Aurora PostgreSQL database. 123 | 124 | 5. Verify whether the changes are migrated to `REGION` table in target Aurora PostgreSQL database. 125 | 126 | ```` 127 | SELECT * FROM hr.regions; 128 | ```` 129 | 130 | ![Final verification](images/final_verification.png) 131 | 132 | 6. You can verify the number of inserts, deletes, updates, and DDLs by checking the **Table statistics** of the CDC task (**oracle-migration-task-cdc**) 133 | 134 | ![Table statistics](images/table_statistics_cdc.png) 135 | 136 | ___ 137 | 138 | ## Conclusion 139 | This part of the workshop demonstrated a database replication with Data Change Capture in real time. 140 | ___ 141 | 142 | [Back to main guide](../README.md)|[Next](optional-validation.md) -------------------------------------------------------------------------------- /lab-guides/dms-fl.md: -------------------------------------------------------------------------------- 1 | [Back to main guide](../README.md)|[Next](dms-cdc.md) 2 | 3 | ___ 4 | 5 | # Run the DMS Replication Task for full load (replicating the initial data) 6 | 7 | In this activity, you perform the following tasks: 8 | 9 | - Run a DMS Replication Task for full-load (migration of initial data) 10 | - Check the source database content for post-replication validation 11 | - Drop foreign keys on the target database 12 | - Setup and run a full-load Replication Task 13 | - Validate the data replication result 14 | - Restore the foreign keys 15 | 16 | ![DMS](images/dms.png) 17 | 18 | ___ 19 | 20 | ## Task 1 - Check the source database content for post-replication validation 21 | 22 | 1. Connect to the **OracleXE-SCT** EC2 instance using the following password 23 | **Windows password**: GPSreInvent@321 24 | 2. Launch **SQL Developer** from the shortcut on the desktop. 25 | 3. Right Click on `XE` under Connections and select properties to verify the following parameters. 26 | 27 | Parameter | Value 28 | --- | --- 29 | Connection Name | XE 30 | Username| hr 31 | Password | hr123 32 | Save Password | checked 33 | Hostname | localhost 34 | Port| 1521 35 | SID | XE 36 | 37 | ![SQLTargetDB creation](images/create_conn.png) 38 | 39 | 4. After you have connected to the Oracle database, right click on **XE** connection and click open worksheet. 40 | 5. Run the following query on the SQL window to get a count of the rows in the tables by clicking green play button. 41 | 42 | ```` 43 | SELECT 'regions' TABLE_NAME, COUNT(*) FROM HR.REGIONS UNION 44 | SELECT 'locations' TABLE_NAME, COUNT(*) FROM HR.LOCATIONS UNION 45 | SELECT 'departments' TABLE_NAME, COUNT(*) FROM HR.DEPARTMENTS UNION 46 | SELECT 'jobs' TABLE_NAME, COUNT(*) FROM HR.JOBS UNION 47 | SELECT 'employees' TABLE_NAME, COUNT(*) FROM HR.EMPLOYEES UNION 48 | SELECT 'job_history' TABLE_NAME, COUNT(*) FROM HR.JOB_HISTORY UNION 49 | SELECT 'countries' TABLE_NAME, COUNT(*) FROM HR.COUNTRIES; 50 | ```` 51 | ![Read table size](images/table_size.png) 52 | 53 | ___ 54 | 55 | ## Task 2 - Configure the target database schema for full load replication 56 | Before running DMS Replication Task, you need to disable the foreign keys on the target database. This prevents migration task failing due to referential integrity enforced by foreign keys. 57 | 58 | 1. Right Click on `AuroraPostgreSQL` under Connections and select properties to **modify** the following parameters. After modifying the connection according to the following table, click ‘Test’ to test the connection, then click ‘Save’, and finally click ‘Connect’. 59 | 60 | Parameter | Value 61 | --- | --- 62 | Connection Name | AuroraPostgreSQL 63 | Username| postgres 64 | Password | Aurora321 65 | Save Password | checked 66 | Hostname | Get `AuroraPostgreSQLEndpoint` from [CloudFormation stack output](./lab-setup-verification.md#cloudformation-stack-outputs) 67 | Port| 5432 68 | Database name | AuroraPostgreSQLDB 69 | 70 | ![Aurora Connection](images/create_conn_aurora.png) 71 | 72 | 2. After you have connected to the Aurora database,right click on **AuroraPostgreSQL** connection and click open worksheet. 73 | 3. Run the following query on the SQL window to get a count of the rows in the tables by clicking green play button. This query will return row count as zero ,since data is not migrated yet to target database. 74 | 75 | ```` 76 | SELECT 'regions' TABLE_NAME, COUNT(*) FROM HR.REGIONS UNION 77 | SELECT 'locations' TABLE_NAME, COUNT(*) FROM HR.LOCATIONS UNION 78 | SELECT 'departments' TABLE_NAME, COUNT(*) FROM HR.DEPARTMENTS UNION 79 | SELECT 'jobs' TABLE_NAME, COUNT(*) FROM HR.JOBS UNION 80 | SELECT 'employees' TABLE_NAME, COUNT(*) FROM HR.EMPLOYEES UNION 81 | SELECT 'job_history' TABLE_NAME, COUNT(*) FROM HR.JOB_HISTORY UNION 82 | SELECT 'countries' TABLE_NAME, COUNT(*) FROM HR.COUNTRIES; 83 | ```` 84 | 85 | 4. Drop foreign keys on the target Aurora database. Run the following query in the SQL window to drop foreign keys. 86 | _Hint:**Select all the statements** before you run the query by clicking the green play button_ 87 | 88 | ``` 89 | ALTER TABLE hr.countries DROP CONSTRAINT country_reg_fk; 90 | ALTER TABLE hr.departments DROP CONSTRAINT dept_loc_fk; 91 | ALTER TABLE hr.departments DROP CONSTRAINT dept_mgr_fk; 92 | ALTER TABLE hr.employees DROP CONSTRAINT emp_dept_fk; 93 | ALTER TABLE hr.employees DROP CONSTRAINT emp_job_fk; 94 | ALTER TABLE hr.employees DROP CONSTRAINT emp_manager_fk; 95 | ALTER TABLE hr.job_history DROP CONSTRAINT jhist_dept_fk; 96 | ALTER TABLE hr.job_history DROP CONSTRAINT jhist_emp_fk; 97 | ALTER TABLE hr.job_history DROP CONSTRAINT jhist_job_fk; 98 | ALTER TABLE hr.locations DROP CONSTRAINT loc_c_id_fk; 99 | ``` 100 | 101 | ![Drop foreign keys](images/drop_FK.png) 102 | 103 | ___ 104 | 105 | ## Task 3 - Configure and run Replication Task 106 | AWS DMS uses a Replication Task to migrate the data from the source to the target database. In this part of the lab, you are going to create a Replication Task for migrating the existing data. 107 | 108 | 1. Click on **Database migration tasks** on the navigation menu, then click on the **Create task** button. 109 | 110 | ![Create replication task](images/create_task.png) 111 | 112 | 2. Create a data migration task with the following values for migrating the `HR` database. 113 | 114 | Parameter | Value 115 | --- | --- 116 | Task identifier | oracle-migration-task 117 | Replication instance | your replication instance 118 | Source database endpoint | oracle-source 119 | Target database endpoint | aurora-postgresql-target 120 | Migration type | Migrate existing data 121 | Start task on create | Checked 122 | Target table preparation mode | Truncate 123 | Include LOB columns in replication | Limited LOB mode 124 | Max LOB size (KB) | 32 125 | Enable validation | Unchecked 126 | Enable CloudWatch logs | Checked 127 | 128 | *Enabling logging would help debug issues that DMS encounters during data migration* 129 | 130 | 3. Expand the **Table mappings** section, and select Guided UI for the editing mode. Table mappings uses several types of rules to specify the data source, source schema, data, and any transformations that should occur during the migration. 131 | 4. Click on **Add new selection rule** button and enter the following values: 132 | You can use Selection rules to choose the schema and/or tables you want to include with, or exclude for migration. For this workshop you are including all the tables under `HR` schema. 133 | 134 | Parameter | Value 135 | ----- | ----- 136 | Schema name | HR 137 | Table name| % 138 | Action | Include 139 | 140 | _Hint:click “Enter a schema” from drop down to enter the schema name._ 141 | 142 | 5. Next, expand the Transformation rules section, and click on Add new transformation rule. Then, create the following rules: 143 | You can use transformation rules to modify the data written to the target database. For this workshop you are transforming the source schema, table and column name to lower case to match target schema. 144 | 145 | Parameter | Value 146 | -------- | -------- 147 | Target | Schema 148 | Schema name | HR 149 | Action | Make lowercase 150 | 151 | Parameter | Value 152 | -------- | -------- 153 | Target | Table 154 | Schema Name | HR 155 | Table Name | % 156 | Action | Make lowercase 157 | 158 | Parameter | Value 159 | -------- | -------- 160 | Target | Column 161 | Schema Name | HR 162 | Table Name | % 163 | Column Name | % 164 | Action | Make lowercase 165 | 166 | Verify that your DMS task configuration is same as in the following screen-shot. 167 | 168 | ![Create task mappings](images/create_task_mappings.png) 169 | 170 | 6. After entering the values click on **Create task**. 171 | 7. At this point, the task should start migrating data from the source Oracle database to the Amazon Aurora RDS instance. 172 | ![Migration Task Progress](images/migration_progress.png) 173 | 8. Go to **Database migration tasks** to monitor the task progress and once the task status is **Load complete**, your data should have been migrated to the target database. This may take couple of minutes, Wait till the task status becomes load completed. 174 | 175 | ![Migration Task Progress](images/migration_complete.png) 176 | 177 | ___ 178 | 179 | ## Task 4 - Validate the migration result 180 | 181 | 1. Click on your task **oracle-migration-task** and scroll to the **Table statistics** section to view how many rows have been moved. 182 | ![Table statistics](images/table_statistics.png) 183 | 2. If there is any error, the status color changes from green to red. Click on the **View logs** link for the logs. 184 | 3. Right click on the **AuroraPostgreSQL** connection and click open worksheet. 185 | 4. Run the following query in the SQL window to get a count of the rows in the tables by clicking green play button. 186 | 187 | ```` 188 | SELECT 'regions' TABLE_NAME, COUNT(*) FROM HR.REGIONS UNION 189 | SELECT 'locations' TABLE_NAME, COUNT(*) FROM HR.LOCATIONS UNION 190 | SELECT 'departments' TABLE_NAME, COUNT(*) FROM HR.DEPARTMENTS UNION 191 | SELECT 'jobs' TABLE_NAME, COUNT(*) FROM HR.JOBS UNION 192 | SELECT 'employees' TABLE_NAME, COUNT(*) FROM HR.EMPLOYEES UNION 193 | SELECT 'job_history' TABLE_NAME, COUNT(*) FROM HR.JOB_HISTORY UNION 194 | SELECT 'countries' TABLE_NAME, COUNT(*) FROM HR.COUNTRIES; 195 | ```` 196 | 197 | 5. Now you should be able to see that the data has been migrated, and the row counts on the Oracle source and the Aurora target match. 198 | ![Verify Target database](images/verify_target_db.png) 199 | 200 | ___ 201 | 202 | ## Task 5 - Restore the foreign keys 203 | 1. After the full load is complete, enable the foreign key constraints on the target database. Run the following query in the SQL window to enable foreign keys. 204 | _Hint:**Select all the statements** before you run the query by clicking the green play button_ 205 | 206 | ``` 207 | ALTER TABLE hr.locations 208 | ADD CONSTRAINT loc_c_id_fk FOREIGN KEY (country_id) 209 | REFERENCES hr.countries (country_id) 210 | ON DELETE NO ACTION; 211 | 212 | ALTER TABLE hr.countries 213 | ADD CONSTRAINT country_reg_fk FOREIGN KEY (region_id) 214 | REFERENCES hr.regions (region_id) 215 | ON DELETE NO ACTION; 216 | 217 | ALTER TABLE hr.employees 218 | ADD CONSTRAINT emp_dept_fk FOREIGN KEY (department_id) 219 | REFERENCES hr.departments (department_id) 220 | ON DELETE NO ACTION; 221 | 222 | ALTER TABLE hr.job_history 223 | ADD CONSTRAINT jhist_dept_fk FOREIGN KEY (department_id) 224 | REFERENCES hr.departments (department_id) 225 | ON DELETE NO ACTION; 226 | 227 | ALTER TABLE hr.departments 228 | ADD CONSTRAINT dept_loc_fk FOREIGN KEY (location_id) 229 | REFERENCES hr.locations (location_id) 230 | ON DELETE NO ACTION; 231 | 232 | ALTER TABLE hr.departments 233 | ADD CONSTRAINT dept_mgr_fk FOREIGN KEY (manager_id) 234 | REFERENCES hr.employees (employee_id) 235 | ON DELETE NO ACTION; 236 | 237 | ALTER TABLE hr.employees 238 | ADD CONSTRAINT emp_manager_fk FOREIGN KEY (manager_id) 239 | REFERENCES hr.employees (employee_id) 240 | ON DELETE NO ACTION; 241 | 242 | ALTER TABLE hr.job_history 243 | ADD CONSTRAINT jhist_emp_fk FOREIGN KEY (employee_id) 244 | REFERENCES hr.employees (employee_id) 245 | ON DELETE NO ACTION; 246 | 247 | ALTER TABLE hr.employees 248 | ADD CONSTRAINT emp_job_fk FOREIGN KEY (job_id) 249 | REFERENCES hr.jobs (job_id) 250 | ON DELETE NO ACTION; 251 | 252 | ALTER TABLE hr.job_history 253 | ADD CONSTRAINT jhist_job_fk FOREIGN KEY (job_id) 254 | REFERENCES hr.jobs (job_id) 255 | ON DELETE NO ACTION; 256 | 257 | ``` 258 | ___ 259 | 260 | ## Conclusion 261 | This part of the workshop demonstrated heterogeneous database migration, from Oracle to Aurora PostgreSQL by AWS Database Migration Service (DMS). 262 | ___ 263 | 264 | [Back to main guide](../README.md)|[Next](dms-cdc.md) -------------------------------------------------------------------------------- /lab-guides/dms-inst.md: -------------------------------------------------------------------------------- 1 | [Back to main guide](../README.md)|[Next](dms-fl.md) 2 | 3 | ___ 4 | 5 | # Database Migration Service (DMS) 6 | 7 | The AWS Database Migration Service helps you migrate databases to AWS easily and securely. The source database remains fully operational during the migration, minimizing downtime to applications that rely on the database. The AWS Database Migration Service can migrate your data to and from most widely used commercial and open-source databases. AWS Database Migration Service can also be used for continuous data replication with high availability. 8 | 9 | This lab will walk you through the steps to Create a DMS Replication Instance and endpoints. 10 | 11 | In this activity, you perform the following tasks: 12 | 13 | - Create a DMS Replication Instance 14 | - Create DMS source and target endpoints 15 | 16 | ![DMS](images/dms.png) 17 | ___ 18 | 19 | ## Task 1 - Create a DMS Replication Instance 20 | 21 | 1. Go to the [AWS DMS console](https://console.aws.amazon.com/dms/v2/) and click on Replication Instances on the navigation menu. This will launch the **Replication instance** screen in the Database Migration Service. 22 | 2. Click on the **Create replication instance** button on the top right side. 23 | 24 | ![Create replication instance](images/create_rep_inst.png) 25 | 26 | 3. Configure the replication instance with the following parameter values. Then, click on the **Create** button. 27 | 28 | **_Note: Make sure that you have selected replication engine version as 2.4.5_** 29 | 30 | Parameter | Value 31 | --- | --- 32 | Name | replication-instance 33 | Description | Oracle to Aurora DMS replication instance 34 | Instance Class | dms.c4.xlarge 35 | Replication engine version | **2.4.5** 36 | VPC | vpc-xxxxxxxxx (VpcId from [CloudFormation stack output](./lab-setup-verification.md#cloudformation-stack-outputs)) 37 | Allocated storage (GB) | Leave default 38 | Multi-AZ | Unchecked 39 | Publicly accessible | Unchecked 40 | 41 | ![Create replication instance](images/create_rep_inst_detail.png) 42 | 43 | _Note: Creation of the replication instance takes a few minutes. While waiting for the replication instance to be created, you can proceed with creation of source and target database endpoints in the next step. **However, you can test the endpoint connectivity only after the replication instance has been created**._ 44 | 45 | ___ 46 | 47 | ## Task 2 - Create DMS source and target endpoints 48 | 49 | 1. Click on the **Endpoints** link on the left menu, and then click on Create endpoint on the top right corner. 50 | 51 | ![Create Endpoints](images/create_ep.png) 52 | 53 | 2. Enter the Connection details for the source endpoint as shown in the following table. 54 | 55 | Parameter | Value 56 | --- | --- 57 | Endpoint type | Source endpoint 58 | Endpoint identifier | oracle-source 59 | Source engine | oracle 60 | Server name | Get `OracleSCTInstancePrivateIP` from [CloudFormation stack output](./lab-setup-verification.md#cloudformation-stack-outputs) 61 | Port | 1521 62 | SSL mode | none 63 | User name | hr 64 | Password | hr123 65 | SID | XE 66 | 67 | ![Create Source Endpoints](images/create_sep.png) 68 | 69 | 3. Once the information has been entered, click **Run Test** under **Test endpoint connection (optional)**. When the status turns to successful, click **Create endpoint**. 70 | 71 | _Note: You can test the endpoint connectivity only after the replication instance has been created, wait for the replication instance creation to be completed. 72 | 73 | 4. Repeat the previous steps to create the target endpoint for the Aurora PostgreSQL database with the following values. 74 | 75 | Parameter | Value 76 | --- | --- 77 | Endpoint type | Target endpoint 78 | Endpoint identifier | aurora-postgresql-target 79 | Source engine | aurora-postgresql 80 | Server name | [Get `AuroraPostgreSQLEndpoint` from CloudFormation stack output](./lab-setup-verification.md#cloudformation-stack-outputs) 81 | Port | 5432 82 | SSL mode | none 83 | User name | postgres 84 | Password | Aurora321 85 | Database name | AuroraPostgreSQLDB 86 | 87 | ![Create Target Endpoints](images/create_tep.png) 88 | 89 | 5. Once the information has been entered, click **Run Test** under **Test endpoint connection (optional)**. When the status turns to successful, click **Create endpoint**. 90 | 91 | ___ 92 | 93 | [Back to main guide](../README.md)|[Next](dms-fl.md) -------------------------------------------------------------------------------- /lab-guides/images/1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/1.png -------------------------------------------------------------------------------- /lab-guides/images/2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/2.png -------------------------------------------------------------------------------- /lab-guides/images/3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/3.png -------------------------------------------------------------------------------- /lab-guides/images/Picture1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/Picture1.png -------------------------------------------------------------------------------- /lab-guides/images/Picture2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/Picture2.png -------------------------------------------------------------------------------- /lab-guides/images/action-items.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/action-items.png -------------------------------------------------------------------------------- /lab-guides/images/apply_db.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/apply_db.png -------------------------------------------------------------------------------- /lab-guides/images/assessment.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/assessment.png -------------------------------------------------------------------------------- /lab-guides/images/cfn-output.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/cfn-output.png -------------------------------------------------------------------------------- /lab-guides/images/cfn-verify-output.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/cfn-verify-output.png -------------------------------------------------------------------------------- /lab-guides/images/cfn-verify.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/cfn-verify.png -------------------------------------------------------------------------------- /lab-guides/images/convert_schema.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/convert_schema.png -------------------------------------------------------------------------------- /lab-guides/images/create_conn.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/create_conn.png -------------------------------------------------------------------------------- /lab-guides/images/create_conn_aurora.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/create_conn_aurora.png -------------------------------------------------------------------------------- /lab-guides/images/create_ep.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/create_ep.png -------------------------------------------------------------------------------- /lab-guides/images/create_rep_inst.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/create_rep_inst.png -------------------------------------------------------------------------------- /lab-guides/images/create_rep_inst_detail.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/create_rep_inst_detail.png -------------------------------------------------------------------------------- /lab-guides/images/create_sep.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/create_sep.png -------------------------------------------------------------------------------- /lab-guides/images/create_task.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/create_task.png -------------------------------------------------------------------------------- /lab-guides/images/create_task_mappings.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/create_task_mappings.png -------------------------------------------------------------------------------- /lab-guides/images/create_task_mappings_cdc.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/create_task_mappings_cdc.png -------------------------------------------------------------------------------- /lab-guides/images/create_tep.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/create_tep.png -------------------------------------------------------------------------------- /lab-guides/images/dashboard-open.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/dashboard-open.png -------------------------------------------------------------------------------- /lab-guides/images/dms.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/dms.png -------------------------------------------------------------------------------- /lab-guides/images/drop_FK.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/drop_FK.png -------------------------------------------------------------------------------- /lab-guides/images/drop_trigger.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/drop_trigger.png -------------------------------------------------------------------------------- /lab-guides/images/event-dashboard.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/event-dashboard.png -------------------------------------------------------------------------------- /lab-guides/images/final_verification.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/final_verification.png -------------------------------------------------------------------------------- /lab-guides/images/hash-code.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/hash-code.png -------------------------------------------------------------------------------- /lab-guides/images/initial_verification.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/initial_verification.png -------------------------------------------------------------------------------- /lab-guides/images/instance-connect.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/instance-connect.png -------------------------------------------------------------------------------- /lab-guides/images/migration_complete.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/migration_complete.png -------------------------------------------------------------------------------- /lab-guides/images/migration_complete_cdc.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/migration_complete_cdc.png -------------------------------------------------------------------------------- /lab-guides/images/migration_progress.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/migration_progress.png -------------------------------------------------------------------------------- /lab-guides/images/new_project.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/new_project.png -------------------------------------------------------------------------------- /lab-guides/images/new_project1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/new_project1.png -------------------------------------------------------------------------------- /lab-guides/images/num.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/num.png -------------------------------------------------------------------------------- /lab-guides/images/reference-architecture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/reference-architecture.png -------------------------------------------------------------------------------- /lab-guides/images/schema_objects.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/schema_objects.png -------------------------------------------------------------------------------- /lab-guides/images/sct.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/sct.png -------------------------------------------------------------------------------- /lab-guides/images/sct_aurora.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/sct_aurora.png -------------------------------------------------------------------------------- /lab-guides/images/sct_oracle.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/sct_oracle.png -------------------------------------------------------------------------------- /lab-guides/images/stack-progress.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/stack-progress.png -------------------------------------------------------------------------------- /lab-guides/images/table_size.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/table_size.png -------------------------------------------------------------------------------- /lab-guides/images/table_statistics.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/table_statistics.png -------------------------------------------------------------------------------- /lab-guides/images/table_statistics_cdc.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/table_statistics_cdc.png -------------------------------------------------------------------------------- /lab-guides/images/verify_target_db.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/4ab9147b661e6a9384d06fc1c5bafdae1c8b5a61/lab-guides/images/verify_target_db.png -------------------------------------------------------------------------------- /lab-guides/lab-setup-verification.md: -------------------------------------------------------------------------------- 1 | [Back to main guide](../README.md)|[Next](sct.md) 2 | 3 | ___ 4 | 5 | # Login to Event Engine 6 | Event Engine provides you an AWS environment for running this workshop. 7 | 8 | 1. You will be given a hash code to log-in to Event Engine. Refer to the following image for a sample hash code. 9 | 10 | ![Hash Code](images/hash-code.png) 11 | 12 | 2. Navigate to [dashboard.eventengine.run](https://dashboard.eventengine.run/dashboard) and enter your hash code. 13 | 14 | ![Dash Board](images/event-dashboard.png) 15 | 16 | 3. Click on the **AWS console** button to log-in to AWS environment. 17 | 18 | ![Dash Board](images/dashboard-open.png) 19 | 20 | # CloudFormation Stack Outputs 21 | 22 | The Event Engine runs the CloudFormation stack to set up following components in AWS: 23 | - An EC2 instance with following components 24 | - AWS Schema Conversion Tool (SCT) 25 | - Source Oracle database 26 | - Oracle SQL Developer 27 | - A sample web application 28 | - An Amazon Aurora PostgreSQL instance used as the target database 29 | 30 | 1. Go to the [AWS CloudFormation console](https://console.aws.amazon.com/cloudformation/home?region=us-east-1) and click on Stacks in the navigation panel to list CloudFormation Stacks. You should see a CloudFormation stack with name like `mod-xxxxxxxxxxx`. 31 | 32 | ![Stack Progress](images/cfn-verify.png) 33 | 34 | 2. Click on the **Resources** tab. You will see various AWS resources created. 35 | 36 | 3. Make a note of the stack output parameters. You can find the stack output parameters in the **Outputs** tab. **You need these parameters later in the workshop.** 37 | 38 | ![Stack Output](images/cfn-verify-output.png) 39 | ___ 40 | 41 | [Back to main guide](../README.md)|[Next](sct.md) -------------------------------------------------------------------------------- /lab-guides/lab-setup.md: -------------------------------------------------------------------------------- 1 | [Back to main guide](../README.md)|[Next](sct.md) 2 | 3 | ___ 4 | 5 | # Lab Setup 6 | 7 | In this activity, we will deploy the CloudFormation template to create the lab environment. 8 | 9 | The environment for this lab consists of: 10 | - An EC2 instance with following components 11 | - AWS Schema Conversion Tool (SCT) 12 | - Source Oracle database 13 | - Oracle SQL Developer 14 | - A sample web application 15 | - An Amazon Aurora PostgreSQL instance used as the target database 16 | 17 | ___ 18 | 19 | ## Deploy the CloudFormation Template 20 | 21 | 1. Click on one of the buttons below to launch the CloudFormation stack in one of the AWS regions. 22 | 23 | Region | Launch 24 | -------|----- 25 | US East (N. Virginia) | [![Launch Solution in us-east-1](http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/images/cloudformation-launch-stack-button.png)](https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/new?stackName=DMSWorkshop&templateURL=https://reinvent-2019-oracle-aurora.s3.amazonaws.com/GPSTEC315_lab_setup.yaml) 26 | US West (Oregon) | [![Launch Solution in us-west-2](http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/images/cloudformation-launch-stack-button.png)](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=DMSWorkshop&templateURL=https://reinvent-2019-oracle-aurora.s3.amazonaws.com/GPSTEC315_lab_setup.yaml) 27 | 28 | 2. Click **Next** on the Select Template page. 29 | 30 | 3. Enter a **Stack Name** or accept the default and click **Next**. 31 | 32 | 4. On the Options page, accept all the defaults and click **Next**. 33 | 34 | 5. On the Review page, click **Create**. 35 | 36 | 6. Click on Stacks in the navigation panel to list the CloudFormation Stacks. You should see the `DMSWorkshop` stack creation in progress. 37 | 38 | ![Stack Progress](images/stack-progress.png) 39 | 40 | 7. After the stack creation is complete, select the stack Name - `DMSWorkshop`, Click on the **Resources** tab. You will see various AWS resources created. 41 | 42 | 8. Make a note of the stack output parameters. You can find the stack output parameters in the **Outputs** tab. 43 | 44 | ![Stack Output](images/cfn-output.png) 45 | ___ 46 | 47 | [Back to main guide](../README.md)|[Next](sct.md) -------------------------------------------------------------------------------- /lab-guides/num-dt.md: -------------------------------------------------------------------------------- 1 | [Back to main guide](../README.md)|[Next](optional-resolv.md) 2 | ___ 3 | 4 | # Fixing issues with numeric data types 5 | 6 | In this section, we will fix possible issues while migrating numeric data types from Oracle to PostgreSQL. 7 | 8 | ## Numeric Data Types 9 | 10 | The NUMBER data type in Oracle is used to store numeric values. The NUMBER data type is defined as NUMBER (**p**, s). **p** is the precision (the total number of digits in the number) and s is the scale (the number of digits after the decimal point). For example, the number 123,456.789 has a precision of 9 and a scale of 3. You would define this data type as NUMBER (9, 3). 11 | 12 | ![Numeric](images/num.png) 13 | 14 | The equivalent datatype in PostgreSQL is **NUMERIC**. For example, you can define a datatype as **NUMERIC (9, 3)**. If a column is defined as **NUMBER (p, s)** in Oracle, i.e. with precision and scale specified, SCT will automatically map it to the equivalent **NUMERIC (p, s)** data type in PostgreSQL. 15 | 16 | ## Numeric Data Types without precision and scale. 17 | 18 | It is recommended to specify both precision and scale while defining numeric columns. However, it is not mandatory to specify the precision and scale for numeric data types; for example, you can define a column in Oracle as just **NUMBER**, without any precision and scale. If the precision and scale are not specified, the number is stored in the database as entered. 19 | 20 | If a column is defined as **NUMBER** in Oracle, **without** any precision and scale, SCT will map it to **DOUBLE PRECISION** in PostgreSQL. This is because SCT cannot identify the number of digits after the decimal point (the scale) if it is not specified in the column data type definition. The following table shows the data type mapping SCT uses in various scenarios. 21 | 22 | Oracle Data Type | PostgreSQL Data Type 23 | --- | --- 24 | NUMBER(9,3) | NUMERIC (9,3) 25 | NUMBER(9,0) | NUMERIC (9,0) 26 | NUMBER | DOUBLE PRECISION 27 | 28 | _Please refer to page 89 of the [Oracle to Aurora PostgreSQL migration playbook](https://d1.awsstatic.com/whitepapers/Migration/oracle-database-amazon-aurora-postgresql-migration-playbook.pdf) for a detailed discussion on Data Type mappings between Oracle and PostgreSQL._ 29 | 30 | ## Issue with the data in the REGIONS table 31 | 32 | In the REGIONS table in the Oracle source, the `region_id` column is defined as **NUMBER** without any precision and scale. SCT will map this to a **DOUBLE PRECISION** data type in PostgreSQL. 33 | 34 | After migrating the data using DMS, the values in the `region_id` column (in PostgreSQL) will appear as 1.0, 2.0 etc. You can verify this behavior by running `SELECT * FROM HR.REGIONS;` in both Oracle and Aurora PostgreSQL and comparing the output as shown in the following screenshots. 35 | 36 | _The data from the REGIONS table in Oracle is shown in the following screenshot._ 37 | 38 | ![Numeric](images/1.png) 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | _The data from the REGIONS table in PostgreSQL is shown in the following screenshot. Note the ‘.0’ appended to the values in the ‘region_id’ column._ 51 | 52 | ![Numeric](images/2.png) 53 | 54 | 55 | To fix this issue, we can modify the data type in the target Aurora PostgreSQL database from DOUBLE PRECISION to NUMERIC with a precision and scale. In this activity, we will modify the data type in the target Aurora PostgreSQL database to NUMERIC (6, 0). 56 | 57 | _Note: The ‘emp_details_view’ depends on the ‘region_id’ column, you need to drop this view before making the data type change, and re-create the view again._ 58 | 59 | 1. Drop the `emp_details_view` view by running the following query while connected to the target **Aurora PostgreSQL** database. 60 | ``` 61 | DROP VIEW hr.emp_details_view; 62 | COMMIT WORK; 63 | ``` 64 | 65 | 2. Modify the data type of the regions table (and the dependent COUNTRIES table) to NUMERIC (6, 0) by running the following query. 66 | 67 | ``` 68 | ALTER TABLE hr.countries 69 | ALTER COLUMN region_id TYPE numeric(6,0); 70 | ALTER TABLE hr.regions 71 | ALTER COLUMN region_id TYPE numeric(6,0); 72 | COMMIT WORK; 73 | ``` 74 | 75 | 3. Re-create the ‘emp_details_view’ view by running the following query. 76 | 77 | ``` 78 | 79 | CREATE OR REPLACE VIEW hr.emp_details_view (employee_id, job_id, manager_id, department_id, location_id, country_id, first_name, last_name, salary, commission_pct, department_name, job_title, city, state_province, country_name, region_name) AS 80 | SELECT 81 | e.employee_id, e.job_id, e.manager_id, e.department_id, d.location_id, l.country_id, e.first_name, e.last_name, e.salary, e.commission_pct, d.department_name, j.job_title, l.city, l.state_province, c.country_name, r.region_name 82 | FROM hr.employees AS e, hr.departments AS d, hr.jobs AS j, hr.locations AS l, hr.countries AS c, hr.regions AS r 83 | WHERE e.department_id = d.department_id AND d.location_id = l.location_id AND l.country_id = c.country_id AND c.region_id = r.region_id AND j.job_id = e.job_id; 84 | 85 | ``` 86 | 87 | 4. You can verify the issue is fixed by inserting two new rows into the REGIONS table in the source Oracle database by running the following query. 88 | 89 | ``` 90 | 91 | INSERT INTO HR.REGIONS VALUES (7,'China'); 92 | INSERT INTO HR.REGIONS VALUES (8,'ANZ'); 93 | COMMIT WORK; 94 | ``` 95 | 96 | 97 | 5. Now run ‘SELECT * from hr.regions;’ after connecting to the **Aurora PostgreSQL database** and verify that the data appears in the correct format as shown in the following screenshot. 98 | 99 | ![Numeric](images/3.png) 100 | 101 | ___ 102 | 103 | [Back to main guide](../README.md)|[Next](optional-resolv.md) 104 | -------------------------------------------------------------------------------- /lab-guides/optional-resolv.md: -------------------------------------------------------------------------------- 1 | [Back to main guide](../README.md)|[Next](cleanup.md) 2 | 3 | ___ 4 | 5 | # Resolving SCT action items 6 | 7 | In this section, we will look at the SCT assessment report, examine some of the warnings from SCT, and manually convert the schema objects that SCT was unable to convert. 8 | 9 | _Note: If you have closed the SCT, please reopen the project in SCT (File-> Open Project from C:\Lab\SCT Projects\AWS Schema Conversion Tool Oracle to Aurora PostgreSQL)._ 10 | 11 | Right-click on the HR schema from the Oracle source, select **Create Report**, then click on **Action items**. You will see three items listed as shown in the following screenshot. 12 | 13 | ![DMS](images/Picture1.png) 14 | 15 | - EMP_DETAILS_VIEW view. 16 | - Issue 5075: PostgreSQL doesn't support VIEW with the READ ONLY option. 17 | - Schemas.HR.Views.EMP_DETAILS_VIEW: 555:568 18 | 19 | PostgreSQL does not support read only views. SCT recommends that we use the view without the read only option and it has made the change automatically. No further action is needed from us. 20 | 21 | - SECURE_DML procedure. 22 | - Issue 5584: Converted functions depends on the time zone settings 23 | - Schemas.HR.Procedures.SECURE_DML: 51:57 24 | - Schemas.HR.Procedures.SECURE_DML: 123:129 25 | 26 | SCT recommends that we review the transformed code, and set time zone manually if necessary. No further action is needed from us. 27 | 28 | - ADD_JOB_HISTORY procedure. 29 | - Issue 5340: Unable to convert functions. 30 | - Schemas.HR.Procedures.ADD_JOB_HISTORY: 441:448 31 | - Schemas.HR.Procedures.ADD_JOB_HISTORY: 451:462 32 | - Schemas.HR.Procedures.ADD_JOB_HISTORY: 465:474 33 | - Schemas.HR.Procedures.ADD_JOB_HISTORY: 477:484 34 | - Schemas.HR.Procedures.ADD_JOB_HISTORY: 487:501 35 | 36 | 37 | We will have to manually convert this procedure. Please follow these steps to manually convert this procedure. 38 | 39 | 1. Click on the HR schema from the Oracle source, expand the procedures node, and select the **ADD_JOB_HISTORY** procedure. SCT will show you the side-by-side view of the original procedure in Oracle and the converted procedure in Aurora PostgreSQL as shown in the following screenshot. 40 | 41 | ![DMS](images/Picture2.png) 42 | 43 | 2. Examine the converted procedure in PostgreSQL on the right side. You will notice that SCT has provided details (as a comment) on the error while attempting to convert the procedure. 44 | 45 | ``` 46 | CREATE OR REPLACE FUNCTION hr.add_job_history( 47 | IN p_emp_id DOUBLE PRECISION, 48 | IN p_start_date TIMESTAMP WITHOUT TIME ZONE, 49 | IN p_end_date TIMESTAMP WITHOUT TIME ZONE, 50 | IN p_job_id TEXT, 51 | IN p_department_id DOUBLE PRECISION) 52 | RETURNS void 53 | AS 54 | $BODY$ 55 | BEGIN 56 | /* 57 | [5340 - Severity CRITICAL - PostgreSQL doesn't support the ADD_JOB_HISTORY.P_EMP_ID function. Use suitable function or create user defined function., 5340 - Severity CRITICAL - PostgreSQL doesn't support the ADD_JOB_HISTORY.P_START_DATE function. Use suitable function or create user defined function., 5340 - Severity CRITICAL - PostgreSQL doesn't support the ADD_JOB_HISTORY.P_END_DATE function. Use suitable function or create user defined function., 5340 - Severity CRITICAL - PostgreSQL doesn't support the ADD_JOB_HISTORY.P_JOB_ID function. Use suitable function or create user defined function., 5340 - Severity CRITICAL - PostgreSQL doesn't support the ADD_JOB_HISTORY.P_DEPARTMENT_ID function. Use suitable function or create user defined function.] 58 | INSERT INTO HR.job_history (employee_id, start_date, end_date, 59 | job_id, department_id) 60 | VALUES(p_emp_id, p_start_date, p_end_date, p_job_id, p_department_id) 61 | */ 62 | BEGIN 63 | END; 64 | END; 65 | $BODY$ 66 | LANGUAGE plpgsql; 67 | 68 | 69 | ``` 70 | 71 | 72 | 3. In this case, the fix is simple. Copy the insert statement outside the comment, in between the BEGIN END block, and add a semicolon (;) to the end of the insert statement as shown below. 73 | ``` 74 | CREATE OR REPLACE FUNCTION hr.add_job_history( 75 | IN p_emp_id DOUBLE PRECISION, 76 | IN p_start_date TIMESTAMP WITHOUT TIME ZONE, 77 | IN p_end_date TIMESTAMP WITHOUT TIME ZONE, 78 | IN p_job_id TEXT, 79 | IN p_department_id DOUBLE PRECISION) 80 | RETURNS void 81 | AS 82 | $BODY$ 83 | BEGIN 84 | /* 85 | [5340 - Severity CRITICAL - PostgreSQL doesn't support the ADD_JOB_HISTORY.P_EMP_ID function. Use suitable function or create user defined function., 5340 - Severity CRITICAL - PostgreSQL doesn't support the ADD_JOB_HISTORY.P_START_DATE function. Use suitable function or create user defined function., 5340 - Severity CRITICAL - PostgreSQL doesn't support the ADD_JOB_HISTORY.P_END_DATE function. Use suitable function or create user defined function., 5340 - Severity CRITICAL - PostgreSQL doesn't support the ADD_JOB_HISTORY.P_JOB_ID function. Use suitable function or create user defined function., 5340 - Severity CRITICAL - PostgreSQL doesn't support the ADD_JOB_HISTORY.P_DEPARTMENT_ID function. Use suitable function or create user defined function.] 86 | */ 87 | BEGIN 88 | INSERT INTO HR.job_history (employee_id, start_date, end_date, 89 | job_id, department_id) 90 | VALUES(p_emp_id, p_start_date, p_end_date, p_job_id, p_department_id); 91 | 92 | END; 93 | END; 94 | $BODY$ 95 | LANGUAGE plpgsql; 96 | 97 | ``` 98 | 99 | 4. Click on the HR schema from Aurora PostgreSQL target, expand the functions node, and select `add_job_history`'`. Right click, select **Apply to database** and click yes when prompted. If the **Apply to database** menu is disabled, click Connect to Amazon Aurora(PostgreSQL compatible) on the top menu bar and try again. Now the manually converted procedure has been applied to the target database. 100 | 101 | 5. To view the modified procedure code in SQL developer, right click on the Aurora PostgreSQL connection, open a new SQL Worksheet, and run the following script (Click the run script button or press F5 on the keyboard). 102 | ``` 103 | SELECT 104 | pg_get_functiondef(( 105 | SELECT 106 | oid FROM pg_proc 107 | WHERE 108 | proname = 'add_job_history')); 109 | ``` 110 | 111 | 112 | [Back to main guide](../README.md)|[Next](cleanup.md) -------------------------------------------------------------------------------- /lab-guides/optional-validation.md: -------------------------------------------------------------------------------- 1 | [Back to main guide](../README.md)|[Next](num-dt.md) 2 | 3 | ___ 4 | 5 | # Validation : Web application 6 | 7 | We have created a nodeJS application that connects to the source Oracle **HR** schema and lists all the rows from the **employees** table. This sample web application allows you to add new employees and update existing employee details. We have also cloned and modified the sample web application to support target PostgreSQL. Both the applications are already installed on the **OracleXE-SCT** EC2 instance. 8 | 9 | In this activity, you will verify the continuous data replication between the Oracle source and the Aurora target using these two applications. 10 | 11 | ## Task 1 - Connect and start the web application connected to the Oracle source. 12 | 1. Connect to the **OracleXE-SCT** EC2 instance using the following password, if not already connected. 13 | **User Name**: administrator 14 | **Windows password**: GPSreInvent@321 15 | 2. Click the **Start** button, right click on **Windows PowerShell**, and click **Open new window**. 16 | 3. Start the Oracle web application by executing the following command in **PowerShell**. 17 | ``` 18 | C:\Lab\oracle-app\start-app.ps1 19 | ``` 20 | 4. Once the application is successfully started (this could take a minute), open the web application by visiting following URL : http://localhost:4200/ 21 | 5. Verify that the Oracle Web application is listing all the rows from the **employees** table. 22 | 23 | ## Task 2 - Configure and start the web application connected to the Aurora PostgreSQL target database 24 | 1. Update the database `config` file to point to the target Aurora PostgreSQL end point. Navigate to `C:\Lab\pgs-app\hr_app\config\`, open the **database.js** config file in TextPad (right click), and update the **host** parameter with the `AuroraPostgreSQLEndpoint` value from the [CloudFormation stack output](./lab-setup-verification.md#cloudformation-stack-outputs). 25 | 2. Open another PowerShell window for running PostgreSQL Web application. Click the **Start** button, right click on **Windows PowerShell**, and click **Open new window**. 26 | 3. Start the PostgreSQL web application by executing the following script in **PowerShell**. 27 | ``` 28 | C:\Lab\pgs-app\start-app.ps1 29 | ``` 30 | 4. Once the application is successfully started(this could take a minute), open the web application by visiting following URL : http://localhost:4400/ 31 | 5. Verify that the PostgreSQL Web application is listing all the rows from **employees** table. 32 | 33 | ## Task 3 - Validate the on going data replication / CDC. 34 | Now you are running two applications, one connected to the source Oracle database and another connected to the target PostgreSQL databases with DMS migration task configured to replicate the data changes from the source to the target. 35 | 36 | 1. Add a new employee from your Oracle web application, by clicking **Add Employee** button. 37 | 2. Verify that newly added employee details appear in the PostgreSQL web application (refresh the page). 38 | 3. Update an employee in the Oracle web application and verify that change is replicated to the target PostgreSQL web application (refresh the page). 39 | 40 | ### Conclusion 41 | This part of the workshop demonstrated a database replication with Data Change Capture in real time. 42 | 43 | ___ 44 | 45 | [Back to main guide](../README.md)|[Next](num-dt.md) -------------------------------------------------------------------------------- /lab-guides/sct.md: -------------------------------------------------------------------------------- 1 | [Back to main guide](../README.md)|[Next](dms-inst.md) 2 | 3 | ___ 4 | 5 | # AWS Schema Conversion Tool (SCT) 6 | 7 | The following steps provide instructions for converting an Oracle database to an Amazon Aurora PostgreSQL database. In this activity, you perform the following tasks: 8 | - Log in to the Windows EC2 instance (this has the AWS SCT installed) 9 | - Use SCT to create a database migration project 10 | - Use SCT to convert the Oracle schema to a PostgreSQL schema and analyze schema conversion issues 11 | - Apply the converted schema to the Aurora PostgreSQL database 12 | 13 | ![Connect to EC2 Instance](images/sct.png) 14 | ___ 15 | 16 | ## Task 1 - Log in to Windows EC2 instance 17 | 18 | 1. Go to the [Amazon EC2 console](https://console.aws.amazon.com/ec2/v2/home) and click on Instances in the left column. 19 | 2. Select the instance with the name **OracleXE-SCT** and then click the **Connect** button. 20 | 21 | _Note : If you need instructions on connecting to a Windows instance, please see our [documentation](https://docs.aws.amazon.com/AWSEC2/latest/WindowsGuide/connecting_to_windows_instance.html?icmpid=docs_ec2_console)._ 22 | 23 | ![Connect to EC2 Instance](images/instance-connect.png) 24 | 25 | 3. Click on **Download Remote Desktop File** to download the RDP file to connect to the EC2 instance **OracleXE-SCT**. 26 | 4. Connect to the EC2 instance using the following password 27 | 28 | **User Name**: administrator 29 | **Password** : GPSreInvent@321 30 | 31 | ___ 32 | 33 | ## Task 2 - Launch the Schema Conversion Tool 34 | Now that you are connected to the EC2 instance **OracleXE-SCT**, launch the Schema Conversion Tool from the shortcut on the desktop. 35 | 36 | 1. Launch SCT from the shortcut on the desktop and create a new Database Migration Project using the tool. 37 | ![Create New Project](images/new_project.png) 38 | 2. Enter the following values and click **OK**. 39 | 40 | Parameter | Value 41 | ----------|------ 42 | Project Name | AWS Schema Conversion Tool Oracle to Aurora PostgreSQL 43 | Location | Leave default 44 | Database type | Transactional Database (OLTP) 45 | Source database engine | Oracle 46 | Target database engine | Amazon Aurora(PostgreSQL compatible) 47 | 48 | ![Create New Project](images/new_project1.png) 49 | 50 | 3. Click **Connect to Oracle** on the top menu bar. Enter the source database details from the below table, and click **Test Connection**. Once the connection is successfully tested, click **Ok**. 51 | 52 | Parameter | Value 53 | ----------|------ 54 | Type | SID 55 | Server Name | localhost 56 | Server Port | 1521 57 | Oracle SID | XE 58 | User Name | hr 59 | Password | hr123 60 | Use SSL | Unchecked 61 | Store Password | Checked 62 | 63 | ![Connect to Oracle](images/sct_oracle.png) 64 | 65 | 4. Click **Connect to Amazon Aurora(PostgreSQL compatible)** on the top menu bar. Enter the target database details from the below table, and click **Test Connection**. Once the connection is successfully tested, click **Ok**. 66 | 67 | Parameter | Value 68 | ----------|------ 69 | Server Name | Get `AuroraPostgreSQLEndpoint` from [CloudFormation stack output](./lab-setup-verification.md#cloudformation-stack-outputs) 70 | Server Port | 5432 71 | Database | AuroraPostgreSQLDB 72 | User Name | postgres 73 | Password | Aurora321 74 | Use SSL | Unchecked 75 | Store Password | Checked 76 | 77 | ![Connect to Amazon Aurora](images/sct_aurora.png) 78 | 79 | 5. Select and expand the `HR` schema from the left-hand panel to inspect the schema objects. 80 | 81 | ![Schema Objects](images/schema_objects.png) 82 | ___ 83 | 84 | ## Task 3 - Convert the Schema Using the Schema Conversion Tool 85 | After creating a Database Migration Project, the next step is to convert the source Oracle schema to the target PostgreSQL schema. 86 | 87 | 1. Right-click on the `HR` schema from Oracle source and select **Convert Schema** to generate the data definition language (DDL) statements for the target database. 88 | 89 | _Note: You may be prompted with a dialog box “These objects might already exist in the target database, Replace?” Select **Yes** and conversion will start._ 90 | 91 | ![Convert Schema](images/convert_schema.png) 92 | 93 | AWS SCT analyses the schema and creates a database migration assessment report for the conversion to PostgreSQL. Items with a red exclamation mark next to them cannot be directly translated from the source to the target. In this case, it includes the SECURE_DML and ADD_JOB_HISTORY procedures, and the EMP_DETAILS_VIEW view. 94 | 95 | 2. Click on the **View** button, and choose **Assessment Report view** to view the detailed assessment report. 96 | 97 | ![Assessment Report view](images/assessment.png) 98 | 99 | 3. Review the assessment report and check if there are any suggested actions to address schema conversion issues. 100 | 101 | 4. Next, navigate to the **Action Items** tab in the report to see the items that have issues while converting to the new PostgreSQL schema. 102 | 103 | ![Action Items](images/action-items.png) 104 | 105 | 5. Check each of the issues listed and compare the contents under the source Oracle panel and the target Aurora PostgreSQL panel. SCT has proposed resolutions by generating equivalent PostgreSQL DDL to convert the objects. Additionally, SCT highlights each conversion issue where it cannot automatically generate a conversion, and provide hints on how you can successfully convert the database object. 106 | 107 | Notice the issue highlighted in the procedure `ADD_JOB_HISTORY`. You will see that SCT is unable to automatically convert the procedure. Ignore these issues for now. We will take a look at fixing these issues towards the end of the workshop in activity [Resolving SCT action items](https://github.com/aws-samples/amazon-aurora-database-migration-workshop-reinvent2019/blob/master/lab-guides/optional-resolv.md). 108 | 109 | 6. To migrate the converted schema to the target database, right click on the `HR` schema in the right-hand panel, and select **Apply to database**. 110 | 111 | ![Apply to database](images/apply_db.png) 112 | 113 | 7. When prompted if you want to apply the schema to the database, click **Yes**. 114 | 8. At this point, the schema has been applied to the target database. Expand the `HR` schema to see the tables. 115 | 116 | _Note: If you don’t see the tables, right click on `HR` schema and select **Refresh from Database**._ 117 | 118 | 9. Save the SCT project by clicking on File-> Save project. 119 | 120 | ### Conclusion 121 | 122 | This part of the workshop demonstrated how to convert and migrate the schema from an Oracle database to an Amazon Aurora PostgreSQL database. The AWS Schema Conversion Tool (SCT) automates the schema conversion to a large extent. These same steps can be followed to migrate SQL Server and Oracle workloads to other Amazon RDS engines including Aurora MySQL and MySQL. 123 | ___ 124 | 125 | [Back to main guide](../README.md)|[Next](dms-inst.md) --------------------------------------------------------------------------------