├── sqlquery.sql ├── LICENSE └── README.md /sqlquery.sql: -------------------------------------------------------------------------------- 1 | CREATE OR REPLACE STREAM "DESTINATION_SQL_STREAM" ( 2 | datetime VARCHAR(30), 3 | status INTEGER, 4 | statusCount INTEGER); 5 | CREATE OR REPLACE PUMP "STREAM_PUMP" AS 6 | INSERT INTO "DESTINATION_SQL_STREAM" 7 | SELECT 8 | STREAM TIMESTAMP_TO_CHAR('yyyy-MM-dd''T''HH:mm:ss.SSS', 9 | LOCALTIMESTAMP) as datetime, 10 | "response" as status, 11 | COUNT(*) AS statusCount 12 | FROM "SOURCE_SQL_STREAM_001" 13 | GROUP BY 14 | "response", 15 | FLOOR(("SOURCE_SQL_STREAM_001".ROWTIME - TIMESTAMP '1970-01- 16 | 01 00:00:00') minute / 1 TO MINUTE); 17 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Adit Modi 4 | Permission is hereby granted, free of charge, to any person obtaining a copy 5 | of this software and associated documentation files (the "Software"), to deal 6 | in the Software without restriction, including without limitation the rights 7 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 8 | copies of the Software, and to permit persons to whom the Software is 9 | furnished to do so, subject to the following conditions: 10 | 11 | The above copyright notice and this permission notice shall be included in all 12 | copies or substantial portions of the Software. 13 | 14 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 15 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 16 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 17 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 18 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 19 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 20 | SOFTWARE. 21 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Log-Analytics-Solution 2 | 3 | Collect, process, and analyze log data using Amazon Kinesis and Elasticsearch Service 4 | 5 | ![image](https://user-images.githubusercontent.com/48589838/77059467-fae1d280-69fc-11ea-97aa-fd2ebe57f712.png) 6 | 7 | 8 | ##  Step 1: Set Up Prerequisites 9 | 10 | 11 |  Create an AWS Account 12 | 13 |  Start an EC2 Instance 14 | 15 |  Prepare Your Log Files 16 | 17 | To create a continuous stream of log file data on your EC2 instance, download, 18 | install, and run the Fake Apache Log Generator from Github. 19 | 20 | 21 | ##  Step 2: Create an Amazon Kinesis Firehose Delivery Stream 22 | 23 | 24 | ##  Step 3: Install and Configure the Amazon Kinesis Agent 25 | 26 | 27 | To install the agent, see Download and Install the Agent.[http://docs.aws.amazon.com/firehose/latest/dev/writing-with-agents.html#download-install] 28 | 29 | 30 | ##  Step 4: Create an Amazon Elasticsearch Service Domain 31 | 32 | 33 | 34 | ##  Step 5: Create a Second Amazon Kinesis Firehose Delivery Stream 35 | 36 | 37 | 38 | ##  Step 6: Create an Amazon Kinesis Analytics Application 39 | 40 | 41 | 42 | ##  Step 7: View the Aggregated Streaming Data 43 | 44 | 45 | 46 | ![image](https://user-images.githubusercontent.com/48589838/77059633-3d0b1400-69fd-11ea-8532-14218a48a96a.png) 47 | 48 | 49 | ##  Step 8: Clean Up 50 | 51 | 52 | 53 | link to access the tutorial[https://d1.awsstatic.com/Projects/P4113850/aws-projects_build-log-analytics-solution-on-aws.pdf] 54 | 55 | --------------------------------------------------------------------------------