├── web-monitor.ini ├── LICENSE ├── README.md └── web.monitor.py /web-monitor.ini: -------------------------------------------------------------------------------- 1 | [Binary paths] 2 | 3 | notify = /usr/local/bin/notify 4 | httpx = /usr/local/sbin/httpx 5 | 6 | [Apis] 7 | 8 | notify_api = /root/.config/notify/provider-config.yaml 9 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2023 Eric Labrador Sainz 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 |

2 | web.Monitor 3 |
4 |

5 | 6 |
  7 | 
  8 |    Fast & user-friendly web change tracking tool.
  9 | 
 10 | 
11 | 12 | ![image](https://github.com/e1abrador/web.Monitor/assets/74373745/dd13f17f-3364-4d25-92fa-6f1924b0acdf) 13 | 14 | ## Why? 15 | 16 | **Continuous Monitoring:** The script can automatically check web pages at regular intervals to detect any changes. 17 | 18 | **Flexible Setup:** You can add individual URLs or load a list of URLs from a file. 19 | 20 | **Persistent Storage:** It uses sqlite module to store the web data in a database file, allowing persistence across runs. 21 | 22 | **Detailed Logging:** It doesn't just check if the page is up, but also logs the HTTP code, content length, and the page title. 23 | 24 | **Notifications:** Upon detecting a change, the script not only prints it on the console but can also send out a notification. 25 | 26 | **Change Visualization:** Provides functionality to view all recorded changes for a specific domain or URL, aiding in historical tracking. 27 | 28 | **Domain Filtering:** If you're only interested in checking URLs of a specific domain, the script allows you to filter and only check those domains. 29 | 30 | **Automation:** With the hour-based repetition option, the script can operate in a loop, checking web pages every set number of hours. 31 | 32 | **Customization and Extension:** Being an open-source script, you can tailor it to your needs or add more functionalities. 33 | 34 | ## Features 35 | - **Fast** 36 | - **Easy to use** 37 | - **Easy to install** 38 | - **Continuously save new changes in the local database (with the possibility of dumping them all)** 39 | - **Telegram/slack/discord notifications** 40 | 41 | ## Help Menu 42 | **web.Monitor** flags: 43 | 44 | ````console 45 | options: 46 | -h, --help show this help message and exit 47 | --add ADD Add a URL to monitor. 48 | --add-urls ADD_URLS Add URLs from a file to monitor. 49 | --check Check all the websites for changes. 50 | -D DOMAIN, --domain DOMAIN 51 | Check websites for a specific root domain. 52 | -df DOMAIN_FILE, --domain-file DOMAIN_FILE 53 | Check websites for root domains specified in a file. 54 | --show-changes Show changes for the specified domain or URL. 55 | -H HOURS, --hours HOURS 56 | Repeat the website check every X hours. 57 | -url URL, --url URL Show changes for a specific URL. 58 | ```` 59 | 60 | ## Previous needed configurations 61 | 62 | You need to write the configuration (api) path files into **config.ini** file. 63 | 64 | - [Httpx](https://github.com/projectdiscovery/httpx) binary. 65 | - [Notify](https://github.com/owasp-amass/amass/blob/master/examples/config.ini) binary. 66 | - Notify api configuration file. 67 | 68 | ## Work plan 69 | 70 | ``IMPORTANT``: I had to change from shelve library to sqlite3 because I had some problems with my VPS. It was generating different database files (I don't really know why, on my local machine was working correctly). 71 | 72 | First of all, is needed to add a URL (or URLs) to the database: 73 | 74 | ````console 75 | python3 web.monitor.py --add-urls urls.txt 76 | python3 web.monitor.py --add http://example.com:81 77 | ```` 78 | 79 | Now, you can start scanning. It's important to note that the first result will be saved on the database but you won't receive a notification, the script will start sending notifications after the first scan. 80 | 81 | Once all the URLs are on the database, you can start scanning with the following command: 82 | 83 | ````console 84 | python3 web.monitor.py -df roots.txt --check -H 1 85 | python3 web.monitor.py -D example.com --check -H 1 86 | ```` 87 | 88 | The ``-df`` flag is used to scan all URLs from a root domain, for example, if the URL ``admin.example.com`` and ``admin.example2.com`` are on the database and the ``roots.txt`` file has only ``*.example.com`` will be scanned. The ``-D`` flag scans only ``*.example.com`` URLs. The ``-H`` flag is used to specify the domain that each 1 hour will be performed a scan, of course, you can customize that, in any case, I recommend scanning each 12 or 24 hours. 89 | 90 | If you want to dump all URLs from a given domain you can use: 91 | 92 | ````console 93 | ➜ python3 web.monitor.py -D example.com --show-changes 94 | 95 | _ __ __ _ _ 96 | | | | \/ | (_) | 97 | __ _____| |__ | \ / | ___ _ __ _| |_ ___ _ __ 98 | \ \ /\ / / _ \ '_ \| |\/| |/ _ \| '_ \| | __/ _ \| '__| 99 | \ V V / __/ |_) | | | | (_) | | | | | || (_) | | 100 | \_/\_/ \___|_.__/|_| |_|\___/|_| |_|_|\__\___/|_| 101 | 102 | github.com/e1abrador/web.Monitor 103 | 104 | example.com:81/ 105 | [2023-09-10 00:58:16.694612] http://example.com:81/ [200] [3463] [Test 1] 106 | [2023-09-10 00:58:25.382700] http://example.com:81/ [200] [39386] [Test 1] 107 | 108 | example.com:82/ 109 | [2023-09-10 00:56:42.354195] http://example.com:82/ [200] [3463] [Test 2] 110 | [2023-09-10 00:57:27.545999] http://example.com:82/ [200] [39386] [Test 2] 111 | [2023-09-10 00:57:38.478968] http://example.com:82/ [200] [2666] [Test 2] 112 | ```` 113 | 114 | You can check 1 single URL with ``--url`` flag too: 115 | 116 | ````console 117 | python3 web.monitor.py --url http://example.com:81 --show-changes 118 | 119 | _ __ __ _ _ 120 | | | | \/ | (_) | 121 | __ _____| |__ | \ / | ___ _ __ _| |_ ___ _ __ 122 | \ \ /\ / / _ \ '_ \| |\/| |/ _ \| '_ \| | __/ _ \| '__| 123 | \ V V / __/ |_) | | | | (_) | | | | | || (_) | | 124 | \_/\_/ \___|_.__/|_| |_|\___/|_| |_|_|\__\___/|_| 125 | 126 | github.com/e1abrador/web.Monitor 127 | 128 | [2023-09-10 21:51:46.626917] http://example.com:81 [200] [21535] [Test 1] 129 | [2023-09-10 21:51:53.748105] http://example.com:81 [200] [2666] [Test 2] 130 | [2023-09-10 21:58:25.827493] http://example.com:81 [200] [35127] [Test 3] 131 | ```` 132 | 133 | 134 | 135 | Note that when using the above command, every URL that contains the domain used in ``-D`` flag will be used, in this example the script will show *.example.com. 136 | 137 | ## Thanks 138 | 139 | Thanks to: 140 | 141 | - Projectdiscovery for creating [httpx](https://github.com/projectdiscovery/httpx) and [notify](https://github.com/projectdiscovery/notify)!. 142 | 143 | If you have any idea of some new functionality open a PR at https://github.com/e1abrador/web.Monitor/pulls. 144 | 145 | Good luck and good hunting! 146 | If you really love the tool (or any others), or they helped you find an awesome bounty, consider [BUYING ME A COFFEE!](https://www.buymeacoffee.com/e1abrador) ☕ (I could use the caffeine!) 147 | 148 | ⚪ e1abrador 149 | 150 | Twitter: https://twitter.com/e1abrador 151 | 152 | Buy Me a Coffee at ko-fi.com 153 | -------------------------------------------------------------------------------- /web.monitor.py: -------------------------------------------------------------------------------- 1 | import sqlite3 2 | import subprocess 3 | import argparse 4 | import time 5 | from datetime import datetime 6 | import configparser 7 | import json 8 | 9 | DB_PATH = 'website_monitor.db' 10 | 11 | config = configparser.ConfigParser() 12 | config.read('web-monitor.ini') 13 | 14 | def create_database(): 15 | with sqlite3.connect(DB_PATH) as conn: 16 | cursor = conn.cursor() 17 | cursor.execute(''' 18 | CREATE TABLE IF NOT EXISTS websites ( 19 | url TEXT PRIMARY KEY, 20 | data TEXT 21 | ) 22 | ''') 23 | conn.commit() 24 | 25 | def add_url(url): 26 | with sqlite3.connect(DB_PATH) as conn: 27 | cursor = conn.cursor() 28 | cursor.execute('INSERT OR IGNORE INTO websites (url, data) VALUES (?, ?)', (url, '[]')) 29 | conn.commit() 30 | 31 | def add_urls_from_file(filename): 32 | with open(filename, 'r') as f: 33 | for url in f: 34 | add_url(url.strip()) 35 | 36 | def get_website_data(url): 37 | with sqlite3.connect(DB_PATH) as conn: 38 | cursor = conn.cursor() 39 | cursor.execute('SELECT data FROM websites WHERE url = ?', (url,)) 40 | data = cursor.fetchone() 41 | return json.loads(data[0]) if data else None 42 | 43 | def update_website_data(url, data): 44 | with sqlite3.connect(DB_PATH) as conn: 45 | cursor = conn.cursor() 46 | cursor.execute('UPDATE websites SET data = ? WHERE url = ?', (json.dumps(data), url)) 47 | conn.commit() 48 | 49 | def check_websites(roots=[]): 50 | current_time = datetime.now().strftime('%Y-%m-%d %H:%M:%S.%f') 51 | 52 | with sqlite3.connect(DB_PATH) as conn: 53 | cursor = conn.cursor() 54 | cursor.execute('SELECT url FROM websites') 55 | urls = [row[0] for row in cursor] 56 | 57 | for url in urls: 58 | if roots and not any(root in url for root in roots): 59 | continue 60 | 61 | httpx_binary = config.get('Binary paths', 'httpx') 62 | cmd = f'echo {url} | {httpx_binary} -silent -sc -title -cl -nc' 63 | result = subprocess.run(cmd, shell=True, capture_output=True, text=True).stdout.strip() 64 | 65 | if not result: 66 | continue 67 | 68 | parts = result.split() 69 | http_code = int(parts[1][1:-1]) 70 | content_length = int(parts[2][1:-1]) 71 | title = " ".join(parts[3:])[1:-1] 72 | 73 | stored_values = get_website_data(url) 74 | 75 | if not stored_values: 76 | stored_values.append({ 77 | 'http_code': http_code, 78 | 'content_length': content_length, 79 | 'title': title, 80 | 'timestamp': current_time 81 | }) 82 | else: 83 | last_values = stored_values[-1] 84 | if (last_values['http_code'] != http_code or 85 | last_values['content_length'] != content_length or 86 | last_values['title'] != title): 87 | 88 | notify_binary = config.get('Binary paths', 'notify') 89 | notify_api = config.get('Apis', 'notify_api') 90 | 91 | cmd = f'echo "Change detected for {url}. New values: {url} [{http_code}] [{content_length}] [{title}]" | {notify_binary} -silent -pc {notify_api}' 92 | subprocess.run(cmd, shell=True) 93 | 94 | stored_values.append({ 95 | 'http_code': http_code, 96 | 'content_length': content_length, 97 | 'title': title, 98 | 'timestamp': current_time 99 | }) 100 | 101 | update_website_data(url, stored_values) 102 | 103 | def show_changes_for_domain(domain): 104 | with sqlite3.connect(DB_PATH) as conn: 105 | cursor = conn.cursor() 106 | cursor.execute('SELECT url, data FROM websites WHERE url LIKE ?', ('%' + domain + '%',)) 107 | changes = {row[0]: json.loads(row[1]) for row in cursor} 108 | 109 | if not changes: 110 | print(f"No changes recorded for {domain}.") 111 | return 112 | 113 | for url, data_list in changes.items(): 114 | print(url) 115 | for data in data_list: 116 | print(f"[{data['timestamp']}] {url} [{data['http_code']}] [{data['content_length']}] [{data['title']}]") 117 | print() 118 | 119 | def show_changes_for_url(url): 120 | data = get_website_data(url) 121 | if not data: 122 | print(f"No changes recorded for {url}.") 123 | return 124 | 125 | for entry in data: 126 | print(f"[{entry['timestamp']}] {url} [{entry['http_code']}] [{entry['content_length']}] [{entry['title']}]") 127 | print() 128 | 129 | def main(): 130 | 131 | create_database() 132 | 133 | print(""" 134 | _ __ __ _ _ 135 | | | | \\/ | (_) | 136 | __ _____| |__ | \\ / | ___ _ __ _| |_ ___ _ __ 137 | \\ \\ /\\ / / _ \\ '_ \\| |\\/| |/ _ \\| '_ \\| | __/ _ \\| '__| 138 | \\ V V / __/ |_) | | | | (_) | | | | | || (_) | | 139 | \\_/\\_/ \\___|_.__/|_| |_|\\___/|_| |_|_|\\__\\___/|_| 140 | 141 | github.com/e1abrador/web.Monitor 142 | """) 143 | 144 | 145 | 146 | parser = argparse.ArgumentParser(description="Monitor websites for changes.") 147 | parser.add_argument('--add', help='Add a URL to monitor.') 148 | parser.add_argument('--add-urls', help='Add URLs from a file to monitor.') 149 | parser.add_argument('--check', action='store_true', help='Check all the websites for changes.') 150 | parser.add_argument('-D', '--domain', help='Check websites for a specific root domain.') 151 | parser.add_argument('-df', '--domain-file', help='Check websites for root domains specified in a file.') 152 | parser.add_argument('--show-changes', action='store_true', help='Show changes for the specified domain or URL.') 153 | parser.add_argument('-H', '--hours', type=float, help='Repeat the website check every X hours.') 154 | parser.add_argument('-url', '--url', help='Show changes for a specific URL.') 155 | 156 | args = parser.parse_args() 157 | 158 | if args.add: 159 | add_url(args.add) 160 | 161 | if args.add_urls: 162 | add_urls_from_file(args.add_urls) 163 | 164 | if args.check: 165 | if args.hours: 166 | while True: 167 | if args.domain: 168 | check_websites(roots=[args.domain]) 169 | elif args.domain_file: 170 | with open(args.domain_file, 'r') as f: 171 | roots = [line.strip() for line in f] 172 | check_websites(roots=roots) 173 | else: 174 | check_websites() 175 | 176 | print(f"Waiting for {args.hours} hour(s) before the next check...") 177 | time.sleep(args.hours * 3600) 178 | else: 179 | if args.domain: 180 | check_websites(roots=[args.domain]) 181 | elif args.domain_file: 182 | with open(args.domain_file, 'r') as f: 183 | roots = [line.strip() for line in f] 184 | check_websites(roots=roots) 185 | else: 186 | check_websites() 187 | 188 | if args.domain and args.show_changes: 189 | show_changes_for_domain(args.domain) 190 | 191 | if args.url and args.show_changes: 192 | show_changes_for_url(args.url) 193 | 194 | if __name__ == "__main__": 195 | main() 196 | --------------------------------------------------------------------------------