├── README.md └── usvisa ├── README.md ├── app.py └── requirements.txt /README.md: -------------------------------------------------------------------------------- 1 | # Playwright Python Scripts 2 | 3 | This repository contains simple scripts written in Python using Playwright for web automation tasks. 4 | 5 | ## Scripts 6 | 7 | 1. **US Visa Automation**: [Find my repo](https://github.com/tolgakurtuluss/playwright-scripts/tree/main/usvisa) 8 | 2. **Flight Price Scraper Automation (revised)**: [Redirect to @AviDataGeek's repo](https://github.com/AviDataGeek/ITAmatrix_price_scraper). 9 | 3. **Flight Price Scraper Automation (original)**: [Redirect to @frkncbngl's repo](https://github.com/frkncbngl/flight-price-web-scraper). 10 | 4. **Script 3**: maybe soon, maybe tomorrow. 11 | 12 | Feel free to explore the scripts and use them for your web automation needs. USE IT YOUR OWN RISK. 13 | -------------------------------------------------------------------------------- /usvisa/README.md: -------------------------------------------------------------------------------- 1 | # USVisa Appointment Automation with Playwright and BeautifulSoup 2 | 3 | This Python script automates the process of checking appointment dates on a specific website and notifies the user if an earlier appointment date is available. 4 | 5 | ***USE IT ON YOUR OWN RISK*** 6 | 7 | # Installation 8 | 9 | ## 1. Enter your credentials: 10 | 11 | You need to place your own credentials can be found on 8-9th rows on app.py file to proceed. Credentials are pre-defined as given below. 12 | 13 | ```bash 14 | USEREMAIL = "username@gmail.com" 15 | PASSWORD = "passw0rd" 16 | ``` 17 | 18 | ## 2. Install the required Python packages by running: 19 | 20 | ```bash 21 | pip install -r requirements.txt 22 | ``` 23 | 24 | ## 3. Install Playwright by running: 25 | 26 | ```bash 27 | playwright install 28 | ``` 29 | 30 | * This step is important as app uses Chromium client supported by playwright. 31 | 32 | ## 4. Run the script by executing the following command: 33 | 34 | ```bash 35 | python app.py 36 | ``` 37 | 38 | ## Requirements 39 | 40 | - Playwright 41 | - BeautifulSoup 42 | - Pandas 43 | - Requests 44 | 45 | ## Disclaimer 46 | 47 | Please note that web scraping/automation may raise legal and ethical concerns. It is important to ensure that your web scraping/automation activities comply with the terms of service of the website you are scraping/automating. 48 | 49 | Additionally, when scraping/automating websites related to US visa information, it is crucial to be aware of and comply with US visa regulations and policies. Make sure to use this script responsibly and in accordance with all applicable laws and regulations. 50 | 51 | The creators of this script are not responsible for any misuse or legal issues that may arise from the use of this script. Use it at your own discretion. 52 | 53 | -------------------------------------------------------------------------------- /usvisa/app.py: -------------------------------------------------------------------------------- 1 | from bs4 import BeautifulSoup 2 | from playwright.sync_api import sync_playwright,expect 3 | import pandas as pd 4 | import time 5 | from datetime import datetime 6 | 7 | USEREMAIL = "username@gmail.com" 8 | PASSWORD = "passw0rd" 9 | 10 | def main(): 11 | context_manager = sync_playwright() 12 | playwright = context_manager.__enter__() 13 | browser = playwright.chromium.launch(headless=False) 14 | expect.set_options(timeout=30_000) 15 | context = browser.new_context() 16 | page = context.new_page() 17 | page.goto("https://ais.usvisa-info.com/tr-tr/niv/users/sign_in") 18 | time.sleep(3) 19 | page.get_by_label("E-posta *").click() 20 | page.get_by_label("E-posta *").fill(USEREMAIL) 21 | page.get_by_label("E-posta *").press("Tab") 22 | page.get_by_label("Parola").fill(PASSWORD) 23 | time.sleep(1) 24 | page.locator("label").filter(has_text="Gizlilik Politikasını ve").locator("div").click() 25 | page.get_by_role("button", name="Oturum Aç").click() 26 | time.sleep(5) 27 | page.wait_for_selector("img") 28 | 29 | mevcutrandevutarihi = get_current_appointment_date(page) 30 | 31 | page.get_by_role("link", name="Devam Et").click() 32 | time.sleep(2) 33 | page.get_by_role("tab", name=" Randevuyu Yeniden Zamanla").click() 34 | time.sleep(3) 35 | page.get_by_role("link", name="Randevuyu Yeniden Zamanla").click() 36 | time.sleep(3) 37 | page.get_by_label("Randevu Tarihi *").click() 38 | time.sleep(3) 39 | 40 | buldugumtarih = get_earliest_available_date(page) 41 | 42 | compare_dates_and_notify(mevcutrandevutarihi, buldugumtarih) 43 | 44 | page.get_by_role("link", name="Eylemler").click() 45 | page.get_by_role("link", name="Oturumu Kapat").click() 46 | 47 | browser.close() 48 | 49 | def get_current_appointment_date(page): 50 | new_html = page.content() 51 | soup = BeautifulSoup(new_html, 'html.parser') 52 | 53 | given_str = soup.find('p', {'class': 'consular-appt'}).get_text().strip() 54 | 55 | date_str = given_str.split('\n')[1].strip().split(',')[0] 56 | year = given_str.split('\n')[1].strip().split(',')[1].strip() 57 | 58 | months = { 59 | 'Ocak': 1, 'Şubat': 2, 'Mart': 3, 'Nisan': 4, 'Mayıs': 5, 'Haziran': 6, 60 | 'Temmuz': 7, 'Ağustos': 8, 'Eylül': 9, 'Ekim': 10, 'Kasım': 11, 'Aralık': 12 61 | } 62 | 63 | date_parts = [part.strip() for part in date_str.split()] 64 | day = date_parts[0] 65 | month = date_parts[1] 66 | year = year 67 | 68 | return datetime(int(year), months[month], int(day)) 69 | 70 | def get_earliest_available_date(page): 71 | new_html = page.content() 72 | soup = BeautifulSoup(new_html, 'html.parser') 73 | 74 | # Find the