├── README.md ├── Shopify Bot ├── README.md └── sharanga shopify bot.py ├── Sneaker-Notify-master ├── README.md └── main │ ├── crawler.sql │ ├── crawler_db.php │ ├── items.py │ ├── main.py │ ├── middlewares.py │ ├── mysql_pipeline.py │ ├── proxies.txt │ ├── random_useragent.py │ ├── requirements.txt │ ├── settings.py │ └── useragents.txt └── sneakermonitor ├── bots.txt ├── keywords.txt ├── nonshopify.txt ├── proxies.txt ├── scrapy.cfg ├── setup.txt ├── shopify sites.txt ├── sneakermonitor ├── __init__.pyc ├── items.py ├── items.pyc ├── middlewares.py ├── pipelines.py ├── pipelines.pyc ├── settings.py ├── settings.pyc ├── slack json.txt ├── slack.py ├── slack.pyc ├── spiders │ ├── __init__.py │ ├── __init__.pyc │ ├── monitor.py │ └── monitor.pyc ├── test.py └── useragents.txt ├── urls.txt └── useragents.txt /README.md: -------------------------------------------------------------------------------- 1 | # Restock-Monitor 2 | 3 | I have created a script for a restock monitor. Right now it is having some issues with proxies. I am trying to improve it and add more sites like adidas, supreme, nike, etc but I can't do much anymore because of school. If you can make improvements and fix problems please do so. I know there is another GitHub project out there that has many non shopify sites and it would be cool if I could incorporate that. 4 | 5 | Run: Go to sneakermonitor>sneakermonitor>spiders and then find the monitor python file. Thats the file you will be running. 6 | In python to run the file type "python monitor.py" and press enter. 7 | 8 | Hey! I have written a script for a restock monitor. I have finished it but the proxies are not working for me, the format for the proxies are in parentheses (IP:Port:User:Password). It would be great if someone could help me. I am looking to add more features to the script. The monitor runs 24/7 that can catch restocks as soon as possible when stock is loaded onto a website, when a product page is available, and automatically, with the most minimal delay possible, send that information (The Item Loaded and the corresponding link) to to my discord or slack channel. I would also love for it to be configurable so that I am able to add in additional sites (100+ Shopify, adidas, Supreme and many more.) I would also love to be able to filter what gets shown on the Discord Channel (keywords possibly) as I don't want to get spammed with an obnoxious amount of information from stock loading on the websites I would like to track. I want to also add proxy’s so I won’t get banned from the site. This information is very important to users that resell sneakers and they need the most up-to-date information about a site loading stock on the backend that is available on the front end, or just having an early link to purchase from before everyone else. 9 | 10 | I also I have script that is a shopify bot that I would like to include. It will be labled "Shopify Bot" 11 | 12 | The "sneaker-notify-master" is the GitHub project I was talking about at the beginning. 13 | 14 | ANYONE THAT IS WILLING TO HELP WILL BE PAID!!!!!! 15 | -------------------------------------------------------------------------------- /Shopify Bot/README.md: -------------------------------------------------------------------------------- 1 | # sharanga shopify bot v1.1 2 | 3 | ## About 4 | This project is a work in progress. Its main goal is to automate the checkout process on any Shopify powered site given a few keywords. 5 | 6 | ### Usage 7 | Edit the fields under "USER SETTINGS" in the .py file. Keywords should be defined as a list, with each word seperated by a comma and placed within quotation marks. Some formatting is required for certain fields, as stated within the .py file. 8 | 9 | ### What can this project do? 10 | * Find all products on a given Shopify site 11 | * Search through products based on keywords 12 | * Find sizes of products 13 | * Add products based on sizes to cart 14 | * Checkout products with given customer info 15 | * Monitor sites until a product is found in real-time 16 | 17 | ### What's left to do? 18 | * Proxy support 19 | * Checkout process 20 | * Log all events for debugging purposes 21 | * Profiles for user info (billing and shipping) 22 | * Manual captcha completion and 2Captcha support (if captchas are required) 23 | * Get authentication token (doesn't appear to be required and is thus being left out for now) 24 | * Add support for sites like KITH and DSMNY which use a different link for 25 | products (i.e https://kith.com/collections/all/products.atom) 26 | * PEP8 compliant 27 | 28 | ### What other features are planned for a final version? 29 | * Rewrite in C# with a GUI (in progress) 30 | * Support for queue-based releases 31 | * Multithreading to support multiple tasks at once 32 | * Preload payment token with a button click by the user (i.e user can click a button to fetch payment token prior to the drop so resources aren't wasted during tasks trying to get the token, ultimately saving time) or get payment token asynchronously 33 | * Preload gateways by specifying a sitelist to save resources during runtime , saving time during the checkout process, while also leaving the option for the user to input a custom Shopify-enabled site to ensure as many sites are supported as possible 34 | 35 | ## Disclaimer 36 | Not affiliated with Shopify or any sites in any way. This is a personal project. Please do NOT sell this script. Share it with others and help build upon it to make it more efficient. 37 | It may not be stable. Please let me know of any problems you come across. 38 | 39 | If you have any suggestions, or want to help make this better, let me know! 40 | 41 | ## License 42 | This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details. 43 | -------------------------------------------------------------------------------- /Shopify Bot/sharanga shopify bot.py: -------------------------------------------------------------------------------- 1 | ''' 2 | sharanga shopify bot v1.1 3 | a shopify bot (WIP) 4 | developed by @snivynGOD 5 | 6 | TO-DO 7 | - check if payment was successful or failed 8 | - MAYBE get auth token? doesn't appear to be needed though 9 | - MAYBE get captcha? not sure if needed (create captcha solving module if so) 10 | - add module for sites like KITH and DSMNY which use a different link for 11 | products (https://kith.com/collections/all/products.atom) 12 | - rewrite in C# with a GUI and multi-threading to support multiple tasks at once (coming soon) 13 | - PEP8 compliant 14 | 15 | EFFICIENCY TO-DO 16 | not insanely effective but it'll help 17 | 18 | - preload payment tokens with button click by user (i.e user can click a button 19 | to fetch payment token prior to the drop so resources aren't wasted during tasks 20 | trying to get the token) or get payment token asynchronously 21 | - preload gateways by specifying a sitelist, but also leaving the option for 22 | a user inputted shopify site to ensure as many sites are supported as possible 23 | ''' 24 | 25 | from bs4 import BeautifulSoup as soup 26 | import requests 27 | import time 28 | import json 29 | import urllib3 30 | import codecs 31 | import random 32 | 33 | 34 | ''' ------------------------------ SETTINGS ------------------------------ ''' 35 | # Global settings 36 | base_url = "https://www.cybersole.io" # Don't add a / at the end 37 | 38 | # Search settings 39 | keywords = ["cybersole", "AIO"] # Seperate keywords with a comma 40 | size = "11" 41 | 42 | # If a size is sold out, a random size will be chosen instead, as a backup plan 43 | random_size = False 44 | 45 | # To avoid a Shopify soft-ban, a delay of 7.5 seconds is recommended if 46 | # starting a task much earlier than release time (minutes before release) 47 | # Otherwise, a 1 second or less delay will be ideal 48 | search_delay = 7.5 49 | 50 | # Checkout settings 51 | email = "email" 52 | fname = "name" 53 | lname = "name" 54 | addy1 = "address" 55 | addy2 = "" # Can be left blank 56 | city = "city" 57 | province = "state" 58 | country = "country" 59 | postal_code = "48306" 60 | phone = "phone" 61 | card_number = "card" # No spaces 62 | cardholder = "name" 63 | exp_m = "month" # 2 digits 64 | exp_y = "year" # 4 digits 65 | cvv = "number" # 3 digits 66 | 67 | ''' ------------------------------- MODULES ------------------------------- ''' 68 | 69 | 70 | def get_products(session): 71 | ''' 72 | Gets all the products from a Shopify site. 73 | ''' 74 | # Download the products 75 | link = base_url + "/products.json" 76 | r = session.get(link, verify=False) 77 | 78 | # Load the product data 79 | products_json = json.loads(r.text) 80 | products = products_json["products"] 81 | 82 | # Return the products 83 | return products 84 | 85 | 86 | def keyword_search(session, products, keywords): 87 | ''' 88 | Searches through given products from a Shopify site to find the a product 89 | containing all the defined keywords. 90 | ''' 91 | # Go through each product 92 | for product in products: 93 | # Set a counter to check if all the keywords are found 94 | keys = 0 95 | # Go through each keyword 96 | for keyword in keywords: 97 | # If the keyword exists in the title 98 | if(keyword.upper() in product["title"].upper()): 99 | # Increment the counter 100 | keys += 1 101 | # If all the keywords were found 102 | if(keys == len(keywords)): 103 | # Return the product 104 | return product 105 | 106 | 107 | def find_size(session, product, size): 108 | ''' 109 | Find the specified size of a product from a Shopify site. 110 | ''' 111 | # Go through each variant for the product 112 | for variant in product["variants"]: 113 | # Check if the size is found 114 | # Use 'in' instead of '==' in case the site lists sizes as 11 US 115 | if(size in variant["title"]): 116 | variant = str(variant["id"]) 117 | 118 | # Return the variant for the size 119 | return variant 120 | 121 | # If the size isn't found but random size is enabled 122 | if(random_size): 123 | # Initialize a list of variants 124 | variants = [] 125 | 126 | # Add all the variants to the list 127 | for variant in product["variants"]: 128 | variants.append(variant["id"]) 129 | 130 | # Randomly select a variant 131 | variant = str(random.choice(variants)) 132 | 133 | # Return the result 134 | return variant 135 | 136 | 137 | def generate_cart_link(session, variant): 138 | ''' 139 | Generate the add to cart link for a Shopify site given a variant ID. 140 | ''' 141 | # Create the link to add the product to cart 142 | link = base_url + "/cart/" + variant + ":1" 143 | 144 | # Return the link 145 | return link 146 | 147 | 148 | def get_payment_token(card_number, cardholder, expiry_month, expiry_year, cvv): 149 | ''' 150 | Given credit card details, the payment token for a Shopify checkout is 151 | returned. 152 | ''' 153 | # POST information to get the payment token 154 | link = "https://elb.deposit.shopifycs.com/sessions" 155 | 156 | payload = { 157 | "credit_card": { 158 | "number": card_number, 159 | "name": cardholder, 160 | "month": expiry_month, 161 | "year": expiry_year, 162 | "verification_value": cvv 163 | } 164 | } 165 | 166 | r = requests.post(link, json=payload, verify=False) 167 | 168 | # Extract the payment token 169 | payment_token = json.loads(r.text)["id"] 170 | 171 | # Return the payment token 172 | return payment_token 173 | 174 | 175 | def get_shipping(postal_code, country, province, cookie_jar): 176 | ''' 177 | Given address details and the cookies of a Shopify checkout session, a shipping option is returned 178 | ''' 179 | # Get the shipping rate info from the Shopify site 180 | link = base_url + "//cart/shipping_rates.json?shipping_address[zip]=" + postal_code + "&shipping_address[country]=" + country + "&shipping_address[province]=" + province 181 | r = session.get(link, cookies=cookie_jar, verify=False) 182 | 183 | # Load the shipping options 184 | shipping_options = json.loads(r.text) 185 | 186 | # Select the first shipping option 187 | ship_opt = shipping_options["shipping_rates"][0]["name"].replace(' ', "%20") 188 | ship_prc = shipping_options["shipping_rates"][0]["price"] 189 | 190 | # Generate the shipping token to submit with checkout 191 | shipping_option = "shopify-" + ship_opt + "-" + ship_prc 192 | 193 | # Return the shipping option 194 | return shipping_option 195 | 196 | 197 | def add_to_cart(session, variant): 198 | ''' 199 | Given a session and variant ID, the product is added to cart and the 200 | response is returned. 201 | ''' 202 | # Add the product to cart 203 | link = base_url + "/cart/add.js?quantity=1&id=" + variant 204 | response = session.get(link, verify=False) 205 | 206 | # Return the response 207 | return response 208 | 209 | 210 | def submit_customer_info(session, cookie_jar): 211 | ''' 212 | Given a session and cookies for a Shopify checkout, the customer's info 213 | is submitted. 214 | ''' 215 | # Submit the customer info 216 | payload = { 217 | "utf8": u"\u2713", 218 | "_method": "patch", 219 | "authenticity_token": "", 220 | "previous_step": "contact_information", 221 | "step": "shipping_method", 222 | "checkout[email]": email, 223 | "checkout[buyer_accepts_marketing]": "0", 224 | "checkout[shipping_address][first_name]": fname, 225 | "checkout[shipping_address][last_name]": lname, 226 | "checkout[shipping_address][company]": "", 227 | "checkout[shipping_address][address1]": addy1, 228 | "checkout[shipping_address][address2]": addy2, 229 | "checkout[shipping_address][city]": city, 230 | "checkout[shipping_address][country]": country, 231 | "checkout[shipping_address][province]": province, 232 | "checkout[shipping_address][zip]": postal_code, 233 | "checkout[shipping_address][phone]": phone, 234 | "checkout[remember_me]": "0", 235 | "checkout[client_details][browser_width]": "1710", 236 | "checkout[client_details][browser_height]": "1289", 237 | "checkout[client_details][javascript_enabled]": "1", 238 | "button": "" 239 | } 240 | 241 | link = base_url + "//checkout.json" 242 | response = session.get(link, cookies=cookie_jar, verify=False) 243 | 244 | # Get the checkout URL 245 | link = response.url 246 | checkout_link = link 247 | 248 | # POST the data to the checkout URL 249 | response = session.post(link, cookies=cookie_jar, data=payload, verify=False) 250 | 251 | # Return the response and the checkout link 252 | return (response, checkout_link) 253 | 254 | ''' ------------------------------- CODE ------------------------------- ''' 255 | 256 | # Initialize 257 | session = requests.session() 258 | urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning) 259 | product = None 260 | 261 | # Loop until a product containing all the keywords is found 262 | while(product == None): 263 | # Grab all the products on the site 264 | products = get_products(session) 265 | # Grab the product defined by keywords 266 | product = keyword_search(session, products, keywords) 267 | if(product == None): 268 | time.sleep(search_delay) 269 | 270 | # Get the variant ID for the size 271 | variant = find_size(session, product, size) 272 | 273 | # Get the cart link 274 | cart_link = generate_cart_link(session, variant) 275 | 276 | # Add the product to cart 277 | r = add_to_cart(session, variant) 278 | 279 | # Store the cookies 280 | cj = r.cookies 281 | 282 | # Get the payment token 283 | p = get_payment_token(card_number, cardholder, exp_m, exp_y, cvv) 284 | 285 | # Submit customer info and get the checkout url 286 | (r, checkout_link) = submit_customer_info(session, cj) 287 | 288 | # Get the shipping info 289 | ship = get_shipping(postal_code, country, province, cj) 290 | 291 | # Get the payment gateway ID 292 | link = checkout_link + "?step=payment_method" 293 | r = session.get(link, cookies=cj, verify=False) 294 | 295 | bs = soup(r.text, "html.parser") 296 | div = bs.find("div", {"class": "radio__input"}) 297 | print(div) 298 | 299 | gateway = "" 300 | values = str(div.input).split('"') 301 | for value in values: 302 | if value.isnumeric(): 303 | gateway = value 304 | break 305 | 306 | # Submit the payment 307 | link = checkout_link 308 | payload = { 309 | "utf8": u"\u2713", 310 | "_method": "patch", 311 | "authenticity_token": "", 312 | "previous_step": "payment_method", 313 | "step": "", 314 | "s": p, 315 | "checkout[payment_gateway]": gateway, 316 | "checkout[credit_card][vault]": "false", 317 | "checkout[different_billing_address]": "true", 318 | "checkout[billing_address][first_name]": fname, 319 | "checkout[billing_address][last_name]": lname, 320 | "checkout[billing_address][address1]": addy1, 321 | "checkout[billing_address][address2]": addy2, 322 | "checkout[billing_address][city]": city, 323 | "checkout[billing_address][country]": country, 324 | "checkout[billing_address][province]": province, 325 | "checkout[billing_address][zip]": postal_code, 326 | "checkout[billing_address][phone]": phone, 327 | "checkout[shipping_rate][id]": ship, 328 | "complete": "1", 329 | "checkout[client_details][browser_width]": str(random.randint(1000, 2000)), 330 | "checkout[client_details][browser_height]": str(random.randint(1000, 2000)), 331 | "checkout[client_details][javascript_enabled]": "1", 332 | "g-recaptcha-repsonse": "", 333 | "button": "" 334 | } 335 | 336 | r = session.post(link, cookies=cj, data=payload, verify=False) 337 | -------------------------------------------------------------------------------- /Sneaker-Notify-master/README.md: -------------------------------------------------------------------------------- 1 | **Donation:** 2 | Feel free to buy me a cup of coffee, so I can stay motivated and keep updating this project. 3 | 4 | [](https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=3WA5WTGP9HPYG) 5 | 6 | # 7 | Sneaker Restock/Monitor Notify via Twitter/Discord/Slack coded in Python using Scrapy. 8 | 9 |  10 | # 11 | Status: **Under Development. If Interested feel free to [](https://twitter.com/w_notify) on Twitter. To view the database online check [here](https://shoesaddictor.com/Crawler_DB.php).** 12 | 13 | Description: Crawl a list of sneaker websites. Once the new product is found or is restocked. It will check the item's name for certain keywords. If found, it will alert the user via Twitter using tweets or Discord/Slack using WebHook with date, time, item name, and link. 14 | # 15 | **Supported Sites List:** 16 | - 43einhalb 17 | - 5 Pointz 18 | - AFewStore 19 | - ALLike 20 | - Addict 21 | - AdidasUS 22 | - AdidasEU 23 | - Aphrodite 24 | - AsphaltGold 25 | - BSTN 26 | - Back Door 27 | - Bait 28 | - Barneys - Ban if crawl too much. 29 | - Basket Store 30 | - Blends 31 | - Bodega 32 | - Caliroots 33 | - Capsule 34 | - ChampsSports 35 | - City Gear 36 | - Concepts 37 | - Consortium 38 | - DeadStock 39 | - DefShop 40 | - Dope Factory 41 | - Drome 42 | - DSMNY 43 | - EastBay 44 | - End - Captcha if crawl too much. 45 | - Extra Butter NY 46 | - Feature 47 | - FinishLine - Banned on Vultr. 48 | - FootAction 49 | - FootAsylum 50 | - FootDistrict 51 | - FootLocker 52 | - FootPatrol 53 | - FootShop 54 | - Hanon 55 | - Haven 56 | - Hubbastille 57 | - HypeBeast 58 | - HypeDC 59 | - Inflammable 60 | - JDSports 61 | - JimmyJazz - ASN blocked on Vultr via CloudFlare. 62 | - Kith 63 | - Kong 64 | - Lapstone and Hammer 65 | - Loaded 66 | - Luisa Via Roma 67 | - MrPorter 68 | - NeedSupply 69 | - Next Door 70 | - NiceKicks 71 | - Nike 72 | - Nordstrom 73 | - Notre 74 | - Nrml 75 | - Office 76 | - Offspring 77 | - Oneness 78 | - OverKill 79 | - Packer 80 | - Proper 81 | - Puff Reds 82 | - Renarts 83 | - Rise45 84 | - Ruvilla 85 | - SSense 86 | - SVD 87 | - SaintAlfred 88 | - SaveOurSole 89 | - Shelf Life 90 | - ShoesPalace - Need to disobey robots.txt, if you want to crawl. 91 | - Size 92 | - Slam Jam Socialism 93 | - SneakerBaas 94 | - SneakerNStuff - ASN blocked on Vultr via CloudFlare. 95 | - SneakerPolitics 96 | - SocialStatus 97 | - SoleBox 98 | - SoleKitchen 99 | - SoleStop 100 | - Solefly 101 | - StickABush 102 | - StormFashion 103 | - Summer 104 | - Tint Footware 105 | - Titolo 106 | - Tres Bien 107 | - TrophyRoom 108 | - Undefeated 109 | - Uptown 110 | - Urban Industry 111 | - Urban Jungle 112 | - Urban Outfitters 113 | - WellGosh 114 | - West NYC 115 | - XileClothing 116 | - YCMC 117 | - YME Universe 118 | - Zappos 119 | # 120 | **Setup:** 121 | 1. Make sure you have Python installed. (Working on Python 2.7 - not working on Python 3) To install Python go to https://www.python.org/ 122 | 123 | 2. Install the pip requirements - can be found in the requirements.txt. 124 | 125 | - Some Windows users will need to install 126 | 127 | ``` 128 | pip install pypiwin32 129 | ``` 130 | 131 | - For Mac: to install MySQL-python. Open terminal and type the below: 132 | ``` 133 | /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" 134 | brew install mysql-connector-c 135 | pip install MySQL-python 136 | ``` 137 | 138 | - If you have problem installing mysqldb or getting "no module named mysqldb" for Windows go to: http://www.lfd.uci.edu/~gohlke/pythonlibs/#mysql-python 139 | 140 | ``` 141 | Download: 142 | MySQL_python‑1.2.5‑cp27‑none‑win32.whl for 32 Bit Python 143 | MySQL_python‑1.2.5‑cp27‑none‑win_amd64.whl for 64 Bit Python 144 | 145 | pip install wheel 146 | pip install MySQL_python‑1.2.5‑cp27‑none‑win32.whl for 32 Bit Python 147 | pip install MySQL_python‑1.2.5‑cp27‑none‑win_amd64.whl for 64 Bit Python 148 | ``` 149 | 150 | 3. Install MySQL database -> .sql file provided in the folder. 151 | 152 | 4. Go into mysql_pipeline.py edit MySQL connection info and edit Twitter's CONSUMER_KEY, CONSUMER_SECRET, ACCESS_TOKEN_KEY, and ACCESS_TOKEN_SECRET to your Twitter account's info. 153 | To use Discord or Slack paste the WebHook URL into requests.post(' DISCORD or SLACK WEBHOOK URL ', data={'content': TEXT_TO_TWEET}) 154 | 155 | 5. To run: 156 | - For Windows: Click on main.py 157 | - For Mac: Open terminal on the folder type python main.py 158 | 159 | Optional: Added a crawler_db.php to see the data online. 160 | - To install, put crawler_db.php on your web server with the crawler's database running and edit the connection info. -------------------------------------------------------------------------------- /Sneaker-Notify-master/main/crawler_db.php: -------------------------------------------------------------------------------- 1 | 5 | 6 | 7 | 8 |
9 |20 |
164 | 165 | 166 | connect_error){ 181 | die("Connection failed " . $conn->connect_error); 182 | } 183 | 184 | // Process query. 185 | $sql = "SELECT * FROM " . $_GET["site"] . " ORDER BY " . $_GET["sort"] . " " . $_GET["type"]; 186 | $result = $conn->query($sql); 187 | 188 | // Get query result. 189 | if($result->num_rows){ 190 | echo "NAME | LINK | DATE | ||||||
---|---|---|---|---|---|---|---|---|
" . utf8_encode($row["name"]) . " | " . utf8_encode($row["link"]) . " | " . utf8_encode($row["date"]) . " | "; 193 | } 194 | } 195 | else{ 196 | echo "0 results found."; 197 | } 198 | 199 | // Close connection. 200 | $conn->close(); 201 | ?> 202 | 203 | 204 |
NAME | LINK | DATE |
---|---|---|
" . utf8_encode($row["name"]) . " | " . utf8_encode($row["link"]) . " | " . utf8_encode($row["date"]) . " | "; 368 | } 369 | } 370 | else{ 371 | echo "0 results found."; 372 | } 373 | 374 | // Close connection. 375 | $conn->close(); 376 | ?> 377 | 378 | 379 | -------------------------------------------------------------------------------- /Sneaker-Notify-master/main/items.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # Sneaker Notify 5 | # author - Yu Lin 6 | # https://github.com/yulin12345 7 | # admin@yulin12345.site 8 | 9 | from scrapy.item import Item, Field 10 | 11 | 12 | class CrawlerItem(Item): 13 | date = Field() 14 | price = Field() 15 | image = Field() 16 | link = Field() 17 | name = Field() 18 | size = Field() 19 | 20 | class KithItem(Item): 21 | date = Field() 22 | price = Field() 23 | image = Field() 24 | link = Field() 25 | name = Field() 26 | size = Field() 27 | size = Field() 28 | 29 | class RuvillaItem(Item): 30 | date = Field() 31 | price = Field() 32 | image = Field() 33 | link = Field() 34 | name = Field() 35 | size = Field() 36 | 37 | class FootLockerItem(Item): 38 | date = Field() 39 | price = Field() 40 | image = Field() 41 | link = Field() 42 | name = Field() 43 | size = Field() 44 | 45 | class FootActionItem(Item): 46 | date = Field() 47 | price = Field() 48 | image = Field() 49 | link = Field() 50 | name = Field() 51 | size = Field() 52 | 53 | class ChampsItem(Item): 54 | date = Field() 55 | price = Field() 56 | image = Field() 57 | link = Field() 58 | name = Field() 59 | size = Field() 60 | 61 | class EastBayItem(Item): 62 | date = Field() 63 | price = Field() 64 | image = Field() 65 | link = Field() 66 | name = Field() 67 | size = Field() 68 | 69 | class FinishLineItem(Item): 70 | date = Field() 71 | price = Field() 72 | image = Field() 73 | link = Field() 74 | name = Field() 75 | size = Field() 76 | 77 | class AdidasItem(Item): 78 | date = Field() 79 | price = Field() 80 | image = Field() 81 | link = Field() 82 | name = Field() 83 | size = Field() 84 | 85 | class NikeItem(Item): 86 | date = Field() 87 | price = Field() 88 | image = Field() 89 | link = Field() 90 | name = Field() 91 | size = Field() 92 | 93 | class FootShopItem(Item): 94 | date = Field() 95 | price = Field() 96 | image = Field() 97 | link = Field() 98 | name = Field() 99 | size = Field() 100 | 101 | class CalirootsItem(Item): 102 | date = Field() 103 | price = Field() 104 | image = Field() 105 | link = Field() 106 | name = Field() 107 | size = Field() 108 | 109 | class AfewItem(Item): 110 | date = Field() 111 | price = Field() 112 | image = Field() 113 | link = Field() 114 | name = Field() 115 | size = Field() 116 | 117 | class EinhalbItem(Item): 118 | date = Field() 119 | price = Field() 120 | image = Field() 121 | link = Field() 122 | name = Field() 123 | size = Field() 124 | 125 | class EndItem(Item): 126 | date = Field() 127 | price = Field() 128 | image = Field() 129 | link = Field() 130 | name = Field() 131 | size = Field() 132 | 133 | class SNSItem(Item): 134 | date = Field() 135 | price = Field() 136 | image = Field() 137 | link = Field() 138 | name = Field() 139 | size = Field() 140 | 141 | class GoodWillOutItem(Item): 142 | date = Field() 143 | price = Field() 144 | image = Field() 145 | link = Field() 146 | name = Field() 147 | size = Field() 148 | 149 | class TintItem(Item): 150 | date = Field() 151 | price = Field() 152 | image = Field() 153 | link = Field() 154 | name = Field() 155 | size = Field() 156 | 157 | class OverkillItem(Item): 158 | date = Field() 159 | price = Field() 160 | image = Field() 161 | link = Field() 162 | name = Field() 163 | size = Field() 164 | 165 | class FootDistrictItem(Item): 166 | date = Field() 167 | price = Field() 168 | image = Field() 169 | link = Field() 170 | name = Field() 171 | size = Field() 172 | 173 | class SizeItem(Item): 174 | date = Field() 175 | price = Field() 176 | image = Field() 177 | link = Field() 178 | name = Field() 179 | size = Field() 180 | 181 | class YCMCItem(Item): 182 | date = Field() 183 | price = Field() 184 | image = Field() 185 | link = Field() 186 | name = Field() 187 | size = Field() 188 | 189 | class CityItem(Item): 190 | date = Field() 191 | price = Field() 192 | image = Field() 193 | link = Field() 194 | name = Field() 195 | size = Field() 196 | 197 | class NordstromItem(Item): 198 | date = Field() 199 | price = Field() 200 | image = Field() 201 | link = Field() 202 | name = Field() 203 | size = Field() 204 | 205 | class BarneysItem(Item): 206 | date = Field() 207 | price = Field() 208 | image = Field() 209 | link = Field() 210 | name = Field() 211 | size = Field() 212 | 213 | class JimmyJazzItem(Item): 214 | date = Field() 215 | price = Field() 216 | image = Field() 217 | link = Field() 218 | name = Field() 219 | size = Field() 220 | 221 | class JDSportsItem(Item): 222 | date = Field() 223 | price = Field() 224 | image = Field() 225 | link = Field() 226 | name = Field() 227 | size = Field() 228 | 229 | class FootPatrolItem(Item): 230 | date = Field() 231 | price = Field() 232 | image = Field() 233 | link = Field() 234 | name = Field() 235 | size = Field() 236 | 237 | class SneakerBaasItem(Item): 238 | date = Field() 239 | price = Field() 240 | image = Field() 241 | link = Field() 242 | name = Field() 243 | size = Field() 244 | 245 | class SneakerPoliticsItem(Item): 246 | date = Field() 247 | price = Field() 248 | image = Field() 249 | link = Field() 250 | name = Field() 251 | size = Field() 252 | 253 | class UrbanIndustryItem(Item): 254 | date = Field() 255 | price = Field() 256 | image = Field() 257 | link = Field() 258 | name = Field() 259 | size = Field() 260 | 261 | class UrbanOutfittersItem(Item): 262 | date = Field() 263 | price = Field() 264 | image = Field() 265 | link = Field() 266 | name = Field() 267 | size = Field() 268 | 269 | class LuisaItem(Item): 270 | date = Field() 271 | price = Field() 272 | image = Field() 273 | link = Field() 274 | name = Field() 275 | size = Field() 276 | 277 | class SlamJamItem(Item): 278 | date = Field() 279 | price = Field() 280 | image = Field() 281 | link = Field() 282 | name = Field() 283 | size = Field() 284 | 285 | class Rise45Item(Item): 286 | date = Field() 287 | price = Field() 288 | image = Field() 289 | link = Field() 290 | name = Field() 291 | size = Field() 292 | 293 | class UndefeatedItem(Item): 294 | date = Field() 295 | price = Field() 296 | image = Field() 297 | link = Field() 298 | name = Field() 299 | size = Field() 300 | 301 | class ZapposItem(Item): 302 | date = Field() 303 | price = Field() 304 | image = Field() 305 | link = Field() 306 | name = Field() 307 | size = Field() 308 | 309 | class UbiqItem(Item): 310 | date = Field() 311 | price = Field() 312 | image = Field() 313 | link = Field() 314 | name = Field() 315 | size = Field() 316 | 317 | class PointzItem(Item): 318 | date = Field() 319 | price = Field() 320 | image = Field() 321 | link = Field() 322 | name = Field() 323 | size = Field() 324 | 325 | class KicksItem(Item): 326 | date = Field() 327 | price = Field() 328 | image = Field() 329 | link = Field() 330 | name = Field() 331 | size = Field() 332 | 333 | class ShoesPalaceItem(Item): 334 | date = Field() 335 | price = Field() 336 | image = Field() 337 | link = Field() 338 | name = Field() 339 | size = Field() 340 | 341 | class StickABushItem(Item): 342 | date = Field() 343 | price = Field() 344 | image = Field() 345 | link = Field() 346 | name = Field() 347 | size = Field() 348 | 349 | class KongItem(Item): 350 | date = Field() 351 | price = Field() 352 | image = Field() 353 | link = Field() 354 | name = Field() 355 | size = Field() 356 | 357 | class SaveOurSoleItem(Item): 358 | date = Field() 359 | price = Field() 360 | image = Field() 361 | link = Field() 362 | name = Field() 363 | size = Field() 364 | 365 | class InflammableItem(Item): 366 | date = Field() 367 | price = Field() 368 | image = Field() 369 | link = Field() 370 | name = Field() 371 | size = Field() 372 | 373 | class DefShopItem(Item): 374 | date = Field() 375 | price = Field() 376 | image = Field() 377 | link = Field() 378 | name = Field() 379 | size = Field() 380 | 381 | class OffSpringItem(Item): 382 | date = Field() 383 | price = Field() 384 | image = Field() 385 | link = Field() 386 | name = Field() 387 | size = Field() 388 | 389 | class SoleKitchenItem(Item): 390 | date = Field() 391 | price = Field() 392 | image = Field() 393 | link = Field() 394 | name = Field() 395 | size = Field() 396 | 397 | class DromeItem(Item): 398 | date = Field() 399 | price = Field() 400 | image = Field() 401 | link = Field() 402 | name = Field() 403 | size = Field() 404 | 405 | class FootAsylumItem(Item): 406 | date = Field() 407 | price = Field() 408 | image = Field() 409 | link = Field() 410 | name = Field() 411 | size = Field() 412 | 413 | class HHVItem(Item): 414 | date = Field() 415 | price = Field() 416 | image = Field() 417 | link = Field() 418 | name = Field() 419 | size = Field() 420 | 421 | class ConceptsItem(Item): 422 | date = Field() 423 | price = Field() 424 | image = Field() 425 | link = Field() 426 | name = Field() 427 | size = Field() 428 | 429 | class SocialStatusItem(Item): 430 | date = Field() 431 | price = Field() 432 | image = Field() 433 | link = Field() 434 | name = Field() 435 | size = Field() 436 | 437 | class ExtraButterItem(Item): 438 | date = Field() 439 | price = Field() 440 | image = Field() 441 | link = Field() 442 | name = Field() 443 | size = Field() 444 | 445 | class BodegaItem(Item): 446 | date = Field() 447 | price = Field() 448 | image = Field() 449 | link = Field() 450 | name = Field() 451 | size = Field() 452 | 453 | class SaintAlfredItem(Item): 454 | date = Field() 455 | price = Field() 456 | image = Field() 457 | link = Field() 458 | name = Field() 459 | size = Field() 460 | 461 | class LapstoneNHammerItem(Item): 462 | date = Field() 463 | price = Field() 464 | image = Field() 465 | link = Field() 466 | name = Field() 467 | size = Field() 468 | 469 | class ShelfLifeItem(Item): 470 | date = Field() 471 | price = Field() 472 | image = Field() 473 | link = Field() 474 | name = Field() 475 | size = Field() 476 | 477 | class AsphaltGoldItem(Item): 478 | date = Field() 479 | price = Field() 480 | image = Field() 481 | link = Field() 482 | name = Field() 483 | size = Field() 484 | 485 | class HanonItem(Item): 486 | date = Field() 487 | price = Field() 488 | image = Field() 489 | link = Field() 490 | name = Field() 491 | size = Field() 492 | 493 | class SoleBoxItem(Item): 494 | date = Field() 495 | price = Field() 496 | image = Field() 497 | link = Field() 498 | name = Field() 499 | size = Field() 500 | 501 | class ConsortiumItem(Item): 502 | date = Field() 503 | price = Field() 504 | image = Field() 505 | link = Field() 506 | name = Field() 507 | size = Field() 508 | 509 | class HavenItem(Item): 510 | date = Field() 511 | price = Field() 512 | image = Field() 513 | link = Field() 514 | name = Field() 515 | size = Field() 516 | 517 | class NeedSupplyItem(Item): 518 | date = Field() 519 | price = Field() 520 | image = Field() 521 | link = Field() 522 | name = Field() 523 | size = Field() 524 | 525 | class LoadedItem(Item): 526 | date = Field() 527 | price = Field() 528 | image = Field() 529 | link = Field() 530 | name = Field() 531 | size = Field() 532 | 533 | class WellGoshItem(Item): 534 | date = Field() 535 | price = Field() 536 | image = Field() 537 | link = Field() 538 | name = Field() 539 | size = Field() 540 | 541 | class CapsuleItem(Item): 542 | date = Field() 543 | price = Field() 544 | image = Field() 545 | link = Field() 546 | name = Field() 547 | size = Field() 548 | 549 | class YMEItem(Item): 550 | date = Field() 551 | price = Field() 552 | image = Field() 553 | link = Field() 554 | name = Field() 555 | size = Field() 556 | 557 | class HypeDCItem(Item): 558 | date = Field() 559 | price = Field() 560 | image = Field() 561 | link = Field() 562 | name = Field() 563 | size = Field() 564 | 565 | class HolyPopItem(Item): 566 | date = Field() 567 | price = Field() 568 | image = Field() 569 | link = Field() 570 | name = Field() 571 | size = Field() 572 | 573 | class BSTNItem(Item): 574 | date = Field() 575 | price = Field() 576 | image = Field() 577 | link = Field() 578 | name = Field() 579 | size = Field() 580 | 581 | class TrophyRoomItem(Item): 582 | date = Field() 583 | price = Field() 584 | image = Field() 585 | link = Field() 586 | name = Field() 587 | size = Field() 588 | 589 | class SideStepItem(Item): 590 | date = Field() 591 | price = Field() 592 | image = Field() 593 | link = Field() 594 | name = Field() 595 | size = Field() 596 | 597 | class ShiekhItem(Item): 598 | date = Field() 599 | price = Field() 600 | image = Field() 601 | link = Field() 602 | name = Field() 603 | size = Field() 604 | 605 | class RezetItem(Item): 606 | date = Field() 607 | price = Field() 608 | image = Field() 609 | link = Field() 610 | name = Field() 611 | size = Field() 612 | 613 | class FootLockerEUItem(Item): 614 | date = Field() 615 | price = Field() 616 | image = Field() 617 | link = Field() 618 | name = Field() 619 | size = Field() 620 | 621 | class OfficeItem(Item): 622 | date = Field() 623 | price = Field() 624 | image = Field() 625 | link = Field() 626 | name = Field() 627 | size = Field() 628 | 629 | class ALLikeItem(Item): 630 | date = Field() 631 | price = Field() 632 | image = Field() 633 | link = Field() 634 | name = Field() 635 | size = Field() 636 | 637 | class SportsShoesItem(Item): 638 | date = Field() 639 | price = Field() 640 | image = Field() 641 | link = Field() 642 | name = Field() 643 | size = Field() 644 | 645 | class RunnersPointItem(Item): 646 | date = Field() 647 | price = Field() 648 | image = Field() 649 | link = Field() 650 | name = Field() 651 | size = Field() 652 | 653 | class GraffitiItem(Item): 654 | date = Field() 655 | price = Field() 656 | image = Field() 657 | link = Field() 658 | name = Field() 659 | size = Field() 660 | 661 | class UrbanJungleItem(Item): 662 | date = Field() 663 | price = Field() 664 | image = Field() 665 | link = Field() 666 | name = Field() 667 | size = Field() 668 | 669 | class SSenseItem(Item): 670 | date = Field() 671 | price = Field() 672 | image = Field() 673 | link = Field() 674 | name = Field() 675 | size = Field() 676 | 677 | class BackDoorItem(Item): 678 | date = Field() 679 | price = Field() 680 | image = Field() 681 | link = Field() 682 | name = Field() 683 | size = Field() 684 | 685 | class BasketItem(Item): 686 | date = Field() 687 | price = Field() 688 | image = Field() 689 | link = Field() 690 | name = Field() 691 | size = Field() 692 | 693 | class OneBlockDownItem(Item): 694 | date = Field() 695 | price = Field() 696 | image = Field() 697 | link = Field() 698 | name = Field() 699 | size = Field() 700 | 701 | class DopeFactoryItem(Item): 702 | date = Field() 703 | price = Field() 704 | image = Field() 705 | link = Field() 706 | name = Field() 707 | size = Field() 708 | 709 | class NextDoorItem(Item): 710 | date = Field() 711 | price = Field() 712 | image = Field() 713 | link = Field() 714 | name = Field() 715 | size = Field() 716 | 717 | class SummerItem(Item): 718 | date = Field() 719 | price = Field() 720 | image = Field() 721 | link = Field() 722 | name = Field() 723 | size = Field() 724 | 725 | class MrPorterItem(Item): 726 | date = Field() 727 | price = Field() 728 | image = Field() 729 | link = Field() 730 | name = Field() 731 | size = Field() 732 | 733 | class StormFashionItem(Item): 734 | date = Field() 735 | price = Field() 736 | image = Field() 737 | link = Field() 738 | name = Field() 739 | size = Field() 740 | 741 | class TresBienItem(Item): 742 | date = Field() 743 | price = Field() 744 | image = Field() 745 | link = Field() 746 | name = Field() 747 | size = Field() 748 | 749 | class PackerItem(Item): 750 | date = Field() 751 | price = Field() 752 | image = Field() 753 | link = Field() 754 | name = Field() 755 | size = Field() 756 | 757 | class AddictItem(Item): 758 | date = Field() 759 | price = Field() 760 | image = Field() 761 | link = Field() 762 | name = Field() 763 | size = Field() 764 | 765 | class AphroditeItem(Item): 766 | date = Field() 767 | price = Field() 768 | image = Field() 769 | link = Field() 770 | name = Field() 771 | size = Field() 772 | 773 | class BaitItem(Item): 774 | date = Field() 775 | price = Field() 776 | image = Field() 777 | link = Field() 778 | name = Field() 779 | size = Field() 780 | 781 | class BlendsItem(Item): 782 | date = Field() 783 | price = Field() 784 | image = Field() 785 | link = Field() 786 | name = Field() 787 | size = Field() 788 | 789 | class NiceKicksItem(Item): 790 | date = Field() 791 | price = Field() 792 | image = Field() 793 | link = Field() 794 | name = Field() 795 | size = Field() 796 | 797 | class ClicksItem(Item): 798 | date = Field() 799 | price = Field() 800 | image = Field() 801 | link = Field() 802 | name = Field() 803 | size = Field() 804 | 805 | class FeatureItem(Item): 806 | date = Field() 807 | price = Field() 808 | image = Field() 809 | link = Field() 810 | name = Field() 811 | size = Field() 812 | 813 | class HypeBeastItem(Item): 814 | date = Field() 815 | price = Field() 816 | image = Field() 817 | link = Field() 818 | name = Field() 819 | size = Field() 820 | 821 | class DeadStockItem(Item): 822 | date = Field() 823 | price = Field() 824 | image = Field() 825 | link = Field() 826 | name = Field() 827 | size = Field() 828 | 829 | class NotreItem(Item): 830 | date = Field() 831 | price = Field() 832 | image = Field() 833 | link = Field() 834 | name = Field() 835 | size = Field() 836 | 837 | class NrmlItem(Item): 838 | date = Field() 839 | price = Field() 840 | image = Field() 841 | link = Field() 842 | name = Field() 843 | size = Field() 844 | 845 | class OnenessItem(Item): 846 | date = Field() 847 | price = Field() 848 | image = Field() 849 | link = Field() 850 | name = Field() 851 | size = Field() 852 | 853 | class PufferRedsItem(Item): 854 | date = Field() 855 | price = Field() 856 | image = Field() 857 | link = Field() 858 | name = Field() 859 | size = Field() 860 | 861 | class RenartsItem(Item): 862 | date = Field() 863 | price = Field() 864 | image = Field() 865 | link = Field() 866 | name = Field() 867 | size = Field() 868 | 869 | class ShoesGalleryItem(Item): 870 | date = Field() 871 | price = Field() 872 | image = Field() 873 | link = Field() 874 | name = Field() 875 | size = Field() 876 | 877 | class ProperItem(Item): 878 | date = Field() 879 | price = Field() 880 | image = Field() 881 | link = Field() 882 | name = Field() 883 | size = Field() 884 | 885 | class SoleStopItem(Item): 886 | date = Field() 887 | price = Field() 888 | image = Field() 889 | link = Field() 890 | name = Field() 891 | size = Field() 892 | 893 | class TitoloItem(Item): 894 | date = Field() 895 | price = Field() 896 | image = Field() 897 | link = Field() 898 | name = Field() 899 | size = Field() 900 | 901 | class UptownItem(Item): 902 | date = Field() 903 | price = Field() 904 | image = Field() 905 | link = Field() 906 | name = Field() 907 | size = Field() 908 | 909 | class WestNYCItem(Item): 910 | date = Field() 911 | price = Field() 912 | image = Field() 913 | link = Field() 914 | name = Field() 915 | size = Field() 916 | 917 | class WishATLItem(Item): 918 | date = Field() 919 | price = Field() 920 | image = Field() 921 | link = Field() 922 | name = Field() 923 | size = Field() 924 | 925 | class XileClothingItem(Item): 926 | date = Field() 927 | price = Field() 928 | image = Field() 929 | link = Field() 930 | name = Field() 931 | size = Field() 932 | 933 | class SoleflyItem(Item): 934 | date = Field() 935 | price = Field() 936 | image = Field() 937 | link = Field() 938 | name = Field() 939 | size = Field() 940 | 941 | class PattaItem(Item): 942 | date = Field() 943 | price = Field() 944 | image = Field() 945 | link = Field() 946 | name = Field() 947 | size = Field() 948 | 949 | class SVDItem(Item): 950 | date = Field() 951 | price = Field() 952 | image = Field() 953 | link = Field() 954 | name = Field() 955 | size = Field() 956 | 957 | class DSMNYItem(Item): 958 | date = Field() 959 | price = Field() 960 | image = Field() 961 | link = Field() 962 | name = Field() 963 | size = Field() 964 | 965 | class HubbastilleItem(Item): 966 | date = Field() 967 | price = Field() 968 | image = Field() 969 | link = Field() 970 | name = Field() 971 | size = Field() 972 | 973 | class ShoesAddictorItem(Item): 974 | date = Field() 975 | price = Field() 976 | image = Field() 977 | link = Field() 978 | name = Field() 979 | size = Field() 980 | -------------------------------------------------------------------------------- /Sneaker-Notify-master/main/middlewares.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # Sneaker Notify 5 | # author - Yu Lin 6 | # https://github.com/yulin12345 7 | # admin@yulin12345.site 8 | 9 | # Define here the models for your spider middleware 10 | # 11 | # See documentation in: 12 | # http://doc.scrapy.org/en/latest/topics/spider-middleware.html 13 | 14 | from colorama import Fore, Style 15 | from scrapy import signals 16 | 17 | 18 | class CrawlerSpiderMiddleware(object): 19 | # Not all methods need to be defined. If a method is not defined, 20 | # scrapy acts as if the spider middleware does not modify the 21 | # passed objects. 22 | 23 | @classmethod 24 | def from_crawler(cls, crawler): 25 | # This method is used by Scrapy to create your spiders. 26 | s = cls() 27 | crawler.signals.connect(s.spider_opened, signal=signals.spider_opened) 28 | return s 29 | 30 | def process_spider_input(response, spider): 31 | # Called for each response that goes through the spider 32 | # middleware and into the spider. 33 | # Should return None or raise an exception. 34 | return None 35 | 36 | def process_spider_output(response, result, spider): 37 | # Called with the results returned from the Spider, after 38 | # it has processed the response. 39 | # Must return an iterable of Request, dict or Item objects. 40 | for i in result: 41 | yield i 42 | 43 | def process_spider_exception(response, exception, spider): 44 | # Called when a spider or process_spider_input() method 45 | # (from other spider middleware) raises an exception. 46 | # Should return either None or an iterable of Response, dict 47 | # or Item objects. 48 | pass 49 | 50 | def process_start_requests(start_requests, spider): 51 | # Called with the start requests of the spider, and works 52 | # similarly to the process_spider_output() method, except 53 | # that it doesn’t have a response associated. 54 | 55 | # Must return only requests (not items). 56 | for r in start_requests: 57 | yield r 58 | 59 | def spider_opened(self, spider): 60 | spider.logger.info(Fore.RED + 'Spider opened: %s' % spider.name + Style.RESET_ALL) 61 | -------------------------------------------------------------------------------- /Sneaker-Notify-master/main/mysql_pipeline.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # Sneaker Notify 5 | # author - Yu Lin 6 | # https://github.com/yulin12345 7 | # admin@yulin12345.site 8 | 9 | from datetime import datetime 10 | import hashlib 11 | import sys 12 | 13 | import MySQLdb 14 | from TwitterAPI import TwitterAPI 15 | from colorama import Fore, Style 16 | from pytz import timezone 17 | import requests 18 | from scrapy.exceptions import DropItem 19 | from scrapy.http import Request 20 | 21 | from items import * 22 | 23 | 24 | # Convert time to EST. 25 | DATE_CONVERT = datetime.now(timezone('US/Eastern')) 26 | DATE = DATE_CONVERT.strftime("%m-%d-%Y|%H:%M:%S") 27 | 28 | # MYSQL pipleline for storing. 29 | class MYSQL_Pipeline(object): 30 | 31 | def __init__(self): 32 | # Database connection info. (host, user, password, database) 33 | self.conn = MySQLdb.connect(host='HOST NAME HERE', user='USER NAME HERE', passwd='PASSWORD HERE', db='DATABASE NAME HERE', charset="utf8", use_unicode=True) 34 | self.conn.ping(True) 35 | self.cursor = self.conn.cursor() 36 | 37 | # Process the item and insert into database. 38 | def process_item(self, item, spider): 39 | try: 40 | # Insert item into kith table. 41 | if isinstance(item, KithItem): 42 | self.cursor.execute("INSERT INTO kith (name, link, size, date) VALUES (%s, %s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), item['size'].encode('utf-8'), DATE)) 43 | 44 | # Insert item into ruvilla table. 45 | elif isinstance(item, RuvillaItem): 46 | self.cursor.execute("INSERT INTO ruvilla (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 47 | 48 | # Insert item into footlocker table. 49 | elif isinstance(item, FootLockerItem): 50 | self.cursor.execute("INSERT INTO footlocker (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 51 | 52 | # Insert item into footaction table. 53 | elif isinstance(item, FootActionItem): 54 | self.cursor.execute("INSERT INTO footaction (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 55 | 56 | # Insert item into champs table. 57 | elif isinstance(item, ChampsItem): 58 | self.cursor.execute("INSERT INTO champs (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 59 | 60 | # Insert item into eastbay table. 61 | elif isinstance(item, EastBayItem): 62 | self.cursor.execute("INSERT INTO eastbay (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 63 | 64 | # Insert item into finishline table. 65 | elif isinstance(item, FinishLineItem): 66 | self.cursor.execute("INSERT INTO finishline (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 67 | 68 | # Insert item into adidas table. 69 | elif isinstance(item, AdidasItem): 70 | self.cursor.execute("INSERT INTO adidas (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 71 | 72 | # Insert item into nike table. 73 | elif isinstance(item, NikeItem): 74 | self.cursor.execute("INSERT INTO nike (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 75 | 76 | # Insert item into footshop table. 77 | elif isinstance(item, FootShopItem): 78 | self.cursor.execute("INSERT INTO footshop (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 79 | 80 | # Insert item into caliroots table. 81 | elif isinstance(item, CalirootsItem): 82 | self.cursor.execute("INSERT INTO caliroots (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 83 | 84 | # Insert item into afew table. 85 | elif isinstance(item, AfewItem): 86 | self.cursor.execute("INSERT INTO afew (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 87 | 88 | # Insert item into einhalb table. 89 | elif isinstance(item, EinhalbItem): 90 | self.cursor.execute("INSERT INTO einhalb (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 91 | 92 | # Insert item into end table. 93 | elif isinstance(item, EndItem): 94 | self.cursor.execute("INSERT INTO end (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 95 | 96 | # Insert item into sns table. 97 | elif isinstance(item, SNSItem): 98 | self.cursor.execute("INSERT INTO sns (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 99 | 100 | # Insert item into goodwillout table. 101 | elif isinstance(item, GoodWillOutItem): 102 | self.cursor.execute("INSERT INTO goodwillout (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 103 | 104 | # Insert item into tint table. 105 | elif isinstance(item, TintItem): 106 | self.cursor.execute("INSERT INTO tint (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 107 | 108 | # Insert item into overkill table. 109 | elif isinstance(item, OverkillItem): 110 | self.cursor.execute("INSERT INTO overkill (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 111 | 112 | # Insert item into footdistrict table. 113 | elif isinstance(item, FootDistrictItem): 114 | self.cursor.execute("INSERT INTO footdistrict (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 115 | 116 | # Insert item into size table. 117 | elif isinstance(item, SizeItem): 118 | self.cursor.execute("INSERT INTO size (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 119 | 120 | # Insert item into ycmc table. 121 | elif isinstance(item, YCMCItem): 122 | self.cursor.execute("INSERT INTO ycmc (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 123 | 124 | # Insert item into city table. 125 | elif isinstance(item, CityItem): 126 | self.cursor.execute("INSERT INTO city (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 127 | 128 | # Insert item into nordstrom table. 129 | elif isinstance(item, NordstromItem): 130 | self.cursor.execute("INSERT INTO nordstrom (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 131 | 132 | # Insert item into barneys table. 133 | elif isinstance(item, BarneysItem): 134 | self.cursor.execute("INSERT INTO barneys (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 135 | 136 | # Insert item into jimmyjazz table. 137 | elif isinstance(item, JimmyJazzItem): 138 | self.cursor.execute("INSERT INTO jimmyjazz (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 139 | 140 | # Insert item into jdsports table. 141 | elif isinstance(item, JDSportsItem): 142 | self.cursor.execute("INSERT INTO jdsports (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 143 | 144 | # Insert item into footpatrol table. 145 | elif isinstance(item, FootPatrolItem): 146 | self.cursor.execute("INSERT INTO footpatrol (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 147 | 148 | # Insert item into sneakerbaas table. 149 | elif isinstance(item, SneakerBaasItem): 150 | self.cursor.execute("INSERT INTO sneakerbaas (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 151 | 152 | # Insert item into sneakerpolitics table. 153 | elif isinstance(item, SneakerPoliticsItem): 154 | self.cursor.execute("INSERT INTO sneakerpolitics (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 155 | 156 | # Insert item into urbanindustry table. 157 | elif isinstance(item, UrbanIndustryItem): 158 | self.cursor.execute("INSERT INTO urbanindustry (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 159 | 160 | # Insert item into urbanoutfitters table. 161 | elif isinstance(item, UrbanOutfittersItem): 162 | self.cursor.execute("INSERT INTO urbanoutfitters (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 163 | 164 | # Insert item into luisa table. 165 | elif isinstance(item, LuisaItem): 166 | self.cursor.execute("INSERT INTO luisa (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 167 | 168 | # Insert item into slamjam table. 169 | elif isinstance(item, SlamJamItem): 170 | self.cursor.execute("INSERT INTO slamjam (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 171 | 172 | # Insert item into rise45 table. 173 | elif isinstance(item, Rise45Item): 174 | self.cursor.execute("INSERT INTO rise45 (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 175 | 176 | # Insert item into undefeated table. 177 | elif isinstance(item, UndefeatedItem): 178 | self.cursor.execute("INSERT INTO undefeated (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 179 | 180 | # Insert item into zappos table. 181 | elif isinstance(item, ZapposItem): 182 | self.cursor.execute("INSERT INTO zappos (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 183 | 184 | # Insert item into ubiq table. 185 | elif isinstance(item, UbiqItem): 186 | self.cursor.execute("INSERT INTO ubiq (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 187 | 188 | # Insert item into pointz table. 189 | elif isinstance(item, PointzItem): 190 | self.cursor.execute("INSERT INTO pointz (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 191 | 192 | # Insert item into kicks table. 193 | elif isinstance(item, KicksItem): 194 | self.cursor.execute("INSERT INTO kicks (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 195 | 196 | # Insert item into shoespalace table. 197 | elif isinstance(item, ShoesPalaceItem): 198 | self.cursor.execute("INSERT INTO shoespalace (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 199 | 200 | # Insert item into stickabush table. 201 | elif isinstance(item, StickABushItem): 202 | self.cursor.execute("INSERT INTO stickabush (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 203 | 204 | # Insert item into kong table. 205 | elif isinstance(item, KongItem): 206 | self.cursor.execute("INSERT INTO kong (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 207 | 208 | # Insert item into saveoursole table. 209 | elif isinstance(item, SaveOurSoleItem): 210 | self.cursor.execute("INSERT INTO saveoursole (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 211 | 212 | # Insert item into inflammable table. 213 | elif isinstance(item, InflammableItem): 214 | self.cursor.execute("INSERT INTO inflammable (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 215 | 216 | # Insert item into defshop table. 217 | elif isinstance(item, DefShopItem): 218 | self.cursor.execute("INSERT INTO defshop (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 219 | 220 | # Insert item into offspring table. 221 | elif isinstance(item, OffSpringItem): 222 | self.cursor.execute("INSERT INTO offspring (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 223 | 224 | # Insert item into solekitchen table. 225 | elif isinstance(item, SoleKitchenItem): 226 | self.cursor.execute("INSERT INTO solekitchen (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 227 | 228 | # Insert item into drome table. 229 | elif isinstance(item, DromeItem): 230 | self.cursor.execute("INSERT INTO drome (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 231 | 232 | # Insert item into footasylum table. 233 | elif isinstance(item, FootAsylumItem): 234 | self.cursor.execute("INSERT INTO footasylum (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 235 | 236 | # Insert item into hhv table. 237 | elif isinstance(item, HHVItem): 238 | self.cursor.execute("INSERT INTO hhv (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 239 | 240 | # Insert item into concepts table. 241 | elif isinstance(item, ConceptsItem): 242 | self.cursor.execute("INSERT INTO concepts (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 243 | 244 | # Insert item into socialstatus table. 245 | elif isinstance(item, SocialStatusItem): 246 | self.cursor.execute("INSERT INTO socialstatus (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 247 | 248 | # Insert item into extrabutter table. 249 | elif isinstance(item, ExtraButterItem): 250 | self.cursor.execute("INSERT INTO extrabutter (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 251 | 252 | # Insert item into bodega table. 253 | elif isinstance(item, BodegaItem): 254 | self.cursor.execute("INSERT INTO bodega (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 255 | 256 | # Insert item into saintalfred table. 257 | elif isinstance(item, SaintAlfredItem): 258 | self.cursor.execute("INSERT INTO saintalfred (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 259 | 260 | # Insert item into lapstonenhammer table. 261 | elif isinstance(item, LapstoneNHammerItem): 262 | self.cursor.execute("INSERT INTO lapstonenhammer (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 263 | 264 | # Insert item into shelflife table. 265 | elif isinstance(item, ShelfLifeItem): 266 | self.cursor.execute("INSERT INTO shelflife (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 267 | 268 | # Insert item into asphaltgold table. 269 | elif isinstance(item, AsphaltGoldItem): 270 | self.cursor.execute("INSERT INTO asphaltgold (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 271 | 272 | # Insert item into hanon table. 273 | elif isinstance(item, HanonItem): 274 | self.cursor.execute("INSERT INTO hanon (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 275 | 276 | # Insert item into solebox table. 277 | elif isinstance(item, SoleBoxItem): 278 | self.cursor.execute("INSERT INTO solebox (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 279 | 280 | # Insert item into consortium table. 281 | elif isinstance(item, ConsortiumItem): 282 | self.cursor.execute("INSERT INTO consortium (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 283 | 284 | # Insert item into haven table. 285 | elif isinstance(item, HavenItem): 286 | self.cursor.execute("INSERT INTO haven (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 287 | 288 | # Insert item into needsupply table. 289 | elif isinstance(item, NeedSupplyItem): 290 | self.cursor.execute("INSERT INTO needsupply (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 291 | 292 | # Insert item into loaded table. 293 | elif isinstance(item, LoadedItem): 294 | self.cursor.execute("INSERT INTO loaded (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 295 | 296 | # Insert item into wellgosh table. 297 | elif isinstance(item, WellGoshItem): 298 | self.cursor.execute("INSERT INTO wellgosh (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 299 | 300 | # Insert item into capsule table. 301 | elif isinstance(item, CapsuleItem): 302 | self.cursor.execute("INSERT INTO capsule (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 303 | 304 | # Insert item into yme table. 305 | elif isinstance(item, YMEItem): 306 | self.cursor.execute("INSERT INTO yme (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 307 | 308 | # Insert item into hypedc table. 309 | elif isinstance(item, HypeDCItem): 310 | self.cursor.execute("INSERT INTO hypedc (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 311 | 312 | # Insert item into holypop table. 313 | elif isinstance(item, HolyPopItem): 314 | self.cursor.execute("INSERT INTO holypop (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 315 | 316 | # Insert item into bstn table. 317 | elif isinstance(item, BSTNItem): 318 | self.cursor.execute("INSERT INTO bstn (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 319 | 320 | # Insert item into trophyroom table. 321 | elif isinstance(item, TrophyRoomItem): 322 | self.cursor.execute("INSERT INTO trophyroom (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 323 | 324 | # Insert item into sidestep table. 325 | elif isinstance(item, SideStepItem): 326 | self.cursor.execute("INSERT INTO sidestep (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 327 | 328 | # Insert item into shiekh table. 329 | elif isinstance(item, ShiekhItem): 330 | self.cursor.execute("INSERT INTO shiekh (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 331 | 332 | # Insert item into rezet table. 333 | elif isinstance(item, RezetItem): 334 | self.cursor.execute("INSERT INTO rezet (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 335 | 336 | # Insert item into footlockereu table. 337 | elif isinstance(item, FootLockerEUItem): 338 | self.cursor.execute("INSERT INTO footlockereu (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 339 | 340 | # Insert item into office table. 341 | elif isinstance(item, OfficeItem): 342 | self.cursor.execute("INSERT INTO office (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 343 | 344 | # Insert item into allike table. 345 | elif isinstance(item, ALLikeItem): 346 | self.cursor.execute("INSERT INTO allike (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 347 | 348 | # Insert item into sportshoes table. 349 | elif isinstance(item, SportsShoesItem): 350 | self.cursor.execute("INSERT INTO sportshoes (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 351 | 352 | # Insert item into runnerspoint table. 353 | elif isinstance(item, RunnersPointItem): 354 | self.cursor.execute("INSERT INTO runnerspoint (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 355 | 356 | # Insert item into graffiti table. 357 | elif isinstance(item, GraffitiItem): 358 | self.cursor.execute("INSERT INTO graffiti (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 359 | 360 | # Insert item into urbanjungle table. 361 | elif isinstance(item, UrbanJungleItem): 362 | self.cursor.execute("INSERT INTO urbanjungle (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 363 | 364 | # Insert item into ssense table. 365 | elif isinstance(item, SSenseItem): 366 | self.cursor.execute("INSERT INTO ssense (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 367 | 368 | # Insert item into backdoor table. 369 | elif isinstance(item, BackDoorItem): 370 | self.cursor.execute("INSERT INTO backdoor (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 371 | 372 | # Insert item into basket table. 373 | elif isinstance(item, BasketItem): 374 | self.cursor.execute("INSERT INTO basket (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 375 | 376 | # Insert item into oneblockdown table. 377 | elif isinstance(item, OneBlockDownItem): 378 | self.cursor.execute("INSERT INTO oneblockdown (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 379 | 380 | # Insert item into dopefactory table. 381 | elif isinstance(item, DopeFactoryItem): 382 | self.cursor.execute("INSERT INTO dopefactory (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 383 | 384 | # Insert item into nextdoor table. 385 | elif isinstance(item, NextDoorItem): 386 | self.cursor.execute("INSERT INTO nextdoor (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 387 | 388 | # Insert item into summer table. 389 | elif isinstance(item, SummerItem): 390 | self.cursor.execute("INSERT INTO summer (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 391 | 392 | # Insert item into mrporter table. 393 | elif isinstance(item, MrPorterItem): 394 | self.cursor.execute("INSERT INTO mrporter (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 395 | 396 | # Insert item into stormfashion table. 397 | elif isinstance(item, StormFashionItem): 398 | self.cursor.execute("INSERT INTO stormfashion (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 399 | 400 | # Insert item into tresbien table. 401 | elif isinstance(item, TresBienItem): 402 | self.cursor.execute("INSERT INTO tresbien (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 403 | 404 | # Insert item into packer table. 405 | elif isinstance(item, PackerItem): 406 | self.cursor.execute("INSERT INTO packer (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 407 | 408 | # Insert item into addict table. 409 | elif isinstance(item, AddictItem): 410 | self.cursor.execute("INSERT INTO addict (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 411 | 412 | # Insert item into aphrodite table. 413 | elif isinstance(item, AphroditeItem): 414 | self.cursor.execute("INSERT INTO aphrodite (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 415 | 416 | # Insert item into bait table. 417 | elif isinstance(item, BaitItem): 418 | self.cursor.execute("INSERT INTO bait (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 419 | 420 | # Insert item into blends table. 421 | elif isinstance(item, BlendsItem): 422 | self.cursor.execute("INSERT INTO blends (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 423 | 424 | # Insert item into nicekicks table. 425 | elif isinstance(item, NiceKicksItem): 426 | self.cursor.execute("INSERT INTO nicekicks (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 427 | 428 | # Insert item into clicks table. 429 | elif isinstance(item, ClicksItem): 430 | self.cursor.execute("INSERT INTO clicks (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 431 | 432 | # Insert item into feature table. 433 | elif isinstance(item, FeatureItem): 434 | self.cursor.execute("INSERT INTO feature (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 435 | 436 | # Insert item into hypebeast table. 437 | elif isinstance(item, HypeBeastItem): 438 | self.cursor.execute("INSERT INTO hypebeast (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 439 | 440 | # Insert item into deadstock table. 441 | elif isinstance(item, DeadStockItem): 442 | self.cursor.execute("INSERT INTO deadstock (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 443 | 444 | # Insert item into notre table. 445 | elif isinstance(item, NotreItem): 446 | self.cursor.execute("INSERT INTO notre (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 447 | 448 | # Insert item into nrml table. 449 | elif isinstance(item, NrmlItem): 450 | self.cursor.execute("INSERT INTO nrml (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 451 | 452 | # Insert item into oneness table. 453 | elif isinstance(item, OnenessItem): 454 | self.cursor.execute("INSERT INTO oneness (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 455 | 456 | # Insert item into pufferreds table. 457 | elif isinstance(item, PufferRedsItem): 458 | self.cursor.execute("INSERT INTO pufferreds (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 459 | 460 | # Insert item into renarts table. 461 | elif isinstance(item, RenartsItem): 462 | self.cursor.execute("INSERT INTO renarts (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 463 | 464 | # Insert item into shoesgallery table. 465 | elif isinstance(item, ShoesGalleryItem): 466 | self.cursor.execute("INSERT INTO shoesgallery (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 467 | 468 | # Insert item into proper table. 469 | elif isinstance(item, ProperItem): 470 | self.cursor.execute("INSERT INTO proper (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 471 | 472 | # Insert item into solestop table. 473 | elif isinstance(item, SoleStopItem): 474 | self.cursor.execute("INSERT INTO solestop (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 475 | 476 | # Insert item into titolo table. 477 | elif isinstance(item, TitoloItem): 478 | self.cursor.execute("INSERT INTO titolo (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 479 | 480 | # Insert item into uptown table. 481 | elif isinstance(item, UptownItem): 482 | self.cursor.execute("INSERT INTO uptown (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 483 | 484 | # Insert item into westnyc table. 485 | elif isinstance(item, WestNYCItem): 486 | self.cursor.execute("INSERT INTO westnyc (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 487 | 488 | # Insert item into wishatl table. 489 | elif isinstance(item, WishATLItem): 490 | self.cursor.execute("INSERT INTO wishatl (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 491 | 492 | # Insert item into xileclothing table. 493 | elif isinstance(item, XileClothingItem): 494 | self.cursor.execute("INSERT INTO xileclothing (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 495 | 496 | # Insert item into solefly table. 497 | elif isinstance(item, SoleflyItem): 498 | self.cursor.execute("INSERT INTO solefly (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 499 | 500 | # Insert item into patta table. 501 | elif isinstance(item, PattaItem): 502 | self.cursor.execute("INSERT INTO patta (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 503 | 504 | # Insert item into svd table. 505 | elif isinstance(item, SVDItem): 506 | self.cursor.execute("INSERT INTO svd (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 507 | 508 | # Insert item into dsmny table. 509 | elif isinstance(item, DSMNYItem): 510 | self.cursor.execute("INSERT INTO dsmny (link, date) VALUES (%s, %s)", (item['link'].encode('utf-8'), DATE)) 511 | 512 | # Insert item into hubbastille table. 513 | elif isinstance(item, HubbastilleItem): 514 | self.cursor.execute("INSERT INTO hubbastille (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 515 | 516 | # Insert item into shoesaddictor table. 517 | elif isinstance(item, ShoesAddictorItem): 518 | self.cursor.execute("INSERT INTO shoesaddictor (name, link, date) VALUES (%s, %s, %s)", (item['name'].encode('utf-8'), item['link'].encode('utf-8'), DATE)) 519 | 520 | self.conn.commit() 521 | 522 | # If item name contain below words. Tweet it. 523 | keywords = ['ultra boost', 'air jordan', 'jordan retro', 'nmd', 'boost', 'retro', 'flyknit', 'yeezy', 'ronnie', 'fieg', 'pharrel', 'atmos', 'clots', 'mars', 'yard'] 524 | 525 | if any(keyword in item['name'].encode('utf-8').lower() for keyword in keywords): 526 | # Twitter Auth - Tweet the item with date, time, item name, and link. 527 | # To obtain Twitter CONSUMER and ACCESS keys go to https://apps.twitter.com/ 528 | CONSUMER_KEY = 'PASTE CONSUMER_KEY HERE' 529 | CONSUMER_SECRET = 'PASTE CONSUMER_SECRET HERE' 530 | ACCESS_TOKEN_KEY = 'PASTE ACCESS_TOKEN_KEY HERE' 531 | ACCESS_TOKEN_SECRET = 'PASTE ACCESS_TOKEN_SECRET HERE' 532 | API = TwitterAPI(CONSUMER_KEY, CONSUMER_SECRET, ACCESS_TOKEN_KEY, ACCESS_TOKEN_SECRET) 533 | TEXT_TO_SEND = DATE + " EST " + item['name'] + " " + item['link'] 534 | TWEET = API.request('statuses/update', {'status': TEXT_TO_SEND}) 535 | print(Fore.RED + 'TWEET LOG SUCCESS: ' + DATE + ' EST ' + item['name'] + ' ' + item['link'] + Style.RESET_ALL if TWEET.status_code == 200 else Fore.RED + 'TWEET LOG FAILURE: FAILED TO TWEET' + Style.RESET_ALL) 536 | 537 | # WebHook for Discord. Comment/Uncomment the line below to enable/disable. 538 | # requests.post('DISCORD WEBHOOK URL', data={'content': "**" + item['name'] + "**" + "\n" + item['link'] + "\n" + "\n" + "[ATC]: " + item['size'] + "\n" + "------------" + "\n"}) 539 | 540 | # WebHook for Slack. Comment/Uncomment the line below to enable/disable. 541 | # requests.post('SLACK WEBHOOK URL', json={'text': "*" + item['name'] + "*" + "\n" + item['link'] + "\n" + "\n" + "[ATC]: " + item['size'] + "\n" + "------------" + "\n"}, headers={'Content-Type': 'application/json'}) 542 | 543 | except MySQLdb.Error, e: 544 | # print (Fore.RED + "MYSQL ERROR %d: %s" % (e.args[0], e.args[1] + Style.RESET_ALL)) 545 | 546 | return item 547 | -------------------------------------------------------------------------------- /Sneaker-Notify-master/main/proxies.txt: -------------------------------------------------------------------------------- 1 | http://ip:port 2 | or 3 | http://username:password@ip:port -------------------------------------------------------------------------------- /Sneaker-Notify-master/main/random_useragent.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # Sneaker Notify 5 | # author - Yu Lin 6 | # https://github.com/yulin12345 7 | # admin@yulin12345.site 8 | 9 | import random 10 | 11 | from scrapy import signals 12 | from scrapy.downloadermiddlewares.useragent import UserAgentMiddleware 13 | 14 | 15 | class RandomUserAgentMiddleware(UserAgentMiddleware): 16 | 17 | def __init__(self, settings, user_agent='Scrapy'): 18 | super(RandomUserAgentMiddleware, self).__init__() 19 | self.user_agent = user_agent 20 | user_agent_list_file = settings.get('USER_AGENT_LIST') 21 | if not user_agent_list_file: 22 | ua = settings.get('USER_AGENT', user_agent) 23 | self.user_agent_list = [ua] 24 | else: 25 | with open(user_agent_list_file, 'r') as f: 26 | self.user_agent_list = [line.strip() for line in f.readlines()] 27 | 28 | @classmethod 29 | def from_crawler(cls, crawler): 30 | obj = cls(crawler.settings) 31 | crawler.signals.connect(obj.spider_opened, 32 | signal=signals.spider_opened) 33 | return obj 34 | 35 | def process_request(self, request, spider): 36 | user_agent = random.choice(self.user_agent_list) 37 | if user_agent: 38 | request.headers.setdefault('User-Agent', user_agent) 39 | -------------------------------------------------------------------------------- /Sneaker-Notify-master/main/requirements.txt: -------------------------------------------------------------------------------- 1 | requests 2 | crayons 3 | datetime 4 | beautifulsoup4 5 | scrapy 6 | scrapy-random-useragent 7 | scrapy_proxies 8 | TwitterAPI 9 | MySQL-python 10 | mysql-connector==2.1.4 -------------------------------------------------------------------------------- /Sneaker-Notify-master/main/settings.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # Sneaker Notify 5 | # author - Yu Lin 6 | # https://github.com/yulin12345 7 | # admin@yulin12345.site 8 | 9 | # Scrapy settings for crawler project 10 | # 11 | # For simplicity, this file contains only settings considered important or 12 | # commonly used. You can find more settings consulting the documentation: 13 | # 14 | # http://doc.scrapy.org/en/latest/topics/settings.html 15 | # http://scrapy.readthedocs.org/en/latest/topics/downloader-middleware.html 16 | # http://scrapy.readthedocs.org/en/latest/topics/spider-middleware.html 17 | 18 | BOT_NAME = '' 19 | 20 | SPIDER_MODULES = ['main'] 21 | NEWSPIDER_MODULE = 'main' 22 | 23 | # Crawl responsibly by identifying yourself (and your website) on the user-agent 24 | # USER_AGENT = 'crawler (+http://www.yourdomain.com)' 25 | USER_AGENT_LIST = "useragents.txt" 26 | 27 | # Obey robots.txt rules 28 | ROBOTSTXT_OBEY = True 29 | 30 | # Configure maximum concurrent requests performed by Scrapy (default: 16) 31 | CONCURRENT_REQUESTS = 32 32 | 33 | # Configure a delay for requests for the same website (default: 0) 34 | # See http://scrapy.readthedocs.org/en/latest/topics/settings.html#download-delay 35 | # See also autothrottle settings and docs 36 | # DOWNLOAD_DELAY = 3 37 | # The download delay setting will honor only one of: 38 | # CONCURRENT_REQUESTS_PER_DOMAIN = 16 39 | # CONCURRENT_REQUESTS_PER_IP = 16 40 | 41 | # Disable cookies (enabled by default) 42 | COOKIES_ENABLED = False 43 | 44 | # Disable Telnet Console (enabled by default) 45 | # TELNETCONSOLE_ENABLED = False 46 | TELNETCONSOLE_PORT = None 47 | 48 | # Override the default request headers: 49 | # DEFAULT_REQUEST_HEADERS = { 50 | # 'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8', 51 | # 'Accept-Language': 'en', 52 | # } 53 | 54 | # Enable or disable spider middlewares 55 | # See http://scrapy.readthedocs.org/en/latest/topics/spider-middleware.html 56 | # SPIDER_MIDDLEWARES = { 57 | # 'crawler.middlewares.CrawlerSpiderMiddleware': 543, 58 | # } 59 | 60 | # Enable or disable downloader middlewares 61 | # See http://scrapy.readthedocs.org/en/latest/topics/downloader-middleware.html 62 | DOWNLOADER_MIDDLEWARES = { 63 | # 'crawler.middlewares.MyCustomDownloaderMiddleware': 543, 64 | # 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware': 700, 65 | 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware': None, 66 | 'random_useragent.RandomUserAgentMiddleware': 120, 67 | 68 | # Proxies Pool. Comment/Uncomment the line below to enable/disable. 69 | # 'scrapy.downloadermiddlewares.retry.RetryMiddleware': 90, 70 | # 'scrapy_proxies.RandomProxy': 100, 71 | # 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 110 72 | } 73 | 74 | # Enable or disable extensions 75 | # See http://scrapy.readthedocs.org/en/latest/topics/extensions.html 76 | # EXTENSIONS = { 77 | # 'scrapy.extensions.telnet.TelnetConsole': None, 78 | # } 79 | 80 | # Configure item pipelines 81 | # See http://scrapy.readthedocs.org/en/latest/topics/item-pipeline.html 82 | ITEM_PIPELINES = { 83 | 'mysql_pipeline.MYSQL_Pipeline': 100 84 | } 85 | 86 | # Enable and configure the AutoThrottle extension (disabled by default) 87 | # See http://doc.scrapy.org/en/latest/topics/autothrottle.html 88 | AUTOTHROTTLE_ENABLED = True 89 | # The initial download delay 90 | AUTOTHROTTLE_START_DELAY = 1 91 | # The maximum download delay to be set in case of high latencies 92 | AUTOTHROTTLE_MAX_DELAY = 2 93 | # The average number of requests Scrapy should be sending in parallel to 94 | # each remote server 95 | # AUTOTHROTTLE_TARGET_CONCURRENCY = 1.0 96 | # Enable showing throttling stats for every response received: 97 | # AUTOTHROTTLE_DEBUG = False 98 | 99 | # Enable and configure HTTP caching (disabled by default) 100 | # See http://scrapy.readthedocs.org/en/latest/topics/downloader-middleware.html#httpcache-middleware-settings 101 | # HTTPCACHE_ENABLED = True 102 | # HTTPCACHE_EXPIRATION_SECS = 0 103 | # HTTPCACHE_DIR = 'httpcache' 104 | # HTTPCACHE_IGNORE_HTTP_CODES = [] 105 | # HTTPCACHE_STORAGE = 'scrapy.extensions.httpcache.FilesystemCacheStorage' 106 | 107 | # How many times you want to retry on fail 108 | RETRY_TIMES = 10 109 | # Retry error codes 110 | RETRY_HTTP_CODES = [400, 403, 404, 408, 500, 503, 504] 111 | 112 | # Proxy list format 113 | # http://ip:port 114 | # http://username:password@ip:port 115 | PROXY_LIST = 'proxies.txt' 116 | 117 | # Proxy mode 118 | # 0 = Every requests have different proxy 119 | # 1 = Take only one proxy from the list and assign it to every requests 120 | # 2 = Put a custom proxy to use in the settings 121 | PROXY_MODE = 0 122 | 123 | # If proxy mode is 2 uncomment below: 124 | # CUSTOM_PROXY = "http://ip:port" 125 | -------------------------------------------------------------------------------- /Sneaker-Notify-master/main/useragents.txt: -------------------------------------------------------------------------------- 1 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 2 | Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 3 | Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 4 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 5 | Mozilla/5.0 (Windows NT 10.0; WOW64; rv:51.0) Gecko/20100101 Firefox/51.0 6 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_3) AppleWebKit/602.4.8 (KHTML, like Gecko) Version/10.0.3 Safari/602.4.8 7 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:51.0) Gecko/20100101 Firefox/51.0 8 | Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 9 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 10 | Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 11 | Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:51.0) Gecko/20100101 Firefox/51.0 12 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36 13 | Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 14 | Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko 15 | Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 16 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.12; rv:51.0) Gecko/20100101 Firefox/51.0 17 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.79 Safari/537.36 Edge/14.14393 18 | Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36 19 | Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36 20 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 21 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 22 | Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 23 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:51.0) Gecko/20100101 Firefox/51.0 24 | Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36 25 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.95 Safari/537.36 26 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/602.4.8 (KHTML, like Gecko) Version/10.0.3 Safari/602.4.8 27 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_2) AppleWebKit/602.3.12 (KHTML, like Gecko) Version/10.0.2 Safari/602.3.12 28 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.95 Safari/537.36 29 | Mozilla/5.0 (Windows NT 6.3; WOW64; rv:51.0) Gecko/20100101 Firefox/51.0 30 | Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36 31 | Mozilla/5.0 (X11; Linux x86_64; rv:51.0) Gecko/20100101 Firefox/51.0 32 | Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0) like Gecko 33 | Mozilla/5.0 (X11; Linux x86_64; rv:45.0) Gecko/20100101 Firefox/45.0 34 | Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:51.0) Gecko/20100101 Firefox/51.0 35 | Mozilla/5.0 (Windows NT 6.1; rv:51.0) Gecko/20100101 Firefox/51.0 36 | Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/55.0.2883.87 Chrome/55.0.2883.87 Safari/537.36 37 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 38 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.95 Safari/537.36 39 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:45.0) Gecko/20100101 Firefox/45.0 40 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_5) AppleWebKit/602.4.8 (KHTML, like Gecko) Version/10.0.3 Safari/602.4.8 41 | Mozilla/5.0 (Windows NT 10.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 42 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 43 | Mozilla/5.0 (iPad; CPU OS 10_2_1 like Mac OS X) AppleWebKit/602.4.6 (KHTML, like Gecko) Version/10.0 Mobile/14D27 Safari/602.1 44 | Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:51.0) Gecko/20100101 Firefox/51.0 45 | Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.76 Safari/537.36 46 | Mozilla/5.0 (Windows NT 5.1; rv:51.0) Gecko/20100101 Firefox/51.0 47 | Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:50.0) Gecko/20100101 Firefox/50.0 48 | Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36 49 | Mozilla/5.0 (Windows NT 6.1; Trident/7.0; rv:11.0) like Gecko 50 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:51.0) Gecko/20100101 Firefox/51.0 51 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.95 Safari/537.36 52 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2486.0 Safari/537.36 Edge/13.10586 53 | Mozilla/5.0 (Windows NT 10.0; WOW64; rv:50.0) Gecko/20100101 Firefox/50.0 54 | Mozilla/5.0 (iPhone; CPU iPhone OS 10_2_1 like Mac OS X) AppleWebKit/602.4.6 (KHTML, like Gecko) Version/10.0 Mobile/14D27 Safari/602.1 55 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/601.7.7 (KHTML, like Gecko) Version/9.1.2 Safari/601.7.7 56 | Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36 57 | Mozilla/5.0 (iPhone; CPU iPhone OS 10_2_1 like Mac OS X) AppleWebKit/602.1.50 (KHTML, like Gecko) CriOS/56.0.2924.79 Mobile/14D27 Safari/602.1 58 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12) AppleWebKit/602.1.50 (KHTML, like Gecko) Version/10.0 Safari/602.1.50 59 | Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.73 Safari/537.36 OPR/34.0.2036.25 60 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:50.0) Gecko/20100101 Firefox/50.0 61 | Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.76 Safari/537.36 OPR/43.0.2442.806 62 | Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.100 Safari/537.36 63 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_1) AppleWebKit/602.2.14 (KHTML, like Gecko) Version/10.0.1 Safari/602.2.14 64 | Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.75 Safari/537.36 65 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 66 | Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; rv:11.0) like Gecko 67 | Mozilla/5.0 (X11; Fedora; Linux x86_64; rv:51.0) Gecko/20100101 Firefox/51.0 68 | Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0; Trident/5.0) 69 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/600.5.17 (KHTML, like Gecko) Version/8.0.5 Safari/600.5.17 70 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/602.3.12 (KHTML, like Gecko) Version/10.0.2 Safari/602.3.12 71 | Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 -------------------------------------------------------------------------------- /sneakermonitor/bots.txt: -------------------------------------------------------------------------------- 1 | splashforce.io 2 | caspercop.io 3 | solesorcerer.com 4 | jamzigod.com 5 | tripbot.io 6 | -------------------------------------------------------------------------------- /sneakermonitor/keywords.txt: -------------------------------------------------------------------------------- 1 | adidas 2 | 3 | nike 4 | airforce 5 | airmax 6 | vapormax 7 | nmd 8 | human race 9 | hu 10 | yeezy 11 | 350 12 | lebron 13 | paul george 14 | hyperadapt 15 | vans 16 | bbc 17 | billionaire boys club 18 | bape 19 | supreme 20 | ultra boost 21 | foamposite 22 | jordan 23 | air jordan 24 | air 25 | ultraboost 26 | adidas originals 27 | air max 28 | air mag 29 | laceless 30 | NMD 31 | XR1 32 | R2 33 | R1 34 | pharrell williams 35 | hu trail 36 | v2 37 | boost 350 38 | yeezy boost 39 | 4.0 40 | 3.0 41 | 2.0 42 | 1.0 43 | unc 44 | low 45 | mid 46 | retro 47 | primeknit 48 | CS2 49 | ultra boost ltd 50 | LTD 51 | off white 52 | og 53 | balenciaga 54 | reigning champ 55 | kith 56 | anti social social club 57 | splashforce 58 | mesh 59 | dashe 60 | trip 61 | jamzi god 62 | aio mac bot 63 | ice bot 64 | vlone 65 | octobers vey own 66 | sole sorcerer 67 | casper cop 68 | gen 5 69 | 70 | -------------------------------------------------------------------------------- /sneakermonitor/nonshopify.txt: -------------------------------------------------------------------------------- 1 | footshop.com 2 | caliroots.com 3 | size.co.uk 4 | jdsports.co.uk 5 | 5pointz.co.uk 6 | footasylum.com 7 | asphaltgold.de 8 | wellgosh.com 9 | hypedc.com 10 | bstnstore.com 11 | allikestore.com 12 | back-door.it 13 | mrporter.com 14 | titolo.ch 15 | xileclothing.com 16 | -------------------------------------------------------------------------------- /sneakermonitor/proxies.txt: -------------------------------------------------------------------------------- 1 | Proxy Format: IP:Port:User:Password -------------------------------------------------------------------------------- /sneakermonitor/scrapy.cfg: -------------------------------------------------------------------------------- 1 | # Automatically created by: scrapy startproject 2 | # 3 | # For more information about the [deploy] section see: 4 | # https://scrapyd.readthedocs.org/en/latest/deploy.html 5 | 6 | [settings] 7 | default = sneakermonitor.settings 8 | 9 | [deploy] 10 | #url = http://localhost:6800/ 11 | project = sneakermonitor 12 | -------------------------------------------------------------------------------- /sneakermonitor/setup.txt: -------------------------------------------------------------------------------- 1 | https://www.12amrun.com/collections/footwear/Mens 2 | https://18montrose.com.com/collections/sneakers-mens 3 | https://www.a-ma-maniere.com/collections/sneakers 4 | https://www.addictmiami.com/collections/men 5 | https://www.bbcicecream.com/collections/footwear 6 | https://www.blendsus.com/collections/mens-footwear 7 | https://shop.bdgastore.com/collections/footwear 8 | https://www.bowsandarrowsberkeley.com/collections/all-footwear 9 | https://burnrubbersneakers.com/collections/shoes 10 | https://cncpts.com/collections/footwear 11 | https://shop.exclucitylife.com/search?q=menfootwear 12 | https://shop.extrabutterny.com/collections/footwear/Mens 13 | https://shop.havenshop.ca/collections/footwear 14 | https://kith.com/collections/footwear/sneaker 15 | https://kith.com/collections/kith/mens 16 | https://www.deadstock.ca/collections/footwear 17 | https://www.minishopmadrid.com/collections/sneakers 18 | https://nrml.ca/collections/nrml-footwear 19 | https://noirfonce.eu/collections/men 20 | https://offthehook.ca/collections/footwear 21 | https://www.oneness287.com/collections/mens-shoes-1 22 | https://packershoes.com/collections/footwear 23 | https://properlbc.com/collections/footwear/Men+Footwear 24 | https://rsvpgallery.com/collections/men-shoes 25 | https://us.reigningchamp.com/collections/mens-footwear 26 | https://us.reigningchamp.com/collections/mens-footwear 27 | https://rise45.com/collections/mens-footwear 28 | https://www.saintalfred.com/collections/footwear 29 | https://shopnicekicks.com/collections/men 30 | https://sneakerpolitics.com/collections/sneakers 31 | https://www.socialstatuspgh.com/collections/sneakers 32 | https://www.solefly.com/collections/mens-1 33 | https://www.soleheaven.com/collections/all 34 | https://suede-store.com/collections/footwear-m 35 | https://www.trophyroomstore.com/collections/all/footwear 36 | https://shop.undefeated.com/collections/footwear 37 | https://www.unknwn.com/collections/1-footwear-mens-sneakers 38 | https://wishatl.com/collections/footwear 39 | https://www.xhibition.co/collections/sneakers 40 | https://yeezysupply.com/collections/footwear 41 | https://fearofgod.com/collections/footwear/fifth-collection 42 | https://fearofgod.com/collections/footwear/fourth-collection 43 | https://cop2flip.com/collections/frontpage 44 | https://www.oipolloi.com/collections/trainers 45 | https://www.featuresneakerboutique.com/collections/sneakers 46 | https://www.freshragsfl.com/collections/footwear 47 | https://www.westnyc.com/collections/footwear 48 | https://www.manorphx.com/collections/footwear -------------------------------------------------------------------------------- /sneakermonitor/shopify sites.txt: -------------------------------------------------------------------------------- 1 | https://www.12amrun.com/collections/footwear/Mens 2 | https://18montrose.com.com/collections/sneakers-mens 3 | https://www.a-ma-maniere.com/collections/sneakers 4 | https://www.addictmiami.com/collections/men 5 | https://www.bbcicecream.com/collections/footwear 6 | https://www.blendsus.com/collections/mens-footwear 7 | https://shop.bdgastore.com/collections/footwear 8 | https://www.bowsandarrowsberkeley.com/collections/all-footwear 9 | https://burnrubbersneakers.com/collections/shoes 10 | https://cncpts.com/collections/footwear 11 | https://shop.exclucitylife.com/search?q=menfootwear 12 | https://shop.extrabutterny.com/collections/footwear/Mens 13 | https://shop.havenshop.ca/collections/footwear 14 | https://kith.com/collections/footwear/sneaker 15 | https://kith.com/collections/kith/mens 16 | https://www.deadstock.ca/collections/footwear 17 | https://www.minishopmadrid.com/collections/sneakers 18 | https://nrml.ca/collections/nrml-footwear 19 | https://noirfonce.eu/collections/men 20 | https://offthehook.ca/collections/footwear 21 | https://www.oneness287.com/collections/mens-shoes-1 22 | https://packershoes.com/collections/footwear 23 | https://properlbc.com/collections/footwear/Men+Footwear 24 | https://rsvpgallery.com/collections/men-shoes 25 | https://us.reigningchamp.com/collections/mens-footwear 26 | https://us.reigningchamp.com/collections/mens-footwear 27 | https://rise45.com/collections/mens-footwear 28 | https://www.saintalfred.com/collections/footwear 29 | https://shopnicekicks.com/collections/men 30 | https://sneakerpolitics.com/collections/sneakers 31 | https://www.socialstatuspgh.com/collections/sneakers 32 | https://www.solefly.com/collections/mens-1 33 | https://www.soleheaven.com/collections/all 34 | https://suede-store.com/collections/footwear-m 35 | https://www.trophyroomstore.com/collections/all/footwear 36 | https://shop.undefeated.com/collections/footwear 37 | https://www.unknwn.com/collections/1-footwear-mens-sneakers 38 | https://wishatl.com/collections/footwear 39 | https://www.xhibition.co/collections/sneakers 40 | https://yeezysupply.com/collections/footwear 41 | https://fearofgod.com/collections/footwear/fifth-collection 42 | https://fearofgod.com/collections/footwear/fourth-collection 43 | https://cop2flip.com/collections/frontpage 44 | https://www.oipolloi.com/collections/trainers 45 | https://www.featuresneakerboutique.com/collections/sneakers 46 | https://www.freshragsfl.com/collections/footwear 47 | https://www.westnyc.com/collections/footwear 48 | https://www.manorphx.com/collections/footwear 49 | https://www.notre-shop.com/collections/sneakers 50 | https://shopnicekicks.com/collections/men 51 | https://bapeonline.com/collections/men 52 | https://www.highsandlows.net.au/collections/footwear 53 | https://www.courtsidesneakers.com/collections/sneakers 54 | https://atmosny.com/collections/mens-footwear-1 55 | https://nomadshop.net/collections/footwear 56 | https://thepremierstore.com/collections/footwear 57 | https://renarts.com/collections/mens-7/footwear 58 | https://commonwealth-ftgg.com/collections/types?q=Sneakers 59 | https://www.rimenyc.com/collections/footwear/Mens 60 | https://www.blkmkt.us/collections/footwear 61 | https://shoegallerymiami.com/collections/all 62 | https://www.capsuletoronto.com/collections/footwear 63 | https://rockcitykicks.com/collections/sn 64 | https://concrete.nl/collections/footwear 65 | https://samtabak.com/collections/new-arrivals 66 | https://www.solestop.com/collections/men-footwear 67 | https://shop.antisocialsocialclub.com 68 | https://www.apbstore.com/collections/shoes 69 | https://thesportsedit.com/collections/adidas-ultra-boost-trainers 70 | https://www.sneakerworldshop.com/collections/men 71 | https://www.cityblueshop.com/collections/men 72 | https://centre214.com/collections/mens-footwear 73 | https://www.lapstoneandhammer.com/collections/foortwear 74 | https://clot.com/collections/shop/Sneakers 75 | https://www.hlorenzo.com/collections/mens-new-arrivals/footwear 76 | https://www.alumniofny.com/collections/shop 77 | https://www.thedarksideinitiative.com/collections/footwear 78 | https://www.ssense.com/en-us/men/sneakers 79 | https://www.nojokicks.com/collections/all 80 | https://www.solestop.com/collections/men-footwear 81 | https://attic2zoo.com/collections/mens-footwear 82 | https://beatniconline.com/collections/mens 83 | https://shophny.com/collections/mens 84 | https://us.octobersveryown.com/collections/shop-all 85 | https://www.rooneyshop.com/collections/footwear/man 86 | https://vlone.co/collections/vlone 87 | https://solesorcerer.com/cart/7923163037739:1 88 | https://purchase.splashforce.io/cart/4973347602471:1 89 | https://ice-bot.myshopify.com/cart/2563445489698:1 90 | https://aiomacbot.myshopify.com/cart/38188288335:1 91 | https://dashe.io/pp-buynow 92 | https://www.tripbot.io/cart/10466487009322:1 93 | https://solesorcerer.com/cart/7923163037739:1 94 | https://jamzigod.com/cart/7228152283166:1 95 | https://gen5.io/ 96 | https://dashe.io/queue 97 | https://caspercop.io/cart/10256528769067:1 98 | https://purchase.splashforce.io/cart/6488844140583:1 -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/__init__.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/copshoes/Restock-Monitor/3478f13f80df8d74e07c4b8df43caec86af29bc2/sneakermonitor/sneakermonitor/__init__.pyc -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/items.py: -------------------------------------------------------------------------------- 1 | import scrapy 2 | 3 | 4 | class Sneaker(scrapy.Item): 5 | name = scrapy.Field() 6 | description = scrapy.Field() 7 | image = scrapy.Field() 8 | price = scrapy.Field() 9 | currency = scrapy.Field() 10 | url = scrapy.Field() 11 | available = scrapy.Field() 12 | stock = scrapy.Field() 13 | sizes = scrapy.Field() 14 | tag = scrapy.Field() 15 | -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/items.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/copshoes/Restock-Monitor/3478f13f80df8d74e07c4b8df43caec86af29bc2/sneakermonitor/sneakermonitor/items.pyc -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/middlewares.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | # Define here the models for your spider middleware 4 | # 5 | # See documentation in: 6 | # http://doc.scrapy.org/en/latest/topics/spider-middleware.html 7 | 8 | from scrapy import signals 9 | 10 | 11 | class SneakermonitorSpiderMiddleware(object): 12 | # Not all methods need to be defined. If a method is not defined, 13 | # scrapy acts as if the spider middleware does not modify the 14 | # passed objects. 15 | 16 | @classmethod 17 | def from_crawler(cls, crawler): 18 | # This method is used by Scrapy to create your spiders. 19 | s = cls() 20 | crawler.signals.connect(s.spider_opened, signal=signals.spider_opened) 21 | return s 22 | 23 | def process_spider_input(self, response, spider): 24 | # Called for each response that goes through the spider 25 | # middleware and into the spider. 26 | 27 | # Should return None or raise an exception. 28 | return None 29 | 30 | def process_spider_output(self, response, result, spider): 31 | # Called with the results returned from the Spider, after 32 | # it has processed the response. 33 | 34 | # Must return an iterable of Request, dict or Item objects. 35 | for i in result: 36 | yield i 37 | 38 | def process_spider_exception(self, response, exception, spider): 39 | # Called when a spider or process_spider_input() method 40 | # (from other spider middleware) raises an exception. 41 | 42 | # Should return either None or an iterable of Response, dict 43 | # or Item objects. 44 | pass 45 | 46 | def process_start_requests(self, start_requests, spider): 47 | # Called with the start requests of the spider, and works 48 | # similarly to the process_spider_output() method, except 49 | # that it doesn’t have a response associated. 50 | 51 | # Must return only requests (not items). 52 | for r in start_requests: 53 | yield r 54 | 55 | def spider_opened(self, spider): 56 | spider.logger.info('Spider opened: %s' % spider.name) 57 | -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/pipelines.py: -------------------------------------------------------------------------------- 1 | import pymongo 2 | from scrapy.conf import settings 3 | from scrapy.exceptions import DropItem 4 | from slack import Slack 5 | import os.path 6 | 7 | 8 | class SneakermonitorPipeline(object): 9 | def process_item(self, item, spider): 10 | return item 11 | 12 | # Filters old or unchanged products and notify for new or restocks 13 | class FilterPipeline(object): 14 | keywords_file = "/../keywords.txt" 15 | keywords = list() 16 | slack = Slack() 17 | 18 | with open(os.path.dirname(__file__) + keywords_file) as file: 19 | for keyword in file: 20 | keywords.append(keyword.strip().lower()) 21 | 22 | mongo_collection = "products" 23 | 24 | # Initialize connection settings 25 | def __init__(self): 26 | self.server = settings['MONGO_SERVER'] 27 | self.port = settings['MONGO_PORT'] 28 | 29 | # Open connection to database 30 | def open_spider(self, spider): 31 | self.client = pymongo.MongoClient(self.server, self.port) 32 | self.db = self.client[settings['MONGO_DB']] 33 | self.collection = self.db[self.mongo_collection] 34 | 35 | # Close connection to database 36 | def close_spider(self, spider): 37 | self.client.close() 38 | 39 | def process_item(self, item, spider): 40 | self.url = item['url'] 41 | self.product = self.collection.find_one({"url": self.url}) 42 | 43 | tag = "" 44 | try: 45 | tag = item['tag'] 46 | except: 47 | pass 48 | description = "" 49 | try: 50 | description = item['description'] 51 | except: 52 | pass 53 | 54 | # if product already in database 55 | if self.product is not None: 56 | availablity = "" 57 | try: 58 | availablity = item['available'] 59 | except: 60 | pass 61 | 62 | # if it was unavailable before but available now 63 | if availablity is True and self.product['available'] is False: 64 | # Item is sneaker 65 | if 'bot' not in tag: 66 | text = item['name'] + " - " + description 67 | for keyword in self.keywords: 68 | if keyword in text.lower(): 69 | self.slack.post(item, keyword) 70 | return item 71 | 72 | # Item is a bot product 73 | else: 74 | # Availability already checked on parent if clause, so just notify 75 | self.slack.post(item, "") 76 | return item 77 | return item 78 | 79 | # or was available before but not now, return item to update 80 | elif availablity is False and self.product['available'] is True: 81 | return item 82 | elif availablity is None: 83 | return item 84 | else: 85 | raise DropItem("No changes in %s" % item['url']) 86 | 87 | # if a new product found 88 | else: 89 | # If item is sneaker 90 | if 'bot' not in tag: 91 | # if keyword matches, show notification 92 | text = item['name'] + " - " + description 93 | for keyword in self.keywords: 94 | if keyword in text.lower(): 95 | self.slack.post(item, keyword) 96 | return item 97 | 98 | # If item is a bot product 99 | else: 100 | if item['available']: 101 | self.slack.post(item, "") 102 | return item 103 | return item 104 | 105 | 106 | # Save or update product in mongo database 107 | class MongoSavePipeline(object): 108 | 109 | mongo_collection = "products" 110 | 111 | def __init__(self): 112 | self.server = settings['MONGO_SERVER'] 113 | self.port = settings['MONGO_PORT'] 114 | 115 | def open_spider(self, spider): 116 | self.client = pymongo.MongoClient(self.server, self.port) 117 | self.db = self.client[settings['MONGO_DB']] 118 | self.collection = self.db[self.mongo_collection] 119 | 120 | def close_spider(self, spider): 121 | self.client.close() 122 | 123 | def process_item(self, item, spider): 124 | self.url = item['url'] 125 | self.product = self.collection.find_one({"url": self.url}) 126 | 127 | # if product already in database, update 128 | if self.product is not None: 129 | try: 130 | self.collection.replace_one({"url": self.url}, dict(item)) 131 | except: 132 | pass 133 | 134 | # product not in database, add 135 | else: 136 | self.collection.insert_one(dict(item)) 137 | return item 138 | -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/pipelines.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/copshoes/Restock-Monitor/3478f13f80df8d74e07c4b8df43caec86af29bc2/sneakermonitor/sneakermonitor/pipelines.pyc -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/settings.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | # Scrapy settings for sneakermonitor project 4 | # 5 | # For simplicity, this file contains only settings considered important or 6 | # commonly used. You can find more settings consulting the documentation: 7 | # 8 | # http://doc.scrapy.org/en/latest/topics/settings.html 9 | # http://scrapy.readthedocs.org/en/latest/topics/downloader-middleware.html 10 | # http://scrapy.readthedocs.org/en/latest/topics/spider-middleware.html 11 | 12 | BOT_NAME = '' 13 | 14 | SPIDER_MODULES = ['sneakermonitor.spiders'] 15 | NEWSPIDER_MODULE = 'sneakermonitor.spiders' 16 | 17 | 18 | # Crawl responsibly by identifying yourself (and your website) on the user-agent 19 | #USER_AGENT = 'sneakermonitor (+http://www.yourdomain.com)' 20 | 21 | # Obey robots.txt rules 22 | ROBOTSTXT_OBEY = False 23 | 24 | # Configure maximum concurrent requests performed by Scrapy (default: 16) 25 | #CONCURRENT_REQUESTS = 32 26 | 27 | # Configure a delay for requests for the same website (default: 0) 28 | # See http://scrapy.readthedocs.org/en/latest/topics/settings.html#download-delay 29 | # See also autothrottle settings and docs 30 | #DOWNLOAD_DELAY = 3 31 | # The download delay setting will honor only one of: 32 | CONCURRENT_REQUESTS_PER_DOMAIN = 1 33 | #CONCURRENT_REQUESTS_PER_IP = 16 34 | 35 | # Disable cookies (enabled by default) 36 | COOKIES_ENABLED = False 37 | 38 | # Disable Telnet Console (enabled by default) 39 | #TELNETCONSOLE_ENABLED = False 40 | 41 | # Override the default request headers: 42 | # DEFAULT_REQUEST_HEADERS = { 43 | # 'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8', 44 | # 'Accept-Language': 'en', 45 | #} 46 | 47 | # Enable or disable spider middlewares 48 | # See http://scrapy.readthedocs.org/en/latest/topics/spider-middleware.html 49 | SPIDER_MIDDLEWARES = { 50 | 'scrapy_splash.SplashDeduplicateArgsMiddleware': 100, 51 | } 52 | 53 | # Enable or disable downloader middlewares 54 | # See http://scrapy.readthedocs.org/en/latest/topics/downloader-middleware.html 55 | DOWNLOADER_MIDDLEWARES = { 56 | # scrapy_proxies start 57 | # 'scrapy.downloadermiddlewares.retry.RetryMiddleware': 90, 58 | # 'scrapy_proxies.RandomProxy': 100, 59 | # 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 110, 60 | # scrapy_proxies end 61 | # scrapy-random-useragent start 62 | 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware': None, 63 | 'random_useragent.RandomUserAgentMiddleware': 400, 64 | # scrapy-random-useragent end 65 | 'scrapy_splash.SplashCookiesMiddleware': 723, 66 | 'scrapy_splash.SplashMiddleware': 725, 67 | 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware': 810, 68 | } 69 | 70 | # Enable or disable extensions 71 | # See http://scrapy.readthedocs.org/en/latest/topics/extensions.html 72 | # EXTENSIONS = { 73 | # 'scrapy.extensions.telnet.TelnetConsole': None, 74 | #} 75 | 76 | # Configure item pipelines 77 | # See http://scrapy.readthedocs.org/en/latest/topics/item-pipeline.html 78 | ITEM_PIPELINES = { 79 | # 'sneakermonitor.pipelines.sneakermonitorPipeline': 300, 80 | 'sneakermonitor.pipelines.FilterPipeline': 400, 81 | 'sneakermonitor.pipelines.MongoSavePipeline': 500 82 | } 83 | 84 | # Enable and configure the AutoThrottle extension (disabled by default) 85 | # See http://doc.scrapy.org/en/latest/topics/autothrottle.html 86 | #AUTOTHROTTLE_ENABLED = True 87 | # The initial download delay 88 | #AUTOTHROTTLE_START_DELAY = 5 89 | # The maximum download delay to be set in case of high latencies 90 | #AUTOTHROTTLE_MAX_DELAY = 60 91 | # The average number of requests Scrapy should be sending in parallel to 92 | # each remote server 93 | #AUTOTHROTTLE_TARGET_CONCURRENCY = 1.0 94 | # Enable showing throttling stats for every response received: 95 | #AUTOTHROTTLE_DEBUG = False 96 | 97 | # Enable and configure HTTP caching (disabled by default) 98 | # See http://scrapy.readthedocs.org/en/latest/topics/downloader-middleware.html#httpcache-middleware-settings 99 | #HTTPCACHE_ENABLED = True 100 | #HTTPCACHE_EXPIRATION_SECS = 0 101 | #HTTPCACHE_DIR = 'httpcache' 102 | #HTTPCACHE_IGNORE_HTTP_CODES = [] 103 | #HTTPCACHE_STORAGE = 'scrapy.extensions.httpcache.FilesystemCacheStorage' 104 | 105 | MONGO_SERVER = "localhost" 106 | MONGO_PORT = 27017 107 | MONGO_DB = "sneakermonitor" 108 | 109 | # Retry many times since proxies often fail 110 | RETRY_TIMES = 5 111 | # Retry on most error codes since proxies fail for different reasons 112 | RETRY_HTTP_CODES = [500, 503, 504, 400, 403, 404, 408] 113 | 114 | # Proxy list containing entries like 115 | # http://username:password@proxyurl:proxyport 116 | # http://username:password@host2:port 117 | # http://host3:port 118 | # ... 119 | 120 | PROXY_LIST = "/home/alex/sneakermonitor/proxies.txt" 121 | 122 | # Proxy mode 123 | # 0 = Every requests have different proxy 124 | # 1 = Take only one proxy from the list and assign it to every requests 125 | # 2 = Put a custom proxy to use in the settings 126 | PROXY_MODE = 1 127 | 128 | # If proxy mode is 2 uncomment this sentence : 129 | # CUSTOM_PROXY = "http://host1:port" 130 | 131 | USER_AGENT_LIST = "/home/alex/sneakermonitor/useragents.txt" 132 | 133 | # Splash settings 134 | SPLASH_URL = 'http://192.168.99.100:8050' 135 | DUPEFILTER_CLASS = 'scrapy_splash.SplashAwareDupeFilter' 136 | # DUPEFILTER_CLASS = False 137 | HTTPCACHE_STORAGE = 'scrapy_splash.SplashAwareFSCacheStorage' 138 | 139 | PROXY_ENABLED = True 140 | ADIDAS_PROXY_ENABLED = True 141 | -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/settings.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/copshoes/Restock-Monitor/3478f13f80df8d74e07c4b8df43caec86af29bc2/sneakermonitor/sneakermonitor/settings.pyc -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/slack json.txt: -------------------------------------------------------------------------------- 1 | { 2 | "attachments": [ 3 | { 4 | "title": "item['name']", 5 | "title_link": "https://api.slack.com/", 6 | "color": "#3AA3E3", 7 | "fields": [ 8 | { 9 | "title": "Stock Count", 10 | "value": "stock", 11 | "short": "true" 12 | }, 13 | { 14 | "title": "Store", 15 | "value": "store", 16 | "short": "true" 17 | }, 18 | { 19 | "title": "Notification type", 20 | "value": "match_type", 21 | "short": "true" 22 | }, 23 | { 24 | "title": "Price", 25 | "value": "price", 26 | "short": "true" 27 | }, 28 | { 29 | "title": "Sizes", 30 | "value": "sizes", 31 | "short": "true" 32 | } 33 | ], 34 | "actions": [ 35 | { 36 | "type": "button", 37 | "text": "Purchase", 38 | "url": "item['url']", 39 | "style": "primary" 40 | } 41 | ], 42 | "thumb_url": "item['image']" 43 | } 44 | ] 45 | } -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/slack.py: -------------------------------------------------------------------------------- 1 | import json 2 | import requests 3 | from urlparse import urlparse 4 | import os.path 5 | 6 | 7 | class Slack(): 8 | 9 | # Add different slack channel's web hook url here: 10 | channels = { 11 | "supreme": "https://hooks.slack.com/services/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", 12 | "adidas": "https://hooks.slack.com/services/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", 13 | "bots": "https://hooks.slack.com/services/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", 14 | "shopify": "https://hooks.slack.com/services/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", 15 | "others": "https://hooks.slack.com/services/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" 16 | } 17 | 18 | non_shopify_file = r"/../nonshopify.txt" 19 | bots_file = r"/../bots.txt" 20 | non_shopify_list = list() 21 | bots_list = list() 22 | 23 | # Supported bots sites list 24 | with open(os.path.dirname(__file__) + bots_file, "rt") as f: 25 | bots_list = [url.strip() for url in f.readlines()] 26 | 27 | # Supported non shopify sites list 28 | with open(os.path.dirname(__file__) + non_shopify_file, "rt") as f: 29 | non_shopify_list = [url.strip() for url in f.readlines()] 30 | 31 | def post(self, item, keyword): 32 | currency = "" 33 | try: 34 | currency = item['currency'] 35 | except: 36 | pass 37 | 38 | stock = "n/a" 39 | try: 40 | stock = item['stock'] 41 | except: 42 | pass 43 | 44 | variations = list() 45 | try: 46 | variations = item['sizes'] 47 | except: 48 | pass 49 | 50 | image = "https://upload.wikimedia.org/wikipedia/commons/c/ca/1x1.png" 51 | try: 52 | image = item['image'] 53 | except: 54 | pass 55 | 56 | price = "n/a" 57 | if currency.lower() == "usd": 58 | price = '$' + str(item['price']) 59 | else: 60 | try: 61 | price = item['price'] 62 | except: 63 | pass 64 | 65 | store = urlparse(item['url']).netloc 66 | 67 | sizes = "n/a" 68 | if stock > 0: 69 | for size in variations: 70 | quantity = size.split(' / Stock: ')[1] 71 | if int(quantity) > 0: 72 | sizes += size + '\n' 73 | 74 | tag = "" 75 | try: 76 | tag = item['tag'] 77 | except: 78 | pass 79 | 80 | # Item is a sneaker 81 | if 'bot' not in tag: 82 | 83 | # Set correct slack channel according to item's tag 84 | if 'shopify' in tag: 85 | webhook_url = self.channels['shopify'] 86 | elif 'adidas' in tag: 87 | webhook_url = self.channels['adidas'] 88 | elif 'supreme' in tag: 89 | webhook_url = self.channels['supreme'] 90 | else: 91 | webhook_url = self.channels['others'] 92 | 93 | slack_data = { 94 | "attachments": [ 95 | { 96 | "title": item['name'], 97 | "title_link":item['url'], 98 | "color":"#3AA3E3", 99 | "fields":[ 100 | { 101 | "title": "Stock Count", 102 | "value": stock, 103 | "short": "true"}, 104 | { 105 | "title": "Store", 106 | "value": store, 107 | "short": "true"}, 108 | { 109 | "title": "Keyword matched", 110 | "value": keyword, 111 | "short": "true"}, 112 | { 113 | "title": "Price", 114 | "value": price, 115 | "short": "true"}, 116 | { 117 | "title": "Sizes", 118 | "value": sizes, 119 | "short": "true"} 120 | ], 121 | "actions": [ 122 | { 123 | "type": "button", 124 | "text": "Purchase", 125 | "url": item['url'], 126 | "style":"primary"} 127 | ], 128 | "thumb_url": image 129 | } 130 | ] 131 | } 132 | 133 | response = requests.post( 134 | webhook_url, data=json.dumps(slack_data), 135 | headers={'Content-Type': 'application/json'} 136 | ) 137 | 138 | if response.status_code != 200: 139 | raise ValueError( 140 | 'Request to slack returned an error %s, the response is:\n%s' 141 | % (response.status_code, response.text) 142 | ) 143 | 144 | # Item is a bot product 145 | else: 146 | slack_data = { 147 | "attachments": [ 148 | { 149 | "title": item['name'], 150 | "title_link":item['url'], 151 | "color":"#3AA3E3", 152 | "fields":[ 153 | { 154 | "title": "Status", 155 | "value": "In stock", 156 | "short": "true"}, 157 | { 158 | "title": "Price", 159 | "value": price, 160 | "short": "true"} 161 | ], 162 | "actions": [ 163 | { 164 | "type": "button", 165 | "text": "Purchase", 166 | "url": item['url'], 167 | "style":"primary"}], 168 | "thumb_url": image 169 | } 170 | ] 171 | } 172 | 173 | response = requests.post( 174 | self.channels['bots'], data=json.dumps(slack_data), 175 | headers={'Content-Type': 'application/json'} 176 | ) 177 | 178 | if response.status_code != 200: 179 | raise ValueError( 180 | 'Request to slack returned an error %s, the response is:\n%s' 181 | % (response.status_code, response.text) 182 | ) 183 | -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/slack.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/copshoes/Restock-Monitor/3478f13f80df8d74e07c4b8df43caec86af29bc2/sneakermonitor/sneakermonitor/slack.pyc -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/spiders/__init__.py: -------------------------------------------------------------------------------- 1 | # This package will contain the spiders of your Scrapy project 2 | # 3 | # Please refer to the documentation for information on how to create and manage 4 | # your spiders. 5 | -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/spiders/__init__.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/copshoes/Restock-Monitor/3478f13f80df8d74e07c4b8df43caec86af29bc2/sneakermonitor/sneakermonitor/spiders/__init__.pyc -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/spiders/monitor.py: -------------------------------------------------------------------------------- 1 | import scrapy 2 | from twisted.internet import reactor 3 | from scrapy.crawler import CrawlerProcess 4 | from scrapy.utils.project import get_project_settings 5 | from scrapy_splash import SplashRequest 6 | from scrapy.http import Request 7 | from scrapy.selector import Selector 8 | from urlparse import urlparse 9 | import jsonfinder 10 | import re 11 | import os.path 12 | import tldextract 13 | import random 14 | 15 | 16 | class Monitor(scrapy.Spider): 17 | name = "monitor" 18 | settings = get_project_settings() 19 | url_file = "/../../urls.txt" 20 | non_shopify_file = r"/../../nonshopify.txt" 21 | bots_file = r"/../../bots.txt" 22 | page = 1 23 | 24 | # Load url list and start scraping 25 | def start_requests(self): 26 | urls = list() 27 | non_shopify_list = list() 28 | bots_list = list() 29 | 30 | # Get all urls to scrape 31 | with open(os.path.dirname(__file__) + self.url_file, "rt") as f: 32 | urls = [url.strip() for url in f.readlines()] 33 | 34 | # Supported non shopify sites list 35 | with open(os.path.dirname(__file__) + self.non_shopify_file, "rt") as f: 36 | non_shopify_list = [url.strip() for url in f.readlines()] 37 | 38 | # Supported bots sites list 39 | with open(os.path.dirname(__file__) + self.bots_file, "rt") as f: 40 | bots_list = [url.strip() for url in f.readlines()] 41 | 42 | for url in urls: 43 | t = tldextract.extract(url) 44 | root = t.domain + '.' + t.suffix 45 | proxy_enabled = self.settings.get('PROXY_ENABLED') 46 | adidas_proxy_enabled = self.settings.get('ADIDAS_PROXY_ENABLED') 47 | 48 | # Adidas site (uses scrapy-splash) 49 | if "adidas.com" in url: 50 | # With proxy 51 | if adidas_proxy_enabled: 52 | yield SplashRequest(url, self.adidas_parse, headers=self.adidas_headers(), 53 | args={'images_enabled': 'false', 'proxy': self.random_proxy()}) 54 | 55 | # Without proxy 56 | else: 57 | yield SplashRequest(url, self.adidas_parse, headers=self.adidas_headers(), 58 | args={'images_enabled': 'false'}) 59 | 60 | # Non shopify site 61 | elif any(root in s for s in non_shopify_list): 62 | # With proxy 63 | if proxy_enabled: 64 | yield scrapy.Request(url, self.non_shoify, meta={'proxy': self.random_proxy()}) 65 | 66 | # Without proxy 67 | else: 68 | yield scrapy.Request(url, self.non_shoify) 69 | 70 | # Bots 71 | elif any(root in s for s in bots_list): 72 | # With proxy 73 | if proxy_enabled: 74 | yield scrapy.Request(url, self.bots_parse, meta={'proxy': self.random_proxy()}) 75 | 76 | # Without proxy 77 | else: 78 | yield scrapy.Request(url, self.bots_parse) 79 | 80 | # Shopify sites 81 | else: 82 | # With proxy 83 | if proxy_enabled: 84 | yield scrapy.Request(url, self.shopify_parse, meta={'proxy': self.random_proxy()}) 85 | 86 | # Without proxy 87 | else: 88 | yield scrapy.Request(url, self.shopify_parse) 89 | 90 | 91 | # Adidas headers with random useragents 92 | def adidas_headers(self): 93 | 94 | # Get useragent list 95 | with open(self.settings.get('USER_AGENT_LIST'), "r") as f: 96 | useragents = [url.strip() for url in f.readlines()] 97 | 98 | adidas_headers = { 99 | "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8", 100 | "Accept-Language": "en-US,en;q=0.9", 101 | "Host": "www.adidas.com", 102 | "Connection": "keep-alive", 103 | "Upgrade-Insecure-Requests": "1", 104 | "User-Agent": random.choice(useragents) 105 | } 106 | 107 | return adidas_headers 108 | 109 | def random_proxy(self): 110 | # Get proxy list 111 | with open(self.settings.get('PROXY_LIST'), "r") as f: 112 | n = sum(1 for line in f if line.rstrip('\n')) 113 | rand_proxy = f.readlines()[random.randint(1, n)].rstrip('\n') 114 | ip, port, user, pw = rand_proxy.split(':') 115 | proxy = 'http://' + user + ':' + pw + '@' + ip + ':' + port 116 | return proxy 117 | 118 | # Adidas 119 | def adidas_parse(self, response): 120 | products = response.xpath('//*[@id="hc-container"]/div') 121 | 122 | for product in products: 123 | # If product don't have comming soon tag, scrape 124 | tag = product.xpath( 125 | "./div[2]/div[3]/div[2]/span/text()").extract_first() 126 | if "coming soon" not in tag.lower().strip(): 127 | sneaker = Sneaker() 128 | root_url = "https://www.adidas.com" 129 | 130 | data = product.xpath("./div/@data-context").extract_first() 131 | 132 | # Name 133 | m = re.search('name:(.*);', data) 134 | sneaker["name"] = m.group(1) 135 | 136 | # Model 137 | m = re.search('model:(.*)', data) 138 | description = 'Model: ' + m.group(1) 139 | 140 | # Id 141 | m = re.search('id:(.*);name', data) 142 | description += ' ID: ' + m.group(1) 143 | 144 | sneaker["description"] = description 145 | 146 | sneaker["image"] = product.xpath( 147 | "./div[2]/div[3]/div[3]/a/img[1]/@data-original").extract_first() 148 | 149 | sneaker["currency"] = product.xpath( 150 | "./div[2]/div[3]/div[4]/div[4]/div/span[1]/text()").extract_first().strip() 151 | 152 | sneaker["price"] = product.xpath( 153 | "./div[2]/div[3]/div[4]/div[4]/div/span[2]/text()").extract_first().strip() 154 | 155 | url = product.xpath( 156 | "./div[2]/div[3]/div[3]/a/@href").extract_first() 157 | sneaker["url"] = root_url + url 158 | 159 | sneaker["tag"] = 'adidas' 160 | 161 | yield sneaker 162 | 163 | self.page += 120 164 | if products: 165 | next_page = "http://www.adidas.com/us/men-shoes?sz=120&start=" + str(self.page) 166 | 167 | # With proxy 168 | if self.settings.get('ADIDAS_PROXY_ENABLED'): 169 | yield SplashRequest(next_page, self.adidas_parse, headers=self.adidas_headers(), 170 | args={'images_enabled': 'false', 'proxy': self.random_proxy()}) 171 | 172 | # Without proxy 173 | else: 174 | yield SplashRequest(next_page, self.adidas_parse, headers=self.adidas_headers(), 175 | args={'images_enabled': 'false'}) 176 | 177 | 178 | # Adidas product (accepts selector instead of response) 179 | def adidas_parse_product(self, product): 180 | sneaker = Sneaker() 181 | root_url = "https://www.adidas.com" 182 | 183 | data = product.xpath("./div/@data-context").extract_first() 184 | 185 | # Name 186 | m = re.search('name:(.*);', data) 187 | sneaker["name"] = m.group(1) 188 | 189 | # Model 190 | m = re.search('model:(.*)', data) 191 | description = 'Model: ' + m.group(1) 192 | 193 | # Id 194 | m = re.search('id:(.*);name', data) 195 | description += ' ID: ' + m.group(1) 196 | 197 | sneaker["description"] = description 198 | 199 | sneaker["image"] = product.xpath( 200 | "./div[2]/div[3]/div[3]/a/img[1]/@data-original").extract_first() 201 | 202 | sneaker["currency"] = product.xpath( 203 | "./div[2]/div[3]/div[4]/div[4]/div/span[1]/text()").extract_first().strip() 204 | 205 | sneaker["price"] = product.xpath( 206 | "./div[2]/div[3]/div[4]/div[4]/div/span[2]/text()").extract_first().strip() 207 | 208 | url = product.xpath( 209 | "./div[2]/div[3]/div[3]/a/@href").extract_first() 210 | sneaker["url"] = root_url + url 211 | 212 | sneaker["tag"] = 'adidas' 213 | 214 | yield sneaker 215 | 216 | # Shopify 217 | def shopify_parse(self, response): 218 | 219 | url = response.url 220 | if url.endswith('/'): 221 | url = url.rstrip('/') 222 | 223 | o = urlparse(url) 224 | 225 | products = response.xpath( 226 | "//a[contains(@href, '/products/')]/@href").extract() 227 | 228 | # remove image urls 229 | for product in products[:]: 230 | if "cdn.shopify" in product: 231 | products.remove(product) 232 | 233 | for product in products: 234 | yield scrapy.Request(response.urljoin(product), self.shopify_parse_product) 235 | 236 | self.page += 1 237 | if products: 238 | next_page = o.path + "?page=" + str(self.page) 239 | yield scrapy.Request(response.urljoin(next_page), self.shopify_parse) 240 | 241 | # Shopify product 242 | def shopify_parse_product(self, response): 243 | sneaker = Sneaker() 244 | 245 | # name 246 | sneaker["name"] = response.xpath( 247 | "//meta[@property='og:title']/@content").extract_first() 248 | 249 | # description 250 | sneaker["description"] = response.xpath( 251 | "//meta[@name='description']/@content").extract_first() 252 | 253 | # image 254 | sneaker["image"] = response.xpath( 255 | "//meta[@property='og:image']/@content").extract_first() 256 | 257 | # price 258 | sneaker["price"] = response.xpath( 259 | "//meta[@property='og:price:amount']/@content").extract_first() 260 | 261 | # currency 262 | sneaker["currency"] = response.xpath( 263 | "//meta[@property='og:price:currency']/@content").extract_first() 264 | 265 | # URL 266 | sneaker['url'] = response.url 267 | sneaker['tag'] = "shopify" 268 | 269 | sizes = list() 270 | stock = 0 271 | 272 | # find stock details part 273 | script = response.xpath( 274 | "//script[contains(text(), 'inventory_quantity')]").extract_first() 275 | 276 | # if stock details found get stock details 277 | if script: 278 | script = re.sub(' +', ' ', script) 279 | raw_json = None 280 | try: 281 | raw_json = re.search("({\"id\".*})", script).group(0) 282 | except: 283 | pass 284 | 285 | if raw_json: 286 | json = jsonfinder.only_json( 287 | re.search("({.*})", raw_json).group(0))[2] 288 | 289 | sneaker['available'] = json['available'] 290 | 291 | for size in json['variants']: 292 | size_str = str(size['option1']) + ' / Stock: ' + str( 293 | size['inventory_quantity']) 294 | 295 | stock += int(size['inventory_quantity']) 296 | sizes.append(size_str) 297 | 298 | # else just find out if add to cart button is there 299 | else: 300 | add_to_cart = None 301 | try: 302 | add_to_cart = response.xpath( 303 | "//*[@name='add' and @type='submit']").extract_first() 304 | except: 305 | pass 306 | 307 | if add_to_cart is None: 308 | sneaker['available'] = False 309 | else: 310 | sneaker['available'] = True 311 | 312 | sneaker['sizes'] = sizes 313 | sneaker['stock'] = stock 314 | 315 | return sneaker 316 | 317 | # Bots availability checker 318 | def bots_parse(self, response): 319 | t = tldextract.extract(response.url) 320 | bot = Sneaker() 321 | 322 | bot["name"] = response.xpath( 323 | "//meta[@property='og:title']/@content").extract_first() 324 | bot["description"] = response.xpath( 325 | "//meta[@name='description']/@content").extract_first() 326 | bot["image"] = response.xpath( 327 | "//meta[@property='og:image']/@content").extract_first() 328 | bot["price"] = response.xpath( 329 | "//meta[@property='og:price:amount']/@content").extract_first() 330 | bot["currency"] = response.xpath( 331 | "//meta[@property='og:price:currency']/@content").extract_first() 332 | bot['url'] = response.url 333 | bot['tag'] = 'bot' 334 | 335 | availability = response.xpath( 336 | '//span[contains(@id, "AddToCartText")]/text()').extract_first() 337 | 338 | # If bot is avaialable 339 | if "sold out" not in availability.lower().strip(): 340 | bot['available'] = True 341 | else: 342 | bot['available'] = False 343 | 344 | return bot 345 | 346 | # Non shopify sites 347 | def non_shoify(self, response): 348 | t = tldextract.extract(response.url) 349 | root = t.domain + '.' + t.suffix 350 | 351 | if "footshop.com" in root: 352 | products = Selector(response).xpath( 353 | '//div[@class="col-xs-6 col-md-4 col-lg-3"]') 354 | 355 | for product in products: 356 | item = Sneaker() 357 | item['name'] = product.xpath('a/@title').extract()[0] 358 | item['url'] = product.xpath('a/@href').extract()[0] 359 | # item['image'] = product.xpath('a/div/img/@data-src').extract()[0] 360 | # item['size'] = '**NOT SUPPORTED YET**' 361 | yield item 362 | 363 | elif "caliroots.com" in root: 364 | products = Selector(response).xpath( 365 | '//ul[@class="product-list row"]//li[contains(@class,"product")]') 366 | 367 | for product in products: 368 | item = Sneaker() 369 | item['name'] = product.xpath('.//a/p[2]/text()').extract()[0] 370 | item['url'] = "https://caliroots.com" + \ 371 | product.xpath('.//a/@href').extract()[0] 372 | # item['image'] = product.xpath('.//a/div/img/@src').extract()[0] 373 | # item['size'] = '**NOT SUPPORTED YET**' 374 | yield item 375 | 376 | elif "size.co.uk" in root: 377 | products = Selector(response).xpath( 378 | '//ul[@class="listProducts productList"]//li[contains(@class,"productListItem")]') 379 | 380 | for product in products: 381 | item = Sneaker() 382 | item['name'] = product.xpath( 383 | './/span/span/span/a/text()').extract()[0] 384 | item['url'] = "https://www.size.co.uk" + \ 385 | product.xpath('.//span/span/span/a/@href').extract()[0] 386 | # item['image'] = product.xpath('.//span/a/img/@src').extract()[0] 387 | # item['size'] = '**NOT SUPPORTED YET**' 388 | yield item 389 | 390 | elif "jdsports.co.uk" in root: 391 | products = Selector(response).xpath( 392 | '//ul[@class="listProducts productList"]//li[contains(@class,"productListItem")]') 393 | 394 | for product in products: 395 | item = Sneaker() 396 | item['name'] = product.xpath( 397 | './/span/a/img/@title').extract()[0] 398 | item['url'] = "https://www.jdsports.co.uk" + \ 399 | product.xpath('.//span/a/@href').extract()[0] 400 | # item['image'] = product.xpath('.//span/a/img/@src').extract()[0] 401 | # item['size'] = '**NOT SUPPORTED YET**' 402 | yield item 403 | 404 | elif "5pointz.co.uk" in root: 405 | products = Selector(response).xpath( 406 | '//ol[@class="listing listing--grid"]//li[contains(@class,"listing-item")]//article//figure') 407 | 408 | for product in products: 409 | item = Sneaker() 410 | item['name'] = product.xpath('a/@title').extract()[0] 411 | item['url'] = product.xpath('a/@href').extract()[0] 412 | # item['image'] = product.xpath('a/img/@src').extract()[0] 413 | # item['size'] = '**NOT SUPPORTED YET**' 414 | yield item 415 | 416 | elif "footasylum.com" in root: 417 | products = Selector(response).xpath( 418 | '//div[@class="productDataOnPage_inner"]//ul[@class="main-list row"]//li[contains(@class,"left")]') 419 | 420 | for product in products: 421 | item = Sneaker() 422 | item['name'] = product.xpath( 423 | 'div/span[2]/img/@alt').extract()[0] 424 | item['url'] = product.xpath('div/span[1]/text()').extract()[0] 425 | # item['image'] = "https://www.footasylum.com" + product.xpath('div/span[2]/img/@data-original').extract()[0] 426 | # item['size'] = '**NOT SUPPORTED YET**' 427 | yield item 428 | 429 | elif "asphaltgold.de" in root: 430 | products = Selector(response).xpath( 431 | '//div[@class="product-grid"]//section[contains(@class,"item")]') 432 | 433 | for product in products: 434 | item = Sneaker() 435 | item['name'] = product.xpath('a/@title').extract()[0] 436 | item['url'] = product.xpath('a/@href').extract()[0] 437 | # item['image'] = product.xpath('a/img//@src').extract()[0] 438 | # item['size'] = '**NOT SUPPORTED YET**' 439 | yield item 440 | 441 | elif "wellgosh.com" in root: 442 | products = Selector(response).xpath( 443 | '//div[@class="category-products row grid-mode"]//article[contains(@class,"small-6")]') 444 | 445 | for product in products: 446 | item = Sneaker() 447 | item['name'] = product.xpath('.//figure/a/@title').extract()[0] 448 | item['url'] = product.xpath('.//figure/a/@href').extract()[0] 449 | # item['image'] = product.xpath('.//figure/a/img/@src').extract()[0] 450 | # item['size'] = '**NOT SUPPORTED YET**' 451 | yield item 452 | 453 | elif "hypedc.com" in root: 454 | products = Selector(response).xpath( 455 | '//div[@class="category-products row"]//div[contains(@class,"item")]') 456 | 457 | for product in products: 458 | item = Sneaker() 459 | item['name'] = product.xpath('.//a/@title').extract()[0] 460 | item['url'] = product.xpath('.//a/@href').extract()[0] 461 | # item['image'] = product.xpath('.//a/div/img/@data-src').extract()[0] 462 | # item['size'] = '**NOT SUPPORTED YET**' 463 | yield item 464 | 465 | elif "bstnstore.com" in root: 466 | products = Selector(response).xpath( 467 | '//ul[@class="block-grid four-up mobile-two-up productlist"]//li[contains(@class,"item")]//div[@class="itemWrapper pOverlay"]//div[@class="pImageContainer"]//a[@class="plink image"]') 468 | 469 | for product in products: 470 | item = Sneaker() 471 | item['name'] = product.xpath('div/@data-alt').extract()[0] 472 | item['url'] = "https://www.bstnstore.com" + \ 473 | product.xpath('@href').extract()[0] 474 | # item['image'] = "https://www.bstnstore.com" + product.xpath('div/div[2]/@data-src').extract()[0] 475 | # item['size'] = '**NOT SUPPORTED YET**' 476 | yield item 477 | 478 | elif "allikestore.com" in root: 479 | products = Selector(response).xpath( 480 | '//ul[@class="products-grid"]//li[contains(@class,"item")]//div[@class="item-wrap"]') 481 | 482 | for product in products: 483 | item = Sneaker() 484 | item['name'] = product.xpath('a/@title').extract()[0] 485 | item['url'] = product.xpath('a/@href').extract()[0] 486 | # item['image'] = product.xpath('a/img/@src').extract()[0] 487 | # item['size'] = '**NOT SUPPORTED YET**' 488 | yield item 489 | 490 | elif "back-door.it" in root: 491 | products = Selector(response).xpath( 492 | '//ul[@class="products clearfix"]//li') 493 | 494 | for product in products: 495 | item = Sneaker() 496 | item['name'] = product.xpath('a[1]/h6/text()').extract()[0] 497 | item['url'] = product.xpath('a[1]/@href').extract()[0] 498 | # item['image'] = product.xpath('div/a[2]/span/img/@src').extract()[0] 499 | # item['size'] = '**NOT SUPPORTED YET**' 500 | yield item 501 | 502 | elif "mrporter.com" in root: 503 | products = Selector(response).xpath( 504 | '//div[@class="pl-grid__column pl-grid__column--main"]//ul[@class="pl-products"]//li[contains(@class,"pl-products-item")]') 505 | 506 | for product in products: 507 | item = Sneaker() 508 | item['name'] = product.xpath( 509 | 'a/div[2]/div/span[2]/text()').extract()[0].replace(" Sneakers", "") 510 | item['url'] = "https://www.mrporter.com" + \ 511 | product.xpath('a/@href').extract()[0] 512 | # item['image'] = product.xpath('a/div[1]/img/@src').extract()[0] 513 | # item['size'] = '**NOT SUPPORTED YET**' 514 | yield item 515 | 516 | elif "titolo.ch" in root: 517 | products = Selector(response).xpath( 518 | '//ul[@class="small-block-grid-2 medium-block-grid-3 large-block-grid-4 no-bullet"]//li[contains(@class,"item")]//div[@class="list-inner-wrapper"]') 519 | 520 | for product in products: 521 | item = Sneaker() 522 | item['name'] = product.xpath('a/@title').extract()[0] 523 | item['url'] = product.xpath('a/@href').extract()[0] 524 | # item['image'] = product.xpath('div[1]/a/img/@src').extract()[0] 525 | # item['size'] = '**NOT SUPPORTED YET**' 526 | yield item 527 | 528 | elif "xileclothing.com" in root: 529 | products = Selector(response).xpath( 530 | '//ul[@class="itemsList"]/li/div[1]') 531 | 532 | for product in products: 533 | item = Sneaker() 534 | item['name'] = product.xpath('a/img/@alt').extract()[0] 535 | item['url'] = product.xpath('a/@href').extract()[0] 536 | # item['image'] = "https://www.xileclothing.com" + product.xpath('a/img/@src').extract()[0] 537 | # item['size'] = '**NOT SUPPORTED YET**' 538 | yield item 539 | 540 | class Sneaker(scrapy.Item): 541 | name = scrapy.Field() 542 | description = scrapy.Field() 543 | image = scrapy.Field() 544 | price = scrapy.Field() 545 | currency = scrapy.Field() 546 | url = scrapy.Field() 547 | available = scrapy.Field() 548 | stock = scrapy.Field() 549 | sizes = scrapy.Field() 550 | tag = scrapy.Field() 551 | 552 | 553 | process = CrawlerProcess(settings=get_project_settings()) 554 | 555 | def crawl(): 556 | d = process.crawl(Monitor) 557 | d.addBoth(crawl_done) 558 | 559 | def crawl_done(error): 560 | crawl() 561 | 562 | crawl() 563 | reactor.run() 564 | -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/spiders/monitor.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/copshoes/Restock-Monitor/3478f13f80df8d74e07c4b8df43caec86af29bc2/sneakermonitor/sneakermonitor/spiders/monitor.pyc -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/test.py: -------------------------------------------------------------------------------- 1 | from slack import Slack 2 | from items import Sneaker 3 | 4 | sneaker = Sneaker() 5 | 6 | sizes = list() 7 | sizes.append("5 / Stock: 10") 8 | sizes.append("10 / Stock: 0") 9 | sizes.append("20 / Stock: 30") 10 | sizes.append("30 / Stock: 0") 11 | sizes.append("40 / Stock: 50") 12 | 13 | sneaker['name'] = "Trip.io beta V2" 14 | sneaker['price'] = 200 15 | sneaker['currency'] = "USD" 16 | sneaker['url'] = "https://shop.bdgastore.com/products/w-nike-air-max-plus-lux" 17 | sneaker['image'] = "http://cdn.shopify.com/s/files/1/0049/9112/products/BTKA_15144940918813283_f79c0aa0e2825fd0d41629395bbb49_grande.jpg?v=1514494399" 18 | sneaker['stock'] = 150 19 | sneaker['sizes'] = sizes 20 | # sneaker['tag'] = "supreme" 21 | 22 | match_type = "New product" 23 | 24 | slack = Slack() 25 | slack.post(sneaker, match_type) 26 | -------------------------------------------------------------------------------- /sneakermonitor/sneakermonitor/useragents.txt: -------------------------------------------------------------------------------- 1 | Mozilla/5.0 (Amiga; U; AmigaOS 1.3; en; rv:1.8.1.19) Gecko/20081204 SeaMonkey/1.1.14 2 | Mozilla/5.0 (AmigaOS; U; AmigaOS 1.3; en-US; rv:1.8.1.21) Gecko/20090303 SeaMonkey/1.1.15 3 | Mozilla/5.0 (AmigaOS; U; AmigaOS 1.3; en; rv:1.8.1.19) Gecko/20081204 SeaMonkey/1.1.14 4 | Mozilla/5.0 (Android 2.2; Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.19.4 (KHTML, like Gecko) Version/5.0.3 Safari/533.19.4 5 | Mozilla/5.0 (BeOS; U; BeOS BeBox; fr; rv:1.9) Gecko/2008052906 BonEcho/2.0 6 | Mozilla/5.0 (BeOS; U; BeOS BePC; en-US; rv:1.8.1.1) Gecko/20061220 BonEcho/2.0.0.1 7 | 8 | -------------------------------------------------------------------------------- /sneakermonitor/urls.txt: -------------------------------------------------------------------------------- 1 | https://www.footshop.com/en/5-mens-shoes/brand-adidas_originals-adidas_running-jordan-nike/page-1c4 -------------------------------------------------------------------------------- /sneakermonitor/useragents.txt: -------------------------------------------------------------------------------- 1 | Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36 2 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.1 Safari/537.36 3 | Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.0 Safari/537.36 4 | Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.0 Safari/537.36 5 | Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2226.0 Safari/537.36 6 | Mozilla/5.0 (Windows NT 6.4; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2225.0 Safari/537.36 7 | Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2225.0 Safari/537.36 8 | Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2224.3 Safari/537.36 9 | Mozilla/5.0 (Windows NT 10.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.93 Safari/537.36 10 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2062.124 Safari/537.36 11 | Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2049.0 Safari/537.36 12 | Mozilla/5.0 (Windows NT 4.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2049.0 Safari/537.36 13 | Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.67 Safari/537.36 14 | Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.67 Safari/537.36 15 | Mozilla/5.0 (X11; OpenBSD i386) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.125 Safari/537.36 16 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1944.0 Safari/537.36 17 | Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.3319.102 Safari/537.36 18 | Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.2309.372 Safari/537.36 19 | Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.2117.157 Safari/537.36 20 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.47 Safari/537.36 21 | Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1866.237 Safari/537.36 22 | Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1847.137 Safari/4E423F 23 | Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1847.116 Safari/537.36 Mozilla/5.0 (iPad; U; CPU OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B334b Safari/531.21.10 24 | Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/33.0.1750.517 Safari/537.36 25 | Mozilla/5.0 (Windows NT 6.2; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1667.0 Safari/537.36 26 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1664.3 Safari/537.36 27 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1664.3 Safari/537.36 28 | Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.16 Safari/537.36 29 | Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1623.0 Safari/537.36 30 | Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/30.0.1599.17 Safari/537.36 31 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.1 32 | Mozilla/5.0 (Windows NT 6.3; rv:36.0) Gecko/20100101 Firefox/36.0 33 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10; rv:33.0) Gecko/20100101 Firefox/33.0 34 | Mozilla/5.0 (X11; Linux i586; rv:31.0) Gecko/20100101 Firefox/31.0 35 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:31.0) Gecko/20130401 Firefox/31.0 36 | Mozilla/5.0 (Windows NT 5.1; rv:31.0) Gecko/20100101 Firefox/31.0 37 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:29.0) Gecko/20120101 Firefox/29.0 38 | Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:25.0) Gecko/20100101 Firefox/29.0 39 | Mozilla/5.0 (X11; OpenBSD amd64; rv:28.0) Gecko/20100101 Firefox/28.0 40 | Mozilla/5.0 (X11; Linux x86_64; rv:28.0) Gecko/20100101 Firefox/28.0 41 | Mozilla/5.0 (Windows NT 6.1; rv:27.3) Gecko/20130101 Firefox/27.3 42 | Mozilla/5.0 (Windows NT 6.2; Win64; x64; rv:27.0) Gecko/20121011 Firefox/27.0 43 | Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:25.0) Gecko/20100101 Firefox/25.0 44 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:25.0) Gecko/20100101 Firefox/25.0 45 | Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:24.0) Gecko/20100101 Firefox/24.0 46 | Mozilla/5.0 (Windows NT 6.0; WOW64; rv:24.0) Gecko/20100101 Firefox/24.0 47 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.8; rv:24.0) Gecko/20100101 Firefox/24.0 48 | Mozilla/5.0 (Windows NT 6.2; rv:22.0) Gecko/20130405 Firefox/23.0 49 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:23.0) Gecko/20130406 Firefox/23.0 50 | Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:23.0) Gecko/20131011 Firefox/23.0 51 | Mozilla/5.0 (Windows NT 6.2; rv:22.0) Gecko/20130405 Firefox/22.0 52 | Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:22.0) Gecko/20130328 Firefox/22.0 53 | Mozilla/5.0 (Windows NT 6.1; rv:22.0) Gecko/20130405 Firefox/22.0 54 | Mozilla/5.0 (Microsoft Windows NT 6.2.9200.0); rv:22.0) Gecko/20130405 Firefox/22.0 55 | Mozilla/5.0 (Windows NT 6.2; Win64; x64; rv:16.0.1) Gecko/20121011 Firefox/21.0.1 56 | Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:16.0.1) Gecko/20121011 Firefox/21.0.1 57 | Mozilla/5.0 (Windows NT 6.2; Win64; x64; rv:21.0.0) Gecko/20121011 Firefox/21.0.0 58 | Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:21.0) Gecko/20130331 Firefox/21.0 59 | Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:21.0) Gecko/20100101 Firefox/21.0 60 | Mozilla/5.0 (X11; Linux i686; rv:21.0) Gecko/20100101 Firefox/21.0 61 | Opera/9.80 (X11; Linux i686; Ubuntu/14.10) Presto/2.12.388 Version/12.16 62 | Opera/9.80 (Windows NT 6.0) Presto/2.12.388 Version/12.14 63 | Mozilla/5.0 (Windows NT 6.0; rv:2.0) Gecko/20100101 Firefox/4.0 Opera 12.14 64 | Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.0) Opera 12.14 65 | Opera/12.80 (Windows NT 5.1; U; en) Presto/2.10.289 Version/12.02 66 | Opera/9.80 (Windows NT 6.1; U; es-ES) Presto/2.9.181 Version/12.00 67 | Opera/9.80 (Windows NT 5.1; U; zh-sg) Presto/2.9.181 Version/12.00 68 | Opera/12.0(Windows NT 5.2;U;en)Presto/22.9.168 Version/12.00 69 | Opera/12.0(Windows NT 5.1;U;en)Presto/22.9.168 Version/12.00 70 | Mozilla/5.0 (Windows NT 5.1) Gecko/20100101 Firefox/14.0 Opera/12.0 71 | Opera/9.80 (Windows NT 6.1; WOW64; U; pt) Presto/2.10.229 Version/11.62 72 | Opera/9.80 (Windows NT 6.0; U; pl) Presto/2.10.229 Version/11.62 73 | Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; fr) Presto/2.9.168 Version/11.52 74 | Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; de) Presto/2.9.168 Version/11.52 75 | Opera/9.80 (Windows NT 5.1; U; en) Presto/2.9.168 Version/11.51 76 | Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; de) Opera 11.51 77 | Opera/9.80 (X11; Linux x86_64; U; fr) Presto/2.9.168 Version/11.50 78 | Opera/9.80 (X11; Linux i686; U; hu) Presto/2.9.168 Version/11.50 79 | Opera/9.80 (X11; Linux i686; U; ru) Presto/2.8.131 Version/11.11 80 | Opera/9.80 (X11; Linux i686; U; es-ES) Presto/2.8.131 Version/11.11 81 | Mozilla/5.0 (Windows NT 5.1; U; en; rv:1.8.1) Gecko/20061208 Firefox/5.0 Opera 11.11 82 | Opera/9.80 (X11; Linux x86_64; U; bg) Presto/2.8.131 Version/11.10 83 | Opera/9.80 (Windows NT 6.0; U; en) Presto/2.8.99 Version/11.10 84 | Opera/9.80 (Windows NT 5.1; U; zh-tw) Presto/2.8.131 Version/11.10 85 | Opera/9.80 (Windows NT 6.1; Opera Tablet/15165; U; en) Presto/2.8.149 Version/11.1 86 | Opera/9.80 (X11; Linux x86_64; U; Ubuntu/10.10 (maverick); pl) Presto/2.7.62 Version/11.01 87 | Opera/9.80 (X11; Linux i686; U; ja) Presto/2.7.62 Version/11.01 88 | Opera/9.80 (X11; Linux i686; U; fr) Presto/2.7.62 Version/11.01 89 | Opera/9.80 (Windows NT 6.1; U; zh-tw) Presto/2.7.62 Version/11.01 90 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.75.14 (KHTML, like Gecko) Version/7.0.3 Safari/7046A194A 91 | Mozilla/5.0 (iPad; CPU OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A5355d Safari/8536.25 92 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/537.13+ (KHTML, like Gecko) Version/5.1.7 Safari/534.57.2 93 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_3) AppleWebKit/534.55.3 (KHTML, like Gecko) Version/5.1.3 Safari/534.53.10 94 | Mozilla/5.0 (iPad; CPU OS 5_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko ) Version/5.1 Mobile/9B176 Safari/7534.48.3 95 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_8; de-at) AppleWebKit/533.21.1 (KHTML, like Gecko) Version/5.0.5 Safari/533.21.1 96 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_7; da-dk) AppleWebKit/533.21.1 (KHTML, like Gecko) Version/5.0.5 Safari/533.21.1 97 | Mozilla/5.0 (Windows; U; Windows NT 6.1; tr-TR) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 98 | Mozilla/5.0 (Windows; U; Windows NT 6.1; ko-KR) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 99 | Mozilla/5.0 (Windows; U; Windows NT 6.1; fr-FR) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 100 | Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 101 | Mozilla/5.0 (Windows; U; Windows NT 6.1; cs-CZ) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 102 | Mozilla/5.0 (Windows; U; Windows NT 6.0; ja-JP) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 103 | Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 104 | Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_5_8; zh-cn) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 105 | Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_5_8; ja-jp) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 106 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_7; ja-jp) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 107 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; zh-cn) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 108 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; sv-se) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 109 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; ko-kr) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 110 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; ja-jp) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 111 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; it-it) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 112 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; fr-fr) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 113 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; es-es) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 114 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-us) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 115 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-gb) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 116 | Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; de-de) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27 117 | Mozilla/5.0 (Windows; U; Windows NT 6.1; sv-SE) AppleWebKit/533.19.4 (KHTML, like Gecko) Version/5.0.3 Safari/533.19.4 118 | Mozilla/5.0 (Windows; U; Windows NT 6.1; ja-JP) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.3 Safari/533.19.4 119 | Mozilla/5.0 (Windows; U; Windows NT 6.1; de-DE) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.3 Safari/533.19.4 120 | Mozilla/5.0 (Windows; U; Windows NT 6.0; hu-HU) AppleWebKit/533.19.4 (KHTML, like Gecko) Version/5.0.3 Safari/533.19.4 --------------------------------------------------------------------------------