├── .dockerignore ├── requirements.txt ├── deployment ├── supervisord.conf ├── scraper.conf ├── deploy.sh └── setup_scraper.sh ├── main_loop.py ├── LICENSE ├── Dockerfile ├── .gitignore ├── util.py ├── scraper.py ├── settings.py └── README.md /.dockerignore: -------------------------------------------------------------------------------- 1 | listings.db 2 | private.py 3 | .idea 4 | .git -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | -e git+git://github.com/juliomalegria/python-craigslist.git#egg=python-craigslist 2 | sqlalchemy 3 | python-dateutil 4 | ipython 5 | slackclient 6 | -------------------------------------------------------------------------------- /deployment/supervisord.conf: -------------------------------------------------------------------------------- 1 | [supervisord] 2 | nodaemon=true 3 | 4 | [program:afinder] 5 | redirect_stderr=true 6 | stdout_logfile=/opt/wwc/logs/afinder.log 7 | directory=/opt/wwc/apartment-finder 8 | command=python3 -u main_loop.py 9 | -------------------------------------------------------------------------------- /deployment/scraper.conf: -------------------------------------------------------------------------------- 1 | start on runlevel [2345] 2 | stop on runlevel [016] 3 | 4 | respawn 5 | setuid ubuntu 6 | setgid ubuntu 7 | chdir /home/ubuntu/apartment-finder 8 | env WORKDIR=/home/ubuntu/apartment-finder 9 | exec python3 main_loop.py -------------------------------------------------------------------------------- /deployment/deploy.sh: -------------------------------------------------------------------------------- 1 | #! /bin/bash 2 | 3 | HOST=$1 4 | DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" 5 | rsync -avz --exclude listings.db --exclude __pycache__ ../ $HOST:~/apartment-finder 6 | ssh $HOST '/bin/bash -s' < setup_scraper.sh -------------------------------------------------------------------------------- /deployment/setup_scraper.sh: -------------------------------------------------------------------------------- 1 | #! /bin/bash 2 | 3 | cd ~/apartment-finder 4 | 5 | sudo apt-get update 6 | sudo apt-get install python3-pip -y 7 | sudo apt-get install -y make build-essential libssl-dev zlib1g-dev libbz2-dev \ 8 | libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev zip git-core sqlite -y 9 | sudo pip3 install -r requirements.txt 10 | 11 | sudo service scraper stop || true 12 | sudo cp deployment/scraper.conf /etc/init/scraper.conf 13 | sudo service scraper start -------------------------------------------------------------------------------- /main_loop.py: -------------------------------------------------------------------------------- 1 | from scraper import do_scrape 2 | import settings 3 | import time 4 | import sys 5 | import traceback 6 | 7 | if __name__ == "__main__": 8 | while True: 9 | print("{}: Starting scrape cycle".format(time.ctime())) 10 | try: 11 | do_scrape() 12 | except KeyboardInterrupt: 13 | print("Exiting....") 14 | sys.exit(1) 15 | except Exception as exc: 16 | print("Error with the scraping:", sys.exc_info()[0]) 17 | traceback.print_exc() 18 | else: 19 | print("{}: Successfully finished scraping".format(time.ctime())) 20 | time.sleep(settings.SLEEP_INTERVAL) 21 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | Copyright (c) 2016 Vik Paruchuri 3 | 4 | Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: 5 | 6 | The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. 7 | 8 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 9 | -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | FROM ubuntu:trusty 2 | 3 | RUN locale-gen en_US.UTF-8 4 | ENV LANG en_US.UTF-8 5 | ENV LANGUAGE en_US:en 6 | ENV LC_ALL en_US.UTF-8 7 | 8 | RUN apt-get update && \ 9 | apt-get -y install \ 10 | python3 \ 11 | python3-pip \ 12 | make \ 13 | build-essential \ 14 | libssl-dev \ 15 | zlib1g-dev \ 16 | libbz2-dev \ 17 | libreadline-dev \ 18 | libsqlite3-dev \ 19 | wget \ 20 | curl \ 21 | llvm \ 22 | libncurses5-dev \ 23 | zip \ 24 | git-core \ 25 | supervisor \ 26 | sqlite 27 | 28 | RUN mkdir -p /tmp 29 | COPY requirements.txt /tmp/requirements.txt 30 | RUN pip3 install -r /tmp/requirements.txt 31 | 32 | COPY deployment/supervisord.conf /etc/supervisor/conf.d/supervisord.conf 33 | RUN mkdir -p /opt/wwc 34 | ADD . /opt/wwc/apartment-finder 35 | 36 | RUN mkdir -p /opt/wwc/logs 37 | WORKDIR /opt/wwc/apartment-finder 38 | 39 | CMD ["/usr/bin/supervisord"] -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | env/ 12 | build/ 13 | develop-eggs/ 14 | dist/ 15 | downloads/ 16 | eggs/ 17 | .eggs/ 18 | lib/ 19 | lib64/ 20 | parts/ 21 | sdist/ 22 | var/ 23 | *.egg-info/ 24 | .installed.cfg 25 | *.egg 26 | 27 | # PyInstaller 28 | # Usually these files are written by a python script from a template 29 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 30 | *.manifest 31 | *.spec 32 | 33 | # Installer logs 34 | pip-log.txt 35 | pip-delete-this-directory.txt 36 | 37 | # Unit test / coverage reports 38 | htmlcov/ 39 | .tox/ 40 | .coverage 41 | .coverage.* 42 | .cache 43 | nosetests.xml 44 | coverage.xml 45 | *,cover 46 | .hypothesis/ 47 | 48 | # Translations 49 | *.mo 50 | *.pot 51 | 52 | # Django stuff: 53 | *.log 54 | local_settings.py 55 | 56 | # Flask stuff: 57 | instance/ 58 | .webassets-cache 59 | 60 | # Scrapy stuff: 61 | .scrapy 62 | 63 | # Sphinx documentation 64 | docs/_build/ 65 | 66 | # PyBuilder 67 | target/ 68 | 69 | # IPython Notebook 70 | .ipynb_checkpoints 71 | 72 | # pyenv 73 | .python-version 74 | 75 | # celery beat schedule file 76 | celerybeat-schedule 77 | 78 | # dotenv 79 | .env 80 | 81 | # virtualenv 82 | venv/ 83 | ENV/ 84 | 85 | # Spyder project settings 86 | .spyderproject 87 | 88 | # Rope project settings 89 | .ropeproject 90 | 91 | # Project specific 92 | .idea/ 93 | private.py 94 | listings.db 95 | -------------------------------------------------------------------------------- /util.py: -------------------------------------------------------------------------------- 1 | import settings 2 | import math 3 | 4 | def coord_distance(lat1, lon1, lat2, lon2): 5 | """ 6 | Finds the distance between two pairs of latitude and longitude. 7 | :param lat1: Point 1 latitude. 8 | :param lon1: Point 1 longitude. 9 | :param lat2: Point two latitude. 10 | :param lon2: Point two longitude. 11 | :return: Kilometer distance. 12 | """ 13 | lon1, lat1, lon2, lat2 = map(math.radians, [lon1, lat1, lon2, lat2]) 14 | dlon = lon2 - lon1 15 | dlat = lat2 - lat1 16 | a = math.sin(dlat/2)**2 + math.cos(lat1) * math.cos(lat2) * math.sin(dlon/2)**2 17 | c = 2 * math.asin(math.sqrt(a)) 18 | km = 6367 * c 19 | return km 20 | 21 | def in_box(coords, box): 22 | """ 23 | Find if a coordinate tuple is inside a bounding box. 24 | :param coords: Tuple containing latitude and longitude. 25 | :param box: Two tuples, where first is the bottom left, and the second is the top right of the box. 26 | :return: Boolean indicating if the coordinates are in the box. 27 | """ 28 | if box[0][0] < coords[0] < box[1][0] and box[1][1] < coords[1] < box[0][1]: 29 | return True 30 | return False 31 | 32 | def post_listing_to_slack(sc, listing): 33 | """ 34 | Posts the listing to slack. 35 | :param sc: A slack client. 36 | :param listing: A record of the listing. 37 | """ 38 | desc = "{0} | {1} | {2} | {3} | <{4}>".format(listing["area"], listing["price"], listing["bart_dist"], listing["name"], listing["url"]) 39 | sc.api_call( 40 | "chat.postMessage", channel=settings.SLACK_CHANNEL, text=desc, 41 | username='pybot', icon_emoji=':robot_face:' 42 | ) 43 | 44 | def find_points_of_interest(geotag, location): 45 | """ 46 | Find points of interest, like transit, near a result. 47 | :param geotag: The geotag field of a Craigslist result. 48 | :param location: The where field of a Craigslist result. Is a string containing a description of where 49 | the listing was posted. 50 | :return: A dictionary containing annotations. 51 | """ 52 | area_found = False 53 | area = "" 54 | min_dist = None 55 | near_bart = False 56 | bart_dist = "N/A" 57 | bart = "" 58 | # Look to see if the listing is in any of the neighborhood boxes we defined. 59 | for a, coords in settings.BOXES.items(): 60 | if in_box(geotag, coords): 61 | area = a 62 | area_found = True 63 | 64 | # Check to see if the listing is near any transit stations. 65 | for station, coords in settings.TRANSIT_STATIONS.items(): 66 | dist = coord_distance(coords[0], coords[1], geotag[0], geotag[1]) 67 | if (min_dist is None or dist < min_dist) and dist < settings.MAX_TRANSIT_DIST: 68 | bart = station 69 | near_bart = True 70 | 71 | if (min_dist is None or dist < min_dist): 72 | bart_dist = dist 73 | 74 | # If the listing isn't in any of the boxes we defined, check to see if the string description of the neighborhood 75 | # matches anything in our list of neighborhoods. 76 | if len(area) == 0: 77 | for hood in settings.NEIGHBORHOODS: 78 | if hood in location.lower(): 79 | area = hood 80 | 81 | return { 82 | "area_found": area_found, 83 | "area": area, 84 | "near_bart": near_bart, 85 | "bart_dist": bart_dist, 86 | "bart": bart 87 | } 88 | -------------------------------------------------------------------------------- /scraper.py: -------------------------------------------------------------------------------- 1 | from craigslist import CraigslistHousing 2 | from sqlalchemy import create_engine 3 | from sqlalchemy.ext.declarative import declarative_base 4 | from sqlalchemy import Column, Integer, String, DateTime, Float, Boolean 5 | from sqlalchemy.orm import sessionmaker 6 | from dateutil.parser import parse 7 | from util import post_listing_to_slack, find_points_of_interest 8 | from slackclient import SlackClient 9 | import time 10 | import settings 11 | 12 | engine = create_engine('sqlite:///listings.db', echo=False) 13 | 14 | Base = declarative_base() 15 | 16 | class Listing(Base): 17 | """ 18 | A table to store data on craigslist listings. 19 | """ 20 | 21 | __tablename__ = 'listings' 22 | 23 | id = Column(Integer, primary_key=True) 24 | link = Column(String, unique=True) 25 | created = Column(DateTime) 26 | geotag = Column(String) 27 | lat = Column(Float) 28 | lon = Column(Float) 29 | name = Column(String) 30 | price = Column(Float) 31 | location = Column(String) 32 | cl_id = Column(Integer, unique=True) 33 | area = Column(String) 34 | bart_stop = Column(String) 35 | 36 | Base.metadata.create_all(engine) 37 | 38 | Session = sessionmaker(bind=engine) 39 | session = Session() 40 | 41 | def scrape_area(area): 42 | """ 43 | Scrapes craigslist for a certain geographic area, and finds the latest listings. 44 | :param area: 45 | :return: A list of results. 46 | """ 47 | cl_h = CraigslistHousing(site=settings.CRAIGSLIST_SITE, area=area, category=settings.CRAIGSLIST_HOUSING_SECTION, 48 | filters={'max_price': settings.MAX_PRICE, "min_price": settings.MIN_PRICE}) 49 | 50 | results = [] 51 | gen = cl_h.get_results(sort_by='newest', geotagged=True, limit=20) 52 | while True: 53 | try: 54 | result = next(gen) 55 | except StopIteration: 56 | break 57 | except Exception: 58 | continue 59 | listing = session.query(Listing).filter_by(cl_id=result["id"]).first() 60 | 61 | # Don't store the listing if it already exists. 62 | if listing is None: 63 | if result["where"] is None: 64 | # If there is no string identifying which neighborhood the result is from, skip it. 65 | continue 66 | 67 | lat = 0 68 | lon = 0 69 | if result["geotag"] is not None: 70 | # Assign the coordinates. 71 | lat = result["geotag"][0] 72 | lon = result["geotag"][1] 73 | 74 | # Annotate the result with information about the area it's in and points of interest near it. 75 | geo_data = find_points_of_interest(result["geotag"], result["where"]) 76 | result.update(geo_data) 77 | else: 78 | result["area"] = "" 79 | result["bart"] = "" 80 | 81 | # Try parsing the price. 82 | price = 0 83 | try: 84 | price = float(result["price"].replace("$", "")) 85 | except Exception: 86 | pass 87 | 88 | # Create the listing object. 89 | listing = Listing( 90 | link=result["url"], 91 | created=parse(result["datetime"]), 92 | lat=lat, 93 | lon=lon, 94 | name=result["name"], 95 | price=price, 96 | location=result["where"], 97 | cl_id=result["id"], 98 | area=result["area"], 99 | bart_stop=result["bart"] 100 | ) 101 | 102 | # Save the listing so we don't grab it again. 103 | session.add(listing) 104 | session.commit() 105 | 106 | # Return the result if it's near a bart station, or if it is in an area we defined. 107 | if len(result["bart"]) > 0 or len(result["area"]) > 0: 108 | results.append(result) 109 | 110 | return results 111 | 112 | def do_scrape(): 113 | """ 114 | Runs the craigslist scraper, and posts data to slack. 115 | """ 116 | 117 | # Create a slack client. 118 | sc = SlackClient(settings.SLACK_TOKEN) 119 | 120 | # Get all the results from craigslist. 121 | all_results = [] 122 | for area in settings.AREAS: 123 | all_results += scrape_area(area) 124 | 125 | print("{}: Got {} results".format(time.ctime(), len(all_results))) 126 | 127 | # Post each result to slack. 128 | for result in all_results: 129 | post_listing_to_slack(sc, result) 130 | -------------------------------------------------------------------------------- /settings.py: -------------------------------------------------------------------------------- 1 | import os 2 | 3 | ## Price 4 | 5 | # The minimum rent you want to pay per month. 6 | MIN_PRICE = 1500 7 | 8 | # The maximum rent you want to pay per month. 9 | MAX_PRICE = 2000 10 | 11 | ## Location preferences 12 | 13 | # The Craigslist site you want to search on. 14 | # For instance, https://sfbay.craigslist.org is SF and the Bay Area. 15 | # You only need the beginning of the URL. 16 | CRAIGSLIST_SITE = 'sfbay' 17 | 18 | # What Craigslist subdirectories to search on. 19 | # For instance, https://sfbay.craigslist.org/eby/ is the East Bay, and https://sfbay.craigslist.org/sfc/ is San Francisco. 20 | # You only need the last three letters of the URLs. 21 | AREAS = ["eby", "sfc", "sby", "nby"] 22 | 23 | # A list of neighborhoods and coordinates that you want to look for apartments in. Any listing that has coordinates 24 | # attached will be checked to see which area it is in. If there's a match, it will be annotated with the area 25 | # name. If no match, the neighborhood field, which is a string, will be checked to see if it matches 26 | # anything in NEIGHBORHOODS. 27 | BOXES = { 28 | "adams_point": [ 29 | [37.80789, -122.25000], 30 | [37.81589, -122.26081], 31 | ], 32 | "piedmont": [ 33 | [37.82240, -122.24768], 34 | [37.83237, -122.25386], 35 | ], 36 | "rockridge": [ 37 | [37.83826, -122.24073], 38 | [37.84680, -122.25944], 39 | ], 40 | "berkeley": [ 41 | [37.86226, -122.25043], 42 | [37.86781, -122.26502], 43 | ], 44 | "north_berkeley": [ 45 | [37.86425, -122.26330], 46 | [37.87655, -122.28974], 47 | ], 48 | "pac_heights": [ 49 | [37.79124, -122.42381], 50 | [37.79850, -122.44784], 51 | ], 52 | "lower_pac_heights": [ 53 | [37.78554, -122.42878], 54 | [37.78873, -122.44544], 55 | ], 56 | "haight": [ 57 | [37.77059, -122.42688], 58 | [37.77086, -122.45401], 59 | ], 60 | "sunset": [ 61 | [37.75451, -122.46422], 62 | [37.76258, -122.50825], 63 | ], 64 | "richmond": [ 65 | [37.77188, -122.47263], 66 | [37.78029, -122.51005], 67 | ], 68 | "presidio": [ 69 | [37.77805, -122.43959], 70 | [37.78829, -122.47151], 71 | ] 72 | } 73 | 74 | # A list of neighborhood names to look for in the Craigslist neighborhood name field. If a listing doesn't fall into 75 | # one of the boxes you defined, it will be checked to see if the neighborhood name it was listed under matches one 76 | # of these. This is less accurate than the boxes, because it relies on the owner to set the right neighborhood, 77 | # but it also catches listings that don't have coordinates (many listings are missing this info). 78 | NEIGHBORHOODS = ["berkeley north", "berkeley", "rockridge", "adams point", "oakland lake merritt", "cow hollow", "piedmont", "pac hts", "pacific heights", "lower haight", "inner sunset", "outer sunset", "presidio", "palo alto", "richmond / seacliff", "haight ashbury", "alameda", "twin peaks", "noe valley", "bernal heights", "glen park", "sunset", "mission district", "potrero hill", "dogpatch"] 79 | 80 | ## Transit preferences 81 | 82 | # The farthest you want to live from a transit stop. 83 | MAX_TRANSIT_DIST = 2 # kilometers 84 | 85 | # Transit stations you want to check against. Every coordinate here will be checked against each listing, 86 | # and the closest station name will be added to the result and posted into Slack. 87 | TRANSIT_STATIONS = { 88 | "oakland_19th_bart": [37.8118051,-122.2720873], 89 | "macarthur_bart": [37.8265657,-122.2686705], 90 | "rockridge_bart": [37.841286,-122.2566329], 91 | "downtown_berkeley_bart": [37.8629541,-122.276594], 92 | "north_berkeley_bart": [37.8713411,-122.2849758] 93 | } 94 | 95 | ## Search type preferences 96 | 97 | # The Craigslist section underneath housing that you want to search in. 98 | # For instance, https://sfbay.craigslist.org/search/apa find apartments for rent. 99 | # https://sfbay.craigslist.org/search/sub finds sublets. 100 | # You only need the last 3 letters of the URLs. 101 | CRAIGSLIST_HOUSING_SECTION = 'apa' 102 | 103 | ## System settings 104 | 105 | # How long we should sleep between scrapes of Craigslist. 106 | # Too fast may get rate limited. 107 | # Too slow may miss listings. 108 | SLEEP_INTERVAL = 20 * 60 # 20 minutes 109 | 110 | # Which slack channel to post the listings into. 111 | SLACK_CHANNEL = "#housing" 112 | 113 | # The token that allows us to connect to slack. 114 | # Should be put in private.py, or set as an environment variable. 115 | SLACK_TOKEN = os.getenv('SLACK_TOKEN', "") 116 | 117 | # Any private settings are imported here. 118 | try: 119 | from private import * 120 | except Exception: 121 | pass 122 | 123 | # Any external private settings are imported from here. 124 | try: 125 | from config.private import * 126 | except Exception: 127 | pass -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | Apartment finder 2 | ------------------- 3 | 4 | This repo contains the code for a bot that will scrape Craigslist for real-time listings matching specific criteria, then alert you in Slack. This will let you quickly see the best new listings, and contact the owners. You can adjust the settings to change your price range, what neighborhoods you want to look in, and what transit stations and other points of interest you'd like to be close to. 5 | 6 | I successfully used this tool to find an apartment when I moved from Boston to SF. It saved a good amount of time and money. Read more about it [here](https://www.dataquest.io/blog/apartment-finding-slackbot/). 7 | 8 | It's recommended to follow the Docker installation and usage instructions. 9 | 10 | Settings 11 | -------------------- 12 | 13 | Look in `settings.py` for a full list of all the configuration options. Here's a high level overview: 14 | 15 | * `MIN_PRICE` -- the minimum listing price you want to search for. 16 | * `MAX_PRICE` -- the minimum listing price you want to search for. 17 | * `CRAIGSLIST_SITE` -- the regional Craigslist site you want to search in. 18 | * `AREAS` -- a list of areas of the regional Craiglist site that you want to search in. 19 | * `BOXES` -- coordinate boxes of the neighborhoods you want to look in. 20 | * `NEIGHBORHOODS` -- if the listing doesn't have coordinates, a list of neighborhoods to match on. 21 | * `MAX_TRANSIT_DISTANCE` -- the farthest you want to be from a transit station. 22 | * `TRANSIT_STATIONS` -- the coordinates of transit stations. 23 | * `CRAIGSLIST_HOUSING_SECTION` -- the subsection of Craigslist housing that you want to look in. 24 | * `SLACK_CHANNEL` -- the Slack channel you want the bot to post in. 25 | 26 | External Setup 27 | -------------------- 28 | 29 | Before using this bot, you'll need a Slack team, a channel for the bot to post into, and a Slack API key: 30 | 31 | * Create a Slack team, which you can do [here](https://slack.com/create#email). 32 | * Create a channel for the listings to be posted into. [Here's](https://get.slack.help/hc/en-us/articles/201402297-Creating-a-channel) help on this. It's suggested to use `#housing` as the name of the channel. 33 | * Get a Slack API token, which you can do [here](https://api.slack.com/docs/oauth-test-tokens). [Here's](https://get.slack.help/hc/en-us/articles/215770388-Creating-and-regenerating-API-tokens) more information on the process. 34 | 35 | Configuration 36 | -------------------- 37 | 38 | ## Docker 39 | 40 | * Create a folder called `config`, then put a file called `private.py` inside. 41 | * Specify new values for any of the settings above in `private.py`. 42 | * For example, you could put `AREAS = ['sfc']` in `private.py` to only look in San Francisco. 43 | * If you want to post into a Slack channel not called `housing`, add an entry for `SLACK_CHANNEL`. 44 | * If you don't want to look in the Bay Area, you'll need to update the following settings at the minimum: 45 | * `CRAIGSLIST_SITE` 46 | * `AREAS` 47 | * `BOXES` 48 | * `NEIGHBORHOODS` 49 | * `TRANSIT_STATIONS` 50 | * `CRAIGSLIST_HOUSING_SECTION` 51 | * `MIN_PRICE` 52 | * `MAX_PRICE` 53 | 54 | ## Manual 55 | 56 | * Create a file called `private.py` in this folder. 57 | * Add a value called `SLACK_TOKEN` that contains your Slack API token. 58 | * Add any other values you want to `private.py`. 59 | 60 | Installation + Usage 61 | -------------------- 62 | 63 | ## Docker 64 | 65 | * Make sure to do the steps in the configuration section above first. 66 | * Install Docker by following [these instructions](https://docs.docker.com/engine/installation/). 67 | * To run the program with the default configuration: 68 | * `docker run -d -e SLACK_TOKEN={YOUR_SLACK_TOKEN} dataquestio/apartment-finder` 69 | * To run the program with your own configuration: 70 | * `docker run -d -e SLACK_TOKEN={YOUR_SLACK_TOKEN} -v {ABSOLUTE_PATH_TO_YOUR_CONFIG_FOLDER}:/opt/wwc/apartment-finder/config dataquestio/apartment-finder` 71 | 72 | ## Manual 73 | 74 | * Look in the `Dockerfile`, and make sure you install any of the apt packages listed there. 75 | * Install Python 3 using Anaconda or another method. 76 | * Install the Python requirements with `pip install -r requirements.txt`. 77 | * Run the program with `python main_loop.py`. Results will be posted to your #Housing channel if successful. 78 | 79 | Troubleshooting 80 | --------------------- 81 | 82 | ## Docker 83 | 84 | * Use `docker ps` to get the id of the container running the bot. 85 | * Run `docker exec -it {YOUR_CONTAINER_ID} /bin/bash` to get a command shell inside the container. 86 | * Run `sqlite listings.db` to run the sqlite command line tool and inspect the database state (the only table is also called `listings`). 87 | * `select * from listings` will get all of the stored listings. 88 | * If nothing is in the database, you may need to wait for a bit, or verify that your settings aren't too restrictive and aren't finding any listings. 89 | * You can see how many listings are being found by looking at the logs. 90 | * Inspect the logs using `tail -f -n 1000 /opt/wwc/logs/afinder.log`. 91 | 92 | ## Manual 93 | 94 | * Look at the stdout of the main program. 95 | * Inspect `listings.db` to ensure listings are being added. 96 | 97 | Deploying 98 | --------------------- 99 | 100 | * Create a server that has Docker installed. It's suggested to use Digital Ocean. 101 | * Follow the configuration + installation instructions for Docker above. 102 | --------------------------------------------------------------------------------