├── requirements.txt ├── .htaccess ├── README.md ├── saved.py ├── webpage.py └── template.html /requirements.txt: -------------------------------------------------------------------------------- 1 | praw==4.3.0 2 | docopt>=0.6.2 3 | -------------------------------------------------------------------------------- /.htaccess: -------------------------------------------------------------------------------- 1 | RewriteEngine On 2 | RewriteCond %{REQUEST_FILENAME} !-f 3 | RewriteRule ^([^\.]+)$ $1.html [NC,L] 4 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | This project was made out of frustration for not being able to manage my saved links on reddit very easily. I wanted to be able to create a list of all of the videos that I had enjoyed, but this was not possible as there was no way to search by domain. 2 | 3 | The script saved.py, allows you to search by multiple critera, title, domain, and sub-reddit to look for a saved post, or posts. The webpage.py script will add all of your saved links to an HTML template, then update the file every 10 minutes to include any new additions; this has a bonus benefit of being able to store more than the 1000 saved links that reddit allows you. 4 | 5 | Usage for saved.py: 6 | 7 | Usage: 8 | savedLinks [options] 9 | Options: 10 | -t, --title TITLE Search for links based on link title 11 | -d, --domain DOMAIN Search for links from a certain domain 12 | -r, --reddit REDDIT Search for links based on subreddit 13 | 14 | There is also a very hacky website/watching script in this repository, but I'm no longer using / maintaining it. 15 | -------------------------------------------------------------------------------- /saved.py: -------------------------------------------------------------------------------- 1 | """ 2 | Usage: 3 | savedLinks [options] 4 | 5 | Options: 6 | -t, --title TITLE Search for links based on link title 7 | -d, --domain DOMAIN Search for links from a certain domain 8 | -r, --reddit REDDIT Search for links based on subreddit 9 | """ 10 | 11 | from docopt import docopt 12 | import praw 13 | import sys 14 | 15 | if __name__ == "__main__": 16 | args = docopt(__doc__) 17 | 18 | criteria = sum(1 for v in args.values() if v is not None) 19 | 20 | if criteria == 0: 21 | sys.exit(__doc__) 22 | 23 | r = praw.Reddit(user_agent='savedSearch', 24 | client_id='OkDyg4-hOs-TbQ', 25 | client_secret='******************', 26 | username='Midasx', 27 | password='**********',) 28 | 29 | for post in r.redditor('Midasx').saved(limit=None): 30 | count = 0 31 | if not hasattr(post, 'domain'): 32 | continue # Filter out saved comments 33 | 34 | if args['--domain']: 35 | if args['--domain'].lower() == post.domain: 36 | count += 1 37 | 38 | if args['--reddit']: 39 | if args['--reddit'].lower() == post.subreddit.display_name.lower(): 40 | count += 1 41 | 42 | if args['--title']: 43 | if args['--title'].lower() in post.title.lower(): 44 | count += 1 45 | 46 | if count == criteria: 47 | print(post.shortlink, " ", post.title) 48 | -------------------------------------------------------------------------------- /webpage.py: -------------------------------------------------------------------------------- 1 | #!/bin/python2.7 2 | import praw 3 | import sys 4 | import time 5 | import fileinput 6 | 7 | r = praw.Reddit(user_agent='saved_links') 8 | r.login("Midasx", "**********") 9 | 10 | added = [] 11 | first = True # Set to false for first run 12 | 13 | while True: 14 | for line in fileinput.input('../www/saved.html', inplace=1): 15 | print line, 16 | if line.startswith(' '): 17 | count = None if first else 5 # Comment out if for first run 18 | for submission in r.user.get_saved(limit=count, time='all'): 19 | if first: 20 | added.append(submission) 21 | if submission not in added: 22 | if not hasattr(submission, 'domain'): 23 | continue # Filter out saved comments 24 | message = "" if submission.thumbnail != 'nsfw' else "
NSFW
" 25 | sys.stdout.write("| Filter | 22 |Subreddit | 23 |Title | 24 |
|---|