├── .gitattributes
├── .gitignore
├── .travis.yml
├── Changelog.md
├── LICENSE
├── ReadMe.md
├── Supported_Sites.md
├── anime_dl
├── Anime_dl.py
├── __init__.py
├── __main__.py
├── common
│ ├── __init__.py
│ ├── browser_instance.py
│ ├── downloader.py
│ └── misc.py
├── exeMaker.bat
├── external
│ ├── __init__.py
│ ├── aes.py
│ ├── compat.py
│ ├── socks.py
│ └── utils.py
├── sites
│ ├── __init__.py
│ ├── crunchyroll.py
│ └── supporters
│ │ ├── __init__.py
│ │ ├── anime_name.py
│ │ ├── path_works.py
│ │ └── sub_fetcher.py
├── subtitles
│ └── __init__.py
└── version.py
├── auto.sh
├── docs
├── Changelog.md
├── Supported_Sites.md
└── index.md
└── requirements.txt
/.gitattributes:
--------------------------------------------------------------------------------
1 | # Auto detect text files and perform LF normalization
2 | * text=auto
3 |
4 | # Custom for Visual Studio
5 | *.cs diff=csharp
6 |
7 | # Standard to msysgit
8 | *.doc diff=astextplain
9 | *.DOC diff=astextplain
10 | *.docx diff=astextplain
11 | *.DOCX diff=astextplain
12 | *.dot diff=astextplain
13 | *.DOT diff=astextplain
14 | *.pdf diff=astextplain
15 | *.PDF diff=astextplain
16 | *.rtf diff=astextplain
17 | *.RTF diff=astextplain
18 |
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | # Windows image file caches
2 | Thumbs.db
3 | ehthumbs.db
4 |
5 | # Folder config file
6 | Desktop.ini
7 |
8 | # Recycle Bin used on file shares
9 | $RECYCLE.BIN/
10 |
11 | # Windows Installer files
12 | *.cab
13 | *.msi
14 | *.msm
15 | *.msp
16 | *.exe
17 | *.pyc
18 | *.pyz
19 | *.pypirc
20 | *.toc
21 | *.manifest
22 | *.pkg
23 | *.zip
24 | *.xml
25 | *.jpeg
26 | *.jpg
27 |
28 | # Windows shortcuts
29 | *.lnk
30 |
31 | # =========================
32 | # Operating System Files
33 | # =========================
34 |
35 | # OSX
36 | # =========================
37 |
38 | .DS_Store
39 | .AppleDouble
40 | .LSOverride
41 |
42 | # Thumbnails
43 | ._*
44 |
45 | # Files that might appear in the root of a volume
46 | .DocumentRevisions-V100
47 | .fseventsd
48 | .Spotlight-V100
49 | .TemporaryItems
50 | .Trashes
51 | .VolumeIcon.icns
52 |
53 | # Directories potentially created on remote AFP share
54 | .AppleDB
55 | .AppleDesktop
56 | Network Trash Folder
57 | Temporary Items
58 | .apdisk
59 | *.spec
60 | *.iml
61 | anime_dl/build/__main__/warn__main__.txt
62 | .idea
63 | *.ass
64 | *.mp4
65 | *.log
66 | *.mkv
67 | anime_dl/sites/funimation - Copy.py
68 | *.old
69 | *.ttf
70 | *.html
71 | anime_dl/build/__main__/warn-__main__.txt
72 |
--------------------------------------------------------------------------------
/.travis.yml:
--------------------------------------------------------------------------------
1 | language: python
2 | python:
3 | - "3.2"
4 | - "3.3"
5 | - "3.4"
6 | - "3.5"
7 | - "3.5-dev" # 3.5 development branch
8 | - "3.6-dev" # 3.6 development branch
9 | - "nightly" # currently points to 3.7-dev
10 | # command to install dependencies
11 | install: "pip install -r requirements.txt"
12 | # command to run tests
13 | #script: nosetests
14 | #script: cd comic_dl
15 | script:
16 | - cd anime_dl
17 | notifications:
18 | email:
19 | - xonshiz@psychoticelites.com
--------------------------------------------------------------------------------
/Changelog.md:
--------------------------------------------------------------------------------
1 | #Changelog
2 |
3 | - Site support for Crunchyroll.com [2017.03.05]
4 | - Fix for #1 [2017.03.06]
5 | - Fix for #2 [2017.03.06]
6 | - ReadMe updated for Python Script execution [2017.03.06]
7 | - Support for Whole Show Downloading for Crunchyroll [2017.03.06]
8 | - Selection of language for the Crunchyroll Show [2017.03.06]
9 | - Downloading only subtitles (skip video downloads) [2017.04.13]
10 | - Fix for [6](https://github.com/Xonshiz/anime-dl/issues/6) and Fix for [3](https://github.com/Xonshiz/anime-dl/issues/3) [2017.04.13]
11 | - Fix for #9 [2017.04.13]
12 | - Added `Verbose Logging` [2017.04.13]
13 | - Fix for #11 [2017.04.21]
14 | - Re-wrote code to remove unnecessary parts [2017.05.30]
15 | - Fix for #12 [2017.05.30]
16 | - Site support for Funimation.com [2017.05.30]
17 | - Fix for #8 [2017.05.30]
18 | - Muxing All The Subtitles [2017.07.03]
19 | - Fix for special characters and #15 [2017.07.05]
20 | - Episode Download Range supprt Added [2017.07.07]
21 | - Added support to include fonts in the muxed videos [2017.07.09]
22 | - Changed mkvmerge.exe to mkvmerge to support Non-Windows Operating Systems [2017.07.24]
23 | - Fix for #21 and #22 [2017.10.04]
24 | - Fix for #17 and #22 [2017.12.27]
25 | - Fix for #17 and #26 [2017.12.27]
26 | - PEP8 Cleaning [2018.01.02]
27 | - Fix for #18 [2018.01.02]
28 | - Fix for #31 [2018.01.21]
29 | - Fix for #39 [2018.01.27]
30 | - Fix for #46 [2018.01.29]
31 | - Fix for #45 [2018.01.29]
32 | - Temp fix for login #65, #66 [2018.10.11]
33 | - Login Issue Fixed [2019.05.16]
34 | - Re-structured the code for better maintainance and re-usability. [2019.05.16]
35 | - Fixed #100 [2019.05.26]
36 | - Fixed #99 [2019.05.26]
37 | - Fixed cookie issue that prevented downloading of HD and FHD streams [2019.05.27]
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 |
2 | The MIT License (MIT)
3 |
4 | Copyright (c) 2013-2016 Blackrock Digital LLC.
5 |
6 | Permission is hereby granted, free of charge, to any person obtaining a copy
7 | of this software and associated documentation files (the "Software"), to deal
8 | in the Software without restriction, including without limitation the rights
9 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
10 | copies of the Software, and to permit persons to whom the Software is
11 | furnished to do so, subject to the following conditions:
12 |
13 | The above copyright notice and this permission notice shall be included in
14 | all copies or substantial portions of the Software.
15 |
16 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
17 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
18 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
19 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
20 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
21 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
22 | THE SOFTWARE.
--------------------------------------------------------------------------------
/ReadMe.md:
--------------------------------------------------------------------------------
1 | # Anime-DL | [](https://travis-ci.org/Xonshiz/anime-dl) [](http://anime-dl.readthedocs.io/en/latest/?badge=latest) | [](https://www.paypal.me/xonshiz) | [](https://github.com/xonshiz/anime-dl/releases/latest) | [](https://github.com/xonshiz/anime-dl/releases)
2 |
3 | Anime-dl is a Command-line program to download anime from CrunchyRoll and Funimation. This script needs you to have a premium subscription for the listed services. If you don't have a subscription, this script is pretty much usless for you.
4 |
5 | > Downloading and distributing this content may be illegal.This script was written for education purposes purely and you are responsible for its use.
6 |
7 | > Support these anime streaming websites by buying a premium account.
8 |
9 | > Taking some libs directly from youtube-dl to decrypt CrunchyRoll's subtitles.
10 |
11 | ## Table of Content
12 |
13 | * [Supported Sites](https://github.com/Xonshiz/anime-dl/blob/master/Supported_Sites.md)
14 | * [Dependencies Installation](#dependencies-installation)
15 | * [Installation](#installation)
16 | * [Python Support](#python-support)
17 | * [Windows Binary](#windows-binary)
18 | * [List of Arguments](#list-of-arguments)
19 | * [Usage](#usage)
20 | * [Windows](#windows)
21 | * [Linux/Debian](#linuxdebian)
22 | * [Example URLs](#example-urls)
23 | * [Features](#features)
24 | * [Changelog](https://github.com/Xonshiz/anime-dl/blob/master/Changelog.md)
25 | * [Opening An Issue/Requesting A Site](#opening-an-issuerequesting-a-site)
26 | * [Reporting Issues](#reporting-issues)
27 | * [Suggesting A Feature](#suggesting-a-feature)
28 | * [Donations](#donations)
29 |
30 | ## Supported Websites
31 | You can check the list of supported websites [**`HERE`**](https://github.com/Xonshiz/anime-dl/blob/master/Supported_Sites.md).
32 |
33 | ## Dependencies Installation
34 | This script can run on multiple Operating Systems. But, the script depends on some external binaries or libs. We need `FFmpeg`, `mkvmerge` and `Node.js` in our paths. There are some old streams on Crunchyroll which only support `rtmpe` streams, as noted from Issue #9. For this, you need `rtmpdump`.
35 |
36 | You also need [mkvmerge.exe](https://mkvtoolnix.download/downloads.html) in your `PATH` or `Working Directory`.
37 |
38 |
39 | **`These dependencies are required on ALL the operating systems, ALL!.`**
40 |
41 | 1.) Make sure you have Python installed and is present in your system's path.
42 |
43 | 2.) Grab [FFmpeg from this link](https://ffmpeg.org/download.html), [Node.js from this link](https://nodejs.org/en/download/) and [RTMPDump](https://www.videohelp.com/software/RTMPDump).
44 |
45 | 3.) Install FFmpeg and Node.js and place it in the directory of this script, or put them in your system's path.
46 |
47 | 4.) Browse to the directory of this script and open command prompt/shell in that directory and run this command :
48 |
49 | ```
50 | python pip install -r requirements.txt
51 | ```
52 |
53 | ## Installation
54 | After installing and setting up all the dependencies in your Operating System, you're good to go and use this script.
55 | The instructions for all the OS would remain same. Download [`THIS REPOSITORY`](https://github.com/Xonshiz/anime-dl/archive/master.zip) and put it somewhere in your system. Move over the `anime_dl` folder.
56 |
57 | **Windows users**, it's better to not place it places where it requires administrator privileges. Good example of places to avoid would be `C:\Windows` etc.. This goes for both, the Python script and the windows binary file (.exe).
58 |
59 | **Debian 7/8/9 users**, run this command to automatically install anime-dl into your CWD:
60 |
61 | `curl https://raw.githubusercontent.com/Xonshiz/anime-dl/master/auto.sh | sudo bash`
62 |
63 | **Linux/Debian** users make sure that this script is executable.Just run this command, if you run into problem(s) :
64 |
65 | `chmod +x anime-dl.py`
66 |
67 | `chmod +x __main__.py`
68 |
69 | and then, execute with this :
70 |
71 | `./__main__.py`
72 |
73 | ## Python Support
74 | This script supports only Pythom 3 currently..
75 |
76 | ## Windows Binary
77 | It is recommended that windows users use this binary to save both, your head and time from installing all the dependencies.
78 |
79 | You need to download [FFmpeg](https://ffmpeg.org/download.html) and [Node.js](https://nodejs.org/en/download/) and keep them in the same directory as that of this windows binary file.
80 |
81 | If you already have these dependencies, then you can download this binary and start using the script right off the bat :
82 | * `Binary (x86)` : [Click Here](https://github.com/Xonshiz/anime-dl/releases/latest)
83 |
84 |
85 | ## List of Arguments
86 | Currently, the script supports these arguments :
87 | ```
88 | -h, --help Prints the basic help menu of the script and exits.
89 | -i,--input Defines the input link to the anime.
90 | -V,--version Prints the VERSION and exits.
91 | -u,--username Indicates username for a website. [REQUIRED]
92 | -p,--password Indicates password for a website. [REQUIRED]
93 | -r,--resolution Indicates the desired resolution. (default = 720p)
94 | --skip Skip video downloads (Will only download subtitles)
95 | -v,--verbose Starts Verbose Logging for detailed information.
96 | -l,--language Selects the language for the show. (default = Japanese) [Langs = english, dub, sub, Japanese, eng]
97 | -rn,--range Selects the range of episodes to download (Default = All) [ Ex : --range 1-10 (This will download first 10 episodes of a series)]
98 | ```
99 |
100 | ## Usage
101 | With this script, you have to pass arguments in order to be able to download anything. Passing arguments in a script is pretty easy. Since the script is pretty basic, it doesn't have too many arguments. Go check the [`ARGUMENTS SECTION`](https://github.com/Xonshiz/anime-dl#list-of-arguments) to know more about which arguments the script offers.
102 |
103 | Follow the instructions according to your OS :
104 |
105 | ### Windows
106 | After you've saved this script in a directory/folder, you need to open `command prompt` and browse to that directory and then execute the script. Let's do it step by step :
107 | * Open the folder where you've downloaded the files of this repository.
108 | * Hold down the **`SHIFT`** key and while holding down the SHIFT key, **`RIGHT CLICK`** and select `Open Command Prompt Here` from the options that show up. *PowerShell users see below*
109 | * Now, in the command prompt, type this :
110 |
111 | *If you're using the windows binary :*
112 |
113 | `anime-dl.exe -i "" -u "YourUsername" -p "Password" -r "Resolution"`
114 |
115 | *If you're using the Python Script :*
116 |
117 | `__main__.py -i "" -u "YourUsername" -p "Password" -r "Resolution"`
118 |
119 | URL can be any URL of the [supported websites](https://github.com/Xonshiz/anime-dl/blob/master/Supported_Sites.md).
120 |
121 | ### PowerShell Note :
122 | * With Windows 10, the default option when shift-right clicking is to open PowerShell instead of a command window. When using PowerShell instead of the command prompt, you must make sure that the dependencies directory is in the path or it will not be found.
123 | * [It is possible to enable `Open Command Window Here` in Windows 10](https://superuser.com/questions/1201988/how-do-i-change-open-with-powershell-to-open-with-command-prompt-when-shift).
124 |
125 | ### Linux/Debian
126 | After you've saved this script in a directory/folder, you need to open `command prompt` and browse to that directory and then execute the script. Let's do it step by step :
127 | * Open a terminal, `Ctrl + Alt + T` is the shortcut to do so (if you didn't know).
128 | * Now, change the current working directory of the terminal to the one where you've downloaded this repository.
129 | * Now, in the Terminal, type this :
130 |
131 | `__main__.py -i "" -u "YourUsername" -p "Password" -r "Resolution"`
132 |
133 | URL can be any URL of the [supported websites](https://github.com/Xonshiz/anime-dl/blob/master/Supported_Sites.md).
134 |
135 | ### Example URLs
136 | * Crunchyroll :
137 | * Single Episode : [http://www.crunchyroll.com/i-cant-understand-what-my-husband-is-saying/episode-13-happy-days-678059](http://www.crunchyroll.com/i-cant-understand-what-my-husband-is-saying/episode-13-happy-days-678059)
138 | * Whole Show : [http://www.crunchyroll.com/i-cant-understand-what-my-husband-is-saying](http://www.crunchyroll.com/i-cant-understand-what-my-husband-is-saying)
139 |
140 | ### Note :
141 |
142 | * If you want to include some fonts in the muxed video, you need to make a folder named "Fonts" in the same directory as that of this script and put all the fonts inside that directory. The script should take them all.
143 |
144 | ## Features
145 | This is a very basic and small sript, so at the moment it only have a few features.
146 | * Downloads a Single episode along with all the available subtitles for that episode.
147 | * Downloads and puts them all in a directory named "Output".
148 | * Skip if the file has already been downloaded.
149 | * Downloads all the episodes for a show available on Crunchyroll.
150 | * Gives choice for downloading subs or dubs of a series available on Crunchyroll.
151 | * Choice to download only the subs and skip the videos.
152 |
153 | ## Changelog
154 | You can check the changelog [**`HERE`**](https://github.com/Xonshiz/anime-dl/blob/master/Changelog.md).
155 |
156 | ## Opening An Issue/Requesting A Site
157 | If your're planning to open an issue for the script or ask for a new feature or anything that requires opening an Issue, then please do keep these things in mind.
158 |
159 | ### Reporting Issues
160 | PLEASE RUN THIS SCRIPT IN A COMMAND LINE (as mentioned in the Usage section) AND DON'T SAY THAT `THE SCRIPT CLOSED TOO QUICK, I COULDN'T SEE`. If something doesn't work like it's supposed to, run the command with the `--verbose` argument. It'll create a `Error Log.txt` file in the same directory. Upload the content of that file on Github Gists/Pastebin etc. and share that link.
161 |
162 | **Please make sure that you remove your loggin credentials from the Error Log.txt file before you post its contents anywhere.**
163 |
164 | If you're here to report an issue, please follow the basic syntax to post a request :
165 |
166 | **Subject** : Error That You Get.
167 |
168 | **Command Line Arguments You Gave** : The whole command that you gave to execute/run this script.
169 |
170 | **Verbose Log Link** : Link to the Gist/Pastebin that holds the content of Error Log.txt.
171 |
172 | **Long Explanation** : Describe in details what you saw, what should've happened and what actually happened.
173 |
174 | This should be enough, but it'll be great if you can add more ;)
175 |
176 | ### Suggesting A Feature
177 | If you're here to make suggestions, please follow the basic syntax to post a request :
178 |
179 | **Subject** : Something that briefly tells us about the feature.
180 |
181 | **Long Explanation** : Describe in details what you want and how you want.
182 |
183 | This should be enough, but it'll be great if you can add more ;)
184 |
185 | # Donations
186 | You can always send some money over from this :
187 |
188 | Paypal : [](https://www.paypal.me/xonshiz)
189 |
190 | Patreon Link : https://www.patreon.com/xonshiz
191 |
192 | Any amount is appreciated :)
193 |
--------------------------------------------------------------------------------
/Supported_Sites.md:
--------------------------------------------------------------------------------
1 | #List of Supported Websites
2 |
3 | * [CrunchyRoll](http://crunchyroll.com)
4 | * [Funimation](http://funimation.com) [Testing]
--------------------------------------------------------------------------------
/anime_dl/Anime_dl.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- coding: utf-8 -*-
3 |
4 | try:
5 | from urllib.parse import urlparse
6 | except ImportError:
7 | from urlparse import urlparse
8 | import sites
9 | from sys import exit
10 |
11 |
12 | '''First, the honcho returns the website name and after that, the corresponding methods are called for a particular
13 | website. I don't remember why I added an extra step, I really don't. Oh well, it's working, so let it work.'''
14 |
15 |
16 | class AnimeDL(object):
17 |
18 | def __init__(self, url, username, password, resolution, language, skipper, logger, episode_range, output):
19 |
20 | website = str(self.honcho(url=url[0]))
21 |
22 | if website == "Crunchyroll":
23 | if not url[0] or not username[0] or not password[0]:
24 | print("Please enter the required arguments. Run __main__.py --help")
25 | exit()
26 | else:
27 |
28 | sites.crunchyroll.Crunchyroll(
29 | url=url[0], password=password, username=username, resolution=resolution, language=language,
30 | skipper=skipper, logger=logger, episode_range=episode_range, output=output)
31 |
32 | elif website == "VRV":
33 | print("Not Implemented")
34 | exit(1)
35 | # if not url[0] or not username[0] or not password[0]:
36 | # print("Please enter the required arguments. Run __main__.py --help")
37 | # exit()
38 | # else:
39 | #
40 | # sites.vrv.Vrv(url=url, password=password, username=username, resolution=resolution)
41 |
42 | elif website == "Funimation":
43 | print("Not Implemented")
44 | exit(1)
45 | # if not url[0] or not username[0] or not password[0]:
46 | # print("Please enter the required arguments. Run __main__.py --help")
47 | # exit()
48 | # else:
49 | # sites.funimation.Funimation(url[0], username, password, resolution, language)
50 |
51 | def honcho(self, url):
52 | # Verify that we have a sane url and return which website it belongs
53 | # to.
54 |
55 | # Fix for script not responding when www.crunchyrol.com/... type links are given.
56 | if "https://" in url:
57 | url = str(url)
58 | elif "http://" not in url:
59 | url = "http://" + str(url)
60 |
61 | # if there's not http:/, then netloc is empty.
62 | # Gotta add the "if crunchyroll in url..."
63 | # print(url)
64 | domain = urlparse(url).netloc
65 |
66 | if domain in ["www.funimation.com", "funimation.com"]:
67 | return "Funimation"
68 |
69 | elif domain in ["www.crunchyroll.com", "crunchyroll.com"]:
70 | return "Crunchyroll"
71 |
72 | elif domain in ["www.vrv.co", "vrv.co"]:
73 | return "VRV"
--------------------------------------------------------------------------------
/anime_dl/__init__.py:
--------------------------------------------------------------------------------
1 | import common
2 | import external
3 | import sites
4 |
5 |
--------------------------------------------------------------------------------
/anime_dl/__main__.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- coding: utf-8 -*-
3 |
4 | """
5 | __author__ = "Xonshiz"
6 | __email__ = "xonshiz@gmail.com"
7 | """
8 | import os,sys,inspect
9 | currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
10 | parentdir = os.path.dirname(currentdir)
11 | sys.path.insert(0,parentdir)
12 | import anime_dl
13 |
14 | from Anime_dl import *
15 | from sys import exit
16 | from version import __version__
17 | import argparse
18 | import logging
19 | import platform
20 |
21 |
22 | class Main():
23 | if __name__ == '__main__':
24 | parser = argparse.ArgumentParser(description='anime_dl downloads anime from CrunchyRoll and Funimation.')
25 |
26 | parser.add_argument('--version', action='store_true', help='Shows version and exits.')
27 |
28 | required_args = parser.add_argument_group('Required Arguments :')
29 | required_args.add_argument('-p', '--password', nargs=1, help='Indicates password for a website.')
30 | required_args.add_argument('-u', '--username', nargs=1, help='Indicates username for a website.')
31 | required_args.add_argument('-i', '--input', nargs=1, help='Inputs the URL to anime.')
32 | parser.add_argument('-r', '--resolution', nargs=1, help='Inputs the URL to anime.', default='720p')
33 | parser.add_argument('-l', '--language', nargs=1, help='Selects the language for the show.', default='Japanese')
34 | parser.add_argument('-rn', '--range', nargs=1, help='Specifies the range of episodes to download.',
35 | default='All')
36 | parser.add_argument('-o', '--output', nargs=1, help='Specifies the directory of which to save the files.')
37 | parser.add_argument('--skip', action='store_true', help='skips the video download and downloads only subs.')
38 | parser.add_argument("-v", "--verbose", help="Prints important debugging messages on screen.",
39 | action="store_true")
40 | logger = "False"
41 | args = parser.parse_args()
42 | skipper = "no"
43 |
44 | if args.verbose:
45 | logging.basicConfig(format='%(levelname)s: %(message)s', filename="Error Log.log", level=logging.DEBUG)
46 | logging.debug('You have successfully set the Debugging On.')
47 | logging.debug("Arguments Provided : {0}".format(args))
48 | logging.debug(
49 | "Operating System : {0} - {1} - {2}".format(platform.system(), platform.release(), platform.version()))
50 | logging.debug("Python Version : {0} ({1})".format(platform.python_version(), platform.architecture()[0]))
51 | logger = "True"
52 |
53 | if args.version:
54 | print("Current Version : {0}".format(__version__))
55 | exit()
56 |
57 | if args.skip:
58 | print("Will be skipping video downloads")
59 | skipper = "yes"
60 |
61 | if args.username is None or args.password is None or args.input is None:
62 | print("Please enter the required arguments. Run __main__.py --help")
63 | exit()
64 | else:
65 | # If the argument has been provided for resolution and language,
66 | # they're going to be lists, otherwise, they're
67 | # going to be simple value == 720p.
68 | # So, if return type comes out to be list, send the first element, otherwise, send 720p as it is.
69 | # Same approach for the audio as well.
70 |
71 | if type(args.resolution) == list:
72 | args.resolution = args.resolution[0]
73 | if type(args.language) == list:
74 | args.language = args.language[0]
75 | if type(args.range) == list:
76 | args.range = args.range[0]
77 | if type(args.output) == list:
78 | args.output = args.output[0]
79 |
80 | AnimeDL(url=args.input, username=args.username, password=args.password,
81 | resolution=args.resolution, language=args.language, skipper=skipper,
82 | logger=logger, episode_range=args.range, output=args.output)
--------------------------------------------------------------------------------
/anime_dl/common/__init__.py:
--------------------------------------------------------------------------------
1 | from . import browser_instance
2 | from . import downloader
3 | from . import misc
--------------------------------------------------------------------------------
/anime_dl/common/browser_instance.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- coding: utf-8 -*-
3 |
4 | from cfscrape import create_scraper
5 | from requests import session
6 | from bs4 import BeautifulSoup
7 | import re
8 |
9 |
10 | def page_downloader(url, scrapper_delay=5, **kwargs):
11 | headers = kwargs.get("headers")
12 | received_cookies = kwargs.get("cookies")
13 | if not headers:
14 | headers = {
15 | 'User-Agent':
16 | 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36',
17 | 'Accept-Encoding': 'gzip, deflate'
18 | }
19 |
20 | sess = session()
21 | sess = create_scraper(sess, delay=scrapper_delay)
22 |
23 | connection = sess.get(url, headers=headers, cookies=received_cookies)
24 |
25 | if connection.status_code != 200:
26 | print("Whoops! Seems like I can't connect to website.")
27 | print("It's showing : %s" % connection)
28 | print("Run this script with the --verbose argument and report the issue along with log file on Github.")
29 | # raise Warning("can't connect to website %s" % manga_url)
30 | return False, None, None
31 | else:
32 | page_source = BeautifulSoup(connection.text.encode("utf-8"), "html.parser")
33 | connection_cookies = sess.cookies
34 |
35 | return True, page_source, received_cookies
36 |
37 |
38 | def login_crunchyroll(url, username, password, country):
39 | headers = {
40 | 'user-agent':
41 | 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36',
42 | 'referer': 'https://www.crunchyroll.com/login',
43 | 'origin': 'https://www.crunchyroll.com',
44 | 'upgrade-insecure-requests': '1'
45 | }
46 |
47 | sess = session()
48 | sess = create_scraper(sess)
49 | print("Trying to login...")
50 |
51 | initial_page_fetch = sess.get(url='https://www.crunchyroll.com/login', headers=headers)
52 |
53 | if initial_page_fetch.status_code == 200:
54 | initial_page_source = initial_page_fetch.text.encode("utf-8")
55 | initial_cookies = sess.cookies
56 | csrf_token = ""
57 | try:
58 | csrf_token = re.search(r'login_form\[_token\]" value="(.*?)"', str(initial_page_source)).group(1)
59 | except Exception:
60 | csrf_token = re.search(r'login_form\[_token\]" type="hidden" value="(.*?)"',
61 | str(initial_page_source)).group(1)
62 |
63 | payload = {
64 | 'login_form[name]': '%s' % username,
65 | 'login_form[password]': '%s' % password,
66 | 'login_form[redirect_url]': '/',
67 | 'login_form[_token]': '%s' % csrf_token
68 | }
69 |
70 | login_post = sess.post(
71 | url='https://www.crunchyroll.com/login',
72 | data=payload,
73 | headers=headers,
74 | cookies=initial_cookies)
75 |
76 | login_check_response, login_cookies = login_check(html_source=login_post.text.encode('utf-8'), cookies=login_post.cookies)
77 | if login_check_response:
78 | print("Logged in successfully...")
79 | return True, initial_cookies, csrf_token
80 | else:
81 | print("Unable to Log you in. Check credentials again.")
82 | return False, None, None
83 | else:
84 | # print("Couldn't connect to the login page...")
85 | # print("Website returned : %s" % str(initial_page_fetch.status_code))
86 | return False, None, None
87 |
88 |
89 | def login_check(html_source, cookies=None):
90 | # Open the page and check the title. CrunchyRoll redirects the user and the title has the text "Redirecting...".
91 | # If this is not found, you're probably not logged in and you'll just get 360p or 480p.
92 | if b'href="/logout"' in html_source:
93 | return True, cookies
94 | else:
95 | print("Let me check again...")
96 | second_try_response, html_source, cookies = page_downloader(url="https://www.crunchyroll.com/", cookies=cookies)
97 | if second_try_response:
98 | if b'href=\"/logout\"' in html_source:
99 | return True, cookies
100 | else:
101 | return False, cookies
102 |
--------------------------------------------------------------------------------
/anime_dl/common/downloader.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- coding: utf-8 -*-
3 |
4 | from cfscrape import create_scraper
5 | from requests import session
6 | from tqdm import tqdm
7 |
8 |
9 | class Downloader(object):
10 |
11 | def file_downloader(self, ddl, file_name, referer, cookies):
12 | headers = {
13 | 'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.133 Safari/537.36',
14 | 'Territory': 'US',
15 | 'Referer': referer
16 | }
17 |
18 | sess = session()
19 | sess = create_scraper(sess)
20 |
21 | dlr = sess.get(ddl, stream=True, cookies = cookies, headers = headers) # Downloading the content using python.
22 | with open(file_name, "wb") as handle:
23 | for data in tqdm(dlr.iter_content(chunk_size=1024)): # Added chunk size to speed up the downloads
24 | handle.write(data)
25 | print("Download has been completed.") # Viola
--------------------------------------------------------------------------------
/anime_dl/common/misc.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- coding: utf-8 -*-
3 |
4 |
5 | def duplicate_remover(seq):
6 | # https://stackoverflow.com/a/480227
7 | seen = set()
8 | seen_add = seen.add
9 | return [x for x in seq if not (x in seen or seen_add(x))]
10 |
--------------------------------------------------------------------------------
/anime_dl/exeMaker.bat:
--------------------------------------------------------------------------------
1 | pyinstaller --hidden-import=queue --onefile "__main__.spec"
--------------------------------------------------------------------------------
/anime_dl/external/__init__.py:
--------------------------------------------------------------------------------
1 | from . import aes
2 | from . import compat
3 | from . import socks
4 | from . import utils
5 |
--------------------------------------------------------------------------------
/anime_dl/external/aes.py:
--------------------------------------------------------------------------------
1 | from __future__ import unicode_literals
2 |
3 | import base64
4 | from math import ceil
5 | from external.utils import bytes_to_intlist, intlist_to_bytes
6 | # from utils import bytes_to_intlist, intlist_to_bytes
7 |
8 | BLOCK_SIZE_BYTES = 16
9 |
10 |
11 | def aes_ctr_decrypt(data, key, counter):
12 | """
13 | Decrypt with aes in counter mode
14 |
15 | @param {int[]} data cipher
16 | @param {int[]} key 16/24/32-Byte cipher key
17 | @param {instance} counter Instance whose next_value function (@returns {int[]} 16-Byte block)
18 | returns the next counter block
19 | @returns {int[]} decrypted data
20 | """
21 | expanded_key = key_expansion(key)
22 | block_count = int(ceil(float(len(data)) / BLOCK_SIZE_BYTES))
23 |
24 | decrypted_data = []
25 | for i in range(block_count):
26 | counter_block = counter.next_value()
27 | block = data[i * BLOCK_SIZE_BYTES: (i + 1) * BLOCK_SIZE_BYTES]
28 | block += [0] * (BLOCK_SIZE_BYTES - len(block))
29 |
30 | cipher_counter_block = aes_encrypt(counter_block, expanded_key)
31 | decrypted_data += xor(block, cipher_counter_block)
32 | decrypted_data = decrypted_data[:len(data)]
33 |
34 | return decrypted_data
35 |
36 |
37 | def aes_cbc_decrypt(data, key, iv):
38 | """
39 | Decrypt with aes in CBC mode
40 |
41 | @param {int[]} data cipher
42 | @param {int[]} key 16/24/32-Byte cipher key
43 | @param {int[]} iv 16-Byte IV
44 | @returns {int[]} decrypted data
45 | """
46 | expanded_key = key_expansion(key)
47 | block_count = int(ceil(float(len(data)) / BLOCK_SIZE_BYTES))
48 |
49 | decrypted_data = []
50 | previous_cipher_block = iv
51 | for i in range(block_count):
52 | block = data[i * BLOCK_SIZE_BYTES: (i + 1) * BLOCK_SIZE_BYTES]
53 | block += [0] * (BLOCK_SIZE_BYTES - len(block))
54 |
55 | decrypted_block = aes_decrypt(block, expanded_key)
56 | decrypted_data += xor(decrypted_block, previous_cipher_block)
57 | previous_cipher_block = block
58 | decrypted_data = decrypted_data[:len(data)]
59 |
60 | return decrypted_data
61 |
62 |
63 | def key_expansion(data):
64 | """
65 | Generate key schedule
66 |
67 | @param {int[]} data 16/24/32-Byte cipher key
68 | @returns {int[]} 176/208/240-Byte expanded key
69 | """
70 | data = data[:] # copy
71 | rcon_iteration = 1
72 | key_size_bytes = len(data)
73 | expanded_key_size_bytes = (key_size_bytes // 4 + 7) * BLOCK_SIZE_BYTES
74 |
75 | while len(data) < expanded_key_size_bytes:
76 | temp = data[-4:]
77 | temp = key_schedule_core(temp, rcon_iteration)
78 | rcon_iteration += 1
79 | data += xor(temp, data[-key_size_bytes: 4 - key_size_bytes])
80 |
81 | for _ in range(3):
82 | temp = data[-4:]
83 | data += xor(temp, data[-key_size_bytes: 4 - key_size_bytes])
84 |
85 | if key_size_bytes == 32:
86 | temp = data[-4:]
87 | temp = sub_bytes(temp)
88 | data += xor(temp, data[-key_size_bytes: 4 - key_size_bytes])
89 |
90 | for _ in range(3 if key_size_bytes == 32 else 2 if key_size_bytes == 24 else 0):
91 | temp = data[-4:]
92 | data += xor(temp, data[-key_size_bytes: 4 - key_size_bytes])
93 | data = data[:expanded_key_size_bytes]
94 |
95 | return data
96 |
97 |
98 | def aes_encrypt(data, expanded_key):
99 | """
100 | Encrypt one block with aes
101 |
102 | @param {int[]} data 16-Byte state
103 | @param {int[]} expanded_key 176/208/240-Byte expanded key
104 | @returns {int[]} 16-Byte cipher
105 | """
106 | rounds = len(expanded_key) // BLOCK_SIZE_BYTES - 1
107 |
108 | data = xor(data, expanded_key[:BLOCK_SIZE_BYTES])
109 | for i in range(1, rounds + 1):
110 | data = sub_bytes(data)
111 | data = shift_rows(data)
112 | if i != rounds:
113 | data = mix_columns(data)
114 | data = xor(data, expanded_key[i * BLOCK_SIZE_BYTES: (i + 1) * BLOCK_SIZE_BYTES])
115 |
116 | return data
117 |
118 |
119 | def aes_decrypt(data, expanded_key):
120 | """
121 | Decrypt one block with aes
122 |
123 | @param {int[]} data 16-Byte cipher
124 | @param {int[]} expanded_key 176/208/240-Byte expanded key
125 | @returns {int[]} 16-Byte state
126 | """
127 | rounds = len(expanded_key) // BLOCK_SIZE_BYTES - 1
128 |
129 | for i in range(rounds, 0, -1):
130 | data = xor(data, expanded_key[i * BLOCK_SIZE_BYTES: (i + 1) * BLOCK_SIZE_BYTES])
131 | if i != rounds:
132 | data = mix_columns_inv(data)
133 | data = shift_rows_inv(data)
134 | data = sub_bytes_inv(data)
135 | data = xor(data, expanded_key[:BLOCK_SIZE_BYTES])
136 |
137 | return data
138 |
139 |
140 | def aes_decrypt_text(data, password, key_size_bytes):
141 | """
142 | Decrypt text
143 | - The first 8 Bytes of decoded 'data' are the 8 high Bytes of the counter
144 | - The cipher key is retrieved by encrypting the first 16 Byte of 'password'
145 | with the first 'key_size_bytes' Bytes from 'password' (if necessary filled with 0's)
146 | - Mode of operation is 'counter'
147 |
148 | @param {str} data Base64 encoded string
149 | @param {str,unicode} password Password (will be encoded with utf-8)
150 | @param {int} key_size_bytes Possible values: 16 for 128-Bit, 24 for 192-Bit or 32 for 256-Bit
151 | @returns {str} Decrypted data
152 | """
153 | NONCE_LENGTH_BYTES = 8
154 |
155 | data = bytes_to_intlist(base64.b64decode(data.encode('utf-8')))
156 | password = bytes_to_intlist(password.encode('utf-8'))
157 |
158 | key = password[:key_size_bytes] + [0] * (key_size_bytes - len(password))
159 | key = aes_encrypt(key[:BLOCK_SIZE_BYTES], key_expansion(key)) * (key_size_bytes // BLOCK_SIZE_BYTES)
160 |
161 | nonce = data[:NONCE_LENGTH_BYTES]
162 | cipher = data[NONCE_LENGTH_BYTES:]
163 |
164 | class Counter(object):
165 | __value = nonce + [0] * (BLOCK_SIZE_BYTES - NONCE_LENGTH_BYTES)
166 |
167 | def next_value(self):
168 | temp = self.__value
169 | self.__value = inc(self.__value)
170 | return temp
171 |
172 | decrypted_data = aes_ctr_decrypt(cipher, key, Counter())
173 | plaintext = intlist_to_bytes(decrypted_data)
174 |
175 | return plaintext
176 |
177 |
178 | RCON = (0x8d, 0x01, 0x02, 0x04, 0x08, 0x10, 0x20, 0x40, 0x80, 0x1b, 0x36)
179 | SBOX = (0x63, 0x7C, 0x77, 0x7B, 0xF2, 0x6B, 0x6F, 0xC5, 0x30, 0x01, 0x67, 0x2B, 0xFE, 0xD7, 0xAB, 0x76,
180 | 0xCA, 0x82, 0xC9, 0x7D, 0xFA, 0x59, 0x47, 0xF0, 0xAD, 0xD4, 0xA2, 0xAF, 0x9C, 0xA4, 0x72, 0xC0,
181 | 0xB7, 0xFD, 0x93, 0x26, 0x36, 0x3F, 0xF7, 0xCC, 0x34, 0xA5, 0xE5, 0xF1, 0x71, 0xD8, 0x31, 0x15,
182 | 0x04, 0xC7, 0x23, 0xC3, 0x18, 0x96, 0x05, 0x9A, 0x07, 0x12, 0x80, 0xE2, 0xEB, 0x27, 0xB2, 0x75,
183 | 0x09, 0x83, 0x2C, 0x1A, 0x1B, 0x6E, 0x5A, 0xA0, 0x52, 0x3B, 0xD6, 0xB3, 0x29, 0xE3, 0x2F, 0x84,
184 | 0x53, 0xD1, 0x00, 0xED, 0x20, 0xFC, 0xB1, 0x5B, 0x6A, 0xCB, 0xBE, 0x39, 0x4A, 0x4C, 0x58, 0xCF,
185 | 0xD0, 0xEF, 0xAA, 0xFB, 0x43, 0x4D, 0x33, 0x85, 0x45, 0xF9, 0x02, 0x7F, 0x50, 0x3C, 0x9F, 0xA8,
186 | 0x51, 0xA3, 0x40, 0x8F, 0x92, 0x9D, 0x38, 0xF5, 0xBC, 0xB6, 0xDA, 0x21, 0x10, 0xFF, 0xF3, 0xD2,
187 | 0xCD, 0x0C, 0x13, 0xEC, 0x5F, 0x97, 0x44, 0x17, 0xC4, 0xA7, 0x7E, 0x3D, 0x64, 0x5D, 0x19, 0x73,
188 | 0x60, 0x81, 0x4F, 0xDC, 0x22, 0x2A, 0x90, 0x88, 0x46, 0xEE, 0xB8, 0x14, 0xDE, 0x5E, 0x0B, 0xDB,
189 | 0xE0, 0x32, 0x3A, 0x0A, 0x49, 0x06, 0x24, 0x5C, 0xC2, 0xD3, 0xAC, 0x62, 0x91, 0x95, 0xE4, 0x79,
190 | 0xE7, 0xC8, 0x37, 0x6D, 0x8D, 0xD5, 0x4E, 0xA9, 0x6C, 0x56, 0xF4, 0xEA, 0x65, 0x7A, 0xAE, 0x08,
191 | 0xBA, 0x78, 0x25, 0x2E, 0x1C, 0xA6, 0xB4, 0xC6, 0xE8, 0xDD, 0x74, 0x1F, 0x4B, 0xBD, 0x8B, 0x8A,
192 | 0x70, 0x3E, 0xB5, 0x66, 0x48, 0x03, 0xF6, 0x0E, 0x61, 0x35, 0x57, 0xB9, 0x86, 0xC1, 0x1D, 0x9E,
193 | 0xE1, 0xF8, 0x98, 0x11, 0x69, 0xD9, 0x8E, 0x94, 0x9B, 0x1E, 0x87, 0xE9, 0xCE, 0x55, 0x28, 0xDF,
194 | 0x8C, 0xA1, 0x89, 0x0D, 0xBF, 0xE6, 0x42, 0x68, 0x41, 0x99, 0x2D, 0x0F, 0xB0, 0x54, 0xBB, 0x16)
195 | SBOX_INV = (0x52, 0x09, 0x6a, 0xd5, 0x30, 0x36, 0xa5, 0x38, 0xbf, 0x40, 0xa3, 0x9e, 0x81, 0xf3, 0xd7, 0xfb,
196 | 0x7c, 0xe3, 0x39, 0x82, 0x9b, 0x2f, 0xff, 0x87, 0x34, 0x8e, 0x43, 0x44, 0xc4, 0xde, 0xe9, 0xcb,
197 | 0x54, 0x7b, 0x94, 0x32, 0xa6, 0xc2, 0x23, 0x3d, 0xee, 0x4c, 0x95, 0x0b, 0x42, 0xfa, 0xc3, 0x4e,
198 | 0x08, 0x2e, 0xa1, 0x66, 0x28, 0xd9, 0x24, 0xb2, 0x76, 0x5b, 0xa2, 0x49, 0x6d, 0x8b, 0xd1, 0x25,
199 | 0x72, 0xf8, 0xf6, 0x64, 0x86, 0x68, 0x98, 0x16, 0xd4, 0xa4, 0x5c, 0xcc, 0x5d, 0x65, 0xb6, 0x92,
200 | 0x6c, 0x70, 0x48, 0x50, 0xfd, 0xed, 0xb9, 0xda, 0x5e, 0x15, 0x46, 0x57, 0xa7, 0x8d, 0x9d, 0x84,
201 | 0x90, 0xd8, 0xab, 0x00, 0x8c, 0xbc, 0xd3, 0x0a, 0xf7, 0xe4, 0x58, 0x05, 0xb8, 0xb3, 0x45, 0x06,
202 | 0xd0, 0x2c, 0x1e, 0x8f, 0xca, 0x3f, 0x0f, 0x02, 0xc1, 0xaf, 0xbd, 0x03, 0x01, 0x13, 0x8a, 0x6b,
203 | 0x3a, 0x91, 0x11, 0x41, 0x4f, 0x67, 0xdc, 0xea, 0x97, 0xf2, 0xcf, 0xce, 0xf0, 0xb4, 0xe6, 0x73,
204 | 0x96, 0xac, 0x74, 0x22, 0xe7, 0xad, 0x35, 0x85, 0xe2, 0xf9, 0x37, 0xe8, 0x1c, 0x75, 0xdf, 0x6e,
205 | 0x47, 0xf1, 0x1a, 0x71, 0x1d, 0x29, 0xc5, 0x89, 0x6f, 0xb7, 0x62, 0x0e, 0xaa, 0x18, 0xbe, 0x1b,
206 | 0xfc, 0x56, 0x3e, 0x4b, 0xc6, 0xd2, 0x79, 0x20, 0x9a, 0xdb, 0xc0, 0xfe, 0x78, 0xcd, 0x5a, 0xf4,
207 | 0x1f, 0xdd, 0xa8, 0x33, 0x88, 0x07, 0xc7, 0x31, 0xb1, 0x12, 0x10, 0x59, 0x27, 0x80, 0xec, 0x5f,
208 | 0x60, 0x51, 0x7f, 0xa9, 0x19, 0xb5, 0x4a, 0x0d, 0x2d, 0xe5, 0x7a, 0x9f, 0x93, 0xc9, 0x9c, 0xef,
209 | 0xa0, 0xe0, 0x3b, 0x4d, 0xae, 0x2a, 0xf5, 0xb0, 0xc8, 0xeb, 0xbb, 0x3c, 0x83, 0x53, 0x99, 0x61,
210 | 0x17, 0x2b, 0x04, 0x7e, 0xba, 0x77, 0xd6, 0x26, 0xe1, 0x69, 0x14, 0x63, 0x55, 0x21, 0x0c, 0x7d)
211 | MIX_COLUMN_MATRIX = ((0x2, 0x3, 0x1, 0x1),
212 | (0x1, 0x2, 0x3, 0x1),
213 | (0x1, 0x1, 0x2, 0x3),
214 | (0x3, 0x1, 0x1, 0x2))
215 | MIX_COLUMN_MATRIX_INV = ((0xE, 0xB, 0xD, 0x9),
216 | (0x9, 0xE, 0xB, 0xD),
217 | (0xD, 0x9, 0xE, 0xB),
218 | (0xB, 0xD, 0x9, 0xE))
219 | RIJNDAEL_EXP_TABLE = (0x01, 0x03, 0x05, 0x0F, 0x11, 0x33, 0x55, 0xFF, 0x1A, 0x2E, 0x72, 0x96, 0xA1, 0xF8, 0x13, 0x35,
220 | 0x5F, 0xE1, 0x38, 0x48, 0xD8, 0x73, 0x95, 0xA4, 0xF7, 0x02, 0x06, 0x0A, 0x1E, 0x22, 0x66, 0xAA,
221 | 0xE5, 0x34, 0x5C, 0xE4, 0x37, 0x59, 0xEB, 0x26, 0x6A, 0xBE, 0xD9, 0x70, 0x90, 0xAB, 0xE6, 0x31,
222 | 0x53, 0xF5, 0x04, 0x0C, 0x14, 0x3C, 0x44, 0xCC, 0x4F, 0xD1, 0x68, 0xB8, 0xD3, 0x6E, 0xB2, 0xCD,
223 | 0x4C, 0xD4, 0x67, 0xA9, 0xE0, 0x3B, 0x4D, 0xD7, 0x62, 0xA6, 0xF1, 0x08, 0x18, 0x28, 0x78, 0x88,
224 | 0x83, 0x9E, 0xB9, 0xD0, 0x6B, 0xBD, 0xDC, 0x7F, 0x81, 0x98, 0xB3, 0xCE, 0x49, 0xDB, 0x76, 0x9A,
225 | 0xB5, 0xC4, 0x57, 0xF9, 0x10, 0x30, 0x50, 0xF0, 0x0B, 0x1D, 0x27, 0x69, 0xBB, 0xD6, 0x61, 0xA3,
226 | 0xFE, 0x19, 0x2B, 0x7D, 0x87, 0x92, 0xAD, 0xEC, 0x2F, 0x71, 0x93, 0xAE, 0xE9, 0x20, 0x60, 0xA0,
227 | 0xFB, 0x16, 0x3A, 0x4E, 0xD2, 0x6D, 0xB7, 0xC2, 0x5D, 0xE7, 0x32, 0x56, 0xFA, 0x15, 0x3F, 0x41,
228 | 0xC3, 0x5E, 0xE2, 0x3D, 0x47, 0xC9, 0x40, 0xC0, 0x5B, 0xED, 0x2C, 0x74, 0x9C, 0xBF, 0xDA, 0x75,
229 | 0x9F, 0xBA, 0xD5, 0x64, 0xAC, 0xEF, 0x2A, 0x7E, 0x82, 0x9D, 0xBC, 0xDF, 0x7A, 0x8E, 0x89, 0x80,
230 | 0x9B, 0xB6, 0xC1, 0x58, 0xE8, 0x23, 0x65, 0xAF, 0xEA, 0x25, 0x6F, 0xB1, 0xC8, 0x43, 0xC5, 0x54,
231 | 0xFC, 0x1F, 0x21, 0x63, 0xA5, 0xF4, 0x07, 0x09, 0x1B, 0x2D, 0x77, 0x99, 0xB0, 0xCB, 0x46, 0xCA,
232 | 0x45, 0xCF, 0x4A, 0xDE, 0x79, 0x8B, 0x86, 0x91, 0xA8, 0xE3, 0x3E, 0x42, 0xC6, 0x51, 0xF3, 0x0E,
233 | 0x12, 0x36, 0x5A, 0xEE, 0x29, 0x7B, 0x8D, 0x8C, 0x8F, 0x8A, 0x85, 0x94, 0xA7, 0xF2, 0x0D, 0x17,
234 | 0x39, 0x4B, 0xDD, 0x7C, 0x84, 0x97, 0xA2, 0xFD, 0x1C, 0x24, 0x6C, 0xB4, 0xC7, 0x52, 0xF6, 0x01)
235 | RIJNDAEL_LOG_TABLE = (0x00, 0x00, 0x19, 0x01, 0x32, 0x02, 0x1a, 0xc6, 0x4b, 0xc7, 0x1b, 0x68, 0x33, 0xee, 0xdf, 0x03,
236 | 0x64, 0x04, 0xe0, 0x0e, 0x34, 0x8d, 0x81, 0xef, 0x4c, 0x71, 0x08, 0xc8, 0xf8, 0x69, 0x1c, 0xc1,
237 | 0x7d, 0xc2, 0x1d, 0xb5, 0xf9, 0xb9, 0x27, 0x6a, 0x4d, 0xe4, 0xa6, 0x72, 0x9a, 0xc9, 0x09, 0x78,
238 | 0x65, 0x2f, 0x8a, 0x05, 0x21, 0x0f, 0xe1, 0x24, 0x12, 0xf0, 0x82, 0x45, 0x35, 0x93, 0xda, 0x8e,
239 | 0x96, 0x8f, 0xdb, 0xbd, 0x36, 0xd0, 0xce, 0x94, 0x13, 0x5c, 0xd2, 0xf1, 0x40, 0x46, 0x83, 0x38,
240 | 0x66, 0xdd, 0xfd, 0x30, 0xbf, 0x06, 0x8b, 0x62, 0xb3, 0x25, 0xe2, 0x98, 0x22, 0x88, 0x91, 0x10,
241 | 0x7e, 0x6e, 0x48, 0xc3, 0xa3, 0xb6, 0x1e, 0x42, 0x3a, 0x6b, 0x28, 0x54, 0xfa, 0x85, 0x3d, 0xba,
242 | 0x2b, 0x79, 0x0a, 0x15, 0x9b, 0x9f, 0x5e, 0xca, 0x4e, 0xd4, 0xac, 0xe5, 0xf3, 0x73, 0xa7, 0x57,
243 | 0xaf, 0x58, 0xa8, 0x50, 0xf4, 0xea, 0xd6, 0x74, 0x4f, 0xae, 0xe9, 0xd5, 0xe7, 0xe6, 0xad, 0xe8,
244 | 0x2c, 0xd7, 0x75, 0x7a, 0xeb, 0x16, 0x0b, 0xf5, 0x59, 0xcb, 0x5f, 0xb0, 0x9c, 0xa9, 0x51, 0xa0,
245 | 0x7f, 0x0c, 0xf6, 0x6f, 0x17, 0xc4, 0x49, 0xec, 0xd8, 0x43, 0x1f, 0x2d, 0xa4, 0x76, 0x7b, 0xb7,
246 | 0xcc, 0xbb, 0x3e, 0x5a, 0xfb, 0x60, 0xb1, 0x86, 0x3b, 0x52, 0xa1, 0x6c, 0xaa, 0x55, 0x29, 0x9d,
247 | 0x97, 0xb2, 0x87, 0x90, 0x61, 0xbe, 0xdc, 0xfc, 0xbc, 0x95, 0xcf, 0xcd, 0x37, 0x3f, 0x5b, 0xd1,
248 | 0x53, 0x39, 0x84, 0x3c, 0x41, 0xa2, 0x6d, 0x47, 0x14, 0x2a, 0x9e, 0x5d, 0x56, 0xf2, 0xd3, 0xab,
249 | 0x44, 0x11, 0x92, 0xd9, 0x23, 0x20, 0x2e, 0x89, 0xb4, 0x7c, 0xb8, 0x26, 0x77, 0x99, 0xe3, 0xa5,
250 | 0x67, 0x4a, 0xed, 0xde, 0xc5, 0x31, 0xfe, 0x18, 0x0d, 0x63, 0x8c, 0x80, 0xc0, 0xf7, 0x70, 0x07)
251 |
252 |
253 | def sub_bytes(data):
254 | return [SBOX[x] for x in data]
255 |
256 |
257 | def sub_bytes_inv(data):
258 | return [SBOX_INV[x] for x in data]
259 |
260 |
261 | def rotate(data):
262 | return data[1:] + [data[0]]
263 |
264 |
265 | def key_schedule_core(data, rcon_iteration):
266 | data = rotate(data)
267 | data = sub_bytes(data)
268 | data[0] = data[0] ^ RCON[rcon_iteration]
269 |
270 | return data
271 |
272 |
273 | def xor(data1, data2):
274 | return [x ^ y for x, y in zip(data1, data2)]
275 |
276 |
277 | def rijndael_mul(a, b):
278 | if(a == 0 or b == 0):
279 | return 0
280 | return RIJNDAEL_EXP_TABLE[(RIJNDAEL_LOG_TABLE[a] + RIJNDAEL_LOG_TABLE[b]) % 0xFF]
281 |
282 |
283 | def mix_column(data, matrix):
284 | data_mixed = []
285 | for row in range(4):
286 | mixed = 0
287 | for column in range(4):
288 | # xor is (+) and (-)
289 | mixed ^= rijndael_mul(data[column], matrix[row][column])
290 | data_mixed.append(mixed)
291 | return data_mixed
292 |
293 |
294 | def mix_columns(data, matrix=MIX_COLUMN_MATRIX):
295 | data_mixed = []
296 | for i in range(4):
297 | column = data[i * 4: (i + 1) * 4]
298 | data_mixed += mix_column(column, matrix)
299 | return data_mixed
300 |
301 |
302 | def mix_columns_inv(data):
303 | return mix_columns(data, MIX_COLUMN_MATRIX_INV)
304 |
305 |
306 | def shift_rows(data):
307 | data_shifted = []
308 | for column in range(4):
309 | for row in range(4):
310 | data_shifted.append(data[((column + row) & 0b11) * 4 + row])
311 | return data_shifted
312 |
313 |
314 | def shift_rows_inv(data):
315 | data_shifted = []
316 | for column in range(4):
317 | for row in range(4):
318 | data_shifted.append(data[((column - row) & 0b11) * 4 + row])
319 | return data_shifted
320 |
321 |
322 | def inc(data):
323 | data = data[:] # copy
324 | for i in range(len(data) - 1, -1, -1):
325 | if data[i] == 255:
326 | data[i] = 0
327 | else:
328 | data[i] = data[i] + 1
329 | break
330 | return data
331 |
332 |
333 | __all__ = ['aes_encrypt', 'key_expansion', 'aes_ctr_decrypt', 'aes_cbc_decrypt', 'aes_decrypt_text']
--------------------------------------------------------------------------------
/anime_dl/external/compat.py:
--------------------------------------------------------------------------------
1 | # coding: utf-8
2 | from __future__ import unicode_literals
3 |
4 | import binascii
5 | import collections
6 | import email
7 | import getpass
8 | import io
9 | import optparse
10 | import os
11 | import re
12 | import shlex
13 | import shutil
14 | import socket
15 | import struct
16 | import subprocess
17 | import sys
18 | import itertools
19 | import xml.etree.ElementTree
20 |
21 |
22 | try:
23 | import urllib.request as compat_urllib_request
24 | except ImportError: # Python 2
25 | import urllib2 as compat_urllib_request
26 |
27 | try:
28 | import urllib.error as compat_urllib_error
29 | except ImportError: # Python 2
30 | import urllib2 as compat_urllib_error
31 |
32 | try:
33 | import urllib.parse as compat_urllib_parse
34 | except ImportError: # Python 2
35 | import urllib as compat_urllib_parse
36 |
37 | try:
38 | from urllib.parse import urlparse as compat_urllib_parse_urlparse
39 | except ImportError: # Python 2
40 | from urlparse import urlparse as compat_urllib_parse_urlparse
41 |
42 | try:
43 | import urllib.parse as compat_urlparse
44 | except ImportError: # Python 2
45 | import urlparse as compat_urlparse
46 |
47 | try:
48 | import urllib.response as compat_urllib_response
49 | except ImportError: # Python 2
50 | import urllib as compat_urllib_response
51 |
52 | try:
53 | import http.cookiejar as compat_cookiejar
54 | except ImportError: # Python 2
55 | import cookielib as compat_cookiejar
56 |
57 | try:
58 | import http.cookies as compat_cookies
59 | except ImportError: # Python 2
60 | import Cookie as compat_cookies
61 |
62 | try:
63 | import html.entities as compat_html_entities
64 | except ImportError: # Python 2
65 | import htmlentitydefs as compat_html_entities
66 |
67 | try: # Python >= 3.3
68 | compat_html_entities_html5 = compat_html_entities.html5
69 | except AttributeError:
70 | # Copied from CPython 3.5.1 html/entities.py
71 | compat_html_entities_html5 = {
72 | 'Aacute': '\xc1',
73 | 'aacute': '\xe1',
74 | 'Aacute;': '\xc1',
75 | 'aacute;': '\xe1',
76 | 'Abreve;': '\u0102',
77 | 'abreve;': '\u0103',
78 | 'ac;': '\u223e',
79 | 'acd;': '\u223f',
80 | 'acE;': '\u223e\u0333',
81 | 'Acirc': '\xc2',
82 | 'acirc': '\xe2',
83 | 'Acirc;': '\xc2',
84 | 'acirc;': '\xe2',
85 | 'acute': '\xb4',
86 | 'acute;': '\xb4',
87 | 'Acy;': '\u0410',
88 | 'acy;': '\u0430',
89 | 'AElig': '\xc6',
90 | 'aelig': '\xe6',
91 | 'AElig;': '\xc6',
92 | 'aelig;': '\xe6',
93 | 'af;': '\u2061',
94 | 'Afr;': '\U0001d504',
95 | 'afr;': '\U0001d51e',
96 | 'Agrave': '\xc0',
97 | 'agrave': '\xe0',
98 | 'Agrave;': '\xc0',
99 | 'agrave;': '\xe0',
100 | 'alefsym;': '\u2135',
101 | 'aleph;': '\u2135',
102 | 'Alpha;': '\u0391',
103 | 'alpha;': '\u03b1',
104 | 'Amacr;': '\u0100',
105 | 'amacr;': '\u0101',
106 | 'amalg;': '\u2a3f',
107 | 'AMP': '&',
108 | 'amp': '&',
109 | 'AMP;': '&',
110 | 'amp;': '&',
111 | 'And;': '\u2a53',
112 | 'and;': '\u2227',
113 | 'andand;': '\u2a55',
114 | 'andd;': '\u2a5c',
115 | 'andslope;': '\u2a58',
116 | 'andv;': '\u2a5a',
117 | 'ang;': '\u2220',
118 | 'ange;': '\u29a4',
119 | 'angle;': '\u2220',
120 | 'angmsd;': '\u2221',
121 | 'angmsdaa;': '\u29a8',
122 | 'angmsdab;': '\u29a9',
123 | 'angmsdac;': '\u29aa',
124 | 'angmsdad;': '\u29ab',
125 | 'angmsdae;': '\u29ac',
126 | 'angmsdaf;': '\u29ad',
127 | 'angmsdag;': '\u29ae',
128 | 'angmsdah;': '\u29af',
129 | 'angrt;': '\u221f',
130 | 'angrtvb;': '\u22be',
131 | 'angrtvbd;': '\u299d',
132 | 'angsph;': '\u2222',
133 | 'angst;': '\xc5',
134 | 'angzarr;': '\u237c',
135 | 'Aogon;': '\u0104',
136 | 'aogon;': '\u0105',
137 | 'Aopf;': '\U0001d538',
138 | 'aopf;': '\U0001d552',
139 | 'ap;': '\u2248',
140 | 'apacir;': '\u2a6f',
141 | 'apE;': '\u2a70',
142 | 'ape;': '\u224a',
143 | 'apid;': '\u224b',
144 | 'apos;': "'",
145 | 'ApplyFunction;': '\u2061',
146 | 'approx;': '\u2248',
147 | 'approxeq;': '\u224a',
148 | 'Aring': '\xc5',
149 | 'aring': '\xe5',
150 | 'Aring;': '\xc5',
151 | 'aring;': '\xe5',
152 | 'Ascr;': '\U0001d49c',
153 | 'ascr;': '\U0001d4b6',
154 | 'Assign;': '\u2254',
155 | 'ast;': '*',
156 | 'asymp;': '\u2248',
157 | 'asympeq;': '\u224d',
158 | 'Atilde': '\xc3',
159 | 'atilde': '\xe3',
160 | 'Atilde;': '\xc3',
161 | 'atilde;': '\xe3',
162 | 'Auml': '\xc4',
163 | 'auml': '\xe4',
164 | 'Auml;': '\xc4',
165 | 'auml;': '\xe4',
166 | 'awconint;': '\u2233',
167 | 'awint;': '\u2a11',
168 | 'backcong;': '\u224c',
169 | 'backepsilon;': '\u03f6',
170 | 'backprime;': '\u2035',
171 | 'backsim;': '\u223d',
172 | 'backsimeq;': '\u22cd',
173 | 'Backslash;': '\u2216',
174 | 'Barv;': '\u2ae7',
175 | 'barvee;': '\u22bd',
176 | 'Barwed;': '\u2306',
177 | 'barwed;': '\u2305',
178 | 'barwedge;': '\u2305',
179 | 'bbrk;': '\u23b5',
180 | 'bbrktbrk;': '\u23b6',
181 | 'bcong;': '\u224c',
182 | 'Bcy;': '\u0411',
183 | 'bcy;': '\u0431',
184 | 'bdquo;': '\u201e',
185 | 'becaus;': '\u2235',
186 | 'Because;': '\u2235',
187 | 'because;': '\u2235',
188 | 'bemptyv;': '\u29b0',
189 | 'bepsi;': '\u03f6',
190 | 'bernou;': '\u212c',
191 | 'Bernoullis;': '\u212c',
192 | 'Beta;': '\u0392',
193 | 'beta;': '\u03b2',
194 | 'beth;': '\u2136',
195 | 'between;': '\u226c',
196 | 'Bfr;': '\U0001d505',
197 | 'bfr;': '\U0001d51f',
198 | 'bigcap;': '\u22c2',
199 | 'bigcirc;': '\u25ef',
200 | 'bigcup;': '\u22c3',
201 | 'bigodot;': '\u2a00',
202 | 'bigoplus;': '\u2a01',
203 | 'bigotimes;': '\u2a02',
204 | 'bigsqcup;': '\u2a06',
205 | 'bigstar;': '\u2605',
206 | 'bigtriangledown;': '\u25bd',
207 | 'bigtriangleup;': '\u25b3',
208 | 'biguplus;': '\u2a04',
209 | 'bigvee;': '\u22c1',
210 | 'bigwedge;': '\u22c0',
211 | 'bkarow;': '\u290d',
212 | 'blacklozenge;': '\u29eb',
213 | 'blacksquare;': '\u25aa',
214 | 'blacktriangle;': '\u25b4',
215 | 'blacktriangledown;': '\u25be',
216 | 'blacktriangleleft;': '\u25c2',
217 | 'blacktriangleright;': '\u25b8',
218 | 'blank;': '\u2423',
219 | 'blk12;': '\u2592',
220 | 'blk14;': '\u2591',
221 | 'blk34;': '\u2593',
222 | 'block;': '\u2588',
223 | 'bne;': '=\u20e5',
224 | 'bnequiv;': '\u2261\u20e5',
225 | 'bNot;': '\u2aed',
226 | 'bnot;': '\u2310',
227 | 'Bopf;': '\U0001d539',
228 | 'bopf;': '\U0001d553',
229 | 'bot;': '\u22a5',
230 | 'bottom;': '\u22a5',
231 | 'bowtie;': '\u22c8',
232 | 'boxbox;': '\u29c9',
233 | 'boxDL;': '\u2557',
234 | 'boxDl;': '\u2556',
235 | 'boxdL;': '\u2555',
236 | 'boxdl;': '\u2510',
237 | 'boxDR;': '\u2554',
238 | 'boxDr;': '\u2553',
239 | 'boxdR;': '\u2552',
240 | 'boxdr;': '\u250c',
241 | 'boxH;': '\u2550',
242 | 'boxh;': '\u2500',
243 | 'boxHD;': '\u2566',
244 | 'boxHd;': '\u2564',
245 | 'boxhD;': '\u2565',
246 | 'boxhd;': '\u252c',
247 | 'boxHU;': '\u2569',
248 | 'boxHu;': '\u2567',
249 | 'boxhU;': '\u2568',
250 | 'boxhu;': '\u2534',
251 | 'boxminus;': '\u229f',
252 | 'boxplus;': '\u229e',
253 | 'boxtimes;': '\u22a0',
254 | 'boxUL;': '\u255d',
255 | 'boxUl;': '\u255c',
256 | 'boxuL;': '\u255b',
257 | 'boxul;': '\u2518',
258 | 'boxUR;': '\u255a',
259 | 'boxUr;': '\u2559',
260 | 'boxuR;': '\u2558',
261 | 'boxur;': '\u2514',
262 | 'boxV;': '\u2551',
263 | 'boxv;': '\u2502',
264 | 'boxVH;': '\u256c',
265 | 'boxVh;': '\u256b',
266 | 'boxvH;': '\u256a',
267 | 'boxvh;': '\u253c',
268 | 'boxVL;': '\u2563',
269 | 'boxVl;': '\u2562',
270 | 'boxvL;': '\u2561',
271 | 'boxvl;': '\u2524',
272 | 'boxVR;': '\u2560',
273 | 'boxVr;': '\u255f',
274 | 'boxvR;': '\u255e',
275 | 'boxvr;': '\u251c',
276 | 'bprime;': '\u2035',
277 | 'Breve;': '\u02d8',
278 | 'breve;': '\u02d8',
279 | 'brvbar': '\xa6',
280 | 'brvbar;': '\xa6',
281 | 'Bscr;': '\u212c',
282 | 'bscr;': '\U0001d4b7',
283 | 'bsemi;': '\u204f',
284 | 'bsim;': '\u223d',
285 | 'bsime;': '\u22cd',
286 | 'bsol;': '\\',
287 | 'bsolb;': '\u29c5',
288 | 'bsolhsub;': '\u27c8',
289 | 'bull;': '\u2022',
290 | 'bullet;': '\u2022',
291 | 'bump;': '\u224e',
292 | 'bumpE;': '\u2aae',
293 | 'bumpe;': '\u224f',
294 | 'Bumpeq;': '\u224e',
295 | 'bumpeq;': '\u224f',
296 | 'Cacute;': '\u0106',
297 | 'cacute;': '\u0107',
298 | 'Cap;': '\u22d2',
299 | 'cap;': '\u2229',
300 | 'capand;': '\u2a44',
301 | 'capbrcup;': '\u2a49',
302 | 'capcap;': '\u2a4b',
303 | 'capcup;': '\u2a47',
304 | 'capdot;': '\u2a40',
305 | 'CapitalDifferentialD;': '\u2145',
306 | 'caps;': '\u2229\ufe00',
307 | 'caret;': '\u2041',
308 | 'caron;': '\u02c7',
309 | 'Cayleys;': '\u212d',
310 | 'ccaps;': '\u2a4d',
311 | 'Ccaron;': '\u010c',
312 | 'ccaron;': '\u010d',
313 | 'Ccedil': '\xc7',
314 | 'ccedil': '\xe7',
315 | 'Ccedil;': '\xc7',
316 | 'ccedil;': '\xe7',
317 | 'Ccirc;': '\u0108',
318 | 'ccirc;': '\u0109',
319 | 'Cconint;': '\u2230',
320 | 'ccups;': '\u2a4c',
321 | 'ccupssm;': '\u2a50',
322 | 'Cdot;': '\u010a',
323 | 'cdot;': '\u010b',
324 | 'cedil': '\xb8',
325 | 'cedil;': '\xb8',
326 | 'Cedilla;': '\xb8',
327 | 'cemptyv;': '\u29b2',
328 | 'cent': '\xa2',
329 | 'cent;': '\xa2',
330 | 'CenterDot;': '\xb7',
331 | 'centerdot;': '\xb7',
332 | 'Cfr;': '\u212d',
333 | 'cfr;': '\U0001d520',
334 | 'CHcy;': '\u0427',
335 | 'chcy;': '\u0447',
336 | 'check;': '\u2713',
337 | 'checkmark;': '\u2713',
338 | 'Chi;': '\u03a7',
339 | 'chi;': '\u03c7',
340 | 'cir;': '\u25cb',
341 | 'circ;': '\u02c6',
342 | 'circeq;': '\u2257',
343 | 'circlearrowleft;': '\u21ba',
344 | 'circlearrowright;': '\u21bb',
345 | 'circledast;': '\u229b',
346 | 'circledcirc;': '\u229a',
347 | 'circleddash;': '\u229d',
348 | 'CircleDot;': '\u2299',
349 | 'circledR;': '\xae',
350 | 'circledS;': '\u24c8',
351 | 'CircleMinus;': '\u2296',
352 | 'CirclePlus;': '\u2295',
353 | 'CircleTimes;': '\u2297',
354 | 'cirE;': '\u29c3',
355 | 'cire;': '\u2257',
356 | 'cirfnint;': '\u2a10',
357 | 'cirmid;': '\u2aef',
358 | 'cirscir;': '\u29c2',
359 | 'ClockwiseContourIntegral;': '\u2232',
360 | 'CloseCurlyDoubleQuote;': '\u201d',
361 | 'CloseCurlyQuote;': '\u2019',
362 | 'clubs;': '\u2663',
363 | 'clubsuit;': '\u2663',
364 | 'Colon;': '\u2237',
365 | 'colon;': ':',
366 | 'Colone;': '\u2a74',
367 | 'colone;': '\u2254',
368 | 'coloneq;': '\u2254',
369 | 'comma;': ',',
370 | 'commat;': '@',
371 | 'comp;': '\u2201',
372 | 'compfn;': '\u2218',
373 | 'complement;': '\u2201',
374 | 'complexes;': '\u2102',
375 | 'cong;': '\u2245',
376 | 'congdot;': '\u2a6d',
377 | 'Congruent;': '\u2261',
378 | 'Conint;': '\u222f',
379 | 'conint;': '\u222e',
380 | 'ContourIntegral;': '\u222e',
381 | 'Copf;': '\u2102',
382 | 'copf;': '\U0001d554',
383 | 'coprod;': '\u2210',
384 | 'Coproduct;': '\u2210',
385 | 'COPY': '\xa9',
386 | 'copy': '\xa9',
387 | 'COPY;': '\xa9',
388 | 'copy;': '\xa9',
389 | 'copysr;': '\u2117',
390 | 'CounterClockwiseContourIntegral;': '\u2233',
391 | 'crarr;': '\u21b5',
392 | 'Cross;': '\u2a2f',
393 | 'cross;': '\u2717',
394 | 'Cscr;': '\U0001d49e',
395 | 'cscr;': '\U0001d4b8',
396 | 'csub;': '\u2acf',
397 | 'csube;': '\u2ad1',
398 | 'csup;': '\u2ad0',
399 | 'csupe;': '\u2ad2',
400 | 'ctdot;': '\u22ef',
401 | 'cudarrl;': '\u2938',
402 | 'cudarrr;': '\u2935',
403 | 'cuepr;': '\u22de',
404 | 'cuesc;': '\u22df',
405 | 'cularr;': '\u21b6',
406 | 'cularrp;': '\u293d',
407 | 'Cup;': '\u22d3',
408 | 'cup;': '\u222a',
409 | 'cupbrcap;': '\u2a48',
410 | 'CupCap;': '\u224d',
411 | 'cupcap;': '\u2a46',
412 | 'cupcup;': '\u2a4a',
413 | 'cupdot;': '\u228d',
414 | 'cupor;': '\u2a45',
415 | 'cups;': '\u222a\ufe00',
416 | 'curarr;': '\u21b7',
417 | 'curarrm;': '\u293c',
418 | 'curlyeqprec;': '\u22de',
419 | 'curlyeqsucc;': '\u22df',
420 | 'curlyvee;': '\u22ce',
421 | 'curlywedge;': '\u22cf',
422 | 'curren': '\xa4',
423 | 'curren;': '\xa4',
424 | 'curvearrowleft;': '\u21b6',
425 | 'curvearrowright;': '\u21b7',
426 | 'cuvee;': '\u22ce',
427 | 'cuwed;': '\u22cf',
428 | 'cwconint;': '\u2232',
429 | 'cwint;': '\u2231',
430 | 'cylcty;': '\u232d',
431 | 'Dagger;': '\u2021',
432 | 'dagger;': '\u2020',
433 | 'daleth;': '\u2138',
434 | 'Darr;': '\u21a1',
435 | 'dArr;': '\u21d3',
436 | 'darr;': '\u2193',
437 | 'dash;': '\u2010',
438 | 'Dashv;': '\u2ae4',
439 | 'dashv;': '\u22a3',
440 | 'dbkarow;': '\u290f',
441 | 'dblac;': '\u02dd',
442 | 'Dcaron;': '\u010e',
443 | 'dcaron;': '\u010f',
444 | 'Dcy;': '\u0414',
445 | 'dcy;': '\u0434',
446 | 'DD;': '\u2145',
447 | 'dd;': '\u2146',
448 | 'ddagger;': '\u2021',
449 | 'ddarr;': '\u21ca',
450 | 'DDotrahd;': '\u2911',
451 | 'ddotseq;': '\u2a77',
452 | 'deg': '\xb0',
453 | 'deg;': '\xb0',
454 | 'Del;': '\u2207',
455 | 'Delta;': '\u0394',
456 | 'delta;': '\u03b4',
457 | 'demptyv;': '\u29b1',
458 | 'dfisht;': '\u297f',
459 | 'Dfr;': '\U0001d507',
460 | 'dfr;': '\U0001d521',
461 | 'dHar;': '\u2965',
462 | 'dharl;': '\u21c3',
463 | 'dharr;': '\u21c2',
464 | 'DiacriticalAcute;': '\xb4',
465 | 'DiacriticalDot;': '\u02d9',
466 | 'DiacriticalDoubleAcute;': '\u02dd',
467 | 'DiacriticalGrave;': '`',
468 | 'DiacriticalTilde;': '\u02dc',
469 | 'diam;': '\u22c4',
470 | 'Diamond;': '\u22c4',
471 | 'diamond;': '\u22c4',
472 | 'diamondsuit;': '\u2666',
473 | 'diams;': '\u2666',
474 | 'die;': '\xa8',
475 | 'DifferentialD;': '\u2146',
476 | 'digamma;': '\u03dd',
477 | 'disin;': '\u22f2',
478 | 'div;': '\xf7',
479 | 'divide': '\xf7',
480 | 'divide;': '\xf7',
481 | 'divideontimes;': '\u22c7',
482 | 'divonx;': '\u22c7',
483 | 'DJcy;': '\u0402',
484 | 'djcy;': '\u0452',
485 | 'dlcorn;': '\u231e',
486 | 'dlcrop;': '\u230d',
487 | 'dollar;': '$',
488 | 'Dopf;': '\U0001d53b',
489 | 'dopf;': '\U0001d555',
490 | 'Dot;': '\xa8',
491 | 'dot;': '\u02d9',
492 | 'DotDot;': '\u20dc',
493 | 'doteq;': '\u2250',
494 | 'doteqdot;': '\u2251',
495 | 'DotEqual;': '\u2250',
496 | 'dotminus;': '\u2238',
497 | 'dotplus;': '\u2214',
498 | 'dotsquare;': '\u22a1',
499 | 'doublebarwedge;': '\u2306',
500 | 'DoubleContourIntegral;': '\u222f',
501 | 'DoubleDot;': '\xa8',
502 | 'DoubleDownArrow;': '\u21d3',
503 | 'DoubleLeftArrow;': '\u21d0',
504 | 'DoubleLeftRightArrow;': '\u21d4',
505 | 'DoubleLeftTee;': '\u2ae4',
506 | 'DoubleLongLeftArrow;': '\u27f8',
507 | 'DoubleLongLeftRightArrow;': '\u27fa',
508 | 'DoubleLongRightArrow;': '\u27f9',
509 | 'DoubleRightArrow;': '\u21d2',
510 | 'DoubleRightTee;': '\u22a8',
511 | 'DoubleUpArrow;': '\u21d1',
512 | 'DoubleUpDownArrow;': '\u21d5',
513 | 'DoubleVerticalBar;': '\u2225',
514 | 'DownArrow;': '\u2193',
515 | 'Downarrow;': '\u21d3',
516 | 'downarrow;': '\u2193',
517 | 'DownArrowBar;': '\u2913',
518 | 'DownArrowUpArrow;': '\u21f5',
519 | 'DownBreve;': '\u0311',
520 | 'downdownarrows;': '\u21ca',
521 | 'downharpoonleft;': '\u21c3',
522 | 'downharpoonright;': '\u21c2',
523 | 'DownLeftRightVector;': '\u2950',
524 | 'DownLeftTeeVector;': '\u295e',
525 | 'DownLeftVector;': '\u21bd',
526 | 'DownLeftVectorBar;': '\u2956',
527 | 'DownRightTeeVector;': '\u295f',
528 | 'DownRightVector;': '\u21c1',
529 | 'DownRightVectorBar;': '\u2957',
530 | 'DownTee;': '\u22a4',
531 | 'DownTeeArrow;': '\u21a7',
532 | 'drbkarow;': '\u2910',
533 | 'drcorn;': '\u231f',
534 | 'drcrop;': '\u230c',
535 | 'Dscr;': '\U0001d49f',
536 | 'dscr;': '\U0001d4b9',
537 | 'DScy;': '\u0405',
538 | 'dscy;': '\u0455',
539 | 'dsol;': '\u29f6',
540 | 'Dstrok;': '\u0110',
541 | 'dstrok;': '\u0111',
542 | 'dtdot;': '\u22f1',
543 | 'dtri;': '\u25bf',
544 | 'dtrif;': '\u25be',
545 | 'duarr;': '\u21f5',
546 | 'duhar;': '\u296f',
547 | 'dwangle;': '\u29a6',
548 | 'DZcy;': '\u040f',
549 | 'dzcy;': '\u045f',
550 | 'dzigrarr;': '\u27ff',
551 | 'Eacute': '\xc9',
552 | 'eacute': '\xe9',
553 | 'Eacute;': '\xc9',
554 | 'eacute;': '\xe9',
555 | 'easter;': '\u2a6e',
556 | 'Ecaron;': '\u011a',
557 | 'ecaron;': '\u011b',
558 | 'ecir;': '\u2256',
559 | 'Ecirc': '\xca',
560 | 'ecirc': '\xea',
561 | 'Ecirc;': '\xca',
562 | 'ecirc;': '\xea',
563 | 'ecolon;': '\u2255',
564 | 'Ecy;': '\u042d',
565 | 'ecy;': '\u044d',
566 | 'eDDot;': '\u2a77',
567 | 'Edot;': '\u0116',
568 | 'eDot;': '\u2251',
569 | 'edot;': '\u0117',
570 | 'ee;': '\u2147',
571 | 'efDot;': '\u2252',
572 | 'Efr;': '\U0001d508',
573 | 'efr;': '\U0001d522',
574 | 'eg;': '\u2a9a',
575 | 'Egrave': '\xc8',
576 | 'egrave': '\xe8',
577 | 'Egrave;': '\xc8',
578 | 'egrave;': '\xe8',
579 | 'egs;': '\u2a96',
580 | 'egsdot;': '\u2a98',
581 | 'el;': '\u2a99',
582 | 'Element;': '\u2208',
583 | 'elinters;': '\u23e7',
584 | 'ell;': '\u2113',
585 | 'els;': '\u2a95',
586 | 'elsdot;': '\u2a97',
587 | 'Emacr;': '\u0112',
588 | 'emacr;': '\u0113',
589 | 'empty;': '\u2205',
590 | 'emptyset;': '\u2205',
591 | 'EmptySmallSquare;': '\u25fb',
592 | 'emptyv;': '\u2205',
593 | 'EmptyVerySmallSquare;': '\u25ab',
594 | 'emsp13;': '\u2004',
595 | 'emsp14;': '\u2005',
596 | 'emsp;': '\u2003',
597 | 'ENG;': '\u014a',
598 | 'eng;': '\u014b',
599 | 'ensp;': '\u2002',
600 | 'Eogon;': '\u0118',
601 | 'eogon;': '\u0119',
602 | 'Eopf;': '\U0001d53c',
603 | 'eopf;': '\U0001d556',
604 | 'epar;': '\u22d5',
605 | 'eparsl;': '\u29e3',
606 | 'eplus;': '\u2a71',
607 | 'epsi;': '\u03b5',
608 | 'Epsilon;': '\u0395',
609 | 'epsilon;': '\u03b5',
610 | 'epsiv;': '\u03f5',
611 | 'eqcirc;': '\u2256',
612 | 'eqcolon;': '\u2255',
613 | 'eqsim;': '\u2242',
614 | 'eqslantgtr;': '\u2a96',
615 | 'eqslantless;': '\u2a95',
616 | 'Equal;': '\u2a75',
617 | 'equals;': '=',
618 | 'EqualTilde;': '\u2242',
619 | 'equest;': '\u225f',
620 | 'Equilibrium;': '\u21cc',
621 | 'equiv;': '\u2261',
622 | 'equivDD;': '\u2a78',
623 | 'eqvparsl;': '\u29e5',
624 | 'erarr;': '\u2971',
625 | 'erDot;': '\u2253',
626 | 'Escr;': '\u2130',
627 | 'escr;': '\u212f',
628 | 'esdot;': '\u2250',
629 | 'Esim;': '\u2a73',
630 | 'esim;': '\u2242',
631 | 'Eta;': '\u0397',
632 | 'eta;': '\u03b7',
633 | 'ETH': '\xd0',
634 | 'eth': '\xf0',
635 | 'ETH;': '\xd0',
636 | 'eth;': '\xf0',
637 | 'Euml': '\xcb',
638 | 'euml': '\xeb',
639 | 'Euml;': '\xcb',
640 | 'euml;': '\xeb',
641 | 'euro;': '\u20ac',
642 | 'excl;': '!',
643 | 'exist;': '\u2203',
644 | 'Exists;': '\u2203',
645 | 'expectation;': '\u2130',
646 | 'ExponentialE;': '\u2147',
647 | 'exponentiale;': '\u2147',
648 | 'fallingdotseq;': '\u2252',
649 | 'Fcy;': '\u0424',
650 | 'fcy;': '\u0444',
651 | 'female;': '\u2640',
652 | 'ffilig;': '\ufb03',
653 | 'fflig;': '\ufb00',
654 | 'ffllig;': '\ufb04',
655 | 'Ffr;': '\U0001d509',
656 | 'ffr;': '\U0001d523',
657 | 'filig;': '\ufb01',
658 | 'FilledSmallSquare;': '\u25fc',
659 | 'FilledVerySmallSquare;': '\u25aa',
660 | 'fjlig;': 'fj',
661 | 'flat;': '\u266d',
662 | 'fllig;': '\ufb02',
663 | 'fltns;': '\u25b1',
664 | 'fnof;': '\u0192',
665 | 'Fopf;': '\U0001d53d',
666 | 'fopf;': '\U0001d557',
667 | 'ForAll;': '\u2200',
668 | 'forall;': '\u2200',
669 | 'fork;': '\u22d4',
670 | 'forkv;': '\u2ad9',
671 | 'Fouriertrf;': '\u2131',
672 | 'fpartint;': '\u2a0d',
673 | 'frac12': '\xbd',
674 | 'frac12;': '\xbd',
675 | 'frac13;': '\u2153',
676 | 'frac14': '\xbc',
677 | 'frac14;': '\xbc',
678 | 'frac15;': '\u2155',
679 | 'frac16;': '\u2159',
680 | 'frac18;': '\u215b',
681 | 'frac23;': '\u2154',
682 | 'frac25;': '\u2156',
683 | 'frac34': '\xbe',
684 | 'frac34;': '\xbe',
685 | 'frac35;': '\u2157',
686 | 'frac38;': '\u215c',
687 | 'frac45;': '\u2158',
688 | 'frac56;': '\u215a',
689 | 'frac58;': '\u215d',
690 | 'frac78;': '\u215e',
691 | 'frasl;': '\u2044',
692 | 'frown;': '\u2322',
693 | 'Fscr;': '\u2131',
694 | 'fscr;': '\U0001d4bb',
695 | 'gacute;': '\u01f5',
696 | 'Gamma;': '\u0393',
697 | 'gamma;': '\u03b3',
698 | 'Gammad;': '\u03dc',
699 | 'gammad;': '\u03dd',
700 | 'gap;': '\u2a86',
701 | 'Gbreve;': '\u011e',
702 | 'gbreve;': '\u011f',
703 | 'Gcedil;': '\u0122',
704 | 'Gcirc;': '\u011c',
705 | 'gcirc;': '\u011d',
706 | 'Gcy;': '\u0413',
707 | 'gcy;': '\u0433',
708 | 'Gdot;': '\u0120',
709 | 'gdot;': '\u0121',
710 | 'gE;': '\u2267',
711 | 'ge;': '\u2265',
712 | 'gEl;': '\u2a8c',
713 | 'gel;': '\u22db',
714 | 'geq;': '\u2265',
715 | 'geqq;': '\u2267',
716 | 'geqslant;': '\u2a7e',
717 | 'ges;': '\u2a7e',
718 | 'gescc;': '\u2aa9',
719 | 'gesdot;': '\u2a80',
720 | 'gesdoto;': '\u2a82',
721 | 'gesdotol;': '\u2a84',
722 | 'gesl;': '\u22db\ufe00',
723 | 'gesles;': '\u2a94',
724 | 'Gfr;': '\U0001d50a',
725 | 'gfr;': '\U0001d524',
726 | 'Gg;': '\u22d9',
727 | 'gg;': '\u226b',
728 | 'ggg;': '\u22d9',
729 | 'gimel;': '\u2137',
730 | 'GJcy;': '\u0403',
731 | 'gjcy;': '\u0453',
732 | 'gl;': '\u2277',
733 | 'gla;': '\u2aa5',
734 | 'glE;': '\u2a92',
735 | 'glj;': '\u2aa4',
736 | 'gnap;': '\u2a8a',
737 | 'gnapprox;': '\u2a8a',
738 | 'gnE;': '\u2269',
739 | 'gne;': '\u2a88',
740 | 'gneq;': '\u2a88',
741 | 'gneqq;': '\u2269',
742 | 'gnsim;': '\u22e7',
743 | 'Gopf;': '\U0001d53e',
744 | 'gopf;': '\U0001d558',
745 | 'grave;': '`',
746 | 'GreaterEqual;': '\u2265',
747 | 'GreaterEqualLess;': '\u22db',
748 | 'GreaterFullEqual;': '\u2267',
749 | 'GreaterGreater;': '\u2aa2',
750 | 'GreaterLess;': '\u2277',
751 | 'GreaterSlantEqual;': '\u2a7e',
752 | 'GreaterTilde;': '\u2273',
753 | 'Gscr;': '\U0001d4a2',
754 | 'gscr;': '\u210a',
755 | 'gsim;': '\u2273',
756 | 'gsime;': '\u2a8e',
757 | 'gsiml;': '\u2a90',
758 | 'GT': '>',
759 | 'gt': '>',
760 | 'GT;': '>',
761 | 'Gt;': '\u226b',
762 | 'gt;': '>',
763 | 'gtcc;': '\u2aa7',
764 | 'gtcir;': '\u2a7a',
765 | 'gtdot;': '\u22d7',
766 | 'gtlPar;': '\u2995',
767 | 'gtquest;': '\u2a7c',
768 | 'gtrapprox;': '\u2a86',
769 | 'gtrarr;': '\u2978',
770 | 'gtrdot;': '\u22d7',
771 | 'gtreqless;': '\u22db',
772 | 'gtreqqless;': '\u2a8c',
773 | 'gtrless;': '\u2277',
774 | 'gtrsim;': '\u2273',
775 | 'gvertneqq;': '\u2269\ufe00',
776 | 'gvnE;': '\u2269\ufe00',
777 | 'Hacek;': '\u02c7',
778 | 'hairsp;': '\u200a',
779 | 'half;': '\xbd',
780 | 'hamilt;': '\u210b',
781 | 'HARDcy;': '\u042a',
782 | 'hardcy;': '\u044a',
783 | 'hArr;': '\u21d4',
784 | 'harr;': '\u2194',
785 | 'harrcir;': '\u2948',
786 | 'harrw;': '\u21ad',
787 | 'Hat;': '^',
788 | 'hbar;': '\u210f',
789 | 'Hcirc;': '\u0124',
790 | 'hcirc;': '\u0125',
791 | 'hearts;': '\u2665',
792 | 'heartsuit;': '\u2665',
793 | 'hellip;': '\u2026',
794 | 'hercon;': '\u22b9',
795 | 'Hfr;': '\u210c',
796 | 'hfr;': '\U0001d525',
797 | 'HilbertSpace;': '\u210b',
798 | 'hksearow;': '\u2925',
799 | 'hkswarow;': '\u2926',
800 | 'hoarr;': '\u21ff',
801 | 'homtht;': '\u223b',
802 | 'hookleftarrow;': '\u21a9',
803 | 'hookrightarrow;': '\u21aa',
804 | 'Hopf;': '\u210d',
805 | 'hopf;': '\U0001d559',
806 | 'horbar;': '\u2015',
807 | 'HorizontalLine;': '\u2500',
808 | 'Hscr;': '\u210b',
809 | 'hscr;': '\U0001d4bd',
810 | 'hslash;': '\u210f',
811 | 'Hstrok;': '\u0126',
812 | 'hstrok;': '\u0127',
813 | 'HumpDownHump;': '\u224e',
814 | 'HumpEqual;': '\u224f',
815 | 'hybull;': '\u2043',
816 | 'hyphen;': '\u2010',
817 | 'Iacute': '\xcd',
818 | 'iacute': '\xed',
819 | 'Iacute;': '\xcd',
820 | 'iacute;': '\xed',
821 | 'ic;': '\u2063',
822 | 'Icirc': '\xce',
823 | 'icirc': '\xee',
824 | 'Icirc;': '\xce',
825 | 'icirc;': '\xee',
826 | 'Icy;': '\u0418',
827 | 'icy;': '\u0438',
828 | 'Idot;': '\u0130',
829 | 'IEcy;': '\u0415',
830 | 'iecy;': '\u0435',
831 | 'iexcl': '\xa1',
832 | 'iexcl;': '\xa1',
833 | 'iff;': '\u21d4',
834 | 'Ifr;': '\u2111',
835 | 'ifr;': '\U0001d526',
836 | 'Igrave': '\xcc',
837 | 'igrave': '\xec',
838 | 'Igrave;': '\xcc',
839 | 'igrave;': '\xec',
840 | 'ii;': '\u2148',
841 | 'iiiint;': '\u2a0c',
842 | 'iiint;': '\u222d',
843 | 'iinfin;': '\u29dc',
844 | 'iiota;': '\u2129',
845 | 'IJlig;': '\u0132',
846 | 'ijlig;': '\u0133',
847 | 'Im;': '\u2111',
848 | 'Imacr;': '\u012a',
849 | 'imacr;': '\u012b',
850 | 'image;': '\u2111',
851 | 'ImaginaryI;': '\u2148',
852 | 'imagline;': '\u2110',
853 | 'imagpart;': '\u2111',
854 | 'imath;': '\u0131',
855 | 'imof;': '\u22b7',
856 | 'imped;': '\u01b5',
857 | 'Implies;': '\u21d2',
858 | 'in;': '\u2208',
859 | 'incare;': '\u2105',
860 | 'infin;': '\u221e',
861 | 'infintie;': '\u29dd',
862 | 'inodot;': '\u0131',
863 | 'Int;': '\u222c',
864 | 'int;': '\u222b',
865 | 'intcal;': '\u22ba',
866 | 'integers;': '\u2124',
867 | 'Integral;': '\u222b',
868 | 'intercal;': '\u22ba',
869 | 'Intersection;': '\u22c2',
870 | 'intlarhk;': '\u2a17',
871 | 'intprod;': '\u2a3c',
872 | 'InvisibleComma;': '\u2063',
873 | 'InvisibleTimes;': '\u2062',
874 | 'IOcy;': '\u0401',
875 | 'iocy;': '\u0451',
876 | 'Iogon;': '\u012e',
877 | 'iogon;': '\u012f',
878 | 'Iopf;': '\U0001d540',
879 | 'iopf;': '\U0001d55a',
880 | 'Iota;': '\u0399',
881 | 'iota;': '\u03b9',
882 | 'iprod;': '\u2a3c',
883 | 'iquest': '\xbf',
884 | 'iquest;': '\xbf',
885 | 'Iscr;': '\u2110',
886 | 'iscr;': '\U0001d4be',
887 | 'isin;': '\u2208',
888 | 'isindot;': '\u22f5',
889 | 'isinE;': '\u22f9',
890 | 'isins;': '\u22f4',
891 | 'isinsv;': '\u22f3',
892 | 'isinv;': '\u2208',
893 | 'it;': '\u2062',
894 | 'Itilde;': '\u0128',
895 | 'itilde;': '\u0129',
896 | 'Iukcy;': '\u0406',
897 | 'iukcy;': '\u0456',
898 | 'Iuml': '\xcf',
899 | 'iuml': '\xef',
900 | 'Iuml;': '\xcf',
901 | 'iuml;': '\xef',
902 | 'Jcirc;': '\u0134',
903 | 'jcirc;': '\u0135',
904 | 'Jcy;': '\u0419',
905 | 'jcy;': '\u0439',
906 | 'Jfr;': '\U0001d50d',
907 | 'jfr;': '\U0001d527',
908 | 'jmath;': '\u0237',
909 | 'Jopf;': '\U0001d541',
910 | 'jopf;': '\U0001d55b',
911 | 'Jscr;': '\U0001d4a5',
912 | 'jscr;': '\U0001d4bf',
913 | 'Jsercy;': '\u0408',
914 | 'jsercy;': '\u0458',
915 | 'Jukcy;': '\u0404',
916 | 'jukcy;': '\u0454',
917 | 'Kappa;': '\u039a',
918 | 'kappa;': '\u03ba',
919 | 'kappav;': '\u03f0',
920 | 'Kcedil;': '\u0136',
921 | 'kcedil;': '\u0137',
922 | 'Kcy;': '\u041a',
923 | 'kcy;': '\u043a',
924 | 'Kfr;': '\U0001d50e',
925 | 'kfr;': '\U0001d528',
926 | 'kgreen;': '\u0138',
927 | 'KHcy;': '\u0425',
928 | 'khcy;': '\u0445',
929 | 'KJcy;': '\u040c',
930 | 'kjcy;': '\u045c',
931 | 'Kopf;': '\U0001d542',
932 | 'kopf;': '\U0001d55c',
933 | 'Kscr;': '\U0001d4a6',
934 | 'kscr;': '\U0001d4c0',
935 | 'lAarr;': '\u21da',
936 | 'Lacute;': '\u0139',
937 | 'lacute;': '\u013a',
938 | 'laemptyv;': '\u29b4',
939 | 'lagran;': '\u2112',
940 | 'Lambda;': '\u039b',
941 | 'lambda;': '\u03bb',
942 | 'Lang;': '\u27ea',
943 | 'lang;': '\u27e8',
944 | 'langd;': '\u2991',
945 | 'langle;': '\u27e8',
946 | 'lap;': '\u2a85',
947 | 'Laplacetrf;': '\u2112',
948 | 'laquo': '\xab',
949 | 'laquo;': '\xab',
950 | 'Larr;': '\u219e',
951 | 'lArr;': '\u21d0',
952 | 'larr;': '\u2190',
953 | 'larrb;': '\u21e4',
954 | 'larrbfs;': '\u291f',
955 | 'larrfs;': '\u291d',
956 | 'larrhk;': '\u21a9',
957 | 'larrlp;': '\u21ab',
958 | 'larrpl;': '\u2939',
959 | 'larrsim;': '\u2973',
960 | 'larrtl;': '\u21a2',
961 | 'lat;': '\u2aab',
962 | 'lAtail;': '\u291b',
963 | 'latail;': '\u2919',
964 | 'late;': '\u2aad',
965 | 'lates;': '\u2aad\ufe00',
966 | 'lBarr;': '\u290e',
967 | 'lbarr;': '\u290c',
968 | 'lbbrk;': '\u2772',
969 | 'lbrace;': '{',
970 | 'lbrack;': '[',
971 | 'lbrke;': '\u298b',
972 | 'lbrksld;': '\u298f',
973 | 'lbrkslu;': '\u298d',
974 | 'Lcaron;': '\u013d',
975 | 'lcaron;': '\u013e',
976 | 'Lcedil;': '\u013b',
977 | 'lcedil;': '\u013c',
978 | 'lceil;': '\u2308',
979 | 'lcub;': '{',
980 | 'Lcy;': '\u041b',
981 | 'lcy;': '\u043b',
982 | 'ldca;': '\u2936',
983 | 'ldquo;': '\u201c',
984 | 'ldquor;': '\u201e',
985 | 'ldrdhar;': '\u2967',
986 | 'ldrushar;': '\u294b',
987 | 'ldsh;': '\u21b2',
988 | 'lE;': '\u2266',
989 | 'le;': '\u2264',
990 | 'LeftAngleBracket;': '\u27e8',
991 | 'LeftArrow;': '\u2190',
992 | 'Leftarrow;': '\u21d0',
993 | 'leftarrow;': '\u2190',
994 | 'LeftArrowBar;': '\u21e4',
995 | 'LeftArrowRightArrow;': '\u21c6',
996 | 'leftarrowtail;': '\u21a2',
997 | 'LeftCeiling;': '\u2308',
998 | 'LeftDoubleBracket;': '\u27e6',
999 | 'LeftDownTeeVector;': '\u2961',
1000 | 'LeftDownVector;': '\u21c3',
1001 | 'LeftDownVectorBar;': '\u2959',
1002 | 'LeftFloor;': '\u230a',
1003 | 'leftharpoondown;': '\u21bd',
1004 | 'leftharpoonup;': '\u21bc',
1005 | 'leftleftarrows;': '\u21c7',
1006 | 'LeftRightArrow;': '\u2194',
1007 | 'Leftrightarrow;': '\u21d4',
1008 | 'leftrightarrow;': '\u2194',
1009 | 'leftrightarrows;': '\u21c6',
1010 | 'leftrightharpoons;': '\u21cb',
1011 | 'leftrightsquigarrow;': '\u21ad',
1012 | 'LeftRightVector;': '\u294e',
1013 | 'LeftTee;': '\u22a3',
1014 | 'LeftTeeArrow;': '\u21a4',
1015 | 'LeftTeeVector;': '\u295a',
1016 | 'leftthreetimes;': '\u22cb',
1017 | 'LeftTriangle;': '\u22b2',
1018 | 'LeftTriangleBar;': '\u29cf',
1019 | 'LeftTriangleEqual;': '\u22b4',
1020 | 'LeftUpDownVector;': '\u2951',
1021 | 'LeftUpTeeVector;': '\u2960',
1022 | 'LeftUpVector;': '\u21bf',
1023 | 'LeftUpVectorBar;': '\u2958',
1024 | 'LeftVector;': '\u21bc',
1025 | 'LeftVectorBar;': '\u2952',
1026 | 'lEg;': '\u2a8b',
1027 | 'leg;': '\u22da',
1028 | 'leq;': '\u2264',
1029 | 'leqq;': '\u2266',
1030 | 'leqslant;': '\u2a7d',
1031 | 'les;': '\u2a7d',
1032 | 'lescc;': '\u2aa8',
1033 | 'lesdot;': '\u2a7f',
1034 | 'lesdoto;': '\u2a81',
1035 | 'lesdotor;': '\u2a83',
1036 | 'lesg;': '\u22da\ufe00',
1037 | 'lesges;': '\u2a93',
1038 | 'lessapprox;': '\u2a85',
1039 | 'lessdot;': '\u22d6',
1040 | 'lesseqgtr;': '\u22da',
1041 | 'lesseqqgtr;': '\u2a8b',
1042 | 'LessEqualGreater;': '\u22da',
1043 | 'LessFullEqual;': '\u2266',
1044 | 'LessGreater;': '\u2276',
1045 | 'lessgtr;': '\u2276',
1046 | 'LessLess;': '\u2aa1',
1047 | 'lesssim;': '\u2272',
1048 | 'LessSlantEqual;': '\u2a7d',
1049 | 'LessTilde;': '\u2272',
1050 | 'lfisht;': '\u297c',
1051 | 'lfloor;': '\u230a',
1052 | 'Lfr;': '\U0001d50f',
1053 | 'lfr;': '\U0001d529',
1054 | 'lg;': '\u2276',
1055 | 'lgE;': '\u2a91',
1056 | 'lHar;': '\u2962',
1057 | 'lhard;': '\u21bd',
1058 | 'lharu;': '\u21bc',
1059 | 'lharul;': '\u296a',
1060 | 'lhblk;': '\u2584',
1061 | 'LJcy;': '\u0409',
1062 | 'ljcy;': '\u0459',
1063 | 'Ll;': '\u22d8',
1064 | 'll;': '\u226a',
1065 | 'llarr;': '\u21c7',
1066 | 'llcorner;': '\u231e',
1067 | 'Lleftarrow;': '\u21da',
1068 | 'llhard;': '\u296b',
1069 | 'lltri;': '\u25fa',
1070 | 'Lmidot;': '\u013f',
1071 | 'lmidot;': '\u0140',
1072 | 'lmoust;': '\u23b0',
1073 | 'lmoustache;': '\u23b0',
1074 | 'lnap;': '\u2a89',
1075 | 'lnapprox;': '\u2a89',
1076 | 'lnE;': '\u2268',
1077 | 'lne;': '\u2a87',
1078 | 'lneq;': '\u2a87',
1079 | 'lneqq;': '\u2268',
1080 | 'lnsim;': '\u22e6',
1081 | 'loang;': '\u27ec',
1082 | 'loarr;': '\u21fd',
1083 | 'lobrk;': '\u27e6',
1084 | 'LongLeftArrow;': '\u27f5',
1085 | 'Longleftarrow;': '\u27f8',
1086 | 'longleftarrow;': '\u27f5',
1087 | 'LongLeftRightArrow;': '\u27f7',
1088 | 'Longleftrightarrow;': '\u27fa',
1089 | 'longleftrightarrow;': '\u27f7',
1090 | 'longmapsto;': '\u27fc',
1091 | 'LongRightArrow;': '\u27f6',
1092 | 'Longrightarrow;': '\u27f9',
1093 | 'longrightarrow;': '\u27f6',
1094 | 'looparrowleft;': '\u21ab',
1095 | 'looparrowright;': '\u21ac',
1096 | 'lopar;': '\u2985',
1097 | 'Lopf;': '\U0001d543',
1098 | 'lopf;': '\U0001d55d',
1099 | 'loplus;': '\u2a2d',
1100 | 'lotimes;': '\u2a34',
1101 | 'lowast;': '\u2217',
1102 | 'lowbar;': '_',
1103 | 'LowerLeftArrow;': '\u2199',
1104 | 'LowerRightArrow;': '\u2198',
1105 | 'loz;': '\u25ca',
1106 | 'lozenge;': '\u25ca',
1107 | 'lozf;': '\u29eb',
1108 | 'lpar;': '(',
1109 | 'lparlt;': '\u2993',
1110 | 'lrarr;': '\u21c6',
1111 | 'lrcorner;': '\u231f',
1112 | 'lrhar;': '\u21cb',
1113 | 'lrhard;': '\u296d',
1114 | 'lrm;': '\u200e',
1115 | 'lrtri;': '\u22bf',
1116 | 'lsaquo;': '\u2039',
1117 | 'Lscr;': '\u2112',
1118 | 'lscr;': '\U0001d4c1',
1119 | 'Lsh;': '\u21b0',
1120 | 'lsh;': '\u21b0',
1121 | 'lsim;': '\u2272',
1122 | 'lsime;': '\u2a8d',
1123 | 'lsimg;': '\u2a8f',
1124 | 'lsqb;': '[',
1125 | 'lsquo;': '\u2018',
1126 | 'lsquor;': '\u201a',
1127 | 'Lstrok;': '\u0141',
1128 | 'lstrok;': '\u0142',
1129 | 'LT': '<',
1130 | 'lt': '<',
1131 | 'LT;': '<',
1132 | 'Lt;': '\u226a',
1133 | 'lt;': '<',
1134 | 'ltcc;': '\u2aa6',
1135 | 'ltcir;': '\u2a79',
1136 | 'ltdot;': '\u22d6',
1137 | 'lthree;': '\u22cb',
1138 | 'ltimes;': '\u22c9',
1139 | 'ltlarr;': '\u2976',
1140 | 'ltquest;': '\u2a7b',
1141 | 'ltri;': '\u25c3',
1142 | 'ltrie;': '\u22b4',
1143 | 'ltrif;': '\u25c2',
1144 | 'ltrPar;': '\u2996',
1145 | 'lurdshar;': '\u294a',
1146 | 'luruhar;': '\u2966',
1147 | 'lvertneqq;': '\u2268\ufe00',
1148 | 'lvnE;': '\u2268\ufe00',
1149 | 'macr': '\xaf',
1150 | 'macr;': '\xaf',
1151 | 'male;': '\u2642',
1152 | 'malt;': '\u2720',
1153 | 'maltese;': '\u2720',
1154 | 'Map;': '\u2905',
1155 | 'map;': '\u21a6',
1156 | 'mapsto;': '\u21a6',
1157 | 'mapstodown;': '\u21a7',
1158 | 'mapstoleft;': '\u21a4',
1159 | 'mapstoup;': '\u21a5',
1160 | 'marker;': '\u25ae',
1161 | 'mcomma;': '\u2a29',
1162 | 'Mcy;': '\u041c',
1163 | 'mcy;': '\u043c',
1164 | 'mdash;': '\u2014',
1165 | 'mDDot;': '\u223a',
1166 | 'measuredangle;': '\u2221',
1167 | 'MediumSpace;': '\u205f',
1168 | 'Mellintrf;': '\u2133',
1169 | 'Mfr;': '\U0001d510',
1170 | 'mfr;': '\U0001d52a',
1171 | 'mho;': '\u2127',
1172 | 'micro': '\xb5',
1173 | 'micro;': '\xb5',
1174 | 'mid;': '\u2223',
1175 | 'midast;': '*',
1176 | 'midcir;': '\u2af0',
1177 | 'middot': '\xb7',
1178 | 'middot;': '\xb7',
1179 | 'minus;': '\u2212',
1180 | 'minusb;': '\u229f',
1181 | 'minusd;': '\u2238',
1182 | 'minusdu;': '\u2a2a',
1183 | 'MinusPlus;': '\u2213',
1184 | 'mlcp;': '\u2adb',
1185 | 'mldr;': '\u2026',
1186 | 'mnplus;': '\u2213',
1187 | 'models;': '\u22a7',
1188 | 'Mopf;': '\U0001d544',
1189 | 'mopf;': '\U0001d55e',
1190 | 'mp;': '\u2213',
1191 | 'Mscr;': '\u2133',
1192 | 'mscr;': '\U0001d4c2',
1193 | 'mstpos;': '\u223e',
1194 | 'Mu;': '\u039c',
1195 | 'mu;': '\u03bc',
1196 | 'multimap;': '\u22b8',
1197 | 'mumap;': '\u22b8',
1198 | 'nabla;': '\u2207',
1199 | 'Nacute;': '\u0143',
1200 | 'nacute;': '\u0144',
1201 | 'nang;': '\u2220\u20d2',
1202 | 'nap;': '\u2249',
1203 | 'napE;': '\u2a70\u0338',
1204 | 'napid;': '\u224b\u0338',
1205 | 'napos;': '\u0149',
1206 | 'napprox;': '\u2249',
1207 | 'natur;': '\u266e',
1208 | 'natural;': '\u266e',
1209 | 'naturals;': '\u2115',
1210 | 'nbsp': '\xa0',
1211 | 'nbsp;': '\xa0',
1212 | 'nbump;': '\u224e\u0338',
1213 | 'nbumpe;': '\u224f\u0338',
1214 | 'ncap;': '\u2a43',
1215 | 'Ncaron;': '\u0147',
1216 | 'ncaron;': '\u0148',
1217 | 'Ncedil;': '\u0145',
1218 | 'ncedil;': '\u0146',
1219 | 'ncong;': '\u2247',
1220 | 'ncongdot;': '\u2a6d\u0338',
1221 | 'ncup;': '\u2a42',
1222 | 'Ncy;': '\u041d',
1223 | 'ncy;': '\u043d',
1224 | 'ndash;': '\u2013',
1225 | 'ne;': '\u2260',
1226 | 'nearhk;': '\u2924',
1227 | 'neArr;': '\u21d7',
1228 | 'nearr;': '\u2197',
1229 | 'nearrow;': '\u2197',
1230 | 'nedot;': '\u2250\u0338',
1231 | 'NegativeMediumSpace;': '\u200b',
1232 | 'NegativeThickSpace;': '\u200b',
1233 | 'NegativeThinSpace;': '\u200b',
1234 | 'NegativeVeryThinSpace;': '\u200b',
1235 | 'nequiv;': '\u2262',
1236 | 'nesear;': '\u2928',
1237 | 'nesim;': '\u2242\u0338',
1238 | 'NestedGreaterGreater;': '\u226b',
1239 | 'NestedLessLess;': '\u226a',
1240 | 'NewLine;': '\n',
1241 | 'nexist;': '\u2204',
1242 | 'nexists;': '\u2204',
1243 | 'Nfr;': '\U0001d511',
1244 | 'nfr;': '\U0001d52b',
1245 | 'ngE;': '\u2267\u0338',
1246 | 'nge;': '\u2271',
1247 | 'ngeq;': '\u2271',
1248 | 'ngeqq;': '\u2267\u0338',
1249 | 'ngeqslant;': '\u2a7e\u0338',
1250 | 'nges;': '\u2a7e\u0338',
1251 | 'nGg;': '\u22d9\u0338',
1252 | 'ngsim;': '\u2275',
1253 | 'nGt;': '\u226b\u20d2',
1254 | 'ngt;': '\u226f',
1255 | 'ngtr;': '\u226f',
1256 | 'nGtv;': '\u226b\u0338',
1257 | 'nhArr;': '\u21ce',
1258 | 'nharr;': '\u21ae',
1259 | 'nhpar;': '\u2af2',
1260 | 'ni;': '\u220b',
1261 | 'nis;': '\u22fc',
1262 | 'nisd;': '\u22fa',
1263 | 'niv;': '\u220b',
1264 | 'NJcy;': '\u040a',
1265 | 'njcy;': '\u045a',
1266 | 'nlArr;': '\u21cd',
1267 | 'nlarr;': '\u219a',
1268 | 'nldr;': '\u2025',
1269 | 'nlE;': '\u2266\u0338',
1270 | 'nle;': '\u2270',
1271 | 'nLeftarrow;': '\u21cd',
1272 | 'nleftarrow;': '\u219a',
1273 | 'nLeftrightarrow;': '\u21ce',
1274 | 'nleftrightarrow;': '\u21ae',
1275 | 'nleq;': '\u2270',
1276 | 'nleqq;': '\u2266\u0338',
1277 | 'nleqslant;': '\u2a7d\u0338',
1278 | 'nles;': '\u2a7d\u0338',
1279 | 'nless;': '\u226e',
1280 | 'nLl;': '\u22d8\u0338',
1281 | 'nlsim;': '\u2274',
1282 | 'nLt;': '\u226a\u20d2',
1283 | 'nlt;': '\u226e',
1284 | 'nltri;': '\u22ea',
1285 | 'nltrie;': '\u22ec',
1286 | 'nLtv;': '\u226a\u0338',
1287 | 'nmid;': '\u2224',
1288 | 'NoBreak;': '\u2060',
1289 | 'NonBreakingSpace;': '\xa0',
1290 | 'Nopf;': '\u2115',
1291 | 'nopf;': '\U0001d55f',
1292 | 'not': '\xac',
1293 | 'Not;': '\u2aec',
1294 | 'not;': '\xac',
1295 | 'NotCongruent;': '\u2262',
1296 | 'NotCupCap;': '\u226d',
1297 | 'NotDoubleVerticalBar;': '\u2226',
1298 | 'NotElement;': '\u2209',
1299 | 'NotEqual;': '\u2260',
1300 | 'NotEqualTilde;': '\u2242\u0338',
1301 | 'NotExists;': '\u2204',
1302 | 'NotGreater;': '\u226f',
1303 | 'NotGreaterEqual;': '\u2271',
1304 | 'NotGreaterFullEqual;': '\u2267\u0338',
1305 | 'NotGreaterGreater;': '\u226b\u0338',
1306 | 'NotGreaterLess;': '\u2279',
1307 | 'NotGreaterSlantEqual;': '\u2a7e\u0338',
1308 | 'NotGreaterTilde;': '\u2275',
1309 | 'NotHumpDownHump;': '\u224e\u0338',
1310 | 'NotHumpEqual;': '\u224f\u0338',
1311 | 'notin;': '\u2209',
1312 | 'notindot;': '\u22f5\u0338',
1313 | 'notinE;': '\u22f9\u0338',
1314 | 'notinva;': '\u2209',
1315 | 'notinvb;': '\u22f7',
1316 | 'notinvc;': '\u22f6',
1317 | 'NotLeftTriangle;': '\u22ea',
1318 | 'NotLeftTriangleBar;': '\u29cf\u0338',
1319 | 'NotLeftTriangleEqual;': '\u22ec',
1320 | 'NotLess;': '\u226e',
1321 | 'NotLessEqual;': '\u2270',
1322 | 'NotLessGreater;': '\u2278',
1323 | 'NotLessLess;': '\u226a\u0338',
1324 | 'NotLessSlantEqual;': '\u2a7d\u0338',
1325 | 'NotLessTilde;': '\u2274',
1326 | 'NotNestedGreaterGreater;': '\u2aa2\u0338',
1327 | 'NotNestedLessLess;': '\u2aa1\u0338',
1328 | 'notni;': '\u220c',
1329 | 'notniva;': '\u220c',
1330 | 'notnivb;': '\u22fe',
1331 | 'notnivc;': '\u22fd',
1332 | 'NotPrecedes;': '\u2280',
1333 | 'NotPrecedesEqual;': '\u2aaf\u0338',
1334 | 'NotPrecedesSlantEqual;': '\u22e0',
1335 | 'NotReverseElement;': '\u220c',
1336 | 'NotRightTriangle;': '\u22eb',
1337 | 'NotRightTriangleBar;': '\u29d0\u0338',
1338 | 'NotRightTriangleEqual;': '\u22ed',
1339 | 'NotSquareSubset;': '\u228f\u0338',
1340 | 'NotSquareSubsetEqual;': '\u22e2',
1341 | 'NotSquareSuperset;': '\u2290\u0338',
1342 | 'NotSquareSupersetEqual;': '\u22e3',
1343 | 'NotSubset;': '\u2282\u20d2',
1344 | 'NotSubsetEqual;': '\u2288',
1345 | 'NotSucceeds;': '\u2281',
1346 | 'NotSucceedsEqual;': '\u2ab0\u0338',
1347 | 'NotSucceedsSlantEqual;': '\u22e1',
1348 | 'NotSucceedsTilde;': '\u227f\u0338',
1349 | 'NotSuperset;': '\u2283\u20d2',
1350 | 'NotSupersetEqual;': '\u2289',
1351 | 'NotTilde;': '\u2241',
1352 | 'NotTildeEqual;': '\u2244',
1353 | 'NotTildeFullEqual;': '\u2247',
1354 | 'NotTildeTilde;': '\u2249',
1355 | 'NotVerticalBar;': '\u2224',
1356 | 'npar;': '\u2226',
1357 | 'nparallel;': '\u2226',
1358 | 'nparsl;': '\u2afd\u20e5',
1359 | 'npart;': '\u2202\u0338',
1360 | 'npolint;': '\u2a14',
1361 | 'npr;': '\u2280',
1362 | 'nprcue;': '\u22e0',
1363 | 'npre;': '\u2aaf\u0338',
1364 | 'nprec;': '\u2280',
1365 | 'npreceq;': '\u2aaf\u0338',
1366 | 'nrArr;': '\u21cf',
1367 | 'nrarr;': '\u219b',
1368 | 'nrarrc;': '\u2933\u0338',
1369 | 'nrarrw;': '\u219d\u0338',
1370 | 'nRightarrow;': '\u21cf',
1371 | 'nrightarrow;': '\u219b',
1372 | 'nrtri;': '\u22eb',
1373 | 'nrtrie;': '\u22ed',
1374 | 'nsc;': '\u2281',
1375 | 'nsccue;': '\u22e1',
1376 | 'nsce;': '\u2ab0\u0338',
1377 | 'Nscr;': '\U0001d4a9',
1378 | 'nscr;': '\U0001d4c3',
1379 | 'nshortmid;': '\u2224',
1380 | 'nshortparallel;': '\u2226',
1381 | 'nsim;': '\u2241',
1382 | 'nsime;': '\u2244',
1383 | 'nsimeq;': '\u2244',
1384 | 'nsmid;': '\u2224',
1385 | 'nspar;': '\u2226',
1386 | 'nsqsube;': '\u22e2',
1387 | 'nsqsupe;': '\u22e3',
1388 | 'nsub;': '\u2284',
1389 | 'nsubE;': '\u2ac5\u0338',
1390 | 'nsube;': '\u2288',
1391 | 'nsubset;': '\u2282\u20d2',
1392 | 'nsubseteq;': '\u2288',
1393 | 'nsubseteqq;': '\u2ac5\u0338',
1394 | 'nsucc;': '\u2281',
1395 | 'nsucceq;': '\u2ab0\u0338',
1396 | 'nsup;': '\u2285',
1397 | 'nsupE;': '\u2ac6\u0338',
1398 | 'nsupe;': '\u2289',
1399 | 'nsupset;': '\u2283\u20d2',
1400 | 'nsupseteq;': '\u2289',
1401 | 'nsupseteqq;': '\u2ac6\u0338',
1402 | 'ntgl;': '\u2279',
1403 | 'Ntilde': '\xd1',
1404 | 'ntilde': '\xf1',
1405 | 'Ntilde;': '\xd1',
1406 | 'ntilde;': '\xf1',
1407 | 'ntlg;': '\u2278',
1408 | 'ntriangleleft;': '\u22ea',
1409 | 'ntrianglelefteq;': '\u22ec',
1410 | 'ntriangleright;': '\u22eb',
1411 | 'ntrianglerighteq;': '\u22ed',
1412 | 'Nu;': '\u039d',
1413 | 'nu;': '\u03bd',
1414 | 'num;': '#',
1415 | 'numero;': '\u2116',
1416 | 'numsp;': '\u2007',
1417 | 'nvap;': '\u224d\u20d2',
1418 | 'nVDash;': '\u22af',
1419 | 'nVdash;': '\u22ae',
1420 | 'nvDash;': '\u22ad',
1421 | 'nvdash;': '\u22ac',
1422 | 'nvge;': '\u2265\u20d2',
1423 | 'nvgt;': '>\u20d2',
1424 | 'nvHarr;': '\u2904',
1425 | 'nvinfin;': '\u29de',
1426 | 'nvlArr;': '\u2902',
1427 | 'nvle;': '\u2264\u20d2',
1428 | 'nvlt;': '<\u20d2',
1429 | 'nvltrie;': '\u22b4\u20d2',
1430 | 'nvrArr;': '\u2903',
1431 | 'nvrtrie;': '\u22b5\u20d2',
1432 | 'nvsim;': '\u223c\u20d2',
1433 | 'nwarhk;': '\u2923',
1434 | 'nwArr;': '\u21d6',
1435 | 'nwarr;': '\u2196',
1436 | 'nwarrow;': '\u2196',
1437 | 'nwnear;': '\u2927',
1438 | 'Oacute': '\xd3',
1439 | 'oacute': '\xf3',
1440 | 'Oacute;': '\xd3',
1441 | 'oacute;': '\xf3',
1442 | 'oast;': '\u229b',
1443 | 'ocir;': '\u229a',
1444 | 'Ocirc': '\xd4',
1445 | 'ocirc': '\xf4',
1446 | 'Ocirc;': '\xd4',
1447 | 'ocirc;': '\xf4',
1448 | 'Ocy;': '\u041e',
1449 | 'ocy;': '\u043e',
1450 | 'odash;': '\u229d',
1451 | 'Odblac;': '\u0150',
1452 | 'odblac;': '\u0151',
1453 | 'odiv;': '\u2a38',
1454 | 'odot;': '\u2299',
1455 | 'odsold;': '\u29bc',
1456 | 'OElig;': '\u0152',
1457 | 'oelig;': '\u0153',
1458 | 'ofcir;': '\u29bf',
1459 | 'Ofr;': '\U0001d512',
1460 | 'ofr;': '\U0001d52c',
1461 | 'ogon;': '\u02db',
1462 | 'Ograve': '\xd2',
1463 | 'ograve': '\xf2',
1464 | 'Ograve;': '\xd2',
1465 | 'ograve;': '\xf2',
1466 | 'ogt;': '\u29c1',
1467 | 'ohbar;': '\u29b5',
1468 | 'ohm;': '\u03a9',
1469 | 'oint;': '\u222e',
1470 | 'olarr;': '\u21ba',
1471 | 'olcir;': '\u29be',
1472 | 'olcross;': '\u29bb',
1473 | 'oline;': '\u203e',
1474 | 'olt;': '\u29c0',
1475 | 'Omacr;': '\u014c',
1476 | 'omacr;': '\u014d',
1477 | 'Omega;': '\u03a9',
1478 | 'omega;': '\u03c9',
1479 | 'Omicron;': '\u039f',
1480 | 'omicron;': '\u03bf',
1481 | 'omid;': '\u29b6',
1482 | 'ominus;': '\u2296',
1483 | 'Oopf;': '\U0001d546',
1484 | 'oopf;': '\U0001d560',
1485 | 'opar;': '\u29b7',
1486 | 'OpenCurlyDoubleQuote;': '\u201c',
1487 | 'OpenCurlyQuote;': '\u2018',
1488 | 'operp;': '\u29b9',
1489 | 'oplus;': '\u2295',
1490 | 'Or;': '\u2a54',
1491 | 'or;': '\u2228',
1492 | 'orarr;': '\u21bb',
1493 | 'ord;': '\u2a5d',
1494 | 'order;': '\u2134',
1495 | 'orderof;': '\u2134',
1496 | 'ordf': '\xaa',
1497 | 'ordf;': '\xaa',
1498 | 'ordm': '\xba',
1499 | 'ordm;': '\xba',
1500 | 'origof;': '\u22b6',
1501 | 'oror;': '\u2a56',
1502 | 'orslope;': '\u2a57',
1503 | 'orv;': '\u2a5b',
1504 | 'oS;': '\u24c8',
1505 | 'Oscr;': '\U0001d4aa',
1506 | 'oscr;': '\u2134',
1507 | 'Oslash': '\xd8',
1508 | 'oslash': '\xf8',
1509 | 'Oslash;': '\xd8',
1510 | 'oslash;': '\xf8',
1511 | 'osol;': '\u2298',
1512 | 'Otilde': '\xd5',
1513 | 'otilde': '\xf5',
1514 | 'Otilde;': '\xd5',
1515 | 'otilde;': '\xf5',
1516 | 'Otimes;': '\u2a37',
1517 | 'otimes;': '\u2297',
1518 | 'otimesas;': '\u2a36',
1519 | 'Ouml': '\xd6',
1520 | 'ouml': '\xf6',
1521 | 'Ouml;': '\xd6',
1522 | 'ouml;': '\xf6',
1523 | 'ovbar;': '\u233d',
1524 | 'OverBar;': '\u203e',
1525 | 'OverBrace;': '\u23de',
1526 | 'OverBracket;': '\u23b4',
1527 | 'OverParenthesis;': '\u23dc',
1528 | 'par;': '\u2225',
1529 | 'para': '\xb6',
1530 | 'para;': '\xb6',
1531 | 'parallel;': '\u2225',
1532 | 'parsim;': '\u2af3',
1533 | 'parsl;': '\u2afd',
1534 | 'part;': '\u2202',
1535 | 'PartialD;': '\u2202',
1536 | 'Pcy;': '\u041f',
1537 | 'pcy;': '\u043f',
1538 | 'percnt;': '%',
1539 | 'period;': '.',
1540 | 'permil;': '\u2030',
1541 | 'perp;': '\u22a5',
1542 | 'pertenk;': '\u2031',
1543 | 'Pfr;': '\U0001d513',
1544 | 'pfr;': '\U0001d52d',
1545 | 'Phi;': '\u03a6',
1546 | 'phi;': '\u03c6',
1547 | 'phiv;': '\u03d5',
1548 | 'phmmat;': '\u2133',
1549 | 'phone;': '\u260e',
1550 | 'Pi;': '\u03a0',
1551 | 'pi;': '\u03c0',
1552 | 'pitchfork;': '\u22d4',
1553 | 'piv;': '\u03d6',
1554 | 'planck;': '\u210f',
1555 | 'planckh;': '\u210e',
1556 | 'plankv;': '\u210f',
1557 | 'plus;': '+',
1558 | 'plusacir;': '\u2a23',
1559 | 'plusb;': '\u229e',
1560 | 'pluscir;': '\u2a22',
1561 | 'plusdo;': '\u2214',
1562 | 'plusdu;': '\u2a25',
1563 | 'pluse;': '\u2a72',
1564 | 'PlusMinus;': '\xb1',
1565 | 'plusmn': '\xb1',
1566 | 'plusmn;': '\xb1',
1567 | 'plussim;': '\u2a26',
1568 | 'plustwo;': '\u2a27',
1569 | 'pm;': '\xb1',
1570 | 'Poincareplane;': '\u210c',
1571 | 'pointint;': '\u2a15',
1572 | 'Popf;': '\u2119',
1573 | 'popf;': '\U0001d561',
1574 | 'pound': '\xa3',
1575 | 'pound;': '\xa3',
1576 | 'Pr;': '\u2abb',
1577 | 'pr;': '\u227a',
1578 | 'prap;': '\u2ab7',
1579 | 'prcue;': '\u227c',
1580 | 'prE;': '\u2ab3',
1581 | 'pre;': '\u2aaf',
1582 | 'prec;': '\u227a',
1583 | 'precapprox;': '\u2ab7',
1584 | 'preccurlyeq;': '\u227c',
1585 | 'Precedes;': '\u227a',
1586 | 'PrecedesEqual;': '\u2aaf',
1587 | 'PrecedesSlantEqual;': '\u227c',
1588 | 'PrecedesTilde;': '\u227e',
1589 | 'preceq;': '\u2aaf',
1590 | 'precnapprox;': '\u2ab9',
1591 | 'precneqq;': '\u2ab5',
1592 | 'precnsim;': '\u22e8',
1593 | 'precsim;': '\u227e',
1594 | 'Prime;': '\u2033',
1595 | 'prime;': '\u2032',
1596 | 'primes;': '\u2119',
1597 | 'prnap;': '\u2ab9',
1598 | 'prnE;': '\u2ab5',
1599 | 'prnsim;': '\u22e8',
1600 | 'prod;': '\u220f',
1601 | 'Product;': '\u220f',
1602 | 'profalar;': '\u232e',
1603 | 'profline;': '\u2312',
1604 | 'profsurf;': '\u2313',
1605 | 'prop;': '\u221d',
1606 | 'Proportion;': '\u2237',
1607 | 'Proportional;': '\u221d',
1608 | 'propto;': '\u221d',
1609 | 'prsim;': '\u227e',
1610 | 'prurel;': '\u22b0',
1611 | 'Pscr;': '\U0001d4ab',
1612 | 'pscr;': '\U0001d4c5',
1613 | 'Psi;': '\u03a8',
1614 | 'psi;': '\u03c8',
1615 | 'puncsp;': '\u2008',
1616 | 'Qfr;': '\U0001d514',
1617 | 'qfr;': '\U0001d52e',
1618 | 'qint;': '\u2a0c',
1619 | 'Qopf;': '\u211a',
1620 | 'qopf;': '\U0001d562',
1621 | 'qprime;': '\u2057',
1622 | 'Qscr;': '\U0001d4ac',
1623 | 'qscr;': '\U0001d4c6',
1624 | 'quaternions;': '\u210d',
1625 | 'quatint;': '\u2a16',
1626 | 'quest;': '?',
1627 | 'questeq;': '\u225f',
1628 | 'QUOT': '"',
1629 | 'quot': '"',
1630 | 'QUOT;': '"',
1631 | 'quot;': '"',
1632 | 'rAarr;': '\u21db',
1633 | 'race;': '\u223d\u0331',
1634 | 'Racute;': '\u0154',
1635 | 'racute;': '\u0155',
1636 | 'radic;': '\u221a',
1637 | 'raemptyv;': '\u29b3',
1638 | 'Rang;': '\u27eb',
1639 | 'rang;': '\u27e9',
1640 | 'rangd;': '\u2992',
1641 | 'range;': '\u29a5',
1642 | 'rangle;': '\u27e9',
1643 | 'raquo': '\xbb',
1644 | 'raquo;': '\xbb',
1645 | 'Rarr;': '\u21a0',
1646 | 'rArr;': '\u21d2',
1647 | 'rarr;': '\u2192',
1648 | 'rarrap;': '\u2975',
1649 | 'rarrb;': '\u21e5',
1650 | 'rarrbfs;': '\u2920',
1651 | 'rarrc;': '\u2933',
1652 | 'rarrfs;': '\u291e',
1653 | 'rarrhk;': '\u21aa',
1654 | 'rarrlp;': '\u21ac',
1655 | 'rarrpl;': '\u2945',
1656 | 'rarrsim;': '\u2974',
1657 | 'Rarrtl;': '\u2916',
1658 | 'rarrtl;': '\u21a3',
1659 | 'rarrw;': '\u219d',
1660 | 'rAtail;': '\u291c',
1661 | 'ratail;': '\u291a',
1662 | 'ratio;': '\u2236',
1663 | 'rationals;': '\u211a',
1664 | 'RBarr;': '\u2910',
1665 | 'rBarr;': '\u290f',
1666 | 'rbarr;': '\u290d',
1667 | 'rbbrk;': '\u2773',
1668 | 'rbrace;': '}',
1669 | 'rbrack;': ']',
1670 | 'rbrke;': '\u298c',
1671 | 'rbrksld;': '\u298e',
1672 | 'rbrkslu;': '\u2990',
1673 | 'Rcaron;': '\u0158',
1674 | 'rcaron;': '\u0159',
1675 | 'Rcedil;': '\u0156',
1676 | 'rcedil;': '\u0157',
1677 | 'rceil;': '\u2309',
1678 | 'rcub;': '}',
1679 | 'Rcy;': '\u0420',
1680 | 'rcy;': '\u0440',
1681 | 'rdca;': '\u2937',
1682 | 'rdldhar;': '\u2969',
1683 | 'rdquo;': '\u201d',
1684 | 'rdquor;': '\u201d',
1685 | 'rdsh;': '\u21b3',
1686 | 'Re;': '\u211c',
1687 | 'real;': '\u211c',
1688 | 'realine;': '\u211b',
1689 | 'realpart;': '\u211c',
1690 | 'reals;': '\u211d',
1691 | 'rect;': '\u25ad',
1692 | 'REG': '\xae',
1693 | 'reg': '\xae',
1694 | 'REG;': '\xae',
1695 | 'reg;': '\xae',
1696 | 'ReverseElement;': '\u220b',
1697 | 'ReverseEquilibrium;': '\u21cb',
1698 | 'ReverseUpEquilibrium;': '\u296f',
1699 | 'rfisht;': '\u297d',
1700 | 'rfloor;': '\u230b',
1701 | 'Rfr;': '\u211c',
1702 | 'rfr;': '\U0001d52f',
1703 | 'rHar;': '\u2964',
1704 | 'rhard;': '\u21c1',
1705 | 'rharu;': '\u21c0',
1706 | 'rharul;': '\u296c',
1707 | 'Rho;': '\u03a1',
1708 | 'rho;': '\u03c1',
1709 | 'rhov;': '\u03f1',
1710 | 'RightAngleBracket;': '\u27e9',
1711 | 'RightArrow;': '\u2192',
1712 | 'Rightarrow;': '\u21d2',
1713 | 'rightarrow;': '\u2192',
1714 | 'RightArrowBar;': '\u21e5',
1715 | 'RightArrowLeftArrow;': '\u21c4',
1716 | 'rightarrowtail;': '\u21a3',
1717 | 'RightCeiling;': '\u2309',
1718 | 'RightDoubleBracket;': '\u27e7',
1719 | 'RightDownTeeVector;': '\u295d',
1720 | 'RightDownVector;': '\u21c2',
1721 | 'RightDownVectorBar;': '\u2955',
1722 | 'RightFloor;': '\u230b',
1723 | 'rightharpoondown;': '\u21c1',
1724 | 'rightharpoonup;': '\u21c0',
1725 | 'rightleftarrows;': '\u21c4',
1726 | 'rightleftharpoons;': '\u21cc',
1727 | 'rightrightarrows;': '\u21c9',
1728 | 'rightsquigarrow;': '\u219d',
1729 | 'RightTee;': '\u22a2',
1730 | 'RightTeeArrow;': '\u21a6',
1731 | 'RightTeeVector;': '\u295b',
1732 | 'rightthreetimes;': '\u22cc',
1733 | 'RightTriangle;': '\u22b3',
1734 | 'RightTriangleBar;': '\u29d0',
1735 | 'RightTriangleEqual;': '\u22b5',
1736 | 'RightUpDownVector;': '\u294f',
1737 | 'RightUpTeeVector;': '\u295c',
1738 | 'RightUpVector;': '\u21be',
1739 | 'RightUpVectorBar;': '\u2954',
1740 | 'RightVector;': '\u21c0',
1741 | 'RightVectorBar;': '\u2953',
1742 | 'ring;': '\u02da',
1743 | 'risingdotseq;': '\u2253',
1744 | 'rlarr;': '\u21c4',
1745 | 'rlhar;': '\u21cc',
1746 | 'rlm;': '\u200f',
1747 | 'rmoust;': '\u23b1',
1748 | 'rmoustache;': '\u23b1',
1749 | 'rnmid;': '\u2aee',
1750 | 'roang;': '\u27ed',
1751 | 'roarr;': '\u21fe',
1752 | 'robrk;': '\u27e7',
1753 | 'ropar;': '\u2986',
1754 | 'Ropf;': '\u211d',
1755 | 'ropf;': '\U0001d563',
1756 | 'roplus;': '\u2a2e',
1757 | 'rotimes;': '\u2a35',
1758 | 'RoundImplies;': '\u2970',
1759 | 'rpar;': ')',
1760 | 'rpargt;': '\u2994',
1761 | 'rppolint;': '\u2a12',
1762 | 'rrarr;': '\u21c9',
1763 | 'Rrightarrow;': '\u21db',
1764 | 'rsaquo;': '\u203a',
1765 | 'Rscr;': '\u211b',
1766 | 'rscr;': '\U0001d4c7',
1767 | 'Rsh;': '\u21b1',
1768 | 'rsh;': '\u21b1',
1769 | 'rsqb;': ']',
1770 | 'rsquo;': '\u2019',
1771 | 'rsquor;': '\u2019',
1772 | 'rthree;': '\u22cc',
1773 | 'rtimes;': '\u22ca',
1774 | 'rtri;': '\u25b9',
1775 | 'rtrie;': '\u22b5',
1776 | 'rtrif;': '\u25b8',
1777 | 'rtriltri;': '\u29ce',
1778 | 'RuleDelayed;': '\u29f4',
1779 | 'ruluhar;': '\u2968',
1780 | 'rx;': '\u211e',
1781 | 'Sacute;': '\u015a',
1782 | 'sacute;': '\u015b',
1783 | 'sbquo;': '\u201a',
1784 | 'Sc;': '\u2abc',
1785 | 'sc;': '\u227b',
1786 | 'scap;': '\u2ab8',
1787 | 'Scaron;': '\u0160',
1788 | 'scaron;': '\u0161',
1789 | 'sccue;': '\u227d',
1790 | 'scE;': '\u2ab4',
1791 | 'sce;': '\u2ab0',
1792 | 'Scedil;': '\u015e',
1793 | 'scedil;': '\u015f',
1794 | 'Scirc;': '\u015c',
1795 | 'scirc;': '\u015d',
1796 | 'scnap;': '\u2aba',
1797 | 'scnE;': '\u2ab6',
1798 | 'scnsim;': '\u22e9',
1799 | 'scpolint;': '\u2a13',
1800 | 'scsim;': '\u227f',
1801 | 'Scy;': '\u0421',
1802 | 'scy;': '\u0441',
1803 | 'sdot;': '\u22c5',
1804 | 'sdotb;': '\u22a1',
1805 | 'sdote;': '\u2a66',
1806 | 'searhk;': '\u2925',
1807 | 'seArr;': '\u21d8',
1808 | 'searr;': '\u2198',
1809 | 'searrow;': '\u2198',
1810 | 'sect': '\xa7',
1811 | 'sect;': '\xa7',
1812 | 'semi;': ';',
1813 | 'seswar;': '\u2929',
1814 | 'setminus;': '\u2216',
1815 | 'setmn;': '\u2216',
1816 | 'sext;': '\u2736',
1817 | 'Sfr;': '\U0001d516',
1818 | 'sfr;': '\U0001d530',
1819 | 'sfrown;': '\u2322',
1820 | 'sharp;': '\u266f',
1821 | 'SHCHcy;': '\u0429',
1822 | 'shchcy;': '\u0449',
1823 | 'SHcy;': '\u0428',
1824 | 'shcy;': '\u0448',
1825 | 'ShortDownArrow;': '\u2193',
1826 | 'ShortLeftArrow;': '\u2190',
1827 | 'shortmid;': '\u2223',
1828 | 'shortparallel;': '\u2225',
1829 | 'ShortRightArrow;': '\u2192',
1830 | 'ShortUpArrow;': '\u2191',
1831 | 'shy': '\xad',
1832 | 'shy;': '\xad',
1833 | 'Sigma;': '\u03a3',
1834 | 'sigma;': '\u03c3',
1835 | 'sigmaf;': '\u03c2',
1836 | 'sigmav;': '\u03c2',
1837 | 'sim;': '\u223c',
1838 | 'simdot;': '\u2a6a',
1839 | 'sime;': '\u2243',
1840 | 'simeq;': '\u2243',
1841 | 'simg;': '\u2a9e',
1842 | 'simgE;': '\u2aa0',
1843 | 'siml;': '\u2a9d',
1844 | 'simlE;': '\u2a9f',
1845 | 'simne;': '\u2246',
1846 | 'simplus;': '\u2a24',
1847 | 'simrarr;': '\u2972',
1848 | 'slarr;': '\u2190',
1849 | 'SmallCircle;': '\u2218',
1850 | 'smallsetminus;': '\u2216',
1851 | 'smashp;': '\u2a33',
1852 | 'smeparsl;': '\u29e4',
1853 | 'smid;': '\u2223',
1854 | 'smile;': '\u2323',
1855 | 'smt;': '\u2aaa',
1856 | 'smte;': '\u2aac',
1857 | 'smtes;': '\u2aac\ufe00',
1858 | 'SOFTcy;': '\u042c',
1859 | 'softcy;': '\u044c',
1860 | 'sol;': '/',
1861 | 'solb;': '\u29c4',
1862 | 'solbar;': '\u233f',
1863 | 'Sopf;': '\U0001d54a',
1864 | 'sopf;': '\U0001d564',
1865 | 'spades;': '\u2660',
1866 | 'spadesuit;': '\u2660',
1867 | 'spar;': '\u2225',
1868 | 'sqcap;': '\u2293',
1869 | 'sqcaps;': '\u2293\ufe00',
1870 | 'sqcup;': '\u2294',
1871 | 'sqcups;': '\u2294\ufe00',
1872 | 'Sqrt;': '\u221a',
1873 | 'sqsub;': '\u228f',
1874 | 'sqsube;': '\u2291',
1875 | 'sqsubset;': '\u228f',
1876 | 'sqsubseteq;': '\u2291',
1877 | 'sqsup;': '\u2290',
1878 | 'sqsupe;': '\u2292',
1879 | 'sqsupset;': '\u2290',
1880 | 'sqsupseteq;': '\u2292',
1881 | 'squ;': '\u25a1',
1882 | 'Square;': '\u25a1',
1883 | 'square;': '\u25a1',
1884 | 'SquareIntersection;': '\u2293',
1885 | 'SquareSubset;': '\u228f',
1886 | 'SquareSubsetEqual;': '\u2291',
1887 | 'SquareSuperset;': '\u2290',
1888 | 'SquareSupersetEqual;': '\u2292',
1889 | 'SquareUnion;': '\u2294',
1890 | 'squarf;': '\u25aa',
1891 | 'squf;': '\u25aa',
1892 | 'srarr;': '\u2192',
1893 | 'Sscr;': '\U0001d4ae',
1894 | 'sscr;': '\U0001d4c8',
1895 | 'ssetmn;': '\u2216',
1896 | 'ssmile;': '\u2323',
1897 | 'sstarf;': '\u22c6',
1898 | 'Star;': '\u22c6',
1899 | 'star;': '\u2606',
1900 | 'starf;': '\u2605',
1901 | 'straightepsilon;': '\u03f5',
1902 | 'straightphi;': '\u03d5',
1903 | 'strns;': '\xaf',
1904 | 'Sub;': '\u22d0',
1905 | 'sub;': '\u2282',
1906 | 'subdot;': '\u2abd',
1907 | 'subE;': '\u2ac5',
1908 | 'sube;': '\u2286',
1909 | 'subedot;': '\u2ac3',
1910 | 'submult;': '\u2ac1',
1911 | 'subnE;': '\u2acb',
1912 | 'subne;': '\u228a',
1913 | 'subplus;': '\u2abf',
1914 | 'subrarr;': '\u2979',
1915 | 'Subset;': '\u22d0',
1916 | 'subset;': '\u2282',
1917 | 'subseteq;': '\u2286',
1918 | 'subseteqq;': '\u2ac5',
1919 | 'SubsetEqual;': '\u2286',
1920 | 'subsetneq;': '\u228a',
1921 | 'subsetneqq;': '\u2acb',
1922 | 'subsim;': '\u2ac7',
1923 | 'subsub;': '\u2ad5',
1924 | 'subsup;': '\u2ad3',
1925 | 'succ;': '\u227b',
1926 | 'succapprox;': '\u2ab8',
1927 | 'succcurlyeq;': '\u227d',
1928 | 'Succeeds;': '\u227b',
1929 | 'SucceedsEqual;': '\u2ab0',
1930 | 'SucceedsSlantEqual;': '\u227d',
1931 | 'SucceedsTilde;': '\u227f',
1932 | 'succeq;': '\u2ab0',
1933 | 'succnapprox;': '\u2aba',
1934 | 'succneqq;': '\u2ab6',
1935 | 'succnsim;': '\u22e9',
1936 | 'succsim;': '\u227f',
1937 | 'SuchThat;': '\u220b',
1938 | 'Sum;': '\u2211',
1939 | 'sum;': '\u2211',
1940 | 'sung;': '\u266a',
1941 | 'sup1': '\xb9',
1942 | 'sup1;': '\xb9',
1943 | 'sup2': '\xb2',
1944 | 'sup2;': '\xb2',
1945 | 'sup3': '\xb3',
1946 | 'sup3;': '\xb3',
1947 | 'Sup;': '\u22d1',
1948 | 'sup;': '\u2283',
1949 | 'supdot;': '\u2abe',
1950 | 'supdsub;': '\u2ad8',
1951 | 'supE;': '\u2ac6',
1952 | 'supe;': '\u2287',
1953 | 'supedot;': '\u2ac4',
1954 | 'Superset;': '\u2283',
1955 | 'SupersetEqual;': '\u2287',
1956 | 'suphsol;': '\u27c9',
1957 | 'suphsub;': '\u2ad7',
1958 | 'suplarr;': '\u297b',
1959 | 'supmult;': '\u2ac2',
1960 | 'supnE;': '\u2acc',
1961 | 'supne;': '\u228b',
1962 | 'supplus;': '\u2ac0',
1963 | 'Supset;': '\u22d1',
1964 | 'supset;': '\u2283',
1965 | 'supseteq;': '\u2287',
1966 | 'supseteqq;': '\u2ac6',
1967 | 'supsetneq;': '\u228b',
1968 | 'supsetneqq;': '\u2acc',
1969 | 'supsim;': '\u2ac8',
1970 | 'supsub;': '\u2ad4',
1971 | 'supsup;': '\u2ad6',
1972 | 'swarhk;': '\u2926',
1973 | 'swArr;': '\u21d9',
1974 | 'swarr;': '\u2199',
1975 | 'swarrow;': '\u2199',
1976 | 'swnwar;': '\u292a',
1977 | 'szlig': '\xdf',
1978 | 'szlig;': '\xdf',
1979 | 'Tab;': '\t',
1980 | 'target;': '\u2316',
1981 | 'Tau;': '\u03a4',
1982 | 'tau;': '\u03c4',
1983 | 'tbrk;': '\u23b4',
1984 | 'Tcaron;': '\u0164',
1985 | 'tcaron;': '\u0165',
1986 | 'Tcedil;': '\u0162',
1987 | 'tcedil;': '\u0163',
1988 | 'Tcy;': '\u0422',
1989 | 'tcy;': '\u0442',
1990 | 'tdot;': '\u20db',
1991 | 'telrec;': '\u2315',
1992 | 'Tfr;': '\U0001d517',
1993 | 'tfr;': '\U0001d531',
1994 | 'there4;': '\u2234',
1995 | 'Therefore;': '\u2234',
1996 | 'therefore;': '\u2234',
1997 | 'Theta;': '\u0398',
1998 | 'theta;': '\u03b8',
1999 | 'thetasym;': '\u03d1',
2000 | 'thetav;': '\u03d1',
2001 | 'thickapprox;': '\u2248',
2002 | 'thicksim;': '\u223c',
2003 | 'ThickSpace;': '\u205f\u200a',
2004 | 'thinsp;': '\u2009',
2005 | 'ThinSpace;': '\u2009',
2006 | 'thkap;': '\u2248',
2007 | 'thksim;': '\u223c',
2008 | 'THORN': '\xde',
2009 | 'thorn': '\xfe',
2010 | 'THORN;': '\xde',
2011 | 'thorn;': '\xfe',
2012 | 'Tilde;': '\u223c',
2013 | 'tilde;': '\u02dc',
2014 | 'TildeEqual;': '\u2243',
2015 | 'TildeFullEqual;': '\u2245',
2016 | 'TildeTilde;': '\u2248',
2017 | 'times': '\xd7',
2018 | 'times;': '\xd7',
2019 | 'timesb;': '\u22a0',
2020 | 'timesbar;': '\u2a31',
2021 | 'timesd;': '\u2a30',
2022 | 'tint;': '\u222d',
2023 | 'toea;': '\u2928',
2024 | 'top;': '\u22a4',
2025 | 'topbot;': '\u2336',
2026 | 'topcir;': '\u2af1',
2027 | 'Topf;': '\U0001d54b',
2028 | 'topf;': '\U0001d565',
2029 | 'topfork;': '\u2ada',
2030 | 'tosa;': '\u2929',
2031 | 'tprime;': '\u2034',
2032 | 'TRADE;': '\u2122',
2033 | 'trade;': '\u2122',
2034 | 'triangle;': '\u25b5',
2035 | 'triangledown;': '\u25bf',
2036 | 'triangleleft;': '\u25c3',
2037 | 'trianglelefteq;': '\u22b4',
2038 | 'triangleq;': '\u225c',
2039 | 'triangleright;': '\u25b9',
2040 | 'trianglerighteq;': '\u22b5',
2041 | 'tridot;': '\u25ec',
2042 | 'trie;': '\u225c',
2043 | 'triminus;': '\u2a3a',
2044 | 'TripleDot;': '\u20db',
2045 | 'triplus;': '\u2a39',
2046 | 'trisb;': '\u29cd',
2047 | 'tritime;': '\u2a3b',
2048 | 'trpezium;': '\u23e2',
2049 | 'Tscr;': '\U0001d4af',
2050 | 'tscr;': '\U0001d4c9',
2051 | 'TScy;': '\u0426',
2052 | 'tscy;': '\u0446',
2053 | 'TSHcy;': '\u040b',
2054 | 'tshcy;': '\u045b',
2055 | 'Tstrok;': '\u0166',
2056 | 'tstrok;': '\u0167',
2057 | 'twixt;': '\u226c',
2058 | 'twoheadleftarrow;': '\u219e',
2059 | 'twoheadrightarrow;': '\u21a0',
2060 | 'Uacute': '\xda',
2061 | 'uacute': '\xfa',
2062 | 'Uacute;': '\xda',
2063 | 'uacute;': '\xfa',
2064 | 'Uarr;': '\u219f',
2065 | 'uArr;': '\u21d1',
2066 | 'uarr;': '\u2191',
2067 | 'Uarrocir;': '\u2949',
2068 | 'Ubrcy;': '\u040e',
2069 | 'ubrcy;': '\u045e',
2070 | 'Ubreve;': '\u016c',
2071 | 'ubreve;': '\u016d',
2072 | 'Ucirc': '\xdb',
2073 | 'ucirc': '\xfb',
2074 | 'Ucirc;': '\xdb',
2075 | 'ucirc;': '\xfb',
2076 | 'Ucy;': '\u0423',
2077 | 'ucy;': '\u0443',
2078 | 'udarr;': '\u21c5',
2079 | 'Udblac;': '\u0170',
2080 | 'udblac;': '\u0171',
2081 | 'udhar;': '\u296e',
2082 | 'ufisht;': '\u297e',
2083 | 'Ufr;': '\U0001d518',
2084 | 'ufr;': '\U0001d532',
2085 | 'Ugrave': '\xd9',
2086 | 'ugrave': '\xf9',
2087 | 'Ugrave;': '\xd9',
2088 | 'ugrave;': '\xf9',
2089 | 'uHar;': '\u2963',
2090 | 'uharl;': '\u21bf',
2091 | 'uharr;': '\u21be',
2092 | 'uhblk;': '\u2580',
2093 | 'ulcorn;': '\u231c',
2094 | 'ulcorner;': '\u231c',
2095 | 'ulcrop;': '\u230f',
2096 | 'ultri;': '\u25f8',
2097 | 'Umacr;': '\u016a',
2098 | 'umacr;': '\u016b',
2099 | 'uml': '\xa8',
2100 | 'uml;': '\xa8',
2101 | 'UnderBar;': '_',
2102 | 'UnderBrace;': '\u23df',
2103 | 'UnderBracket;': '\u23b5',
2104 | 'UnderParenthesis;': '\u23dd',
2105 | 'Union;': '\u22c3',
2106 | 'UnionPlus;': '\u228e',
2107 | 'Uogon;': '\u0172',
2108 | 'uogon;': '\u0173',
2109 | 'Uopf;': '\U0001d54c',
2110 | 'uopf;': '\U0001d566',
2111 | 'UpArrow;': '\u2191',
2112 | 'Uparrow;': '\u21d1',
2113 | 'uparrow;': '\u2191',
2114 | 'UpArrowBar;': '\u2912',
2115 | 'UpArrowDownArrow;': '\u21c5',
2116 | 'UpDownArrow;': '\u2195',
2117 | 'Updownarrow;': '\u21d5',
2118 | 'updownarrow;': '\u2195',
2119 | 'UpEquilibrium;': '\u296e',
2120 | 'upharpoonleft;': '\u21bf',
2121 | 'upharpoonright;': '\u21be',
2122 | 'uplus;': '\u228e',
2123 | 'UpperLeftArrow;': '\u2196',
2124 | 'UpperRightArrow;': '\u2197',
2125 | 'Upsi;': '\u03d2',
2126 | 'upsi;': '\u03c5',
2127 | 'upsih;': '\u03d2',
2128 | 'Upsilon;': '\u03a5',
2129 | 'upsilon;': '\u03c5',
2130 | 'UpTee;': '\u22a5',
2131 | 'UpTeeArrow;': '\u21a5',
2132 | 'upuparrows;': '\u21c8',
2133 | 'urcorn;': '\u231d',
2134 | 'urcorner;': '\u231d',
2135 | 'urcrop;': '\u230e',
2136 | 'Uring;': '\u016e',
2137 | 'uring;': '\u016f',
2138 | 'urtri;': '\u25f9',
2139 | 'Uscr;': '\U0001d4b0',
2140 | 'uscr;': '\U0001d4ca',
2141 | 'utdot;': '\u22f0',
2142 | 'Utilde;': '\u0168',
2143 | 'utilde;': '\u0169',
2144 | 'utri;': '\u25b5',
2145 | 'utrif;': '\u25b4',
2146 | 'uuarr;': '\u21c8',
2147 | 'Uuml': '\xdc',
2148 | 'uuml': '\xfc',
2149 | 'Uuml;': '\xdc',
2150 | 'uuml;': '\xfc',
2151 | 'uwangle;': '\u29a7',
2152 | 'vangrt;': '\u299c',
2153 | 'varepsilon;': '\u03f5',
2154 | 'varkappa;': '\u03f0',
2155 | 'varnothing;': '\u2205',
2156 | 'varphi;': '\u03d5',
2157 | 'varpi;': '\u03d6',
2158 | 'varpropto;': '\u221d',
2159 | 'vArr;': '\u21d5',
2160 | 'varr;': '\u2195',
2161 | 'varrho;': '\u03f1',
2162 | 'varsigma;': '\u03c2',
2163 | 'varsubsetneq;': '\u228a\ufe00',
2164 | 'varsubsetneqq;': '\u2acb\ufe00',
2165 | 'varsupsetneq;': '\u228b\ufe00',
2166 | 'varsupsetneqq;': '\u2acc\ufe00',
2167 | 'vartheta;': '\u03d1',
2168 | 'vartriangleleft;': '\u22b2',
2169 | 'vartriangleright;': '\u22b3',
2170 | 'Vbar;': '\u2aeb',
2171 | 'vBar;': '\u2ae8',
2172 | 'vBarv;': '\u2ae9',
2173 | 'Vcy;': '\u0412',
2174 | 'vcy;': '\u0432',
2175 | 'VDash;': '\u22ab',
2176 | 'Vdash;': '\u22a9',
2177 | 'vDash;': '\u22a8',
2178 | 'vdash;': '\u22a2',
2179 | 'Vdashl;': '\u2ae6',
2180 | 'Vee;': '\u22c1',
2181 | 'vee;': '\u2228',
2182 | 'veebar;': '\u22bb',
2183 | 'veeeq;': '\u225a',
2184 | 'vellip;': '\u22ee',
2185 | 'Verbar;': '\u2016',
2186 | 'verbar;': '|',
2187 | 'Vert;': '\u2016',
2188 | 'vert;': '|',
2189 | 'VerticalBar;': '\u2223',
2190 | 'VerticalLine;': '|',
2191 | 'VerticalSeparator;': '\u2758',
2192 | 'VerticalTilde;': '\u2240',
2193 | 'VeryThinSpace;': '\u200a',
2194 | 'Vfr;': '\U0001d519',
2195 | 'vfr;': '\U0001d533',
2196 | 'vltri;': '\u22b2',
2197 | 'vnsub;': '\u2282\u20d2',
2198 | 'vnsup;': '\u2283\u20d2',
2199 | 'Vopf;': '\U0001d54d',
2200 | 'vopf;': '\U0001d567',
2201 | 'vprop;': '\u221d',
2202 | 'vrtri;': '\u22b3',
2203 | 'Vscr;': '\U0001d4b1',
2204 | 'vscr;': '\U0001d4cb',
2205 | 'vsubnE;': '\u2acb\ufe00',
2206 | 'vsubne;': '\u228a\ufe00',
2207 | 'vsupnE;': '\u2acc\ufe00',
2208 | 'vsupne;': '\u228b\ufe00',
2209 | 'Vvdash;': '\u22aa',
2210 | 'vzigzag;': '\u299a',
2211 | 'Wcirc;': '\u0174',
2212 | 'wcirc;': '\u0175',
2213 | 'wedbar;': '\u2a5f',
2214 | 'Wedge;': '\u22c0',
2215 | 'wedge;': '\u2227',
2216 | 'wedgeq;': '\u2259',
2217 | 'weierp;': '\u2118',
2218 | 'Wfr;': '\U0001d51a',
2219 | 'wfr;': '\U0001d534',
2220 | 'Wopf;': '\U0001d54e',
2221 | 'wopf;': '\U0001d568',
2222 | 'wp;': '\u2118',
2223 | 'wr;': '\u2240',
2224 | 'wreath;': '\u2240',
2225 | 'Wscr;': '\U0001d4b2',
2226 | 'wscr;': '\U0001d4cc',
2227 | 'xcap;': '\u22c2',
2228 | 'xcirc;': '\u25ef',
2229 | 'xcup;': '\u22c3',
2230 | 'xdtri;': '\u25bd',
2231 | 'Xfr;': '\U0001d51b',
2232 | 'xfr;': '\U0001d535',
2233 | 'xhArr;': '\u27fa',
2234 | 'xharr;': '\u27f7',
2235 | 'Xi;': '\u039e',
2236 | 'xi;': '\u03be',
2237 | 'xlArr;': '\u27f8',
2238 | 'xlarr;': '\u27f5',
2239 | 'xmap;': '\u27fc',
2240 | 'xnis;': '\u22fb',
2241 | 'xodot;': '\u2a00',
2242 | 'Xopf;': '\U0001d54f',
2243 | 'xopf;': '\U0001d569',
2244 | 'xoplus;': '\u2a01',
2245 | 'xotime;': '\u2a02',
2246 | 'xrArr;': '\u27f9',
2247 | 'xrarr;': '\u27f6',
2248 | 'Xscr;': '\U0001d4b3',
2249 | 'xscr;': '\U0001d4cd',
2250 | 'xsqcup;': '\u2a06',
2251 | 'xuplus;': '\u2a04',
2252 | 'xutri;': '\u25b3',
2253 | 'xvee;': '\u22c1',
2254 | 'xwedge;': '\u22c0',
2255 | 'Yacute': '\xdd',
2256 | 'yacute': '\xfd',
2257 | 'Yacute;': '\xdd',
2258 | 'yacute;': '\xfd',
2259 | 'YAcy;': '\u042f',
2260 | 'yacy;': '\u044f',
2261 | 'Ycirc;': '\u0176',
2262 | 'ycirc;': '\u0177',
2263 | 'Ycy;': '\u042b',
2264 | 'ycy;': '\u044b',
2265 | 'yen': '\xa5',
2266 | 'yen;': '\xa5',
2267 | 'Yfr;': '\U0001d51c',
2268 | 'yfr;': '\U0001d536',
2269 | 'YIcy;': '\u0407',
2270 | 'yicy;': '\u0457',
2271 | 'Yopf;': '\U0001d550',
2272 | 'yopf;': '\U0001d56a',
2273 | 'Yscr;': '\U0001d4b4',
2274 | 'yscr;': '\U0001d4ce',
2275 | 'YUcy;': '\u042e',
2276 | 'yucy;': '\u044e',
2277 | 'yuml': '\xff',
2278 | 'Yuml;': '\u0178',
2279 | 'yuml;': '\xff',
2280 | 'Zacute;': '\u0179',
2281 | 'zacute;': '\u017a',
2282 | 'Zcaron;': '\u017d',
2283 | 'zcaron;': '\u017e',
2284 | 'Zcy;': '\u0417',
2285 | 'zcy;': '\u0437',
2286 | 'Zdot;': '\u017b',
2287 | 'zdot;': '\u017c',
2288 | 'zeetrf;': '\u2128',
2289 | 'ZeroWidthSpace;': '\u200b',
2290 | 'Zeta;': '\u0396',
2291 | 'zeta;': '\u03b6',
2292 | 'Zfr;': '\u2128',
2293 | 'zfr;': '\U0001d537',
2294 | 'ZHcy;': '\u0416',
2295 | 'zhcy;': '\u0436',
2296 | 'zigrarr;': '\u21dd',
2297 | 'Zopf;': '\u2124',
2298 | 'zopf;': '\U0001d56b',
2299 | 'Zscr;': '\U0001d4b5',
2300 | 'zscr;': '\U0001d4cf',
2301 | 'zwj;': '\u200d',
2302 | 'zwnj;': '\u200c',
2303 | }
2304 |
2305 | try:
2306 | import http.client as compat_http_client
2307 | except ImportError: # Python 2
2308 | import httplib as compat_http_client
2309 |
2310 | try:
2311 | from urllib.error import HTTPError as compat_HTTPError
2312 | except ImportError: # Python 2
2313 | from urllib2 import HTTPError as compat_HTTPError
2314 |
2315 | try:
2316 | from urllib.request import urlretrieve as compat_urlretrieve
2317 | except ImportError: # Python 2
2318 | from urllib import urlretrieve as compat_urlretrieve
2319 |
2320 | try:
2321 | from html.parser import HTMLParser as compat_HTMLParser
2322 | except ImportError: # Python 2
2323 | from HTMLParser import HTMLParser as compat_HTMLParser
2324 |
2325 | try:
2326 | from subprocess import DEVNULL
2327 | compat_subprocess_get_DEVNULL = lambda: DEVNULL
2328 | except ImportError:
2329 | compat_subprocess_get_DEVNULL = lambda: open(os.path.devnull, 'w')
2330 |
2331 | try:
2332 | import http.server as compat_http_server
2333 | except ImportError:
2334 | import BaseHTTPServer as compat_http_server
2335 |
2336 | try:
2337 | compat_str = unicode # Python 2
2338 | except NameError:
2339 | compat_str = str
2340 |
2341 | try:
2342 | from urllib.parse import unquote_to_bytes as compat_urllib_parse_unquote_to_bytes
2343 | from urllib.parse import unquote as compat_urllib_parse_unquote
2344 | from urllib.parse import unquote_plus as compat_urllib_parse_unquote_plus
2345 | except ImportError: # Python 2
2346 | _asciire = (compat_urllib_parse._asciire if hasattr(compat_urllib_parse, '_asciire')
2347 | else re.compile(r'([\x00-\x7f]+)'))
2348 |
2349 | # HACK: The following are the correct unquote_to_bytes, unquote and unquote_plus
2350 | # implementations from cpython 3.4.3's stdlib. Python 2's version
2351 | # is apparently broken (see https://github.com/rg3/youtube-dl/pull/6244)
2352 |
2353 | def compat_urllib_parse_unquote_to_bytes(string):
2354 | """unquote_to_bytes('abc%20def') -> b'abc def'."""
2355 | # Note: strings are encoded as UTF-8. This is only an issue if it contains
2356 | # unescaped non-ASCII characters, which URIs should not.
2357 | if not string:
2358 | # Is it a string-like object?
2359 | string.split
2360 | return b''
2361 | if isinstance(string, compat_str):
2362 | string = string.encode('utf-8')
2363 | bits = string.split(b'%')
2364 | if len(bits) == 1:
2365 | return string
2366 | res = [bits[0]]
2367 | append = res.append
2368 | for item in bits[1:]:
2369 | try:
2370 | append(compat_urllib_parse._hextochr[item[:2]])
2371 | append(item[2:])
2372 | except KeyError:
2373 | append(b'%')
2374 | append(item)
2375 | return b''.join(res)
2376 |
2377 | def compat_urllib_parse_unquote(string, encoding='utf-8', errors='replace'):
2378 | """Replace %xx escapes by their single-character equivalent. The optional
2379 | encoding and errors parameters specify how to decode percent-encoded
2380 | sequences into Unicode characters, as accepted by the bytes.decode()
2381 | method.
2382 | By default, percent-encoded sequences are decoded with UTF-8, and invalid
2383 | sequences are replaced by a placeholder character.
2384 |
2385 | unquote('abc%20def') -> 'abc def'.
2386 | """
2387 | if '%' not in string:
2388 | string.split
2389 | return string
2390 | if encoding is None:
2391 | encoding = 'utf-8'
2392 | if errors is None:
2393 | errors = 'replace'
2394 | bits = _asciire.split(string)
2395 | res = [bits[0]]
2396 | append = res.append
2397 | for i in range(1, len(bits), 2):
2398 | append(compat_urllib_parse_unquote_to_bytes(bits[i]).decode(encoding, errors))
2399 | append(bits[i + 1])
2400 | return ''.join(res)
2401 |
2402 | def compat_urllib_parse_unquote_plus(string, encoding='utf-8', errors='replace'):
2403 | """Like unquote(), but also replace plus signs by spaces, as required for
2404 | unquoting HTML form values.
2405 |
2406 | unquote_plus('%7e/abc+def') -> '~/abc def'
2407 | """
2408 | string = string.replace('+', ' ')
2409 | return compat_urllib_parse_unquote(string, encoding, errors)
2410 |
2411 | try:
2412 | from urllib.parse import urlencode as compat_urllib_parse_urlencode
2413 | except ImportError: # Python 2
2414 | # Python 2 will choke in urlencode on mixture of byte and unicode strings.
2415 | # Possible solutions are to either port it from python 3 with all
2416 | # the friends or manually ensure input query contains only byte strings.
2417 | # We will stick with latter thus recursively encoding the whole query.
2418 | def compat_urllib_parse_urlencode(query, doseq=0, encoding='utf-8'):
2419 | def encode_elem(e):
2420 | if isinstance(e, dict):
2421 | e = encode_dict(e)
2422 | elif isinstance(e, (list, tuple,)):
2423 | list_e = encode_list(e)
2424 | e = tuple(list_e) if isinstance(e, tuple) else list_e
2425 | elif isinstance(e, compat_str):
2426 | e = e.encode(encoding)
2427 | return e
2428 |
2429 | def encode_dict(d):
2430 | return dict((encode_elem(k), encode_elem(v)) for k, v in d.items())
2431 |
2432 | def encode_list(l):
2433 | return [encode_elem(e) for e in l]
2434 |
2435 | return compat_urllib_parse.urlencode(encode_elem(query), doseq=doseq)
2436 |
2437 | try:
2438 | from urllib.request import DataHandler as compat_urllib_request_DataHandler
2439 | except ImportError: # Python < 3.4
2440 | # Ported from CPython 98774:1733b3bd46db, Lib/urllib/request.py
2441 | class compat_urllib_request_DataHandler(compat_urllib_request.BaseHandler):
2442 | def data_open(self, req):
2443 | # data URLs as specified in RFC 2397.
2444 | #
2445 | # ignores POSTed data
2446 | #
2447 | # syntax:
2448 | # dataurl := "data:" [ mediatype ] [ ";base64" ] "," data
2449 | # mediatype := [ type "/" subtype ] *( ";" parameter )
2450 | # data := *urlchar
2451 | # parameter := attribute "=" value
2452 | url = req.get_full_url()
2453 |
2454 | scheme, data = url.split(':', 1)
2455 | mediatype, data = data.split(',', 1)
2456 |
2457 | # even base64 encoded data URLs might be quoted so unquote in any case:
2458 | data = compat_urllib_parse_unquote_to_bytes(data)
2459 | if mediatype.endswith(';base64'):
2460 | data = binascii.a2b_base64(data)
2461 | mediatype = mediatype[:-7]
2462 |
2463 | if not mediatype:
2464 | mediatype = 'text/plain;charset=US-ASCII'
2465 |
2466 | headers = email.message_from_string(
2467 | 'Content-type: %s\nContent-length: %d\n' % (mediatype, len(data)))
2468 |
2469 | return compat_urllib_response.addinfourl(io.BytesIO(data), headers, url)
2470 |
2471 | try:
2472 | compat_basestring = basestring # Python 2
2473 | except NameError:
2474 | compat_basestring = str
2475 |
2476 | try:
2477 | compat_chr = unichr # Python 2
2478 | except NameError:
2479 | compat_chr = chr
2480 |
2481 | try:
2482 | from xml.etree.ElementTree import ParseError as compat_xml_parse_error
2483 | except ImportError: # Python 2.6
2484 | from xml.parsers.expat import ExpatError as compat_xml_parse_error
2485 |
2486 |
2487 | etree = xml.etree.ElementTree
2488 |
2489 |
2490 | class _TreeBuilder(etree.TreeBuilder):
2491 | def doctype(self, name, pubid, system):
2492 | pass
2493 |
2494 |
2495 | if sys.version_info[0] >= 3:
2496 | def compat_etree_fromstring(text):
2497 | return etree.XML(text, parser=etree.XMLParser(target=_TreeBuilder()))
2498 | else:
2499 | # python 2.x tries to encode unicode strings with ascii (see the
2500 | # XMLParser._fixtext method)
2501 | try:
2502 | _etree_iter = etree.Element.iter
2503 | except AttributeError: # Python <=2.6
2504 | def _etree_iter(root):
2505 | for el in root.findall('*'):
2506 | yield el
2507 | for sub in _etree_iter(el):
2508 | yield sub
2509 |
2510 | # on 2.6 XML doesn't have a parser argument, function copied from CPython
2511 | # 2.7 source
2512 | def _XML(text, parser=None):
2513 | if not parser:
2514 | parser = etree.XMLParser(target=_TreeBuilder())
2515 | parser.feed(text)
2516 | return parser.close()
2517 |
2518 | def _element_factory(*args, **kwargs):
2519 | el = etree.Element(*args, **kwargs)
2520 | for k, v in el.items():
2521 | if isinstance(v, bytes):
2522 | el.set(k, v.decode('utf-8'))
2523 | return el
2524 |
2525 | def compat_etree_fromstring(text):
2526 | doc = _XML(text, parser=etree.XMLParser(target=_TreeBuilder(element_factory=_element_factory)))
2527 | for el in _etree_iter(doc):
2528 | if el.text is not None and isinstance(el.text, bytes):
2529 | el.text = el.text.decode('utf-8')
2530 | return doc
2531 |
2532 | if hasattr(etree, 'register_namespace'):
2533 | compat_etree_register_namespace = etree.register_namespace
2534 | else:
2535 | def compat_etree_register_namespace(prefix, uri):
2536 | """Register a namespace prefix.
2537 | The registry is global, and any existing mapping for either the
2538 | given prefix or the namespace URI will be removed.
2539 | *prefix* is the namespace prefix, *uri* is a namespace uri. Tags and
2540 | attributes in this namespace will be serialized with prefix if possible.
2541 | ValueError is raised if prefix is reserved or is invalid.
2542 | """
2543 | if re.match(r"ns\d+$", prefix):
2544 | raise ValueError("Prefix format reserved for internal use")
2545 | for k, v in list(etree._namespace_map.items()):
2546 | if k == uri or v == prefix:
2547 | del etree._namespace_map[k]
2548 | etree._namespace_map[uri] = prefix
2549 |
2550 | if sys.version_info < (2, 7):
2551 | # Here comes the crazy part: In 2.6, if the xpath is a unicode,
2552 | # .//node does not match if a node is a direct child of . !
2553 | def compat_xpath(xpath):
2554 | if isinstance(xpath, compat_str):
2555 | xpath = xpath.encode('ascii')
2556 | return xpath
2557 | else:
2558 | compat_xpath = lambda xpath: xpath
2559 |
2560 | try:
2561 | from urllib.parse import parse_qs as compat_parse_qs
2562 | except ImportError: # Python 2
2563 | # HACK: The following is the correct parse_qs implementation from cpython 3's stdlib.
2564 | # Python 2's version is apparently totally broken
2565 |
2566 | def _parse_qsl(qs, keep_blank_values=False, strict_parsing=False,
2567 | encoding='utf-8', errors='replace'):
2568 | qs, _coerce_result = qs, compat_str
2569 | pairs = [s2 for s1 in qs.split('&') for s2 in s1.split(';')]
2570 | r = []
2571 | for name_value in pairs:
2572 | if not name_value and not strict_parsing:
2573 | continue
2574 | nv = name_value.split('=', 1)
2575 | if len(nv) != 2:
2576 | if strict_parsing:
2577 | raise ValueError('bad query field: %r' % (name_value,))
2578 | # Handle case of a control-name with no equal sign
2579 | if keep_blank_values:
2580 | nv.append('')
2581 | else:
2582 | continue
2583 | if len(nv[1]) or keep_blank_values:
2584 | name = nv[0].replace('+', ' ')
2585 | name = compat_urllib_parse_unquote(
2586 | name, encoding=encoding, errors=errors)
2587 | name = _coerce_result(name)
2588 | value = nv[1].replace('+', ' ')
2589 | value = compat_urllib_parse_unquote(
2590 | value, encoding=encoding, errors=errors)
2591 | value = _coerce_result(value)
2592 | r.append((name, value))
2593 | return r
2594 |
2595 | def compat_parse_qs(qs, keep_blank_values=False, strict_parsing=False,
2596 | encoding='utf-8', errors='replace'):
2597 | parsed_result = {}
2598 | pairs = _parse_qsl(qs, keep_blank_values, strict_parsing,
2599 | encoding=encoding, errors=errors)
2600 | for name, value in pairs:
2601 | if name in parsed_result:
2602 | parsed_result[name].append(value)
2603 | else:
2604 | parsed_result[name] = [value]
2605 | return parsed_result
2606 |
2607 | try:
2608 | from shlex import quote as compat_shlex_quote
2609 | except ImportError: # Python < 3.3
2610 | def compat_shlex_quote(s):
2611 | if re.match(r'^[-_\w./]+$', s):
2612 | return s
2613 | else:
2614 | return "'" + s.replace("'", "'\"'\"'") + "'"
2615 |
2616 |
2617 | try:
2618 | args = shlex.split('中文')
2619 | assert (isinstance(args, list) and
2620 | isinstance(args[0], compat_str) and
2621 | args[0] == '中文')
2622 | compat_shlex_split = shlex.split
2623 | except (AssertionError, UnicodeEncodeError):
2624 | # Working around shlex issue with unicode strings on some python 2
2625 | # versions (see http://bugs.python.org/issue1548891)
2626 | def compat_shlex_split(s, comments=False, posix=True):
2627 | if isinstance(s, compat_str):
2628 | s = s.encode('utf-8')
2629 | return list(map(lambda s: s.decode('utf-8'), shlex.split(s, comments, posix)))
2630 |
2631 |
2632 | def compat_ord(c):
2633 | if type(c) is int:
2634 | return c
2635 | else:
2636 | return ord(c)
2637 |
2638 |
2639 | compat_os_name = os._name if os.name == 'java' else os.name
2640 |
2641 |
2642 | if sys.version_info >= (3, 0):
2643 | compat_getenv = os.getenv
2644 | compat_expanduser = os.path.expanduser
2645 |
2646 | def compat_setenv(key, value, env=os.environ):
2647 | env[key] = value
2648 | else:
2649 | # Environment variables should be decoded with filesystem encoding.
2650 | # Otherwise it will fail if any non-ASCII characters present (see #3854 #3217 #2918)
2651 |
2652 | def compat_getenv(key, default=None):
2653 | from .utils import get_filesystem_encoding
2654 | env = os.getenv(key, default)
2655 | if env:
2656 | env = env.decode(get_filesystem_encoding())
2657 | return env
2658 |
2659 | def compat_setenv(key, value, env=os.environ):
2660 | def encode(v):
2661 | from .utils import get_filesystem_encoding
2662 | return v.encode(get_filesystem_encoding()) if isinstance(v, compat_str) else v
2663 | env[encode(key)] = encode(value)
2664 |
2665 | # HACK: The default implementations of os.path.expanduser from cpython do not decode
2666 | # environment variables with filesystem encoding. We will work around this by
2667 | # providing adjusted implementations.
2668 | # The following are os.path.expanduser implementations from cpython 2.7.8 stdlib
2669 | # for different platforms with correct environment variables decoding.
2670 |
2671 | if compat_os_name == 'posix':
2672 | def compat_expanduser(path):
2673 | """Expand ~ and ~user constructions. If user or $HOME is unknown,
2674 | do nothing."""
2675 | if not path.startswith('~'):
2676 | return path
2677 | i = path.find('/', 1)
2678 | if i < 0:
2679 | i = len(path)
2680 | if i == 1:
2681 | if 'HOME' not in os.environ:
2682 | import pwd
2683 | userhome = pwd.getpwuid(os.getuid()).pw_dir
2684 | else:
2685 | userhome = compat_getenv('HOME')
2686 | else:
2687 | import pwd
2688 | try:
2689 | pwent = pwd.getpwnam(path[1:i])
2690 | except KeyError:
2691 | return path
2692 | userhome = pwent.pw_dir
2693 | userhome = userhome.rstrip('/')
2694 | return (userhome + path[i:]) or '/'
2695 | elif compat_os_name == 'nt' or compat_os_name == 'ce':
2696 | def compat_expanduser(path):
2697 | """Expand ~ and ~user constructs.
2698 |
2699 | If user or $HOME is unknown, do nothing."""
2700 | if path[:1] != '~':
2701 | return path
2702 | i, n = 1, len(path)
2703 | while i < n and path[i] not in '/\\':
2704 | i = i + 1
2705 |
2706 | if 'HOME' in os.environ:
2707 | userhome = compat_getenv('HOME')
2708 | elif 'USERPROFILE' in os.environ:
2709 | userhome = compat_getenv('USERPROFILE')
2710 | elif 'HOMEPATH' not in os.environ:
2711 | return path
2712 | else:
2713 | try:
2714 | drive = compat_getenv('HOMEDRIVE')
2715 | except KeyError:
2716 | drive = ''
2717 | userhome = os.path.join(drive, compat_getenv('HOMEPATH'))
2718 |
2719 | if i != 1: # ~user
2720 | userhome = os.path.join(os.path.dirname(userhome), path[1:i])
2721 |
2722 | return userhome + path[i:]
2723 | else:
2724 | compat_expanduser = os.path.expanduser
2725 |
2726 |
2727 | if sys.version_info < (3, 0):
2728 | def compat_print(s):
2729 | from .utils import preferredencoding
2730 | print(s.encode(preferredencoding(), 'xmlcharrefreplace'))
2731 | else:
2732 | def compat_print(s):
2733 | assert isinstance(s, compat_str)
2734 | print(s)
2735 |
2736 |
2737 | if sys.version_info < (3, 0) and sys.platform == 'win32':
2738 | def compat_getpass(prompt, *args, **kwargs):
2739 | if isinstance(prompt, compat_str):
2740 | from .utils import preferredencoding
2741 | prompt = prompt.encode(preferredencoding())
2742 | return getpass.getpass(prompt, *args, **kwargs)
2743 | else:
2744 | compat_getpass = getpass.getpass
2745 |
2746 | try:
2747 | compat_input = raw_input
2748 | except NameError: # Python 3
2749 | compat_input = input
2750 |
2751 | # Python < 2.6.5 require kwargs to be bytes
2752 | try:
2753 | def _testfunc(x):
2754 | pass
2755 | _testfunc(**{'x': 0})
2756 | except TypeError:
2757 | def compat_kwargs(kwargs):
2758 | return dict((bytes(k), v) for k, v in kwargs.items())
2759 | else:
2760 | compat_kwargs = lambda kwargs: kwargs
2761 |
2762 |
2763 | compat_numeric_types = ((int, float, long, complex) if sys.version_info[0] < 3
2764 | else (int, float, complex))
2765 |
2766 |
2767 | if sys.version_info < (2, 7):
2768 | def compat_socket_create_connection(address, timeout, source_address=None):
2769 | host, port = address
2770 | err = None
2771 | for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
2772 | af, socktype, proto, canonname, sa = res
2773 | sock = None
2774 | try:
2775 | sock = socket.socket(af, socktype, proto)
2776 | sock.settimeout(timeout)
2777 | if source_address:
2778 | sock.bind(source_address)
2779 | sock.connect(sa)
2780 | return sock
2781 | except socket.error as _:
2782 | err = _
2783 | if sock is not None:
2784 | sock.close()
2785 | if err is not None:
2786 | raise err
2787 | else:
2788 | raise socket.error('getaddrinfo returns an empty list')
2789 | else:
2790 | compat_socket_create_connection = socket.create_connection
2791 |
2792 |
2793 | # Fix https://github.com/rg3/youtube-dl/issues/4223
2794 | # See http://bugs.python.org/issue9161 for what is broken
2795 | def workaround_optparse_bug9161():
2796 | op = optparse.OptionParser()
2797 | og = optparse.OptionGroup(op, 'foo')
2798 | try:
2799 | og.add_option('-t')
2800 | except TypeError:
2801 | real_add_option = optparse.OptionGroup.add_option
2802 |
2803 | def _compat_add_option(self, *args, **kwargs):
2804 | enc = lambda v: (
2805 | v.encode('ascii', 'replace') if isinstance(v, compat_str)
2806 | else v)
2807 | bargs = [enc(a) for a in args]
2808 | bkwargs = dict(
2809 | (k, enc(v)) for k, v in kwargs.items())
2810 | return real_add_option(self, *bargs, **bkwargs)
2811 | optparse.OptionGroup.add_option = _compat_add_option
2812 |
2813 |
2814 | if hasattr(shutil, 'get_terminal_size'): # Python >= 3.3
2815 | compat_get_terminal_size = shutil.get_terminal_size
2816 | else:
2817 | _terminal_size = collections.namedtuple('terminal_size', ['columns', 'lines'])
2818 |
2819 | def compat_get_terminal_size(fallback=(80, 24)):
2820 | columns = compat_getenv('COLUMNS')
2821 | if columns:
2822 | columns = int(columns)
2823 | else:
2824 | columns = None
2825 | lines = compat_getenv('LINES')
2826 | if lines:
2827 | lines = int(lines)
2828 | else:
2829 | lines = None
2830 |
2831 | if columns is None or lines is None or columns <= 0 or lines <= 0:
2832 | try:
2833 | sp = subprocess.Popen(
2834 | ['stty', 'size'],
2835 | stdout=subprocess.PIPE, stderr=subprocess.PIPE)
2836 | out, err = sp.communicate()
2837 | _lines, _columns = map(int, out.split())
2838 | except Exception:
2839 | _columns, _lines = _terminal_size(*fallback)
2840 |
2841 | if columns is None or columns <= 0:
2842 | columns = _columns
2843 | if lines is None or lines <= 0:
2844 | lines = _lines
2845 | return _terminal_size(columns, lines)
2846 |
2847 | try:
2848 | itertools.count(start=0, step=1)
2849 | compat_itertools_count = itertools.count
2850 | except TypeError: # Python 2.6
2851 | def compat_itertools_count(start=0, step=1):
2852 | n = start
2853 | while True:
2854 | yield n
2855 | n += step
2856 |
2857 | if sys.version_info >= (3, 0):
2858 | from tokenize import tokenize as compat_tokenize_tokenize
2859 | else:
2860 | from tokenize import generate_tokens as compat_tokenize_tokenize
2861 |
2862 |
2863 | try:
2864 | struct.pack('!I', 0)
2865 | except TypeError:
2866 | # In Python 2.6 and 2.7.x < 2.7.7, struct requires a bytes argument
2867 | # See https://bugs.python.org/issue19099
2868 | def compat_struct_pack(spec, *args):
2869 | if isinstance(spec, compat_str):
2870 | spec = spec.encode('ascii')
2871 | return struct.pack(spec, *args)
2872 |
2873 | def compat_struct_unpack(spec, *args):
2874 | if isinstance(spec, compat_str):
2875 | spec = spec.encode('ascii')
2876 | return struct.unpack(spec, *args)
2877 | else:
2878 | compat_struct_pack = struct.pack
2879 | compat_struct_unpack = struct.unpack
2880 |
2881 |
2882 | __all__ = [
2883 | 'compat_HTMLParser',
2884 | 'compat_HTTPError',
2885 | 'compat_basestring',
2886 | 'compat_chr',
2887 | 'compat_cookiejar',
2888 | 'compat_cookies',
2889 | 'compat_etree_fromstring',
2890 | 'compat_etree_register_namespace',
2891 | 'compat_expanduser',
2892 | 'compat_get_terminal_size',
2893 | 'compat_getenv',
2894 | 'compat_getpass',
2895 | 'compat_html_entities',
2896 | 'compat_html_entities_html5',
2897 | 'compat_http_client',
2898 | 'compat_http_server',
2899 | 'compat_input',
2900 | 'compat_itertools_count',
2901 | 'compat_kwargs',
2902 | 'compat_numeric_types',
2903 | 'compat_ord',
2904 | 'compat_os_name',
2905 | 'compat_parse_qs',
2906 | 'compat_print',
2907 | 'compat_setenv',
2908 | 'compat_shlex_quote',
2909 | 'compat_shlex_split',
2910 | 'compat_socket_create_connection',
2911 | 'compat_str',
2912 | 'compat_struct_pack',
2913 | 'compat_struct_unpack',
2914 | 'compat_subprocess_get_DEVNULL',
2915 | 'compat_tokenize_tokenize',
2916 | 'compat_urllib_error',
2917 | 'compat_urllib_parse',
2918 | 'compat_urllib_parse_unquote',
2919 | 'compat_urllib_parse_unquote_plus',
2920 | 'compat_urllib_parse_unquote_to_bytes',
2921 | 'compat_urllib_parse_urlencode',
2922 | 'compat_urllib_parse_urlparse',
2923 | 'compat_urllib_request',
2924 | 'compat_urllib_request_DataHandler',
2925 | 'compat_urllib_response',
2926 | 'compat_urlparse',
2927 | 'compat_urlretrieve',
2928 | 'compat_xml_parse_error',
2929 | 'compat_xpath',
2930 | 'workaround_optparse_bug9161',
2931 | ]
--------------------------------------------------------------------------------
/anime_dl/external/socks.py:
--------------------------------------------------------------------------------
1 | # Public Domain SOCKS proxy protocol implementation
2 | # Adapted from https://gist.github.com/bluec0re/cafd3764412967417fd3
3 |
4 | from __future__ import unicode_literals
5 |
6 | # References:
7 | # SOCKS4 protocol http://www.openssh.com/txt/socks4.protocol
8 | # SOCKS4A protocol http://www.openssh.com/txt/socks4a.protocol
9 | # SOCKS5 protocol https://tools.ietf.org/html/rfc1928
10 | # SOCKS5 username/password authentication https://tools.ietf.org/html/rfc1929
11 |
12 | import collections
13 | import socket
14 |
15 | from external.compat import (
16 | compat_ord,
17 | compat_struct_pack,
18 | compat_struct_unpack,
19 | )
20 |
21 | __author__ = 'Timo Schmid '
22 |
23 | SOCKS4_VERSION = 4
24 | SOCKS4_REPLY_VERSION = 0x00
25 | # Excerpt from SOCKS4A protocol:
26 | # if the client cannot resolve the destination host's domain name to find its
27 | # IP address, it should set the first three bytes of DSTIP to NULL and the last
28 | # byte to a non-zero value.
29 | SOCKS4_DEFAULT_DSTIP = compat_struct_pack('!BBBB', 0, 0, 0, 0xFF)
30 |
31 | SOCKS5_VERSION = 5
32 | SOCKS5_USER_AUTH_VERSION = 0x01
33 | SOCKS5_USER_AUTH_SUCCESS = 0x00
34 |
35 |
36 | class Socks4Command(object):
37 | CMD_CONNECT = 0x01
38 | CMD_BIND = 0x02
39 |
40 |
41 | class Socks5Command(Socks4Command):
42 | CMD_UDP_ASSOCIATE = 0x03
43 |
44 |
45 | class Socks5Auth(object):
46 | AUTH_NONE = 0x00
47 | AUTH_GSSAPI = 0x01
48 | AUTH_USER_PASS = 0x02
49 | AUTH_NO_ACCEPTABLE = 0xFF # For server response
50 |
51 |
52 | class Socks5AddressType(object):
53 | ATYP_IPV4 = 0x01
54 | ATYP_DOMAINNAME = 0x03
55 | ATYP_IPV6 = 0x04
56 |
57 |
58 | class ProxyError(socket.error):
59 | ERR_SUCCESS = 0x00
60 |
61 | def __init__(self, code=None, msg=None):
62 | if code is not None and msg is None:
63 | msg = self.CODES.get(code) or 'unknown error'
64 | super(ProxyError, self).__init__(code, msg)
65 |
66 |
67 | class InvalidVersionError(ProxyError):
68 | def __init__(self, expected_version, got_version):
69 | msg = ('Invalid response version from server. Expected {0:02x} got '
70 | '{1:02x}'.format(expected_version, got_version))
71 | super(InvalidVersionError, self).__init__(0, msg)
72 |
73 |
74 | class Socks4Error(ProxyError):
75 | ERR_SUCCESS = 90
76 |
77 | CODES = {
78 | 91: 'request rejected or failed',
79 | 92: 'request rejected because SOCKS server cannot connect to identd on the client',
80 | 93: 'request rejected because the client program and identd report different user-ids'
81 | }
82 |
83 |
84 | class Socks5Error(ProxyError):
85 | ERR_GENERAL_FAILURE = 0x01
86 |
87 | CODES = {
88 | 0x01: 'general SOCKS server failure',
89 | 0x02: 'connection not allowed by ruleset',
90 | 0x03: 'Network unreachable',
91 | 0x04: 'Host unreachable',
92 | 0x05: 'Connection refused',
93 | 0x06: 'TTL expired',
94 | 0x07: 'Command not supported',
95 | 0x08: 'Address type not supported',
96 | 0xFE: 'unknown username or invalid password',
97 | 0xFF: 'all offered authentication methods were rejected'
98 | }
99 |
100 |
101 | class ProxyType(object):
102 | SOCKS4 = 0
103 | SOCKS4A = 1
104 | SOCKS5 = 2
105 |
106 |
107 | Proxy = collections.namedtuple('Proxy', (
108 | 'type', 'host', 'port', 'username', 'password', 'remote_dns'))
109 |
110 |
111 | class sockssocket(socket.socket):
112 | def __init__(self, *args, **kwargs):
113 | self._proxy = None
114 | super(sockssocket, self).__init__(*args, **kwargs)
115 |
116 | def setproxy(self, proxytype, addr, port, rdns=True, username=None, password=None):
117 | assert proxytype in (ProxyType.SOCKS4, ProxyType.SOCKS4A, ProxyType.SOCKS5)
118 |
119 | self._proxy = Proxy(proxytype, addr, port, username, password, rdns)
120 |
121 | def recvall(self, cnt):
122 | data = b''
123 | while len(data) < cnt:
124 | cur = self.recv(cnt - len(data))
125 | if not cur:
126 | raise EOFError('{0} bytes missing'.format(cnt - len(data)))
127 | data += cur
128 | return data
129 |
130 | def _recv_bytes(self, cnt):
131 | data = self.recvall(cnt)
132 | return compat_struct_unpack('!{0}B'.format(cnt), data)
133 |
134 | @staticmethod
135 | def _len_and_data(data):
136 | return compat_struct_pack('!B', len(data)) + data
137 |
138 | def _check_response_version(self, expected_version, got_version):
139 | if got_version != expected_version:
140 | self.close()
141 | raise InvalidVersionError(expected_version, got_version)
142 |
143 | def _resolve_address(self, destaddr, default, use_remote_dns):
144 | try:
145 | return socket.inet_aton(destaddr)
146 | except socket.error:
147 | if use_remote_dns and self._proxy.remote_dns:
148 | return default
149 | else:
150 | return socket.inet_aton(socket.gethostbyname(destaddr))
151 |
152 | def _setup_socks4(self, address, is_4a=False):
153 | destaddr, port = address
154 |
155 | ipaddr = self._resolve_address(destaddr, SOCKS4_DEFAULT_DSTIP, use_remote_dns=is_4a)
156 |
157 | packet = compat_struct_pack('!BBH', SOCKS4_VERSION, Socks4Command.CMD_CONNECT, port) + ipaddr
158 |
159 | username = (self._proxy.username or '').encode('utf-8')
160 | packet += username + b'\x00'
161 |
162 | if is_4a and self._proxy.remote_dns:
163 | packet += destaddr.encode('utf-8') + b'\x00'
164 |
165 | self.sendall(packet)
166 |
167 | version, resp_code, dstport, dsthost = compat_struct_unpack('!BBHI', self.recvall(8))
168 |
169 | self._check_response_version(SOCKS4_REPLY_VERSION, version)
170 |
171 | if resp_code != Socks4Error.ERR_SUCCESS:
172 | self.close()
173 | raise Socks4Error(resp_code)
174 |
175 | return (dsthost, dstport)
176 |
177 | def _setup_socks4a(self, address):
178 | self._setup_socks4(address, is_4a=True)
179 |
180 | def _socks5_auth(self):
181 | packet = compat_struct_pack('!B', SOCKS5_VERSION)
182 |
183 | auth_methods = [Socks5Auth.AUTH_NONE]
184 | if self._proxy.username and self._proxy.password:
185 | auth_methods.append(Socks5Auth.AUTH_USER_PASS)
186 |
187 | packet += compat_struct_pack('!B', len(auth_methods))
188 | packet += compat_struct_pack('!{0}B'.format(len(auth_methods)), *auth_methods)
189 |
190 | self.sendall(packet)
191 |
192 | version, method = self._recv_bytes(2)
193 |
194 | self._check_response_version(SOCKS5_VERSION, version)
195 |
196 | if method == Socks5Auth.AUTH_NO_ACCEPTABLE:
197 | self.close()
198 | raise Socks5Error(method)
199 |
200 | if method == Socks5Auth.AUTH_USER_PASS:
201 | username = self._proxy.username.encode('utf-8')
202 | password = self._proxy.password.encode('utf-8')
203 | packet = compat_struct_pack('!B', SOCKS5_USER_AUTH_VERSION)
204 | packet += self._len_and_data(username) + self._len_and_data(password)
205 | self.sendall(packet)
206 |
207 | version, status = self._recv_bytes(2)
208 |
209 | self._check_response_version(SOCKS5_USER_AUTH_VERSION, version)
210 |
211 | if status != SOCKS5_USER_AUTH_SUCCESS:
212 | self.close()
213 | raise Socks5Error(Socks5Error.ERR_GENERAL_FAILURE)
214 |
215 | def _setup_socks5(self, address):
216 | destaddr, port = address
217 |
218 | ipaddr = self._resolve_address(destaddr, None, use_remote_dns=True)
219 |
220 | self._socks5_auth()
221 |
222 | reserved = 0
223 | packet = compat_struct_pack('!BBB', SOCKS5_VERSION, Socks5Command.CMD_CONNECT, reserved)
224 | if ipaddr is None:
225 | destaddr = destaddr.encode('utf-8')
226 | packet += compat_struct_pack('!B', Socks5AddressType.ATYP_DOMAINNAME)
227 | packet += self._len_and_data(destaddr)
228 | else:
229 | packet += compat_struct_pack('!B', Socks5AddressType.ATYP_IPV4) + ipaddr
230 | packet += compat_struct_pack('!H', port)
231 |
232 | self.sendall(packet)
233 |
234 | version, status, reserved, atype = self._recv_bytes(4)
235 |
236 | self._check_response_version(SOCKS5_VERSION, version)
237 |
238 | if status != Socks5Error.ERR_SUCCESS:
239 | self.close()
240 | raise Socks5Error(status)
241 |
242 | if atype == Socks5AddressType.ATYP_IPV4:
243 | destaddr = self.recvall(4)
244 | elif atype == Socks5AddressType.ATYP_DOMAINNAME:
245 | alen = compat_ord(self.recv(1))
246 | destaddr = self.recvall(alen)
247 | elif atype == Socks5AddressType.ATYP_IPV6:
248 | destaddr = self.recvall(16)
249 | destport = compat_struct_unpack('!H', self.recvall(2))[0]
250 |
251 | return (destaddr, destport)
252 |
253 | def _make_proxy(self, connect_func, address):
254 | if not self._proxy:
255 | return connect_func(self, address)
256 |
257 | result = connect_func(self, (self._proxy.host, self._proxy.port))
258 | if result != 0 and result is not None:
259 | return result
260 | setup_funcs = {
261 | ProxyType.SOCKS4: self._setup_socks4,
262 | ProxyType.SOCKS4A: self._setup_socks4a,
263 | ProxyType.SOCKS5: self._setup_socks5,
264 | }
265 | setup_funcs[self._proxy.type](address)
266 | return result
267 |
268 | def connect(self, address):
269 | self._make_proxy(socket.socket.connect, address)
270 |
271 | def connect_ex(self, address):
272 | return self._make_proxy(socket.socket.connect_ex, address)
--------------------------------------------------------------------------------
/anime_dl/sites/__init__.py:
--------------------------------------------------------------------------------
1 | from . import crunchyroll
--------------------------------------------------------------------------------
/anime_dl/sites/crunchyroll.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- coding: utf-8 -*-
3 |
4 | import logging
5 | import re
6 | import anime_dl
7 | from . import supporters
8 | import os
9 | import subprocess
10 | from glob import glob
11 | from shutil import move
12 | from sys import exit
13 |
14 | class Crunchyroll(object):
15 | def __init__(self, url, password, username, resolution, language, skipper, logger, episode_range, output):
16 | if logger == "True":
17 | logging.basicConfig(format='%(levelname)s: %(message)s', filename="Error Log.log", level=logging.DEBUG,
18 | encoding="utf-8")
19 |
20 | # Extract the language from the input URL
21 | crunchy_language = re.search(r'.+\/([a-z]{2}|[a-z]{2}-[a-z]{2})\/.+', url)
22 | if not crunchy_language:
23 | crunchy_language = "/"
24 | else:
25 | crunchy_language = crunchy_language.group(1) + "/"
26 |
27 | crunchy_show_regex = r'https?:\/\/(?:(?Pwww|m)\.)?(?Pcrunchyroll\.com(\/[a-z]{2}|\/[a-z]{2}-[a-z]{2})?\/(?!(?:news|anime-news|library|forum|launchcalendar|lineup|store|comics|freetrial|login))(?P[\w\-]+))\/?(?:\?|$)'
28 | crunchy_video_regex = r'https?:\/\/(?:(?Pwww|m)\.)?(?Pcrunchyroll\.(?:com|fr)(\/[a-z]{2}|\/[a-z]{2}-[a-z]{2})?\/(?:media(?:-|\/\?id=)|[^\/]*\/[^\/?&]*?)(?P[0-9]+))(?:[\/?&]|$)'
29 |
30 | crunchy_show = re.match(crunchy_show_regex, url)
31 | crunchy_video = re.match(crunchy_video_regex, url)
32 |
33 | login_response, cookies, token = anime_dl.common.browser_instance.login_crunchyroll(url=url,
34 | username=username[0],
35 | password=password[0],
36 | country=crunchy_language)
37 | if login_response:
38 | if crunchy_video:
39 | if skipper == "yes":
40 | self.only_subs(url=url, cookies=cookies, resolution=resolution)
41 | else:
42 | self.single_episode(url=url, cookies=cookies, token=token, resolution=resolution, output=output)
43 | elif crunchy_show:
44 | self.whole_show(url=url, cookie=cookies, token=token, language=language, resolution=resolution, skipper=skipper, episode_range=episode_range, output=output)
45 | else:
46 | print("URL does not look like a show or a video, stopping.")
47 | else:
48 | print("Failed Login!!!")
49 | exit(1)
50 |
51 | def single_episode(self, url, cookies, token, resolution, output):
52 | video_id = str(url.split('-')[-1]).replace("/", "")
53 | logging.debug("video_id : {0}".format(video_id))
54 |
55 | response, resolution_to_find, info_url = self.resolution_finder(resolution=resolution, video_id=video_id, url=url)
56 |
57 | if not response:
58 | print("No Resolution Found")
59 | exit(1)
60 |
61 | response_value, xml_page_connect, xml_cookies = anime_dl.common.browser_instance.page_downloader(url=info_url, cookies=cookies)
62 |
63 | if xml_page_connect:
64 | xml_page_connect = str(xml_page_connect)
65 | stream_exists, m3u8_file_link = self.m3u8_finder(xml_page_source=xml_page_connect)
66 |
67 | if stream_exists:
68 | anime_name, episode_number, video_resolution = self.episode_information_extractor(page_source=xml_page_connect, resolution_to_find=resolution_to_find)
69 | file_name = supporters.anime_name.crunchyroll_name(anime_name=anime_name, episode_number=episode_number, resolution=video_resolution)
70 | output_directory = supporters.path_works.path_creator(anime_name=anime_name)
71 | file_location = str(output_directory) + os.sep + str(file_name).replace(".mp4", ".mkv")
72 |
73 | if output is None or not os.path.exists(output):
74 | output = output_directory
75 |
76 | if os.path.isfile(file_location):
77 | print('[anime-dl] File Exists! Skipping {0}\n'.format(file_name))
78 | pass
79 | else:
80 | subs_downloaded = supporters.sub_fetcher.crunchyroll_subs(xml=str(xml_page_connect), episode_number=episode_number, file_name=file_name)
81 | if not subs_downloaded:
82 | pass
83 | m3u8_downloaded = self.m3u8_downloader(url=m3u8_file_link, cookies=cookies, resolution_to_find=resolution_to_find, file_name=file_name)
84 | if m3u8_downloaded:
85 | sub_files = self.sub_prepare()
86 | font_files = [os.path.realpath(font_file) for font_file in
87 | glob(str(os.getcwd()) + "/Fonts/*.*")]
88 |
89 | fonts = '--attachment-mime-type application/x-truetype-font --attach-file "' + str(
90 | '" --attachment-mime-type application/x-truetype-font --attach-file "'.join(
91 | font_files)) + '"'
92 |
93 | if len(font_files) == 0:
94 | fonts = ''
95 |
96 | is_stream_muxed = self.stream_muxing(file_name=file_name, subs_files=sub_files, fonts=fonts, output_directory=output_directory)
97 | if is_stream_muxed:
98 | is_file_moved = self.move_video_file(output_directory = output)
99 | print("Moved file to", output)
100 | if is_file_moved:
101 | is_cleaned = self.material_cleaner()
102 | if is_cleaned:
103 | print("{0} - {1} successfully downloaded.\n".format(anime_name, episode_number))
104 | else:
105 | print("Couldn't remove the leftover files.")
106 | pass
107 | else:
108 | print("Couldn't move the file.")
109 | pass
110 | else:
111 | print("Stream couldn't be muxed. Make sure MKVMERGE is in the path.")
112 | pass
113 | else:
114 | print("Couldn't download the m3u8 file.")
115 | pass
116 | else:
117 | print("Couldn't find the stream.")
118 | pass
119 | else:
120 | print("Couldn't Connect To XML Page.")
121 | pass
122 |
123 | def whole_show(self, url, cookie, token, language, resolution, skipper, episode_range, output):
124 | response, page_source, episode_list_cookies = anime_dl.common.browser_instance.page_downloader(url=url, cookies=cookie)
125 |
126 | if response:
127 | dub_list, ep_sub_list = self.episode_list_extractor(page_source=page_source, url=url)
128 | ep_sub_list = self.sub_list_editor(episode_range=episode_range, ep_sub_list=ep_sub_list)
129 |
130 | if skipper == "yes":
131 | # print("DLing everything")
132 | print("Total Subs to download : %s" % len(ep_sub_list))
133 | for episode_url in ep_sub_list[::-1]:
134 | # cookies, Token = self.webpagedownloader(url=url)
135 | # print("Sub list : %s" % sub_list)
136 | self.only_subs(url=episode_url, cookies=cookie, resolution=resolution)
137 |
138 | print("-----------------------------------------------------------")
139 | print("\n")
140 | else:
141 | if str(language).lower() in ["english", "eng", "dub"]:
142 | # If the "dub_list" is empty, that means there are no English Dubs for the show, or CR changed something.
143 | if len(dub_list) == 0:
144 | print("No English Dub Available For This Series.")
145 | print(
146 | "If you can see the Dubs, please open an Issue on https://github.com/Xonshiz/anime-dl/issues/new")
147 | exit(1)
148 | else:
149 | print("Total Episodes to download : %s" % len(dub_list))
150 | for episode_url in dub_list[::-1]:
151 | # cookies, Token = self.webpagedownloader(url=url)
152 | # print("Dub list : %s" % dub_list)
153 | try:
154 | self.single_episode(url=episode_url, cookies=cookie, token=token, resolution=resolution, output=output)
155 | except Exception as SomeError:
156 | print("Error Downloading : {0}".format(SomeError))
157 | pass
158 | print("-----------------------------------------------------------")
159 | print("\n")
160 | else:
161 | print("Total Episodes to download : %s" % len(ep_sub_list))
162 |
163 | for episode_url in ep_sub_list[::-1]:
164 | # cookies, Token = self.webpagedownloader(url=url)
165 | # print("Sub list : %s" % sub_list)
166 | try:
167 | self.single_episode(url=episode_url, cookies=cookie, token=token, resolution=resolution, output=output)
168 | except Exception as SomeError:
169 | print("Error Downloading : {0}".format(SomeError))
170 | pass
171 | print("-----------------------------------------------------------")
172 | print("\n")
173 | else:
174 | print("Couldn't connect to Crunchyroll. Failed.")
175 | exit(1)
176 |
177 | def episode_list_extractor(self, page_source, url):
178 | dub_list = []
179 | ep_sub_list = []
180 | chap_holder_div = page_source.find_all('a', {'class': 'portrait-element block-link titlefix episode'})
181 |
182 | for single_node in chap_holder_div:
183 | href_value = single_node["href"]
184 | title_value = single_node["title"]
185 | if "(Dub)" in u' '.join(title_value).encode('utf-8').strip():
186 | dub_list.append(str(url) + "/" + str(str(href_value).split("/")[-1]))
187 | else:
188 | ep_sub_list.append(str(url) + "/" + str(str(href_value).split("/")[-1]))
189 |
190 | if len(dub_list) == 0 and len(ep_sub_list) == 0:
191 | print("Could not find the show links. Report on https://github.com/Xonshiz/anime-dl/issues/new")
192 | exit(0)
193 | else:
194 | return dub_list, ep_sub_list
195 |
196 | def sub_list_editor(self, episode_range, ep_sub_list):
197 | if episode_range != "All":
198 | # -1 to shift the episode number accordingly to the INDEX of it. List starts from 0 xD!
199 | starting = int(str(episode_range).split("-")[0]) - 1
200 | ending = int(str(episode_range).split("-")[1])
201 | indexes = [x for x in range(starting, ending)]
202 | # [::-1] in sub_list in beginning to start this from the 1st episode and at the last, it is to reverse the list again, becasue I'm reverting it again at the end.
203 | return [ep_sub_list[::-1][x] for x in indexes][::-1]
204 | else:
205 | return ep_sub_list
206 |
207 | def episode_information_extractor(self, page_source, resolution_to_find):
208 | anime_name = re.sub(r'[^A-Za-z0-9\ \-\' \\]+', '', str(re.search(r'(.*?)', page_source).group(1))).title().strip()
209 | episode_number = re.search(r'(.*?)', page_source).group(1)
210 | video_width, video_height = resolution_to_find.split("x")
211 | video_resolution = str(video_width) + "x" + str(video_height)
212 |
213 | return anime_name, episode_number, video_resolution
214 |
215 | def stream_muxing(self, file_name, subs_files, fonts, output_directory):
216 | mkv_merge_command = 'mkvmerge --output "%s" ' % str(file_name).replace(".mp4", ".mkv") + '"' + str(file_name) + '" ' + ' '.join(subs_files) + ' ' + str(fonts)
217 |
218 | logging.debug("mkv_merge_command : %s", mkv_merge_command)
219 |
220 | try:
221 | subprocess.check_call(mkv_merge_command, shell=True)
222 | return True
223 | # if call:
224 | # return True
225 | # else:
226 | # return False
227 | except Exception as FileMuxingException:
228 | print("Sees like I couldn't mux the files.\n")
229 | print("Check whether the MKVMERGE.exe is in PATH or not.\n")
230 | print(str(FileMuxingException) + "\n")
231 | fallback = self.stream_not_muxed_fallback(output_directory=output_directory)
232 | return False
233 |
234 | def stream_not_muxed_fallback(self, output_directory):
235 | try:
236 | for video_file in glob("*.mp4"):
237 | try:
238 | move(video_file, output_directory)
239 | except Exception as e:
240 | print(str(e))
241 | pass
242 | for sub_files in glob("*.ass"):
243 | try:
244 | move(sub_files, output_directory)
245 | except Exception as e:
246 | print(str(e))
247 | pass
248 | return True
249 | except Exception:
250 | return False
251 |
252 | def move_video_file(self, output_directory):
253 | try:
254 | for video_file in glob("*.mkv"):
255 | try:
256 | move(video_file, output_directory)
257 | except Exception as e:
258 | # print(str(e))
259 | pass
260 | return True
261 | except Exception:
262 | exit(1)
263 |
264 | def move_subtitle_file(self, output_directory):
265 | try:
266 | for video_file in glob("*.ass"):
267 | try:
268 | move(video_file, output_directory)
269 | except Exception as e:
270 | # print(str(e))
271 | pass
272 | return True
273 | except Exception:
274 | exit(1)
275 |
276 | def material_cleaner(self):
277 | try:
278 | for video in glob("*.mp4"):
279 | os.remove(os.path.realpath(video))
280 |
281 | for sub_file_delete in glob("*.ass"):
282 | os.remove(os.path.realpath(sub_file_delete))
283 | return True
284 | except Exception:
285 | exit(1)
286 |
287 | def m3u8_downloader(self, url, cookies, resolution_to_find, file_name):
288 | response_value, m3u8_file_connect, updated_cookies = anime_dl.common.browser_instance.page_downloader(url=url, cookies=cookies)
289 | try:
290 | m3u8_file_text = None
291 |
292 | next_line_is_good = False
293 | for i, currentLine in enumerate(m3u8_file_connect.text.splitlines()):
294 | if next_line_is_good:
295 | m3u8_file_text = currentLine
296 | logging.debug("file to download : {0}".format(m3u8_file_text))
297 | break
298 | elif currentLine.startswith("#EXT-X") and resolution_to_find in currentLine:
299 | next_line_is_good = True
300 |
301 | if m3u8_file_text is None:
302 | print('Could not find the requested resolution {0} in the m3u8 file\n'.format(file_name))
303 | exit(1)
304 |
305 | self.ffmpeg_call(m3u8_file_text, file_name)
306 | return True
307 |
308 | except Exception:
309 | print("Exception Occurred In m3u8 File.")
310 | exit(1)
311 |
312 | def sub_prepare(self):
313 | subtitles_files = []
314 | for sub_file in glob("*.ass"):
315 | if sub_file.endswith(".enUS.ass"):
316 | subtitles_files.insert(0,
317 | "--track-name 0:English_US --language 0:eng --default-track 0:yes --sub-charset 0:utf-8 " + '"' + str(
318 | os.path.realpath(sub_file)) + '" ')
319 |
320 | elif sub_file.endswith(".enGB.ass"):
321 | subtitles_files.append(
322 | "--track-name 0:English_UK --language 0:eng --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
323 | os.path.realpath(sub_file)) + '" ')
324 |
325 | elif sub_file.endswith(".esLA.ass"):
326 | subtitles_files.append(
327 | "--track-name 0:Espanol --language 0:spa --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
328 | os.path.realpath(sub_file)) + '" ')
329 | elif sub_file.endswith(".esES.ass"):
330 | subtitles_files.append(
331 | "--track-name 0:Espanol_Espana --language 0:spa --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
332 | os.path.realpath(sub_file)) + '" ')
333 | elif sub_file.endswith(".ptBR.ass"):
334 | subtitles_files.append(
335 | "--track-name 0:Portugues_Brasil --language 0:por --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
336 | os.path.realpath(sub_file)) + '" ')
337 | elif sub_file.endswith(".ptPT.ass"):
338 | subtitles_files.append(
339 | "--track-name 0:Portugues_Portugal --language 0:por --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
340 | os.path.realpath(sub_file)) + '" ')
341 | elif sub_file.endswith(".frFR.ass"):
342 | subtitles_files.append(
343 | "--track-name 0:Francais_France --language 0:fre --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
344 | os.path.realpath(sub_file)) + '" ')
345 | elif sub_file.endswith(".deDE.ass"):
346 | subtitles_files.append(
347 | "--track-name 0:Deutsch --language 0:ger --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
348 | os.path.realpath(sub_file)) + '" ')
349 | elif sub_file.endswith(".arME.ass"):
350 | subtitles_files.append(
351 | "--track-name 0:Arabic --language 0:ara --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
352 | os.path.realpath(sub_file)) + '" ')
353 | elif sub_file.endswith(".itIT.ass"):
354 | subtitles_files.append(
355 | "--track-name 0:Italiano --language 0:ita --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
356 | os.path.realpath(sub_file)) + '" ')
357 | elif sub_file.endswith(".trTR.ass"):
358 | subtitles_files.append(
359 | "--track-name 0:Turkce --language 0:tur --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
360 | os.path.realpath(sub_file)) + '" ')
361 | else:
362 | subtitles_files.append(
363 | "--track-name 0:und --default-track 0:no --sub-charset 0:utf-8 " + '"' + str(
364 | os.path.realpath(sub_file)) + '" ')
365 |
366 | subs_files = anime_dl.common.misc.duplicate_remover(subtitles_files)
367 | logging.debug("subs_files : {0}".format(subs_files))
368 | return subs_files
369 |
370 | def only_subs(self, url, cookies, resolution):
371 | video_id = str(url.split('-')[-1]).replace("/", "")
372 | logging.debug("video_id : {0}".format(video_id))
373 |
374 | response, resolution_to_find, info_url = self.resolution_finder(resolution=resolution, video_id=video_id, url=url)
375 |
376 | if not response:
377 | print("No Resolution Found")
378 | exit(1)
379 |
380 | response_value, xml_page_connect, xml_cookies = anime_dl.common.browser_instance.page_downloader(url=info_url,
381 | cookies=cookies)
382 |
383 | if xml_page_connect:
384 | xml_page_connect = str(xml_page_connect)
385 | stream_exists, m3u8_file_link = self.m3u8_finder(xml_page_source=xml_page_connect)
386 |
387 | if stream_exists:
388 | anime_name, episode_number, video_resolution = self.episode_information_extractor(
389 | page_source=xml_page_connect, resolution_to_find=resolution_to_find)
390 | file_name = supporters.anime_name.crunchyroll_name(anime_name=anime_name, episode_number=episode_number,
391 | resolution=video_resolution)
392 | output_directory = supporters.path_works.path_creator(anime_name=anime_name)
393 | file_location = str(output_directory) + os.sep + str(file_name).replace(".mp4", ".ass")
394 |
395 | if os.path.isfile(file_location):
396 | print('[anime-dl] File Exists! Skipping {0}\n'.format(file_name))
397 | pass
398 | else:
399 | subs_downloaded = supporters.sub_fetcher.crunchyroll_subs(xml=str(xml_page_connect),
400 | episode_number=episode_number,
401 | file_name=file_name)
402 | if not subs_downloaded:
403 | pass
404 | else:
405 | subtitles_moved = self.move_subtitle_file(output_directory)
406 | if subtitles_moved:
407 | return True
408 | else:
409 | return False
410 | else:
411 | print("Stream Not Found. Subtitle Downloading Failed.")
412 | return False
413 |
414 |
415 | def ffmpeg_call(self, m3u8_text, file_name):
416 | try:
417 | ffmpeg_command = 'ffmpeg -i "{0}" -c copy -bsf:a aac_adtstoasc "{1}/{2}"'.format(m3u8_text, os.getcwd(),
418 | file_name)
419 | logging.debug("ffmpeg_command : {0}\n".format(ffmpeg_command))
420 | call = subprocess.check_call(ffmpeg_command, shell=True)
421 | if call:
422 | return True
423 | else:
424 | return False
425 | except Exception:
426 | return False
427 |
428 | def m3u8_finder(self, xml_page_source):
429 | m3u8_file_link = str(re.search(r'(.*?)', xml_page_source).group(1)).replace("&", "&")
430 | logging.debug("m3u8_file_link : %s", m3u8_file_link)
431 |
432 | if not m3u8_file_link:
433 | # If no m3u8 found, try the rtmpdump...
434 | try:
435 | host_link = re.search(r'(.*?)', xml_page_source).group(1)
436 | logging.debug("Found RTMP DUMP!")
437 | print("RTMP streams not supported currently...")
438 | return False, None
439 | except Exception as NoRtmpDump:
440 | print("No RTMP Streams Found...")
441 | print(NoRtmpDump)
442 | else:
443 | return True, m3u8_file_link
444 |
445 | def resolution_finder(self, resolution, video_id, url):
446 | resolution_to_find = None
447 | info_url = ""
448 |
449 | if str(resolution).lower() in ['1080p', '1080', 'fhd', 'best']:
450 | info_url = "http://www.crunchyroll.com/xml/?req=RpcApiVideoPlayer_GetStandardConfig&media_id=%s&video_format=108&video_quality=80¤t_page=%s" % (
451 | video_id, url)
452 | resolution_to_find = "1920x1080"
453 |
454 | elif str(resolution).lower() in ['720p', '720', 'hd']:
455 | info_url = "http://www.crunchyroll.com/xml/?req=RpcApiVideoPlayer_GetStandardConfig&media_id=%s&video_format=106&video_quality=62¤t_page=%s" % (
456 | video_id, url)
457 | resolution_to_find = "1280x720"
458 | elif str(resolution).lower() in ['640', '640x480']:
459 | info_url = "http://www.crunchyroll.com/xml/?req=RpcApiVideoPlayer_GetStandardConfig&media_id=%s&video_format=103&video_quality=61¤t_page=%s" % (
460 | video_id, url)
461 | resolution_to_find = "640x480"
462 | elif str(resolution).lower() in ['480p', '480', 'sd']:
463 | info_url = "http://www.crunchyroll.com/xml/?req=RpcApiVideoPlayer_GetStandardConfig&media_id=%s&video_format=106&video_quality=61¤t_page=%s" % (
464 | video_id, url)
465 | resolution_to_find = "848x480"
466 | elif str(resolution).lower() in ['480x360']:
467 | info_url = "http://www.crunchyroll.com/xml/?req=RpcApiVideoPlayer_GetStandardConfig&media_id=%s&video_format=106&video_quality=61¤t_page=%s" % (
468 | video_id, url)
469 | resolution_to_find = "480x360"
470 | elif str(resolution).lower() in ['360p', '360', 'cancer']:
471 | info_url = "http://www.crunchyroll.com/xml/?req=RpcApiVideoPlayer_GetStandardConfig&media_id=%s&video_format=106&video_quality=60¤t_page=%s" % (
472 | video_id, url)
473 | resolution_to_find = "640x360"
474 | elif str(resolution).lower() in ['240p', '240', 'supracancer']:
475 | info_url = "http://www.crunchyroll.com/xml/?req=RpcApiVideoPlayer_GetStandardConfig&media_id=%s&video_format=106&video_quality=60¤t_page=%s" % (
476 | video_id, url)
477 | resolution_to_find = "428x240"
478 |
479 | logging.debug("info_url : {0}".format(info_url))
480 |
481 | if resolution_to_find is None:
482 | print('Unknown requested resolution %s' % str(resolution).lower())
483 | return False, None, None
484 |
485 | else:
486 | return True, resolution_to_find, info_url
487 |
--------------------------------------------------------------------------------
/anime_dl/sites/supporters/__init__.py:
--------------------------------------------------------------------------------
1 | from . import anime_name
2 | from . import path_works
3 | from . import sub_fetcher
4 |
--------------------------------------------------------------------------------
/anime_dl/sites/supporters/anime_name.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- coding: utf-8 -*-
3 |
4 | import re
5 | import subprocess
6 |
7 |
8 | def crunchyroll_name(anime_name, episode_number, resolution):
9 | anime_name = str(anime_name).replace("039T", "'")
10 | # rawName = str(animeName).title().strip().replace("Season ", "S") + " - " + \
11 | # str(episode_number).strip() + " [" + str(resolution) + "]"
12 |
13 | file_name = str(re.sub(r'[^A-Za-z0-9\ \-\' \\]+', '', str(anime_name))).title().strip().replace("Season ", "S") \
14 | + " - " + str(episode_number.zfill(2)).strip() + " [" + str(resolution) + "].mp4"
15 |
16 | try:
17 | max_path = int(subprocess.check_output(['getconf', 'PATH_MAX', '/']))
18 | except Exception:
19 | max_path = 4096
20 |
21 | if len(file_name) > max_path:
22 | file_name = file_name[:max_path]
23 |
24 | return file_name
25 |
--------------------------------------------------------------------------------
/anime_dl/sites/supporters/path_works.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- coding: utf-8 -*-
3 |
4 | import os
5 |
6 |
7 | def path_creator(anime_name):
8 | output_directory = os.path.abspath("Output" + os.sep + str(anime_name) + "/")
9 | if not os.path.exists("Output"):
10 | os.makedirs("Output")
11 | if not os.path.exists(output_directory):
12 | os.makedirs(output_directory)
13 | return output_directory
14 |
--------------------------------------------------------------------------------
/anime_dl/sites/supporters/sub_fetcher.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- coding: utf-8 -*-
3 |
4 | import anime_dl.common
5 | from anime_dl.external.aes import aes_cbc_decrypt
6 | from anime_dl.external.compat import compat_etree_fromstring
7 | from anime_dl.external.utils import bytes_to_intlist, intlist_to_bytes
8 | import re
9 | import logging
10 | import os
11 | import base64
12 | import zlib
13 | from hashlib import sha1
14 | from math import pow, sqrt, floor
15 |
16 |
17 | def crunchyroll_subs(xml, episode_number, file_name):
18 | headers = {
19 | 'User-Agent':
20 | 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_1) AppleWebKit/601.2.7 (KHTML, like Gecko) Version/9.0.1 Safari/601.2.7',
21 | 'Referer':
22 | 'https://www.crunchyroll.com'
23 | }
24 | for sub_id, sub_lang, sub_lang2 in re.findall(
25 | r'subtitle_script_id\=(.*?)\"\ title\=\"\[(.*?)\]\ (.*?)\"',
26 | str(xml)):
27 | xml_return = anime_dl.common.browser_instance.page_downloader(url="http://www.crunchyroll.com/xml/?req=RpcApiSubtitle_GetXml&subtitle_script_id={0}".format(sub_id), headers=headers)
28 |
29 | iv = str(re.search(r'\(.*?)\<\/iv\>', str(xml_return)).group(1)).strip()
30 | data = str(re.search(r'\(.*?)\<\/data\>', str(xml_return)).group(1)).strip()
31 | subtitle = _decrypt_subtitles(data, iv, sub_id).decode('utf-8')
32 | sub_root = compat_etree_fromstring(subtitle)
33 | sub_data = _convert_subtitles_to_ass(sub_root)
34 | lang_code = str(
35 | re.search(r'lang_code\=\"(.*?)\"', str(subtitle)).group(
36 | 1)).strip()
37 | sub_file_name = str(file_name).replace(".mp4", ".") + str(lang_code) + ".ass"
38 |
39 | print("Downloading {0} ...".format(sub_file_name))
40 |
41 | try:
42 | with open(str(os.getcwd()) + "/" + str(sub_file_name), "wb") as sub_file:
43 | sub_file.write(sub_data.encode("utf-8"))
44 | except Exception as EncodingException:
45 | print("Couldn't write the subtitle file...skipping.")
46 | pass
47 | logging.debug("\n----- Subs Downloaded -----\n")
48 | return True
49 |
50 |
51 | def _decrypt_subtitles(data, iv, id):
52 | data = bytes_to_intlist(base64.b64decode(data.encode('utf-8')))
53 | iv = bytes_to_intlist(base64.b64decode(iv.encode('utf-8')))
54 | id = int(id)
55 |
56 | def obfuscate_key_aux(count, modulo, start):
57 | output = list(start)
58 | for _ in range(count):
59 | output.append(output[-1] + output[-2])
60 | # cut off start values
61 | output = output[2:]
62 | output = list(map(lambda x: x % modulo + 33, output))
63 | return output
64 |
65 | def obfuscate_key(key):
66 | num1 = int(floor(pow(2, 25) * sqrt(6.9)))
67 | num2 = (num1 ^ key) << 5
68 | num3 = key ^ num1
69 | num4 = num3 ^ (num3 >> 3) ^ num2
70 | prefix = intlist_to_bytes(obfuscate_key_aux(20, 97, (1, 2)))
71 | shaHash = bytes_to_intlist(
72 | sha1(prefix + str(num4).encode('ascii')).digest())
73 | # Extend 160 Bit hash to 256 Bit
74 | return shaHash + [0] * 12
75 |
76 | key = obfuscate_key(id)
77 |
78 | decrypted_data = intlist_to_bytes(aes_cbc_decrypt(data, key, iv))
79 | return zlib.decompress(decrypted_data)
80 |
81 |
82 | def _convert_subtitles_to_ass(sub_root):
83 | output = ''
84 |
85 | def ass_bool(strvalue):
86 | assvalue = '0'
87 | if strvalue == '1':
88 | assvalue = '-1'
89 | return assvalue
90 |
91 | output = '[Script Info]\n'
92 | output += 'Title: %s\n' % sub_root.attrib['title']
93 | output += 'ScriptType: v4.00+\n'
94 | output += 'WrapStyle: %s\n' % sub_root.attrib['wrap_style']
95 | output += 'PlayResX: %s\n' % sub_root.attrib['play_res_x']
96 | output += 'PlayResY: %s\n' % sub_root.attrib['play_res_y']
97 | output += """
98 | [V4+ Styles]
99 | Format: Name, Fontname, Fontsize, PrimaryColour, SecondaryColour, OutlineColour, BackColour, Bold, Italic, Underline, StrikeOut, ScaleX, ScaleY, Spacing, Angle, BorderStyle, Outline, Shadow, Alignment, MarginL, MarginR, MarginV, Encoding
100 | """
101 | for style in sub_root.findall('./styles/style'):
102 | output += 'Style: ' + style.attrib['name']
103 | output += ',' + style.attrib['font_name']
104 | output += ',' + style.attrib['font_size']
105 | output += ',' + style.attrib['primary_colour']
106 | output += ',' + style.attrib['secondary_colour']
107 | output += ',' + style.attrib['outline_colour']
108 | output += ',' + style.attrib['back_colour']
109 | output += ',' + ass_bool(style.attrib['bold'])
110 | output += ',' + ass_bool(style.attrib['italic'])
111 | output += ',' + ass_bool(style.attrib['underline'])
112 | output += ',' + ass_bool(style.attrib['strikeout'])
113 | output += ',' + style.attrib['scale_x']
114 | output += ',' + style.attrib['scale_y']
115 | output += ',' + style.attrib['spacing']
116 | output += ',' + style.attrib['angle']
117 | output += ',' + style.attrib['border_style']
118 | output += ',' + style.attrib['outline']
119 | output += ',' + style.attrib['shadow']
120 | output += ',' + style.attrib['alignment']
121 | output += ',' + style.attrib['margin_l']
122 | output += ',' + style.attrib['margin_r']
123 | output += ',' + style.attrib['margin_v']
124 | output += ',' + style.attrib['encoding']
125 | output += '\n'
126 |
127 | output += """
128 | [Events]
129 | Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text
130 | """
131 | for event in sub_root.findall('./events/event'):
132 | output += 'Dialogue: 0'
133 | output += ',' + event.attrib['start']
134 | output += ',' + event.attrib['end']
135 | output += ',' + event.attrib['style']
136 | output += ',' + event.attrib['name']
137 | output += ',' + event.attrib['margin_l']
138 | output += ',' + event.attrib['margin_r']
139 | output += ',' + event.attrib['margin_v']
140 | output += ',' + event.attrib['effect']
141 | output += ',' + event.attrib['text']
142 | output += '\n'
143 |
144 | return output
145 |
--------------------------------------------------------------------------------
/anime_dl/subtitles/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Xonshiz/anime-dl/d40f0ca11b894dbbf17edbb9d0f069e741be902f/anime_dl/subtitles/__init__.py
--------------------------------------------------------------------------------
/anime_dl/version.py:
--------------------------------------------------------------------------------
1 | __version__ = "2019.05.16"
2 |
--------------------------------------------------------------------------------
/auto.sh:
--------------------------------------------------------------------------------
1 | #Creator of script: SirJosh3917
2 | #Creator of anime-dl: Xonshiz
3 | #May (5) 5th, 2018 - Updated June (6) 10th, 2018
4 |
5 | #check if root
6 | #https://askubuntu.com/questions/15853/how-can-a-script-check-if-its-being-run-as-root
7 | is_root() {
8 | if ! [ $(id -u) = 0 ]; then
9 | echo "Please run this script as root."
10 | echo "Running \'sudo\' on this script..."
11 |
12 | SCRIPT_NAME=$(basename "$0")
13 |
14 | CMD="sudo ./${SCRIPT_NAME}"
15 | eval ${CMD}
16 | exit $?
17 | fi
18 | }
19 |
20 | #shamelessly stolen/modified from https://install.pi-hole.net
21 | distro_check() {
22 | DISTRO_OS=""
23 | DISTRO_DEBIAN="debian"
24 |
25 | PKG_MANAGER=""
26 | PKG_INSTALL=""
27 |
28 | if command -v apt-get &> /dev/null; then
29 | DISTRO_OS=(${DISTRO_DEBIAN})
30 | PKG_MANAGER="apt-get"
31 | PKG_INSTALL="${PKG_MANAGER} --yes --no-install-recommends install"
32 | # elif command -v rpm &> /dev/null; then
33 | # if command -v dnf &> /dev/null; then
34 | # PKG_MANAGER="dnf"
35 | # elif
36 | # PKG_MANAGER="yum"
37 | # fi
38 | else
39 | echo "Your linux distro is not supporeted."
40 | exit
41 | fi
42 |
43 | if [ ${DISTRO_OS} == ${DISTRO_DEBIAN} ]; then
44 | return 0;
45 | else
46 | echo "Your linux distro either isn\'t supported, or somebody didn\'t finish coding the distro_check..."
47 | exit
48 | fi
49 |
50 | return 0;
51 | }
52 |
53 | get_distro() {
54 | DISTRO_DEBIAN_7=0
55 | DISTRO_DEBIAN_8=1
56 | DISTRO_DEBIAN_9=2
57 | RETURN=-1
58 |
59 | if [ ${DISTRO_OS} == ${DISTRO_DEBIAN} ]; then
60 | lsb_release -a > tmp_distro
61 | if grep "stretch" tmp_distro; then
62 | RETURN=${DISTRO_DEBIAN_9}
63 | elif grep "jessie" tmp_distro; then
64 | RETURN=${DISTRO_DEBIAN_8}
65 | elif grep "wheezy" tmp_distro; then
66 | RETURN=${DISTRO_DEBIAN_7}
67 | else
68 | echo "Version of debian not supported."
69 | RETURN=-1;
70 | fi
71 | fi
72 |
73 | rm tmp_distro
74 | return ${RETURN}
75 | }
76 |
77 | install_ffmpeg() {
78 | ${PKG_INSTALL} ffmpeg
79 | }
80 |
81 | install_mkvmerge() {
82 | wget -q -O - https://mkvtoolnix.download/gpg-pub-moritzbunkus.txt | sudo apt-key add -
83 |
84 | get_distro
85 | DISTRO=$?
86 |
87 | if [ ${DISTRO} == ${DISTRO_DEBIAN_9} ]; then
88 | deb https://mkvtoolnix.download/debian/ stretch main
89 | deb-src https://mkvtoolnix.download/debian/ stretch main
90 | elif [ ${DISTRO} == ${DISTRO_DEBIAN_8} ]; then
91 | deb https://mkvtoolnix.download/debian/ jessie main
92 | deb-src https://mkvtoolnix.download/debian/ jessie main
93 | elif [ ${DISTRO} == ${DISTRO_DEBIAN_7} ]; then
94 | deb https://mkvtoolnix.download/debian/ wheezy main
95 | deb-src https://mkvtoolnix.download/debian/ wheezy main
96 | else echo "Distro unsupported."; return 1; fi
97 |
98 | apt update
99 | ${PKG_INSTALL} mkvtoolnix
100 | }
101 |
102 | install_nodejs() {
103 | curl -sL https://deb.nodesource.com/setup_8.x | sudo -E bash -
104 | ${PKG_INSTALL} nodejs
105 | }
106 |
107 | install_animedl() {
108 | wget https://github.com/Xonshiz/anime-dl/archive/master.tar.gz
109 | tar -xzf master.tar.gz
110 | mv anime-dl-master anime-dl #rename to anime-dl
111 | rm master.tar.gz
112 |
113 | SCRIPTS_DIR="anime-dl/anime_dl/"
114 |
115 | #make it runnable
116 | chmod -R +x ${SCRIPTS_DIR}
117 | chmod -R 755 ${SCRIPTS_DIR}
118 | }
119 |
120 | install_curl() {
121 | ${PKG_INSTALL} curl
122 | }
123 |
124 | install_pip() {
125 | curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
126 | python get-pip.py
127 | rm get-pip.py
128 | }
129 |
130 | install_dependencies() {
131 | pip install cfscrape
132 | pip install tqdm
133 | pip install bs4
134 | }
135 |
136 | ensure_animedl_installed() {
137 | #find "Anime_DL downloads anime from" in the --help
138 |
139 | FIND="Anime_DL downloads anime from"
140 | SCRIPTS_DIR="anime-dl/anime_dl/"
141 | RETURN=1
142 |
143 | cd ${SCRIPTS_DIR}
144 | ./__main__.py --help > ../../tmp_help
145 | cd ..; cd ..
146 |
147 | if grep "${FIND}" tmp_help; then
148 | echo "anime-dl installed!";
149 | RETURN=0
150 | else
151 | echo "anime-dl not installed...";
152 | RETURN=1
153 | fi
154 |
155 | rm tmp_help
156 | return ${RETURN}
157 | }
158 |
159 | #make sure we're root so we can install packages
160 |
161 | is_root
162 |
163 | distro_check
164 |
165 | if [ $? == 0 ]; then
166 | install_ffmpeg
167 | install_mkvmerge
168 | install_nodejs
169 | install_curl
170 | install_pip
171 | install_dependencies
172 | install_animedl
173 | ensure_animedl_installed
174 | exit $?
175 | else
176 | echo "no"
177 | exit
178 | fi
179 |
--------------------------------------------------------------------------------
/docs/Changelog.md:
--------------------------------------------------------------------------------
1 | #Changelog
2 |
3 | - Site support for Crunchyroll.com [2017.03.05]
4 | - Fix for #1 [2017.03.06]
5 | - Fix for #2 [2017.03.06]
6 | - ReadMe updated for Python Script execution [2017.03.06]
7 | - Support for Whole Show Downloading for Crunchyroll [2017.03.06]
8 | - Selection of language for the Crunchyroll Show [2017.03.06]
9 | - Downloading only subtitles (skip video downloads) [2017.04.13]
10 | - Fix for [6](https://github.com/Xonshiz/anime-dl/issues/6) and Fix for [3](https://github.com/Xonshiz/anime-dl/issues/3) [2017.04.13]
11 | - Fix for #9 [2017.04.13]
12 | - Added `Verbose Logging` [2017.04.13]
13 | - Fix for #11 [2017.04.21]
14 | - Re-wrote code to remove unnecessary parts [2017.05.30]
15 | - Fix for #12 [2017.05.30]
16 | - Site support for Funimation.com [2017.05.30]
17 | - Fix for #8 [2017.05.30]
18 | - Muxing All The Subtitles [2017.07.03]
19 | - Fix for special characters and #15 [2017.07.05]
20 | - Episode Download Range supprt Added [2017.07.07]
21 | - Added support to include fonts in the muxed videos [2017.07.09]
22 | - Changed mkvmerge.exe to mkvmerge to support Non-Windows Operating Systems [2017.07.24]
23 | - Fix for #21 and #22 [2017.10.04]
24 | - Fix for #17 and #22 [2017.12.27]
25 | - Fix for #17 and #26 [2017.12.27]
26 | - PEP8 Cleaning [2018.01.02]
27 | - Fix for #18 [2018.01.02]
28 | - Fix for #31 [2018.01.21]
29 | - Fix for #39 [2018.01.27]
30 | - Fix for #46 [2018.01.29]
31 | - Fix for #45 [2018.01.29]
32 | - Temp fix for login #65, #66 [2018.10.11]
33 | - Login Issue Fixed [2019.05.16]
34 | - Re-structured the code for better maintainance and re-usability. [2019.05.16]
35 | - Fixed #100 [2019.05.26]
36 | - Fixed #99 [2019.05.26]
37 | - Fixed cookie issue that prevented downloading of HD and FHD streams [2019.05.27
--------------------------------------------------------------------------------
/docs/Supported_Sites.md:
--------------------------------------------------------------------------------
1 | #List of Supported Websites
2 |
3 | * [CrunchyRoll](http://crunchyroll.com)
4 | * [Funimation](http://funimation.com) [Testing]
--------------------------------------------------------------------------------
/docs/index.md:
--------------------------------------------------------------------------------
1 | # Anime-DL | [](https://travis-ci.org/Xonshiz/anime-dl) [](http://anime-dl.readthedocs.io/en/latest/?badge=latest) | [](https://www.paypal.me/xonshiz)
2 |
3 | Anime-dl is a Command-line program to download anime from CrunchyRoll and Funimation. This script needs you to have a premium subscription for the listed services. If you don't have a subscription, this script is pretty much usless for you.
4 |
5 | > Downloading and distributing this content may be illegal.This script was written for education purposes purely and you are responsible for its use.
6 |
7 | > Support these anime streaming websites by buying a premium account.
8 |
9 | > Taking some libs directly from youtube-dl to decrypt CrunchyRoll's subtitles.
10 |
11 | ## Table of Content
12 |
13 | * [Supported Sites](https://github.com/Xonshiz/anime-dl/blob/master/Supported_Sites.md)
14 | * [Dependencies Installation](#dependencies-installation)
15 | * [Installation](#installation)
16 | * [Python Support](#python-support)
17 | * [Windows Binary](#windows-binary)
18 | * [List of Arguments](#list-of-arguments)
19 | * [Usage](#usage)
20 | * [Windows](#windows)
21 | * [Linux/Debian](#linuxdebian)
22 | * [Example URLs](#example-urls)
23 | * [Features](#features)
24 | * [Changelog](https://github.com/Xonshiz/anime-dl/blob/master/Changelog.md)
25 | * [Opening An Issue/Requesting A Site](#opening-an-issuerequesting-a-site)
26 | * [Reporting Issues](#reporting-issues)
27 | * [Suggesting A Feature](#suggesting-a-feature)
28 | * [Donations](#donations)
29 |
30 | ## Supported Websites
31 | You can check the list of supported websites [**`HERE`**](https://github.com/Xonshiz/anime-dl/blob/master/Supported_Sites.md).
32 |
33 | ## Dependencies Installation
34 | This script can run on multiple Operating Systems. But, the script depends on some external binaries or libs. We need `FFmpeg`, `mkvmerge` and `Node.js` in our paths. There are some old streams on Crunchyroll which only support `rtmpe` streams, as noted from Issue #9. For this, you need `rtmpdump`.
35 |
36 | You also need [mkvmerge.exe](https://mkvtoolnix.download/downloads.html) in your `PATH` or `Working Directory`.
37 |
38 |
39 | **`These dependencies are required on ALL the operating systems, ALL!.`**
40 |
41 | 1.) Make sure you have Python installed and is present in your system's path.
42 |
43 | 2.) Grab [FFmpeg from this link](https://ffmpeg.org/download.html), [Node.js from this link](https://nodejs.org/en/download/) and [RTMPDump](https://www.videohelp.com/software/RTMPDump).
44 |
45 | 3.) Install FFmpeg and Node.js and place it in the directory of this script, or put them in your system's path.
46 |
47 | 4.) Browse to the directory of this script and open command prompt/shell in that directory and run this command :
48 |
49 | ```
50 | python pip install -r requirements.txt
51 | ```
52 |
53 | ## Installation
54 | After installing and setting up all the dependencies in your Operating System, you're good to go and use this script.
55 | The instructions for all the OS would remain same. Download [`THIS REPOSITORY`](https://github.com/Xonshiz/anime-dl/archive/master.zip) and put it somewhere in your system. Move over the `anime_dl` folder.
56 |
57 | **Windows users**, it's better to not place it places where it requires administrator privileges. Good example of places to avoid would be `C:\Windows` etc.. This goes for both, the Python script and the windows binary file (.exe).
58 |
59 | **Linux/Debian** users make sure that this script is executable.Just run this command, if you run into problem(s) :
60 |
61 | `chmod +x anime-dl.py`
62 |
63 | `chmod +x __main__.py`
64 |
65 | and then, execute with this :
66 |
67 | `./__main__.py`
68 |
69 | ## Python Support
70 | This script supports only Pythom 3 currently..
71 |
72 | ## Windows Binary
73 | It is recommended that windows users use this binary to save both, your head and time from installing all the dependencies.
74 |
75 | You need to download [FFmpeg](https://ffmpeg.org/download.html) and [Node.js](https://nodejs.org/en/download/) and keep them in the same directory as that of this windows binary file.
76 |
77 | If you already have these dependencies, then you can download this binary and start using the script right off the bat :
78 | * `Binary (x86)` : [Click Here](https://github.com/Xonshiz/anime-dl/releases/latest)
79 |
80 |
81 | ## List of Arguments
82 | Currently, the script supports these arguments :
83 | ```
84 | -h, --help Prints the basic help menu of the script and exits.
85 | -i,--input Defines the input link to the anime.
86 | -V,--version Prints the VERSION and exits.
87 | -u,--username Indicates username for a website. [REQUIRED]
88 | -p,--password Indicates password for a website. [REQUIRED]
89 | -r,--resolution Indicates the desired resolution. (default = 720p)
90 | --skip Skip video downloads (Will only download subtitles)
91 | -v,--verbose Starts Verbose Logging for detailed information.
92 | -l,--language Selects the language for the show. (default = Japanese) [Langs = english, dub, sub, Japanese, eng]
93 | -rn,--range Selects the range of episodes to download (Default = All) [ Ex : --range 1-10 (This will download first 10 episodes of a series)]
94 | ```
95 |
96 | ## Usage
97 | With this script, you have to pass arguments in order to be able to download anything. Passing arguments in a script is pretty easy. Since the script is pretty basic, it doesn't have too many arguments. Go check the [`ARGUMENTS SECTION`](https://github.com/Xonshiz/anime-dl#list-of-arguments) to know more about which arguments the script offers.
98 |
99 | Follow the instructions according to your OS :
100 |
101 | ### Windows
102 | After you've saved this script in a directory/folder, you need to open `command prompt` and browse to that directory and then execute the script. Let's do it step by step :
103 | * Open the folder where you've downloaded the files of this repository.
104 | * Hold down the **`SHIFT`** key and while holding down the SHIFT key, **`RIGHT CLICK`** and select `Open Command Prompy Here` from the options that show up.
105 | * Now, in the command prompt, type this :
106 |
107 | *If you're using the windows binary :*
108 |
109 | `anime-dl.exe -i "" -u "YourUsername" -p "Password" -r "Resolution"`
110 |
111 | *If you're using the Python Script :*
112 |
113 | `__main__.py -i "" -u "YourUsername" -p "Password" -r "Resolution"`
114 |
115 | URL can be any URL of the [supported websites](https://github.com/Xonshiz/anime-dl/blob/master/Supported_Sites.md).
116 |
117 | ### Linux/Debian
118 | After you've saved this script in a directory/folder, you need to open `command prompt` and browse to that directory and then execute the script. Let's do it step by step :
119 | * Open a terminal, `Ctrl + Alt + T` is the shortcut to do so (if you didn't know).
120 | * Now, change the current working directory of the terminal to the one where you've downloaded this repository.
121 | * Now, in the Terminal, type this :
122 |
123 | `__main__.py -i "" -u "YourUsername" -p "Password" -r "Resolution"`
124 |
125 | URL can be any URL of the [supported websites](https://github.com/Xonshiz/anime-dl/blob/master/Supported_Sites.md).
126 |
127 | ### Example URLs
128 | * Crunchyroll :
129 | * Single Episode : [http://www.crunchyroll.com/i-cant-understand-what-my-husband-is-saying/episode-13-happy-days-678059](http://www.crunchyroll.com/i-cant-understand-what-my-husband-is-saying/episode-13-happy-days-678059)
130 | * Whole Show : [http://www.crunchyroll.com/i-cant-understand-what-my-husband-is-saying](http://www.crunchyroll.com/i-cant-understand-what-my-husband-is-saying)
131 |
132 | ### Note :
133 |
134 | * If you want to include some fonts in the muxed video, you need to make a folder named "Fonts" in the same directory as that of this script and put all the fonts inside that directory. The script should take them all.
135 |
136 | ## Features
137 | This is a very basic and small sript, so at the moment it only have a few features.
138 | * Downloads a Single episode along with all the available subtitles for that episode.
139 | * Downloads and puts them all in a directory named "Output".
140 | * Skip if the file has already been downloaded.
141 | * Downloads all the episodes for a show available on Crunchyroll.
142 | * Gives choice for downloading subs or dubs of a series available on Crunchyroll.
143 | * Choice to download only the subs and skip the videos.
144 |
145 | ## Changelog
146 | You can check the changelog [**`HERE`**](https://github.com/Xonshiz/anime-dl/blob/master/Changelog.md).
147 |
148 | ## Opening An Issue/Requesting A Site
149 | If your're planning to open an issue for the script or ask for a new feature or anything that requires opening an Issue, then please do keep these things in mind.
150 |
151 | ### Reporting Issues
152 | PLEASE RUN THIS SCRIPT IN A COMMAND LINE (as mentioned in the Usage section) AND DON'T SAY THAT `THE SCRIPT CLOSED TOO QUICK, I COULDN'T SEE`. If something doesn't work like it's supposed to, run the command with the `--verbose` argument. It'll create a `Error Log.txt` file in the same directory. Upload the content of that file on Github Gists/Pastebin etc. and share that link.
153 |
154 | **Please make sure that you remove your loggin credentials from the Error Log.txt file before you post its contents anywhere.**
155 |
156 | If you're here to report an issue, please follow the basic syntax to post a request :
157 |
158 | **Subject** : Error That You Get.
159 |
160 | **Command Line Arguments You Gave** : The whole command that you gave to execute/run this script.
161 |
162 | **Verbose Log Link** : Link to the Gist/Pastebin that holds the content of Error Log.txt.
163 |
164 | **Long Explanation** : Describe in details what you saw, what should've happened and what actually happened.
165 |
166 | This should be enough, but it'll be great if you can add more ;)
167 |
168 | ### Suggesting A Feature
169 | If you're here to make suggestions, please follow the basic syntax to post a request :
170 |
171 | **Subject** : Something that briefly tells us about the feature.
172 |
173 | **Long Explanation** : Describe in details what you want and how you want.
174 |
175 | This should be enough, but it'll be great if you can add more ;)
176 |
177 | # Donations
178 | You can always send some money over from this :
179 |
180 | Paypal : [](https://www.paypal.me/xonshiz)
181 |
182 | Patreon Link : https://www.patreon.com/xonshiz
183 |
184 | Any amount is appreciated :)
185 |
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | argparse
2 | cfscrape
3 | bs4
4 | tqdm
5 |
--------------------------------------------------------------------------------