├── .github
└── FUNDING.yml
├── .gitignore
├── README.md
├── LICENSE
└── scrapper-notebook.ipynb
/.github/FUNDING.yml:
--------------------------------------------------------------------------------
1 | github: bhattbhavesh91
2 | custom: ['https://www.buymeacoffee.com/bhattbhavesh91']
3 |
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | MANIFEST
2 | build
3 | dist
4 | _build
5 | docs/man/*.gz
6 | docs/source/api/generated
7 | docs/source/config.rst
8 | docs/gh-pages
9 | notebook/i18n/*/LC_MESSAGES/*.mo
10 | notebook/i18n/*/LC_MESSAGES/nbjs.json
11 | notebook/static/components
12 | notebook/static/style/*.min.css*
13 | notebook/static/*/js/built/
14 | notebook/static/*/built/
15 | notebook/static/built/
16 | notebook/static/*/js/main.min.js*
17 | notebook/static/lab/*bundle.js
18 | node_modules
19 | *.py[co]
20 | __pycache__
21 | *.egg-info
22 | *~
23 | *.bak
24 | .ipynb_checkpoints
25 | .tox
26 | .DS_Store
27 | \#*#
28 | .#*
29 | .coverage
30 | .pytest_cache
31 | src
32 |
33 | *.swp
34 | *.map
35 | .idea/
36 | Read the Docs
37 | config.rst
38 |
39 | /.project
40 | /.pydevproject
41 |
42 | package-lock.json
43 | geckodriver.log
44 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Scrape Twitter Data or Tweets in Python using snscrape module
2 |
3 | **If you like my work, you can support me by buying me a coffee by clicking the link below**
4 |
5 |
6 |
7 | ## Click to open the Notebook directly in Google Colab
8 | [](https://colab.research.google.com/github/bhattbhavesh91/twitter-scrapper-snscrape/blob/main/scrapper-notebook.ipynb)
9 |
10 | ## To view the video
11 |
12 |
13 |
14 |  |
15 |
16 |
17 |
18 | or click on the image below
19 |
20 |
21 | [](http://www.youtube.com/watch?v=jOqLX_az_Tg)
23 |
24 | ## Follow Me
25 |
26 |
27 |
28 |
29 |
30 |
31 | Show your support by starring the repository 🙂
32 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | Apache License
2 | Version 2.0, January 2004
3 | http://www.apache.org/licenses/
4 |
5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6 |
7 | 1. Definitions.
8 |
9 | "License" shall mean the terms and conditions for use, reproduction,
10 | and distribution as defined by Sections 1 through 9 of this document.
11 |
12 | "Licensor" shall mean the copyright owner or entity authorized by
13 | the copyright owner that is granting the License.
14 |
15 | "Legal Entity" shall mean the union of the acting entity and all
16 | other entities that control, are controlled by, or are under common
17 | control with that entity. For the purposes of this definition,
18 | "control" means (i) the power, direct or indirect, to cause the
19 | direction or management of such entity, whether by contract or
20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the
21 | outstanding shares, or (iii) beneficial ownership of such entity.
22 |
23 | "You" (or "Your") shall mean an individual or Legal Entity
24 | exercising permissions granted by this License.
25 |
26 | "Source" form shall mean the preferred form for making modifications,
27 | including but not limited to software source code, documentation
28 | source, and configuration files.
29 |
30 | "Object" form shall mean any form resulting from mechanical
31 | transformation or translation of a Source form, including but
32 | not limited to compiled object code, generated documentation,
33 | and conversions to other media types.
34 |
35 | "Work" shall mean the work of authorship, whether in Source or
36 | Object form, made available under the License, as indicated by a
37 | copyright notice that is included in or attached to the work
38 | (an example is provided in the Appendix below).
39 |
40 | "Derivative Works" shall mean any work, whether in Source or Object
41 | form, that is based on (or derived from) the Work and for which the
42 | editorial revisions, annotations, elaborations, or other modifications
43 | represent, as a whole, an original work of authorship. For the purposes
44 | of this License, Derivative Works shall not include works that remain
45 | separable from, or merely link (or bind by name) to the interfaces of,
46 | the Work and Derivative Works thereof.
47 |
48 | "Contribution" shall mean any work of authorship, including
49 | the original version of the Work and any modifications or additions
50 | to that Work or Derivative Works thereof, that is intentionally
51 | submitted to Licensor for inclusion in the Work by the copyright owner
52 | or by an individual or Legal Entity authorized to submit on behalf of
53 | the copyright owner. For the purposes of this definition, "submitted"
54 | means any form of electronic, verbal, or written communication sent
55 | to the Licensor or its representatives, including but not limited to
56 | communication on electronic mailing lists, source code control systems,
57 | and issue tracking systems that are managed by, or on behalf of, the
58 | Licensor for the purpose of discussing and improving the Work, but
59 | excluding communication that is conspicuously marked or otherwise
60 | designated in writing by the copyright owner as "Not a Contribution."
61 |
62 | "Contributor" shall mean Licensor and any individual or Legal Entity
63 | on behalf of whom a Contribution has been received by Licensor and
64 | subsequently incorporated within the Work.
65 |
66 | 2. Grant of Copyright License. Subject to the terms and conditions of
67 | this License, each Contributor hereby grants to You a perpetual,
68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69 | copyright license to reproduce, prepare Derivative Works of,
70 | publicly display, publicly perform, sublicense, and distribute the
71 | Work and such Derivative Works in Source or Object form.
72 |
73 | 3. Grant of Patent License. Subject to the terms and conditions of
74 | this License, each Contributor hereby grants to You a perpetual,
75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76 | (except as stated in this section) patent license to make, have made,
77 | use, offer to sell, sell, import, and otherwise transfer the Work,
78 | where such license applies only to those patent claims licensable
79 | by such Contributor that are necessarily infringed by their
80 | Contribution(s) alone or by combination of their Contribution(s)
81 | with the Work to which such Contribution(s) was submitted. If You
82 | institute patent litigation against any entity (including a
83 | cross-claim or counterclaim in a lawsuit) alleging that the Work
84 | or a Contribution incorporated within the Work constitutes direct
85 | or contributory patent infringement, then any patent licenses
86 | granted to You under this License for that Work shall terminate
87 | as of the date such litigation is filed.
88 |
89 | 4. Redistribution. You may reproduce and distribute copies of the
90 | Work or Derivative Works thereof in any medium, with or without
91 | modifications, and in Source or Object form, provided that You
92 | meet the following conditions:
93 |
94 | (a) You must give any other recipients of the Work or
95 | Derivative Works a copy of this License; and
96 |
97 | (b) You must cause any modified files to carry prominent notices
98 | stating that You changed the files; and
99 |
100 | (c) You must retain, in the Source form of any Derivative Works
101 | that You distribute, all copyright, patent, trademark, and
102 | attribution notices from the Source form of the Work,
103 | excluding those notices that do not pertain to any part of
104 | the Derivative Works; and
105 |
106 | (d) If the Work includes a "NOTICE" text file as part of its
107 | distribution, then any Derivative Works that You distribute must
108 | include a readable copy of the attribution notices contained
109 | within such NOTICE file, excluding those notices that do not
110 | pertain to any part of the Derivative Works, in at least one
111 | of the following places: within a NOTICE text file distributed
112 | as part of the Derivative Works; within the Source form or
113 | documentation, if provided along with the Derivative Works; or,
114 | within a display generated by the Derivative Works, if and
115 | wherever such third-party notices normally appear. The contents
116 | of the NOTICE file are for informational purposes only and
117 | do not modify the License. You may add Your own attribution
118 | notices within Derivative Works that You distribute, alongside
119 | or as an addendum to the NOTICE text from the Work, provided
120 | that such additional attribution notices cannot be construed
121 | as modifying the License.
122 |
123 | You may add Your own copyright statement to Your modifications and
124 | may provide additional or different license terms and conditions
125 | for use, reproduction, or distribution of Your modifications, or
126 | for any such Derivative Works as a whole, provided Your use,
127 | reproduction, and distribution of the Work otherwise complies with
128 | the conditions stated in this License.
129 |
130 | 5. Submission of Contributions. Unless You explicitly state otherwise,
131 | any Contribution intentionally submitted for inclusion in the Work
132 | by You to the Licensor shall be under the terms and conditions of
133 | this License, without any additional terms or conditions.
134 | Notwithstanding the above, nothing herein shall supersede or modify
135 | the terms of any separate license agreement you may have executed
136 | with Licensor regarding such Contributions.
137 |
138 | 6. Trademarks. This License does not grant permission to use the trade
139 | names, trademarks, service marks, or product names of the Licensor,
140 | except as required for reasonable and customary use in describing the
141 | origin of the Work and reproducing the content of the NOTICE file.
142 |
143 | 7. Disclaimer of Warranty. Unless required by applicable law or
144 | agreed to in writing, Licensor provides the Work (and each
145 | Contributor provides its Contributions) on an "AS IS" BASIS,
146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147 | implied, including, without limitation, any warranties or conditions
148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149 | PARTICULAR PURPOSE. You are solely responsible for determining the
150 | appropriateness of using or redistributing the Work and assume any
151 | risks associated with Your exercise of permissions under this License.
152 |
153 | 8. Limitation of Liability. In no event and under no legal theory,
154 | whether in tort (including negligence), contract, or otherwise,
155 | unless required by applicable law (such as deliberate and grossly
156 | negligent acts) or agreed to in writing, shall any Contributor be
157 | liable to You for damages, including any direct, indirect, special,
158 | incidental, or consequential damages of any character arising as a
159 | result of this License or out of the use or inability to use the
160 | Work (including but not limited to damages for loss of goodwill,
161 | work stoppage, computer failure or malfunction, or any and all
162 | other commercial damages or losses), even if such Contributor
163 | has been advised of the possibility of such damages.
164 |
165 | 9. Accepting Warranty or Additional Liability. While redistributing
166 | the Work or Derivative Works thereof, You may choose to offer,
167 | and charge a fee for, acceptance of support, warranty, indemnity,
168 | or other liability obligations and/or rights consistent with this
169 | License. However, in accepting such obligations, You may act only
170 | on Your own behalf and on Your sole responsibility, not on behalf
171 | of any other Contributor, and only if You agree to indemnify,
172 | defend, and hold each Contributor harmless for any liability
173 | incurred by, or claims asserted against, such Contributor by reason
174 | of your accepting any such warranty or additional liability.
175 |
176 | END OF TERMS AND CONDITIONS
177 |
178 | APPENDIX: How to apply the Apache License to your work.
179 |
180 | To apply the Apache License to your work, attach the following
181 | boilerplate notice, with the fields enclosed by brackets "[]"
182 | replaced with your own identifying information. (Don't include
183 | the brackets!) The text should be enclosed in the appropriate
184 | comment syntax for the file format. We also recommend that a
185 | file or class name and description of purpose be included on the
186 | same "printed page" as the copyright notice for easier
187 | identification within third-party archives.
188 |
189 | Copyright [yyyy] [name of copyright owner]
190 |
191 | Licensed under the Apache License, Version 2.0 (the "License");
192 | you may not use this file except in compliance with the License.
193 | You may obtain a copy of the License at
194 |
195 | http://www.apache.org/licenses/LICENSE-2.0
196 |
197 | Unless required by applicable law or agreed to in writing, software
198 | distributed under the License is distributed on an "AS IS" BASIS,
199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200 | See the License for the specific language governing permissions and
201 | limitations under the License.
202 |
--------------------------------------------------------------------------------
/scrapper-notebook.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 4,
3 | "nbformat_minor": 0,
4 | "metadata": {
5 | "colab": {
6 | "name": "Copy of twitter-scrapper.ipynb",
7 | "provenance": [],
8 | "collapsed_sections": []
9 | },
10 | "kernelspec": {
11 | "name": "python3",
12 | "display_name": "Python 3"
13 | },
14 | "language_info": {
15 | "name": "python"
16 | }
17 | },
18 | "cells": [
19 | {
20 | "cell_type": "code",
21 | "metadata": {
22 | "id": "xwHaRJpruXsz"
23 | },
24 | "source": [
25 | "!pip install -q snscrape==0.3.4"
26 | ],
27 | "execution_count": null,
28 | "outputs": []
29 | },
30 | {
31 | "cell_type": "code",
32 | "metadata": {
33 | "id": "nD7B3adiubGG"
34 | },
35 | "source": [
36 | "import os\n",
37 | "import pandas as pd\n",
38 | "from datetime import date"
39 | ],
40 | "execution_count": null,
41 | "outputs": []
42 | },
43 | {
44 | "cell_type": "code",
45 | "metadata": {
46 | "id": "uNhZNSNavwGQ"
47 | },
48 | "source": [
49 | "today = date.today()\n",
50 | "end_date = today"
51 | ],
52 | "execution_count": null,
53 | "outputs": []
54 | },
55 | {
56 | "cell_type": "code",
57 | "metadata": {
58 | "id": "lUFINzS2yNem"
59 | },
60 | "source": [
61 | "search_term = 'Bhavesh Bhatt Data Scientist'\n",
62 | "from_date = '2019-01-01'"
63 | ],
64 | "execution_count": null,
65 | "outputs": []
66 | },
67 | {
68 | "cell_type": "markdown",
69 | "metadata": {
70 | "id": "dx9KjHTi4Fxj"
71 | },
72 | "source": [
73 | "# Total Number of Tweets for Search Terms"
74 | ]
75 | },
76 | {
77 | "cell_type": "code",
78 | "metadata": {
79 | "colab": {
80 | "base_uri": "https://localhost:8080/"
81 | },
82 | "id": "I0fFOKpLwBGF",
83 | "outputId": "b2a02763-e7be-4f3a-945a-316e9337109d"
84 | },
85 | "source": [
86 | "os.system(f\"snscrape --since {from_date} twitter-search '{search_term} until:{end_date}' > result-tweets.txt\")\n",
87 | "if os.stat(\"result-tweets.txt\").st_size == 0:\n",
88 | " counter = 0\n",
89 | "else:\n",
90 | " df = pd.read_csv('result-tweets.txt', names=['link'])\n",
91 | " counter = df.size\n",
92 | "\n",
93 | "print('Number Of Tweets : '+ str(counter))"
94 | ],
95 | "execution_count": null,
96 | "outputs": [
97 | {
98 | "output_type": "stream",
99 | "name": "stdout",
100 | "text": [
101 | "Number Of Tweets : 3\n"
102 | ]
103 | }
104 | ]
105 | },
106 | {
107 | "cell_type": "markdown",
108 | "metadata": {
109 | "id": "iFN4VxqK4KDU"
110 | },
111 | "source": [
112 | "# Extracting Exact Tweeets"
113 | ]
114 | },
115 | {
116 | "cell_type": "code",
117 | "metadata": {
118 | "id": "GGUawFA0xTG3"
119 | },
120 | "source": [
121 | "max_results = 100"
122 | ],
123 | "execution_count": null,
124 | "outputs": []
125 | },
126 | {
127 | "cell_type": "code",
128 | "metadata": {
129 | "colab": {
130 | "base_uri": "https://localhost:8080/"
131 | },
132 | "id": "AWseHfDmwYCw",
133 | "outputId": "6fd2cab6-d879-4517-c6e0-53a185fe8f0f"
134 | },
135 | "source": [
136 | "extracted_tweets = \"snscrape --format '{content!r}'\"+ f\" --max-results {max_results} --since {from_date} twitter-search '{search_term} until:{end_date}' > extracted-tweets.txt\"\n",
137 | "os.system(extracted_tweets)\n",
138 | "if os.stat(\"extracted-tweets.txt\").st_size == 0:\n",
139 | " print('No Tweets found')\n",
140 | "else:\n",
141 | " df = pd.read_csv('extracted-tweets.txt', names=['content'])\n",
142 | " for row in df['content'].iteritems():\n",
143 | " print(row)"
144 | ],
145 | "execution_count": null,
146 | "outputs": [
147 | {
148 | "output_type": "stream",
149 | "name": "stdout",
150 | "text": [
151 | "((\"'🚨Tech Talk of the Week alert!\\\\n\\\\nLearn about TensorFlow Hub by joining the session hosted by Machine Learning GDE\", ' and Data Scientist'), \" Bhavesh Bhatt.\\\\n\\\\nThis event will start at 8 PM Oman time\\\\n\\\\nJoin by clicking the link below!\\\\nhttps://t.co/CO6S9i6NfL\\\\n\\\\n#CourageToCreate #IWD #WTM https://t.co/4zKKFtfuhC https://t.co/qs8N3SwYW3'\")\n",
152 | "((\"'This Sunday\", ' April 18\\\\n\\\\n💾Tech Talk Time!\\\\n\\\\nFrom 8 to 9PM GST'), ' learn about TFHub\\\\n\\\\nTensorFlow Hub is an open repository & library for reusable machine learning. Come join Machine Learning GDE & Data Scientist')\n",
153 | "((\"'Data Science Career | How to Transition to Data Science with Data Scientist Bhavesh Bhatt |GreyAtom https://t.co/ocZCP6LCeM'\", nan), nan)\n"
154 | ]
155 | }
156 | ]
157 | },
158 | {
159 | "cell_type": "markdown",
160 | "metadata": {
161 | "id": "MWWTBnwS4Ofr"
162 | },
163 | "source": [
164 | "# Extracting Tweets from Users"
165 | ]
166 | },
167 | {
168 | "cell_type": "code",
169 | "metadata": {
170 | "colab": {
171 | "base_uri": "https://localhost:8080/"
172 | },
173 | "id": "uFBwPec2zSr6",
174 | "outputId": "106991a9-2325-48c7-f650-dc948c510b8e"
175 | },
176 | "source": [
177 | "user_name = \"_bhaveshbhatt\"\n",
178 | "user_tweets = \"snscrape --format '{content!r}'\"+ f\" --max-results {max_results} --since {from_date} twitter-user '{user_name} until:{end_date}' > user-tweets.txt\"\n",
179 | "os.system(user_tweets)\n",
180 | "if os.stat(\"user-tweets.txt\").st_size == 0:\n",
181 | " print('No Tweets found')\n",
182 | "else:\n",
183 | " df = pd.read_csv('user-tweets.txt', names=['content'])\n",
184 | " for row in df['content'].iteritems():\n",
185 | " print(row)"
186 | ],
187 | "execution_count": null,
188 | "outputs": [
189 | {
190 | "output_type": "stream",
191 | "text": [
192 | "(0, \"@iaveshh @huggingface I'm still exploring the M1 so don't want to create a video for the sake of creating it. I'll create a video once I have enough data points.\")\n",
193 | "(1, \"@iaveshh Hey Avesh, I'm still exploring the device so won't be able to comment right away but I'll definitely make a video on this soon :)\")\n",
194 | "(2, \"'55% battery consumed with ~8 hours of screen on time with close to 4-5 hours of 4k video consumption.\\\\n\\\\nIs this a dream or are we living in the future? \\\\n\\\\n#mindblown #m1 #mac #M1Mac #MacBookAir #macbook https://t.co/uKnYUTdqPI'\")\n",
195 | "(3, \"'The M1 MacBook Air is a Beast. I just rendered a 2 minute video in less than 35 seconds.\\\\n\\\\n#mindblown #m1 #mac #M1Mac #videoediting #MacBookAir #macbook https://t.co/5tFBdI1B9G'\")\n",
196 | "(4, \"'@RsreeTech @PrabhhavSharma Thanks for sharing this :)'\")\n",
197 | "(5, \"'Just received some super amazing package from #Google for the talk that I gave during #GoogleIO this year🙂\\\\n\\\\nThank you @GoogleDevsIN & @GoogleDevExpert 🙂\\\\n\\\\n#GDE https://t.co/WsI2ChK56B'\")\n",
198 | "(6, \"@PrabhhavSharma I'm without a laptop currently 😛 As soon as I buy a laptop, I'll create a video on it soon!\")\n",
199 | "(7, \"'Lenovo Thinkpad Yoga 14 was a friend that helped me for 5.5 long years. This very laptop helped me create 300+ videos & propelled my career growth. You will definitely be missed. Thank you #Lenovo for creating such a solid product. https://t.co/6lQ993hHnv'\")\n",
200 | "(8, '\\'Chandler Bing\\\\\\'s job was \"Data Reconfiguration & Statistical Factoring\" so officially he became a Data Scientist long back😂\\\\n\\\\n#AI #ArtificialIntelligence #ML #MachineLearning #DeepLearning #DataScience #FriendsReunion\\'')\n",
201 | "(9, \"It's getting to that stage that by the time you learn a technique in ML & then a year later it's obsolete. LSTM is now made obsolete all thanks to Transformers. Its tough to keep up with the advances.\\\\n\\\\n#AI #ArtificialIntelligence #ML #MachineLearning #DeepLearning #DataScience\")\n",
202 | "(10, \"It’s finally happening. I'm thrilled to be featured at #GoogleIO this year in a lighting talk on Landmark Detection using TensorFlow Hub.\\\\n\\\\nIt's free & virtual. Click here to register: \\\\nhttps://t.co/AJlXOijvow\\\\n\\\\nSee you there :)\\\\n\\\\n@GoogleDevExpert @GDGIndia @googledevs @GoogleDevsIN\")\n",
203 | "(11, \"@ankur_rangi The API calls are being blocked! I'm trying to figure out a way!\")\n",
204 | "(12, \"'@philipvollet Thank you 🙏'\")\n",
205 | "(13, \"'30K Subscribers! Thank You🙏\\\\n\\\\n#YouTube #youtubechannel #DataScience #MachineLearning #AI #ArtificialIntelligence #DeepLearning https://t.co/YR0o0ss0lr'\")\n",
206 | "(14, \"'@bengaluruu @rnaik21 Try now!'\")\n",
207 | "(15, \"'@malind Sure! I will add that feature'\")\n",
208 | "(16, \"@sss26888 I haven't tried it! I will try to add this feature soon.\")\n",
209 | "(17, \"'@RameshYesh Look at the recent commits :)'\")\n",
210 | "(18, \"'New Video : Derivative of the Tanh Activation function in Deep Learning\\\\n\\\\nVideo Link : https://t.co/DyV6f2cG9z\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence'\")\n",
211 | "(19, \"'NVIDIA GTC 21 is live now\\\\n\\\\n#GTC21 #GTCwithMe #Nvidia https://t.co/R78aLU9vuK'\")\n",
212 | "(20, \"New Video : @streamlit application summarize text on blogs, websites using @huggingface's summarization pipeline.\\\\n\\\\nVideo Link : https://t.co/qbTmdABC0z\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/gPafBU5Iln\")\n",
213 | "(21, \"'haha 😂 https://t.co/aFDKhFC41I'\")\n",
214 | "(22, \"New Video : @streamlit application to answer questions regarding your text using @huggingface's q&a pipeline.\\\\n\\\\nVideo Link : https://t.co/AB9couvIDs\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/kWPrfqpAmY\")\n",
215 | "(23, \"'New Video : @streamlit application to detect tables in document images such as invoices\")\n",
216 | "(24, \"'Palindrome Subscribers :)\\\\n\\\\n#YouTube #youtubechannel #DataScience #MachineLearning #AI #ArtificialIntelligence #DeepLearning https://t.co/Qqfs9W8dSm'\")\n",
217 | "(25, \"'NEW Video - Automatic Regression using TuringBot software.\\\\n\\\\nVideo Link : https://t.co/IVnWnGHJxw\\\\n\\\\n#DataScience #MachineLearning #ArtificialIntelligence #AI #YouTube #DeepLearning https://t.co/UgkY2ZBXlc'\")\n",
218 | "(26, \"'New Video : I created a #Bitcoin tracking application with 20 lines of Python code using @streamlit \\\\n\\\\nVideo Link : https://t.co/IwZkdFmgFW\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/oslKpc9FRq'\")\n",
219 | "(27, \"'NEW Video - Answer Questions related to a table using @huggingface Transformers Pipeline.\\\\n\\\\nVideo Link : https://t.co/5rynXzNruu\\\\n\\\\n#DataScience #MachineLearning #ArtificialIntelligence #AI #YouTube #DeepLearning #NLP https://t.co/XWEvwbf1J1'\")\n",
220 | "(28, \"'NEW Video - Sentiment Analysis using @huggingface Transformers Pipeline\\\\n\\\\nVideo Link : https://t.co/A8epjff4VA\\\\n\\\\n#DataScience #MachineLearning #ArtificialIntelligence #AI #YouTube #DeepLearning #NLP https://t.co/3sVMlWnEPA'\")\n",
221 | "(29, \"'NEW Video - NVIDIA Deep Learning Institute (DLI) Course Giveaway Results.\\\\n\\\\nVideo Link : https://t.co/mtI5D4w5i0 \\\\n\\\\n#DataScience #MachineLearning #ArtificialIntelligence #AI #YouTube #DeepLearning https://t.co/Hko3Zi5krh'\")\n",
222 | "(30, \"'NEW Video - Free @nvidia GTC21 Conference + #nvidia Deep Learning Institute Course Giveaway.\\\\n\\\\nVideo Link : https://t.co/XtrJLp2505\\\\n\\\\n@NVIDIAAI @NVIDIAGTC #nvidia #GTC21 #AI #GTCwithme #Conference #DataScience https://t.co/BGFRrmSOUm'\")\n",
223 | "(31, \"I am planning to do a small giveaway as I'm about to reach 29000 subscribers.\\\\n\\\\nThe giveaway will help you in your career going forward. \\\\n\\\\nStay tuned for my next video :)\\\\n\\\\n#DataScience #MachineLearning #ArtificialIntelligence #AI #YouTube #DeepLearning https://t.co/Hm8mwg5fiT\")\n",
224 | "(32, \"'NEW Video - Optical Character Recognition (OCR) in Python using keras-ocr\\\\n\\\\nVideo Link : https://t.co/ZlH5gkQvEL\\\\n\\\\n#MachineLearning #DeepLearning #DataScience #Python #AI https://t.co/VRmVrVVpWZ'\")\n",
225 | "(33, \"'NEW Video - Remove Background Noise using #NVIDIA RTX Voice.\\\\n\\\\nVideo Link : https://t.co/vHVKWFUY2Y\\\\n\\\\n#MachineLearning #DeepLearning #DataScience #Python #AI https://t.co/JOjIMEUeHY'\")\n",
226 | "(34, \"'When you get @TensorFlow swags 😎\\\\n\\\\nThank you @GoogleDevsIN & @GoogleDevExpert 😀 \\\\n\\\\n#GDE https://t.co/2iT0IJ9wEE'\")\n",
227 | "(35, \"'Apply Quickly :) #AcceleratedWithGoogle\\\\n\\\\n@GoogleDevsIN @GoogleDevExpert https://t.co/o8l1Ale2DH'\")\n",
228 | "(36, \"'NEW Video - TensorGram : Telegram bot to receive Deep Learning model training updates on your mobile.\\\\n\\\\nVideo Link : https://t.co/TwI835oZMw\\\\n\\\\n#MachineLearning #DeepLearning #DataScience #Python #AI https://t.co/42Q49j5pTf'\")\n",
229 | "(37, \"'https://t.co/fO6aehIwnZ'\")\n",
230 | "(38, \"New Video : Accelerate NumPy operations using @TensorFlow's tf.experimental.numpy module & @nvidia's GPUs\\\\n\\\\nVideo Link : https://t.co/Snhi5iw8i1 \\\\n\\\\n@GoogleDevExpert @GoogleDevExpert \\\\n\\\\n#ArtificialIntelligence #MachineLearning #Python #AI https://t.co/iQ6onNGzhH\")\n",
231 | "(39, \"New Video : Is this the BEST BOOK on Google's BERT?\\\\n\\\\nVideo Link : https://t.co/5GhN3XEEe7\\\\n\\\\n#MachineLearning #DeepLearning #DataScience #Python #AI https://t.co/Q9tyz99lJ7\")\n",
232 | "(40, \"'NVIDIA is definitely increasing AI adoption. NVIDIA just announced : NVIDIA AI Enterprise\")\n",
233 | "(41, \"'Video Tutorial : A simple spelling & grammar checker web application using Python \\\\n\\\\nVideo Link : https://t.co/gU7AP6x3iu\\\\n\\\\n#ArtificialIntelligence #MachineLearning #AI #DataScience #NLP https://t.co/TW5lSYyy0g'\")\n",
234 | "(42, \"NEW Video - Easy to use extractive text summarization using bert-extractive-summarizer which uses @huggingface's neuralcoref library.\\\\n\\\\nVideo Link : https://t.co/ocysr4zgFh\\\\n\\\\n#MachineLearning #DeepLearning #DataScience #Python #AI https://t.co/G6RS4tdloU\")\n",
235 | "(43, \"'I created a mini version of Grammarly using #Python & @streamlit.\\\\n\\\\nVideo Tutorial coming soon :)\\\\n\\\\n#ArtificialIntelligence #MachineLearning #AI #DataScience #NLP https://t.co/HVCqTgyF0S'\")\n",
236 | "(44, \"'NEW Video - NVIDIA Jarvis\")\n",
237 | "(45, '\\'Charles Cooley : \"I am not what I think I am')\n",
238 | "(46, \"'NEW Video - Notify : Jupyter Extension For Browser Notifications of Cell Completion in Jupyter Notebook\\\\n\\\\nVideo Link : https://t.co/7pJKQbUIJs\\\\n\\\\n#MachineLearning #DeepLearning #DataScience #Python #AI https://t.co/pa1q4He5Mc'\")\n",
239 | "(47, \"NEW Video : @facebookai's mBART-50 using @huggingface's transformer for Multilingual Language Translation.\\\\n\\\\nVideo Link : https://t.co/wADHZYLnqR\\\\n\\\\n#NLP #MachineLearning #DeepLearning #DataScience #Python #AI https://t.co/ok6I7aiGK2\")\n",
240 | "(48, \"'I created a @streamlit application that translates from Hindi to English. A Big Shoutout to @huggingface & @facebookai for open sourcing mBART-50 which helps in Translating text to\")\n",
241 | "(49, \"'Launch VSCode (codeserver) on Google Colab using ColabCode!\\\\n\\\\nThank you @abhi1thakur for creating this :)\\\\n\\\\nVideo Link : https://t.co/xTQFeQBdNS\\\\n\\\\n#deployment #google #vscode #machinelearning #DataScience https://t.co/WUF9cmRXmy'\")\n",
242 | "(50, \"'12-02-2021 - Palindrome Day!'\")\n",
243 | "(51, \"@facebookai's Wav2Vec 2.0 using @huggingface's transformer 4.3.0 for Automatic Speech Recognition.\\\\n\\\\nNow, you can transcribe your audio files directly on the hub using Wav2Vec2.\\\\n\\\\nVideo Link : https://t.co/E0omr18jIx\\\\n\\\\n#NLP #MachineLearning #DeepLearning #DataScience #Python #AI https://t.co/mXww5PQNHs\")\n",
244 | "(52, \"'@MdArif34 Thank you so much!'\")\n",
245 | "(53, \"'1000 Followers on GitHub 🙏🙏\\\\n\\\\n#github #MachineLearning #AI #YouTube https://t.co/puNy3UuSLk'\")\n",
246 | "(54, \"FREE Data Science Online Course for Absolute Beginner's\\\\n\\\\nVideo Link : https://t.co/FyWRS4iANO\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/yeOeBxsaZ6\")\n",
247 | "(55, \"'A Big Big Thank You to @GoogleDevsIN & @GoogleDevExpert for helping me have a higher quality online presence by providing me..🙂\\\\n- Blue Yeti USB Mic + Pop Filter\\\\n- Lenovo HD Web Camera\\\\n- Ring Light with Stand \\\\n\\\\n#GDE #DataScience #MachineLearning #AI #YouTube #youtubechannel https://t.co/azhYLRI2Hh'\")\n",
248 | "(56, \"'From 36 All out to winning the test series. \\\\nHistory has been made ✌️✌️✌️✌️✌️\\\\n#AUSvIND #GabbaTest #Gabba #AUSvINDtest'\")\n",
249 | "(57, \"'History will definitely be made today 🇮🇳🤞\\\\n#AUSvIND #GabbaTest #Gabba #AUSvINDtest'\")\n",
250 | "(58, \"'Best Data Science Certifications from Google to consider in 2021.\\\\n\\\\nVideo Link : https://t.co/hCiKd8K2Qb\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning https://t.co/EvGwLkL0ys'\")\n",
251 | "(59, \"'Will AutoML replace a Data Scientist?\\\\n\\\\nVideo Link : https://t.co/G3EBbGU6Jn\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning https://t.co/jwjItpuj7P'\")\n",
252 | "(60, \"'Faster Pandas Operation using PyPolars\\\\n\\\\nVideo Link : https://t.co/T2ohYevMIC\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/ie4rTD1uB5'\")\n",
253 | "(61, \"'Language Identification using Google Compact Language Detector v3 (CLD3)\")\n",
254 | "(62, \"'No Facebook\")\n",
255 | "(63, \"'TextHero : Simplest way to Clean & Analyze Text Data in Pandas\\\\n\\\\nVideo Link : https://t.co/757ENLb0Sr\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/MVDUXY8Bon'\")\n",
256 | "(64, \"Cleaning Text Data using Python's Clean-Text Library\\\\n\\\\nVideo Link : https://t.co/FBnoWvWLVP\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #artificialitellegence https://t.co/0ea3t6OqA9\")\n",
257 | "(65, \"'Simplest Example to Gamma & C Hyper parameters of SVM.\\\\n\\\\nVideo Link : https://t.co/QwAx7ATiZp\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/6cZwaBaQsS'\")\n",
258 | "(66, \"'Auto_TS : Automatically build multiple Time Series models using a Single Line of Code.\\\\n\\\\nVideo Link : https://t.co/k6DxUGy7ks\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/37ZCHhncqo'\")\n",
259 | "(67, \"'The insights that a single alluvial diagram can give is amazing 😀 https://t.co/FsNFhj0mv9'\")\n",
260 | "(68, \"'Topic Modeling with BERT using Top2Vec.\\\\n\\\\nVideo Link : https://t.co/U8Rrj9wpEq\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence #NLP https://t.co/YUSxDAH95p'\")\n",
261 | "(69, \"'I had an amazing experience being interviewed by @tejakkuntla on the Exploiting Podcast. \\\\n\\\\nFind the episode at: https://t.co/CYMcXBpjBp\\\\n\\\\n#MachineLearning #DeepLearning #DataScience #Python #AI https://t.co/a3a87woKyq'\")\n",
262 | "(70, \"Today's draw feels like a win! 🙏 #INDvAUS\")\n",
263 | "(71, \"'Lazy Predict for ML Models (AutoML)\\\\n\\\\nVideo Link : https://t.co/6vAqzhVkgZ\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/A6GLAHvcz6'\")\n",
264 | "(72, \"'Regression using Multivariate Adaptive Regression Splines (MARS)\\\\n\\\\nVideo Link : https://t.co/Mm0UJk0fQW\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/8SaJk5b95s'\")\n",
265 | "(73, \"'Topic Modeling with BERT using BERTopic.\\\\n\\\\nVideo Link : https://t.co/0lk1zSiqBF\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence #NLP https://t.co/R5tRcbVz5W'\")\n",
266 | "(74, \"'OpenAI has created the DALL-E model\")\n",
267 | "(75, \"'Low Light Image Enhancement using Python & Deep Learning\\\\n\\\\nVideo Link : https://t.co/AImtQsbWGe\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/dUhtN47A3H'\")\n",
268 | "(76, \"'Visualize Python Code Execution.\\\\n\\\\nVideo Link : https://t.co/h7ORg619eZ\\\\n\\\\n#programmer #Python #Python3 #pythonlearning #pythoncode #SoftwareEngineer https://t.co/Kgk3pDIs75'\")\n",
269 | "(77, \"Happy New Year everyone 🥳 I hope 2021 turns out to be the best year of your life & your family too. 2020 was a hard year because of COVID-19. Let's hope 2021 brings an end to this COVID-19 menace. Please stay safe & look after one another. Bhavesh ✨✨\")\n",
270 | "(78, \"In 2021, I want to get my hands on the @YouTube's Silver Play Button ✨✨🤞🤞\\\\n\\\\n#YouTube #youtubechannel #DataScience #MachineLearning #AI #ArtificialIntelligence #DeepLearning\")\n",
271 | "(79, \"'@Arth_Soni242001 This blog should help - https://t.co/GFgbKLD70y'\")\n",
272 | "(80, \"'Wishing all of you & your families a Merry Christmas & Happy Holidays 🎅 🎄\\\\n\\\\nA lot happened in 2020\\\\n- I created 115 videos & gained close to 19k subscribers 🙏\\\\n- I was awarded the 40 Under 40 Data Scientist award.\\\\n\\\\nThank you all for the support in 2020 & looking forward to 2021😊'\")\n",
273 | "(81, \"@Arth_Soni242001 @GoogleDevsIN @GoogleDevExpert @sidagarwal04 I'm a GDE in Machine Learning Arth so we get rewarded for the community work that we do by Google :)\")\n",
274 | "(82, \"'Received some more awesome year end gifts from Google :) Thank you @GoogleDevsIN \")\n",
275 | "(83, \"'I have recently developed a fondness towards Palindrome Numbers\")\n",
276 | "(84, \"'My small tutorial on BERT based MuRIL (Multilingual Representations for Indian Languages) by @GoogleIndia\\\\n\\\\nVideo Link : https://t.co/neEL9AT8ob\\\\n\\\\n@GoogleDevsIN @GoogleDevExpert\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence #NLP https://t.co/NpoXNFpjiy'\")\n",
277 | "(85, \"'#Bitcoin smashed through $20\")\n",
278 | "(86, \"'Simplest Example to explain the advantages of BERT over Word2Vec models.\\\\n\\\\nVideo Link : https://t.co/OvG1ckHZKa\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/xC8tqr5vwJ'\")\n",
279 | "(87, \"'Received some amazing year end gifts :) Thank you @GoogleDevsIN \")\n",
280 | "(88, \"'Scrape HTML tables easily with a Button Click using Python!\\\\n\\\\nVideo Link : https://t.co/1ttzsSw7ua\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/k6GOTri66c'\")\n",
281 | "(89, \"I sometimes can't imagine my life without Google Colab! 🙂 https://t.co/z2AnzwjW2G\")\n",
282 | "(90, \"'Turn Your Photos into Pencil Drawing/Sketch Easily using Python.\\\\n\\\\nVideo Link : https://t.co/svkDKfxB4n\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning #ArtificialIntelligence https://t.co/uEWHMtKs0x'\")\n",
283 | "(91, \"'Fuzzy String Matching with BERT\")\n",
284 | "(92, \"'For all of you who wanted to support me directly for my YouTube videos.\\\\n\\\\nI have activated the YouTube membership option. Do check out this video for more information 🙂\\\\n\\\\nLink : https://t.co/7PDERftqlK\\\\n\\\\n#DataScience #MachineLeraning #YouTube #ArtificialIntelligence #teaching https://t.co/wiWteSMIDW'\")\n",
285 | "(93, \"'Learn how to create Ridgeline plots which shows the distribution of a numeric value for several groups. \\\\n\\\\nVideo Link : https://t.co/H9vssCWslK\\\\n\\\\n#DeepLearning #DataScience #Python #AI #MachineLearning https://t.co/UaRWEWVdK8'\")\n",
286 | "(94, \"'Elegant Neural Network User Interface to build drag-and-drop neural networks\")\n",
287 | "(95, \"'@AiCodist Docly is surely gonna go places :) Thanks for creating it!'\")\n",
288 | "(96, \"Auto-Generate Python Comments/Documentation using with @AiCodist's Docly\\\\n\\\\nVideo Link : https://t.co/FO8E8jQ80U\\\\n\\\\n#Docly #NLP #MachineLearning #DeepLearning #DataScience #Python #AI https://t.co/2Fnj2xmcgG\")\n",
289 | "(97, \"'The small details of his videos 🙏 https://t.co/OTav4A2Wny'\")\n",
290 | "(98, \"I started a channel with the aim of teaching Python in 2020 & I just realized that it has crossed 2000 subscriber mark 🙂\\\\n\\\\nDo check the channel if you haven't already : https://t.co/ZeTDCXoMHI\\\\n\\\\n#Python #DataScience #MachineLearning #AI #DeepLearning #youtubechannel https://t.co/XwQwy2FOlP\")\n",
291 | "(99, \"November, 2017 was when I uploaded my first video on YouTube. Very little did I know that in exactly 3 years, I'll gain 25K Subscribers! A Big Thank You to all of you :)\\\\n\\\\n#DataScience #MachineLearning #AI #Python #youtubechannel #YouTube #YouTuber #YouTubers https://t.co/kIIUHArXkJ\")\n"
292 | ],
293 | "name": "stdout"
294 | }
295 | ]
296 | },
297 | {
298 | "cell_type": "code",
299 | "metadata": {
300 | "id": "d8uHfSmwzhDG"
301 | },
302 | "source": [
303 | ""
304 | ],
305 | "execution_count": null,
306 | "outputs": []
307 | }
308 | ]
309 | }
--------------------------------------------------------------------------------