├── CreativeCommonsLicense.txt ├── README.md ├── SEANCE User Manual (revised 6-18-2018).docx ├── SEANCE_1_2_0.py ├── components ├── C1.txt ├── C10.txt ├── C11.txt ├── C12.txt ├── C13.txt ├── C14.txt ├── C15.txt ├── C16.txt ├── C17.txt ├── C18.txt ├── C19.txt ├── C2.txt ├── C20.txt ├── C3.txt ├── C4.txt ├── C5.txt ├── C6.txt ├── C7.txt ├── C8.txt └── C9.txt ├── data_files ├── SEANCE_Icon.icns ├── __pycache__ │ └── vaderSentiment.cpython-36.pyc ├── affective_list.txt ├── affective_norms.txt ├── e_lemma_py_format_lower.txt ├── emoji_utf8_lexicon.txt ├── inquirerbasic.txt ├── negative_words.txt ├── positive_words.txt ├── senticnet_data.txt ├── vaderSentiment.py └── vader_lexicon.txt └── en_core_web_sm ├── accuracy.json ├── meta.json ├── ner ├── cfg ├── lower_model ├── moves ├── tok2vec_model └── upper_model ├── parser ├── cfg ├── lower_model ├── moves ├── tok2vec_model └── upper_model ├── tagger ├── cfg ├── model └── tag_map ├── tokenizer └── vocab ├── key2row ├── lexemes.bin ├── strings.json └── vectors /CreativeCommonsLicense.txt: -------------------------------------------------------------------------------- 1 | Use of SEANCE is governed by the following license: 2 | 3 | Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International Public License 4 | Please see https://creativecommons.org/licenses/by-nc-sa/4.0/ for more information (or keep reading) 5 | 6 | This is a human-readable summary of (and not a substitute for) the license. 7 | You are free to: 8 | Share — copy and redistribute the material in any medium or format 9 | Adapt — remix, transform, and build upon the material 10 | The licensor cannot revoke these freedoms as long as you follow the license terms. 11 | Under the following terms: 12 | Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. 13 | NonCommercial — You may not use the material for commercial purposes. 14 | ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. 15 | No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits. 16 | 17 | 18 | By exercising the Licensed Rights (defined below), You accept and agree to be bound by the terms and conditions of this Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International Public License ("Public License"). To the extent this Public License may be interpreted as a contract, You are granted the Licensed Rights in consideration of Your acceptance of these terms and conditions, and the Licensor grants You such rights in consideration of benefits the Licensor receives from making the Licensed Material available under these terms and conditions. 19 | 20 | Section 1 – Definitions. 21 | 22 | Adapted Material means material subject to Copyright and Similar Rights that is derived from or based upon the Licensed Material and in which the Licensed Material is translated, altered, arranged, transformed, or otherwise modified in a manner requiring permission under the Copyright and Similar Rights held by the Licensor. For purposes of this Public License, where the Licensed Material is a musical work, performance, or sound recording, Adapted Material is always produced where the Licensed Material is synched in timed relation with a moving image. 23 | Adapter's License means the license You apply to Your Copyright and Similar Rights in Your contributions to Adapted Material in accordance with the terms and conditions of this Public License. 24 | BY-NC-SA Compatible License means a license listed at creativecommons.org/compatiblelicenses, approved by Creative Commons as essentially the equivalent of this Public License. 25 | Copyright and Similar Rights means copyright and/or similar rights closely related to copyright including, without limitation, performance, broadcast, sound recording, and Sui Generis Database Rights, without regard to how the rights are labeled or categorized. For purposes of this Public License, the rights specified in Section 2(b)(1)-(2) are not Copyright and Similar Rights. 26 | Effective Technological Measures means those measures that, in the absence of proper authority, may not be circumvented under laws fulfilling obligations under Article 11 of the WIPO Copyright Treaty adopted on December 20, 1996, and/or similar international agreements. 27 | Exceptions and Limitations means fair use, fair dealing, and/or any other exception or limitation to Copyright and Similar Rights that applies to Your use of the Licensed Material. 28 | License Elements means the license attributes listed in the name of a Creative Commons Public License. The License Elements of this Public License are Attribution, NonCommercial, and ShareAlike. 29 | Licensed Material means the artistic or literary work, database, or other material to which the Licensor applied this Public License. 30 | Licensed Rights means the rights granted to You subject to the terms and conditions of this Public License, which are limited to all Copyright and Similar Rights that apply to Your use of the Licensed Material and that the Licensor has authority to license. 31 | Licensor means the individual(s) or entity(ies) granting rights under this Public License. 32 | NonCommercial means not primarily intended for or directed towards commercial advantage or monetary compensation. For purposes of this Public License, the exchange of the Licensed Material for other material subject to Copyright and Similar Rights by digital file-sharing or similar means is NonCommercial provided there is no payment of monetary compensation in connection with the exchange. 33 | Share means to provide material to the public by any means or process that requires permission under the Licensed Rights, such as reproduction, public display, public performance, distribution, dissemination, communication, or importation, and to make material available to the public including in ways that members of the public may access the material from a place and at a time individually chosen by them. 34 | Sui Generis Database Rights means rights other than copyright resulting from Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of databases, as amended and/or succeeded, as well as other essentially equivalent rights anywhere in the world. 35 | You means the individual or entity exercising the Licensed Rights under this Public License. Your has a corresponding meaning. 36 | Section 2 – Scope. 37 | 38 | License grant. 39 | Subject to the terms and conditions of this Public License, the Licensor hereby grants You a worldwide, royalty-free, non-sublicensable, non-exclusive, irrevocable license to exercise the Licensed Rights in the Licensed Material to: 40 | reproduce and Share the Licensed Material, in whole or in part, for NonCommercial purposes only; and 41 | produce, reproduce, and Share Adapted Material for NonCommercial purposes only. 42 | Exceptions and Limitations. For the avoidance of doubt, where Exceptions and Limitations apply to Your use, this Public License does not apply, and You do not need to comply with its terms and conditions. 43 | Term. The term of this Public License is specified in Section 6(a). 44 | Media and formats; technical modifications allowed. The Licensor authorizes You to exercise the Licensed Rights in all media and formats whether now known or hereafter created, and to make technical modifications necessary to do so. The Licensor waives and/or agrees not to assert any right or authority to forbid You from making technical modifications necessary to exercise the Licensed Rights, including technical modifications necessary to circumvent Effective Technological Measures. For purposes of this Public License, simply making modifications authorized by this Section 2(a)(4) never produces Adapted Material. 45 | Downstream recipients. 46 | Offer from the Licensor – Licensed Material. Every recipient of the Licensed Material automatically receives an offer from the Licensor to exercise the Licensed Rights under the terms and conditions of this Public License. 47 | Additional offer from the Licensor – Adapted Material. Every recipient of Adapted Material from You automatically receives an offer from the Licensor to exercise the Licensed Rights in the Adapted Material under the conditions of the Adapter’s License You apply. 48 | No downstream restrictions. You may not offer or impose any additional or different terms or conditions on, or apply any Effective Technological Measures to, the Licensed Material if doing so restricts exercise of the Licensed Rights by any recipient of the Licensed Material. 49 | No endorsement. Nothing in this Public License constitutes or may be construed as permission to assert or imply that You are, or that Your use of the Licensed Material is, connected with, or sponsored, endorsed, or granted official status by, the Licensor or others designated to receive attribution as provided in Section 3(a)(1)(A)(i). 50 | Other rights. 51 | 52 | Moral rights, such as the right of integrity, are not licensed under this Public License, nor are publicity, privacy, and/or other similar personality rights; however, to the extent possible, the Licensor waives and/or agrees not to assert any such rights held by the Licensor to the limited extent necessary to allow You to exercise the Licensed Rights, but not otherwise. 53 | Patent and trademark rights are not licensed under this Public License. 54 | To the extent possible, the Licensor waives any right to collect royalties from You for the exercise of the Licensed Rights, whether directly or through a collecting society under any voluntary or waivable statutory or compulsory licensing scheme. In all other cases the Licensor expressly reserves any right to collect such royalties, including when the Licensed Material is used other than for NonCommercial purposes. 55 | Section 3 – License Conditions. 56 | 57 | Your exercise of the Licensed Rights is expressly made subject to the following conditions. 58 | 59 | Attribution. 60 | 61 | If You Share the Licensed Material (including in modified form), You must: 62 | 63 | retain the following if it is supplied by the Licensor with the Licensed Material: 64 | identification of the creator(s) of the Licensed Material and any others designated to receive attribution, in any reasonable manner requested by the Licensor (including by pseudonym if designated); 65 | a copyright notice; 66 | a notice that refers to this Public License; 67 | a notice that refers to the disclaimer of warranties; 68 | a URI or hyperlink to the Licensed Material to the extent reasonably practicable; 69 | indicate if You modified the Licensed Material and retain an indication of any previous modifications; and 70 | indicate the Licensed Material is licensed under this Public License, and include the text of, or the URI or hyperlink to, this Public License. 71 | You may satisfy the conditions in Section 3(a)(1) in any reasonable manner based on the medium, means, and context in which You Share the Licensed Material. For example, it may be reasonable to satisfy the conditions by providing a URI or hyperlink to a resource that includes the required information. 72 | If requested by the Licensor, You must remove any of the information required by Section 3(a)(1)(A) to the extent reasonably practicable. 73 | ShareAlike. 74 | In addition to the conditions in Section 3(a), if You Share Adapted Material You produce, the following conditions also apply. 75 | 76 | The Adapter’s License You apply must be a Creative Commons license with the same License Elements, this version or later, or a BY-NC-SA Compatible License. 77 | You must include the text of, or the URI or hyperlink to, the Adapter's License You apply. You may satisfy this condition in any reasonable manner based on the medium, means, and context in which You Share Adapted Material. 78 | You may not offer or impose any additional or different terms or conditions on, or apply any Effective Technological Measures to, Adapted Material that restrict exercise of the rights granted under the Adapter's License You apply. 79 | Section 4 – Sui Generis Database Rights. 80 | 81 | Where the Licensed Rights include Sui Generis Database Rights that apply to Your use of the Licensed Material: 82 | 83 | for the avoidance of doubt, Section 2(a)(1) grants You the right to extract, reuse, reproduce, and Share all or a substantial portion of the contents of the database for NonCommercial purposes only; 84 | if You include all or a substantial portion of the database contents in a database in which You have Sui Generis Database Rights, then the database in which You have Sui Generis Database Rights (but not its individual contents) is Adapted Material, including for purposes of Section 3(b); and 85 | You must comply with the conditions in Section 3(a) if You Share all or a substantial portion of the contents of the database. 86 | For the avoidance of doubt, this Section 4 supplements and does not replace Your obligations under this Public License where the Licensed Rights include other Copyright and Similar Rights. 87 | Section 5 – Disclaimer of Warranties and Limitation of Liability. 88 | 89 | Unless otherwise separately undertaken by the Licensor, to the extent possible, the Licensor offers the Licensed Material as-is and as-available, and makes no representations or warranties of any kind concerning the Licensed Material, whether express, implied, statutory, or other. This includes, without limitation, warranties of title, merchantability, fitness for a particular purpose, non-infringement, absence of latent or other defects, accuracy, or the presence or absence of errors, whether or not known or discoverable. Where disclaimers of warranties are not allowed in full or in part, this disclaimer may not apply to You. 90 | To the extent possible, in no event will the Licensor be liable to You on any legal theory (including, without limitation, negligence) or otherwise for any direct, special, indirect, incidental, consequential, punitive, exemplary, or other losses, costs, expenses, or damages arising out of this Public License or use of the Licensed Material, even if the Licensor has been advised of the possibility of such losses, costs, expenses, or damages. Where a limitation of liability is not allowed in full or in part, this limitation may not apply to You. 91 | The disclaimer of warranties and limitation of liability provided above shall be interpreted in a manner that, to the extent possible, most closely approximates an absolute disclaimer and waiver of all liability. 92 | Section 6 – Term and Termination. 93 | 94 | This Public License applies for the term of the Copyright and Similar Rights licensed here. However, if You fail to comply with this Public License, then Your rights under this Public License terminate automatically. 95 | Where Your right to use the Licensed Material has terminated under Section 6(a), it reinstates: 96 | 97 | automatically as of the date the violation is cured, provided it is cured within 30 days of Your discovery of the violation; or 98 | upon express reinstatement by the Licensor. 99 | For the avoidance of doubt, this Section 6(b) does not affect any right the Licensor may have to seek remedies for Your violations of this Public License. 100 | For the avoidance of doubt, the Licensor may also offer the Licensed Material under separate terms or conditions or stop distributing the Licensed Material at any time; however, doing so will not terminate this Public License. 101 | Sections 1, 5, 6, 7, and 8 survive termination of this Public License. 102 | Section 7 – Other Terms and Conditions. 103 | 104 | The Licensor shall not be bound by any additional or different terms or conditions communicated by You unless expressly agreed. 105 | Any arrangements, understandings, or agreements regarding the Licensed Material not stated herein are separate from and independent of the terms and conditions of this Public License. 106 | Section 8 – Interpretation. 107 | 108 | For the avoidance of doubt, this Public License does not, and shall not be interpreted to, reduce, limit, restrict, or impose conditions on any use of the Licensed Material that could lawfully be made without permission under this Public License. 109 | To the extent possible, if any provision of this Public License is deemed unenforceable, it shall be automatically reformed to the minimum extent necessary to make it enforceable. If the provision cannot be reformed, it shall be severed from this Public License without affecting the enforceability of the remaining terms and conditions. 110 | No term or condition of this Public License will be waived and no failure to comply consented to unless expressly agreed to by the Licensor. 111 | Nothing in this Public License constitutes or may be interpreted as a limitation upon, or waiver of, any privileges and immunities that apply to the Licensor or You, including from the legal processes of any jurisdiction or authority. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # The Sentiment Analysis and Cognition Engine (SEANCE) 1.2.0 2 | For more information about SEANCE, see [https://www.linguisticanalysistools.org/seance.html](https://www.linguisticanalysistools.org/seance.html) 3 | 4 | This tool was conceptualized by Scott Crossley and Kristopher Kyle and implemented in Python by Kristopher Kyle. 5 | 6 | # Quick Start Guide 7 | For Mac OSX, download the SEANCE_1_2_0_Mac.zip file (found under the "releases" tab). After extracting the SEANCE program from the .zip file, double-click the icon to open the program. Note that you may have to change your security preferences to open SEANCE. 8 | For Windows 7, 8, and 10 (64-bit versions only), download the SEANCE_1_2_0_win.zip file (found under the "releases" tab). After extracting the SEANCE program from the .zip file, double-click the icon to open the program. 9 | Note that after clicking on the SEANCE icon, a terminal (Mac) or command prompt (Windows) window will open. SEANCE will take 10-20 seconds to initialize, depending on the speed of your program. 10 | 11 | The source code for SEANCE 1.2.0 is also available. For SEANCE to work properly the user must have spaCy installed (see https://spacy.io/usage/ for instructions). 12 | 13 | After downloading and extracting the SEANCE_1_2_0_Py3.zip file, the user can start the program using the following command: 14 | $ python SEANCE_1_2_0.py 15 | 16 | # Attribution 17 | If you use SEANCE in your work, please use the following citation: 18 | 19 | Crossley, S. A., Kyle, K., & McNamara, D. S. (2017). Sentiment analysis and social cognition engine (SEANCE): An automatic tool for sentiment, social cognition, and social order analysis. Behavior Research Methods 49(3), pp. 803-821. doi:10.3758/s13428-016-0743-z. 20 | # Licensing 21 | SEANCE is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International Public License. 22 | See https://creativecommons.org/licenses/by-nc-sa/4.0/ for more information 23 | -------------------------------------------------------------------------------- /SEANCE User Manual (revised 6-18-2018).docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/SEANCE User Manual (revised 6-18-2018).docx -------------------------------------------------------------------------------- /SEANCE_1_2_0.py: -------------------------------------------------------------------------------- 1 | #-*- coding: utf-8 -*- 2 | from __future__ import division 3 | 4 | import sys 5 | 6 | #import spacy #this is for if spaCy is used 7 | import tkinter as tk 8 | import tkinter.font 9 | import tkinter.filedialog 10 | import tkinter.constants 11 | import queue 12 | from tkinter import messagebox 13 | 14 | import os 15 | import sys 16 | import re 17 | import platform 18 | import glob 19 | import math 20 | from collections import Counter 21 | 22 | from threading import Thread 23 | 24 | def resource_path(relative): 25 | if hasattr(sys, "_MEIPASS"): 26 | return os.path.join(sys._MEIPASS, relative) 27 | return os.path.join(relative) 28 | 29 | if platform.system() == "Darwin": 30 | system = "M" 31 | title_size = 16 32 | font_size = 14 33 | geom_size = "425x650" 34 | color = "#ff771c" 35 | elif platform.system() == "Windows": 36 | system = "W" 37 | title_size = 14 38 | font_size = 12 39 | geom_size = "450x650" 40 | color = "#ff771c" 41 | elif platform.system() == "Linux": 42 | system = "L" 43 | title_size = 14 44 | font_size = 12 45 | geom_size = "450x650" 46 | color = "#ff771c" 47 | 48 | #This creates a que in which the core TAALES program can communicate with the GUI 49 | dataQueue = queue.Queue() 50 | 51 | #This creates the message for the progress box (and puts it in the dataQueue) 52 | progress = "...Waiting for Data to Process" 53 | dataQueue.put(progress) 54 | 55 | #This creates a function for starting a new thread. The arguments are what TAALES needs to run 56 | #Def1 is the core TAALES program 57 | def start_thread(def1, arg1, arg2, arg3): 58 | t = Thread(target=def1, args=(arg1, arg2, arg3)) 59 | t.start() 60 | 61 | 62 | #This is how the correct path is given for program files. 63 | def resource_path(relative): 64 | if hasattr(sys, "_MEIPASS"): 65 | return os.path.join(sys._MEIPASS, relative) 66 | return os.path.join(relative) 67 | 68 | class MyApp: 69 | def __init__(self, parent): 70 | 71 | #Creates font styles - Task: Make these pretty! 72 | 73 | 74 | helv14= tkinter.font.Font(family= "Helvetica Neue", size=font_size) 75 | times14= tkinter.font.Font(family= "Lucida Grande", size=font_size) 76 | helv16= tkinter.font.Font(family= "Helvetica Neue", size = title_size, weight = "bold", slant = "italic") 77 | 78 | #This defines the GUI parent (ish) 79 | self.myParent = parent 80 | 81 | #This creates the header text - Task:work with this to make more pretty! 82 | self.spacer1= tk.Label(parent, text= "SEntiment ANalysis and Cognition Engine", font = helv16, background = color) 83 | self.spacer1.pack() 84 | 85 | #This creates a frame for the meat of the GUI 86 | self.thestuff= tk.Frame(parent, background =color) 87 | self.thestuff.pack() 88 | #Currently, the sizes aren't doing anything... 89 | 90 | #Within the 'thestuff' frame, this creates a frame on the left for the buttons/input 91 | self.myContainer1 = tk.Frame(self.thestuff, background = color) 92 | self.myContainer1.pack(side = tk.RIGHT, expand = tk.TRUE) 93 | 94 | 95 | #Text to be displayed above the widgets AND for a line to be placed around the elements 96 | self.labelframe2 = tk.LabelFrame(self.myContainer1, text= "Instructions", background = color) 97 | self.labelframe2.pack(expand=tk.TRUE) 98 | 99 | #Checkbox 100 | self.checkboxframe = tk.LabelFrame(self.myContainer1, text= "Indices", background = color, width = "45") 101 | self.checkboxframe.pack(expand=tk.TRUE) 102 | 103 | self.checkboxframe2 = tk.LabelFrame(self.myContainer1, text= "Words to Analyze", background = color, width = "45") 104 | self.checkboxframe2.pack(expand=tk.TRUE) 105 | 106 | self.myContainer2 = tk.Frame(self.myContainer1, background = color) 107 | self.myContainer2.pack(expand = tk.TRUE) 108 | 109 | self.checkboxframe3 = tk.LabelFrame(self.myContainer2, text= "Negation Control", background = color, width = "45") 110 | self.checkboxframe3.grid(row=1,column=2, sticky = "W") 111 | 112 | self.checkboxframe4 = tk.LabelFrame(self.myContainer2, text= "Components", background = color, width = "45") 113 | self.checkboxframe4.grid(row=1,column=1, sticky = "W") 114 | 115 | self.cb2_var = tk.IntVar() 116 | self.cb2 = "" 117 | 118 | self.cb3_var = tk.IntVar() 119 | self.cb3 = tk.Checkbutton(self.checkboxframe, text="GALC", variable=self.cb3_var,background = color) 120 | self.cb3.grid(row=1,column=1, sticky = "W") 121 | self.cb3.bind("", self.clicker) 122 | self.cb3.deselect() 123 | 124 | self.cb4_var = tk.IntVar() 125 | 126 | self.cb4 = tk.Checkbutton(self.checkboxframe, text="EmoLex", variable=self.cb4_var,background = color) 127 | self.cb4.grid(row=1,column=2, sticky = "W") 128 | self.cb4.deselect() 129 | self.cb4.bind("", self.clicker) 130 | 131 | self.cb5_var = tk.IntVar() 132 | 133 | self.cb5 = tk.Checkbutton(self.checkboxframe, text="ANEW", variable=self.cb5_var,background = color) 134 | self.cb5.grid(row=1,column=3, sticky = "W") 135 | self.cb5.deselect() 136 | self.cb5.bind("", self.clicker) 137 | 138 | self.cb6_var = tk.IntVar() 139 | self.cb6 = tk.Checkbutton(self.checkboxframe, text="SENTIC", variable=self.cb6_var,background = color) 140 | self.cb6.grid(row=1,column=4, sticky = "W") 141 | self.cb6.deselect() 142 | self.cb6.bind("", self.clicker) 143 | 144 | self.cb7_var = tk.IntVar() 145 | self.cb7 = tk.Checkbutton(self.checkboxframe, text="VADER", variable=self.cb7_var,background = color) 146 | self.cb7.grid(row=2,column=1, sticky = "W") 147 | self.cb7.deselect() 148 | self.cb7.bind("", self.clicker) 149 | 150 | self.cb8_var = tk.IntVar() 151 | self.cb8 = tk.Checkbutton(self.checkboxframe, text="Hu-Liu", variable=self.cb8_var,background = color) 152 | self.cb8.grid(row=2,column=2, sticky = "W") 153 | self.cb8.deselect() 154 | self.cb8.bind("", self.clicker) 155 | 156 | self.cb9_var = tk.IntVar() 157 | self.cb9 = tk.Checkbutton(self.checkboxframe, text="GI", variable=self.cb9_var,background = color) 158 | self.cb9.grid(row=2,column=3, sticky = "W") 159 | self.cb9.deselect() 160 | self.cb9.bind("", self.clicker) 161 | 162 | self.cb10_var = tk.IntVar() 163 | self.cb10 = tk.Checkbutton(self.checkboxframe, text="Lasswell", variable=self.cb10_var,background = color) 164 | self.cb10.grid(row=2,column=4, sticky = "W") 165 | self.cb10.deselect() 166 | self.cb10.bind("", self.clicker) 167 | 168 | 169 | self.cb11_var = tk.IntVar() 170 | self.cb11 = tk.Checkbutton(self.checkboxframe2, text="Nouns", variable=self.cb11_var,background = color) 171 | self.cb11.grid(row=1,column=2, sticky = "W") 172 | self.cb11.deselect() 173 | 174 | self.cb12_var = tk.IntVar() 175 | self.cb12 = tk.Checkbutton(self.checkboxframe2, text="Verbs", variable=self.cb12_var,background = color) 176 | self.cb12.grid(row=1,column=3, sticky = "W") 177 | self.cb12.deselect() 178 | 179 | self.cb13_var = tk.IntVar() 180 | self.cb13 = tk.Checkbutton(self.checkboxframe2, text="Adjectives", variable=self.cb13_var,background = color) 181 | self.cb13.grid(row=1,column=4, sticky = "W") 182 | self.cb13.deselect() 183 | 184 | self.cb14_var = tk.IntVar() 185 | self.cb14 = tk.Checkbutton(self.checkboxframe2, text="Adverbs", variable=self.cb14_var,background = color) 186 | self.cb14.grid(row=1,column=5, sticky = "W") 187 | self.cb14.deselect() 188 | 189 | self.cb15_var = tk.IntVar() 190 | self.cb15 = tk.Checkbutton(self.checkboxframe2, text="All Words", variable=self.cb15_var,background = color) 191 | self.cb15.grid(row=1,column=1, sticky = "W") 192 | self.cb15.deselect() 193 | 194 | self.cb16_var = tk.IntVar() 195 | self.cb16 = tk.Checkbutton(self.checkboxframe3, text="Three Left", variable=self.cb16_var,background = color) 196 | self.cb16.grid(row=1,column=1, sticky = "W") 197 | self.cb16.deselect() 198 | 199 | self.cb18_var = tk.IntVar() 200 | self.cb18 = tk.Checkbutton(self.checkboxframe4, text="Components", variable=self.cb18_var,background = color) 201 | self.cb18.grid(row=1,column=1, sticky = "W") 202 | self.cb18.select() 203 | 204 | 205 | self.var_list = [self.cb2_var,self.cb3_var,self.cb4_var,self.cb5_var,self.cb6_var,self.cb7_var,self.cb8_var,self.cb9_var,self.cb10_var,self.cb11_var,self.cb12_var,self.cb13_var,self.cb14_var,self.cb15_var,self.cb16_var,self.cb18_var] 206 | 207 | self.box_list = [self.cb3,self.cb4,self.cb5,self.cb6,self.cb7,self.cb8,self.cb9,self.cb10,self.cb11,self.cb12,self.cb13,self.cb14,self.cb15,self.cb16,self.cb18] 208 | 209 | self.cb_all = tk.Button(self.checkboxframe, text = " Select All ",justify = tk.LEFT) 210 | self.cb_all.grid(row=1, column = 0) 211 | self.cb_all.bind("", self.cb_all_Click) 212 | 213 | self.cb_none = tk.Button(self.checkboxframe, text = "Select None") 214 | self.cb_none.grid(row=2, column = 0) 215 | self.cb_none.bind("", self.cb_none_Click) 216 | 217 | 218 | #This creates the list of instructions. There may be a better way to do this... 219 | self.instruct = tk.Label(self.labelframe2, height = "9", width = "45", justify = tk.LEFT, padx = "4", pady= "6", anchor = tk.W, font = helv14, text ="1. Select desired indices, types of words to analyze,\n and whether negation control is desired.\n2. Choose the input folder (where your files are).\n3. Select the folder you want the output file to go in.\n4. Give a name to the output file.\n5. Press the 'Process Texts' button.\n6. Please reference the SEANCE Index Spreadsheet\n and the SEANCE help file (www.kristopherkyle.com)\n for further assistance in interpreting the output.") 220 | self.instruct.pack() 221 | 222 | #Creates Label Frame for Data Input area 223 | self.secondframe= tk.LabelFrame(self.myContainer1, text= "Data Input", background = color) 224 | self.secondframe.pack(expand=tk.TRUE) 225 | #This Places the first button under the instructions. 226 | self.button1 = tk.Button(self.secondframe) 227 | self.button1.configure(text= "Select Input Folder") 228 | self.button1.pack() 229 | 230 | #This tells the button what to do when clicked. Currently, only a left-click 231 | #makes the button do anything (e.g. ). The second argument is a "def" 232 | #That is defined later in the program. 233 | self.button1.bind("", self.button1Click) 234 | 235 | #Creates default dirname so if statement in Process Texts can check to see 236 | #if a directory name has been chosen 237 | self.dirname = "" 238 | 239 | #This creates a label for the first program input (Input Directory) 240 | self.inputdirlabel =tk.LabelFrame(self.secondframe, height = "1", width= "45", padx = "4", text = "Your selected input folder:", background = color) 241 | self.inputdirlabel.pack() 242 | 243 | #Creates label that informs user which directory has been chosen 244 | directoryprompt = "(No Folder Chosen)" 245 | self.inputdirchosen = tk.Label(self.inputdirlabel, height= "1", width= "44", justify=tk.LEFT, padx = "4", anchor = tk.W, font= helv14, text = directoryprompt) 246 | self.inputdirchosen.pack() 247 | 248 | #This creates the Output Directory button. 249 | self.button2 = tk.Button(self.secondframe) 250 | self.button2["text"]= "Choose Output Filename" 251 | #This tells the button what to do if clicked. 252 | self.button2.bind("", self.button2Click) 253 | self.button2.pack() 254 | self.outdirname = "" 255 | 256 | #Creates a label for the second program input (Output Directory) 257 | self.outputdirlabel = tk.LabelFrame(self.secondframe, height = "1", width= "45", padx = "4", text = "Your selected filename:", background = color) 258 | self.outputdirlabel.pack() 259 | 260 | #Creates a label that informs sure which directory has been chosen 261 | #outdirectoryprompt = "(No output Folder Chosen)" 262 | outdirectoryprompt = "(No Output Filename Chosen)" 263 | self.input2 = "" 264 | self.outputdirchosen = tk.Label(self.outputdirlabel, height= "1", width= "44", justify=tk.LEFT, padx = "4", anchor = tk.W, font= helv14, text = outdirectoryprompt) 265 | self.outputdirchosen.pack() 266 | 267 | self.BottomSpace= tk.LabelFrame(self.myContainer1, text = "Run Program", background = color) 268 | self.BottomSpace.pack() 269 | 270 | self.button3= tk.Button(self.BottomSpace) 271 | self.button3["text"] = "Process Texts" 272 | self.button3.bind("", self.runprogram) 273 | self.button3.pack() 274 | 275 | self.progresslabelframe = tk.LabelFrame(self.BottomSpace, text= "Program Status", background = color) 276 | self.progresslabelframe.pack(expand= tk.TRUE) 277 | 278 | #progress = "...Waiting for Data to Process" 279 | self.progress= tk.Label(self.progresslabelframe, height= "1", width= "45", justify=tk.LEFT, padx = "4", anchor = tk.W, font= helv14, text=progress) 280 | self.progress.pack() 281 | 282 | self.poll(self.progress) 283 | 284 | def cb_all_Click(self, event): 285 | for items in self.box_list[:8]: 286 | items.select() 287 | self.cb15.select() 288 | 289 | def cb_none_Click(self, event): 290 | for items in self.box_list[:8]: 291 | items.deselect() 292 | self.box_list[12].deselect() 293 | 294 | def clicker(self, event): 295 | item = self.box_list[12] 296 | item.select() 297 | 298 | def entry1Return(self,event): 299 | input= self.entry1.get() 300 | self.input2 = input + ".csv" 301 | self.filechosenchosen.config(text = self.input2) 302 | self.filechosenchosen.update_idletasks() 303 | 304 | #Following is an example of how we can update the information from users... 305 | def button1Click(self, event): 306 | self.dirname = tkinter.filedialog.askdirectory(parent=root,title='Please select a directory') 307 | print(self.dirname) 308 | if self.dirname == "": 309 | self.displayinputtext = "(No Folder Chosen)" 310 | else: self.displayinputtext = '.../'+self.dirname.split('/')[-1] 311 | self.inputdirchosen.config(text = self.displayinputtext) 312 | 313 | 314 | def button2Click(self, event): 315 | self.outdirname = tkinter.filedialog.asksaveasfilename(parent=root, defaultextension = ".csv", initialfile = "results",title='Choose Output Filename') 316 | print(self.outdirname) 317 | if self.outdirname == "": 318 | self.displayoutputtext = "(No Output Filename Chosen)" 319 | else: self.displayoutputtext = '.../' + self.outdirname.split('/')[-1] 320 | self.outputdirchosen.config(text = self.displayoutputtext) 321 | 322 | def runprogram(self, event): 323 | self.poll(self.progress) 324 | import tkinter.messagebox 325 | if self.dirname is "": 326 | tkinter.messagebox.showinfo("Supply Information", "Choose Input Directory") 327 | if self.outdirname is "": 328 | tkinter.messagebox.showinfo("Choose Output Filename", "Choose Output Filename") 329 | if self.dirname is not "" and self.outdirname is not "": 330 | 331 | dataQueue.put("Starting Sentiment Tool...") 332 | start_thread(main, self.dirname, self.outdirname, self.var_list) 333 | 334 | def poll(self, function): 335 | 336 | self.myParent.after(10, self.poll, function) 337 | try: 338 | function.config(text = dataQueue.get(block=False)) 339 | 340 | #root.update_idletasks() 341 | except queue.Empty: 342 | pass 343 | 344 | def main(indir, outfile,var_list): 345 | 346 | if var_list[5].get() == 1 or var_list[15].get() == 1: #if vader box or component scores checked: 347 | from data_files.vaderSentiment import SentimentIntensityAnalyzer 348 | 349 | def dict_builder(database_file, number): #builds dictionaries from database files 350 | dict ={} 351 | 352 | for entries in database_file: 353 | if entries[0] is '#': #ignores first line which contains category information 354 | continue 355 | 356 | entries = entries.split("\t") 357 | dict[entries[0]]=entries[number] 358 | 359 | return dict 360 | 361 | #This function deals with denominator issues that can kill the program: 362 | def safe_divide(numerator, denominator): 363 | if denominator == 0: 364 | index = 0 365 | else: index = numerator/denominator 366 | return index 367 | 368 | #this is almost the same as safe_divide, but creates proportions 369 | def proportion(numerator,denominator): 370 | if denominator == 0 and numerator == 0: 371 | index = 0 372 | elif denominator == 0: 373 | index = 1 374 | else: index = numerator/denominator 375 | return index 376 | 377 | def component_dicter(folder): #takes a folder of component files, turns it into dict 378 | dict = {} 379 | file_list = glob.glob(folder) 380 | for files in file_list: 381 | list = [] 382 | comp = open(files,"rU").read().split("\n") 383 | for line in comp: 384 | list.append(line.split("\t")) 385 | if system !="W": 386 | key = files.split("/")[-1] 387 | else: 388 | key = files.split("\\")[-1] 389 | dict[key]=list 390 | 391 | return dict 392 | 393 | 394 | #function for dictionary 395 | def regex_count(dict, key, text, list, pos=None): 396 | counter = 0 397 | nwords = len(text.split())#text length 398 | ##print pos, nwords 399 | for item in dict[key]: 400 | if item == "": 401 | continue #if it is empty, ignore it. Lots of empty keys 402 | #For wildcards and case 403 | if item[-1] == '*':#if item in list ends in wildcard 404 | my_regex = r'\b' + item[0:-1] + r'.*?\b' 405 | counter+= len(re.findall(my_regex, text, re.IGNORECASE)) 406 | else: 407 | my_regex = r'\b' + item + r'\b' 408 | counter+= len(re.findall(my_regex, text, re.IGNORECASE)) 409 | if list == "yes": 410 | variable_list.append(safe_divide(counter,nwords)) 411 | if pos == None: 412 | header_list.append(key) 413 | else: header_list.append(key+pos) 414 | 415 | else: return safe_divide(counter,nwords) 416 | 417 | def DataDict_counter(in_text, data_dict,number,header,list): 418 | counter = 0 419 | sum_counter = 0 420 | nwords = len(in_text) 421 | 422 | for word in in_text: 423 | if word in data_dict: 424 | counter+=1 425 | sum_counter+=float(data_dict[word]) 426 | else: 427 | if word in lemma_dict and lemma_dict[word] in data_dict: 428 | counter+=1 429 | sum_counter+=float(data_dict[lemma_dict[word]]) 430 | if list == "yes": 431 | if number == 0: variable_list.append(safe_divide(sum_counter,counter)) 432 | if number == 1: variable_list.append(safe_divide(sum_counter,nwords)) 433 | header_list.append(header) 434 | else: 435 | if number == 0: variable=safe_divide(sum_counter,counter) 436 | if number == 1: variable=safe_divide(sum_counter,nwords) 437 | 438 | return variable 439 | 440 | def ListDict_counter(list_dict,key,in_text,list,pos=None): 441 | counter = 0 442 | nwords = len(in_text) 443 | 444 | for word in in_text: 445 | if word in list_dict[key]: 446 | counter+=1 447 | else: 448 | if word in lemma_dict and lemma_dict[word] in list_dict[key]: 449 | counter+=1 450 | if list == "yes": 451 | variable_list.append(safe_divide(counter,nwords)) 452 | if pos == None: 453 | if "NRC" in key: 454 | key = key.replace("NRC","EmoLex") 455 | header_list.append(key) 456 | else: 457 | if "NRC" in key: 458 | key = key.replace("NRC","EmoLex") 459 | header_list.append(key+pos) 460 | else: 461 | return safe_divide(counter,nwords) 462 | 463 | def Ngram_DataDict_counter(in_text, data_dict,header,list): #replaces Sentic Count 464 | counter = 0 465 | nwords = len(in_text) 466 | position = 0 467 | denominator = 0 468 | 469 | def single_count(word): 470 | key = 0 471 | if word in data_dict: 472 | key=1 473 | elif word in lemma_dict and lemma_dict[word] in data_dict: 474 | key =2 475 | return key 476 | 477 | def ngram_count(gram,n): 478 | yes = 0 479 | if gram in data_dict: 480 | yes +=n 481 | return yes 482 | 483 | for item in in_text: 484 | #This section ensures the text (or remaining portion of the text) is long enough to look for five-grams, etc. 485 | if position > len(in_text)-1: continue 486 | word = in_text[position] 487 | if len(in_text[position:]) < 2: 488 | if single_count(word) == 1: 489 | counter+=float(data_dict[word]) 490 | denominator+=1 491 | position+=1 492 | elif single_count(word) == 2: 493 | counter+=float(data_dict[lemma_dict[word]]) 494 | denominator+=1 495 | position+=1 496 | else: position+=1 497 | 498 | bigram = " ".join(in_text[position:position +1]) 499 | if len(in_text[position:]) < 3: 500 | yes = ngram_count(bigram,2) 501 | if yes > 0: 502 | counter+=float(data_dict[bigram]) 503 | denominator+=1 504 | position+=yes 505 | continue 506 | if single_count(word) == 1: 507 | counter+=float(data_dict[word]) 508 | denominator+=1 509 | position+=1 510 | elif single_count(word) == 2: 511 | counter+=float(data_dict[lemma_dict[word]]) 512 | denominator+=1 513 | position+=1 514 | else: position+=1 515 | 516 | trigram = " ".join(in_text[position:position +2]) 517 | if len(in_text[position:]) < 4: 518 | yes = ngram_count(trigram,3) 519 | if yes > 0: 520 | counter+=float(data_dict[trigram]) 521 | denominator+=1 522 | position+=yes 523 | continue 524 | yes = ngram_count(bigram,2) 525 | if yes > 0: 526 | counter+=float(data_dict[bigram]) 527 | denominator+=1 528 | position+=yes 529 | continue 530 | if single_count(word) == 1: 531 | counter+=float(data_dict[word]) 532 | denominator+=1 533 | position+=1 534 | elif single_count(word) == 2: 535 | counter+=float(data_dict[lemma_dict[word]]) 536 | denominator+=1 537 | position+=1 538 | else: position+=1 539 | 540 | quadgram = " ".join(in_text[position:position +3]) 541 | if len(in_text[position:]) < 5: 542 | yes = ngram_count(quadgram,4) 543 | if yes > 0: 544 | counter+=float(data_dict[quadgram]) 545 | denominator+=1 546 | position+=yes 547 | continue 548 | yes = ngram_count(trigram,3) 549 | if yes > 0: 550 | counter+=float(data_dict[trigram]) 551 | denominator+=1 552 | position+=yes 553 | continue 554 | yes = ngram_count(bigram,2) 555 | if yes > 0: 556 | counter+=float(data_dict[bigram]) 557 | denominator+=1 558 | position+=yes 559 | continue 560 | if single_count(word) == 1: 561 | counter+=float(data_dict[word]) 562 | denominator+=1 563 | position+=1 564 | elif single_count(word) == 2: 565 | counter+=float(data_dict[lemma_dict[word]]) 566 | denominator+=1 567 | position+=1 568 | else: position+=1 569 | 570 | #This functions on most of the text. It checks for five-grams. If none, quad-grams, etc. 571 | fivegram = " ".join(in_text[position:position +4]) 572 | if len(in_text[position:]) > 4: 573 | yes = ngram_count(fivegram,4) 574 | if yes > 0: 575 | counter+=float(data_dict[fivegram]) 576 | denominator+=1 577 | position+=yes 578 | continue 579 | yes = ngram_count(quadgram,4) 580 | if yes > 0: 581 | counter+=float(data_dict[quadgram]) 582 | denominator+=1 583 | position+=yes 584 | continue 585 | yes = ngram_count(trigram,3) 586 | if yes > 0: 587 | counter+=float(data_dict[trigram]) 588 | denominator+=1 589 | position+=yes 590 | continue 591 | yes = ngram_count(bigram,2) 592 | if yes > 0: 593 | counter+=float(data_dict[bigram]) 594 | denominator+=1 595 | position+=yes 596 | continue 597 | if single_count(word) == 1: 598 | counter+=float(data_dict[word]) 599 | denominator+=1 600 | position+=1 601 | elif single_count(word) == 2: 602 | counter+=float(data_dict[lemma_dict[word]]) 603 | denominator+=1 604 | position+=1 605 | else: position+=1 606 | 607 | if list == "yes": 608 | variable_list.append(safe_divide(counter,denominator)) 609 | header_list.append(header) 610 | else: 611 | return safe_divide(counter,denominator) 612 | 613 | inputfile = indir + "/*.txt" 614 | filenames = glob.glob(inputfile) 615 | 616 | outf=open(outfile, "w") 617 | 618 | print("Importing databases...\n") 619 | 620 | affect_list = open(resource_path('data_files/affective_list.txt'), 'rU').read() 621 | 622 | #this cleans up the affect list 623 | affect_list = re.sub('\t\t', '', affect_list)#makes all double tabs nothing 624 | affect_list = re.sub('\t\n', '\n', affect_list)#similar all the way down 625 | affect_list = re.sub(' \t', '\t', affect_list) 626 | affect_list = re.sub(' \n', '\n', affect_list) 627 | 628 | affect_list = affect_list.split('\n') 629 | affect_dict = {} 630 | 631 | punctuation = (".","!","?",",", ":",";", "'",'"') 632 | 633 | for line in affect_list: 634 | entries = line.split("\t") #splits entries in .csv file at tab 635 | affect_dict[entries[0]] = entries[1:] 636 | 637 | GI_list = open(resource_path('data_files/inquirerbasic.txt'), 'rU').readlines() 638 | GI_dict={} 639 | 640 | for line in GI_list: 641 | entries = line.split("\t") #splits entries in .csv file at tab 642 | 643 | GI_dict[entries[0]] = set(entries[1:]) 644 | ##print entries[0] 645 | 646 | #defines program files 647 | lemma_list = open(resource_path('data_files/e_lemma_py_format_lower.txt'), 'rU').readlines() 648 | anew_list = open(resource_path('data_files/affective_norms.txt'), 'rU').readlines() 649 | sentic_list = open(resource_path('data_files/senticnet_data.txt'), 'rU').readlines() 650 | lu_hui_positive = open(resource_path('data_files/positive_words.txt'), 'rU').read().split("\n") 651 | lu_hui_negative = open(resource_path('data_files/negative_words.txt'), 'rU').read().split("\n") 652 | 653 | components_2 = component_dicter(resource_path('components/C*.txt')) 654 | 655 | 656 | #creates dictionary for lemma list 657 | lemma_dict={} 658 | 659 | #Creates lemma dictionary, which may or may not be necessary with this database! 660 | 661 | #print "Compiling Lemma Dict...\n" 662 | for line in lemma_list: 663 | #ignores first lines 664 | if line[0] is '#': 665 | continue 666 | #allows use of each line: 667 | entries=line.split() 668 | #creates dictionary entry for each word in line: 669 | for word in entries: 670 | lemma_dict[word] = entries[0] 671 | ##print lemma_dict 672 | 673 | #Creates a dictionaries with word:score pairings: 674 | 675 | valence = dict_builder(anew_list, 1) 676 | arousal = dict_builder(anew_list, 2) 677 | dominance = dict_builder(anew_list, 3) 678 | 679 | pleasantness = dict_builder(sentic_list, 1) 680 | attention = dict_builder(sentic_list, 2) 681 | sensitivity = dict_builder(sentic_list, 3) 682 | aptitude = dict_builder(sentic_list, 4) 683 | polarity = dict_builder(sentic_list, 5) 684 | 685 | def run_galc(text,var=None): 686 | if var == None: 687 | pos = "" 688 | else: pos = var 689 | if var != None: 690 | text = " ".join(text) 691 | 692 | 693 | regex_count(affect_dict,'Admiration/Awe_GALC',text,"yes",pos) 694 | regex_count(affect_dict,'Amusement_GALC',text,"yes",pos) 695 | regex_count(affect_dict,'Anger_GALC',text,"yes",pos) 696 | regex_count(affect_dict,'Anxiety_GALC',text,"yes",pos) 697 | regex_count(affect_dict,'Beingtouched_GALC',text,"yes",pos) 698 | regex_count(affect_dict,'Boredom_GALC',text,"yes",pos) 699 | regex_count(affect_dict,'Compassion_GALC',text,"yes",pos) 700 | regex_count(affect_dict,'Contempt_GALC',text,"yes",pos) 701 | regex_count(affect_dict,'Contentment_GALC',text,"yes",pos) 702 | regex_count(affect_dict,'Desperation_GALC',text,"yes",pos) 703 | regex_count(affect_dict,'Disappointment_GALC',text,"yes",pos) 704 | regex_count(affect_dict,'Disgust_GALC',text,"yes",pos) 705 | regex_count(affect_dict,'Dissatisfaction_GALC',text,"yes",pos) 706 | regex_count(affect_dict,'Envy_GALC',text,"yes",pos) 707 | regex_count(affect_dict,'Fear_GALC',text,"yes",pos) 708 | regex_count(affect_dict,'Feelinglove_GALC',text,"yes",pos) 709 | regex_count(affect_dict,'Gratitude_GALC',text,"yes",pos) 710 | regex_count(affect_dict,'Guilt_GALC',text,"yes",pos) 711 | regex_count(affect_dict,'Happiness_GALC',text,"yes",pos) 712 | regex_count(affect_dict,'Hatred_GALC',text,"yes",pos) 713 | regex_count(affect_dict,'Hope_GALC',text,"yes",pos) 714 | regex_count(affect_dict,'Humility_GALC',text,"yes",pos) 715 | regex_count(affect_dict,'Interest/Enthusiasm_GALC',text,"yes",pos) 716 | regex_count(affect_dict,'Irritation_GALC',text,"yes",pos) 717 | regex_count(affect_dict,'Jealousy_GALC',text,"yes",pos) 718 | regex_count(affect_dict,'Joy_GALC',text,"yes",pos) 719 | regex_count(affect_dict,'Longing_GALC',text,"yes",pos) 720 | regex_count(affect_dict,'Lust_GALC',text,"yes",pos) 721 | regex_count(affect_dict,'Pleasure/Enjoyment_GALC',text,"yes",pos) 722 | regex_count(affect_dict,'Pride_GALC',text,"yes",pos) 723 | regex_count(affect_dict,'Relaxation/Serenity_GALC',text,"yes",pos) 724 | regex_count(affect_dict,'Relief_GALC',text,"yes",pos) 725 | regex_count(affect_dict,'Sadness_GALC',text,"yes",pos) 726 | regex_count(affect_dict,'Shame_GALC',text,"yes",pos) 727 | regex_count(affect_dict,'Surprise_GALC',text,"yes",pos) 728 | regex_count(affect_dict,'Tension/Stress_GALC',text,"yes",pos) 729 | regex_count(affect_dict,'Positive_GALC',text,"yes",pos) 730 | regex_count(affect_dict,'Negative_GALC',text,"yes",pos) 731 | 732 | def run_nrc(text,var=None): 733 | if var == None: 734 | pos = "" 735 | else: pos = var 736 | 737 | ListDict_counter(GI_dict,'Anger_NRC',text,"yes",pos) 738 | ListDict_counter(GI_dict,'Anticipation_NRC',text,"yes",pos) 739 | ListDict_counter(GI_dict,'Disgust_NRC',text,"yes",pos) 740 | ListDict_counter(GI_dict,'Fear_NRC',text,"yes",pos) 741 | ListDict_counter(GI_dict,'Joy_NRC',text,"yes",pos) 742 | ListDict_counter(GI_dict,'Negative_NRC',text,"yes",pos) 743 | ListDict_counter(GI_dict,'Positive_NRC',text,"yes",pos) 744 | ListDict_counter(GI_dict,'Sadness_NRC',text,"yes",pos) 745 | ListDict_counter(GI_dict,'Surprise_NRC',text,"yes",pos) 746 | ListDict_counter(GI_dict,'Trust_NRC',text,"yes",pos) 747 | 748 | def run_anew(text,var=None): 749 | if var == None: 750 | pos = "" 751 | else: pos = var 752 | 753 | DataDict_counter(text, valence,0,"Valence"+pos,"yes")#valence_average = 754 | DataDict_counter(text, valence,1,"Valence_nwords"+pos,"yes")#valence_average_nwords = 755 | DataDict_counter(text, arousal,0,"Arousal"+pos,"yes")#arousal_average = 756 | DataDict_counter(text, arousal,1,"Arousal_nwords"+pos,"yes")#arousal_average_nwords = 757 | DataDict_counter(text, dominance,0,"Dominance"+pos,"yes")#dominance_average = 758 | DataDict_counter(text, dominance,1,"Dominance_nwords"+pos,"yes")#dominance_average_nwords = 759 | 760 | def run_sentic(text,var=None): 761 | if var == None: 762 | pos = "" 763 | else: pos = var 764 | 765 | Ngram_DataDict_counter(text, pleasantness,"pleasantness"+pos,"yes")#norm_pleasant = 766 | Ngram_DataDict_counter(text, attention,"attention"+pos,"yes")#norm_attention = 767 | Ngram_DataDict_counter(text, sensitivity,"sensitivity"+pos,"yes")#norm_sensitivity = 768 | Ngram_DataDict_counter(text, aptitude,"aptitude"+pos,"yes")#norm_aptitude = 769 | Ngram_DataDict_counter(text, polarity,"polarity"+pos,"yes")#norm_polarity = 770 | 771 | def run_vader(text,var=None): 772 | if var == None: 773 | pos = "" 774 | else: pos = var 775 | 776 | if var != None: 777 | text = " ".join(text) 778 | try: 779 | analyzer = SentimentIntensityAnalyzer() 780 | vs = analyzer.polarity_scores(text) 781 | variable_list.append(vs['neg']) #vader_negative #relies on vader lexicon 782 | variable_list.append(vs['neu']) #vader_neutral #relies on vader lexicon 783 | variable_list.append(vs['pos'])#vader_positive #relies on vader lexicon 784 | variable_list.append(vs['compound'])#vader_compound = #vader lexicon and rules. most accurate 785 | except UnicodeEncodeError: 786 | variable_list.append("Error") 787 | variable_list.append("Error") 788 | variable_list.append("Error") 789 | variable_list.append("Error") 790 | 791 | vader_header = ["vader_negative"+pos,"vader_neutral"+pos,"vader_positive"+pos,"vader_compound"+pos] 792 | for items in vader_header: header_list.append(items) 793 | 794 | def run_lu_hui(text,var=None): 795 | if var == None: 796 | pos = "" 797 | else: pos = var 798 | 799 | lu_hui_positive_count = 0 800 | lu_hui_negative_count = 0 801 | lu_hui_pos_neg_count = 0 802 | 803 | for word in text: 804 | 805 | if word in lu_hui_positive: 806 | lu_hui_positive_count += 1 807 | lu_hui_pos_neg_count += 1 808 | 809 | if word in lu_hui_negative: 810 | lu_hui_negative_count += 1 811 | lu_hui_pos_neg_count += 1 812 | 813 | variable_list.append(safe_divide(lu_hui_positive_count,lu_hui_pos_neg_count))#lu_hui_pos_perc = 814 | variable_list.append(safe_divide(lu_hui_negative_count,lu_hui_pos_neg_count))#lu_hui_neg_perc = 815 | variable_list.append(safe_divide(lu_hui_positive_count,nwords))#lu_hui_pos_nwords = 816 | variable_list.append(safe_divide(lu_hui_negative_count,nwords))#lu_hui_neg_nwords = 817 | variable_list.append(proportion(lu_hui_positive_count,lu_hui_negative_count))#lu_hui_prop = 818 | lu_hui_header = ["hu_liu_pos_perc"+pos,"hu_liu_neg_perc"+pos,"hu_liu_pos_nwords"+pos,"hu_liu_neg_nwords"+pos,"hu_liu_prop"+pos] 819 | for items in lu_hui_header: header_list.append(items) 820 | 821 | def run_gi(text,var=None): 822 | if var == None: 823 | pos = "" 824 | else: pos = var 825 | 826 | ListDict_counter(GI_dict,"Positiv_GI",text,"yes",pos) 827 | ListDict_counter(GI_dict,"Negativ_GI",text,"yes",pos) 828 | ListDict_counter(GI_dict,"Pstv_GI",text,"yes",pos) 829 | ListDict_counter(GI_dict,"Affil_GI",text,"yes",pos) 830 | ListDict_counter(GI_dict,"Ngtv_GI",text,"yes",pos) 831 | ListDict_counter(GI_dict,"Hostile_GI",text,"yes",pos) 832 | ListDict_counter(GI_dict,"Strong_GI",text,"yes",pos) 833 | ListDict_counter(GI_dict,"Power_GI",text,"yes",pos) 834 | ListDict_counter(GI_dict,"Weak_GI",text,"yes",pos) 835 | ListDict_counter(GI_dict,"Submit_GI",text,"yes",pos) 836 | ListDict_counter(GI_dict,"Active_GI",text,"yes",pos) 837 | ListDict_counter(GI_dict,"Passive_GI",text,"yes",pos) 838 | ListDict_counter(GI_dict,"Pleasur_GI",text,"yes",pos) 839 | ListDict_counter(GI_dict,"Pain_GI",text,"yes",pos) 840 | ListDict_counter(GI_dict,"Feel_GI",text,"yes",pos) 841 | ListDict_counter(GI_dict,"Arousal_GI",text,"yes",pos) 842 | ListDict_counter(GI_dict,"Emot_GI",text,"yes",pos) 843 | ListDict_counter(GI_dict,"Virtue_GI",text,"yes",pos) 844 | ListDict_counter(GI_dict,"Vice_GI",text,"yes",pos) 845 | ListDict_counter(GI_dict,"Ovrst_GI",text,"yes",pos) 846 | ListDict_counter(GI_dict,"Undrst_GI",text,"yes",pos) 847 | ListDict_counter(GI_dict,"Academ_GI",text,"yes",pos) 848 | ListDict_counter(GI_dict,"Doctrin_GI",text,"yes",pos) 849 | ListDict_counter(GI_dict,"Econ_2_GI",text,"yes",pos) 850 | ListDict_counter(GI_dict,"Exch_GI",text,"yes",pos) 851 | ListDict_counter(GI_dict,"Econ_GI",text,"yes",pos) 852 | ListDict_counter(GI_dict,"Exprsv_GI",text,"yes",pos) 853 | ListDict_counter(GI_dict,"Legal_GI",text,"yes",pos) 854 | ListDict_counter(GI_dict,"Milit_GI",text,"yes",pos) 855 | ListDict_counter(GI_dict,"Polit_2_GI",text,"yes",pos) 856 | ListDict_counter(GI_dict,"Polit_GI",text,"yes",pos) 857 | ListDict_counter(GI_dict,"Relig_GI",text,"yes",pos) 858 | ListDict_counter(GI_dict,"Role_GI",text,"yes",pos) 859 | ListDict_counter(GI_dict,"Coll_GI",text,"yes",pos) 860 | ListDict_counter(GI_dict,"Work_GI",text,"yes",pos) 861 | ListDict_counter(GI_dict,"Ritual_GI",text,"yes",pos) 862 | ListDict_counter(GI_dict,"Socrel_GI",text,"yes",pos) 863 | ListDict_counter(GI_dict,"Race_GI",text,"yes",pos) 864 | ListDict_counter(GI_dict,"Kin_2_GI",text,"yes",pos) 865 | ListDict_counter(GI_dict,"Male_GI",text,"yes",pos) 866 | ListDict_counter(GI_dict,"Female_GI",text,"yes",pos) 867 | ListDict_counter(GI_dict,"Nonadlt_GI",text,"yes",pos) 868 | ListDict_counter(GI_dict,"Hu_GI",text,"yes",pos) 869 | ListDict_counter(GI_dict,"Ani_GI",text,"yes",pos) 870 | ListDict_counter(GI_dict,"Place_GI",text,"yes",pos) 871 | ListDict_counter(GI_dict,"Social_GI",text,"yes",pos) 872 | ListDict_counter(GI_dict,"Region_GI",text,"yes",pos) 873 | ListDict_counter(GI_dict,"Route_GI",text,"yes",pos) 874 | ListDict_counter(GI_dict,"Aquatic_GI",text,"yes",pos) 875 | ListDict_counter(GI_dict,"Land_GI",text,"yes",pos) 876 | ListDict_counter(GI_dict,"Sky_GI",text,"yes",pos) 877 | ListDict_counter(GI_dict,"Object_GI",text,"yes",pos) 878 | ListDict_counter(GI_dict,"Tool_GI",text,"yes",pos) 879 | ListDict_counter(GI_dict,"Food_GI",text,"yes",pos) 880 | ListDict_counter(GI_dict,"Vehicle_GI",text,"yes",pos) 881 | ListDict_counter(GI_dict,"Bldgpt_GI",text,"yes",pos) 882 | ListDict_counter(GI_dict,"Comnobj_GI",text,"yes",pos) 883 | ListDict_counter(GI_dict,"Natobj_GI",text,"yes",pos) 884 | ListDict_counter(GI_dict,"Bodypt_GI",text,"yes",pos) 885 | ListDict_counter(GI_dict,"Comform_GI",text,"yes",pos) 886 | ListDict_counter(GI_dict,"Com_GI",text,"yes",pos) 887 | ListDict_counter(GI_dict,"Say_GI",text,"yes",pos) 888 | ListDict_counter(GI_dict,"Need_GI",text,"yes",pos) 889 | ListDict_counter(GI_dict,"Goal_GI",text,"yes",pos) 890 | ListDict_counter(GI_dict,"Try_GI",text,"yes",pos) 891 | ListDict_counter(GI_dict,"Means_GI",text,"yes",pos) 892 | ListDict_counter(GI_dict,"Persist_GI",text,"yes",pos) 893 | ListDict_counter(GI_dict,"Complet_GI",text,"yes",pos) 894 | ListDict_counter(GI_dict,"Fail_GI",text,"yes",pos) 895 | ListDict_counter(GI_dict,"Natrpro_GI",text,"yes",pos) 896 | ListDict_counter(GI_dict,"Begin_GI",text,"yes",pos) 897 | ListDict_counter(GI_dict,"Vary_GI",text,"yes",pos) 898 | ListDict_counter(GI_dict,"Increas_GI",text,"yes",pos) 899 | ListDict_counter(GI_dict,"Decreas_GI",text,"yes",pos) 900 | ListDict_counter(GI_dict,"Finish_GI",text,"yes",pos) 901 | ListDict_counter(GI_dict,"Stay_GI",text,"yes",pos) 902 | ListDict_counter(GI_dict,"Rise_GI",text,"yes",pos) 903 | ListDict_counter(GI_dict,"Exert_GI",text,"yes",pos) 904 | ListDict_counter(GI_dict,"Fetch_GI",text,"yes",pos) 905 | ListDict_counter(GI_dict,"Travel_GI",text,"yes",pos) 906 | ListDict_counter(GI_dict,"Fall_GI",text,"yes",pos) 907 | ListDict_counter(GI_dict,"Think_GI",text,"yes",pos) 908 | ListDict_counter(GI_dict,"Know_GI",text,"yes",pos) 909 | ListDict_counter(GI_dict,"Causal_GI",text,"yes",pos) 910 | ListDict_counter(GI_dict,"Ought_GI",text,"yes",pos) 911 | ListDict_counter(GI_dict,"Perceiv_GI",text,"yes",pos) 912 | ListDict_counter(GI_dict,"Compare_GI",text,"yes",pos) 913 | ListDict_counter(GI_dict,"Eval_2_GI",text,"yes",pos) 914 | ListDict_counter(GI_dict,"Eval_GI",text,"yes",pos) 915 | ListDict_counter(GI_dict,"Solve_GI",text,"yes",pos) 916 | ListDict_counter(GI_dict,"Abs_2_GI",text,"yes",pos) 917 | ListDict_counter(GI_dict,"Abs_GI",text,"yes",pos) 918 | ListDict_counter(GI_dict,"Quality_GI",text,"yes",pos) 919 | ListDict_counter(GI_dict,"Quan_GI",text,"yes",pos) 920 | ListDict_counter(GI_dict,"Numb_GI",text,"yes",pos) 921 | ListDict_counter(GI_dict,"Ord_GI",text,"yes",pos) 922 | ListDict_counter(GI_dict,"Card_GI",text,"yes",pos) 923 | ListDict_counter(GI_dict,"Freq_GI",text,"yes",pos) 924 | ListDict_counter(GI_dict,"Dist_GI",text,"yes",pos) 925 | ListDict_counter(GI_dict,"Time_2_GI",text,"yes",pos) 926 | ListDict_counter(GI_dict,"Time_GI",text,"yes",pos) 927 | ListDict_counter(GI_dict,"Space_GI",text,"yes",pos) 928 | ListDict_counter(GI_dict,"Pos_GI",text,"yes",pos) 929 | ListDict_counter(GI_dict,"Dim_GI",text,"yes",pos) 930 | ListDict_counter(GI_dict,"Rel_GI",text,"yes",pos) 931 | ListDict_counter(GI_dict,"Color_GI",text,"yes",pos) 932 | ListDict_counter(GI_dict,"Self_GI",text,"yes",pos) 933 | ListDict_counter(GI_dict,"Our_GI",text,"yes",pos) 934 | ListDict_counter(GI_dict,"You_GI",text,"yes",pos) 935 | ListDict_counter(GI_dict,"Name_GI",text,"yes",pos) 936 | ListDict_counter(GI_dict,"Yes_GI",text,"yes",pos) 937 | ListDict_counter(GI_dict,"No_GI",text,"yes",pos) 938 | ListDict_counter(GI_dict,"Negate_GI",text,"yes",pos) 939 | ListDict_counter(GI_dict,"Intrj_GI",text,"yes",pos) 940 | ListDict_counter(GI_dict,"Iav_GI",text,"yes",pos) 941 | ListDict_counter(GI_dict,"Dav_GI",text,"yes",pos) 942 | ListDict_counter(GI_dict,"Sv_GI",text,"yes",pos) 943 | ListDict_counter(GI_dict,"Ipadj_GI",text,"yes",pos) 944 | ListDict_counter(GI_dict,"Indadj_GI",text,"yes",pos) 945 | 946 | def run_lasswell(text,var=None): 947 | if var == None: 948 | pos = "" 949 | else: pos = var 950 | 951 | ListDict_counter(GI_dict,"Powgain_Lasswell",text,"yes",pos) 952 | ListDict_counter(GI_dict,"Powloss_Lasswell",text,"yes",pos) 953 | ListDict_counter(GI_dict,"Powends_Lasswell",text,"yes",pos) 954 | ListDict_counter(GI_dict,"Powaren_Lasswell",text,"yes",pos) 955 | ListDict_counter(GI_dict,"Powcon_Lasswell",text,"yes",pos) 956 | ListDict_counter(GI_dict,"Powcoop_Lasswell",text,"yes",pos) 957 | ListDict_counter(GI_dict,"Powaupt_Lasswell",text,"yes",pos) 958 | ListDict_counter(GI_dict,"Powpt_Lasswell",text,"yes",pos) 959 | ListDict_counter(GI_dict,"Powdoct_Lasswell",text,"yes",pos) 960 | ListDict_counter(GI_dict,"Powauth_Lasswell",text,"yes",pos) 961 | ListDict_counter(GI_dict,"Powoth_Lasswell",text,"yes",pos) 962 | ListDict_counter(GI_dict,"Powtot_Lasswell",text,"yes",pos) 963 | ListDict_counter(GI_dict,"Rcethic_Lasswell",text,"yes",pos) 964 | ListDict_counter(GI_dict,"Rcrelig_Lasswell",text,"yes",pos) 965 | ListDict_counter(GI_dict,"Rcgain_Lasswell",text,"yes",pos) 966 | ListDict_counter(GI_dict,"Rcloss_Lasswell",text,"yes",pos) 967 | ListDict_counter(GI_dict,"Rcends_Lasswell",text,"yes",pos) 968 | ListDict_counter(GI_dict,"Rctot_Lasswell",text,"yes",pos) 969 | ListDict_counter(GI_dict,"Rspgain_Lasswell",text,"yes",pos) 970 | ListDict_counter(GI_dict,"Rsploss_Lasswell",text,"yes",pos) 971 | ListDict_counter(GI_dict,"Rspoth_Lasswell",text,"yes",pos) 972 | ListDict_counter(GI_dict,"Rsptot_Lasswell",text,"yes",pos) 973 | ListDict_counter(GI_dict,"Affgain_Lasswell",text,"yes",pos) 974 | ListDict_counter(GI_dict,"Affloss_Lasswell",text,"yes",pos) 975 | ListDict_counter(GI_dict,"Affpt_Lasswell",text,"yes",pos) 976 | ListDict_counter(GI_dict,"Affoth_Lasswell",text,"yes",pos) 977 | ListDict_counter(GI_dict,"Afftot_Lasswell",text,"yes",pos) 978 | ListDict_counter(GI_dict,"Wltpt_Lasswell",text,"yes",pos) 979 | ListDict_counter(GI_dict,"Wlttran_Lasswell",text,"yes",pos) 980 | ListDict_counter(GI_dict,"Wltoth_Lasswell",text,"yes",pos) 981 | ListDict_counter(GI_dict,"Wlttot_Lasswell",text,"yes",pos) 982 | ListDict_counter(GI_dict,"Wlbgain_Lasswell",text,"yes",pos) 983 | ListDict_counter(GI_dict,"Wlbloss_Lasswell",text,"yes",pos) 984 | ListDict_counter(GI_dict,"Wlbphys_Lasswell",text,"yes",pos) 985 | ListDict_counter(GI_dict,"Wlbpsyc_Lasswell",text,"yes",pos) 986 | ListDict_counter(GI_dict,"Wlbpt_Lasswell",text,"yes",pos) 987 | ListDict_counter(GI_dict,"Wlbtot_Lasswell",text,"yes",pos) 988 | ListDict_counter(GI_dict,"Enlgain_Lasswell",text,"yes",pos) 989 | ListDict_counter(GI_dict,"Enlloss_Lasswell",text,"yes",pos) 990 | ListDict_counter(GI_dict,"Enlends_Lasswell",text,"yes",pos) 991 | ListDict_counter(GI_dict,"Enlpt_Lasswell",text,"yes",pos) 992 | ListDict_counter(GI_dict,"Enloth_Lasswell",text,"yes",pos) 993 | ListDict_counter(GI_dict,"Enltot_Lasswell",text,"yes",pos) 994 | ListDict_counter(GI_dict,"Sklasth_Lasswell",text,"yes",pos) 995 | ListDict_counter(GI_dict,"Sklpt_Lasswell",text,"yes",pos) 996 | ListDict_counter(GI_dict,"Skloth_Lasswell",text,"yes",pos) 997 | ListDict_counter(GI_dict,"Skltot_Lasswell",text,"yes",pos) 998 | ListDict_counter(GI_dict,"Trngain_Lasswell",text,"yes",pos) 999 | ListDict_counter(GI_dict,"Trnloss_Lasswell",text,"yes",pos) 1000 | ListDict_counter(GI_dict,"Tranlw_Lasswell",text,"yes",pos) 1001 | ListDict_counter(GI_dict,"Meanslw_Lasswell",text,"yes",pos) 1002 | ListDict_counter(GI_dict,"Endslw_Lasswell",text,"yes",pos) 1003 | ListDict_counter(GI_dict,"Arenalw_Lasswell",text,"yes",pos) 1004 | ListDict_counter(GI_dict,"Ptlw_Lasswell",text,"yes",pos) 1005 | ListDict_counter(GI_dict,"Nation_Lasswell",text,"yes",pos) 1006 | ListDict_counter(GI_dict,"Anomie_Lasswell",text,"yes",pos) 1007 | ListDict_counter(GI_dict,"Negaff_Lasswell",text,"yes",pos) 1008 | ListDict_counter(GI_dict,"Posaff_Lasswell",text,"yes",pos) 1009 | ListDict_counter(GI_dict,"Surelw_Lasswell",text,"yes",pos) 1010 | ListDict_counter(GI_dict,"If_Lasswell",text,"yes",pos) 1011 | ListDict_counter(GI_dict,"Notlw_Lasswell",text,"yes",pos) 1012 | ListDict_counter(GI_dict,"Timespc_Lasswell",text,"yes",pos) 1013 | ListDict_counter(GI_dict,"formlw_Lasswell",text,"yes",pos) 1014 | 1015 | def component_count(component,header): 1016 | output = 0 1017 | item_number = 0 1018 | 1019 | for item in component: 1020 | item_number +=1 1021 | number = "blank" 1022 | variable = item[0] 1023 | 1024 | cut_var = variable.split("_") 1025 | eigen = float(item[1]) 1026 | 1027 | if len(cut_var) > 1: 1028 | if "2" in variable: 1029 | target_name = "_".join(cut_var[0:3]) 1030 | else: target_name = "_".join(cut_var[0:2]) #first part of name to identify list 1031 | else:target_name = cut_var[0] 1032 | 1033 | if "neg_3" in variable: #check for negation 1034 | 1035 | if "_nouns" in variable: 1036 | input = noun_text_neg_3 1037 | elif "_verbs" in variable: 1038 | input = verb_text_neg_3 1039 | elif "_adjectives" in variable: 1040 | input = adjective_text_neg_3 1041 | elif "_adverbs" in variable: 1042 | input = adverb_text_neg_3 1043 | else: 1044 | input = neg_3_text 1045 | 1046 | else: 1047 | if "_nouns" in variable: 1048 | input = noun_text 1049 | elif "_verbs" in variable: 1050 | input = verb_text 1051 | elif "_adjectives" in variable: 1052 | input = adjective_text 1053 | elif "_adverbs" in variable: 1054 | input = adverb_text 1055 | else: 1056 | input = text_2 1057 | 1058 | if "_GI" in variable or "_Lasswell" in variable or "_NRC" in variable: 1059 | number = ListDict_counter(GI_dict,target_name,input,"no") 1060 | 1061 | if "_GALC" in variable: 1062 | input = " ".join(input) 1063 | number = regex_count(affect_dict,target_name,input,"no") 1064 | 1065 | #ANEW 1066 | if "Valence" in variable or "Arousal" in variable or "Dominance" in variable: 1067 | if len(cut_var) > 1 and cut_var[1] == "nwords": 1068 | if cut_var[0] == "Valence": 1069 | number = DataDict_counter(input, valence,1,"","") 1070 | if cut_var[0] == "Arousal": 1071 | number = DataDict_counter(input, arousal,1,"","") 1072 | if cut_var[0] == "Dominance": 1073 | number = DataDict_counter(input, dominance,1,"","") 1074 | 1075 | else: 1076 | if cut_var[0] == "Valence": 1077 | number = DataDict_counter(input, valence,0,"","") 1078 | if cut_var[0] == "Arousal": 1079 | number = DataDict_counter(input, arousal,0,"","") 1080 | if cut_var[0] == "Dominance": 1081 | number = DataDict_counter(input, dominance,0,"","") 1082 | 1083 | #SENTIC 1084 | if "pleasantness" in variable: 1085 | #Ngram_DataDict_counter(in_text, data_dict,header,list) 1086 | number = Ngram_DataDict_counter(input, pleasantness,"pleasantness","no") 1087 | if "attention" in variable: 1088 | number = Ngram_DataDict_counter(input, attention,"attention","no") 1089 | if "sensitivity" in variable: 1090 | number = Ngram_DataDict_counter(input, sensitivity,"sensitivity","no") 1091 | if "aptitude" in variable: 1092 | number = Ngram_DataDict_counter(input, aptitude,"aptitude","no") 1093 | if "polarity" in variable: 1094 | number = Ngram_DataDict_counter(input, polarity,"polarity","no") 1095 | 1096 | #Vader 1097 | if "vader" in variable: 1098 | input = " ".join(input) 1099 | 1100 | try: 1101 | analyzer = SentimentIntensityAnalyzer() 1102 | vs = analyzer.polarity_scores(text) 1103 | 1104 | if "vader_negative" in variable: 1105 | number = vs['neg'] 1106 | if "vader_neutral" in variable: 1107 | number = vs['neu'] 1108 | if "vader_positive" in variable: 1109 | number = vs['pos'] 1110 | if "vader_compound" in variable: 1111 | number = vs['compound'] 1112 | 1113 | except UnicodeEncodeError: 1114 | number = 0 #fix this!!! 1115 | 1116 | 1117 | #LuHui 1118 | 1119 | if "lu_hui" in variable: 1120 | lu_hui_positive_count = 0 1121 | lu_hui_negative_count = 0 1122 | lu_hui_pos_neg_count = 0 1123 | 1124 | for word in input: 1125 | 1126 | if word in lu_hui_positive: 1127 | lu_hui_positive_count += 1 1128 | lu_hui_pos_neg_count += 1 1129 | 1130 | if word in lu_hui_negative: 1131 | lu_hui_negative_count += 1 1132 | lu_hui_pos_neg_count += 1 1133 | 1134 | if "lu_hui_pos_perc" in variable: 1135 | number = safe_divide(lu_hui_positive_count,lu_hui_pos_neg_count) 1136 | if "lu_hui_neg_perc" in variable: 1137 | number = safe_divide(lu_hui_negative_count,lu_hui_pos_neg_count) 1138 | if "lu_hui_pos_nwords" in variable: 1139 | number = safe_divide(lu_hui_positive_count,nwords) 1140 | if "lu_hui_neg_nwords" in variable: 1141 | number = safe_divide(lu_hui_negative_count,nwords) 1142 | if "lu_hui_prop" in variable: 1143 | number = proportion(lu_hui_positive_count,lu_hui_negative_count) 1144 | 1145 | output += (number*eigen) 1146 | 1147 | variable_list.append(output) 1148 | header_list.append(header) 1149 | 1150 | if var_list[9].get() == 1 or var_list[10].get() == 1 or var_list[11].get() == 1 or var_list[12].get() == 1 or var_list[15].get() == 1: 1151 | dataQueue.put("Loading spaCy") 1152 | root.update_idletasks() 1153 | import spacy 1154 | from spacy.util import set_data_path 1155 | set_data_path(resource_path('en_core_web_sm')) 1156 | nlp = spacy.load(resource_path('en_core_web_sm')) 1157 | 1158 | ####Beginning of file iteration####### 1159 | file_counter = 1 1160 | nfiles = len(filenames) 1161 | for filename in filenames: 1162 | variable_list = [] 1163 | header_list = ["filename","nwords"] 1164 | 1165 | print(filename) 1166 | filename1 = ("Processing: " + str(file_counter) + " of " + str(nfiles) + " files") 1167 | 1168 | if system != "W": 1169 | filename_2 = filename.split("/")[-1] 1170 | else: 1171 | filename_2 = filename.split("\\")[-1] 1172 | 1173 | dataQueue.put(filename1) 1174 | root.update_idletasks() 1175 | 1176 | 1177 | Text= open(filename, 'r',errors = 'ignore').read() 1178 | #the text has educated quotes and we want straight quotes. 1179 | text = re.sub("‘", "'", Text) #bottom heavy educated quote replaced by straight quotes 1180 | text= re.sub("’", "'", text) #top heavy educated quote replaced by straight quotes 1181 | 1182 | nwords = len(text.split()) 1183 | 1184 | for word in text: 1185 | if word[-1] in punctuation: 1186 | word = word[:-1] 1187 | 1188 | pre_text= open(filename, 'r',errors = 'ignore').read().lower().split() 1189 | text_2 = [] 1190 | #Holder structure for lemmatized text: 1191 | lemma_text=[] 1192 | neg_3_text = [] 1193 | neg_list = ["aint", "arent", "cannot", "cant", "couldnt", "darent", "didnt", "doesnt", 1194 | "ain't", "aren't", "can't", "couldn't", "daren't", "didn't", "doesn't", 1195 | "dont", "hadnt", "hasnt", "havent", "isnt", "mightnt", "mustnt", "neither", 1196 | "don't", "hadn't", "hasn't", "haven't", "isn't", "mightn't", "mustn't", 1197 | "neednt", "needn't", "never", "none", "nope", "nor", "not", "nothing", "nowhere", 1198 | "oughtnt", "shant", "shouldnt", "uhuh", "wasnt", "werent", 1199 | "oughtn't", "shan't", "shouldn't", "uh-uh", "wasn't", "weren't", 1200 | "without", "wont", "wouldnt", "won't", "wouldn't", "rarely", "seldom", "despite"] 1201 | neg_list_2 = ["n't", "aint", "arent", "cannot", "cant", "couldnt", "darent", "didnt", "doesnt", 1202 | "ain't", "aren't", "can't", "couldn't", "daren't", "didn't", "doesn't", 1203 | "dont", "hadnt", "hasnt", "havent", "isnt", "mightnt", "mustnt", "neither", 1204 | "don't", "hadn't", "hasn't", "haven't", "isn't", "mightn't", "mustn't", 1205 | "neednt", "needn't", "never", "none", "nope", "nor", "not", "nothing", "nowhere", 1206 | "oughtnt", "shant", "shouldnt", "uhuh", "wasnt", "werent", 1207 | "oughtn't", "shan't", "shouldn't", "uh-uh", "wasn't", "weren't", 1208 | "without", "wont", "wouldnt", "won't", "wouldn't", "rarely", "seldom", "despite"] 1209 | 1210 | 1211 | 1212 | for word in pre_text: #gets rid of punctuation attached to words 1213 | if len(word) < 1: #new in 1.4 1214 | continue 1215 | if len(word)==1 and word in punctuation: 1216 | continue 1217 | if word[-1] in punctuation: 1218 | word = word[:-1] 1219 | if word[0] in punctuation: 1220 | word = word[1:] 1221 | #Creates lemmatized text: 1222 | if word in lemma_dict: 1223 | lemma_text.append(lemma_dict[word]) 1224 | else: 1225 | lemma_text.append(word) 1226 | text_2.append(word) 1227 | 1228 | #####negation text maker#### 1229 | 1230 | neg_check = [] 1231 | for items in text_2: 1232 | neg_check.append(items) 1233 | if len(neg_check)>4: 1234 | neg_check = neg_check[1:] 1235 | checker = 0 1236 | ##print neg_check 1237 | for items in neg_check: 1238 | if items in neg_list: 1239 | checker+=1 1240 | if checker > 0: 1241 | continue 1242 | else: neg_3_text.append(items) 1243 | 1244 | ###end negation text maker### 1245 | 1246 | 1247 | ### Begin Creation of POS Lists: ### 1248 | if var_list[9].get() == 1 or var_list[10].get() == 1 or var_list[11].get() == 1 or var_list[12].get() == 1 or var_list[15].get() == 1: 1249 | 1250 | noun_tags = "NN NNS NNP NNPS".split(" ") #consider whether to identify gerunds 1251 | adjectives = "JJ JJR JJS".split(" ") 1252 | verbs = "VB VBZ VBP VBD VBN VBG".split(" ") 1253 | adverbs = ["RB"] 1254 | pronouns = ["PRP", "PRP$"] 1255 | pronoun_dict = {"me":"i","him":"he","her":"she"} 1256 | 1257 | pos_word_list = [] 1258 | pos_lemma_list = [] 1259 | 1260 | noun_text = [] 1261 | noun_lemma_list = [] 1262 | adjective_text = [] 1263 | adjective_lemma_list = [] 1264 | verb_text = [] 1265 | verb_lemma_list = [] 1266 | adverb_text = [] 1267 | adverb_lemma_list = [] 1268 | 1269 | noun_text_neg_3 = [] 1270 | adjective_text_neg_3 = [] 1271 | verb_text_neg_3 = [] 1272 | adverb_text_neg_3 = [] 1273 | 1274 | tagged_text = nlp(text) 1275 | 1276 | for token in tagged_text: 1277 | if token.tag_ in punctuation: 1278 | continue 1279 | 1280 | if token.tag_ in pronouns: 1281 | if token.text.lower() in pronoun_dict: 1282 | lemma_form = pronoun_dict[token.text.lower()] 1283 | else: 1284 | lemma_form = token.text.lower() 1285 | else: 1286 | lemma_form = token.lemma_.lower() 1287 | 1288 | pos_word_list.append(token.text.lower()) 1289 | pos_lemma_list.append(lemma_form) 1290 | 1291 | if token.tag_ in noun_tags: 1292 | noun_text.append(token.text.lower()) 1293 | noun_lemma_list.append(lemma_form) 1294 | if token.tag_ in adjectives: 1295 | adjective_text.append(token.text.lower()) 1296 | adjective_lemma_list.append(lemma_form) 1297 | if token.tag_ in verbs: 1298 | verb_text.append(token.text.lower()) 1299 | verb_lemma_list.append(lemma_form) 1300 | if token.tag_ in adverbs: 1301 | adverb_text.append(token.text.lower()) 1302 | adverb_lemma_list.append(lemma_form) 1303 | ##print pos_word_list 1304 | 1305 | ### for negation check counts ### 1306 | neg_check = [] 1307 | for token in tagged_text: 1308 | if token.tag_ in punctuation: 1309 | continue 1310 | ### the following checks for negation in the span of 3 words to the left ### 1311 | neg_check.append(token.text.lower()) 1312 | ##print neg_check 1313 | if len(neg_check)>4: 1314 | neg_check = neg_check[1:] 1315 | neg_checker = 0 1316 | for items in neg_check: 1317 | if items in neg_list_2: 1318 | neg_checker +=1 1319 | if neg_checker > 0: 1320 | continue 1321 | ### if the negation check is clear, we go on to add the words ### 1322 | if token.tag_ in noun_tags: 1323 | noun_text_neg_3.append(token.text.lower()) 1324 | if token.tag_ in adjectives: 1325 | adjective_text_neg_3.append(token.text.lower()) 1326 | if token.tag_ in verbs: 1327 | verb_text_neg_3.append(token.text.lower()) 1328 | if token.tag_ in adverbs: 1329 | adverb_text_neg_3.append(token.text.lower()) 1330 | 1331 | ### end creation of POS lists ### 1332 | 1333 | variable_list = [] 1334 | 1335 | 1336 | #### Normal Program #### 1337 | if var_list[1].get() == 1 and var_list[13].get() == 1: run_galc(text) 1338 | if var_list[2].get() == 1 and var_list[13].get() == 1: run_nrc(text_2) 1339 | if var_list[3].get() == 1 and var_list[13].get() == 1: run_anew(text_2) 1340 | if var_list[4].get() == 1 and var_list[13].get() == 1: run_sentic(text_2) 1341 | if var_list[5].get() == 1 and var_list[13].get() == 1: run_vader(Text) 1342 | if var_list[6].get() == 1 and var_list[13].get() == 1: run_lu_hui(text_2) 1343 | if var_list[7].get() == 1 and var_list[13].get() == 1: run_gi(text_2) 1344 | if var_list[8].get() == 1 and var_list[13].get() == 1: run_lasswell(text_2) 1345 | 1346 | #### Normal Program Negation Controlled #### 1347 | if var_list[1].get() == 1 and var_list[13].get() == 1 and var_list[14].get() == 1: run_galc(neg_3_text,"_neg_3") 1348 | if var_list[2].get() == 1 and var_list[13].get() == 1 and var_list[14].get() == 1: run_nrc(neg_3_text,"_neg_3") 1349 | if var_list[3].get() == 1 and var_list[13].get() == 1 and var_list[14].get() == 1: run_anew(neg_3_text,"_neg_3") 1350 | if var_list[4].get() == 1 and var_list[13].get() == 1 and var_list[14].get() == 1: run_sentic(neg_3_text,"_neg_3") 1351 | #if var_list[5].get() == 1 and var_list[13].get() == 1: run_vader(Text) #Vader already has negation control 1352 | if var_list[6].get() == 1 and var_list[13].get() == 1 and var_list[14].get() == 1: run_lu_hui(neg_3_text,"_neg_3") 1353 | if var_list[7].get() == 1 and var_list[13].get() == 1 and var_list[14].get() == 1: run_gi(neg_3_text,"_neg_3") 1354 | if var_list[8].get() == 1 and var_list[13].get() == 1 and var_list[14].get() == 1: run_lasswell(neg_3_text,"_neg_3") 1355 | 1356 | #### Nouns Only #### 1357 | if var_list[1].get() == 1 and var_list[9].get() == 1: run_galc(noun_text,"_nouns") 1358 | if var_list[2].get() == 1 and var_list[9].get() == 1: run_nrc(noun_text,"_nouns") 1359 | if var_list[3].get() == 1 and var_list[9].get() == 1: run_anew(noun_text,"_nouns") 1360 | if var_list[4].get() == 1 and var_list[9].get() == 1: run_sentic(noun_text,"_nouns") 1361 | if var_list[5].get() == 1 and var_list[9].get() == 1: run_vader(noun_text,"_nouns") 1362 | if var_list[6].get() == 1 and var_list[9].get() == 1: run_lu_hui(noun_text,"_nouns") 1363 | if var_list[7].get() == 1 and var_list[9].get() == 1: run_gi(noun_text,"_nouns") 1364 | if var_list[8].get() == 1 and var_list[9].get() == 1: run_lasswell(noun_text,"_nouns") 1365 | 1366 | #### Verbs Only #### 1367 | if var_list[1].get() == 1 and var_list[10].get() == 1: run_galc(verb_text,"_verbs") 1368 | if var_list[2].get() == 1 and var_list[10].get() == 1: run_nrc(verb_text,"_verbs") 1369 | if var_list[3].get() == 1 and var_list[10].get() == 1: run_anew(verb_text,"_verbs") 1370 | if var_list[4].get() == 1 and var_list[10].get() == 1: run_sentic(verb_text,"_verbs") 1371 | if var_list[5].get() == 1 and var_list[10].get() == 1: run_vader(verb_text,"_verbs") 1372 | if var_list[6].get() == 1 and var_list[10].get() == 1: run_lu_hui(verb_text,"_verbs") 1373 | if var_list[7].get() == 1 and var_list[10].get() == 1: run_gi(verb_text,"_verbs") 1374 | if var_list[8].get() == 1 and var_list[10].get() == 1: run_lasswell(verb_text,"_verbs") 1375 | 1376 | #### Adjectives Only #### 1377 | if var_list[1].get() == 1 and var_list[11].get() == 1: run_galc(adjective_text,"_adjectives") 1378 | if var_list[2].get() == 1 and var_list[11].get() == 1: run_nrc(adjective_text,"_adjectives") 1379 | if var_list[3].get() == 1 and var_list[11].get() == 1: run_anew(adjective_text,"_adjectives") 1380 | if var_list[4].get() == 1 and var_list[11].get() == 1: run_sentic(adjective_text,"_adjectives") 1381 | if var_list[5].get() == 1 and var_list[11].get() == 1: run_vader(adjective_text,"_adjectives") 1382 | if var_list[6].get() == 1 and var_list[11].get() == 1: run_lu_hui(adjective_text,"_adjectives") 1383 | if var_list[7].get() == 1 and var_list[11].get() == 1: run_gi(adjective_text,"_adjectives") 1384 | if var_list[8].get() == 1 and var_list[11].get() == 1: run_lasswell(adjective_text,"_adjectives") 1385 | 1386 | #### Adverbs Only #### 1387 | if var_list[1].get() == 1 and var_list[12].get() == 1: run_galc(adverb_text,"_adverbs") 1388 | if var_list[2].get() == 1 and var_list[12].get() == 1: run_nrc(adverb_text,"_adverbs") 1389 | if var_list[3].get() == 1 and var_list[12].get() == 1: run_anew(adverb_text,"_adverbs") 1390 | if var_list[4].get() == 1 and var_list[12].get() == 1: run_sentic(adverb_text,"_adverbs") 1391 | if var_list[5].get() == 1 and var_list[12].get() == 1: run_vader(adverb_text,"_adverbs") 1392 | if var_list[6].get() == 1 and var_list[12].get() == 1: run_lu_hui(adverb_text,"_adverbs") 1393 | if var_list[7].get() == 1 and var_list[12].get() == 1: run_gi(adverb_text,"_adverbs") 1394 | if var_list[8].get() == 1 and var_list[12].get() == 1: run_lasswell(adverb_text,"_adverbs") 1395 | 1396 | #### Nouns Only Negation Checked #### 1397 | if var_list[1].get() == 1 and var_list[9].get() == 1and var_list[14].get() == 1: run_galc(noun_text_neg_3,"_nouns_neg_3") 1398 | if var_list[2].get() == 1 and var_list[9].get() == 1and var_list[14].get() == 1: run_nrc(noun_text_neg_3,"_nouns_neg_3") 1399 | if var_list[3].get() == 1 and var_list[9].get() == 1and var_list[14].get() == 1: run_anew(noun_text_neg_3,"_nouns_neg_3") 1400 | if var_list[4].get() == 1 and var_list[9].get() == 1and var_list[14].get() == 1: run_sentic(noun_text_neg_3,"_nouns_neg_3") 1401 | #if var_list[5].get() == 1 and var_list[9].get() == 1and var_list[14].get() == 1: run_vader(noun_text_neg_3,"_nouns_neg_3") 1402 | if var_list[6].get() == 1 and var_list[9].get() == 1and var_list[14].get() == 1: run_lu_hui(noun_text_neg_3,"_nouns_neg_3") 1403 | if var_list[7].get() == 1 and var_list[9].get() == 1and var_list[14].get() == 1: run_gi(noun_text_neg_3,"_nouns_neg_3") 1404 | if var_list[8].get() == 1 and var_list[9].get() == 1and var_list[14].get() == 1: run_lasswell(noun_text_neg_3,"_nouns_neg_3") 1405 | 1406 | #### Verbs Only Negation Checked #### 1407 | if var_list[1].get() == 1 and var_list[10].get() == 1and var_list[14].get() == 1: run_galc(verb_text_neg_3,"_verbs_neg_3") 1408 | if var_list[2].get() == 1 and var_list[10].get() == 1and var_list[14].get() == 1: run_nrc(verb_text_neg_3,"_verbs_neg_3") 1409 | if var_list[3].get() == 1 and var_list[10].get() == 1and var_list[14].get() == 1: run_anew(verb_text_neg_3,"_verbs_neg_3") 1410 | if var_list[4].get() == 1 and var_list[10].get() == 1and var_list[14].get() == 1: run_sentic(verb_text_neg_3,"_verbs_neg_3") 1411 | #if var_list[5].get() == 1 and var_list[10].get() == 1and var_list[14].get() == 1: run_vader(verb_text_neg_3,"_verbs_neg_3") 1412 | if var_list[6].get() == 1 and var_list[10].get() == 1and var_list[14].get() == 1: run_lu_hui(verb_text_neg_3,"_verbs_neg_3") 1413 | if var_list[7].get() == 1 and var_list[10].get() == 1and var_list[14].get() == 1: run_gi(verb_text_neg_3,"_verbs_neg_3") 1414 | if var_list[8].get() == 1 and var_list[10].get() == 1and var_list[14].get() == 1: run_lasswell(verb_text_neg_3,"_verbs_neg_3") 1415 | 1416 | #### Adjectives Only Negation Checked #### 1417 | if var_list[1].get() == 1 and var_list[11].get() == 1and var_list[14].get() == 1: run_galc(adjective_text_neg_3,"_adjectives_neg_3") 1418 | if var_list[2].get() == 1 and var_list[11].get() == 1and var_list[14].get() == 1: run_nrc(adjective_text_neg_3,"_adjectives_neg_3") 1419 | if var_list[3].get() == 1 and var_list[11].get() == 1and var_list[14].get() == 1: run_anew(adjective_text_neg_3,"_adjectives_neg_3") 1420 | if var_list[4].get() == 1 and var_list[11].get() == 1and var_list[14].get() == 1: run_sentic(adjective_text_neg_3,"_adjectives_neg_3") 1421 | #if var_list[5].get() == 1 and var_list[11].get() == 1and var_list[14].get() == 1: run_vader(adjective_text_neg_3,"_adjectives_neg_3") 1422 | if var_list[6].get() == 1 and var_list[11].get() == 1and var_list[14].get() == 1: run_lu_hui(adjective_text_neg_3,"_adjectives_neg_3") 1423 | if var_list[7].get() == 1 and var_list[11].get() == 1and var_list[14].get() == 1: run_gi(adjective_text_neg_3,"_adjectives_neg_3") 1424 | if var_list[8].get() == 1 and var_list[11].get() == 1and var_list[14].get() == 1: run_lasswell(adjective_text_neg_3,"_adjectives_neg_3") 1425 | 1426 | #### Adverbs Only Negation Checked #### 1427 | if var_list[1].get() == 1 and var_list[12].get() == 1and var_list[14].get() == 1: run_galc(adverb_text_neg_3,"_adverbs_neg_3") 1428 | if var_list[2].get() == 1 and var_list[12].get() == 1and var_list[14].get() == 1: run_nrc(adverb_text_neg_3,"_adverbs_neg_3") 1429 | if var_list[3].get() == 1 and var_list[12].get() == 1and var_list[14].get() == 1: run_anew(adverb_text_neg_3,"_adverbs_neg_3") 1430 | if var_list[4].get() == 1 and var_list[12].get() == 1and var_list[14].get() == 1: run_sentic(adverb_text_neg_3,"_adverbs_neg_3") 1431 | #if var_list[5].get() == 1 and var_list[12].get() == 1and var_list[14].get() == 1: run_vader(adverb_text_neg_3,"_adverbs_neg_3") 1432 | if var_list[6].get() == 1 and var_list[12].get() == 1and var_list[14].get() == 1: run_lu_hui(adverb_text_neg_3,"_adverbs_neg_3") 1433 | if var_list[7].get() == 1 and var_list[12].get() == 1and var_list[14].get() == 1: run_gi(adverb_text_neg_3,"_adverbs_neg_3") 1434 | if var_list[8].get() == 1 and var_list[12].get() == 1and var_list[14].get() == 1: run_lasswell(adverb_text_neg_3,"_adverbs_neg_3") 1435 | 1436 | #Second Iteration of Components 1437 | if var_list[15].get() == 1: component_count(components_2["C1.txt"],"negative_adjectives_component") 1438 | if var_list[15].get() == 1: component_count(components_2["C2.txt"],"social_order_component") 1439 | if var_list[15].get() == 1: component_count(components_2["C3.txt"],"action_component") 1440 | if var_list[15].get() == 1: component_count(components_2["C4.txt"],"positive_adjectives_component") 1441 | if var_list[15].get() == 1: component_count(components_2["C5.txt"],"joy_component") 1442 | if var_list[15].get() == 1: component_count(components_2["C6.txt"],"affect_friends_and_family_component") 1443 | if var_list[15].get() == 1: component_count(components_2["C7.txt"],"fear_and_digust_component") 1444 | if var_list[15].get() == 1: component_count(components_2["C8.txt"],"politeness_component") 1445 | if var_list[15].get() == 1: component_count(components_2["C9.txt"],"polarity_nouns_component") 1446 | if var_list[15].get() == 1: component_count(components_2["C10.txt"],"polarity_verbs_component") 1447 | if var_list[15].get() == 1: component_count(components_2["C11.txt"],"virtue_adverbs_component") 1448 | if var_list[15].get() == 1: component_count(components_2["C12.txt"],"positive_nouns_component") 1449 | if var_list[15].get() == 1: component_count(components_2["C13.txt"],"respect_component") 1450 | if var_list[15].get() == 1: component_count(components_2["C14.txt"],"trust_verbs_component") 1451 | if var_list[15].get() == 1: component_count(components_2["C15.txt"],"failure_component") 1452 | if var_list[15].get() == 1: component_count(components_2["C16.txt"],"well_being_component") 1453 | if var_list[15].get() == 1: component_count(components_2["C17.txt"],"economy_component") 1454 | if var_list[15].get() == 1: component_count(components_2["C18.txt"],"certainty_component") 1455 | if var_list[15].get() == 1: component_count(components_2["C19.txt"],"positive_verbs_component") 1456 | if var_list[15].get() == 1: component_count(components_2["C20.txt"],"objects_component") 1457 | 1458 | 1459 | 1460 | if file_counter == 1: 1461 | header = ",".join(header_list) + "\n" 1462 | outf.write(header) 1463 | 1464 | variable_string_list=[] 1465 | for items in variable_list: 1466 | variable_string_list.append(str(items)) 1467 | string = ",".join(variable_string_list) 1468 | 1469 | file_counter+=1 1470 | #print(file_counter) 1471 | outf.write ('{0}, {1}, {2}\n' 1472 | .format(filename_2,nwords,string)) 1473 | 1474 | outf.flush()#flushes out buffer to clean output file 1475 | outf.close()#close output file 1476 | 1477 | #Closing Message to let user know that the program did something 1478 | 1479 | finishmessage = ("Processed " + str(nfiles) + " Files") 1480 | dataQueue.put(finishmessage) 1481 | root.update_idletasks() 1482 | if system == "M": 1483 | #self.progress.config(text =finishmessage) 1484 | tkinter.messagebox.showinfo("Finished!", "Your files have been processed by Sentiment Tool!") 1485 | 1486 | 1487 | if __name__ == '__main__': 1488 | root = tk.Tk() 1489 | root.wm_title("SEANCE 1.2.0") 1490 | root.configure(background = color) 1491 | root.geometry(geom_size) 1492 | myapp = MyApp(root) 1493 | root.mainloop() 1494 | -------------------------------------------------------------------------------- /components/C1.txt: -------------------------------------------------------------------------------- 1 | Negative_NRC_adjectives 0.834 2 | Disgust_NRC_adjectives_neg_3 0.824 3 | Anger_NRC_adjectives_neg_3 0.821 4 | Negativ_GI_adjectives 0.813 5 | Vice_GI_adjectives_neg_3 0.81 6 | Negaff_Lasswell_adjectives_neg_3 0.739 7 | Sadness_NRC_adjectives_neg_3 0.737 8 | Fear_NRC_adjectives_neg_3 0.723 9 | lu_hui_neg_nwords_adjectives_neg_3 0.709 10 | Vice_GI_neg_3 0.628 11 | lu_hui_pos_perc_adjectives_neg_3 -0.628 12 | lu_hui_neg_perc_adjectives 0.623 13 | lu_hui_neg_nwords 0.594 14 | Negaff_Lasswell 0.557 15 | Negativ_GI 0.546 16 | vader_negative 0.535 17 | lu_hui_pos_perc_neg_3 -0.519 18 | vader_compound -0.412 -------------------------------------------------------------------------------- /components/C10.txt: -------------------------------------------------------------------------------- 1 | polarity_verbs 0.933 2 | aptitude_verbs 0.856 3 | pleasantness_verbs 0.845 4 | attention_verbs 0.733 -------------------------------------------------------------------------------- /components/C11.txt: -------------------------------------------------------------------------------- 1 | Rcgain_Lasswell_adverbs_neg_3 0.908 2 | Hostile_GI_adverbs_neg_3 0.905 3 | Negativ_GI_adverbs_neg_3 0.756 4 | Rcgain_Lasswell 0.636 5 | Surelw_Lasswell_adverbs 0.577 -------------------------------------------------------------------------------- /components/C12.txt: -------------------------------------------------------------------------------- 1 | lu_hui_neg_perc_nouns -0.863 2 | lu_hui_pos_perc_nouns 0.857 3 | lu_hui_pos_nwords_nouns 0.707 4 | lu_hui_neg_nwords_nouns_neg_3 -0.458 -------------------------------------------------------------------------------- /components/C13.txt: -------------------------------------------------------------------------------- 1 | Rspoth_Lasswell_nouns 0.902 2 | Rsptot_Lasswell_nouns 0.895 3 | Rspoth_Lasswell 0.884 4 | Rsptot_Lasswell_neg_3 0.865 -------------------------------------------------------------------------------- /components/C14.txt: -------------------------------------------------------------------------------- 1 | Trust_NRC_verbs 0.805 2 | Joy_NRC_verbs 0.788 3 | Positive_NRC_verbs 0.765 4 | Trust_NRC 0.504 5 | Positive_NRC_neg_3 0.489 -------------------------------------------------------------------------------- /components/C15.txt: -------------------------------------------------------------------------------- 1 | Powloss_Lasswell_verbs 0.813 2 | Powloss_Lasswell_neg_3 0.793 3 | Fail_GI_verbs 0.77 4 | Fail_GI_neg_3 0.739 5 | Rsptot_Lasswell_verbs_neg_3 0.324 -------------------------------------------------------------------------------- /components/C16.txt: -------------------------------------------------------------------------------- 1 | Wlbphys_Lasswell_nouns_neg_3 0.891 2 | Wlbtot_Lasswell_nouns 0.87 3 | Wlbphys_Lasswell 0.806 4 | Wlbtot_Lasswell 0.748 -------------------------------------------------------------------------------- /components/C17.txt: -------------------------------------------------------------------------------- 1 | Name_GI_adjectives_neg_3 0.864 2 | Polit_GI_adjectives_neg_3 0.86 3 | Econ_GI_adjectives 0.833 4 | Econ_GI_neg_3 0.572 -------------------------------------------------------------------------------- /components/C18.txt: -------------------------------------------------------------------------------- 1 | Quan_GI 0.758 2 | Undrst_GI 0.652 3 | Quan_GI_nouns 0.646 4 | Ovrst_GI 0.565 5 | If_Lasswell 0.526 6 | Surelw_Lasswell_nouns_neg_3 0.417 -------------------------------------------------------------------------------- /components/C19.txt: -------------------------------------------------------------------------------- 1 | lu_hui_pos_perc_verbs 0.886 2 | lu_hui_neg_perc_verbs -0.872 3 | lu_hui_pos_nwords_verbs 0.745 -------------------------------------------------------------------------------- /components/C2.txt: -------------------------------------------------------------------------------- 1 | Rcethic_Lasswell_verbs_neg_3 0.88 2 | Need_GI_verbs_neg_3 0.868 3 | Need_GI 0.83 4 | Rcethic_Lasswell 0.737 5 | Weak_GI_verbs 0.722 6 | Rctot_Lasswell 0.653 7 | Rel_GI_neg_3 0.63 8 | Weak_GI 0.572 9 | Solve_GI 0.531 10 | Passive_GI_neg_3 0.511 11 | Sv_GI 0.444 -------------------------------------------------------------------------------- /components/C20.txt: -------------------------------------------------------------------------------- 1 | Object_GI_neg_3 0.846 2 | Object_GI_nouns 0.815 3 | Beingtouched_GALC_neg_3 0.631 4 | Com_GI 0.602 -------------------------------------------------------------------------------- /components/C3.txt: -------------------------------------------------------------------------------- 1 | Ought_GI_verbs_neg_3 0.818 2 | Try_GI_verbs_neg_3 0.81 3 | Ought_GI 0.759 4 | Try_GI 0.759 5 | Travel_GI_verbs_neg_3 0.745 6 | Travel_GI 0.649 7 | Dav_GI_verbs_neg_3 0.576 8 | Dav_GI_neg_3 0.509 9 | Active_GI_verbs_neg_3 0.445 -------------------------------------------------------------------------------- /components/C4.txt: -------------------------------------------------------------------------------- 1 | lu_hui_pos_nwords_adjectives 0.685 2 | vader_positive 0.68 3 | Virtue_GI_neg_3 0.674 4 | Positiv_GI_adjectives 0.673 5 | lu_hui_pos_nwords_neg_3 0.652 6 | Positiv_GI_neg_3 0.645 7 | Virtue_GI_adjectives 0.636 8 | vader_neutral -0.545 9 | Posaff_Lasswell_adjectives 0.431 -------------------------------------------------------------------------------- /components/C5.txt: -------------------------------------------------------------------------------- 1 | Joy_NRC_adjectives 0.801 2 | Anticipation_NRC_adjectives 0.797 3 | Surprise_NRC_adjectives 0.76 4 | Trust_NRC_adjectives 0.698 5 | Positive_NRC_adjectives 0.691 6 | Valence_nwords_adjectives_neg_3 0.568 7 | Joy_NRC_neg_3 0.471 8 | Anticipation_NRC_neg_3 0.466 -------------------------------------------------------------------------------- /components/C6.txt: -------------------------------------------------------------------------------- 1 | Afftot_Lasswell_nouns_neg_3 0.905 2 | Afftot_Lasswell_neg_3 0.871 3 | Affpt_Lasswell_neg_3 0.816 4 | Kin_2_GI_nouns_neg_3 0.795 5 | Affil_GI_nouns 0.619 6 | Affoth_Lasswell_neg_3 0.599 7 | Affil_GI_neg_3 0.487 8 | Male_GI 0.365 9 | Dominance_nwords 0.358 -------------------------------------------------------------------------------- /components/C7.txt: -------------------------------------------------------------------------------- 1 | Disgust_NRC_nouns 0.769 2 | Negative_NRC_nouns 0.732 3 | Disgust_NRC_neg_3 0.668 4 | Negative_NRC_neg_3 0.646 5 | Fear_NRC_neg_3 0.629 6 | Anger_NRC_neg_3 0.596 7 | Sadness_NRC_neg_3 0.495 8 | Negative_NRC_verbs 0.34 -------------------------------------------------------------------------------- /components/C8.txt: -------------------------------------------------------------------------------- 1 | Polit_2_GI 0.851 2 | Polit_2_GI_nouns_neg_3 0.829 3 | Polit_GI_nouns_neg_3 0.799 4 | Polit_GI 0.75 5 | Powtot_Lasswell 0.575 6 | Powaupt_Lasswell 0.534 7 | Doctrin_GI 0.477 -------------------------------------------------------------------------------- /components/C9.txt: -------------------------------------------------------------------------------- 1 | polarity_nouns 0.858 2 | pleasantness_nouns_neg_3 0.788 3 | aptitude_nouns 0.784 4 | polarity_neg_3 0.572 5 | attention_neg_3 0.542 6 | Joy_NRC_nouns 0.463 7 | Positiv_GI_nouns 0.343 -------------------------------------------------------------------------------- /data_files/SEANCE_Icon.icns: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/data_files/SEANCE_Icon.icns -------------------------------------------------------------------------------- /data_files/__pycache__/vaderSentiment.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/data_files/__pycache__/vaderSentiment.cpython-36.pyc -------------------------------------------------------------------------------- /data_files/affective_list.txt: -------------------------------------------------------------------------------- 1 | Admiration/Awe_GALC admir* ador* awe* dazed dazzl* enrapt* enthrall* fascina* marveli* rapt* reveren* spellbound wonder* worship* 2 | Amusement_GALC amus* fun* humor* laugh* play* rollick* smil* 3 | Anger_GALC anger angr* cross* enrag* furious fury incens* infuriat* irate ire* mad* rag* resent* temper wrath* wrought* 4 | Anxiety_GALC anguish* anxi* apprehens* diffiden* jitter* nervous* trepida* wari* wary worried* worry* 5 | Beingtouched_GALC affect* mov* touch* 6 | Boredom_GALC bor* ennui indifferen* languor* tedi* wear* 7 | Compassion_GALC commiser* compass* empath* pit* 8 | Contempt_GALC contempt* denigr* deprec* deris* despi* disdain* scorn* 9 | Contentment_GALC comfortabl* content* satisf* 10 | Desperation_GALC deject* desolat* despair* desperat* despond* disconsolat* hopeless* inconsol* 11 | Disappointment_GALC comedown disappoint* discontent* disenchant* disgruntl* disillusion* frustrat* jilt* letdown resign* sour* thwart* 12 | Disgust_GALC abhor* avers* detest* disgust* dislik* disrelish distast* loath* nause* queas* repugn* repuls* revolt* sicken* 13 | Dissatisfaction_GALC dissatisf* unhapp* 14 | Envy_GALC envious* envy* 15 | Fear_GALC afraid* aghast* alarm* dread* fear* fright* horr* panic* scare* terror* 16 | Feelinglove_GALC affection* fond* love* friend* tender* 17 | Gratitude_GALC grat* thank* 18 | Guilt_GALC blame* contriti* guilt* remorse* repent* 19 | Happiness_GALC cheer* bliss* delect* delight* enchant* enjoy* felicit* happ* merr* 20 | Hatred_GALC acrimon* hat* rancor* 21 | Hope_GALC buoyan* confident* faith* hop* optim* 22 | Humility_GALC devout* humility 23 | Interest/Enthusiasm_GALC absor* alert animat* ardor* attenti* curi* eager* enrapt* engross* enthusias* ferv* interes* zeal* 24 | Irritation_GALC annoy* exasperat* grump* indign* irrita* sullen* vex* 25 | Jealousy_GALC covetous* jealous* 26 | Joy_GALC ecstat* elat* euphor* exalt* exhilar* exult* flush* glee* joy* jubil* overjoyed ravish* rejoic* 27 | Longing_GALC crav* daydream* desir* fanta* hanker* hark* homesick* long* nostalg* pin* regret* wish* wistf* yearn* 28 | Lust_GALC carnal lust* climax ecsta* orgas* sensu* sexual* 29 | Pleasure/Enjoyment_GALC enjoy* delight* glow* pleas* thrill* zest* 30 | Pride_GALC pride* proud* 31 | Relaxation/Serenity_GALC ease* calm* carefree casual detach* dispassion* equanim* eventemper* laid-back peace* placid* poise* relax* seren* tranquil* unruffl* 32 | Relief_GALC relie* 33 | Sadness_GALC chagrin* deject* dole* gloom* glum* grie* hopeles* melancho* mourn* sad* sorrow* tear* weep* 34 | Shame_GALC abash* asham* crush* disgrace* embarras* humili* shame* 35 | Surprise_GALC amaze* astonish* dumbfound* startl* stunn* surpris* aback thunderstruck wonder* 36 | Tension/Stress_GALC activ* agit* discomfort* distress* strain* stress* tense* 37 | Positive_GALC agree* excellent fair fine good nice positiv* 38 | Negative_GALC bad disagree* lousy negativ* unpleas* -------------------------------------------------------------------------------- /data_files/affective_norms.txt: -------------------------------------------------------------------------------- 1 | #word valence arousal dominance w 2 | abduction 2.76 5.53 3.49 0 3 | abortion 3.5 5.39 4.59 0 4 | absurd 4.26 4.36 4.73 0 5 | abundance 6.59 5.51 5.8 0 6 | abuse 1.8 6.83 3.69 0 7 | acceptance 7.98 5.4 6.64 0 8 | accident 2.05 6.26 3.76 0 9 | ace 6.88 5.5 6.39 0 10 | ache 2.46 5 3.54 0 11 | achievement 7.89 5.53 6.56 0 12 | activate 5.46 4.86 5.43 0 13 | addict 2.48 5.66 3.72 0 14 | addicted 2.51 4.81 3.46 0 15 | admired 7.74 6.11 7.53 0 16 | adorable 7.81 5.12 5.74 0 17 | adult 6.49 4.76 5.75 0 18 | advantage 6.95 4.76 6.36 0 19 | adventure 7.6 6.98 6.46 0 20 | affection 8.39 6.21 6.08 0 21 | afraid 2 6.67 3.98 0 22 | aggressive 5.1 5.83 5.59 0 23 | agility 6.46 4.85 5.87 0 24 | agony 2.43 6.06 4.02 0 25 | agreement 7.08 5.02 6.22 0 26 | air 6.34 4.12 5.1 0 27 | alcoholic 2.84 5.69 4.45 0 28 | alert 6.2 6.85 5.96 0 29 | alien 5.6 5.45 4.64 0 30 | alimony 3.95 4.3 4.63 0 31 | alive 7.25 5.5 6.39 0 32 | allergy 3.07 4.64 3.21 0 33 | alley 4.48 4.91 4 0 34 | alone 2.41 4.83 3.7 0 35 | aloof 4.9 4.28 4.69 0 36 | ambition 7.04 5.61 6.93 0 37 | ambulance 2.47 7.33 3.22 0 38 | angel 7.53 4.83 4.97 0 39 | anger 2.34 7.63 5.5 0 40 | angry 2.85 7.17 5.55 0 41 | ankle 5.27 4.16 4.77 0 42 | annoy 2.74 6.49 5.09 0 43 | answer 6.63 5.41 5.85 0 44 | anxious 4.81 6.92 5.33 0 45 | applause 7.5 5.8 6.48 0 46 | appliance 5.1 4.05 5.05 0 47 | arm 5.34 3.59 5.07 0 48 | army 4.72 5.03 5.03 0 49 | aroused 7.97 6.63 6.14 0 50 | arrogant 3.69 5.65 5.14 0 51 | art 6.68 4.86 5.3 0 52 | assassin 3.09 6.28 4.33 0 53 | assault 2.03 7.51 3.94 0 54 | astonished 6.56 6.58 5.16 0 55 | astronaut 6.66 5.28 5.2 0 56 | athletics 6.61 6.1 6.12 0 57 | autumn 6.3 4.51 5.15 0 58 | avalanche 3.29 5.54 3.61 0 59 | avenue 5.5 4.12 5.4 0 60 | awed 6.7 5.74 5.3 0 61 | baby 8.22 5.53 5 0 62 | bake 6.17 5.1 5.49 0 63 | bandage 4.54 3.9 4.52 0 64 | bankrupt 2 6.21 3.27 0 65 | banner 5.4 3.83 4.8 0 66 | bar 6.42 5 5.47 0 67 | barrel 5.05 3.36 4.89 0 68 | basket 5.45 3.63 5.76 0 69 | bastard 3.36 6.07 4.17 0 70 | bath 7.33 4.16 6.41 0 71 | bathroom 5.55 3.88 5.65 0 72 | bathtub 6.69 4.36 5.76 0 73 | beach 8.03 5.53 5.44 0 74 | beast 4.23 5.57 4.89 0 75 | beautiful 7.6 6.17 6.29 0 76 | beauty 7.82 4.95 5.53 0 77 | bed 7.51 3.61 6.88 0 78 | bees 3.2 6.51 4.16 0 79 | beggar 3.22 4.91 4.09 0 80 | bench 4.61 3.59 4.68 0 81 | bereavement 4.57 4.2 4.33 0 82 | betray 1.68 7.24 4.92 0 83 | beverage 6.83 5.21 5.63 0 84 | bird 7.27 3.17 4.42 0 85 | birthday 7.84 6.68 5.89 0 86 | black 5.39 4.61 5.14 0 87 | blackmail 2.95 6.03 3.54 0 88 | bland 4.1 3.29 4.88 0 89 | blase 4.89 3.94 4.57 0 90 | blasphemy 3.75 4.93 4.75 0 91 | bless 7.19 4.05 5.52 0 92 | blind 3.05 4.39 3.28 0 93 | bliss 6.95 4.41 6.12 0 94 | blister 2.88 4.1 3.98 0 95 | blond 6.43 5.07 5.74 0 96 | bloody 2.9 6.41 3.96 0 97 | blossom 7.26 5.03 5.53 0 98 | blubber 3.52 4.57 3.86 0 99 | blue 6.76 4.31 5.63 0 100 | board 4.82 3.36 4.98 0 101 | body 5.55 5.52 5.34 0 102 | bold 6.8 5.6 6.67 0 103 | bomb 2.1 7.15 4.54 0 104 | book 5.72 4.17 5.3 0 105 | bored 2.95 2.83 4.11 0 106 | bottle 6.15 4.79 4.78 0 107 | bouquet 7.02 5.46 6.15 0 108 | bowl 5.33 3.47 4.69 0 109 | boxer 5.51 5.12 5.1 0 110 | boy 6.32 4.58 5.34 0 111 | brave 7.15 6.15 7.22 0 112 | breast 6.5 5.37 5.39 0 113 | breeze 6.85 4.37 5.54 0 114 | bride 7.34 5.55 5.74 0 115 | bright 7.5 5.4 6.34 0 116 | broken 3.05 5.43 4.14 0 117 | brother 7.11 4.71 5.12 0 118 | brutal 2.8 6.6 4.59 0 119 | building 5.29 3.92 5.25 0 120 | bullet 3.29 5.33 3.9 0 121 | bunny 7.24 4.06 4.97 0 122 | burdened 2.5 5.63 5.03 0 123 | burial 2.05 5.08 3.55 0 124 | burn 2.73 6.22 4.22 0 125 | bus 4.51 3.55 4.84 0 126 | busybody 5.17 4.84 5.45 0 127 | butter 5.33 3.17 4.67 0 128 | butterfly 7.17 3.47 4.65 0 129 | cabinet 5.05 3.43 4.73 0 130 | cake 7.26 5 5.16 0 131 | cancer 1.5 6.42 3.42 0 132 | candy 6.54 4.58 5.33 0 133 | cane 4 4.2 4.27 0 134 | cannon 4.9 4.71 5.17 0 135 | capable 7.16 5.08 6.47 0 136 | car 7.73 6.24 6.98 0 137 | carcass 3.34 4.83 4.9 0 138 | carefree 7.54 4.17 5.78 0 139 | caress 7.84 5.14 5.83 0 140 | cash 8.37 7.37 6.96 0 141 | casino 6.81 6.51 5.12 0 142 | cat 5.72 4.38 6.16 0 143 | cell 3.82 4.08 4.12 0 144 | cellar 4.32 4.39 4.66 0 145 | cemetery 2.63 4.82 4.27 0 146 | chair 5.08 3.15 4.56 0 147 | champ 7.18 6 6.77 0 148 | champion 8.44 5.85 6.5 0 149 | chance 6.02 5.38 4.64 0 150 | chaos 4.17 6.67 3.86 0 151 | charm 6.77 5.16 5.57 0 152 | cheer 8.1 6.12 6 0 153 | child 7.08 5.55 5.1 0 154 | chin 5.29 3.31 5.26 0 155 | chocolate 6.88 5.29 5.18 0 156 | christmas 7.8 6.27 5.37 0 157 | church 6.28 4.34 5 0 158 | circle 5.67 3.86 5.03 0 159 | circus 7.3 5.97 5.39 0 160 | city 6.03 5.24 5.74 0 161 | cliff 4.67 6.25 4.35 0 162 | clock 5.14 4.02 4.67 0 163 | clothing 6.54 4.78 5.33 0 164 | clouds 6.18 3.3 5.22 0 165 | clumsy 4 5.18 3.86 0 166 | coarse 4.55 4.21 5 0 167 | coast 5.98 4.59 5.67 0 168 | cockroach 2.81 6.11 4.74 0 169 | coffin 2.56 5.03 4.08 0 170 | coin 6.02 4.29 5.66 0 171 | cold 4.02 5.19 4.69 0 172 | color 7.02 4.73 6.17 0 173 | column 5.17 3.62 4.81 0 174 | comedy 8.37 5.85 5.44 0 175 | comfort 7.07 3.93 5.7 0 176 | computer 6.24 4.75 5.29 0 177 | concentrate 5.2 4.65 4.97 0 178 | confident 7.98 6.22 7.68 0 179 | confused 3.21 6.03 4.24 0 180 | consoled 5.78 4.53 4.44 0 181 | contempt 3.85 5.28 5.13 0 182 | contents 4.89 4.32 4.85 0 183 | context 5.2 4.22 5.17 0 184 | controlling 3.8 6.1 5.17 0 185 | cook 6.16 4.44 5.14 0 186 | cord 5.1 3.54 5 0 187 | cork 5.22 3.8 4.98 0 188 | corner 4.36 3.91 4.12 0 189 | corpse 2.18 4.74 3.59 0 190 | corridor 4.88 3.63 5 0 191 | corrupt 3.32 4.67 4.64 0 192 | cottage 6.45 3.39 5.39 0 193 | couple 7.41 6.39 6.02 0 194 | cow 5.57 3.49 5.32 0 195 | coward 2.74 4.07 2.83 0 196 | cozy 7.39 3.32 4.89 0 197 | crash 2.31 6.95 3.44 0 198 | crime 2.89 5.41 4.12 0 199 | criminal 2.93 4.79 3.34 0 200 | crisis 2.74 5.44 3.6 0 201 | crown 6.58 4.28 6.06 0 202 | crucify 2.23 6.47 3.74 0 203 | crude 3.12 5.07 4.27 0 204 | cruel 1.97 5.68 4.24 0 205 | crushed 2.21 5.52 3.36 0 206 | crutch 3.43 4.14 3.91 0 207 | cuddle 7.72 4.4 5.85 0 208 | cuisine 6.64 4.39 5.41 0 209 | curious 6.08 5.82 5.42 0 210 | curtains 4.83 3.67 5.05 0 211 | custom 5.85 4.66 5 0 212 | cut 3.64 5 4.7 0 213 | cute 7.62 5.53 4.86 0 214 | cyclone 3.6 6.36 4.89 0 215 | dagger 3.38 6.14 4.52 0 216 | damage 3.05 5.57 3.88 0 217 | dancer 7.14 6 6.02 0 218 | danger 2.95 7.32 3.59 0 219 | dark 4.71 4.28 4.84 0 220 | dawn 6.16 4.39 5.16 0 221 | daylight 6.8 4.77 5.48 0 222 | dazzle 7.29 6.33 5.62 0 223 | dead 1.94 5.73 2.84 0 224 | death 1.61 4.59 3.47 0 225 | debt 2.22 5.68 3.02 0 226 | deceit 2.9 5.68 3.95 0 227 | decompose 3.2 4.65 4.02 0 228 | decorate 6.93 5.14 6.05 0 229 | defeated 2.34 5.09 3.11 0 230 | defiant 4.26 6.1 5.77 0 231 | deformed 2.41 4.07 3.95 0 232 | delayed 3.07 5.62 3.64 0 233 | delight 8.26 5.44 5.79 0 234 | demon 2.11 6.76 4.89 0 235 | dentist 4.02 5.73 3.8 0 236 | depressed 1.83 4.72 2.74 0 237 | depression 1.85 4.54 2.91 0 238 | derelict 4.28 4.1 4.78 0 239 | deserter 2.45 5.5 3.77 0 240 | desire 7.69 7.35 6.49 0 241 | despairing 2.43 5.68 3.43 0 242 | despise 2.03 6.28 4.72 0 243 | destroy 2.64 6.83 4.94 0 244 | destruction 3.16 5.82 3.93 0 245 | detached 3.86 4.26 3.63 0 246 | detail 5.55 4.1 5.21 0 247 | detest 2.17 6.06 5.83 0 248 | devil 2.21 6.07 5.35 0 249 | devoted 7.41 5.23 6.18 0 250 | diamond 7.92 5.53 5.54 0 251 | dignified 7.1 4.12 6.12 0 252 | dinner 7.16 5.43 6.1 0 253 | diploma 8 5.67 6.76 0 254 | dirt 4.17 3.76 4.83 0 255 | dirty 3.08 4.88 4.7 0 256 | disappoint 2.39 4.92 3.29 0 257 | disaster 1.73 6.33 3.52 0 258 | discomfort 2.19 4.17 3.86 0 259 | discouraged 3 4.53 3.61 0 260 | disdainful 3.68 5.04 4.55 0 261 | disgusted 2.45 5.42 4.34 0 262 | disloyal 1.93 6.56 3.79 0 263 | displeased 2.79 5.64 4.19 0 264 | distressed 1.94 6.4 3.76 0 265 | disturb 3.66 5.8 4.55 0 266 | diver 6.45 5.04 5.04 0 267 | divorce 2.22 6.33 3.26 0 268 | doctor 5.2 5.86 4.89 0 269 | dog 7.57 5.76 6.25 0 270 | doll 6.09 4.24 4.61 0 271 | dollar 7.47 6.07 6.33 0 272 | door 5.13 3.8 4.69 0 273 | dove 6.9 3.79 5.48 0 274 | dentist 4.02 5.73 3.8 0 275 | depressed 1.83 4.72 2.74 0 276 | depression 1.85 4.54 2.91 0 277 | derelict 4.28 4.1 4.78 0 278 | deserter 2.45 5.5 3.77 0 279 | desire 7.69 7.35 6.49 0 280 | despairing 2.43 5.68 3.43 0 281 | despise 2.03 6.28 4.72 0 282 | destroy 2.64 6.83 4.94 0 283 | destruction 3.16 5.82 3.93 0 284 | detached 3.86 4.26 3.63 0 285 | detail 5.55 4.1 5.21 0 286 | detest 2.17 6.06 5.83 0 287 | devil 2.21 6.07 5.35 0 288 | devoted 7.41 5.23 6.18 0 289 | diamond 7.92 5.53 5.54 0 290 | dignified 7.1 4.12 6.12 0 291 | dinner 7.16 5.43 6.1 0 292 | diploma 8 5.67 6.76 0 293 | dirt 4.17 3.76 4.83 0 294 | dirty 3.08 4.88 4.7 0 295 | disappoint 2.39 4.92 3.29 0 296 | disaster 1.73 6.33 3.52 0 297 | discomfort 2.19 4.17 3.86 0 298 | discouraged 3 4.53 3.61 0 299 | disdainful 3.68 5.04 4.55 0 300 | disgusted 2.45 5.42 4.34 0 301 | disloyal 1.93 6.56 3.79 0 302 | displeased 2.79 5.64 4.19 0 303 | distressed 1.94 6.4 3.76 0 304 | disturb 3.66 5.8 4.55 0 305 | diver 6.45 5.04 5.04 0 306 | divorce 2.22 6.33 3.26 0 307 | doctor 5.2 5.86 4.89 0 308 | dog 7.57 5.76 6.25 0 309 | doll 6.09 4.24 4.61 0 310 | dollar 7.47 6.07 6.33 0 311 | door 5.13 3.8 4.69 0 312 | dove 6.9 3.79 5.48 0 313 | fall 4.09 4.7 4 0 314 | FALSE 3.27 3.43 4.1 0 315 | fame 7.93 6.55 6.85 0 316 | family 7.65 4.8 6 0 317 | famous 6.98 5.73 6.32 0 318 | fantasy 7.41 5.14 6.43 0 319 | farm 5.53 3.9 5.59 0 320 | fascinate 7.34 5.83 6.15 0 321 | fat 2.28 4.81 4.47 0 322 | father 7.08 5.92 5.63 0 323 | fatigued 3.28 2.64 3.78 0 324 | fault 3.43 4.07 4.02 0 325 | favor 6.46 4.54 5.67 0 326 | fear 2.76 6.96 3.22 0 327 | fearful 2.25 6.33 3.64 0 328 | feeble 3.26 4.1 2.71 0 329 | festive 7.3 6.58 5.77 0 330 | fever 2.76 4.29 3.52 0 331 | field 6.2 4.08 5.84 0 332 | fight 3.76 7.15 5.27 0 333 | filth 2.47 5.12 3.81 0 334 | finger 5.29 3.78 5.05 0 335 | fire 3.22 7.17 4.49 0 336 | fireworks 7.55 6.67 5.51 0 337 | fish 6.04 4 6.02 0 338 | flabby 2.66 4.82 3.31 0 339 | flag 6.02 4.6 5.5 0 340 | flirt 7.52 6.91 6.24 0 341 | flood 3.19 6 3.24 0 342 | flower 6.64 4 4.98 0 343 | foam 6.07 5.26 5.24 0 344 | food 7.65 5.92 6.18 0 345 | foot 5.02 3.27 4.98 0 346 | fork 5.29 3.96 5.74 0 347 | foul 2.81 4.93 4.51 0 348 | fragrance 6.07 4.79 5.14 0 349 | fraud 2.67 5.75 3.58 0 350 | free 8.26 5.15 6.35 0 351 | freedom 7.58 5.52 6.76 0 352 | friend 7.74 5.74 6.74 0 353 | friendly 8.43 5.11 5.92 0 354 | frigid 3.5 4.75 4.27 0 355 | frog 5.71 4.54 5.34 0 356 | frustrated 2.48 5.61 3.5 0 357 | fun 8.37 7.22 6.8 0 358 | funeral 1.39 4.94 2.97 0 359 | fungus 3.06 4.68 4.06 0 360 | fur 4.51 4.18 4.32 0 361 | game 6.98 5.89 5.7 0 362 | gangrene 2.28 5.7 3.36 0 363 | garbage 2.98 5.04 4.24 0 364 | garden 6.71 4.39 6.02 0 365 | garment 6.07 4.49 5.3 0 366 | garter 6.22 5.47 5.82 0 367 | gender 5.73 4.38 5.6 0 368 | gentle 7.31 3.21 5.1 0 369 | germs 2.86 4.49 3.79 0 370 | gift 7.77 6.14 5.52 0 371 | girl 6.87 4.29 5.8 0 372 | glacier 5.5 4.24 4.92 0 373 | glamour 6.76 4.68 5.76 0 374 | glass 4.75 4.27 5 0 375 | gloom 1.88 3.83 3.55 0 376 | glory 7.55 6.02 6.85 0 377 | god 8.15 5.95 5.88 0 378 | gold 7.54 5.76 5.85 0 379 | golfer 5.61 3.73 5.55 0 380 | good 7.47 5.43 6.41 0 381 | gossip 3.48 5.74 3.57 0 382 | graduate 8.19 7.25 6.94 0 383 | grass 6.12 4.14 5.44 0 384 | grateful 7.37 4.58 6.18 0 385 | greed 3.51 4.71 4.88 0 386 | green 6.18 4.28 4.82 0 387 | greet 7 5.27 5.95 0 388 | grenade 3.6 5.7 4.29 0 389 | grief 1.69 4.78 3.5 0 390 | grime 3.37 3.98 4.47 0 391 | grin 7.4 5.27 6 0 392 | gripe 3.14 5 4.67 0 393 | guillotine 2.48 6.56 4.64 0 394 | guilty 2.63 6.04 3.09 0 395 | gun 3.47 7.02 3.53 0 396 | gymnast 6.35 5.02 5.31 0 397 | habit 4.11 3.95 4.3 0 398 | hairdryer 4.84 3.71 5.57 0 399 | hairpin 5.26 3.27 5.05 0 400 | hamburger 6.27 4.55 5.32 0 401 | hammer 4.88 4.58 4.75 0 402 | hand 5.95 4.4 5.35 0 403 | handicap 3.29 3.81 4 0 404 | handsome 7.93 5.95 5.19 0 405 | haphazard 4.02 4.07 4.29 0 406 | happy 8.21 6.49 6.63 0 407 | hard 5.22 5.12 5.59 0 408 | hardship 2.45 4.76 4.22 0 409 | hat 5.46 4.1 5.39 0 410 | hate 2.12 6.95 5.05 0 411 | hatred 1.98 6.66 4.3 0 412 | hawk 5.88 4.39 5.5 0 413 | hay 5.24 3.95 5.37 0 414 | headache 2.02 5.07 3.6 0 415 | headlight 5.24 3.81 4.88 0 416 | heal 7.09 4.77 5.79 0 417 | health 6.81 5.13 5.83 0 418 | heart 7.39 6.34 5.49 0 419 | heaven 7.3 5.61 6.15 0 420 | hell 2.24 5.38 3.24 0 421 | helpless 2.2 5.34 2.27 0 422 | heroin 4.36 5.11 4.8 0 423 | hide 4.32 5.28 3.4 0 424 | highway 5.92 5.16 5.66 0 425 | hinder 3.81 4.12 4.21 0 426 | history 5.24 3.93 4.83 0 427 | hit 4.33 5.73 4.88 0 428 | holiday 7.55 6.59 6.3 0 429 | home 7.91 4.21 5.9 0 430 | honest 7.7 5.32 6.24 0 431 | honey 6.73 4.51 5.44 0 432 | honor 7.66 5.9 6.7 0 433 | hooker 3.34 4.93 4.73 0 434 | hope 7.05 5.44 5.52 0 435 | hopeful 7.1 5.78 5.41 0 436 | horror 2.76 7.21 4.63 0 437 | horse 5.89 3.89 4.67 0 438 | hospital 5.04 5.98 4.69 0 439 | hostage 2.2 6.76 2.83 0 440 | hostile 2.73 6.44 4.85 0 441 | hotel 6 4.8 5.12 0 442 | house 7.26 4.56 6.08 0 443 | hug 8 5.35 5.79 0 444 | humane 6.89 4.5 5.7 0 445 | humble 5.86 3.74 4.76 0 446 | humiliate 2.24 6.14 2.6 0 447 | humor 8.56 5.5 6.08 0 448 | hungry 3.58 5.13 4.68 0 449 | hurricane 3.34 6.83 3.07 0 450 | hurt 1.9 5.85 3.33 0 451 | hydrant 5.02 3.71 5.53 0 452 | icebox 4.95 4.17 5.05 0 453 | idea 7 5.86 6.26 0 454 | identity 6.57 4.95 6.4 0 455 | idiot 3.16 4.21 3.18 0 456 | idol 6.12 4.95 5.37 0 457 | ignorance 3.07 4.39 4.41 0 458 | illness 2.48 4.71 3.21 0 459 | imagine 7.32 5.98 7.07 0 460 | immature 3.39 4.15 4.85 0 461 | immoral 3.5 4.98 4.66 0 462 | impair 3.18 4.04 4.09 0 463 | impotent 2.81 4.57 3.43 0 464 | impressed 7.33 5.42 5.51 0 465 | improve 7.65 5.69 6.08 0 466 | incentive 7 5.69 5.93 0 467 | indifferent 4.61 3.18 4.84 0 468 | industry 5.3 4.47 4.91 0 469 | infant 6.95 5.05 5.67 0 470 | infatuation 6.73 7.02 4.9 0 471 | infection 1.66 5.03 3.61 0 472 | inferior 3.07 3.83 2.78 0 473 | inhabitant 5.05 3.95 5.37 0 474 | injury 2.49 5.69 3.57 0 475 | ink 5.05 3.84 4.61 0 476 | innocent 6.51 4.21 5.28 0 477 | insane 2.85 5.83 4.12 0 478 | insect 4.07 4.07 4.56 0 479 | insecure 2.36 5.56 2.33 0 480 | insolent 4.35 5.38 4.5 0 481 | inspire 6.97 5 6.34 0 482 | inspired 7.15 6.02 6.67 0 483 | insult 2.29 6 3.62 0 484 | intellect 6.82 4.75 6.3 0 485 | intercourse 7.36 7 6.4 0 486 | interest 6.97 5.66 5.88 0 487 | intimate 7.61 6.98 5.86 0 488 | intruder 2.77 6.86 4 0 489 | invader 3.05 5.5 4 0 490 | invest 5.93 5.12 5.88 0 491 | iron 4.9 3.76 5.1 0 492 | irritate 3.11 5.76 5.03 0 493 | item 5.26 3.24 5.26 0 494 | jail 1.95 5.49 3.81 0 495 | jealousy 2.51 6.36 3.8 0 496 | jelly 5.66 3.7 4.53 0 497 | jewel 7 5.38 5.59 0 498 | joke 8.1 6.74 6.15 0 499 | jolly 7.41 5.57 6.39 0 500 | journal 5.14 4.05 5.26 0 501 | joy 8.6 7.22 6.28 0 502 | joyful 8.22 5.98 6.6 0 503 | jug 5.24 3.88 5.05 0 504 | justice 7.78 5.47 6.47 0 505 | kerchief 5.11 3.43 5.25 0 506 | kerosene 4.8 4.34 4.63 0 507 | ketchup 5.6 4.09 5.29 0 508 | kettle 5.22 3.22 5 0 509 | key 5.68 3.7 4.98 0 510 | kick 4.31 4.9 5.5 0 511 | kids 6.91 5.27 5.07 0 512 | killer 1.89 7.86 4.54 0 513 | kind 7.59 4.46 5.95 0 514 | kindness 7.82 4.3 5.67 0 515 | king 7.26 5.51 7.38 0 516 | kiss 8.26 7.32 6.93 0 517 | kitten 6.86 5.08 6.86 0 518 | knife 3.62 5.8 4.12 0 519 | knot 4.64 4.07 4.67 0 520 | knowledge 7.58 5.92 6.78 0 521 | lake 6.82 3.95 4.9 0 522 | lamb 5.89 3.36 4.91 0 523 | lamp 5.41 3.8 5.27 0 524 | lantern 5.57 4.05 5.07 0 525 | laughter 8.45 6.75 6.45 0 526 | lavish 6.21 4.93 5.64 0 527 | lawn 5.24 4 5.37 0 528 | lawsuit 3.37 4.93 3.92 0 529 | lazy 4.38 2.65 4.07 0 530 | leader 7.63 6.27 7.88 0 531 | learn 7.15 5.39 6.34 0 532 | legend 6.39 4.88 5.54 0 533 | leisurely 6.88 3.8 5.15 0 534 | leprosy 2.09 6.29 4 0 535 | lesbian 4.67 5.12 5.35 0 536 | letter 6.61 4.9 5.73 0 537 | liberty 7.98 5.6 6.29 0 538 | lice 2.31 5 3.95 0 539 | lie 2.79 5.96 3.3 0 540 | life 7.27 6.02 5.72 0 541 | lightbulb 5.61 4.1 5.82 0 542 | lighthouse 5.89 4.41 5.25 0 543 | lightning 4.57 6.61 3.67 0 544 | limber 5.68 4.57 5.34 0 545 | lion 5.57 6.2 4.12 0 546 | listless 4.12 4.1 4.14 0 547 | lively 7.2 5.53 6.09 0 548 | locker 5.19 3.38 5.36 0 549 | loneliness 1.61 4.56 2.51 0 550 | lonely 2.17 4.51 2.95 0 551 | loser 2.25 4.95 3.02 0 552 | lost 2.82 5.82 2.86 0 553 | lottery 6.57 5.36 4.81 0 554 | louse 2.81 4.98 3.57 0 555 | love 8.72 6.44 7.11 0 556 | loved 8.64 6.38 6.62 0 557 | loyal 7.55 5.16 6.91 0 558 | lucky 8.17 6.53 6.05 0 559 | lump 4.16 4.8 4.32 0 560 | luscious 7.5 5.34 5.68 0 561 | lust 7.12 6.88 5.49 0 562 | luxury 7.88 4.75 6.4 0 563 | machine 5.09 3.82 5.23 0 564 | mad 2.44 6.76 5.86 0 565 | madman 3.91 5.56 4.79 0 566 | maggot 2.06 5.28 4.03 0 567 | magical 7.46 5.95 5.73 0 568 | mail 6.88 5.63 5.67 0 569 | malaria 2.4 4.4 3.22 0 570 | malice 2.69 5.86 4.74 0 571 | man 6.73 5.24 5.53 0 572 | mangle 3.9 5.44 4.61 0 573 | maniac 3.76 5.39 4.22 0 574 | manner 5.64 4.56 5.05 0 575 | mantel 4.93 3.27 4.95 0 576 | manure 3.1 4.17 4.67 0 577 | market 5.66 4.12 5.27 0 578 | massacre 2.28 5.33 3.5 0 579 | masterful 7.09 5.2 7.18 0 580 | masturbate 5.45 5.67 5.63 0 581 | material 5.26 4.05 5.12 0 582 | measles 2.74 5.06 4.13 0 583 | medicine 5.67 4.4 4.7 0 584 | meek 3.87 3.8 3.67 0 585 | melody 7.07 4.98 5.46 0 586 | memories 7.48 6.1 5.88 0 587 | memory 6.62 5.42 5.11 0 588 | menace 2.88 5.52 4.98 0 589 | merry 7.9 5.9 6.64 0 590 | messy 3.15 3.34 4.75 0 591 | metal 4.95 3.79 5.38 0 592 | method 5.56 3.85 5.67 0 593 | mighty 6.54 5.61 7.23 0 594 | mildew 3.17 4.08 4.4 0 595 | milk 5.95 3.68 5.83 0 596 | millionaire 8.03 6.14 6.97 0 597 | mind 6.68 5 6.37 0 598 | miracle 8.6 7.65 5.35 0 599 | mischief 5.57 5.76 5.56 0 600 | misery 1.93 5.17 2.55 0 601 | mistake 2.86 5.18 3.86 0 602 | mobility 6.83 5 6.43 0 603 | modest 5.76 3.98 4.96 0 604 | mold 3.55 4.07 4.33 0 605 | moment 5.76 3.83 4.81 0 606 | money 7.59 5.7 6.25 0 607 | month 5.15 4.03 4.85 0 608 | moody 3.2 4.18 4.39 0 609 | moral 6.2 4.49 5.9 0 610 | morbid 2.87 5.06 4.34 0 611 | morgue 1.92 4.84 3.61 0 612 | mosquito 2.8 4.78 4.51 0 613 | mother 8.39 6.13 5.74 0 614 | mountain 6.59 5.49 5.46 0 615 | movie 6.86 4.93 5 0 616 | mucus 3.34 3.41 4.8 0 617 | muddy 4.44 4.13 4.73 0 618 | muffin 6.57 4.76 5.51 0 619 | murderer 1.53 7.47 3.77 0 620 | muscular 6.82 5.47 6.58 0 621 | museum 5.54 3.6 5.32 0 622 | mushroom 5.78 4.72 5.52 0 623 | music 8.13 5.32 6.39 0 624 | mutation 3.91 4.84 4.07 0 625 | mutilate 1.82 6.41 3.41 0 626 | mystic 6 4.84 5.52 0 627 | naked 6.34 5.8 6 0 628 | name 5.55 4.25 5.16 0 629 | narcotic 4.29 4.93 4.44 0 630 | nasty 3.58 4.89 5 0 631 | natural 6.59 4.09 5.57 0 632 | nature 7.65 4.37 4.95 0 633 | nectar 6.9 3.89 4.54 0 634 | needle 3.82 5.36 3.95 0 635 | neglect 2.63 4.83 3.85 0 636 | nervous 3.29 6.59 3.56 0 637 | neurotic 4.45 5.13 4.41 0 638 | news 5.3 5.17 4.6 0 639 | nice 6.55 4.38 5.58 0 640 | nightmare 1.91 7.59 3.68 0 641 | nipple 6.27 5.56 5.57 0 642 | noisy 5.02 6.38 4.93 0 643 | nonchalant 4.74 3.12 4.31 0 644 | nonsense 4.61 4.17 4.9 0 645 | noose 3.76 4.39 4.17 0 646 | nourish 6.46 4.29 5.8 0 647 | nude 6.82 6.41 5.96 0 648 | nuisance 3.27 4.49 4.36 0 649 | nun 4.93 2.93 4.93 0 650 | nurse 6.08 4.84 4.84 0 651 | nursery 5.73 4.04 5.18 0 652 | obesity 2.73 3.87 3.74 0 653 | obey 4.52 4.23 4.26 0 654 | obnoxious 3.5 4.74 5.39 0 655 | obscene 4.23 5.04 4.48 0 656 | obsession 4.52 6.41 4.77 0 657 | ocean 7.12 4.95 5.53 0 658 | odd 4.82 4.27 4.77 0 659 | offend 2.76 5.56 3.73 0 660 | office 5.24 4.08 5.59 0 661 | opinion 6.28 4.89 5.53 0 662 | optimism 6.95 5.34 6.61 0 663 | option 6.49 4.74 6.34 0 664 | orchestra 6.02 3.52 5.17 0 665 | orgasm 8.32 8.1 6.83 0 666 | outdoors 7.47 5.92 6.27 0 667 | outrage 3.52 6.83 5.26 0 668 | outstanding 7.75 6.24 6.4 0 669 | overcast 3.65 3.46 4.2 0 670 | overwhelmed 4.19 7 3.89 0 671 | owl 5.8 3.98 5.82 0 672 | pain 2.13 6.5 3.71 0 673 | paint 5.62 4.1 5.75 0 674 | palace 7.19 5.1 5.69 0 675 | pamphlet 4.79 3.62 4.63 0 676 | pancakes 6.08 4.06 5.76 0 677 | panic 3.12 7.02 3.2 0 678 | paper 5.2 2.5 4.47 0 679 | paradise 8.72 5.12 6.03 0 680 | paralysis 1.98 4.73 2.56 0 681 | part 5.11 3.82 4.75 0 682 | party 7.86 6.69 5.83 0 683 | passage 5.28 4.36 5.02 0 684 | passion 8.03 7.26 6.13 0 685 | pasta 6.69 4.94 5.8 0 686 | patent 5.29 3.5 4.9 0 687 | patient 5.29 4.21 4.9 0 688 | patriot 6.71 5.17 5.9 0 689 | peace 7.72 2.95 5.45 0 690 | penalty 2.83 5.1 3.95 0 691 | pencil 5.22 3.14 4.78 0 692 | penis 5.9 5.54 5.92 0 693 | penthouse 6.81 5.52 6.52 0 694 | people 7.33 5.94 6.14 0 695 | perfection 7.25 5.95 6.71 0 696 | perfume 6.76 5.05 5.93 0 697 | person 6.32 4.19 5.35 0 698 | pervert 2.79 6.26 4.72 0 699 | pest 3.13 5.62 5.29 0 700 | pet 6.79 5.1 5.85 0 701 | phase 5.17 3.98 4.65 0 702 | pie 6.41 4.2 5.35 0 703 | pig 5.07 4.2 5.34 0 704 | pillow 7.92 2.97 4.56 0 705 | pinch 3.83 4.59 4.76 0 706 | pistol 4.2 6.15 5.05 0 707 | pity 3.37 3.72 4.12 0 708 | pizza 6.65 5.24 5.69 0 709 | plain 4.39 3.52 4.71 0 710 | plane 6.43 6.14 4.78 0 711 | plant 5.98 3.62 4.71 0 712 | pleasure 8.28 5.74 6.15 0 713 | poetry 5.86 4 5.31 0 714 | poison 1.98 6.05 3.1 0 715 | politeness 7.18 3.74 5.74 0 716 | pollute 1.85 6.08 4.92 0 717 | poster 5.34 3.93 4.91 0 718 | poverty 1.67 4.87 3.21 0 719 | power 6.54 6.67 7.28 0 720 | powerful 6.84 5.83 7.19 0 721 | prairie 5.75 3.41 4.62 0 722 | present 6.95 5.12 5.83 0 723 | pressure 3.38 6.07 3.45 0 724 | prestige 7.26 5.86 6.9 0 725 | pretty 7.75 6.03 5.5 0 726 | prick 3.98 4.7 4.47 0 727 | pride 7 5.83 7.06 0 728 | priest 6.42 4.41 4.88 0 729 | prison 2.05 5.7 4.2 0 730 | privacy 5.88 4.12 5.66 0 731 | profit 7.63 6.68 5.85 0 732 | progress 7.73 6.02 6.76 0 733 | promotion 8.2 6.44 6.79 0 734 | protected 7.29 4.09 5.8 0 735 | proud 8.03 5.56 6.74 0 736 | pungent 3.95 4.24 4.78 0 737 | punishment 2.22 5.93 3.5 0 738 | puppy 7.56 5.85 5.51 0 739 | pus 2.86 4.82 4.35 0 740 | putrid 2.38 5.74 4.89 0 741 | python 4.05 6.18 4.52 0 742 | quality 6.25 4.48 5.64 0 743 | quarrel 2.93 6.29 4.02 0 744 | quart 5.39 3.59 5.2 0 745 | queen 6.44 4.76 5.49 0 746 | quick 6.64 6.57 6.57 0 747 | quiet 5.58 2.82 4.42 0 748 | rabbit 6.57 4.02 6.08 0 749 | rabies 1.77 6.1 3.85 0 750 | radiant 6.73 5.39 5.61 0 751 | radiator 4.67 4.02 4.81 0 752 | radio 6.73 4.78 5.28 0 753 | rage 2.41 8.17 5.68 0 754 | rain 5.08 3.65 4.78 0 755 | rainbow 8.14 4.64 4.72 0 756 | rancid 4.34 5.04 4.59 0 757 | rape 1.25 6.81 2.97 0 758 | rat 3.02 4.95 4.55 0 759 | rattle 5.03 4.36 4.17 0 760 | razor 4.81 5.36 4.91 0 761 | red 6.41 5.29 5.78 0 762 | refreshment 7.44 4.45 5 0 763 | regretful 2.28 5.74 3.43 0 764 | rejected 1.5 6.37 2.72 0 765 | relaxed 7 2.39 5.55 0 766 | repentant 5.53 4.69 5.42 0 767 | reptile 4.77 5.18 4.77 0 768 | rescue 7.7 6.53 6.45 0 769 | resent 3.76 4.47 4.46 0 770 | reserved 4.88 3.27 4.3 0 771 | respect 7.64 5.19 6.89 0 772 | respectful 7.22 4.6 5.67 0 773 | restaurant 6.76 5.41 5.73 0 774 | reunion 6.48 6.34 5.64 0 775 | reverent 5.35 4 4.67 0 776 | revolt 4.13 6.56 6.18 0 777 | revolver 4.02 5.55 4.39 0 778 | reward 7.53 4.95 6 0 779 | riches 7.7 6.17 6.74 0 780 | ridicule 3.13 5.83 3.87 0 781 | rifle 4.02 6.35 4.16 0 782 | rigid 3.66 4.66 4.61 0 783 | riot 2.96 6.39 4.18 0 784 | river 6.85 4.51 5.1 0 785 | roach 2.35 6.64 4.82 0 786 | robber 2.61 5.62 3.62 0 787 | rock 5.56 4.52 5.15 0 788 | rollercoaster 8.02 8.06 5.1 0 789 | romantic 8.32 7.59 6.08 0 790 | rotten 2.26 4.53 4.32 0 791 | rough 4.74 5.33 4.81 0 792 | rude 2.5 6.31 4.91 0 793 | runner 5.67 4.76 5.47 0 794 | rusty 3.86 3.77 4.53 0 795 | sad 1.61 4.13 3.45 0 796 | safe 7.07 3.86 5.81 0 797 | sailboat 7.25 4.88 5.86 0 798 | saint 6.49 4.49 5.37 0 799 | salad 5.74 3.81 5.47 0 800 | salute 5.92 5.31 5.46 0 801 | sapphire 7 5 5.55 0 802 | satisfied 7.94 4.94 6.14 0 803 | save 6.45 4.95 6 0 804 | savior 7.73 5.8 6.64 0 805 | scalding 2.82 5.95 3.82 0 806 | scandal 3.32 5.12 4.34 0 807 | scapegoat 3.67 4.53 3.52 0 808 | scar 3.38 4.79 3.88 0 809 | scared 2.78 6.82 2.94 0 810 | scholar 7.26 5.12 6.59 0 811 | scissors 5.05 4.47 5.16 0 812 | scorching 3.76 5 4.1 0 813 | scorn 2.84 5.48 3.93 0 814 | scornful 3.02 5.04 4.59 0 815 | scorpion 3.69 5.38 3.98 0 816 | scream 3.88 7.04 4.75 0 817 | scum 2.43 4.88 4.26 0 818 | scurvy 3.19 4.71 4.48 0 819 | seasick 2.05 5.8 3.41 0 820 | seat 4.95 2.95 4.84 0 821 | secure 7.57 3.14 5.93 0 822 | selfish 2.42 5.5 4.64 0 823 | sentiment 5.98 4.41 5.09 0 824 | serious 5.08 4 5.12 0 825 | severe 3.2 5.26 3.83 0 826 | sex 8.05 7.36 5.75 0 827 | sexy 8.02 7.36 6.82 0 828 | shadow 4.35 4.3 4.19 0 829 | shamed 2.5 4.88 2.98 0 830 | shark 3.65 7.16 2.63 0 831 | sheltered 5.75 4.28 3.76 0 832 | ship 5.55 4.38 5.12 0 833 | shotgun 4.37 6.27 5.29 0 834 | shriek 3.93 5.36 4.3 0 835 | shy 4.64 3.77 3.44 0 836 | sick 1.9 4.29 3.04 0 837 | sickness 2.25 5.61 3.84 0 838 | silk 6.9 3.71 4.81 0 839 | silly 7.41 5.88 6 0 840 | sin 2.8 5.78 3.62 0 841 | sinful 2.93 6.29 4.24 0 842 | sissy 3.14 5.17 3.58 0 843 | skeptical 4.52 4.91 4.5 0 844 | skijump 7.06 7.06 4.9 0 845 | skull 4.27 4.75 4.86 0 846 | sky 7.37 4.27 5.16 0 847 | skyscraper 5.88 5.71 4.33 0 848 | slap 2.95 6.46 4.21 0 849 | slaughter 1.64 6.77 3.82 0 850 | slave 1.84 6.21 3.29 0 851 | sleep 7.2 2.8 5.41 0 852 | slime 2.68 5.36 4.17 0 853 | slow 3.93 3.39 4.35 0 854 | slum 2.39 4.78 3.83 0 855 | slush 4.66 3.73 4.91 0 856 | smallpox 2.52 5.58 4.29 0 857 | smooth 6.58 4.91 5.09 0 858 | snake 3.31 6.82 3.78 0 859 | snob 3.36 5.65 5.11 0 860 | snow 7.08 5.75 5.8 0 861 | snuggle 7.92 4.16 5.66 0 862 | social 6.88 4.98 5.91 0 863 | soft 7.12 4.63 6 0 864 | solemn 4.32 3.56 4.61 0 865 | song 7.1 6.07 5.85 0 866 | soothe 7.3 4.4 5.36 0 867 | sour 3.93 5.1 4.64 0 868 | space 6.78 5.14 5.2 0 869 | spanking 3.55 5.41 3.91 0 870 | sphere 5.33 3.88 5 0 871 | spider 3.33 5.71 4.75 0 872 | spirit 7 5.56 5.82 0 873 | spouse 7.58 5.21 5.53 0 874 | spray 5.45 4.14 5.12 0 875 | spring 7.76 5.67 6.26 0 876 | square 4.74 3.18 4.51 0 877 | stagnant 4.15 3.93 4.71 0 878 | star 7.27 5.83 4.68 0 879 | startled 4.5 6.93 4.48 0 880 | starving 2.39 5.61 3.63 0 881 | statue 5.17 3.46 4.95 0 882 | stench 2.19 4.36 4.29 0 883 | stiff 4.68 4.02 4.93 0 884 | stink 3 4.26 4.16 0 885 | stomach 4.82 3.93 4.68 0 886 | stool 4.56 4 4.98 0 887 | storm 4.95 5.71 4.54 0 888 | stove 4.98 4.51 5.36 0 889 | street 5.22 3.39 4.81 0 890 | stress 2.09 7.45 3.93 0 891 | strong 7.11 5.92 6.92 0 892 | stupid 2.31 4.72 2.98 0 893 | subdued 4.67 2.9 4.08 0 894 | success 8.29 6.11 6.89 0 895 | suffocate 1.56 6.03 3.44 0 896 | sugar 6.74 5.64 5.5 0 897 | suicide 1.25 5.73 3.58 0 898 | sun 7.55 5.04 6.16 0 899 | sunlight 7.76 6.1 5.63 0 900 | sunrise 7.86 5.06 5.29 0 901 | sunset 7.68 4.2 5.66 0 902 | surgery 2.86 6.35 2.75 0 903 | surprised 7.47 7.47 6.11 0 904 | suspicious 3.76 6.25 4.47 0 905 | swamp 5.14 4.86 5.29 0 906 | sweetheart 8.42 5.5 6.03 0 907 | swift 6.46 5.39 6.29 0 908 | swimmer 6.54 4.82 5.96 0 909 | syphilis 1.68 5.69 3.33 0 910 | table 5.22 2.92 4.47 0 911 | talent 7.56 6.27 6.49 0 912 | tamper 4.1 4.95 4.58 0 913 | tank 5.16 4.88 4.78 0 914 | taste 6.66 5.22 5.5 0 915 | taxi 5 3.41 4.64 0 916 | teacher 5.68 4.05 5.11 0 917 | tease 4.84 5.87 4.67 0 918 | tender 6.93 4.88 5.33 0 919 | tennis 6.02 4.61 5.61 0 920 | tense 3.56 6.53 5.22 0 921 | termite 3.58 5.39 3.87 0 922 | terrible 1.93 6.27 3.58 0 923 | terrific 8.16 6.23 6.6 0 924 | terrified 1.72 7.86 3.08 0 925 | terrorist 1.69 7.27 2.65 0 926 | thankful 6.89 4.34 5.32 0 927 | theory 5.3 4.62 4.88 0 928 | thermometer 4.73 3.79 4.39 0 929 | thief 2.13 6.89 3.79 0 930 | thorn 3.64 5.14 4.45 0 931 | thought 6.39 4.83 6.02 0 932 | thoughtful 7.65 5.72 5.61 0 933 | thrill 8.05 8.02 6.54 0 934 | tidy 6.3 3.98 5.49 0 935 | time 5.31 4.64 4.63 0 936 | timid 3.86 4.11 3.09 0 937 | tobacco 3.28 4.83 4.08 0 938 | tomb 2.94 4.73 3.72 0 939 | tool 5.19 4.33 5.67 0 940 | toothache 1.98 5.55 3.9 0 941 | tornado 2.55 6.83 4.3 0 942 | torture 1.56 6.1 3.33 0 943 | tower 5.46 3.95 5.78 0 944 | toxic 2.1 6.4 4.42 0 945 | toy 7 5.11 6.09 0 946 | tragedy 1.78 6.24 3.5 0 947 | traitor 2.22 5.78 4.61 0 948 | trash 2.67 4.16 5.24 0 949 | trauma 2.1 6.33 2.84 0 950 | travel 7.1 6.21 6.31 0 951 | treasure 8.27 6.75 6.36 0 952 | treat 7.36 5.62 5.78 0 953 | tree 6.32 3.42 5.08 0 954 | triumph 7.8 5.78 6.98 0 955 | triumphant 8.82 6.78 6.95 0 956 | trophy 7.78 5.39 6.44 0 957 | trouble 3.03 6.85 4.85 0 958 | troubled 2.17 5.94 3.91 0 959 | truck 5.47 4.84 5.33 0 960 | trumpet 5.75 4.97 4.57 0 961 | trunk 5.09 4.18 5.14 0 962 | trust 6.68 5.3 6.61 0 963 | truth 7.8 5 6.47 0 964 | tumor 2.36 6.51 3.58 0 965 | tune 6.93 4.71 5.74 0 966 | twilight 7.23 4.7 5.59 0 967 | ugly 2.43 5.38 4.26 0 968 | ulcer 1.78 6.12 4.17 0 969 | umbrella 5.16 3.68 5.42 0 970 | unfaithful 2.05 6.2 3.02 0 971 | unhappy 1.57 4.18 3.34 0 972 | unit 5.59 3.75 5.11 0 973 | untroubled 7.62 3.89 5.53 0 974 | upset 2 5.86 4.08 0 975 | urine 3.25 4.2 5.24 0 976 | useful 7.14 4.26 5.93 0 977 | useless 2.13 4.87 3.92 0 978 | utensil 5.14 3.57 5.4 0 979 | vacation 8.16 5.64 6.8 0 980 | vagina 6.14 5.55 5.88 0 981 | valentine 8.11 6.06 5.81 0 982 | vampire 4.26 6.37 5.05 0 983 | vandal 2.71 6.4 3.91 0 984 | vanity 4.3 4.98 4.8 0 985 | vehicle 6.27 4.63 5.77 0 986 | venom 2.68 6.08 3.94 0 987 | vest 5.25 3.95 5.09 0 988 | victim 2.18 6.06 2.69 0 989 | victory 8.32 6.63 7.26 0 990 | vigorous 6.79 5.9 5.41 0 991 | village 5.92 4.08 4.94 0 992 | violent 2.29 6.89 5.16 0 993 | violin 5.43 3.49 5.18 0 994 | virgin 6.45 5.51 6.24 0 995 | virtue 6.22 4.52 6.13 0 996 | vision 6.62 4.66 6.02 0 997 | volcano 4.84 6.33 3.25 0 998 | vomit 2.06 5.75 3.58 0 999 | voyage 6.25 5.55 5.18 0 1000 | wagon 5.37 3.98 5.05 0 1001 | war 2.08 7.49 4.5 0 1002 | warmth 7.41 3.73 5.61 0 1003 | wasp 3.37 5.5 3.76 0 1004 | waste 2.93 4.14 4.72 0 1005 | watch 5.78 4.1 5.37 0 1006 | water 6.61 4.97 5.08 0 1007 | waterfall 7.88 5.37 5.2 0 1008 | wealthy 7.7 5.8 6.77 0 1009 | weapon 3.97 6.03 5.19 0 1010 | weary 3.79 3.81 4 0 1011 | wedding 7.82 5.97 6.68 0 1012 | whistle 5.81 4.69 5.27 0 1013 | white 6.47 4.37 5.98 0 1014 | whore 2.3 5.85 4.61 0 1015 | wicked 2.96 6.09 4.36 0 1016 | wife 6.33 4.93 5.57 0 1017 | win 8.38 7.72 7.39 0 1018 | windmill 5.6 3.74 5.24 0 1019 | window 5.91 3.97 4.91 0 1020 | wine 5.95 4.78 5.31 0 1021 | wink 6.93 5.44 5.7 0 1022 | wise 7.52 3.91 6.7 0 1023 | wish 7.09 5.16 5.28 0 1024 | wit 7.32 5.42 6.38 0 1025 | woman 6.64 5.32 6.33 0 1026 | wonder 6.03 5 5.32 0 1027 | world 6.5 5.32 5.26 0 1028 | wounds 2.51 5.82 3.92 0 1029 | writer 5.52 4.33 4.73 0 1030 | yacht 6.95 5.61 6.1 0 1031 | yellow 5.61 4.43 5.47 0 1032 | young 6.89 5.64 5.3 0 1033 | youth 6.75 5.67 5.11 0 1034 | zest 6.79 5.59 6 0 -------------------------------------------------------------------------------- /data_files/positive_words.txt: -------------------------------------------------------------------------------- 1 | a+ 2 | abound 3 | abounds 4 | abundance 5 | abundant 6 | accessable 7 | accessible 8 | acclaim 9 | acclaimed 10 | acclamation 11 | accolade 12 | accolades 13 | accommodative 14 | accomodative 15 | accomplish 16 | accomplished 17 | accomplishment 18 | accomplishments 19 | accurate 20 | accurately 21 | achievable 22 | achievement 23 | achievements 24 | achievible 25 | acumen 26 | adaptable 27 | adaptive 28 | adequate 29 | adjustable 30 | admirable 31 | admirably 32 | admiration 33 | admire 34 | admirer 35 | admiring 36 | admiringly 37 | adorable 38 | adore 39 | adored 40 | adorer 41 | adoring 42 | adoringly 43 | adroit 44 | adroitly 45 | adulate 46 | adulation 47 | adulatory 48 | advanced 49 | advantage 50 | advantageous 51 | advantageously 52 | advantages 53 | adventuresome 54 | adventurous 55 | advocate 56 | advocated 57 | advocates 58 | affability 59 | affable 60 | affably 61 | affectation 62 | affection 63 | affectionate 64 | affinity 65 | affirm 66 | affirmation 67 | affirmative 68 | affluence 69 | affluent 70 | afford 71 | affordable 72 | affordably 73 | afordable 74 | agile 75 | agilely 76 | agility 77 | agreeable 78 | agreeableness 79 | agreeably 80 | all-around 81 | alluring 82 | alluringly 83 | altruistic 84 | altruistically 85 | amaze 86 | amazed 87 | amazement 88 | amazes 89 | amazing 90 | amazingly 91 | ambitious 92 | ambitiously 93 | ameliorate 94 | amenable 95 | amenity 96 | amiability 97 | amiabily 98 | amiable 99 | amicability 100 | amicable 101 | amicably 102 | amity 103 | ample 104 | amply 105 | amuse 106 | amusing 107 | amusingly 108 | angel 109 | angelic 110 | apotheosis 111 | appeal 112 | appealing 113 | applaud 114 | appreciable 115 | appreciate 116 | appreciated 117 | appreciates 118 | appreciative 119 | appreciatively 120 | appropriate 121 | approval 122 | approve 123 | ardent 124 | ardently 125 | ardor 126 | articulate 127 | aspiration 128 | aspirations 129 | aspire 130 | assurance 131 | assurances 132 | assure 133 | assuredly 134 | assuring 135 | astonish 136 | astonished 137 | astonishing 138 | astonishingly 139 | astonishment 140 | astound 141 | astounded 142 | astounding 143 | astoundingly 144 | astutely 145 | attentive 146 | attraction 147 | attractive 148 | attractively 149 | attune 150 | audible 151 | audibly 152 | auspicious 153 | authentic 154 | authoritative 155 | autonomous 156 | available 157 | aver 158 | avid 159 | avidly 160 | award 161 | awarded 162 | awards 163 | awe 164 | awed 165 | awesome 166 | awesomely 167 | awesomeness 168 | awestruck 169 | awsome 170 | backbone 171 | balanced 172 | bargain 173 | beauteous 174 | beautiful 175 | beautifullly 176 | beautifully 177 | beautify 178 | beauty 179 | beckon 180 | beckoned 181 | beckoning 182 | beckons 183 | believable 184 | believeable 185 | beloved 186 | benefactor 187 | beneficent 188 | beneficial 189 | beneficially 190 | beneficiary 191 | benefit 192 | benefits 193 | benevolence 194 | benevolent 195 | benifits 196 | best 197 | best-known 198 | best-performing 199 | best-selling 200 | better 201 | better-known 202 | better-than-expected 203 | beutifully 204 | blameless 205 | bless 206 | blessing 207 | bliss 208 | blissful 209 | blissfully 210 | blithe 211 | blockbuster 212 | bloom 213 | blossom 214 | bolster 215 | bonny 216 | bonus 217 | bonuses 218 | boom 219 | booming 220 | boost 221 | boundless 222 | bountiful 223 | brainiest 224 | brainy 225 | brand-new 226 | brave 227 | bravery 228 | bravo 229 | breakthrough 230 | breakthroughs 231 | breathlessness 232 | breathtaking 233 | breathtakingly 234 | breeze 235 | bright 236 | brighten 237 | brighter 238 | brightest 239 | brilliance 240 | brilliances 241 | brilliant 242 | brilliantly 243 | brisk 244 | brotherly 245 | bullish 246 | buoyant 247 | cajole 248 | calm 249 | calming 250 | calmness 251 | capability 252 | capable 253 | capably 254 | captivate 255 | captivating 256 | carefree 257 | cashback 258 | cashbacks 259 | catchy 260 | celebrate 261 | celebrated 262 | celebration 263 | celebratory 264 | champ 265 | champion 266 | charisma 267 | charismatic 268 | charitable 269 | charm 270 | charming 271 | charmingly 272 | chaste 273 | cheaper 274 | cheapest 275 | cheer 276 | cheerful 277 | cheery 278 | cherish 279 | cherished 280 | cherub 281 | chic 282 | chivalrous 283 | chivalry 284 | civility 285 | civilize 286 | clarity 287 | classic 288 | classy 289 | clean 290 | cleaner 291 | cleanest 292 | cleanliness 293 | cleanly 294 | clear 295 | clear-cut 296 | cleared 297 | clearer 298 | clearly 299 | clears 300 | clever 301 | cleverly 302 | cohere 303 | coherence 304 | coherent 305 | cohesive 306 | colorful 307 | comely 308 | comfort 309 | comfortable 310 | comfortably 311 | comforting 312 | comfy 313 | commend 314 | commendable 315 | commendably 316 | commitment 317 | commodious 318 | compact 319 | compactly 320 | compassion 321 | compassionate 322 | compatible 323 | competitive 324 | complement 325 | complementary 326 | complemented 327 | complements 328 | compliant 329 | compliment 330 | complimentary 331 | comprehensive 332 | conciliate 333 | conciliatory 334 | concise 335 | confidence 336 | confident 337 | congenial 338 | congratulate 339 | congratulation 340 | congratulations 341 | congratulatory 342 | conscientious 343 | considerate 344 | consistent 345 | consistently 346 | constructive 347 | consummate 348 | contentment 349 | continuity 350 | contrasty 351 | contribution 352 | convenience 353 | convenient 354 | conveniently 355 | convience 356 | convienient 357 | convient 358 | convincing 359 | convincingly 360 | cool 361 | coolest 362 | cooperative 363 | cooperatively 364 | cornerstone 365 | correct 366 | correctly 367 | cost-effective 368 | cost-saving 369 | counter-attack 370 | counter-attacks 371 | courage 372 | courageous 373 | courageously 374 | courageousness 375 | courteous 376 | courtly 377 | covenant 378 | cozy 379 | creative 380 | credence 381 | credible 382 | crisp 383 | crisper 384 | cure 385 | cure-all 386 | cushy 387 | cute 388 | cuteness 389 | danke 390 | danken 391 | daring 392 | daringly 393 | darling 394 | dashing 395 | dauntless 396 | dawn 397 | dazzle 398 | dazzled 399 | dazzling 400 | dead-cheap 401 | dead-on 402 | decency 403 | decent 404 | decisive 405 | decisiveness 406 | dedicated 407 | defeat 408 | defeated 409 | defeating 410 | defeats 411 | defender 412 | deference 413 | deft 414 | deginified 415 | delectable 416 | delicacy 417 | delicate 418 | delicious 419 | delight 420 | delighted 421 | delightful 422 | delightfully 423 | delightfulness 424 | dependable 425 | dependably 426 | deservedly 427 | deserving 428 | desirable 429 | desiring 430 | desirous 431 | destiny 432 | detachable 433 | devout 434 | dexterous 435 | dexterously 436 | dextrous 437 | dignified 438 | dignify 439 | dignity 440 | diligence 441 | diligent 442 | diligently 443 | diplomatic 444 | dirt-cheap 445 | distinction 446 | distinctive 447 | distinguished 448 | diversified 449 | divine 450 | divinely 451 | dominate 452 | dominated 453 | dominates 454 | dote 455 | dotingly 456 | doubtless 457 | dreamland 458 | dumbfounded 459 | dumbfounding 460 | dummy-proof 461 | durable 462 | dynamic 463 | eager 464 | eagerly 465 | eagerness 466 | earnest 467 | earnestly 468 | earnestness 469 | ease 470 | eased 471 | eases 472 | easier 473 | easiest 474 | easiness 475 | easing 476 | easy 477 | easy-to-use 478 | easygoing 479 | ebullience 480 | ebullient 481 | ebulliently 482 | ecenomical 483 | economical 484 | ecstasies 485 | ecstasy 486 | ecstatic 487 | ecstatically 488 | edify 489 | educated 490 | effective 491 | effectively 492 | effectiveness 493 | effectual 494 | efficacious 495 | efficient 496 | efficiently 497 | effortless 498 | effortlessly 499 | effusion 500 | effusive 501 | effusively 502 | effusiveness 503 | elan 504 | elate 505 | elated 506 | elatedly 507 | elation 508 | electrify 509 | elegance 510 | elegant 511 | elegantly 512 | elevate 513 | elite 514 | eloquence 515 | eloquent 516 | eloquently 517 | embolden 518 | eminence 519 | eminent 520 | empathize 521 | empathy 522 | empower 523 | empowerment 524 | enchant 525 | enchanted 526 | enchanting 527 | enchantingly 528 | encourage 529 | encouragement 530 | encouraging 531 | encouragingly 532 | endear 533 | endearing 534 | endorse 535 | endorsed 536 | endorsement 537 | endorses 538 | endorsing 539 | energetic 540 | energize 541 | energy-efficient 542 | energy-saving 543 | engaging 544 | engrossing 545 | enhance 546 | enhanced 547 | enhancement 548 | enhances 549 | enjoy 550 | enjoyable 551 | enjoyably 552 | enjoyed 553 | enjoying 554 | enjoyment 555 | enjoys 556 | enlighten 557 | enlightenment 558 | enliven 559 | ennoble 560 | enough 561 | enrapt 562 | enrapture 563 | enraptured 564 | enrich 565 | enrichment 566 | enterprising 567 | entertain 568 | entertaining 569 | entertains 570 | enthral 571 | enthrall 572 | enthralled 573 | enthuse 574 | enthusiasm 575 | enthusiast 576 | enthusiastic 577 | enthusiastically 578 | entice 579 | enticed 580 | enticing 581 | enticingly 582 | entranced 583 | entrancing 584 | entrust 585 | enviable 586 | enviably 587 | envious 588 | enviously 589 | enviousness 590 | envy 591 | equitable 592 | ergonomical 593 | err-free 594 | erudite 595 | ethical 596 | eulogize 597 | euphoria 598 | euphoric 599 | euphorically 600 | evaluative 601 | evenly 602 | eventful 603 | everlasting 604 | evocative 605 | exalt 606 | exaltation 607 | exalted 608 | exaltedly 609 | exalting 610 | exaltingly 611 | examplar 612 | examplary 613 | excallent 614 | exceed 615 | exceeded 616 | exceeding 617 | exceedingly 618 | exceeds 619 | excel 620 | exceled 621 | excelent 622 | excellant 623 | excelled 624 | excellence 625 | excellency 626 | excellent 627 | excellently 628 | excels 629 | exceptional 630 | exceptionally 631 | excite 632 | excited 633 | excitedly 634 | excitedness 635 | excitement 636 | excites 637 | exciting 638 | excitingly 639 | exellent 640 | exemplar 641 | exemplary 642 | exhilarate 643 | exhilarating 644 | exhilaratingly 645 | exhilaration 646 | exonerate 647 | expansive 648 | expeditiously 649 | expertly 650 | exquisite 651 | exquisitely 652 | extol 653 | extoll 654 | extraordinarily 655 | extraordinary 656 | exuberance 657 | exuberant 658 | exuberantly 659 | exult 660 | exultant 661 | exultation 662 | exultingly 663 | eye-catch 664 | eye-catching 665 | eyecatch 666 | eyecatching 667 | fabulous 668 | fabulously 669 | facilitate 670 | fair 671 | fairly 672 | fairness 673 | faith 674 | faithful 675 | faithfully 676 | faithfulness 677 | fame 678 | famed 679 | famous 680 | famously 681 | fancier 682 | fancinating 683 | fancy 684 | fanfare 685 | fans 686 | fantastic 687 | fantastically 688 | fascinate 689 | fascinating 690 | fascinatingly 691 | fascination 692 | fashionable 693 | fashionably 694 | fast 695 | fast-growing 696 | fast-paced 697 | faster 698 | fastest 699 | fastest-growing 700 | faultless 701 | fav 702 | fave 703 | favor 704 | favorable 705 | favored 706 | favorite 707 | favorited 708 | favour 709 | fearless 710 | fearlessly 711 | feasible 712 | feasibly 713 | feat 714 | feature-rich 715 | fecilitous 716 | feisty 717 | felicitate 718 | felicitous 719 | felicity 720 | fertile 721 | fervent 722 | fervently 723 | fervid 724 | fervidly 725 | fervor 726 | festive 727 | fidelity 728 | fiery 729 | fine 730 | fine-looking 731 | finely 732 | finer 733 | finest 734 | firmer 735 | first-class 736 | first-in-class 737 | first-rate 738 | flashy 739 | flatter 740 | flattering 741 | flatteringly 742 | flawless 743 | flawlessly 744 | flexibility 745 | flexible 746 | flourish 747 | flourishing 748 | fluent 749 | flutter 750 | fond 751 | fondly 752 | fondness 753 | foolproof 754 | foremost 755 | foresight 756 | formidable 757 | fortitude 758 | fortuitous 759 | fortuitously 760 | fortunate 761 | fortunately 762 | fortune 763 | fragrant 764 | free 765 | freed 766 | freedom 767 | freedoms 768 | fresh 769 | fresher 770 | freshest 771 | friendliness 772 | friendly 773 | frolic 774 | frugal 775 | fruitful 776 | ftw 777 | fulfillment 778 | fun 779 | futurestic 780 | futuristic 781 | gaiety 782 | gaily 783 | gain 784 | gained 785 | gainful 786 | gainfully 787 | gaining 788 | gains 789 | gallant 790 | gallantly 791 | galore 792 | geekier 793 | geeky 794 | gem 795 | gems 796 | generosity 797 | generous 798 | generously 799 | genial 800 | genius 801 | gentle 802 | gentlest 803 | genuine 804 | gifted 805 | glad 806 | gladden 807 | gladly 808 | gladness 809 | glamorous 810 | glee 811 | gleeful 812 | gleefully 813 | glimmer 814 | glimmering 815 | glisten 816 | glistening 817 | glitter 818 | glitz 819 | glorify 820 | glorious 821 | gloriously 822 | glory 823 | glow 824 | glowing 825 | glowingly 826 | god-given 827 | god-send 828 | godlike 829 | godsend 830 | gold 831 | golden 832 | good 833 | goodly 834 | goodness 835 | goodwill 836 | goood 837 | gooood 838 | gorgeous 839 | gorgeously 840 | grace 841 | graceful 842 | gracefully 843 | gracious 844 | graciously 845 | graciousness 846 | grand 847 | grandeur 848 | grateful 849 | gratefully 850 | gratification 851 | gratified 852 | gratifies 853 | gratify 854 | gratifying 855 | gratifyingly 856 | gratitude 857 | great 858 | greatest 859 | greatness 860 | grin 861 | groundbreaking 862 | guarantee 863 | guidance 864 | guiltless 865 | gumption 866 | gush 867 | gusto 868 | gutsy 869 | hail 870 | halcyon 871 | hale 872 | hallmark 873 | hallmarks 874 | hallowed 875 | handier 876 | handily 877 | hands-down 878 | handsome 879 | handsomely 880 | handy 881 | happier 882 | happily 883 | happiness 884 | happy 885 | hard-working 886 | hardier 887 | hardy 888 | harmless 889 | harmonious 890 | harmoniously 891 | harmonize 892 | harmony 893 | headway 894 | heal 895 | healthful 896 | healthy 897 | hearten 898 | heartening 899 | heartfelt 900 | heartily 901 | heartwarming 902 | heaven 903 | heavenly 904 | helped 905 | helpful 906 | helping 907 | hero 908 | heroic 909 | heroically 910 | heroine 911 | heroize 912 | heros 913 | high-quality 914 | high-spirited 915 | hilarious 916 | holy 917 | homage 918 | honest 919 | honesty 920 | honor 921 | honorable 922 | honored 923 | honoring 924 | hooray 925 | hopeful 926 | hospitable 927 | hot 928 | hotcake 929 | hotcakes 930 | hottest 931 | hug 932 | humane 933 | humble 934 | humility 935 | humor 936 | humorous 937 | humorously 938 | humour 939 | humourous 940 | ideal 941 | idealize 942 | ideally 943 | idol 944 | idolize 945 | idolized 946 | idyllic 947 | illuminate 948 | illuminati 949 | illuminating 950 | illumine 951 | illustrious 952 | ilu 953 | imaculate 954 | imaginative 955 | immaculate 956 | immaculately 957 | immense 958 | impartial 959 | impartiality 960 | impartially 961 | impassioned 962 | impeccable 963 | impeccably 964 | important 965 | impress 966 | impressed 967 | impresses 968 | impressive 969 | impressively 970 | impressiveness 971 | improve 972 | improved 973 | improvement 974 | improvements 975 | improves 976 | improving 977 | incredible 978 | incredibly 979 | indebted 980 | individualized 981 | indulgence 982 | indulgent 983 | industrious 984 | inestimable 985 | inestimably 986 | inexpensive 987 | infallibility 988 | infallible 989 | infallibly 990 | influential 991 | ingenious 992 | ingeniously 993 | ingenuity 994 | ingenuous 995 | ingenuously 996 | innocuous 997 | innovation 998 | innovative 999 | inpressed 1000 | insightful 1001 | insightfully 1002 | inspiration 1003 | inspirational 1004 | inspire 1005 | inspiring 1006 | instantly 1007 | instructive 1008 | instrumental 1009 | integral 1010 | integrated 1011 | intelligence 1012 | intelligent 1013 | intelligible 1014 | interesting 1015 | interests 1016 | intimacy 1017 | intimate 1018 | intricate 1019 | intrigue 1020 | intriguing 1021 | intriguingly 1022 | intuitive 1023 | invaluable 1024 | invaluablely 1025 | inventive 1026 | invigorate 1027 | invigorating 1028 | invincibility 1029 | invincible 1030 | inviolable 1031 | inviolate 1032 | invulnerable 1033 | irreplaceable 1034 | irreproachable 1035 | irresistible 1036 | irresistibly 1037 | issue-free 1038 | jaw-droping 1039 | jaw-dropping 1040 | jollify 1041 | jolly 1042 | jovial 1043 | joy 1044 | joyful 1045 | joyfully 1046 | joyous 1047 | joyously 1048 | jubilant 1049 | jubilantly 1050 | jubilate 1051 | jubilation 1052 | jubiliant 1053 | judicious 1054 | justly 1055 | keen 1056 | keenly 1057 | keenness 1058 | kid-friendly 1059 | kindliness 1060 | kindly 1061 | kindness 1062 | knowledgeable 1063 | kudos 1064 | large-capacity 1065 | laud 1066 | laudable 1067 | laudably 1068 | lavish 1069 | lavishly 1070 | law-abiding 1071 | lawful 1072 | lawfully 1073 | lead 1074 | leading 1075 | leads 1076 | lean 1077 | led 1078 | legendary 1079 | leverage 1080 | levity 1081 | liberate 1082 | liberation 1083 | liberty 1084 | lifesaver 1085 | light-hearted 1086 | lighter 1087 | likable 1088 | like 1089 | liked 1090 | likes 1091 | liking 1092 | lionhearted 1093 | lively 1094 | logical 1095 | long-lasting 1096 | lovable 1097 | lovably 1098 | love 1099 | loved 1100 | loveliness 1101 | lovely 1102 | lover 1103 | loves 1104 | loving 1105 | low-cost 1106 | low-price 1107 | low-priced 1108 | low-risk 1109 | lower-priced 1110 | loyal 1111 | loyalty 1112 | lucid 1113 | lucidly 1114 | luck 1115 | luckier 1116 | luckiest 1117 | luckiness 1118 | lucky 1119 | lucrative 1120 | luminous 1121 | lush 1122 | luster 1123 | lustrous 1124 | luxuriant 1125 | luxuriate 1126 | luxurious 1127 | luxuriously 1128 | luxury 1129 | lyrical 1130 | magic 1131 | magical 1132 | magnanimous 1133 | magnanimously 1134 | magnificence 1135 | magnificent 1136 | magnificently 1137 | majestic 1138 | majesty 1139 | manageable 1140 | maneuverable 1141 | marvel 1142 | marveled 1143 | marvelled 1144 | marvellous 1145 | marvelous 1146 | marvelously 1147 | marvelousness 1148 | marvels 1149 | master 1150 | masterful 1151 | masterfully 1152 | masterpiece 1153 | masterpieces 1154 | masters 1155 | mastery 1156 | matchless 1157 | mature 1158 | maturely 1159 | maturity 1160 | meaningful 1161 | memorable 1162 | merciful 1163 | mercifully 1164 | mercy 1165 | merit 1166 | meritorious 1167 | merrily 1168 | merriment 1169 | merriness 1170 | merry 1171 | mesmerize 1172 | mesmerized 1173 | mesmerizes 1174 | mesmerizing 1175 | mesmerizingly 1176 | meticulous 1177 | meticulously 1178 | mightily 1179 | mighty 1180 | mind-blowing 1181 | miracle 1182 | miracles 1183 | miraculous 1184 | miraculously 1185 | miraculousness 1186 | modern 1187 | modest 1188 | modesty 1189 | momentous 1190 | monumental 1191 | monumentally 1192 | morality 1193 | motivated 1194 | multi-purpose 1195 | navigable 1196 | neat 1197 | neatest 1198 | neatly 1199 | nice 1200 | nicely 1201 | nicer 1202 | nicest 1203 | nifty 1204 | nimble 1205 | noble 1206 | nobly 1207 | noiseless 1208 | non-violence 1209 | non-violent 1210 | notably 1211 | noteworthy 1212 | nourish 1213 | nourishing 1214 | nourishment 1215 | novelty 1216 | nurturing 1217 | oasis 1218 | obsession 1219 | obsessions 1220 | obtainable 1221 | openly 1222 | openness 1223 | optimal 1224 | optimism 1225 | optimistic 1226 | opulent 1227 | orderly 1228 | originality 1229 | outdo 1230 | outdone 1231 | outperform 1232 | outperformed 1233 | outperforming 1234 | outperforms 1235 | outshine 1236 | outshone 1237 | outsmart 1238 | outstanding 1239 | outstandingly 1240 | outstrip 1241 | outwit 1242 | ovation 1243 | overjoyed 1244 | overtake 1245 | overtaken 1246 | overtakes 1247 | overtaking 1248 | overtook 1249 | overture 1250 | pain-free 1251 | painless 1252 | painlessly 1253 | palatial 1254 | pamper 1255 | pampered 1256 | pamperedly 1257 | pamperedness 1258 | pampers 1259 | panoramic 1260 | paradise 1261 | paramount 1262 | pardon 1263 | passion 1264 | passionate 1265 | passionately 1266 | patience 1267 | patient 1268 | patiently 1269 | patriot 1270 | patriotic 1271 | peace 1272 | peaceable 1273 | peaceful 1274 | peacefully 1275 | peacekeepers 1276 | peach 1277 | peerless 1278 | pep 1279 | pepped 1280 | pepping 1281 | peppy 1282 | peps 1283 | perfect 1284 | perfection 1285 | perfectly 1286 | permissible 1287 | perseverance 1288 | persevere 1289 | personages 1290 | personalized 1291 | phenomenal 1292 | phenomenally 1293 | picturesque 1294 | piety 1295 | pinnacle 1296 | playful 1297 | playfully 1298 | pleasant 1299 | pleasantly 1300 | pleased 1301 | pleases 1302 | pleasing 1303 | pleasingly 1304 | pleasurable 1305 | pleasurably 1306 | pleasure 1307 | plentiful 1308 | pluses 1309 | plush 1310 | plusses 1311 | poetic 1312 | poeticize 1313 | poignant 1314 | poise 1315 | poised 1316 | polished 1317 | polite 1318 | politeness 1319 | popular 1320 | portable 1321 | posh 1322 | positive 1323 | positively 1324 | positives 1325 | powerful 1326 | powerfully 1327 | praise 1328 | praiseworthy 1329 | praising 1330 | pre-eminent 1331 | precious 1332 | precise 1333 | precisely 1334 | preeminent 1335 | prefer 1336 | preferable 1337 | preferably 1338 | prefered 1339 | preferes 1340 | preferring 1341 | prefers 1342 | premier 1343 | prestige 1344 | prestigious 1345 | prettily 1346 | pretty 1347 | priceless 1348 | pride 1349 | principled 1350 | privilege 1351 | privileged 1352 | prize 1353 | proactive 1354 | problem-free 1355 | problem-solver 1356 | prodigious 1357 | prodigiously 1358 | prodigy 1359 | productive 1360 | productively 1361 | proficient 1362 | proficiently 1363 | profound 1364 | profoundly 1365 | profuse 1366 | profusion 1367 | progress 1368 | progressive 1369 | prolific 1370 | prominence 1371 | prominent 1372 | promise 1373 | promised 1374 | promises 1375 | promising 1376 | promoter 1377 | prompt 1378 | promptly 1379 | proper 1380 | properly 1381 | propitious 1382 | propitiously 1383 | pros 1384 | prosper 1385 | prosperity 1386 | prosperous 1387 | prospros 1388 | protect 1389 | protection 1390 | protective 1391 | proud 1392 | proven 1393 | proves 1394 | providence 1395 | proving 1396 | prowess 1397 | prudence 1398 | prudent 1399 | prudently 1400 | punctual 1401 | pure 1402 | purify 1403 | purposeful 1404 | quaint 1405 | qualified 1406 | qualify 1407 | quicker 1408 | quiet 1409 | quieter 1410 | radiance 1411 | radiant 1412 | rapid 1413 | rapport 1414 | rapt 1415 | rapture 1416 | raptureous 1417 | raptureously 1418 | rapturous 1419 | rapturously 1420 | rational 1421 | razor-sharp 1422 | reachable 1423 | readable 1424 | readily 1425 | ready 1426 | reaffirm 1427 | reaffirmation 1428 | realistic 1429 | realizable 1430 | reasonable 1431 | reasonably 1432 | reasoned 1433 | reassurance 1434 | reassure 1435 | receptive 1436 | reclaim 1437 | recomend 1438 | recommend 1439 | recommendation 1440 | recommendations 1441 | recommended 1442 | reconcile 1443 | reconciliation 1444 | record-setting 1445 | recover 1446 | recovery 1447 | rectification 1448 | rectify 1449 | rectifying 1450 | redeem 1451 | redeeming 1452 | redemption 1453 | refine 1454 | refined 1455 | refinement 1456 | reform 1457 | reformed 1458 | reforming 1459 | reforms 1460 | refresh 1461 | refreshed 1462 | refreshing 1463 | refund 1464 | refunded 1465 | regal 1466 | regally 1467 | regard 1468 | rejoice 1469 | rejoicing 1470 | rejoicingly 1471 | rejuvenate 1472 | rejuvenated 1473 | rejuvenating 1474 | relaxed 1475 | relent 1476 | reliable 1477 | reliably 1478 | relief 1479 | relish 1480 | remarkable 1481 | remarkably 1482 | remedy 1483 | remission 1484 | remunerate 1485 | renaissance 1486 | renewed 1487 | renown 1488 | renowned 1489 | replaceable 1490 | reputable 1491 | reputation 1492 | resilient 1493 | resolute 1494 | resound 1495 | resounding 1496 | resourceful 1497 | resourcefulness 1498 | respect 1499 | respectable 1500 | respectful 1501 | respectfully 1502 | respite 1503 | resplendent 1504 | responsibly 1505 | responsive 1506 | restful 1507 | restored 1508 | restructure 1509 | restructured 1510 | restructuring 1511 | retractable 1512 | revel 1513 | revelation 1514 | revere 1515 | reverence 1516 | reverent 1517 | reverently 1518 | revitalize 1519 | revival 1520 | revive 1521 | revives 1522 | revolutionary 1523 | revolutionize 1524 | revolutionized 1525 | revolutionizes 1526 | reward 1527 | rewarding 1528 | rewardingly 1529 | rich 1530 | richer 1531 | richly 1532 | richness 1533 | right 1534 | righten 1535 | righteous 1536 | righteously 1537 | righteousness 1538 | rightful 1539 | rightfully 1540 | rightly 1541 | rightness 1542 | risk-free 1543 | robust 1544 | rock-star 1545 | rock-stars 1546 | rockstar 1547 | rockstars 1548 | romantic 1549 | romantically 1550 | romanticize 1551 | roomier 1552 | roomy 1553 | rosy 1554 | safe 1555 | safely 1556 | sagacity 1557 | sagely 1558 | saint 1559 | saintliness 1560 | saintly 1561 | salutary 1562 | salute 1563 | sane 1564 | satisfactorily 1565 | satisfactory 1566 | satisfied 1567 | satisfies 1568 | satisfy 1569 | satisfying 1570 | satisified 1571 | saver 1572 | savings 1573 | savior 1574 | savvy 1575 | scenic 1576 | seamless 1577 | seasoned 1578 | secure 1579 | securely 1580 | selective 1581 | self-determination 1582 | self-respect 1583 | self-satisfaction 1584 | self-sufficiency 1585 | self-sufficient 1586 | sensation 1587 | sensational 1588 | sensationally 1589 | sensations 1590 | sensible 1591 | sensibly 1592 | sensitive 1593 | serene 1594 | serenity 1595 | sexy 1596 | sharp 1597 | sharper 1598 | sharpest 1599 | shimmering 1600 | shimmeringly 1601 | shine 1602 | shiny 1603 | significant 1604 | silent 1605 | simpler 1606 | simplest 1607 | simplified 1608 | simplifies 1609 | simplify 1610 | simplifying 1611 | sincere 1612 | sincerely 1613 | sincerity 1614 | skill 1615 | skilled 1616 | skillful 1617 | skillfully 1618 | slammin 1619 | sleek 1620 | slick 1621 | smart 1622 | smarter 1623 | smartest 1624 | smartly 1625 | smile 1626 | smiles 1627 | smiling 1628 | smilingly 1629 | smitten 1630 | smooth 1631 | smoother 1632 | smoothes 1633 | smoothest 1634 | smoothly 1635 | snappy 1636 | snazzy 1637 | sociable 1638 | soft 1639 | softer 1640 | solace 1641 | solicitous 1642 | solicitously 1643 | solid 1644 | solidarity 1645 | soothe 1646 | soothingly 1647 | sophisticated 1648 | soulful 1649 | soundly 1650 | soundness 1651 | spacious 1652 | sparkle 1653 | sparkling 1654 | spectacular 1655 | spectacularly 1656 | speedily 1657 | speedy 1658 | spellbind 1659 | spellbinding 1660 | spellbindingly 1661 | spellbound 1662 | spirited 1663 | spiritual 1664 | splendid 1665 | splendidly 1666 | splendor 1667 | spontaneous 1668 | sporty 1669 | spotless 1670 | sprightly 1671 | stability 1672 | stabilize 1673 | stable 1674 | stainless 1675 | standout 1676 | state-of-the-art 1677 | stately 1678 | statuesque 1679 | staunch 1680 | staunchly 1681 | staunchness 1682 | steadfast 1683 | steadfastly 1684 | steadfastness 1685 | steadiest 1686 | steadiness 1687 | steady 1688 | stellar 1689 | stellarly 1690 | stimulate 1691 | stimulates 1692 | stimulating 1693 | stimulative 1694 | stirringly 1695 | straighten 1696 | straightforward 1697 | streamlined 1698 | striking 1699 | strikingly 1700 | striving 1701 | strong 1702 | stronger 1703 | strongest 1704 | stunned 1705 | stunning 1706 | stunningly 1707 | stupendous 1708 | stupendously 1709 | sturdier 1710 | sturdy 1711 | stylish 1712 | stylishly 1713 | stylized 1714 | suave 1715 | suavely 1716 | sublime 1717 | subsidize 1718 | subsidized 1719 | subsidizes 1720 | subsidizing 1721 | substantive 1722 | succeed 1723 | succeeded 1724 | succeeding 1725 | succeeds 1726 | succes 1727 | success 1728 | successes 1729 | successful 1730 | successfully 1731 | suffice 1732 | sufficed 1733 | suffices 1734 | sufficient 1735 | sufficiently 1736 | suitable 1737 | sumptuous 1738 | sumptuously 1739 | sumptuousness 1740 | super 1741 | superb 1742 | superbly 1743 | superior 1744 | superiority 1745 | supple 1746 | support 1747 | supported 1748 | supporter 1749 | supporting 1750 | supportive 1751 | supports 1752 | supremacy 1753 | supreme 1754 | supremely 1755 | supurb 1756 | supurbly 1757 | surmount 1758 | surpass 1759 | surreal 1760 | survival 1761 | survivor 1762 | sustainability 1763 | sustainable 1764 | swank 1765 | swankier 1766 | swankiest 1767 | swanky 1768 | sweeping 1769 | sweet 1770 | sweeten 1771 | sweetheart 1772 | sweetly 1773 | sweetness 1774 | swift 1775 | swiftness 1776 | talent 1777 | talented 1778 | talents 1779 | tantalize 1780 | tantalizing 1781 | tantalizingly 1782 | tempt 1783 | tempting 1784 | temptingly 1785 | tenacious 1786 | tenaciously 1787 | tenacity 1788 | tender 1789 | tenderly 1790 | terrific 1791 | terrifically 1792 | thank 1793 | thankful 1794 | thinner 1795 | thoughtful 1796 | thoughtfully 1797 | thoughtfulness 1798 | thrift 1799 | thrifty 1800 | thrill 1801 | thrilled 1802 | thrilling 1803 | thrillingly 1804 | thrills 1805 | thrive 1806 | thriving 1807 | thumb-up 1808 | thumbs-up 1809 | tickle 1810 | tidy 1811 | time-honored 1812 | timely 1813 | tingle 1814 | titillate 1815 | titillating 1816 | titillatingly 1817 | togetherness 1818 | tolerable 1819 | toll-free 1820 | top 1821 | top-notch 1822 | top-quality 1823 | topnotch 1824 | tops 1825 | tough 1826 | tougher 1827 | toughest 1828 | traction 1829 | tranquil 1830 | tranquility 1831 | transparent 1832 | treasure 1833 | tremendously 1834 | trendy 1835 | triumph 1836 | triumphal 1837 | triumphant 1838 | triumphantly 1839 | trivially 1840 | trophy 1841 | trouble-free 1842 | trump 1843 | trumpet 1844 | trust 1845 | trusted 1846 | trusting 1847 | trustingly 1848 | trustworthiness 1849 | trustworthy 1850 | trusty 1851 | truthful 1852 | truthfully 1853 | truthfulness 1854 | twinkly 1855 | ultra-crisp 1856 | unabashed 1857 | unabashedly 1858 | unaffected 1859 | unassailable 1860 | unbeatable 1861 | unbiased 1862 | unbound 1863 | uncomplicated 1864 | unconditional 1865 | undamaged 1866 | undaunted 1867 | understandable 1868 | undisputable 1869 | undisputably 1870 | undisputed 1871 | unencumbered 1872 | unequivocal 1873 | unequivocally 1874 | unfazed 1875 | unfettered 1876 | unforgettable 1877 | unity 1878 | unlimited 1879 | unmatched 1880 | unparalleled 1881 | unquestionable 1882 | unquestionably 1883 | unreal 1884 | unrestricted 1885 | unrivaled 1886 | unselfish 1887 | unwavering 1888 | upbeat 1889 | upgradable 1890 | upgradeable 1891 | upgraded 1892 | upheld 1893 | uphold 1894 | uplift 1895 | uplifting 1896 | upliftingly 1897 | upliftment 1898 | upscale 1899 | usable 1900 | useable 1901 | useful 1902 | user-friendly 1903 | user-replaceable 1904 | valiant 1905 | valiantly 1906 | valor 1907 | valuable 1908 | variety 1909 | venerate 1910 | verifiable 1911 | veritable 1912 | versatile 1913 | versatility 1914 | vibrant 1915 | vibrantly 1916 | victorious 1917 | victory 1918 | viewable 1919 | vigilance 1920 | vigilant 1921 | virtue 1922 | virtuous 1923 | virtuously 1924 | visionary 1925 | vivacious 1926 | vivid 1927 | vouch 1928 | vouchsafe 1929 | warm 1930 | warmer 1931 | warmhearted 1932 | warmly 1933 | warmth 1934 | wealthy 1935 | welcome 1936 | well 1937 | well-backlit 1938 | well-balanced 1939 | well-behaved 1940 | well-being 1941 | well-bred 1942 | well-connected 1943 | well-educated 1944 | well-established 1945 | well-informed 1946 | well-intentioned 1947 | well-known 1948 | well-made 1949 | well-managed 1950 | well-mannered 1951 | well-positioned 1952 | well-received 1953 | well-regarded 1954 | well-rounded 1955 | well-run 1956 | well-wishers 1957 | wellbeing 1958 | whoa 1959 | wholeheartedly 1960 | wholesome 1961 | whooa 1962 | whoooa 1963 | wieldy 1964 | willing 1965 | willingly 1966 | willingness 1967 | win 1968 | windfall 1969 | winnable 1970 | winner 1971 | winners 1972 | winning 1973 | wins 1974 | wisdom 1975 | wise 1976 | wisely 1977 | witty 1978 | won 1979 | wonder 1980 | wonderful 1981 | wonderfully 1982 | wonderous 1983 | wonderously 1984 | wonders 1985 | wondrous 1986 | woo 1987 | work 1988 | workable 1989 | worked 1990 | works 1991 | world-famous 1992 | worth 1993 | worth-while 1994 | worthiness 1995 | worthwhile 1996 | worthy 1997 | wow 1998 | wowed 1999 | wowing 2000 | wows 2001 | yay 2002 | youthful 2003 | zeal 2004 | zenith 2005 | zest 2006 | zippy 2007 | -------------------------------------------------------------------------------- /data_files/vaderSentiment.py: -------------------------------------------------------------------------------- 1 | # coding: utf-8 2 | # Author: C.J. Hutto 3 | # Thanks to George Berry for reducing the time complexity from something like O(N^4) to O(N). 4 | # Thanks to Ewan Klein and Pierpaolo Pantone for bringing VADER into NLTK. Those modifications were awesome. 5 | # For license information, see LICENSE.TXT 6 | 7 | """ 8 | If you use the VADER sentiment analysis tools, please cite: 9 | Hutto, C.J. & Gilbert, E.E. (2014). VADER: A Parsimonious Rule-based Model for 10 | Sentiment Analysis of Social Media Text. Eighth International Conference on 11 | Weblogs and Social Media (ICWSM-14). Ann Arbor, MI, June 2014. 12 | """ 13 | import os 14 | import sys 15 | import re 16 | import math 17 | import string 18 | import requests 19 | import json 20 | from itertools import product 21 | from inspect import getsourcefile 22 | 23 | def resource_path(relative): 24 | if hasattr(sys, "_MEIPASS"): 25 | return os.path.join(sys._MEIPASS, relative) 26 | return os.path.join(relative) 27 | 28 | # ##Constants## 29 | 30 | # (empirically derived mean sentiment intensity rating increase for booster words) 31 | B_INCR = 0.293 32 | B_DECR = -0.293 33 | 34 | # (empirically derived mean sentiment intensity rating increase for using ALLCAPs to emphasize a word) 35 | C_INCR = 0.733 36 | N_SCALAR = -0.74 37 | 38 | # for removing punctuation 39 | REGEX_REMOVE_PUNCTUATION = re.compile('[%s]' % re.escape(string.punctuation)) 40 | 41 | PUNC_LIST = [".", "!", "?", ",", ";", ":", "-", "'", "\"", 42 | "!!", "!!!", "??", "???", "?!?", "!?!", "?!?!", "!?!?"] 43 | NEGATE = \ 44 | ["aint", "arent", "cannot", "cant", "couldnt", "darent", "didnt", "doesnt", 45 | "ain't", "aren't", "can't", "couldn't", "daren't", "didn't", "doesn't", 46 | "dont", "hadnt", "hasnt", "havent", "isnt", "mightnt", "mustnt", "neither", 47 | "don't", "hadn't", "hasn't", "haven't", "isn't", "mightn't", "mustn't", 48 | "neednt", "needn't", "never", "none", "nope", "nor", "not", "nothing", "nowhere", 49 | "oughtnt", "shant", "shouldnt", "uhuh", "wasnt", "werent", 50 | "oughtn't", "shan't", "shouldn't", "uh-uh", "wasn't", "weren't", 51 | "without", "wont", "wouldnt", "won't", "wouldn't", "rarely", "seldom", "despite"] 52 | 53 | # booster/dampener 'intensifiers' or 'degree adverbs' 54 | # http://en.wiktionary.org/wiki/Category:English_degree_adverbs 55 | 56 | BOOSTER_DICT = \ 57 | {"absolutely": B_INCR, "amazingly": B_INCR, "awfully": B_INCR, "completely": B_INCR, "considerably": B_INCR, 58 | "decidedly": B_INCR, "deeply": B_INCR, "effing": B_INCR, "enormously": B_INCR, 59 | "entirely": B_INCR, "especially": B_INCR, "exceptionally": B_INCR, "extremely": B_INCR, 60 | "fabulously": B_INCR, "flipping": B_INCR, "flippin": B_INCR, 61 | "fricking": B_INCR, "frickin": B_INCR, "frigging": B_INCR, "friggin": B_INCR, "fully": B_INCR, "fucking": B_INCR, 62 | "greatly": B_INCR, "hella": B_INCR, "highly": B_INCR, "hugely": B_INCR, "incredibly": B_INCR, 63 | "intensely": B_INCR, "majorly": B_INCR, "more": B_INCR, "most": B_INCR, "particularly": B_INCR, 64 | "purely": B_INCR, "quite": B_INCR, "really": B_INCR, "remarkably": B_INCR, 65 | "so": B_INCR, "substantially": B_INCR, 66 | "thoroughly": B_INCR, "totally": B_INCR, "tremendously": B_INCR, 67 | "uber": B_INCR, "unbelievably": B_INCR, "unusually": B_INCR, "utterly": B_INCR, 68 | "very": B_INCR, 69 | "almost": B_DECR, "barely": B_DECR, "hardly": B_DECR, "just enough": B_DECR, 70 | "kind of": B_DECR, "kinda": B_DECR, "kindof": B_DECR, "kind-of": B_DECR, 71 | "less": B_DECR, "little": B_DECR, "marginally": B_DECR, "occasionally": B_DECR, "partly": B_DECR, 72 | "scarcely": B_DECR, "slightly": B_DECR, "somewhat": B_DECR, 73 | "sort of": B_DECR, "sorta": B_DECR, "sortof": B_DECR, "sort-of": B_DECR} 74 | 75 | # check for sentiment laden idioms that do not contain lexicon words (future work, not yet implemented) 76 | SENTIMENT_LADEN_IDIOMS = {"cut the mustard": 2, "hand to mouth": -2, 77 | "back handed": -2, "blow smoke": -2, "blowing smoke": -2, 78 | "upper hand": 1, "break a leg": 2, 79 | "cooking with gas": 2, "in the black": 2, "in the red": -2, 80 | "on the ball": 2, "under the weather": -2} 81 | 82 | # check for special case idioms containing lexicon words 83 | SPECIAL_CASE_IDIOMS = {"the shit": 3, "the bomb": 3, "bad ass": 1.5, "yeah right": -2, 84 | "kiss of death": -1.5} 85 | 86 | 87 | # #Static methods# # 88 | 89 | def negated(input_words, include_nt=True): 90 | """ 91 | Determine if input contains negation words 92 | """ 93 | input_words = [str(w).lower() for w in input_words] 94 | neg_words = [] 95 | neg_words.extend(NEGATE) 96 | for word in neg_words: 97 | if word in input_words: 98 | return True 99 | if include_nt: 100 | for word in input_words: 101 | if "n't" in word: 102 | return True 103 | if "least" in input_words: 104 | i = input_words.index("least") 105 | if i > 0 and input_words[i - 1] != "at": 106 | return True 107 | return False 108 | 109 | 110 | def normalize(score, alpha=15): 111 | """ 112 | Normalize the score to be between -1 and 1 using an alpha that 113 | approximates the max expected value 114 | """ 115 | norm_score = score / math.sqrt((score * score) + alpha) 116 | if norm_score < -1.0: 117 | return -1.0 118 | elif norm_score > 1.0: 119 | return 1.0 120 | else: 121 | return norm_score 122 | 123 | 124 | def allcap_differential(words): 125 | """ 126 | Check whether just some words in the input are ALL CAPS 127 | :param list words: The words to inspect 128 | :returns: `True` if some but not all items in `words` are ALL CAPS 129 | """ 130 | is_different = False 131 | allcap_words = 0 132 | for word in words: 133 | if word.isupper(): 134 | allcap_words += 1 135 | cap_differential = len(words) - allcap_words 136 | if 0 < cap_differential < len(words): 137 | is_different = True 138 | return is_different 139 | 140 | 141 | def scalar_inc_dec(word, valence, is_cap_diff): 142 | """ 143 | Check if the preceding words increase, decrease, or negate/nullify the 144 | valence 145 | """ 146 | scalar = 0.0 147 | word_lower = word.lower() 148 | if word_lower in BOOSTER_DICT: 149 | scalar = BOOSTER_DICT[word_lower] 150 | if valence < 0: 151 | scalar *= -1 152 | # check if booster/dampener word is in ALLCAPS (while others aren't) 153 | if word.isupper() and is_cap_diff: 154 | if valence > 0: 155 | scalar += C_INCR 156 | else: 157 | scalar -= C_INCR 158 | return scalar 159 | 160 | 161 | class SentiText(object): 162 | """ 163 | Identify sentiment-relevant string-level properties of input text. 164 | """ 165 | 166 | def __init__(self, text): 167 | if not isinstance(text, str): 168 | text = str(text).encode('utf-8') 169 | self.text = text 170 | self.words_and_emoticons = self._words_and_emoticons() 171 | # doesn't separate words from\ 172 | # adjacent punctuation (keeps emoticons & contractions) 173 | self.is_cap_diff = allcap_differential(self.words_and_emoticons) 174 | 175 | def _words_plus_punc(self): 176 | """ 177 | Returns mapping of form: 178 | { 179 | 'cat,': 'cat', 180 | ',cat': 'cat', 181 | } 182 | """ 183 | no_punc_text = REGEX_REMOVE_PUNCTUATION.sub('', self.text) 184 | # removes punctuation (but loses emoticons & contractions) 185 | words_only = no_punc_text.split() 186 | # remove singletons 187 | words_only = set(w for w in words_only if len(w) > 1) 188 | # the product gives ('cat', ',') and (',', 'cat') 189 | punc_before = {''.join(p): p[1] for p in product(PUNC_LIST, words_only)} 190 | punc_after = {''.join(p): p[0] for p in product(words_only, PUNC_LIST)} 191 | words_punc_dict = punc_before 192 | words_punc_dict.update(punc_after) 193 | return words_punc_dict 194 | 195 | def _words_and_emoticons(self): 196 | """ 197 | Removes leading and trailing puncutation 198 | Leaves contractions and most emoticons 199 | Does not preserve punc-plus-letter emoticons (e.g. :D) 200 | """ 201 | wes = self.text.split() 202 | words_punc_dict = self._words_plus_punc() 203 | wes = [we for we in wes if len(we) > 1] 204 | for i, we in enumerate(wes): 205 | if we in words_punc_dict: 206 | wes[i] = words_punc_dict[we] 207 | return wes 208 | 209 | 210 | class SentimentIntensityAnalyzer(object): 211 | """ 212 | Give a sentiment intensity score to sentences. 213 | """ 214 | 215 | def __init__(self, lexicon_file="data_files/vader_lexicon.txt", emoji_lexicon="data_files/emoji_utf8_lexicon.txt"): 216 | with open(resource_path(lexicon_file), encoding='utf-8') as f: 217 | self.lexicon_full_filepath = f.read() 218 | self.lexicon = self.make_lex_dict() 219 | 220 | with open(resource_path(emoji_lexicon), encoding='utf-8') as f: 221 | self.emoji_full_filepath = f.read() 222 | self.emojis = self.make_emoji_dict() 223 | 224 | def make_lex_dict(self): 225 | """ 226 | Convert lexicon file to a dictionary 227 | """ 228 | lex_dict = {} 229 | for line in self.lexicon_full_filepath.split('\n'): 230 | (word, measure) = line.strip().split('\t')[0:2] 231 | lex_dict[word] = float(measure) 232 | return lex_dict 233 | 234 | def make_emoji_dict(self): 235 | """ 236 | Convert emoji lexicon file to a dictionary 237 | """ 238 | emoji_dict = {} 239 | for line in self.emoji_full_filepath.split('\n'): 240 | (emoji, description) = line.strip().split('\t')[0:2] 241 | emoji_dict[emoji] = description 242 | return emoji_dict 243 | 244 | def polarity_scores(self, text): 245 | """ 246 | Return a float for sentiment strength based on the input text. 247 | Positive values are positive valence, negative value are negative 248 | valence. 249 | """ 250 | # convert emojis to their textual descriptions 251 | text_token_list = text.split() 252 | text_no_emoji_lst = [] 253 | for token in text_token_list: 254 | if token in self.emojis: 255 | # get the textual description 256 | description = self.emojis[token] 257 | text_no_emoji_lst.append(description) 258 | else: 259 | text_no_emoji_lst.append(token) 260 | text = " ".join(x for x in text_no_emoji_lst) 261 | 262 | sentitext = SentiText(text) 263 | 264 | sentiments = [] 265 | words_and_emoticons = sentitext.words_and_emoticons 266 | for item in words_and_emoticons: 267 | valence = 0 268 | i = words_and_emoticons.index(item) 269 | # check for vader_lexicon words that may be used as modifiers or negations 270 | if item.lower() in BOOSTER_DICT: 271 | sentiments.append(valence) 272 | continue 273 | if (i < len(words_and_emoticons) - 1 and item.lower() == "kind" and 274 | words_and_emoticons[i + 1].lower() == "of"): 275 | sentiments.append(valence) 276 | continue 277 | 278 | sentiments = self.sentiment_valence(valence, sentitext, item, i, sentiments) 279 | 280 | sentiments = self._but_check(words_and_emoticons, sentiments) 281 | 282 | valence_dict = self.score_valence(sentiments, text) 283 | 284 | return valence_dict 285 | 286 | def sentiment_valence(self, valence, sentitext, item, i, sentiments): 287 | is_cap_diff = sentitext.is_cap_diff 288 | words_and_emoticons = sentitext.words_and_emoticons 289 | item_lowercase = item.lower() 290 | if item_lowercase in self.lexicon: 291 | # get the sentiment valence 292 | valence = self.lexicon[item_lowercase] 293 | # check if sentiment laden word is in ALL CAPS (while others aren't) 294 | if item.isupper() and is_cap_diff: 295 | if valence > 0: 296 | valence += C_INCR 297 | else: 298 | valence -= C_INCR 299 | 300 | for start_i in range(0, 3): 301 | # dampen the scalar modifier of preceding words and emoticons 302 | # (excluding the ones that immediately preceed the item) based 303 | # on their distance from the current item. 304 | if i > start_i and words_and_emoticons[i - (start_i + 1)].lower() not in self.lexicon: 305 | s = scalar_inc_dec(words_and_emoticons[i - (start_i + 1)], valence, is_cap_diff) 306 | if start_i == 1 and s != 0: 307 | s = s * 0.95 308 | if start_i == 2 and s != 0: 309 | s = s * 0.9 310 | valence = valence + s 311 | valence = self._negation_check(valence, words_and_emoticons, start_i, i) 312 | if start_i == 2: 313 | valence = self._special_idioms_check(valence, words_and_emoticons, i) 314 | 315 | valence = self._least_check(valence, words_and_emoticons, i) 316 | sentiments.append(valence) 317 | return sentiments 318 | 319 | def _least_check(self, valence, words_and_emoticons, i): 320 | # check for negation case using "least" 321 | if i > 1 and words_and_emoticons[i - 1].lower() not in self.lexicon \ 322 | and words_and_emoticons[i - 1].lower() == "least": 323 | if words_and_emoticons[i - 2].lower() != "at" and words_and_emoticons[i - 2].lower() != "very": 324 | valence = valence * N_SCALAR 325 | elif i > 0 and words_and_emoticons[i - 1].lower() not in self.lexicon \ 326 | and words_and_emoticons[i - 1].lower() == "least": 327 | valence = valence * N_SCALAR 328 | return valence 329 | 330 | @staticmethod 331 | def _but_check(words_and_emoticons, sentiments): 332 | # check for modification in sentiment due to contrastive conjunction 'but' 333 | words_and_emoticons_lower = [str(w).lower() for w in words_and_emoticons] 334 | if 'but' in words_and_emoticons_lower: 335 | bi = words_and_emoticons_lower.index('but') 336 | for sentiment in sentiments: 337 | si = sentiments.index(sentiment) 338 | if si < bi: 339 | sentiments.pop(si) 340 | sentiments.insert(si, sentiment * 0.5) 341 | elif si > bi: 342 | sentiments.pop(si) 343 | sentiments.insert(si, sentiment * 1.5) 344 | return sentiments 345 | 346 | @staticmethod 347 | def _special_idioms_check(valence, words_and_emoticons, i): 348 | words_and_emoticons_lower = [str(w).lower() for w in words_and_emoticons] 349 | onezero = "{0} {1}".format(words_and_emoticons_lower[i - 1], words_and_emoticons_lower[i]) 350 | 351 | twoonezero = "{0} {1} {2}".format(words_and_emoticons_lower[i - 2], 352 | words_and_emoticons_lower[i - 1], words_and_emoticons_lower[i]) 353 | 354 | twoone = "{0} {1}".format(words_and_emoticons_lower[i - 2], words_and_emoticons_lower[i - 1]) 355 | 356 | threetwoone = "{0} {1} {2}".format(words_and_emoticons_lower[i - 3], 357 | words_and_emoticons_lower[i - 2], words_and_emoticons_lower[i - 1]) 358 | 359 | threetwo = "{0} {1}".format(words_and_emoticons_lower[i - 3], words_and_emoticons_lower[i - 2]) 360 | 361 | sequences = [onezero, twoonezero, twoone, threetwoone, threetwo] 362 | 363 | for seq in sequences: 364 | if seq in SPECIAL_CASE_IDIOMS: 365 | valence = SPECIAL_CASE_IDIOMS[seq] 366 | break 367 | 368 | if len(words_and_emoticons_lower) - 1 > i: 369 | zeroone = "{0} {1}".format(words_and_emoticons_lower[i], words_and_emoticons_lower[i + 1]) 370 | if zeroone in SPECIAL_CASE_IDIOMS: 371 | valence = SPECIAL_CASE_IDIOMS[zeroone] 372 | if len(words_and_emoticons_lower) - 1 > i + 1: 373 | zeroonetwo = "{0} {1} {2}".format(words_and_emoticons_lower[i], words_and_emoticons_lower[i + 1], 374 | words_and_emoticons_lower[i + 2]) 375 | if zeroonetwo in SPECIAL_CASE_IDIOMS: 376 | valence = SPECIAL_CASE_IDIOMS[zeroonetwo] 377 | 378 | # check for booster/dampener bi-grams such as 'sort of' or 'kind of' 379 | n_grams = [threetwoone, threetwo, twoone] 380 | for n_gram in n_grams: 381 | if n_gram in BOOSTER_DICT: 382 | valence = valence + BOOSTER_DICT[n_gram] 383 | return valence 384 | 385 | @staticmethod 386 | def _sentiment_laden_idioms_check(valence, senti_text_lower): 387 | # Future Work 388 | # check for sentiment laden idioms that don't contain a lexicon word 389 | idioms_valences = [] 390 | for idiom in SENTIMENT_LADEN_IDIOMS: 391 | if idiom in senti_text_lower: 392 | print(idiom, senti_text_lower) 393 | valence = SENTIMENT_LADEN_IDIOMS[idiom] 394 | idioms_valences.append(valence) 395 | if len(idioms_valences) > 0: 396 | valence = sum(idioms_valences) / float(len(idioms_valences)) 397 | return valence 398 | 399 | @staticmethod 400 | def _negation_check(valence, words_and_emoticons, start_i, i): 401 | words_and_emoticons_lower = [str(w).lower() for w in words_and_emoticons] 402 | if start_i == 0: 403 | if negated([words_and_emoticons_lower[i - (start_i + 1)]]): # 1 word preceding lexicon word (w/o stopwords) 404 | valence = valence * N_SCALAR 405 | if start_i == 1: 406 | if words_and_emoticons_lower[i - 2] == "never" and \ 407 | (words_and_emoticons_lower[i - 1] == "so" or 408 | words_and_emoticons_lower[i - 1] == "this"): 409 | valence = valence * 1.25 410 | elif words_and_emoticons_lower[i - 2] == "without" and \ 411 | words_and_emoticons_lower[i - 1] == "doubt": 412 | valence = valence 413 | elif negated([words_and_emoticons_lower[i - (start_i + 1)]]): # 2 words preceding the lexicon word position 414 | valence = valence * N_SCALAR 415 | if start_i == 2: 416 | if words_and_emoticons_lower[i - 3] == "never" and \ 417 | (words_and_emoticons_lower[i - 2] == "so" or words_and_emoticons_lower[i - 2] == "this") or \ 418 | (words_and_emoticons_lower[i - 1] == "so" or words_and_emoticons_lower[i - 1] == "this"): 419 | valence = valence * 1.25 420 | elif words_and_emoticons_lower[i - 3] == "without" and \ 421 | (words_and_emoticons_lower[i - 2] == "doubt" or words_and_emoticons_lower[i - 1] == "doubt"): 422 | valence = valence 423 | elif negated([words_and_emoticons_lower[i - (start_i + 1)]]): # 3 words preceding the lexicon word position 424 | valence = valence * N_SCALAR 425 | return valence 426 | 427 | def _punctuation_emphasis(self, text): 428 | # add emphasis from exclamation points and question marks 429 | ep_amplifier = self._amplify_ep(text) 430 | qm_amplifier = self._amplify_qm(text) 431 | punct_emph_amplifier = ep_amplifier + qm_amplifier 432 | return punct_emph_amplifier 433 | 434 | @staticmethod 435 | def _amplify_ep(text): 436 | # check for added emphasis resulting from exclamation points (up to 4 of them) 437 | ep_count = text.count("!") 438 | if ep_count > 4: 439 | ep_count = 4 440 | # (empirically derived mean sentiment intensity rating increase for 441 | # exclamation points) 442 | ep_amplifier = ep_count * 0.292 443 | return ep_amplifier 444 | 445 | @staticmethod 446 | def _amplify_qm(text): 447 | # check for added emphasis resulting from question marks (2 or 3+) 448 | qm_count = text.count("?") 449 | qm_amplifier = 0 450 | if qm_count > 1: 451 | if qm_count <= 3: 452 | # (empirically derived mean sentiment intensity rating increase for 453 | # question marks) 454 | qm_amplifier = qm_count * 0.18 455 | else: 456 | qm_amplifier = 0.96 457 | return qm_amplifier 458 | 459 | @staticmethod 460 | def _sift_sentiment_scores(sentiments): 461 | # want separate positive versus negative sentiment scores 462 | pos_sum = 0.0 463 | neg_sum = 0.0 464 | neu_count = 0 465 | for sentiment_score in sentiments: 466 | if sentiment_score > 0: 467 | pos_sum += (float(sentiment_score) + 1) # compensates for neutral words that are counted as 1 468 | if sentiment_score < 0: 469 | neg_sum += (float(sentiment_score) - 1) # when used with math.fabs(), compensates for neutrals 470 | if sentiment_score == 0: 471 | neu_count += 1 472 | return pos_sum, neg_sum, neu_count 473 | 474 | def score_valence(self, sentiments, text): 475 | if sentiments: 476 | sum_s = float(sum(sentiments)) 477 | # compute and add emphasis from punctuation in text 478 | punct_emph_amplifier = self._punctuation_emphasis(text) 479 | if sum_s > 0: 480 | sum_s += punct_emph_amplifier 481 | elif sum_s < 0: 482 | sum_s -= punct_emph_amplifier 483 | 484 | compound = normalize(sum_s) 485 | # discriminate between positive, negative and neutral sentiment scores 486 | pos_sum, neg_sum, neu_count = self._sift_sentiment_scores(sentiments) 487 | 488 | if pos_sum > math.fabs(neg_sum): 489 | pos_sum += punct_emph_amplifier 490 | elif pos_sum < math.fabs(neg_sum): 491 | neg_sum -= punct_emph_amplifier 492 | 493 | total = pos_sum + math.fabs(neg_sum) + neu_count 494 | pos = math.fabs(pos_sum / total) 495 | neg = math.fabs(neg_sum / total) 496 | neu = math.fabs(neu_count / total) 497 | 498 | else: 499 | compound = 0.0 500 | pos = 0.0 501 | neg = 0.0 502 | neu = 0.0 503 | 504 | sentiment_dict = \ 505 | {"neg": round(neg, 3), 506 | "neu": round(neu, 3), 507 | "pos": round(pos, 3), 508 | "compound": round(compound, 4)} 509 | 510 | return sentiment_dict 511 | 512 | 513 | if __name__ == '__main__': 514 | # --- examples ------- 515 | sentences = ["VADER is smart, handsome, and funny.", # positive sentence example 516 | "VADER is smart, handsome, and funny!", 517 | # punctuation emphasis handled correctly (sentiment intensity adjusted) 518 | "VADER is very smart, handsome, and funny.", 519 | # booster words handled correctly (sentiment intensity adjusted) 520 | "VADER is VERY SMART, handsome, and FUNNY.", # emphasis for ALLCAPS handled 521 | "VADER is VERY SMART, handsome, and FUNNY!!!", 522 | # combination of signals - VADER appropriately adjusts intensity 523 | "VADER is VERY SMART, uber handsome, and FRIGGIN FUNNY!!!", 524 | # booster words & punctuation make this close to ceiling for score 525 | "VADER is not smart, handsome, nor funny.", # negation sentence example 526 | "The book was good.", # positive sentence 527 | "At least it isn't a horrible book.", # negated negative sentence with contraction 528 | "The book was only kind of good.", 529 | # qualified positive sentence is handled correctly (intensity adjusted) 530 | "The plot was good, but the characters are uncompelling and the dialog is not great.", 531 | # mixed negation sentence 532 | "Today SUX!", # negative slang with capitalization emphasis 533 | "Today only kinda sux! But I'll get by, lol", 534 | # mixed sentiment example with slang and constrastive conjunction "but" 535 | "Make sure you :) or :D today!", # emoticons handled 536 | "Catch utf-8 emoji such as 💘 and 💋 and 😁", # emojis handled 537 | "Not bad at all" # Capitalized negation 538 | ] 539 | 540 | analyzer = SentimentIntensityAnalyzer() 541 | 542 | print("----------------------------------------------------") 543 | print(" - Analyze typical example cases, including handling of:") 544 | print(" -- negations") 545 | print(" -- punctuation emphasis & punctuation flooding") 546 | print(" -- word-shape as emphasis (capitalization difference)") 547 | print(" -- degree modifiers (intensifiers such as 'very' and dampeners such as 'kind of')") 548 | print(" -- slang words as modifiers such as 'uber' or 'friggin' or 'kinda'") 549 | print(" -- contrastive conjunction 'but' indicating a shift in sentiment; sentiment of later text is dominant") 550 | print(" -- use of contractions as negations") 551 | print(" -- sentiment laden emoticons such as :) and :D") 552 | print(" -- utf-8 encoded emojis such as 💘 and 💋 and 😁") 553 | print(" -- sentiment laden slang words (e.g., 'sux')") 554 | print(" -- sentiment laden initialisms and acronyms (for example: 'lol') \n") 555 | for sentence in sentences: 556 | vs = analyzer.polarity_scores(sentence) 557 | print("{:-<65} {}".format(sentence, str(vs))) 558 | print("----------------------------------------------------") 559 | print(" - About the scoring: ") 560 | print(""" -- The 'compound' score is computed by summing the valence scores of each word in the lexicon, adjusted 561 | according to the rules, and then normalized to be between -1 (most extreme negative) and +1 (most extreme positive). 562 | This is the most useful metric if you want a single unidimensional measure of sentiment for a given sentence. 563 | Calling it a 'normalized, weighted composite score' is accurate.""") 564 | print(""" -- The 'pos', 'neu', and 'neg' scores are ratios for proportions of text that fall in each category (so these 565 | should all add up to be 1... or close to it with float operation). These are the most useful metrics if 566 | you want multidimensional measures of sentiment for a given sentence.""") 567 | print("----------------------------------------------------") 568 | 569 | # input("\nPress Enter to continue the demo...\n") # for DEMO purposes... 570 | 571 | tricky_sentences = ["Sentiment analysis has never been good.", 572 | "Sentiment analysis has never been this good!", 573 | "Most automated sentiment analysis tools are shit.", 574 | "With VADER, sentiment analysis is the shit!", 575 | "Other sentiment analysis tools can be quite bad.", 576 | "On the other hand, VADER is quite bad ass", 577 | "VADER is such a badass!", # slang with punctuation emphasis 578 | "Without a doubt, excellent idea.", 579 | "Roger Dodger is one of the most compelling variations on this theme.", 580 | "Roger Dodger is at least compelling as a variation on the theme.", 581 | "Roger Dodger is one of the least compelling variations on this theme.", 582 | "Not such a badass after all.", # Capitalized negation with slang 583 | "Without a doubt, an excellent idea." # "without {any} doubt" as negation 584 | ] 585 | print("----------------------------------------------------") 586 | print(" - Analyze examples of tricky sentences that cause trouble to other sentiment analysis tools.") 587 | print(" -- special case idioms - e.g., 'never good' vs 'never this good', or 'bad' vs 'bad ass'.") 588 | print(" -- special uses of 'least' as negation versus comparison \n") 589 | for sentence in tricky_sentences: 590 | vs = analyzer.polarity_scores(sentence) 591 | print("{:-<69} {}".format(sentence, str(vs))) 592 | print("----------------------------------------------------") 593 | 594 | # input("\nPress Enter to continue the demo...\n") # for DEMO purposes... 595 | 596 | print("----------------------------------------------------") 597 | print( 598 | " - VADER works best when analysis is done at the sentence level (but it can work on single words or entire novels).") 599 | paragraph = "It was one of the worst movies I've seen, despite good reviews. Unbelievably bad acting!! Poor direction. VERY poor production. The movie was bad. Very bad movie. VERY BAD movie!" 600 | print(" -- For example, given the following paragraph text from a hypothetical movie review:\n\t'{}'".format( 601 | paragraph)) 602 | print( 603 | " -- You could use NLTK to break the paragraph into sentence tokens for VADER, then average the results for the paragraph like this: \n") 604 | # simple example to tokenize paragraph into sentences for VADER 605 | from nltk import tokenize 606 | 607 | sentence_list = tokenize.sent_tokenize(paragraph) 608 | paragraphSentiments = 0.0 609 | for sentence in sentence_list: 610 | vs = analyzer.polarity_scores(sentence) 611 | print("{:-<69} {}".format(sentence, str(vs["compound"]))) 612 | paragraphSentiments += vs["compound"] 613 | print("AVERAGE SENTIMENT FOR PARAGRAPH: \t" + str(round(paragraphSentiments / len(sentence_list), 4))) 614 | print("----------------------------------------------------") 615 | 616 | # input("\nPress Enter to continue the demo...\n") # for DEMO purposes... 617 | 618 | print("----------------------------------------------------") 619 | print(" - Analyze sentiment of IMAGES/VIDEO data based on annotation 'tags' or image labels. \n") 620 | conceptList = ["balloons", "cake", "candles", "happy birthday", "friends", "laughing", "smiling", "party"] 621 | conceptSentiments = 0.0 622 | for concept in conceptList: 623 | vs = analyzer.polarity_scores(concept) 624 | print("{:-<15} {}".format(concept, str(vs['compound']))) 625 | conceptSentiments += vs["compound"] 626 | print("AVERAGE SENTIMENT OF TAGS/LABELS: \t" + str(round(conceptSentiments / len(conceptList), 4))) 627 | print("\t") 628 | conceptList = ["riot", "fire", "fight", "blood", "mob", "war", "police", "tear gas"] 629 | conceptSentiments = 0.0 630 | for concept in conceptList: 631 | vs = analyzer.polarity_scores(concept) 632 | print("{:-<15} {}".format(concept, str(vs['compound']))) 633 | conceptSentiments += vs["compound"] 634 | print("AVERAGE SENTIMENT OF TAGS/LABELS: \t" + str(round(conceptSentiments / len(conceptList), 4))) 635 | print("----------------------------------------------------") 636 | 637 | # input("\nPress Enter to continue the demo...") # for DEMO purposes... 638 | 639 | do_translate = input( 640 | "\nWould you like to run VADER demo examples with NON-ENGLISH text? (Note: requires Internet access) \n Type 'y' or 'n', then press Enter: ") 641 | if do_translate.lower().lstrip().__contains__("y"): 642 | print("\n----------------------------------------------------") 643 | print(" - Analyze sentiment of NON ENGLISH text...for example:") 644 | print(" -- French, German, Spanish, Italian, Russian, Japanese, Arabic, Chinese") 645 | print(" -- many other languages supported. \n") 646 | languages = ["English", "French", "German", "Spanish", "Italian", "Russian", "Japanese", "Arabic", "Chinese"] 647 | language_codes = ["en", "fr", "de", "es", "it", "ru", "ja", "ar", "zh"] 648 | nonEnglish_sentences = ["I'm surprised to see just how amazingly helpful VADER is!", 649 | "Je suis surpris de voir juste comment incroyablement utile VADER est!", 650 | "Ich bin überrascht zu sehen, nur wie erstaunlich nützlich VADER!", 651 | "Me sorprende ver sólo cómo increíblemente útil VADER!", 652 | "Sono sorpreso di vedere solo come incredibilmente utile VADER è!", 653 | "Я удивлен увидеть, как раз как удивительно полезно ВЕЙДЕРА!", 654 | "私はちょうどどのように驚くほど役に立つベイダーを見て驚いています!", 655 | "أنا مندهش لرؤية فقط كيف مثير للدهشة فيدر فائدة!", 656 | "惊讶地看到有用维德是的只是如何令人惊讶了 !" 657 | ] 658 | for sentence in nonEnglish_sentences: 659 | to_lang = "en" 660 | from_lang = language_codes[nonEnglish_sentences.index(sentence)] 661 | if (from_lang == "en") or (from_lang == "en-US"): 662 | translation = sentence 663 | translator_name = "No translation needed" 664 | else: # please note usage limits for My Memory Translation Service: http://mymemory.translated.net/doc/usagelimits.php 665 | # using MY MEMORY NET http://mymemory.translated.net 666 | api_url = "http://mymemory.translated.net/api/get?q={}&langpair={}|{}".format(sentence, from_lang, 667 | to_lang) 668 | hdrs = { 669 | 'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11', 670 | 'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8', 671 | 'Accept-Charset': 'ISO-8859-1,utf-8;q=0.7,*;q=0.3', 672 | 'Accept-Encoding': 'none', 673 | 'Accept-Language': 'en-US,en;q=0.8', 674 | 'Connection': 'keep-alive'} 675 | response = requests.get(api_url, headers=hdrs) 676 | response_json = json.loads(response.text) 677 | translation = response_json["responseData"]["translatedText"] 678 | translator_name = "MemoryNet Translation Service" 679 | vs = analyzer.polarity_scores(translation) 680 | print("- {: <8}: {: <69}\t {} ({})".format(languages[nonEnglish_sentences.index(sentence)], sentence, 681 | str(vs['compound']), translator_name)) 682 | print("----------------------------------------------------") 683 | 684 | print("\n\n Demo Done!") 685 | -------------------------------------------------------------------------------- /en_core_web_sm/accuracy.json: -------------------------------------------------------------------------------- 1 | { 2 | "uas":91.7237657538, 3 | "las":89.800872413, 4 | "ents_p":84.9664503965, 5 | "ents_r":85.6312524451, 6 | "ents_f":85.2975560875, 7 | "tags_acc":97.0403350292, 8 | "token_acc":99.8698372794 9 | } -------------------------------------------------------------------------------- /en_core_web_sm/meta.json: -------------------------------------------------------------------------------- 1 | { 2 | "lang":"en", 3 | "pipeline":[ 4 | "tagger", 5 | "parser", 6 | "ner" 7 | ], 8 | "accuracy":{ 9 | "token_acc":99.8698372794, 10 | "ents_p":84.9664503965, 11 | "ents_r":85.6312524451, 12 | "uas":91.7237657538, 13 | "tags_acc":97.0403350292, 14 | "ents_f":85.2975560875, 15 | "las":89.800872413 16 | }, 17 | "name":"core_web_sm", 18 | "license":"CC BY-SA 3.0", 19 | "author":"Explosion AI", 20 | "url":"https://explosion.ai", 21 | "vectors":{ 22 | "keys":0, 23 | "width":0, 24 | "vectors":0 25 | }, 26 | "sources":[ 27 | "OntoNotes 5", 28 | "Common Crawl" 29 | ], 30 | "version":"2.0.0", 31 | "spacy_version":">=2.0.0a18", 32 | "parent_package":"spacy", 33 | "speed":{ 34 | "gpu":null, 35 | "nwords":291344, 36 | "cpu":5122.3040471407 37 | }, 38 | "email":"contact@explosion.ai", 39 | "description":"English multi-task CNN trained on OntoNotes, with GloVe vectors trained on Common Crawl. Assigns word vectors, context-specific token vectors, POS tags, dependency parse and named entities." 40 | } -------------------------------------------------------------------------------- /en_core_web_sm/ner/cfg: -------------------------------------------------------------------------------- 1 | { 2 | "beam_width":1, 3 | "beam_density":0.0, 4 | "pretrained_dims":0, 5 | "cnn_maxout_pieces":3, 6 | "nr_class":73, 7 | "hidden_depth":1, 8 | "token_vector_width":128, 9 | "hidden_width":200, 10 | "maxout_pieces":2, 11 | "hist_size":0, 12 | "hist_width":0 13 | } -------------------------------------------------------------------------------- /en_core_web_sm/ner/lower_model: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/ner/lower_model -------------------------------------------------------------------------------- /en_core_web_sm/ner/moves: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/ner/moves -------------------------------------------------------------------------------- /en_core_web_sm/ner/tok2vec_model: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/ner/tok2vec_model -------------------------------------------------------------------------------- /en_core_web_sm/ner/upper_model: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/ner/upper_model -------------------------------------------------------------------------------- /en_core_web_sm/parser/cfg: -------------------------------------------------------------------------------- 1 | { 2 | "beam_width":1, 3 | "beam_density":0.0, 4 | "pretrained_dims":0, 5 | "cnn_maxout_pieces":3, 6 | "nr_class":111, 7 | "hidden_depth":1, 8 | "token_vector_width":128, 9 | "hidden_width":200, 10 | "maxout_pieces":2, 11 | "hist_size":0, 12 | "hist_width":0 13 | } -------------------------------------------------------------------------------- /en_core_web_sm/parser/lower_model: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/parser/lower_model -------------------------------------------------------------------------------- /en_core_web_sm/parser/moves: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/parser/moves -------------------------------------------------------------------------------- /en_core_web_sm/parser/tok2vec_model: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/parser/tok2vec_model -------------------------------------------------------------------------------- /en_core_web_sm/parser/upper_model: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/parser/upper_model -------------------------------------------------------------------------------- /en_core_web_sm/tagger/cfg: -------------------------------------------------------------------------------- 1 | { 2 | "cnn_maxout_pieces":2, 3 | "pretrained_dims":0 4 | } -------------------------------------------------------------------------------- /en_core_web_sm/tagger/model: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/tagger/model -------------------------------------------------------------------------------- /en_core_web_sm/tagger/tag_map: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/tagger/tag_map -------------------------------------------------------------------------------- /en_core_web_sm/tokenizer: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/tokenizer -------------------------------------------------------------------------------- /en_core_web_sm/vocab/key2row: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/vocab/key2row -------------------------------------------------------------------------------- /en_core_web_sm/vocab/lexemes.bin: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/vocab/lexemes.bin -------------------------------------------------------------------------------- /en_core_web_sm/vocab/vectors: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kristopherkyle/SEANCE/89213d9ab7e397a64db1fde91ef7f44494a19e35/en_core_web_sm/vocab/vectors --------------------------------------------------------------------------------