├── LICENSE ├── Media └── app.gif ├── README.md ├── app_v2.py ├── f_dl_secrets.py ├── f_utils.py ├── requirements.txt ├── test_samples ├── out.txt └── out.wav └── test_summary.ipynb /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2022 Juan Camilo López Montes 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /Media/app.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/juan-csv/GPT3-text-summarization/7fa3ff1e4128977d094efa739e0a755902ec03fb/Media/app.gif -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # GPT3-text-summarization 2 | This repository structures data in title, summary, tags, sentiment given a fragment of a conversation 3 | 4 | # How to install: 5 |
pip install -r requirements.txt 
6 | 7 | # How to run: 8 | The code is tested in python 3.7.8 and macOS Catalina 9 | 10 |
streamlit run app.py 
11 | 12 | Or You can run online [here](https://share.streamlit.io/juan-csv/gpt3-text-summarization/app_v2.py) 13 | 14 | 15 | ![alt text](https://github.com/juan-csv/gpt3-text-summarization/blob/master/Media/app.gif) 16 | 17 | 18 | 19 | -------------------------------------------------------------------------------- /app_v2.py: -------------------------------------------------------------------------------- 1 | import streamlit as st 2 | import f_dl_secrets as f_dl 3 | import speech_recognition as sr 4 | from io import StringIO 5 | 6 | r = sr.Recognizer() 7 | 8 | # define parameters 9 | LIST_MODE = ["TITLE" ,"SUMMARY", "TAGS", "SENTIMENT", "REASON_FOR_CUSTOMER_CALL"] 10 | # Select Engine ---> Ada, Babbage, Curie, Davinci 11 | ENGINE = "Davinci" 12 | 13 | Name_app = "Understand conversation" 14 | REPO = "https://github.com/juan-csv/Understand-conversation-AI" 15 | 16 | description_app = f"[{Name_app}]({REPO}) "\ 17 | f"structures data in title, summary, tags, sentiment "\ 18 | f"given a fragment of a conversation using Deep Learning. " 19 | 20 | EXAMPLE = "Agent: For calling customer service. My name is Vanessa, how may help you. Client: I was calling to order place and white. Agent: We happy to send out a replacement card out for you. Agent: Your 16, digit card number. Client: I dont know the Cardinals. Agent: Thank you verify your first and last name please. Client: Patricia Covington. Agent: How you spell your last name? Client: C O V I N G T O and. Agent: And you said your first name. Client: Letricia. Agent: L a T. Client: R I C I. Agent: Z. Agent: It's not pulling up anything C O N C I N G T O. Client: Know C O V as in Victor, Agent: S when? Client: I N G T O N E. Agent: I put the extra letter and I was wondering what I was doing wrong key verify your data birth for the reserve anson. Client: Uh huh made since 1995. Agent: Thing with this card last owner damage. Client: It was last. Agent: Thinking to verify your address we like a new cards remote out to. Client: 1918 Arlington avenue saint, Louis Missouri 63112 apartment a. Agent: You. Okay. Thank you Mrs. Could send him before? I cant see your car. I need to inform you that this call will be personally cancel excuse me. It will take three to five business days for your new card to arrive in the mail would you like him for counselors car now. Client: Yes maam. Agent: Thank you your card is now been council your my name is Alison team will be transferred to your new card you have 121 instead of benefits available and a dollar and 0.38 and cash benefits. Client: Okay. Thank you. I have you. Agent: Or anything else? I can assist you with today. Client: Know you have a good day. Agent: I was coming soon. Thank you for calling customer service and have a nice day. Client: Thank you, bye, bye. Thank you bye." 21 | # Add \n in any "." 22 | EXAMPLE = EXAMPLE.replace("Agent","\nAgent") 23 | EXAMPLE = EXAMPLE.replace("Client","\nClient")[1:] 24 | 25 | EXAMPLE_ES = """Agente: Gracias por llamar al servicio de atención al cliente. Mi nombre es Vanessa, en que te puedo ayudar. Cliente: Estaba llamando para ordenar una tarjeta nueva. Agente: Nos complace enviarle una tarjeta de reemplazo. Agente: cuales son los números de su tarjeta de 16 dígitos. Cliente: No conozco los numeros. Agente: Gracias, verifique su nombre y apellido, por favor. Cliente: Patricia Covington. Agente: ¿Cómo se escribe su apellido? Cliente: C O V I N G T O N. Agente: Y tu primer nombre. Cliente: Letricia. Agente: L a T. Cliente: R I C I. Agente: Z. Agente: No está sacando nada CONCLUYENDO. Cliente: es C O V, Cliente: I N G T O N E. Agente: Ahora funciona, verifica tus datos de nacimiento. Cliente: Uh huh hecho desde 1995. Agente: La cosa con esta tarjeta daña al último propietario. Cliente: Fue el último. Agente: Pensando en verificar su dirección, nos gustaría enviar una nueva tarjeta remota. Cliente: 1918 Arlington Avenue Saint, Louis Missouri 63112 apartamento a. Agente: Gracias Sra.Necesito informarle que esta llamada será cancelada personalmente disculpe. Su nueva tarjeta tardará de tres a cinco días hábiles en llegar por correo. Cliente: Si señora. Agente: Gracias su tarjeta ahora ha sido programada, mi nombre es Alison, su dinero será transferido a su nueva tarjeta tiene 121 en lugar de beneficios disponibles y un dólar y 0.38 y beneficios en efectivo. Cliente: Está bien. Gracias. Agente: ¿Hay algo más? En que pueda ayudarte el día de hoy. Cliente: Sepa que tenga un buen día. Agente: Gracias por llamar al servicio de atención al cliente y que tenga un buen día. Cliente: Gracias, adiós, adiós. Gracias adios.""" 26 | EXAMPLE_ES = EXAMPLE_ES.replace("Agente","\nAgente") 27 | EXAMPLE_ES = EXAMPLE_ES.replace("Cliente","\nCliente")[1:] 28 | 29 | 30 | st.set_page_config( 31 | page_title=Name_app, 32 | layout="wide", 33 | initial_sidebar_state="expanded" 34 | ) 35 | 36 | #---------------------------------------------------------------------------------------------------------------------- 37 | #---------------------------------------------------------------------------------------------------------------------- 38 | # sidebar information 39 | st.sidebar.markdown("

🤖

", unsafe_allow_html=True) 40 | st.sidebar.markdown("-----------------------------------") 41 | st.sidebar.markdown(description_app) 42 | st.sidebar.markdown("Made with 💙 by [juan-csv](https://github.com/juan-csv)") 43 | 44 | #CONTACT 45 | ######## 46 | st.sidebar.markdown("-----------------------------------") 47 | expander = st.sidebar.expander('Contact', expanded=True) 48 | expander.write("I'd love your feedback :smiley: Want to collaborate? Develop a project? Find me on [LinkedIn](https://www.linkedin.com/in/juan-camilo-lopez-montes-125875105/)") 49 | #---------------------------------------------------------------------------------------------------------------------- 50 | #---------------------------------------------------------------------------------------------------------------------- 51 | 52 | # add title 53 | st.title(Name_app) 54 | 55 | # select language 56 | select_box_language = st.radio(label="Select language:", options=["English", "Spanish"], index=0) 57 | 58 | ######################################################################################################################## 59 | # Upload file 60 | ######################################################################################################################## 61 | uploaded_file = st.file_uploader("Choose a file (.wav or .txt)") 62 | 63 | if uploaded_file!=None: 64 | print(f"uploaded_file: {uploaded_file.name}") 65 | if uploaded_file.name.endswith(".txt"): 66 | EXAMPLE = StringIO(uploaded_file.getvalue().decode("utf-8")).read() 67 | 68 | elif uploaded_file.name.endswith(".wav"): 69 | digital_audio = sr.AudioFile(uploaded_file) # cargar el archivo de audio 70 | with digital_audio as source: 71 | #audio1 = r.record(source,duration=8) #Tomar los primeros 8 segundos 72 | audio = r.record(source) #Leer todo el archivo de audio 73 | with st.spinner(text="Creating transcript..."): 74 | # selecet language for create transcription 75 | if select_box_language == "English": 76 | language = "en-US" 77 | else: 78 | language = "es-CO" 79 | EXAMPLE = r.recognize_google(audio, language=language) 80 | #st.success("Done!!!") 81 | 82 | ######################################################################################################################## 83 | ######################################################################################################################## 84 | 85 | 86 | # divide interface in two columns 87 | col1, col2 = st.columns([6,4]) 88 | if select_box_language == "English": 89 | with col1: 90 | # create input text 91 | input_text = st.text_area("Input text:", value=EXAMPLE, height=500) 92 | 93 | with col2: 94 | with st.spinner(text="Creating summary..."): 95 | 96 | # get result in json format 97 | res = dict() 98 | res["ENGINE"] = ENGINE 99 | total_price = 0 100 | Res = "" 101 | for MODE in LIST_MODE: 102 | response, price = f_dl.get_ingerence_GPT3(input_text, MODE, ENGINE, select_box_language) 103 | Res += f"{MODE} : {response} \n\n" 104 | res[MODE] = response 105 | total_price += price 106 | res["PRICE"] = total_price 107 | st.success("Done!!!") 108 | 109 | st.markdown(f"**TITLE** : {res['TITLE']}", unsafe_allow_html=True) 110 | st.markdown(f"**REASON FOR CUSTOMER CALL** : {res['REASON_FOR_CUSTOMER_CALL']}", unsafe_allow_html=True) 111 | st.markdown(f"**SUMMARY** : {res['SUMMARY']}", unsafe_allow_html=True) 112 | st.markdown(f"**TAGS** : {res['TAGS']}", unsafe_allow_html=True) 113 | st.markdown(f"**SENTIMENT** : {res['SENTIMENT']}", unsafe_allow_html=True) 114 | # show result 115 | st.success(Res) 116 | 117 | elif select_box_language == "Spanish": 118 | # change label select_box_language to Spanish 119 | with col1: 120 | # create input text 121 | input_text = st.text_area("Ingresar texto:", value=EXAMPLE, height=500) 122 | 123 | with col2: 124 | with st.spinner(text="Creando resumen..."): 125 | 126 | # get result in json format 127 | res = dict() 128 | res["ENGINE"] = ENGINE 129 | total_price = 0 130 | Res = "" 131 | for MODE in LIST_MODE: 132 | response, price = f_dl.get_ingerence_GPT3(input_text, MODE, ENGINE, select_box_language) 133 | Res += f"{MODE} : {response} \n\n" 134 | res[MODE] = response 135 | total_price += price 136 | res["PRICE"] = total_price 137 | st.success('Done!') 138 | 139 | 140 | st.markdown(f"**TITULO** : {res['TITLE']}", unsafe_allow_html=True) 141 | st.markdown(f"**MOTIVO DE LA LLAMADA DEL CLIENTE** : {res['REASON_FOR_CUSTOMER_CALL']}", unsafe_allow_html=True) 142 | st.markdown(f"**RESUMEN** : {res['SUMMARY']}", unsafe_allow_html=True) 143 | st.markdown(f"**CATEGORA** : {res['TAGS']}", unsafe_allow_html=True) 144 | st.markdown(f"**SENTIMIENTO** : {res['SENTIMENT']}", unsafe_allow_html=True) 145 | # show result 146 | st.success(Res) 147 | -------------------------------------------------------------------------------- /f_dl_secrets.py: -------------------------------------------------------------------------------- 1 | import re 2 | import openai 3 | # get configuration 4 | # use for get number of tokens 5 | from transformers import GPT2Tokenizer 6 | # essentials 7 | import numpy as np 8 | import math 9 | # get secrets 10 | import streamlit as st 11 | 12 | 13 | # instance tokenizer & api 14 | openai.api_key = st.secrets["GPT3"]["OPENAI_API_KEY"] #os.getenv("OPENAI_API_KEY") 15 | tokenizer = GPT2Tokenizer.from_pretrained("gpt2") 16 | 17 | #-------------------------------------------------------------------------------- 18 | #-------------------------------------------------------------------------------- 19 | ## Reducing text 20 | def reduce_text_to_nearest_period(text, MAX_LEN): 21 | """ Reduce text to nearest period always the text is less than MAX_LEN 22 | """ 23 | dots_idx = np.array( find_words_endswith_dot(text) ) 24 | # select dot position less than MAX_TOKEN 25 | best_dot_idx = dots_idx[ (dots_idx - MAX_LEN)<0 ] 26 | if len(best_dot_idx) == 0: 27 | reduce_text = text[:MAX_LEN] 28 | else: 29 | reduce_text = text[:best_dot_idx[-1]+1] 30 | return reduce_text 31 | 32 | 33 | # create function to search postions of "." in list_words 34 | def find_words_endswith_dot(list_words): 35 | pos = [] 36 | for i in range(len(list_words)): 37 | if list_words[i].endswith("."): 38 | pos.append(i) 39 | return pos 40 | #-------------------------------------------------------------------------------- 41 | #-------------------------------------------------------------------------------- 42 | ## Reduce tokens for using gpt3 api 43 | def reduce_tokens_for_gpt3(input_text_gpt3): 44 | """reduce tokens for using gpt3 api 45 | """ 46 | # if text have periods ---> reduce using periods 47 | best_id_dot = 0 48 | dots_idx = find_words_endswith_dot(input_text_gpt3) 49 | for id in dots_idx: 50 | number_tokens_text = get_number_of_tokens(input_text_gpt3[:id+1]) 51 | if number_tokens_text < st.secrets["GPT3"]["MAX_TOKENS"]: 52 | best_id_dot = id 53 | 54 | if best_id_dot != 0: 55 | reduce_text = input_text_gpt3[:best_id_dot+1] 56 | return reduce_text 57 | 58 | else: # if text dont have period ---> force reduce 59 | reduce_text = "" 60 | for word in input_text_gpt3.split(" "): 61 | if get_number_of_tokens(reduce_text) < st.secrets["GPT3"]["MAX_TOKENS"] - 1: 62 | reduce_text = reduce_text + " " + word 63 | else: 64 | break 65 | return reduce_text 66 | 67 | def get_number_of_tokens(text): 68 | return len(tokenizer(text)['input_ids']) 69 | #-------------------------------------------------------------------------------- 70 | #-------------------------------------------------------------------------------- 71 | # Main function 72 | def get_ingerence_GPT3(input_text, MODE, ENGINE, LANGUAGE="English"): 73 | # number of tokens would be less than st.secrets["GPT3"]["MAX_TOKENS"] 74 | reduce_text = reduce_tokens_for_gpt3(input_text) 75 | 76 | # define prompt 77 | # define prompt 78 | if LANGUAGE == "English": 79 | command = st.secrets["FUNCTIONS"][MODE]["PROMPT"] 80 | elif LANGUAGE == "Spanish": 81 | command = st.secrets["FUNCTIONS"][MODE]["PROMPT_ES"] 82 | prompt = command[0] + reduce_text + command[1] 83 | 84 | response_raw = openai.Completion.create( 85 | engine= st.secrets["GPT3"]["ENGINE_MODEL"][ENGINE]["MODEL"], 86 | prompt=prompt, 87 | temperature=0, 88 | max_tokens=st.secrets["FUNCTIONS"][MODE]["MAX_TOKENS"], 89 | top_p=1, 90 | frequency_penalty=0, 91 | presence_penalty=0, 92 | #stop=["\n"], 93 | #n=3, 94 | #best_of=1, 95 | ) 96 | # get raw text 97 | response = response_raw["choices"][0]["text"] 98 | # clean blank space and brea lines 99 | response = response.replace("\n","").strip() 100 | # clean tags 101 | if MODE == "TAGS": 102 | response = response[:-1] if response.endswith(",") else response 103 | 104 | # get price for prediction 105 | tokens_1000 = math.ceil( get_number_of_tokens( reduce_text + " " + response_raw["choices"][0]["text"] )/1000 ) 106 | price = tokens_1000 * st.secrets["GPT3"]["ENGINE_MODEL"][ENGINE]["PRICE"] # USD 107 | 108 | return response, price 109 | 110 | if __name__ == "__main__": 111 | # inputs 112 | input_text = "Agent: For calling customer service. My name is Vanessa, how may help you. Client: I was calling to order place and white. Agent: We happy to send out a replacement card out for you. Agent: Your 16, digit card number. Client: I dont know the Cardinals. Agent: Thank you verify your first and last name please. Client: Patricia Covington. Agent: How you spell your last name? Client: C O V I N G T O and. Agent: And you said your first name. Client: Letricia. Agent: L a T. Client: R I C I. Agent: Z. Agent: It's not pulling up anything C O N C I N G T O. Client: Know C O V as in Victor, Agent: S when? Client: I N G T O N E. Agent: I put the extra letter and I was wondering what I was doing wrong key verify your data birth for the reserve anson. Client: Uh huh made since 1995. Agent: Thing with this card last owner damage. Client: It was last. Agent: Thinking to verify your address we like a new cards remote out to. Client: 1918 Arlington avenue saint, Louis Missouri 63112 apartment a. Agent: You. Okay. Thank you Mrs. Could send him before? I cant see your car. I need to inform you that this call will be personally cancel excuse me. It will take three to five business days for your new card to arrive in the mail would you like him for counselors car now. Client: Yes maam. Agent: Thank you your card is now been council your my name is Alison team will be transferred to your new card you have 121 instead of benefits available and a dollar and 0.38 and cash benefits. Client: Okay. Thank you. I have you. Agent: Or anything else? I can assist you with today. Client: Know you have a good day. Agent: I was coming soon. Thank you for calling customer service and have a nice day. Client: Thank you, bye, bye. Thank you bye." 113 | LIST_MODE = ["TITLE" ,"SUMMARY", "TAGS", "SENTIMENT"] 114 | # Select Engine ---> Ada, Babbage, Curie, Davinci 115 | ENGINE = "Davinci" 116 | 117 | 118 | # get result in json format 119 | res = dict() 120 | res["ENGINE"] = ENGINE 121 | total_price = 0 122 | for MODE in LIST_MODE: 123 | response, price = get_ingerence_GPT3(input_text, MODE, ENGINE) 124 | res[MODE] = response 125 | total_price += price 126 | res["PRICE"] = total_price 127 | 128 | 129 | # show result 130 | from pprint import pprint 131 | print(f"Input_text:\n{input_text}") 132 | print("--------------------------------------\n") 133 | pprint(res) -------------------------------------------------------------------------------- /f_utils.py: -------------------------------------------------------------------------------- 1 | from pydub import AudioSegment 2 | 3 | def mp3_to_wav(audio_file_path): 4 | sound = AudioSegment.from_mp3(audio_file_path) 5 | audio_file_path = audio_file_path.split('.')[0] + '.wav' 6 | sound.export(audio_file_path, format="wav") 7 | return audio_file_path 8 | 9 | #audio_file_path = mp3_to_wav("audio_data_donald_trump_fake_0ghm5Cqpfwk.mp3") 10 | #print(audio_file_path) 11 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | openai==0.11.0 2 | #sentence-transformers==0.4.1.2 3 | transformers==4.10.0 4 | SpeechRecognition==3.8.1 -------------------------------------------------------------------------------- /test_samples/out.txt: -------------------------------------------------------------------------------- 1 | Agent: For calling customer service. My name is Vanessa, how may help you. Client: I was calling to order place and white. Agent: We happy to send out a replacement card out for you. Agent: Your 16, digit card number. Client: I dont know the Cardinals. Agent: Thank you verify your first and last name please. Client: Patricia Covington. Agent: How you spell your last name? Client: C O V I N G T O and. Agent: And you said your first name. Client: Letricia. Agent: L a T. Client: R I C I. Agent: Z. Agent: It's not pulling up anything C O N C I N G T O. Client: Know C O V as in Victor, Agent: S when? Client: I N G T O N E. Agent: I put the extra letter and I was wondering what I was doing wrong key verify your data birth for the reserve anson. Client: Uh huh made since 1995. Agent: Thing with this card last owner damage. Client: It was last. Agent: Thinking to verify your address we like a new cards remote out to. Client: 1918 Arlington avenue saint, Louis Missouri 63112 apartment a. Agent: You. Okay. Thank you Mrs. Could send him before? I cant see your car. I need to inform you that this call will be personally cancel excuse me. It will take three to five business days for your new card to arrive in the mail would you like him for counselors car now. Client: Yes maam. Agent: Thank you your card is now been council your my name is Alison team will be transferred to your new card you have 121 instead of benefits available and a dollar and 0.38 and cash benefits. Client: Okay. Thank you. I have you. Agent: Or anything else? I can assist you with today. Client: Know you have a good day. Agent: I was coming soon. Thank you for calling customer service and have a nice day. Client: Thank you, bye, bye. Thank you bye. -------------------------------------------------------------------------------- /test_samples/out.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/juan-csv/GPT3-text-summarization/7fa3ff1e4128977d094efa739e0a755902ec03fb/test_samples/out.wav -------------------------------------------------------------------------------- /test_summary.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## [Run async gpt3](https://github.com/OthersideAI/chronology)\n", 8 | "## [examples summarization](https://beta.openai.com/docs/examples/summarization)\n", 9 | "## [tokenizer](https://beta.openai.com/tokenizer)" 10 | ] 11 | }, 12 | { 13 | "cell_type": "code", 14 | "execution_count": 1, 15 | "metadata": {}, 16 | "outputs": [], 17 | "source": [ 18 | "#!pip install openai\n", 19 | "import os\n", 20 | "import openai" 21 | ] 22 | }, 23 | { 24 | "cell_type": "code", 25 | "execution_count": 2, 26 | "metadata": {}, 27 | "outputs": [], 28 | "source": [ 29 | "OPENAI_API_KEY = \"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\"\n", 30 | "openai.api_key = OPENAI_API_KEY#os.getenv(\"OPENAI_API_KEY\")" 31 | ] 32 | }, 33 | { 34 | "cell_type": "code", 35 | "execution_count": 3, 36 | "metadata": {}, 37 | "outputs": [], 38 | "source": [ 39 | "# summarization\n", 40 | "def summary_gpt3(input_text):\n", 41 | " response = openai.Completion.create(\n", 42 | " engine=\"davinci\",\n", 43 | " #prompt= input_text + \"\\n\\ntl;dr:\",\n", 44 | " prompt= input_text + \"\\n\\nOne-sentence summary:\",\n", 45 | " temperature=0,\n", 46 | " max_tokens=128,\n", 47 | " top_p=1,\n", 48 | " frequency_penalty=0,\n", 49 | " presence_penalty=0,\n", 50 | " stop=[\"\\n\"],\n", 51 | " #n=3,\n", 52 | " #best_of=1,\n", 53 | " )\n", 54 | " return response[\"choices\"][0][\"text\"]\n", 55 | "\n", 56 | "# Ttile\n", 57 | "def title_gpt3(input_text):\n", 58 | " response = openai.Completion.create(\n", 59 | " engine=\"davinci\",\n", 60 | " prompt= input_text + \"\\n\\nheadline:\",\n", 61 | " temperature=0,\n", 62 | " max_tokens=64,\n", 63 | " top_p=1,\n", 64 | " frequency_penalty=0,\n", 65 | " presence_penalty=0,\n", 66 | " stop=[\"\\n\"],\n", 67 | " #n=3,\n", 68 | " #best_of=1,\n", 69 | " )\n", 70 | " return response[\"choices\"][0][\"text\"]\n", 71 | "# topics:\n", 72 | "def topics_gpt3(input_text):\n", 73 | " response = openai.Completion.create(\n", 74 | " engine=\"davinci\",\n", 75 | " prompt= input_text + \"\\n\\nKeywords & topics:\",\n", 76 | " temperature=0, \n", 77 | " max_tokens=16,\n", 78 | " top_p=1,\n", 79 | " frequency_penalty=0,\n", 80 | " presence_penalty=0,\n", 81 | " stop=[\"\\n\"],\n", 82 | " #n=3,\n", 83 | " #best_of=1,\n", 84 | " )\n", 85 | " return response[\"choices\"][0][\"text\"]" 86 | ] 87 | }, 88 | { 89 | "cell_type": "markdown", 90 | "metadata": {}, 91 | "source": [ 92 | "# Try with intruction GPT3" 93 | ] 94 | }, 95 | { 96 | "cell_type": "code", 97 | "execution_count": 4, 98 | "metadata": {}, 99 | "outputs": [], 100 | "source": [ 101 | "# summarization\n", 102 | "def summary_gpt3(input_text):\n", 103 | " response = openai.Completion.create(\n", 104 | " engine=\"davinci-instruct-beta-v3\",\n", 105 | " prompt=f\"Given a transcript, get summary:\\n\\nTranscript: {input_text}\\n\\nSummary:\",\n", 106 | " temperature=0,\n", 107 | " max_tokens=128,\n", 108 | " top_p=1,\n", 109 | " frequency_penalty=0,\n", 110 | " presence_penalty=0,\n", 111 | " #stop=[\"\\n\"],\n", 112 | " #n=3,\n", 113 | " #best_of=1,\n", 114 | " )\n", 115 | " summary = response[\"choices\"][0][\"text\"]\n", 116 | " return summary.replace(\"\\n\",\"\").strip()\n", 117 | "\n", 118 | "# Ttile\n", 119 | "def title_gpt3(input_text):\n", 120 | " response = openai.Completion.create(\n", 121 | " engine=\"davinci-instruct-beta-v3\",\n", 122 | " prompt=f\"Given a transcript, get headline:\\n\\nTranscript: {input_text}\\n\\nHeadline:\",\n", 123 | " temperature=0,\n", 124 | " max_tokens=32,\n", 125 | " top_p=1,\n", 126 | " frequency_penalty=0,\n", 127 | " presence_penalty=0,\n", 128 | " #stop=[\"\\n\"],\n", 129 | " #n=3,\n", 130 | " #best_of=1,\n", 131 | " )\n", 132 | " title = response[\"choices\"][0][\"text\"]\n", 133 | " return title.replace(\"\\n\",\"\").strip().strip()\n", 134 | "\n", 135 | "# topics:\n", 136 | "def topics_gpt3(input_text):\n", 137 | " response = openai.Completion.create(\n", 138 | " engine=\"davinci-instruct-beta-v3\",\n", 139 | " prompt=f\"Given a transcript, get tags:\\n\\nTranscript: {input_text}\\n\\nTags:\",\n", 140 | " temperature=0, \n", 141 | " max_tokens=20,\n", 142 | " top_p=1,\n", 143 | " frequency_penalty=0,\n", 144 | " presence_penalty=0,\n", 145 | " #stop=[\"\\n\"],\n", 146 | " #n=3,\n", 147 | " #best_of=1,\n", 148 | " )\n", 149 | " topics = response[\"choices\"][0][\"text\"]\n", 150 | " topics = topics[:-1] if topics.endswith(\",\") else topics\n", 151 | " return topics.replace(\"\\n\",\"\").strip()\n", 152 | "\n", 153 | "def sentiment_gpt3(input_text):\n", 154 | " response = openai.Completion.create(\n", 155 | " engine=\"davinci-instruct-beta-v3\",\n", 156 | " prompt=f\"Given a transcript, get sentiment:\\n\\nSentiment: {input_text}\\n\\nTags:\",\n", 157 | " temperature=0, \n", 158 | " max_tokens=20,\n", 159 | " top_p=1,\n", 160 | " frequency_penalty=0,\n", 161 | " presence_penalty=0,\n", 162 | " #stop=[\"\\n\"],\n", 163 | " #n=3,\n", 164 | " #best_of=1,\n", 165 | " )\n", 166 | " topics = response[\"choices\"][0][\"text\"]\n", 167 | " topics = topics[:-1] if topics.endswith(\",\") else topics\n", 168 | " return topics.replace(\"\\n\",\"\").strip()" 169 | ] 170 | }, 171 | { 172 | "cell_type": "code", 173 | "execution_count": 5, 174 | "metadata": {}, 175 | "outputs": [ 176 | { 177 | "ename": "NameError", 178 | "evalue": "name 'input_text' is not defined", 179 | "output_type": "error", 180 | "traceback": [ 181 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", 182 | "\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)", 183 | "\u001b[0;32m/var/folders/t4/c2kdpvnj6r793w3k056dkc640000gp/T/ipykernel_846/1150659228.py\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[1;32m 54\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 55\u001b[0m \u001b[0mcommand\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m\"Given a transcript, get summary:\\n\\nTranscript: \"\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m\"\\n\\nSummary:\"\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 56\u001b[0;31m \u001b[0minference\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minput_text\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mcommand\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmode\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m\"summary\"\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", 184 | "\u001b[0;31mNameError\u001b[0m: name 'input_text' is not defined" 185 | ] 186 | } 187 | ], 188 | "source": [ 189 | "# summarization\n", 190 | "def inference(input_text, command, mode=\"summary\"):\n", 191 | "\n", 192 | " prompt = command[0] + input_text + command[1] \n", 193 | " try:\n", 194 | " response = openai.Completion.create(\n", 195 | " engine= \"davinci-instruct-beta-v3\", #self.ENGINE@\" #\"davinci-instruct-beta-v3\", #\"ada-instruct-beta\",\n", 196 | " prompt=prompt,\n", 197 | " temperature=0,\n", 198 | " max_tokens=130, #self.MAX_LEN_DESCRIPTION+80,\n", 199 | " top_p=1,\n", 200 | " frequency_penalty=0,\n", 201 | " presence_penalty=0,\n", 202 | " #stop=[\"\\n\"],\n", 203 | " #n=3,\n", 204 | " #best_of=1,\n", 205 | " )\n", 206 | " except:\n", 207 | " # find index of all period in input_text\n", 208 | " periods = [i for i, x in enumerate(input_text) if x == \".\"]\n", 209 | " period = periods[len(periods)//2]\n", 210 | " input_text = input_text[:period+1]\n", 211 | " \n", 212 | " prompt = command[0] + input_text + command[1]\n", 213 | "\n", 214 | " response = openai.Completion.create(\n", 215 | " engine= \"davinci-instruct-beta-v3\", #self.ENGINE\", #\"davinci-instruct-beta-v3\", #\"ada-instruct-beta\",\n", 216 | " prompt=prompt,\n", 217 | " temperature=0,\n", 218 | " max_tokens=130, #self.MAX_LEN_DESCRIPTION+80,\n", 219 | " top_p=1,\n", 220 | " frequency_penalty=0,\n", 221 | " presence_penalty=0,\n", 222 | " #stop=[\"\\n\"],\n", 223 | " #n=3,\n", 224 | " #best_of=1,\n", 225 | " )\n", 226 | " \n", 227 | " if mode == \"summary\":\n", 228 | " summary = response[\"choices\"][0][\"text\"]\n", 229 | " return summary.replace(\"\\n\",\"\").strip()\n", 230 | " elif mode == \"title\":\n", 231 | " title = response[\"choices\"][0][\"text\"]\n", 232 | " return title.replace(\"\\n\",\"\").strip()\n", 233 | " elif mode == \"topics\":\n", 234 | " topics = response[\"choices\"][0][\"text\"]\n", 235 | " topics = topics[:-1] if topics.endswith(\",\") else topics\n", 236 | " return topics.replace(\"\\n\",\"\").strip()\n", 237 | " elif mode == \"sentiment\":\n", 238 | " sentiment = response[\"choices\"][0][\"text\"]\n", 239 | " return sentiment.replace(\"\\n\",\"\").strip()\n", 240 | "\n", 241 | "\n", 242 | "\n", 243 | "command = [\"Given a transcript, get summary:\\n\\nTranscript: \", \"\\n\\nSummary:\"]\n", 244 | "inference(input_text, command, mode=\"summary\")" 245 | ] 246 | }, 247 | { 248 | "cell_type": "code", 249 | "execution_count": 7, 250 | "metadata": {}, 251 | "outputs": [], 252 | "source": [ 253 | "segments = [\"\\\"Can you taste words?\\\" It was a question\\nthat caught me by surprise. This summer, I was giving a talk\\nat a literary festival, and afterwards, as I was signing books, a teenage girl came with her friend, and this is what she asked me. I told her that some people\\nexperience an overlap in their senses so that they could hear colors or see sounds, and many writers were fascinated\\nby this subject, myself included. But she cut me off, a bit impatiently,\\nand said, \\\"Yeah, I know all of that. It's called synesthesia. We learned it at school. But my mom is reading your book, and she says there's lots\\nof food and ingredients and a long dinner scene in it. She gets hungry at every page. So I was thinking, how come you don't\\nget hungry when you write? And I thought maybe,\\nmaybe you could taste words. Does it make sense?\\\" And, actually, it did make sense, because ever since my childhood, each letter in the alphabet\\nhas a different color, and colors bring me flavors. So for instance, the color purple\\nis quite pungent, almost perfumed, and any words that I associate with purple taste the same way, such as \\\"sunset\\\" -- a very spicy word. But I was worried that if I tell\\nall of this to the teenager, it might sound either too abstract or perhaps too weird, and there wasn't enough time anyhow, because people were waiting in the queue, so it suddenly felt like\\nwhat I was trying to convey was more complicated and detailed than what the circumstances\\nallowed me to say. And I did what I usually do\\nin similar situations: I stammered, I shut down,\\nand I stopped talking. I stopped talking because\\nthe truth was complicated, even though I knew, deep within, that one should never, ever\\nremain silent for fear of complexity. So I want to start my talk today with the answer that I was not able\\nto give on that day. Yes, I can taste words -- sometimes, that is, not always, and happy words have\\na different flavor than sad words. I like to explore: What does\\nthe word \\\"creativity\\\" taste like, or \\\"equality,\\\" \\\"love,\\\" \\\"revolution?\\\" And what about \\\"motherland?\\\" These days, it's particularly\\nthis last word that troubles me. It leaves a sweet taste on my tongue, like cinnamon, a bit of rose water and golden apples. But underneath, there's a sharp tang, like nettles and dandelion. The taste of my motherland, Turkey, is a mixture of sweet and bitter. And the reason why I'm telling you this is because I think\\nthere's more and more people all around the world today who have similarly mixed emotions about the lands they come from. We love our native countries, yeah? How can we not? We feel attached to the people,\\nthe culture, the land, the food. And yet at the same time, we feel increasingly frustrated\\nby its politics and politicians, sometimes to the point\\nof despair or hurt or anger. I want to talk about emotions and the need to boost\\nour emotional intelligence. I think it's a pity that mainstream political theory\\npays very little attention to emotions. Oftentimes, analysts and experts\\nare so busy with data and metrics that they seem to forget\\nthose things in life that are difficult to measure and perhaps impossible to cluster\\nunder statistical models. But I think this is a mistake,\\nfor two main reasons. Firstly, because we are emotional beings. As human beings,\\nI think we all are like that. But secondly, and this is new, we have entered\\na new stage in world history in which collective sentiments\\nguide and misguide politics more than ever before.\", \"Ours is the age of anxiety, anger, distrust, resentment and, I think, lots of fear. But here's the thing: even though there's plenty of research\\nabout economic factors, there's relatively few studies\\nabout emotional factors. Why is it that we underestimate\\nfeelings and perceptions? I think it's going to be one\\nof our biggest intellectual challenges, because our political systems\\nare replete with emotions. In country after country, we have seen illiberal politicians\\nexploiting these emotions. And yet within the academia\\nand among the intelligentsia, we are yet to take emotions seriously. I think we should. And just like we should focus\\non economic inequality worldwide, we need to pay more attention\\nto emotional and cognitive gaps worldwide and how to bridge these gaps, because they do matter.\", \"And at some point\\nin our exchange, she said, \\\"I understand why you're a feminist, because, you know, you live in Turkey.\\\" And I said to her, \\\"I don't understand\\nwhy you're not a feminist, because, you know, you live in America.\\\" (Laughter) (Applause) And she laughed. She took it as a joke, and the moment passed. (Laughter) But the way she had divided the world\\ninto two imaginary camps, into two opposite camps -- it bothered me and it stayed with me.\", \"They were like choppy waters\\nnot yet settled. Some other parts of the world,\\nnamely the West, were solid, safe and stable. So it was the liquid lands\\nthat needed feminism and activism and human rights, and those of us who were\\nunfortunate enough to come from such places had to keep struggling\\nfor these most essential values. But there was hope. Since history moved forward, even the most unsteady lands\\nwould someday catch up. And meanwhile, the citizens of solid lands could take comfort\\nin the progress of history and in the triumph of the liberal order. They could support the struggles\\nof other people elsewhere, but they themselves\\ndid not have to struggle for the basics of democracy anymore, because they were beyond that stage. I think in the year 2016, this hierarchical geography\\nwas shattered to pieces. Our world no longer follows\\nthis dualistic pattern in the scholar's mind, if it ever did. Now we know that history\\ndoes not necessarily move forward. Sometimes it draws circles, even slides backwards, and that generations\\ncan make the same mistakes that their great-grandfathers had made. And now we know that there's no such thing as solid countries\\nversus liquid countries. In fact, we are all living\\nin liquid times, just like the late Zygmunt Bauman told us. And Bauman had another\\ndefinition for our age. He used to say we are all going\\nto be walking on moving sands. And if that's the case, I think, it should concern us women more than men, because when societies\\nslide backwards into authoritarianism, nationalism or religious fanaticism, women have much more to lose. That is why this needs\\nto be a vital moment, not only for global activism, but in my opinion,\\nfor global sisterhood as well. (Applause) But I want to make a little confession\\nbefore I go any further. Until recently, whenever I took part in\\nan international conference or festival, I would be usually one\\nof the more depressed speakers. (Laughter) Having seen how our dreams of democracy\\nand how our dreams of coexistence were crushed in Turkey, both gradually but also\\nwith a bewildering speed, over the years I've felt\\nquite demoralized. And at these festivals there would be\\nsome other gloomy writers, and they would come from places\\nsuch as Egypt, Nigeria, Pakistan, Bangladesh, Philippines,\\nChina, Venezuela, Russia. And we would smile\\nat each other in sympathy, this camaraderie of the doomed. (Laughter) And you could call us WADWIC: Worried and Depressed\\nWriters International Club.\", \"I remember -- (Laughter) I remember Greek writers and poets\\njoined first, came on board. And then writers from Hungary and Poland, and then, interestingly, writers\\nfrom Austria, the Netherlands, France, and then writers from the UK,\\nwhere I live and where I call my home, and then writers from the USA. Suddenly, there were more of us feeling worried about\\nthe fate of our nations and the future of the world. And maybe there were more of us now feeling like strangers\\nin our own motherlands. And then this bizarre thing happened. Those of us who used to be\\nvery depressed for a long time, we started to feel less depressed, whereas the newcomers,\\nthey were so not used to feeling this way that they were now even more depressed. (Laughter) So you could see writers\\nfrom Bangladesh or Turkey or Egypt trying to console their colleagues from Brexit Britain\\nor from post-election USA. (Laughter) But joking aside, I think our world is full\\nof unprecedented challenges, and this comes with an emotional backlash, because in the face of high-speed change, many people wish to slow down, and when there's too much unfamiliarity, people long for the familiar.\", \"This is a very dangerous crossroads, because it's exactly where the demagogue\\nenters into the picture. The demagogue understands\\nhow collective sentiments work and how he -- it's usually a he --\\ncan benefit from them. He tells us that we all\\nbelong in our tribes, and he tells us that we will be safer\\nif we are surrounded by sameness.\", \"This could be the eccentric leader\\nof a marginal political party somewhere in Europe, or an Islamist extremist imam\\npreaching dogma and hatred, or it could be a white supremacist\\nNazi-admiring orator somewhere else. All these figures, at first glance --\\nthey seem disconnected. But I think they feed each other, and they need each other. And all around the world, when we look at how demagogues talk\\nand how they inspire movements, I think they have one\\nunmistakable quality in common: they strongly, strongly dislike plurality. They cannot deal with multiplicity. Adorno used to say, \\\"Intolerance of ambiguity is the sign\\nof an authoritarian personality.\\\" But I ask myself: What if that same sign, that same intolerance of ambiguity -- what if it's the mark of our times,\\nof the age we're living in? Because wherever I look,\\nI see nuances withering away.\", \"Yeah? It's good ratings. It's even better\\nif they shout at each other. Even in academia, where our intellect\\nis supposed to be nourished, you see one atheist scholar\\ncompeting with a firmly theist scholar, but it's not a real intellectual exchange, because it's a clash\\nbetween two certainties. I think binary oppositions are everywhere. So slowly and systematically, we are being denied the right\\nto be complex.\", \"And when you express your sorrow,\\nand when you react against the cruelty, you get all kinds of reactions, messages on social media. But one of them is quite disturbing, only because it's so widespread. They say, \\\"Why do you feel sorry for them? Why do you feel sorry for them? Why don't you feel sorry\\nfor civilians in Yemen or civilians in Syria?\\\" And I think the people\\nwho write such messages do not understand that we can feel sorry for\\nand stand in solidarity with victims of terrorism and violence\\nin the Middle East, in Europe, in Asia, in America, wherever, everywhere, equally and simultaneously. They don't seem to understand\\nthat we don't have to pick one pain and one place over all others. But I think this is what\\ntribalism does to us. It shrinks our minds, for sure, but it also shrinks our hearts, to such an extent that we become numb\\nto the suffering of other people. And the sad truth is,\\nwe weren't always like this.\", \"I went to many primary schools, which gave me a chance to observe\\nyounger kids in Turkey. And it was always amazing to see\\nhow much empathy, imagination and chutzpah they have. These children are much more inclined\\nto become global citizens than nationalists at that age. And it's wonderful to see,\\nwhen you ask them, so many of them want\\nto be poets and writers, and girls are just as confident as boys, if not even more. But then I would go to high schools, and everything has changed. Now nobody wants to be a writer anymore, now nobody wants to be a novelist anymore, and girls have become timid, they are cautious, guarded, reluctant to speak up in the public space, because we have taught them -- the family, the school, the society -- we have taught them\\nto erase their individuality. I think East and West, we are losing multiplicity, both within our societies\\nand within ourselves. And coming from Turkey,\\nI do know that the loss of diversity is a major, major loss.\", \"And I also believe that what happened\\nover there in Turkey can happen anywhere. It can even happen here. So just like solid countries\\nwas an illusion, singular identities is also an illusion, because we all have\\na multiplicity of voices inside. The Iranian, the Persian poet, Hafiz, used to say, \\\"You carry in your soul\\nevery ingredient necessary to turn your existence into joy. All you have to do\\nis to mix those ingredients.\\\" And I think mix we can. I am an Istanbulite, but I'm also attached to the Balkans, the Aegean, the Mediterranean, the Middle East, the Levant. I am a European by birth, by choice, the values that I uphold. I have become a Londoner over the years. I would like to think of myself\\nas a global soul, as a world citizen, a nomad and an itinerant storyteller. I have multiple attachments,\\njust like all of us do.\", \"As writers, we always\\nchase stories, of course, but I think we are also\\ninterested in silences, the things we cannot talk about, political taboos, cultural taboos. We're also interested in our own silences. I have always been very vocal\\nabout and written extensively about minority rights, women's rights, LGBT rights. But as I was thinking about this TED Talk, I realized one thing: I have never had the courage\\nto say in a public space that I was bisexual myself, because I so feared the slander and the stigma and the ridicule and the hatred that was sure to follow. But of course, one should never,\\never, remain silent for fear of complexity. (Applause) And although I am\\nno stranger to anxieties, and although I am talking here\\nabout the power of emotions -- I do know the power of emotions -- I have discovered over time that emotions are not limitless. You know? They have a limit. There comes a moment -- it's like a tipping point\\nor a threshold -- when you get tired of feeling afraid, when you get tired of feeling anxious. And I think not only individuals, but perhaps nations, too,\\nhave their own tipping points. So even stronger than my emotions is my awareness that not only gender, not only identity, but life itself is fluid. They want to divide us into tribes, but we are connected across borders. They preach certainty, but we know that life has plenty of magic and plenty of ambiguity. And they like to incite dualities, but we are far more nuanced than that. So what can we do? I think we need to go back to the basics, back to the colors of the alphabet. The Lebanese poet\\nKhalil Gibran used to say, \\\"I learned silence from the talkative and tolerance from the intolerant and kindness from the unkind.\\\" I think it's a great motto for our times. So from populist demagogues, we will learn the indispensability of democracy. And from isolationists, we will learn\\nthe need for global solidarity. And from tribalists, we will learn\\nthe beauty of cosmopolitanism and the beauty of diversity. As I finish, I want to leave you\\nwith one word, or one taste. The word \\\"yurt\\\" in Turkish\\nmeans \\\"motherland.\\\" It means \\\"homeland.\\\" But interestingly, the word also means \\\"a tent used by nomadic tribes.\\\" And I like that combination,\\nbecause it makes me think homelands do not need\\nto be rooted in one place. They can be portable. We can take them with us everywhere. And I think for writers, for storytellers, at the end of the day, there is one main homeland, and it's called \\\"Storyland.\\\" And the taste of that word is the taste of freedom. Thank you. (Applause)\"]" 254 | ] 255 | }, 256 | { 257 | "cell_type": "code", 258 | "execution_count": 8, 259 | "metadata": {}, 260 | "outputs": [], 261 | "source": [ 262 | "#input_text = \"\"\"Jupiter is the fifth planet from the Sun and the largest in the Solar System. It is a gas giant with a mass one-thousandth that of the Sun, but two-and-a-half times that of all the other planets in the Solar System combined. Jupiter is one of the brightest objects visible to the naked eye in the night sky, and has been known to ancient civilizations since before recorded history. It is named after the Roman god Jupiter.[19] When viewed from Earth, Jupiter can be bright enough for its reflected light to cast visible shadows,[20] and is on average the third-brightest natural object in the night sky after the Moon and Venus.\\n\\nJupiter is primarily composed of hydrogen with a quarter of its mass being helium, though helium comprises only about a tenth of the number of molecules. It may also have a rocky core of heavier elements,[21] but like the other giant planets, Jupiter lacks a well-defined solid surface. Because of its rapid rotation, the planet's shape is that of an oblate spheroid (it has a slight but noticeable bulge around the equator).\"\"\"\n", 263 | "input_text = segments[0]\n", 264 | "input_text =\"Agent: For calling customer service. My name is Vanessa, how may help you. Client: I was calling to order place and white. Agent: We happy to send out a replacement card out for you. Agent: Your 16, digit card number. Client: I dont know the Cardinals. Agent: Thank you verify your first and last name please. Client: Patricia Covington. Agent: How you spell your last name? Client: C O V I N G T O and. Agent: And you said your first name. Client: Letricia. Agent: L a T. Client: R I C I. Agent: Z. Agent: It's not pulling up anything C O N C I N G T O. Client: Know C O V as in Victor, Agent: S when? Client: I N G T O N E. Agent: I put the extra letter and I was wondering what I was doing wrong key verify your data birth for the reserve anson. Client: Uh huh made since 1995. Agent: Thing with this card last owner damage. Client: It was last. Agent: Thinking to verify your address we like a new cards remote out to. Client: 1918 Arlington avenue saint, Louis Missouri 63112 apartment a. Agent: You. Okay. Thank you Mrs. Could send him before? I cant see your car. I need to inform you that this call will be personally cancel excuse me. It will take three to five business days for your new card to arrive in the mail would you like him for counselors car now. Client: Yes maam. Agent: Thank you your card is now been council your my name is Alison team will be transferred to your new card you have 121 instead of benefits available and a dollar and 0.38 and cash benefits. Client: Okay. Thank you. I have you. Agent: Or anything else? I can assist you with today. Client: Know you have a good day. Agent: I was coming soon. Thank you for calling customer service and have a nice day. Client: Thank you, bye, bye. Thank you bye.\"\n", 265 | "#input_text = \"\"\"Computing VRA are and AI workloads but having another viable GPU out there. To do GPU things in a world that doesn't have enough. Gpus does seem like a good GPU thing. And we do have a vague idea of the potential performance and I mean very vague but it is something thanks to German website Golem dot. D e, who uncovered that Feng Hua number ones. Chip based design is built on underlying technology from powervr. And powervr is owned by British company. Imagination Technologies who was acquired by a Chinese investor group called Canyon Bridge, Capital Partners in 2017, Canyon Bridge puts imagination in touch within 0, silicon and there. Go, the best-known powervr, GPU configuration right now would push about six teraflops FP 32, though, in terms of raw performance, which would very theoretically put it in the same league as an Nvidia, r-tx, 2060 or a Radeon RX 580. Speaking of the r-tx 2060. It really does sound like, Nvidia will be bringing it out of retirement. For one more round with a doubled up memory allotment increasing from six to twelve gigabytes of gddr5 6. The r-tx 2060 originally launched in January, 2019, that magical time when GPU Jews were plentiful and there were at least slightly fewer justifications for your chronic existential. Dread the 12 gig 2060 seems all but assured now though as the blabbermouths at the Eurasian economic commission, have once again, broadcasted early info for gigabyte model numbers for cards that they recently registered indicating that the 12 NM gpus are in production. Once again, it's still unknown which die variant. It would ship with as there are several that would fit including the TU 10 6-4. 10 from the 20. 60 super that has more Cuda, cores. Hope is that these might not just sell out at MSRP like every other GPU but r-tx 2060. Mining profitability is still decent and used six gig cards or selling for five to six hundred dollars. So don't get your hopes up still, any available GPU in the budget range would be better than nothing or better than gt7. Tens and 7 30s, which are worse than nothing because of the terrible curse, that they carry the unconcerned announcement date for the 12. Gig 2060 is December 7th for now, but that's not all Nvidia has another bone to throw. Hungry, PC Gamers and it's called Nvidia image scaling which isn't DLS, s. But kind of does a similar Thing by making frames rendered at lower resolutions. Look better, allowing for better frame rates. It is actually just an upgrade to Nvidia has existing image scalar. So it's more like, amd's Fidelity, FX super resolution and it's going to be open source, like am DFS are too. Hey, what a coincidence that means in video image. Scaling could also work on AMD and Intel gpus and sure maybe even the fenghuang number one to eventually. So even if this Just Nvidia saying hey we can do FS are too, it's another tool Gamers can use to eke a few more frames out of their gpus which will be, especially welcome to those who are still doing their best to get by with aging graphics cards. Those of you who are struggling through these dark times with nothing but a Radeon HD 6850 or a GTX 970 3.5 gig to support your gaming needs, can at least take solace in the fact that GPU manufacturers themselves are making huge steaming, but loads of cash mountains of money, a veritable, golden shower of actual gold bars case in point on Wednesday and video, CEO. And man, who I really hope signs the adoption papers. I sent. Tim Jensen Wang revealed their revenue for the last quarter and all-time record, 7.1 billion dollars up 50% year-over-year and also probably too much money. I'm expecting the realize that soon and announced partial refunds for anyone who has bought an overpriced GPU in the past year. Last line was a joke just to be clear. Please don't contact me because Nvidia won't give you money. If you're wondering where in video will go from here, though. Don't worry. The next GPU challenge, has already arrived in the form of these.\"\"\"" 266 | ] 267 | }, 268 | { 269 | "cell_type": "code", 270 | "execution_count": 9, 271 | "metadata": {}, 272 | "outputs": [ 273 | { 274 | "data": { 275 | "text/plain": [ 276 | "' Positive'" 277 | ] 278 | }, 279 | "execution_count": 9, 280 | "metadata": {}, 281 | "output_type": "execute_result" 282 | } 283 | ], 284 | "source": [ 285 | "response = openai.Completion.create(\n", 286 | " engine=\"davinci-instruct-beta-v3\",\n", 287 | " prompt=f\"Given a transcript, classify it into one of 3 categories: Positive, Negative, Neutral:\\n\\nTranscript: {input_text}\\n\\nSentiment:\",\n", 288 | " temperature=0, \n", 289 | " max_tokens=20,\n", 290 | " top_p=1,\n", 291 | " frequency_penalty=0,\n", 292 | " presence_penalty=0,\n", 293 | " #stop=[\"\\n\"],\n", 294 | " #n=3,\n", 295 | " #best_of=1,\n", 296 | ")\n", 297 | "response[\"choices\"][0][\"text\"]\n" 298 | ] 299 | }, 300 | { 301 | "cell_type": "code", 302 | "execution_count": 7, 303 | "metadata": {}, 304 | "outputs": [ 305 | { 306 | "name": "stdout", 307 | "output_type": "stream", 308 | "text": [ 309 | "Input text: \n", 310 | "Agent: For calling customer service. My name is Vanessa, how may help you. Client: I was calling to order place and white. Agent: We happy to send out a replacement card out for you. Agent: Your 16, digit card number. Client: I dont know the Cardinals. Agent: Thank you verify your first and last name please. Client: Patricia Covington. Agent: How you spell your last name? Client: C O V I N G T O and. Agent: And you said your first name. Client: Letricia. Agent: L a T. Client: R I C I. Agent: Z. Agent: It's not pulling up anything C O N C I N G T O. Client: Know C O V as in Victor, Agent: S when? Client: I N G T O N E. Agent: I put the extra letter and I was wondering what I was doing wrong key verify your data birth for the reserve anson. Client: Uh huh made since 1995. Agent: Thing with this card last owner damage. Client: It was last. Agent: Thinking to verify your address we like a new cards remote out to. Client: 1918 Arlington avenue saint, Louis Missouri 63112 apartment a. Agent: You. Okay. Thank you Mrs. Could send him before? I cant see your car. I need to inform you that this call will be personally cancel excuse me. It will take three to five business days for your new card to arrive in the mail would you like him for counselors car now. Client: Yes maam. Agent: Thank you your card is now been council your my name is Alison team will be transferred to your new card you have 121 instead of benefits available and a dollar and 0.38 and cash benefits. Client: Okay. Thank you. I have you. Agent: Or anything else? I can assist you with today. Client: Know you have a good day. Agent: I was coming soon. Thank you for calling customer service and have a nice day. Client: Thank you, bye, bye. Thank you bye.\n", 311 | "--------------------------------------------------------------------------------\n", 312 | "--------------------------------------------------------------------------------\n", 313 | "\n", 314 | "Summary: \n", 315 | "The customer called to order a new card, but was having trouble because they didn't know their card number. The agent helped them to verify their information and then sent them a new card.\n", 316 | "--------------------------------------------------------------------------------\n", 317 | "Title: \n", 318 | "Vanessa Helps Patricia Replace Damaged Card\n", 319 | "--------------------------------------------------------------------------------\n", 320 | "Tags: \n", 321 | "transcription, customer service, order, replacement card, 16 digit card number, Patricia C\n", 322 | "--------------------------------------------------------------------------------\n", 323 | "--------------------------------------------------------------------------------\n", 324 | "Sentiment: \n", 325 | "positive, happy, helpful\n", 326 | "--------------------------------------------------------------------------------\n" 327 | ] 328 | } 329 | ], 330 | "source": [ 331 | "print(f\"Input text: \\n{input_text}\")\n", 332 | "\n", 333 | "print(\"--------\"*10)\n", 334 | "print(\"--------\"*10+\"\\n\")\n", 335 | "\n", 336 | "summary = summary_gpt3(input_text)\n", 337 | "print(f\"Summary: \\n{summary}\")\n", 338 | "\n", 339 | "print(\"--------\"*10)\n", 340 | "\n", 341 | "title = title_gpt3(input_text)\n", 342 | "print(f\"Title: \\n{title}\")\n", 343 | "\n", 344 | "print(\"--------\"*10)\n", 345 | "\n", 346 | "topics = topics_gpt3(input_text)\n", 347 | "print(f\"Tags: \\n{topics}\")\n", 348 | "print(\"--------\"*10)\n", 349 | "\n", 350 | "print(\"--------\"*10)\n", 351 | "\n", 352 | "topics = sentiment_gpt3(input_text)\n", 353 | "print(f\"Sentiment: \\n{topics}\")\n", 354 | "print(\"--------\"*10)" 355 | ] 356 | }, 357 | { 358 | "cell_type": "code", 359 | "execution_count": 13, 360 | "metadata": {}, 361 | "outputs": [ 362 | { 363 | "name": "stdout", 364 | "output_type": "stream", 365 | "text": [ 366 | "Input text: \n", 367 | "\"Can you taste words?\" It was a question\n", 368 | "that caught me by surprise. This summer, I was giving a talk\n", 369 | "at a literary festival, and afterwards, as I was signing books, a teenage girl came with her friend, and this is what she asked me. I told her that some people\n", 370 | "experience an overlap in their senses so that they could hear colors or see sounds, and many writers were fascinated\n", 371 | "by this subject, myself included. But she cut me off, a bit impatiently,\n", 372 | "and said, \"Yeah, I know all of that. It's called synesthesia. We learned it at school. But my mom is reading your book, and she says there's lots\n", 373 | "of food and ingredients and a long dinner scene in it. She gets hungry at every page. So I was thinking, how come you don't\n", 374 | "get hungry when you write? And I thought maybe,\n", 375 | "maybe you could taste words. Does it make sense?\" And, actually, it did make sense, because ever since my childhood, each letter in the alphabet\n", 376 | "has a different color, and colors bring me flavors. So for instance, the color purple\n", 377 | "is quite pungent, almost perfumed, and any words that I associate with purple taste the same way, such as \"sunset\" -- a very spicy word. But I was worried that if I tell\n", 378 | "all of this to the teenager, it might sound either too abstract or perhaps too weird, and there wasn't enough time anyhow, because people were waiting in the queue, so it suddenly felt like\n", 379 | "what I was trying to convey was more complicated and detailed than what the circumstances\n", 380 | "allowed me to say. And I did what I usually do\n", 381 | "in similar situations: I stammered, I shut down,\n", 382 | "and I stopped talking. I stopped talking because\n", 383 | "the truth was complicated, even though I knew, deep within, that one should never, ever\n", 384 | "remain silent for fear of complexity. So I want to start my talk today with the answer that I was not able\n", 385 | "to give on that day. Yes, I can taste words -- sometimes, that is, not always, and happy words have\n", 386 | "a different flavor than sad words. I like to explore: What does\n", 387 | "the word \"creativity\" taste like, or \"equality,\" \"love,\" \"revolution?\" And what about \"motherland?\" These days, it's particularly\n", 388 | "this last word that troubles me. It leaves a sweet taste on my tongue, like cinnamon, a bit of rose water and golden apples. But underneath, there's a sharp tang, like nettles and dandelion. The taste of my motherland, Turkey, is a mixture of sweet and bitter. And the reason why I'm telling you this is because I think\n", 389 | "there's more and more people all around the world today who have similarly mixed emotions about the lands they come from. We love our native countries, yeah? How can we not? We feel attached to the people,\n", 390 | "the culture, the land, the food. And yet at the same time, we feel increasingly frustrated\n", 391 | "by its politics and politicians, sometimes to the point\n", 392 | "of despair or hurt or anger. I want to talk about emotions and the need to boost\n", 393 | "our emotional intelligence. I think it's a pity that mainstream political theory\n", 394 | "pays very little attention to emotions. Oftentimes, analysts and experts\n", 395 | "are so busy with data and metrics that they seem to forget\n", 396 | "those things in life that are difficult to measure and perhaps impossible to cluster\n", 397 | "under statistical models. But I think this is a mistake,\n", 398 | "for two main reasons. Firstly, because we are emotional beings. As human beings,\n", 399 | "I think we all are like that. But secondly, and this is new, we have entered\n", 400 | "a new stage in world history in which collective sentiments\n", 401 | "guide and misguide politics more than ever before.\n", 402 | "--------------------------------------------------------------------------------\n", 403 | "--------------------------------------------------------------------------------\n", 404 | "\n", 405 | "Summary: \n", 406 | "The speaker discusses how emotions can be a powerful force in politics. She argues that mainstream political theory often ignores emotions, which she believes is a mistake. She provides two reasons why emotions should be taken into account: first, because humans are emotional beings, and second, because collective sentiments are increasingly influential in politics.\n", 407 | "--------------------------------------------------------------------------------\n", 408 | "Title: \n", 409 | "Can you taste words? Writer explores how flavor affects emotions\n", 410 | "--------------------------------------------------------------------------------\n", 411 | "Tags: \n", 412 | "emotions, political theory, collective sentiment, world history\n", 413 | "--------------------------------------------------------------------------------\n", 414 | "--------------------------------------------------------------------------------\n", 415 | "--------------------------------------------------------------------------------\n", 416 | "\n", 417 | "\n", 418 | "\n", 419 | "\n", 420 | "Input text: \n", 421 | "Ours is the age of anxiety, anger, distrust, resentment and, I think, lots of fear. But here's the thing: even though there's plenty of research\n", 422 | "about economic factors, there's relatively few studies\n", 423 | "about emotional factors. Why is it that we underestimate\n", 424 | "feelings and perceptions? I think it's going to be one\n", 425 | "of our biggest intellectual challenges, because our political systems\n", 426 | "are replete with emotions. In country after country, we have seen illiberal politicians\n", 427 | "exploiting these emotions. And yet within the academia\n", 428 | "and among the intelligentsia, we are yet to take emotions seriously. I think we should. And just like we should focus\n", 429 | "on economic inequality worldwide, we need to pay more attention\n", 430 | "to emotional and cognitive gaps worldwide and how to bridge these gaps, because they do matter.\n", 431 | "--------------------------------------------------------------------------------\n", 432 | "--------------------------------------------------------------------------------\n", 433 | "\n", 434 | "Summary: \n", 435 | "The speaker discusses the role of emotions in politics, and argues that we need to take emotions more seriously. He cites research showing that emotional factors play a significant role in political outcomes, and argues that we need to do more to understand and address these emotions.\n", 436 | "--------------------------------------------------------------------------------\n", 437 | "Title: \n", 438 | "Study Shows that Emotions Play a Huge Role in Politics\n", 439 | "--------------------------------------------------------------------------------\n", 440 | "Tags: \n", 441 | "anxiety, anger, distrust, resentment, fear, economics, academia, intelligentsia\n", 442 | "--------------------------------------------------------------------------------\n", 443 | "--------------------------------------------------------------------------------\n", 444 | "--------------------------------------------------------------------------------\n", 445 | "\n", 446 | "\n", 447 | "\n", 448 | "\n", 449 | "Input text: \n", 450 | "And at some point\n", 451 | "in our exchange, she said, \"I understand why you're a feminist, because, you know, you live in Turkey.\" And I said to her, \"I don't understand\n", 452 | "why you're not a feminist, because, you know, you live in America.\" (Laughter) (Applause) And she laughed. She took it as a joke, and the moment passed. (Laughter) But the way she had divided the world\n", 453 | "into two imaginary camps, into two opposite camps -- it bothered me and it stayed with me.\n", 454 | "--------------------------------------------------------------------------------\n", 455 | "--------------------------------------------------------------------------------\n", 456 | "\n", 457 | "Summary: \n", 458 | "The speaker discusses a conversation they had with a woman who divided the world into two imaginary camps. The woman said she understood why the speaker was a feminist, because they lived in Turkey. The speaker said they didn't understand why the woman wasn't a feminist, because they lived in America. The woman laughed and took it as a joke. The speaker found the conversation bothersome and it stayed with them.\n", 459 | "--------------------------------------------------------------------------------\n", 460 | "Title: \n", 461 | "Feminist comedian debates woman who doesn't see why she needs feminism\n", 462 | "--------------------------------------------------------------------------------\n", 463 | "Tags: \n", 464 | "feminism, Turkey, America\n", 465 | "--------------------------------------------------------------------------------\n", 466 | "--------------------------------------------------------------------------------\n", 467 | "--------------------------------------------------------------------------------\n", 468 | "\n", 469 | "\n", 470 | "\n", 471 | "\n", 472 | "Input text: \n", 473 | "They were like choppy waters\n", 474 | "not yet settled. Some other parts of the world,\n", 475 | "namely the West, were solid, safe and stable. So it was the liquid lands\n", 476 | "that needed feminism and activism and human rights, and those of us who were\n", 477 | "unfortunate enough to come from such places had to keep struggling\n", 478 | "for these most essential values. But there was hope. Since history moved forward, even the most unsteady lands\n", 479 | "would someday catch up. And meanwhile, the citizens of solid lands could take comfort\n", 480 | "in the progress of history and in the triumph of the liberal order. They could support the struggles\n", 481 | "of other people elsewhere, but they themselves\n", 482 | "did not have to struggle for the basics of democracy anymore, because they were beyond that stage. I think in the year 2016, this hierarchical geography\n", 483 | "was shattered to pieces. Our world no longer follows\n", 484 | "this dualistic pattern in the scholar's mind, if it ever did. Now we know that history\n", 485 | "does not necessarily move forward. Sometimes it draws circles, even slides backwards, and that generations\n", 486 | "can make the same mistakes that their great-grandfathers had made. And now we know that there's no such thing as solid countries\n", 487 | "versus liquid countries. In fact, we are all living\n", 488 | "in liquid times, just like the late Zygmunt Bauman told us. And Bauman had another\n", 489 | "definition for our age. He used to say we are all going\n", 490 | "to be walking on moving sands. And if that's the case, I think, it should concern us women more than men, because when societies\n", 491 | "slide backwards into authoritarianism, nationalism or religious fanaticism, women have much more to lose. That is why this needs\n", 492 | "to be a vital moment, not only for global activism, but in my opinion,\n", 493 | "for global sisterhood as well. (Applause) But I want to make a little confession\n", 494 | "before I go any further. Until recently, whenever I took part in\n", 495 | "an international conference or festival, I would be usually one\n", 496 | "of the more depressed speakers. (Laughter) Having seen how our dreams of democracy\n", 497 | "and how our dreams of coexistence were crushed in Turkey, both gradually but also\n", 498 | "with a bewildering speed, over the years I've felt\n", 499 | "quite demoralized. And at these festivals there would be\n", 500 | "some other gloomy writers, and they would come from places\n", 501 | "such as Egypt, Nigeria, Pakistan, Bangladesh, Philippines,\n", 502 | "China, Venezuela, Russia. And we would smile\n", 503 | "at each other in sympathy, this camaraderie of the doomed. (Laughter) And you could call us WADWIC: Worried and Depressed\n", 504 | "Writers International Club.\n", 505 | "--------------------------------------------------------------------------------\n", 506 | "--------------------------------------------------------------------------------\n", 507 | "\n", 508 | "Summary: \n", 509 | "The speaker discusses how, in the year 2016, the dualistic pattern in the scholar's mind of solid countries versus liquid countries was shattered. They explain that now we know that history does not necessarily move forward, and that sometimes it draws circles or slides backwards. Additionally, they say that we are all living in liquid times. The speaker goes on to say that, because societies can slide backwards into authoritarianism, nationalism, or religious fanaticism, women have much more to lose than men. They end their talk by confessing that, until recently, they were usually one of the more depressed speakers at international conferences and festivals.\n", 510 | "--------------------------------------------------------------------------------\n", 511 | "Title: \n", 512 | "Worried and Depressed Writers International Club\n", 513 | "--------------------------------------------------------------------------------\n", 514 | "Tags: \n", 515 | "feminism, human rights, activism, democracy, coexistence, authoritarianism, nationalism, religious fanaticism\n", 516 | "--------------------------------------------------------------------------------\n", 517 | "--------------------------------------------------------------------------------\n", 518 | "--------------------------------------------------------------------------------\n", 519 | "\n", 520 | "\n", 521 | "\n", 522 | "\n", 523 | "Input text: \n", 524 | "I remember -- (Laughter) I remember Greek writers and poets\n", 525 | "joined first, came on board. And then writers from Hungary and Poland, and then, interestingly, writers\n", 526 | "from Austria, the Netherlands, France, and then writers from the UK,\n", 527 | "where I live and where I call my home, and then writers from the USA. Suddenly, there were more of us feeling worried about\n", 528 | "the fate of our nations and the future of the world. And maybe there were more of us now feeling like strangers\n", 529 | "in our own motherlands. And then this bizarre thing happened. Those of us who used to be\n", 530 | "very depressed for a long time, we started to feel less depressed, whereas the newcomers,\n", 531 | "they were so not used to feeling this way that they were now even more depressed. (Laughter) So you could see writers\n", 532 | "from Bangladesh or Turkey or Egypt trying to console their colleagues from Brexit Britain\n", 533 | "or from post-election USA. (Laughter) But joking aside, I think our world is full\n", 534 | "of unprecedented challenges, and this comes with an emotional backlash, because in the face of high-speed change, many people wish to slow down, and when there's too much unfamiliarity, people long for the familiar.\n", 535 | "--------------------------------------------------------------------------------\n", 536 | "--------------------------------------------------------------------------------\n", 537 | "\n", 538 | "Summary: \n", 539 | "The speaker remembers a time when many writers from different countries came together to discuss the fate of their nations and the future of the world. They noticed that the writers who were used to feeling depressed started to feel less depressed, while the newcomers became even more depressed. They joke about this, but the speaker believes that our world is full of unprecedented challenges that come with an emotional backlash.\n", 540 | "--------------------------------------------------------------------------------\n", 541 | "Title: \n", 542 | "Writers from around the world come together to discuss the future of their nations\n", 543 | "--------------------------------------------------------------------------------\n", 544 | "Tags: \n", 545 | "globalisation, migration, displacement, culture, identity\n", 546 | "--------------------------------------------------------------------------------\n", 547 | "--------------------------------------------------------------------------------\n", 548 | "--------------------------------------------------------------------------------\n", 549 | "\n", 550 | "\n", 551 | "\n", 552 | "\n", 553 | "Input text: \n", 554 | "This is a very dangerous crossroads, because it's exactly where the demagogue\n", 555 | "enters into the picture. The demagogue understands\n", 556 | "how collective sentiments work and how he -- it's usually a he --\n", 557 | "can benefit from them. He tells us that we all\n", 558 | "belong in our tribes, and he tells us that we will be safer\n", 559 | "if we are surrounded by sameness.\n", 560 | "--------------------------------------------------------------------------------\n", 561 | "--------------------------------------------------------------------------------\n", 562 | "\n", 563 | "Summary: \n", 564 | "The speaker is discussing the dangers of a demagogue, who uses collective sentiments to manipulate people. The demagogue tells people that they will be safer if they are surrounded by those who are like them, when in reality this is not the case.\n", 565 | "--------------------------------------------------------------------------------\n", 566 | "Title: \n", 567 | "Trump exploiting collective sentiments for personal gain, warns expert\n", 568 | "--------------------------------------------------------------------------------\n", 569 | "Tags: \n", 570 | "politics, society, danger, democracy, populism, rhetoric\n", 571 | "--------------------------------------------------------------------------------\n", 572 | "--------------------------------------------------------------------------------\n", 573 | "--------------------------------------------------------------------------------\n", 574 | "\n", 575 | "\n", 576 | "\n", 577 | "\n", 578 | "Input text: \n", 579 | "This could be the eccentric leader\n", 580 | "of a marginal political party somewhere in Europe, or an Islamist extremist imam\n", 581 | "preaching dogma and hatred, or it could be a white supremacist\n", 582 | "Nazi-admiring orator somewhere else. All these figures, at first glance --\n", 583 | "they seem disconnected. But I think they feed each other, and they need each other. And all around the world, when we look at how demagogues talk\n", 584 | "and how they inspire movements, I think they have one\n", 585 | "unmistakable quality in common: they strongly, strongly dislike plurality. They cannot deal with multiplicity. Adorno used to say, \"Intolerance of ambiguity is the sign\n", 586 | "of an authoritarian personality.\" But I ask myself: What if that same sign, that same intolerance of ambiguity -- what if it's the mark of our times,\n", 587 | "of the age we're living in? Because wherever I look,\n", 588 | "I see nuances withering away.\n", 589 | "--------------------------------------------------------------------------------\n", 590 | "--------------------------------------------------------------------------------\n", 591 | "\n", 592 | "Summary: \n", 593 | "The speaker is discussing how various demagogues around the world share an intolerance of ambiguity, which they see as a sign of authoritarianism. They argue that this intolerance is contributing to the withering of nuance in our world.\n", 594 | "--------------------------------------------------------------------------------\n", 595 | "Title: \n", 596 | "\"Eccentric Leader of Marginal Political Party Preaches Dogma and Hatred\"\n", 597 | "--------------------------------------------------------------------------------\n", 598 | "Tags: \n", 599 | "authoritarian, extremism, intolerance, ambiguity\n", 600 | "--------------------------------------------------------------------------------\n", 601 | "--------------------------------------------------------------------------------\n", 602 | "--------------------------------------------------------------------------------\n", 603 | "\n", 604 | "\n", 605 | "\n", 606 | "\n", 607 | "Input text: \n", 608 | "Yeah? It's good ratings. It's even better\n", 609 | "if they shout at each other. Even in academia, where our intellect\n", 610 | "is supposed to be nourished, you see one atheist scholar\n", 611 | "competing with a firmly theist scholar, but it's not a real intellectual exchange, because it's a clash\n", 612 | "between two certainties. I think binary oppositions are everywhere. So slowly and systematically, we are being denied the right\n", 613 | "to be complex.\n", 614 | "--------------------------------------------------------------------------------\n", 615 | "--------------------------------------------------------------------------------\n", 616 | "\n", 617 | "Summary: \n", 618 | "The speaker discusses how binary oppositions are everywhere, and how we are being denied the right to be complex. They argue that this is particularly apparent in academia, where intellectual exchanges are often reduced to clashes between two certainties.\n", 619 | "--------------------------------------------------------------------------------\n", 620 | "Title: \n", 621 | "\"Atheist Scholar Clashes with Theist Scholar in Academia\"\n", 622 | "--------------------------------------------------------------------------------\n", 623 | "Tags: \n", 624 | "binary oppositions, atheism, theism, academia, intellect, certainty\n", 625 | "--------------------------------------------------------------------------------\n", 626 | "--------------------------------------------------------------------------------\n", 627 | "--------------------------------------------------------------------------------\n", 628 | "\n", 629 | "\n", 630 | "\n", 631 | "\n", 632 | "Input text: \n", 633 | "And when you express your sorrow,\n", 634 | "and when you react against the cruelty, you get all kinds of reactions, messages on social media. But one of them is quite disturbing, only because it's so widespread. They say, \"Why do you feel sorry for them? Why do you feel sorry for them? Why don't you feel sorry\n", 635 | "for civilians in Yemen or civilians in Syria?\" And I think the people\n", 636 | "who write such messages do not understand that we can feel sorry for\n", 637 | "and stand in solidarity with victims of terrorism and violence\n", 638 | "in the Middle East, in Europe, in Asia, in America, wherever, everywhere, equally and simultaneously. They don't seem to understand\n", 639 | "that we don't have to pick one pain and one place over all others. But I think this is what\n", 640 | "tribalism does to us. It shrinks our minds, for sure, but it also shrinks our hearts, to such an extent that we become numb\n", 641 | "to the suffering of other people. And the sad truth is,\n", 642 | "we weren't always like this.\n", 643 | "--------------------------------------------------------------------------------\n", 644 | "--------------------------------------------------------------------------------\n", 645 | "\n", 646 | "Summary: \n", 647 | "The speaker discusses how people react to acts of terrorism and violence. They say that some people do not understand that the speaker can feel sorry for victims of terrorism in different parts of the world. The speaker argues that this is due to tribalism, which shrinks our minds and hearts.\n", 648 | "--------------------------------------------------------------------------------\n", 649 | "Title: \n", 650 | "\"Saying We Shouldn't Feel Sorry for Paris Victims is Disrespectful and Narrow-Minded\"\n", 651 | "--------------------------------------------------------------------------------\n", 652 | "Tags: \n", 653 | "#solidarity #empathy #tribalism\n", 654 | "--------------------------------------------------------------------------------\n", 655 | "--------------------------------------------------------------------------------\n", 656 | "--------------------------------------------------------------------------------\n", 657 | "\n", 658 | "\n", 659 | "\n", 660 | "\n", 661 | "Input text: \n", 662 | "I went to many primary schools, which gave me a chance to observe\n", 663 | "younger kids in Turkey. And it was always amazing to see\n", 664 | "how much empathy, imagination and chutzpah they have. These children are much more inclined\n", 665 | "to become global citizens than nationalists at that age. And it's wonderful to see,\n", 666 | "when you ask them, so many of them want\n", 667 | "to be poets and writers, and girls are just as confident as boys, if not even more. But then I would go to high schools, and everything has changed. Now nobody wants to be a writer anymore, now nobody wants to be a novelist anymore, and girls have become timid, they are cautious, guarded, reluctant to speak up in the public space, because we have taught them -- the family, the school, the society -- we have taught them\n", 668 | "to erase their individuality. I think East and West, we are losing multiplicity, both within our societies\n", 669 | "and within ourselves. And coming from Turkey,\n", 670 | "I do know that the loss of diversity is a major, major loss.\n", 671 | "--------------------------------------------------------------------------------\n", 672 | "--------------------------------------------------------------------------------\n", 673 | "\n", 674 | "Summary: \n", 675 | "The speaker reflects on their experiences of observing young children in Turkey, noting that they are much more inclined to become global citizens than nationalists. They then discuss their experiences of observing high school students in Turkey, noting that the girls have become timid and guarded. The speaker argues that this is a result of the society teaching them to erase their individuality.\n", 676 | "--------------------------------------------------------------------------------\n", 677 | "Title: \n", 678 | "Turkish Author Laments Loss of Diversity in East and West\n", 679 | "--------------------------------------------------------------------------------\n", 680 | "Tags: \n", 681 | "empathy, imagination, chutzpah, global citizens, nationalism, poets, writers, girls\n", 682 | "--------------------------------------------------------------------------------\n", 683 | "--------------------------------------------------------------------------------\n", 684 | "--------------------------------------------------------------------------------\n", 685 | "\n", 686 | "\n", 687 | "\n", 688 | "\n", 689 | "Input text: \n", 690 | "And I also believe that what happened\n", 691 | "over there in Turkey can happen anywhere. It can even happen here. So just like solid countries\n", 692 | "was an illusion, singular identities is also an illusion, because we all have\n", 693 | "a multiplicity of voices inside. The Iranian, the Persian poet, Hafiz, used to say, \"You carry in your soul\n", 694 | "every ingredient necessary to turn your existence into joy. All you have to do\n", 695 | "is to mix those ingredients.\" And I think mix we can. I am an Istanbulite, but I'm also attached to the Balkans, the Aegean, the Mediterranean, the Middle East, the Levant. I am a European by birth, by choice, the values that I uphold. I have become a Londoner over the years. I would like to think of myself\n", 696 | "as a global soul, as a world citizen, a nomad and an itinerant storyteller. I have multiple attachments,\n", 697 | "just like all of us do.\n", 698 | "--------------------------------------------------------------------------------\n", 699 | "--------------------------------------------------------------------------------\n", 700 | "\n", 701 | "Summary: \n", 702 | "The speaker believes that identities are illusions, because we all have multiple voices inside of us. They say that just like solid countries were an illusion, singular identities are also an illusion. They add that they identify as an Istanbulite, but they are also attached to other places, such as the Balkans, the Aegean, the Mediterranean, and the Middle East. They say that they are a European by birth and choice, and that they uphold European values. Finally, they say that they consider themselves a global soul and a world citizen.\n", 703 | "--------------------------------------------------------------------------------\n", 704 | "Title: \n", 705 | "Istanbulite Poet: Solid Countries an Illusion, Singular Identities an Illusion\n", 706 | "--------------------------------------------------------------------------------\n", 707 | "Tags: \n", 708 | "Istanbul, Balkans, Aegean, Mediterranean, Middle East, Levant, European, Londoner\n", 709 | "--------------------------------------------------------------------------------\n", 710 | "--------------------------------------------------------------------------------\n", 711 | "--------------------------------------------------------------------------------\n", 712 | "\n", 713 | "\n", 714 | "\n", 715 | "\n", 716 | "Input text: \n", 717 | "As writers, we always\n", 718 | "chase stories, of course, but I think we are also\n", 719 | "interested in silences, the things we cannot talk about, political taboos, cultural taboos. We're also interested in our own silences. I have always been very vocal\n", 720 | "about and written extensively about minority rights, women's rights, LGBT rights. But as I was thinking about this TED Talk, I realized one thing: I have never had the courage\n", 721 | "to say in a public space that I was bisexual myself, because I so feared the slander and the stigma and the ridicule and the hatred that was sure to follow. But of course, one should never,\n", 722 | "ever, remain silent for fear of complexity. (Applause) And although I am\n", 723 | "no stranger to anxieties, and although I am talking here\n", 724 | "about the power of emotions -- I do know the power of emotions -- I have discovered over time that emotions are not limitless. You know? They have a limit. There comes a moment -- it's like a tipping point\n", 725 | "or a threshold -- when you get tired of feeling afraid, when you get tired of feeling anxious. And I think not only individuals, but perhaps nations, too,\n", 726 | "have their own tipping points. So even stronger than my emotions is my awareness that not only gender, not only identity, but life itself is fluid. They want to divide us into tribes, but we are connected across borders. They preach certainty, but we know that life has plenty of magic and plenty of ambiguity. And they like to incite dualities, but we are far more nuanced than that. So what can we do? I think we need to go back to the basics, back to the colors of the alphabet. The Lebanese poet\n", 727 | "Khalil Gibran used to say, \"I learned silence from the talkative and tolerance from the intolerant and kindness from the unkind.\" I think it's a great motto for our times. So from populist demagogues, we will learn the indispensability of democracy. And from isolationists, we will learn\n", 728 | "the need for global solidarity. And from tribalists, we will learn\n", 729 | "the beauty of cosmopolitanism and the beauty of diversity. As I finish, I want to leave you\n", 730 | "with one word, or one taste. The word \"yurt\" in Turkish\n", 731 | "means \"motherland.\" It means \"homeland.\" But interestingly, the word also means \"a tent used by nomadic tribes.\" And I like that combination,\n", 732 | "because it makes me think homelands do not need\n", 733 | "to be rooted in one place. They can be portable. We can take them with us everywhere. And I think for writers, for storytellers, at the end of the day, there is one main homeland, and it's called \"Storyland.\" And the taste of that word is the taste of freedom. Thank you. (Applause)\n", 734 | "--------------------------------------------------------------------------------\n", 735 | "--------------------------------------------------------------------------------\n", 736 | "\n", 737 | "Summary: \n", 738 | "The speaker discusses the power of emotions and how they can be used to achieve goals. She also talks about the importance of going back to the basics, and how different groups of people can learn from each other.\n", 739 | "--------------------------------------------------------------------------------\n", 740 | "Title: \n", 741 | "Writer Explores the Power of Silence and Emotions\n", 742 | "--------------------------------------------------------------------------------\n", 743 | "Tags: \n", 744 | "storytelling, identity, politics, culture, fear, anxiety, bisexuality, Khalil Gibran\n", 745 | "--------------------------------------------------------------------------------\n", 746 | "--------------------------------------------------------------------------------\n", 747 | "--------------------------------------------------------------------------------\n", 748 | "\n", 749 | "\n", 750 | "\n", 751 | "\n" 752 | ] 753 | } 754 | ], 755 | "source": [ 756 | "for input_text in segments:\n", 757 | " print(f\"Input text: \\n{input_text}\")\n", 758 | " print(\"--------\"*10)\n", 759 | " print(\"--------\"*10+\"\\n\")\n", 760 | "\n", 761 | " summary = summary_gpt3(input_text)\n", 762 | " print(f\"Summary: \\n{summary}\")\n", 763 | "\n", 764 | " print(\"--------\"*10)\n", 765 | "\n", 766 | " title = title_gpt3(input_text)\n", 767 | " print(f\"Title: \\n{title}\")\n", 768 | "\n", 769 | " print(\"--------\"*10)\n", 770 | "\n", 771 | " topics = topics_gpt3(input_text)\n", 772 | " print(f\"Tags: \\n{topics}\")\n", 773 | " print(\"--------\"*10)\n", 774 | " print(\"--------\"*10)\n", 775 | " print(\"--------\"*10+\"\\n\\n\\n\\n\")" 776 | ] 777 | }, 778 | { 779 | "cell_type": "code", 780 | "execution_count": 20, 781 | "metadata": {}, 782 | "outputs": [], 783 | "source": [ 784 | "response = openai.Completion.create(\n", 785 | " engine=\"davinci-instruct-beta-v3\",\n", 786 | " prompt=f\"Given a transcript of a call center conversation, create a list of chapters:\\n\\nTranscript: {input_text}\\n\\nChapters:\",\n", 787 | " temperature=0, \n", 788 | " max_tokens=20,\n", 789 | " top_p=1,\n", 790 | " frequency_penalty=0,\n", 791 | " presence_penalty=0,\n", 792 | " #stop=[\"\\n\"],\n", 793 | " #n=3,\n", 794 | " #best_of=1,\n", 795 | ")\n", 796 | "topics = response[\"choices\"][0][\"text\"]\n", 797 | "topics = topics[:-1] if topics.endswith(\",\") else topics\n", 798 | "\n", 799 | "\n" 800 | ] 801 | }, 802 | { 803 | "cell_type": "code", 804 | "execution_count": 21, 805 | "metadata": {}, 806 | "outputs": [ 807 | { 808 | "name": "stdout", 809 | "output_type": "stream", 810 | "text": [ 811 | "Agent: For calling customer service. My name is Vanessa, how may help you. Client: I was calling to order place and white. Agent: We happy to send out a replacement card out for you. Agent: Your 16, digit card number. Client: I dont know the Cardinals. Agent: Thank you verify your first and last name please. Client: Patricia Covington. Agent: How you spell your last name? Client: C O V I N G T O and. Agent: And you said your first name. Client: Letricia. Agent: L a T. Client: R I C I. Agent: Z. Agent: It's not pulling up anything C O N C I N G T O. Client: Know C O V as in Victor, Agent: S when? Client: I N G T O N E. Agent: I put the extra letter and I was wondering what I was doing wrong key verify your data birth for the reserve anson. Client: Uh huh made since 1995. Agent: Thing with this card last owner damage. Client: It was last. Agent: Thinking to verify your address we like a new cards remote out to. Client: 1918 Arlington avenue saint, Louis Missouri 63112 apartment a. Agent: You. Okay. Thank you Mrs. Could send him before? I cant see your car. I need to inform you that this call will be personally cancel excuse me. It will take three to five business days for your new card to arrive in the mail would you like him for counselors car now. Client: Yes maam. Agent: Thank you your card is now been council your my name is Alison team will be transferred to your new card you have 121 instead of benefits available and a dollar and 0.38 and cash benefits. Client: Okay. Thank you. I have you. Agent: Or anything else? I can assist you with today. Client: Know you have a good day. Agent: I was coming soon. Thank you for calling customer service and have a nice day. Client: Thank you, bye, bye. Thank you bye.\n", 812 | "--------------------------------------------------------------------------------\n", 813 | "\n", 814 | "\n", 815 | "-Ordering a new card\n", 816 | "-Card number\n", 817 | "-Verifying personal information\n", 818 | "-\n" 819 | ] 820 | } 821 | ], 822 | "source": [ 823 | "print(input_text)\n", 824 | "print(\"--------\"*10)\n", 825 | "print(topics)" 826 | ] 827 | }, 828 | { 829 | "cell_type": "code", 830 | "execution_count": 39, 831 | "metadata": {}, 832 | "outputs": [ 833 | { 834 | "data": { 835 | "text/plain": [ 836 | "'\\n# check if list_words is greater than MAX_LEN_TITLES\\nif len_txt > MAX_LEN_TITLES:\\n list_words_with_dot = find_words_endswith_dot(list_words)\\n # search the worrd with dot most near to MAX_LEN_TITLES\\n for i in range(len(list_words_with_dot)):\\n'" 837 | ] 838 | }, 839 | "execution_count": 39, 840 | "metadata": {}, 841 | "output_type": "execute_result" 842 | } 843 | ], 844 | "source": [ 845 | "MAX_LEN_TITLES = 45\n", 846 | "len_txt = len(input_text.split(\" \"))\n", 847 | "\n", 848 | "list_words = input_text.split(\" \")\n" 849 | ] 850 | }, 851 | { 852 | "cell_type": "code", 853 | "execution_count": 40, 854 | "metadata": {}, 855 | "outputs": [], 856 | "source": [ 857 | "list_words_with_dot = find_words_endswith_dot(list_words)" 858 | ] 859 | }, 860 | { 861 | "cell_type": "code", 862 | "execution_count": 53, 863 | "metadata": {}, 864 | "outputs": [ 865 | { 866 | "data": { 867 | "text/plain": [ 868 | "'Agent: For calling customer service. My name is Vanessa, how may help you. Client: I was calling to order place and white. Agent: We happy to send out a replacement card out for you. Agent: Your 16, digit card number. Client: I dont know the Cardinals.'" 869 | ] 870 | }, 871 | "execution_count": 53, 872 | "metadata": {}, 873 | "output_type": "execute_result" 874 | } 875 | ], 876 | "source": [] 877 | }, 878 | { 879 | "cell_type": "code", 880 | "execution_count": 46, 881 | "metadata": {}, 882 | "outputs": [ 883 | { 884 | "data": { 885 | "text/plain": [ 886 | "0" 887 | ] 888 | }, 889 | "execution_count": 46, 890 | "metadata": {}, 891 | "output_type": "execute_result" 892 | } 893 | ], 894 | "source": [ 895 | "res[5]" 896 | ] 897 | }, 898 | { 899 | "cell_type": "code", 900 | "execution_count": 52, 901 | "metadata": {}, 902 | "outputs": [ 903 | { 904 | "data": { 905 | "text/plain": [ 906 | "'Agent: For calling customer service. My name is Vanessa, how may help you. Client: I was calling to order place and white. Agent: We happy to send out a replacement card out for you. Agent: Your 16, digit card number. Client: I dont know the Cardinals.'" 907 | ] 908 | }, 909 | "execution_count": 52, 910 | "metadata": {}, 911 | "output_type": "execute_result" 912 | } 913 | ], 914 | "source": [] 915 | }, 916 | { 917 | "cell_type": "code", 918 | "execution_count": null, 919 | "metadata": {}, 920 | "outputs": [], 921 | "source": [] 922 | } 923 | ], 924 | "metadata": { 925 | "interpreter": { 926 | "hash": "88de8a03d390c9821a705f8812eeaeda47efe29e530260f109fd5a47346b85c0" 927 | }, 928 | "kernelspec": { 929 | "display_name": "Python 3.8.5 64-bit ('base': conda)", 930 | "language": "python", 931 | "name": "python3" 932 | }, 933 | "language_info": { 934 | "codemirror_mode": { 935 | "name": "ipython", 936 | "version": 3 937 | }, 938 | "file_extension": ".py", 939 | "mimetype": "text/x-python", 940 | "name": "python", 941 | "nbconvert_exporter": "python", 942 | "pygments_lexer": "ipython3", 943 | "version": "3.8.5" 944 | }, 945 | "orig_nbformat": 4 946 | }, 947 | "nbformat": 4, 948 | "nbformat_minor": 2 949 | } 950 | --------------------------------------------------------------------------------