├── README.md └── app.py /README.md: -------------------------------------------------------------------------------- 1 | # Flask-Deep-Learning-NLP-API 2 | 3 | #### This is built for windows 10 4 | 5 | Flask API to productize a document classification model. Classification model was built using Keras with tensorflow backend. 6 | Make sure to install all dependencies, which are listed below 7 | 1) `Python 3` 8 | 2) `flask`, `numpy`, `keras`, `tensorflow`, `pickle`, `logging`. 9 | 10 | We will need the checkpointed exported model .json and .h5 extension files. We will also need the original tokenizer used during model training process. 11 | 12 | To initialize api, edit the `api.py` file and provide path for folder where model and tokenizer are saved. In the command line in windows, navigate to folder where app.py is saved locally. in the command line, type `python app.py` to launch the API. 13 | 14 | To access API for prediction purpose, URL is `http://localhost:5000/predict?text=`. Prediction text can be specified afterthis line. Below is an example of how to do this using `requests` library in Python using `post` method. Output will be in the form of JSON. 15 | 16 | `import requests` 17 | 18 | `example_text = "A complete Bust: This game requires quicktime 5.0 to work...if you have a better version of quicktime (I have 7.5), it will ask you to install the quicktime available on the CD...if you click no, it will not let you play. So, I begrudgingly clicked yes on the third try, and it installed quicktime 5, THEN it tells me to please install the quicktime available on the disc. It KEPT telling me that, even after I uninstalled my version of quicktime 7.5, and reinstalled Barbie Rapunzel and quicktime 5. Very frustrating, and the game absolutely will not work for me. It keeps telling me over and over, to install quicktime 5, tho I've been through the installation process repeatedly. It is NOT my "operating system limitations". This is a brand new computer...merely weeks old with all the state of the art contraptions."` 19 | 20 | `requests.post('''http://localhost:5000/predict?text='''+example_text)` 21 | 22 | Corresponding model files are in my other repo https://github.com/StatguyUser/Amazon-Review-Classification_DeepLearning 23 | 24 | Feel free to reach out!! 25 | -------------------------------------------------------------------------------- /app.py: -------------------------------------------------------------------------------- 1 | ####Dependencies 2 | #1) Python 3 3 | #2) flask, numpy, keras, tensorflow, pickle, logging 4 | #these libraries can be installed one after another by typing library name after the command 'pip install' 5 | #for example: pip install flask 6 | 7 | #####Save model file and tokenizer pickle file in a single folder 8 | #model.json 9 | #model.h5 10 | #tokenizer.pickle 11 | 12 | #navigate to the folder which has app.py file in your computer and type below command and press enter to initialize the API 13 | #python app.py 14 | 15 | #call api by passing text after text = in below command to get the desired label for text 16 | #http://localhost:5000/predict?text= 17 | 18 | import os 19 | from flask import jsonify 20 | from flask import request 21 | from flask import Flask 22 | import numpy as np 23 | 24 | from keras.models import model_from_json 25 | from keras.models import load_model 26 | from keras.preprocessing.text import Tokenizer, text_to_word_sequence 27 | from keras.preprocessing.sequence import pad_sequences 28 | 29 | import tensorflow as tf 30 | 31 | import pickle 32 | 33 | import logging 34 | log = logging.getLogger('werkzeug') 35 | log.setLevel(logging.ERROR) 36 | 37 | np.random.seed(1337) 38 | 39 | graph = tf.get_default_graph() 40 | 41 | 42 | #star Flask application 43 | app = Flask(__name__) 44 | 45 | #Load model 46 | path = 'C:/Users/user/Downloads/Model/' 47 | json_file = open(path+'/model.json', 'r') 48 | loaded_model_json = json_file.read() 49 | json_file.close() 50 | keras_model_loaded = model_from_json(loaded_model_json) 51 | keras_model_loaded.load_weights(path+'/model.h5') 52 | print('Model loaded...') 53 | 54 | 55 | #load tokenizer pickle file 56 | with open(path+'/tokenizer.pickle', 'rb') as handle: 57 | tok = pickle.load(handle) 58 | 59 | def preprocess_text(texts,max_review_length = 100): 60 | #tok = Tokenizer(num_words=num_max) 61 | #tok.fit_on_texts(texts) 62 | cnn_texts_seq = tok.texts_to_sequences(texts) 63 | cnn_texts_mat = pad_sequences(cnn_texts_seq,maxlen=max_review_length) 64 | return cnn_texts_mat 65 | 66 | # URL that we'll use to make predictions using get and post 67 | #@app.route('/predict',methods=['GET','POST']) 68 | @app.route('/predict',methods=['POST']) 69 | 70 | def predict(): 71 | text = request.args.get('text') 72 | x = preprocess_text([text]) 73 | with graph.as_default(): 74 | y = int(np.round(keras_model_loaded.predict(x))) 75 | if y == 1: 76 | return jsonify({'prediction': "Positive"}) 77 | else: 78 | return jsonify({'prediction': "Negative"}) 79 | 80 | 81 | if __name__ == "__main__": 82 | # Run locally 83 | app.run(debug=False) 84 | 85 | --------------------------------------------------------------------------------