├── Book-Recommender-System
├── Book-Recommender.ipynb
├── README.md
├── Result-1.PNG
├── Result-2.PNG
└── books.csv
├── Campus-Recruitment-Analysis
├── CampusPlacementAnalysis.ipynb
├── Placement_Data_Full_Class.csv
├── README.md
├── result1.PNG
└── result2.PNG
├── Celestial-Bodies-Classification
├── DataSheet.csv
├── Plots
│ ├── galaxy.csv
│ ├── plot.py
│ ├── plot1.png
│ ├── plot2.png
│ ├── readme.jpg
│ └── star.csv
├── kernel_svm.py
├── knn.py
└── svm.py
├── Cervical-Cancer-Risk
├── cervical-cancer-awareness.png
├── cervicalCancer-ML.ipynb
└── risk_factors_cervical_cancer.csv
├── Drowsiness-Detection-web-app
├── Drowsiness Detection model.py
├── Eye_patch_extractor_&_GUI.py
├── ISHN0619_C3_pic.jpg
├── Pics-for-Readme
│ ├── 2021-04-10.png
│ ├── 2021-05-03 (1).png
│ ├── 2021-05-03 (2).png
│ ├── 2021-05-03 (3).png
│ └── 2021-05-03 (4).png
├── README.md
├── my_model (1).h5
├── requirements.txt
└── sleep.jfif
├── Kaggle 2020 Survey Analysis
├── Kaggle-Survey-2020-Submission.ipynb
├── README.md
└── images.png
├── Noise-Removal-and-OCR-Using-CNNs-Autoencoders
├── OCR_DeNoiser.ipynb
└── images
│ └── result.jpg
├── README.md
├── Titanic-Survival-Dataset-Analysis
├── .ipynb_checkpoints
│ └── titanic_survival-checkpoint.ipynb
├── README.md
├── ground_truth.csv
├── titanic_survival.ipynb
├── titanic_test.csv
└── titanic_train.csv
├── TwitchTopStreamers-Analysis
├── README.md
├── Twitch-Result.PNG
├── TwitchTopStreamersAnalysis.ipynb
└── twitchdata-update.csv
├── UsedCarPricePredictor
├── CarPricePrediction.ipynb
├── Procfile
├── README.md
├── app.py
├── car_data.csv
├── main.py
├── random_forest_regression_model.pkl
├── requirements.txt
├── static
│ └── stylesheets
│ │ └── styles.css
└── templates
│ └── index.html
└── contribution.md
/Book-Recommender-System/README.md:
--------------------------------------------------------------------------------
1 | # Book-Recommender
2 | A Machine Learning Project that makes the use of KNN to recommend books to the users based off of average ratings of the books in the Data, the language they are written in, and the total rating counts for that book.
3 |
4 | # Files In the Folder
5 | 1) Book Recommender - Contains the actual Recommender System.
6 | 2) books.csv - The data that was used to make the Model.
7 |
8 | # Tools used
9 |
10 |
11 | # Objective of the Project
12 | The objective of this Recommender system was to Recommend Books to the user based on the average ratings that the books have recieved, the number of rating counts and the languages these books are written in.
13 |
14 | # Steps involved in making the Recommender System
15 | 1) We start off by importing and taking a look at our data. We have in total 10 columns to work with. These columns include - the authors, book title, publishers and many other details that might be relevant to the books. More information on these columns is provided in the main notebook.
16 | 2) We then check some basic information such as checkig for any null data present, checking for maximum and minimum values of each column and finding the data types of each column. This step helps us in knowing a little bit more about our data and makes our further processes easier.
17 | 3) Next, we do some visualizations on on the columns in our data. We find the top 10 athors, some highly rated authors, the top 10 publishers and try and find relations between the average ratings of the books and number of ratings that these books have recieved. These helps us in having a better understanding of our data and also helps us in selectig the right features to make our model on.
18 | 4) We chose average ratings, the rating counts and languages as our features to feedd to our model. It would have been better if we had a category column as well, we could use this to suggest books according to the genres but our data set didn't have these values but we can obtain them externally through scraping.
19 | 5) We feed these features to our model which is a KNN based model. So what it does is, a user inputs a name of the book, this book name becomes a data point, then our model looks for other data points that are in the vicinity of this data point to make our Recommendation.
20 | 6) We finally define a method to make our predictions and that concludes our whole process of making this recommender system.
21 |
22 | # Results
23 | The System actually makes some decent recommendations based on the inputs. These recommendations can be made even better with a category column as well. We have a column called ISBN-13 which we cann use to scrape the Categories of each book in our data but even with our features, the system performs well.
24 |
25 | Results when we input one of the Harry Potter books -
26 | 
27 |
28 | Results when we input one of the Lord of the Rings books -
29 | 
30 |
--------------------------------------------------------------------------------
/Book-Recommender-System/Result-1.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Book-Recommender-System/Result-1.PNG
--------------------------------------------------------------------------------
/Book-Recommender-System/Result-2.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Book-Recommender-System/Result-2.PNG
--------------------------------------------------------------------------------
/Campus-Recruitment-Analysis/Placement_Data_Full_Class.csv:
--------------------------------------------------------------------------------
1 | sl_no,gender,ssc_p,ssc_b,hsc_p,hsc_b,hsc_s,degree_p,degree_t,workex,etest_p,specialisation,mba_p,status,salary
2 | 1,M,67.00,Others,91.00,Others,Commerce,58.00,Sci&Tech,No,55,Mkt&HR,58.8,Placed,270000
3 | 2,M,79.33,Central,78.33,Others,Science,77.48,Sci&Tech,Yes,86.5,Mkt&Fin,66.28,Placed,200000
4 | 3,M,65.00,Central,68.00,Central,Arts,64.00,Comm&Mgmt,No,75,Mkt&Fin,57.8,Placed,250000
5 | 4,M,56.00,Central,52.00,Central,Science,52.00,Sci&Tech,No,66,Mkt&HR,59.43,Not Placed,
6 | 5,M,85.80,Central,73.60,Central,Commerce,73.30,Comm&Mgmt,No,96.8,Mkt&Fin,55.5,Placed,425000
7 | 6,M,55.00,Others,49.80,Others,Science,67.25,Sci&Tech,Yes,55,Mkt&Fin,51.58,Not Placed,
8 | 7,F,46.00,Others,49.20,Others,Commerce,79.00,Comm&Mgmt,No,74.28,Mkt&Fin,53.29,Not Placed,
9 | 8,M,82.00,Central,64.00,Central,Science,66.00,Sci&Tech,Yes,67,Mkt&Fin,62.14,Placed,252000
10 | 9,M,73.00,Central,79.00,Central,Commerce,72.00,Comm&Mgmt,No,91.34,Mkt&Fin,61.29,Placed,231000
11 | 10,M,58.00,Central,70.00,Central,Commerce,61.00,Comm&Mgmt,No,54,Mkt&Fin,52.21,Not Placed,
12 | 11,M,58.00,Central,61.00,Central,Commerce,60.00,Comm&Mgmt,Yes,62,Mkt&HR,60.85,Placed,260000
13 | 12,M,69.60,Central,68.40,Central,Commerce,78.30,Comm&Mgmt,Yes,60,Mkt&Fin,63.7,Placed,250000
14 | 13,F,47.00,Central,55.00,Others,Science,65.00,Comm&Mgmt,No,62,Mkt&HR,65.04,Not Placed,
15 | 14,F,77.00,Central,87.00,Central,Commerce,59.00,Comm&Mgmt,No,68,Mkt&Fin,68.63,Placed,218000
16 | 15,M,62.00,Central,47.00,Central,Commerce,50.00,Comm&Mgmt,No,76,Mkt&HR,54.96,Not Placed,
17 | 16,F,65.00,Central,75.00,Central,Commerce,69.00,Comm&Mgmt,Yes,72,Mkt&Fin,64.66,Placed,200000
18 | 17,M,63.00,Central,66.20,Central,Commerce,65.60,Comm&Mgmt,Yes,60,Mkt&Fin,62.54,Placed,300000
19 | 18,F,55.00,Central,67.00,Central,Commerce,64.00,Comm&Mgmt,No,60,Mkt&Fin,67.28,Not Placed,
20 | 19,F,63.00,Central,66.00,Central,Commerce,64.00,Comm&Mgmt,No,68,Mkt&HR,64.08,Not Placed,
21 | 20,M,60.00,Others,67.00,Others,Arts,70.00,Comm&Mgmt,Yes,50.48,Mkt&Fin,77.89,Placed,236000
22 | 21,M,62.00,Others,65.00,Others,Commerce,66.00,Comm&Mgmt,No,50,Mkt&HR,56.7,Placed,265000
23 | 22,F,79.00,Others,76.00,Others,Commerce,85.00,Comm&Mgmt,No,95,Mkt&Fin,69.06,Placed,393000
24 | 23,F,69.80,Others,60.80,Others,Science,72.23,Sci&Tech,No,55.53,Mkt&HR,68.81,Placed,360000
25 | 24,F,77.40,Others,60.00,Others,Science,64.74,Sci&Tech,Yes,92,Mkt&Fin,63.62,Placed,300000
26 | 25,M,76.50,Others,97.70,Others,Science,78.86,Sci&Tech,No,97.4,Mkt&Fin,74.01,Placed,360000
27 | 26,F,52.58,Others,54.60,Central,Commerce,50.20,Comm&Mgmt,Yes,76,Mkt&Fin,65.33,Not Placed,
28 | 27,M,71.00,Others,79.00,Others,Commerce,66.00,Comm&Mgmt,Yes,94,Mkt&Fin,57.55,Placed,240000
29 | 28,M,63.00,Others,67.00,Others,Commerce,66.00,Comm&Mgmt,No,68,Mkt&HR,57.69,Placed,265000
30 | 29,M,76.76,Others,76.50,Others,Commerce,67.50,Comm&Mgmt,Yes,73.35,Mkt&Fin,64.15,Placed,350000
31 | 30,M,62.00,Central,67.00,Central,Commerce,58.00,Comm&Mgmt,No,77,Mkt&Fin,51.29,Not Placed,
32 | 31,F,64.00,Central,73.50,Central,Commerce,73.00,Comm&Mgmt,No,52,Mkt&HR,56.7,Placed,250000
33 | 32,F,67.00,Central,53.00,Central,Science,65.00,Sci&Tech,No,64,Mkt&HR,58.32,Not Placed,
34 | 33,F,61.00,Central,81.00,Central,Commerce,66.40,Comm&Mgmt,No,50.89,Mkt&HR,62.21,Placed,278000
35 | 34,F,87.00,Others,65.00,Others,Science,81.00,Comm&Mgmt,Yes,88,Mkt&Fin,72.78,Placed,260000
36 | 35,M,62.00,Others,51.00,Others,Science,52.00,Others,No,68.44,Mkt&HR,62.77,Not Placed,
37 | 36,F,69.00,Central,78.00,Central,Commerce,72.00,Comm&Mgmt,No,71,Mkt&HR,62.74,Placed,300000
38 | 37,M,51.00,Central,44.00,Central,Commerce,57.00,Comm&Mgmt,No,64,Mkt&Fin,51.45,Not Placed,
39 | 38,F,79.00,Central,76.00,Central,Science,65.60,Sci&Tech,No,58,Mkt&HR,55.47,Placed,320000
40 | 39,F,73.00,Others,58.00,Others,Science,66.00,Comm&Mgmt,No,53.7,Mkt&HR,56.86,Placed,240000
41 | 40,M,81.00,Others,68.00,Others,Science,64.00,Sci&Tech,No,93,Mkt&Fin,62.56,Placed,411000
42 | 41,F,78.00,Central,77.00,Others,Commerce,80.00,Comm&Mgmt,No,60,Mkt&Fin,66.72,Placed,287000
43 | 42,F,74.00,Others,63.16,Others,Commerce,65.00,Comm&Mgmt,Yes,65,Mkt&HR,69.76,Not Placed,
44 | 43,M,49.00,Others,39.00,Central,Science,65.00,Others,No,63,Mkt&Fin,51.21,Not Placed,
45 | 44,M,87.00,Others,87.00,Others,Commerce,68.00,Comm&Mgmt,No,95,Mkt&HR,62.9,Placed,300000
46 | 45,F,77.00,Others,73.00,Others,Commerce,81.00,Comm&Mgmt,Yes,89,Mkt&Fin,69.7,Placed,200000
47 | 46,F,76.00,Central,64.00,Central,Science,72.00,Sci&Tech,No,58,Mkt&HR,66.53,Not Placed,
48 | 47,F,70.89,Others,71.98,Others,Science,65.60,Comm&Mgmt,No,68,Mkt&HR,71.63,Not Placed,
49 | 48,M,63.00,Central,60.00,Central,Commerce,57.00,Comm&Mgmt,Yes,78,Mkt&Fin,54.55,Placed,204000
50 | 49,M,63.00,Others,62.00,Others,Commerce,68.00,Comm&Mgmt,No,64,Mkt&Fin,62.46,Placed,250000
51 | 50,F,50.00,Others,37.00,Others,Arts,52.00,Others,No,65,Mkt&HR,56.11,Not Placed,
52 | 51,F,75.20,Central,73.20,Central,Science,68.40,Comm&Mgmt,No,65,Mkt&HR,62.98,Placed,200000
53 | 52,M,54.40,Central,61.12,Central,Commerce,56.20,Comm&Mgmt,No,67,Mkt&HR,62.65,Not Placed,
54 | 53,F,40.89,Others,45.83,Others,Commerce,53.00,Comm&Mgmt,No,71.2,Mkt&HR,65.49,Not Placed,
55 | 54,M,80.00,Others,70.00,Others,Science,72.00,Sci&Tech,No,87,Mkt&HR,71.04,Placed,450000
56 | 55,F,74.00,Central,60.00,Others,Science,69.00,Comm&Mgmt,No,78,Mkt&HR,65.56,Placed,216000
57 | 56,M,60.40,Central,66.60,Others,Science,65.00,Comm&Mgmt,No,71,Mkt&HR,52.71,Placed,220000
58 | 57,M,63.00,Others,71.40,Others,Commerce,61.40,Comm&Mgmt,No,68,Mkt&Fin,66.88,Placed,240000
59 | 58,M,68.00,Central,76.00,Central,Commerce,74.00,Comm&Mgmt,No,80,Mkt&Fin,63.59,Placed,360000
60 | 59,M,74.00,Central,62.00,Others,Science,68.00,Comm&Mgmt,No,74,Mkt&Fin,57.99,Placed,268000
61 | 60,M,52.60,Central,65.58,Others,Science,72.11,Sci&Tech,No,57.6,Mkt&Fin,56.66,Placed,265000
62 | 61,M,74.00,Central,70.00,Central,Science,72.00,Comm&Mgmt,Yes,60,Mkt&Fin,57.24,Placed,260000
63 | 62,M,84.20,Central,73.40,Central,Commerce,66.89,Comm&Mgmt,No,61.6,Mkt&Fin,62.48,Placed,300000
64 | 63,F,86.50,Others,64.20,Others,Science,67.40,Sci&Tech,No,59,Mkt&Fin,59.69,Placed,240000
65 | 64,M,61.00,Others,70.00,Others,Commerce,64.00,Comm&Mgmt,No,68.5,Mkt&HR,59.5,Not Placed,
66 | 65,M,80.00,Others,73.00,Others,Commerce,75.00,Comm&Mgmt,No,61,Mkt&Fin,58.78,Placed,240000
67 | 66,M,54.00,Others,47.00,Others,Science,57.00,Comm&Mgmt,No,89.69,Mkt&HR,57.1,Not Placed,
68 | 67,M,83.00,Others,74.00,Others,Science,66.00,Comm&Mgmt,No,68.92,Mkt&HR,58.46,Placed,275000
69 | 68,M,80.92,Others,78.50,Others,Commerce,67.00,Comm&Mgmt,No,68.71,Mkt&Fin,60.99,Placed,275000
70 | 69,F,69.70,Central,47.00,Central,Commerce,72.70,Sci&Tech,No,79,Mkt&HR,59.24,Not Placed,
71 | 70,M,73.00,Central,73.00,Central,Science,66.00,Sci&Tech,Yes,70,Mkt&Fin,68.07,Placed,275000
72 | 71,M,82.00,Others,61.00,Others,Science,62.00,Sci&Tech,No,89,Mkt&Fin,65.45,Placed,360000
73 | 72,M,75.00,Others,70.29,Others,Commerce,71.00,Comm&Mgmt,No,95,Mkt&Fin,66.94,Placed,240000
74 | 73,M,84.86,Others,67.00,Others,Science,78.00,Comm&Mgmt,No,95.5,Mkt&Fin,68.53,Placed,240000
75 | 74,M,64.60,Central,83.83,Others,Commerce,71.72,Comm&Mgmt,No,86,Mkt&Fin,59.75,Placed,218000
76 | 75,M,56.60,Central,64.80,Central,Commerce,70.20,Comm&Mgmt,No,84.27,Mkt&Fin,67.2,Placed,336000
77 | 76,F,59.00,Central,62.00,Others,Commerce,77.50,Comm&Mgmt,No,74,Mkt&HR,67,Not Placed,
78 | 77,F,66.50,Others,70.40,Central,Arts,71.93,Comm&Mgmt,No,61,Mkt&Fin,64.27,Placed,230000
79 | 78,M,64.00,Others,80.00,Others,Science,65.00,Sci&Tech,Yes,69,Mkt&Fin,57.65,Placed,500000
80 | 79,M,84.00,Others,90.90,Others,Science,64.50,Sci&Tech,No,86.04,Mkt&Fin,59.42,Placed,270000
81 | 80,F,69.00,Central,62.00,Central,Science,66.00,Sci&Tech,No,75,Mkt&HR,67.99,Not Placed,
82 | 81,F,69.00,Others,62.00,Others,Commerce,69.00,Comm&Mgmt,Yes,67,Mkt&HR,62.35,Placed,240000
83 | 82,M,81.70,Others,63.00,Others,Science,67.00,Comm&Mgmt,Yes,86,Mkt&Fin,70.2,Placed,300000
84 | 83,M,63.00,Central,67.00,Central,Commerce,74.00,Comm&Mgmt,No,82,Mkt&Fin,60.44,Not Placed,
85 | 84,M,84.00,Others,79.00,Others,Science,68.00,Sci&Tech,Yes,84,Mkt&Fin,66.69,Placed,300000
86 | 85,M,70.00,Central,63.00,Others,Science,70.00,Sci&Tech,Yes,55,Mkt&Fin,62,Placed,300000
87 | 86,F,83.84,Others,89.83,Others,Commerce,77.20,Comm&Mgmt,Yes,78.74,Mkt&Fin,76.18,Placed,400000
88 | 87,M,62.00,Others,63.00,Others,Commerce,64.00,Comm&Mgmt,No,67,Mkt&Fin,57.03,Placed,220000
89 | 88,M,59.60,Central,51.00,Central,Science,60.00,Others,No,75,Mkt&HR,59.08,Not Placed,
90 | 89,F,66.00,Central,62.00,Central,Commerce,73.00,Comm&Mgmt,No,58,Mkt&HR,64.36,Placed,210000
91 | 90,F,84.00,Others,75.00,Others,Science,69.00,Sci&Tech,Yes,62,Mkt&HR,62.36,Placed,210000
92 | 91,F,85.00,Others,90.00,Others,Commerce,82.00,Comm&Mgmt,No,92,Mkt&Fin,68.03,Placed,300000
93 | 92,M,52.00,Central,57.00,Central,Commerce,50.80,Comm&Mgmt,No,67,Mkt&HR,62.79,Not Placed,
94 | 93,F,60.23,Central,69.00,Central,Science,66.00,Comm&Mgmt,No,72,Mkt&Fin,59.47,Placed,230000
95 | 94,M,52.00,Central,62.00,Central,Commerce,54.00,Comm&Mgmt,No,72,Mkt&HR,55.41,Not Placed,
96 | 95,M,58.00,Central,62.00,Central,Commerce,64.00,Comm&Mgmt,No,53.88,Mkt&Fin,54.97,Placed,260000
97 | 96,M,73.00,Central,78.00,Others,Commerce,65.00,Comm&Mgmt,Yes,95.46,Mkt&Fin,62.16,Placed,420000
98 | 97,F,76.00,Central,70.00,Central,Science,76.00,Comm&Mgmt,Yes,66,Mkt&Fin,64.44,Placed,300000
99 | 98,F,70.50,Central,62.50,Others,Commerce,61.00,Comm&Mgmt,No,93.91,Mkt&Fin,69.03,Not Placed,
100 | 99,F,69.00,Central,73.00,Central,Commerce,65.00,Comm&Mgmt,No,70,Mkt&Fin,57.31,Placed,220000
101 | 100,M,54.00,Central,82.00,Others,Commerce,63.00,Sci&Tech,No,50,Mkt&Fin,59.47,Not Placed,
102 | 101,F,45.00,Others,57.00,Others,Commerce,58.00,Comm&Mgmt,Yes,56.39,Mkt&HR,64.95,Not Placed,
103 | 102,M,63.00,Central,72.00,Central,Commerce,68.00,Comm&Mgmt,No,78,Mkt&HR,60.44,Placed,380000
104 | 103,F,77.00,Others,61.00,Others,Commerce,68.00,Comm&Mgmt,Yes,57.5,Mkt&Fin,61.31,Placed,300000
105 | 104,M,73.00,Central,78.00,Central,Science,73.00,Sci&Tech,Yes,85,Mkt&HR,65.83,Placed,240000
106 | 105,M,69.00,Central,63.00,Others,Science,65.00,Comm&Mgmt,Yes,55,Mkt&HR,58.23,Placed,360000
107 | 106,M,59.00,Central,64.00,Others,Science,58.00,Sci&Tech,No,85,Mkt&HR,55.3,Not Placed,
108 | 107,M,61.08,Others,50.00,Others,Science,54.00,Sci&Tech,No,71,Mkt&Fin,65.69,Not Placed,
109 | 108,M,82.00,Others,90.00,Others,Commerce,83.00,Comm&Mgmt,No,80,Mkt&HR,73.52,Placed,200000
110 | 109,M,61.00,Central,82.00,Central,Commerce,69.00,Comm&Mgmt,No,84,Mkt&Fin,58.31,Placed,300000
111 | 110,M,52.00,Central,63.00,Others,Science,65.00,Sci&Tech,Yes,86,Mkt&HR,56.09,Not Placed,
112 | 111,F,69.50,Central,70.00,Central,Science,72.00,Sci&Tech,No,57.2,Mkt&HR,54.8,Placed,250000
113 | 112,M,51.00,Others,54.00,Others,Science,61.00,Sci&Tech,No,60,Mkt&HR,60.64,Not Placed,
114 | 113,M,58.00,Others,61.00,Others,Commerce,61.00,Comm&Mgmt,No,58,Mkt&HR,53.94,Placed,250000
115 | 114,F,73.96,Others,79.00,Others,Commerce,67.00,Comm&Mgmt,No,72.15,Mkt&Fin,63.08,Placed,280000
116 | 115,M,65.00,Central,68.00,Others,Science,69.00,Comm&Mgmt,No,53.7,Mkt&HR,55.01,Placed,250000
117 | 116,F,73.00,Others,63.00,Others,Science,66.00,Comm&Mgmt,No,89,Mkt&Fin,60.5,Placed,216000
118 | 117,M,68.20,Central,72.80,Central,Commerce,66.60,Comm&Mgmt,Yes,96,Mkt&Fin,70.85,Placed,300000
119 | 118,M,77.00,Others,75.00,Others,Science,73.00,Sci&Tech,No,80,Mkt&Fin,67.05,Placed,240000
120 | 119,M,76.00,Central,80.00,Central,Science,78.00,Sci&Tech,Yes,97,Mkt&HR,70.48,Placed,276000
121 | 120,M,60.80,Central,68.40,Central,Commerce,64.60,Comm&Mgmt,Yes,82.66,Mkt&Fin,64.34,Placed,940000
122 | 121,M,58.00,Others,40.00,Others,Science,59.00,Comm&Mgmt,No,73,Mkt&HR,58.81,Not Placed,
123 | 122,F,64.00,Central,67.00,Others,Science,69.60,Sci&Tech,Yes,55.67,Mkt&HR,71.49,Placed,250000
124 | 123,F,66.50,Central,66.80,Central,Arts,69.30,Comm&Mgmt,Yes,80.4,Mkt&Fin,71,Placed,236000
125 | 124,M,74.00,Others,59.00,Others,Commerce,73.00,Comm&Mgmt,Yes,60,Mkt&HR,56.7,Placed,240000
126 | 125,M,67.00,Central,71.00,Central,Science,64.33,Others,Yes,64,Mkt&HR,61.26,Placed,250000
127 | 126,F,84.00,Central,73.00,Central,Commerce,73.00,Comm&Mgmt,No,75,Mkt&Fin,73.33,Placed,350000
128 | 127,F,79.00,Others,61.00,Others,Science,75.50,Sci&Tech,Yes,70,Mkt&Fin,68.2,Placed,210000
129 | 128,F,72.00,Others,60.00,Others,Science,69.00,Comm&Mgmt,No,55.5,Mkt&HR,58.4,Placed,250000
130 | 129,M,80.40,Central,73.40,Central,Science,77.72,Sci&Tech,Yes,81.2,Mkt&HR,76.26,Placed,400000
131 | 130,M,76.70,Central,89.70,Others,Commerce,66.00,Comm&Mgmt,Yes,90,Mkt&Fin,68.55,Placed,250000
132 | 131,M,62.00,Central,65.00,Others,Commerce,60.00,Comm&Mgmt,No,84,Mkt&Fin,64.15,Not Placed,
133 | 132,F,74.90,Others,57.00,Others,Science,62.00,Others,Yes,80,Mkt&Fin,60.78,Placed,360000
134 | 133,M,67.00,Others,68.00,Others,Commerce,64.00,Comm&Mgmt,Yes,74.4,Mkt&HR,53.49,Placed,300000
135 | 134,M,73.00,Central,64.00,Others,Commerce,77.00,Comm&Mgmt,Yes,65,Mkt&HR,60.98,Placed,250000
136 | 135,F,77.44,Central,92.00,Others,Commerce,72.00,Comm&Mgmt,Yes,94,Mkt&Fin,67.13,Placed,250000
137 | 136,F,72.00,Central,56.00,Others,Science,69.00,Comm&Mgmt,No,55.6,Mkt&HR,65.63,Placed,200000
138 | 137,F,47.00,Central,59.00,Central,Arts,64.00,Comm&Mgmt,No,78,Mkt&Fin,61.58,Not Placed,
139 | 138,M,67.00,Others,63.00,Central,Commerce,72.00,Comm&Mgmt,No,56,Mkt&HR,60.41,Placed,225000
140 | 139,F,82.00,Others,64.00,Others,Science,73.00,Sci&Tech,Yes,96,Mkt&Fin,71.77,Placed,250000
141 | 140,M,77.00,Central,70.00,Central,Commerce,59.00,Comm&Mgmt,Yes,58,Mkt&Fin,54.43,Placed,220000
142 | 141,M,65.00,Central,64.80,Others,Commerce,69.50,Comm&Mgmt,Yes,56,Mkt&Fin,56.94,Placed,265000
143 | 142,M,66.00,Central,64.00,Central,Science,60.00,Comm&Mgmt,No,60,Mkt&HR,61.9,Not Placed,
144 | 143,M,85.00,Central,60.00,Others,Science,73.43,Sci&Tech,Yes,60,Mkt&Fin,61.29,Placed,260000
145 | 144,M,77.67,Others,64.89,Others,Commerce,70.67,Comm&Mgmt,No,89,Mkt&Fin,60.39,Placed,300000
146 | 145,M,52.00,Others,50.00,Others,Arts,61.00,Comm&Mgmt,No,60,Mkt&Fin,58.52,Not Placed,
147 | 146,M,89.40,Others,65.66,Others,Science,71.25,Sci&Tech,No,72,Mkt&HR,63.23,Placed,400000
148 | 147,M,62.00,Central,63.00,Others,Science,66.00,Comm&Mgmt,No,85,Mkt&HR,55.14,Placed,233000
149 | 148,M,70.00,Central,74.00,Central,Commerce,65.00,Comm&Mgmt,No,83,Mkt&Fin,62.28,Placed,300000
150 | 149,F,77.00,Central,86.00,Central,Arts,56.00,Others,No,57,Mkt&Fin,64.08,Placed,240000
151 | 150,M,44.00,Central,58.00,Central,Arts,55.00,Comm&Mgmt,Yes,64.25,Mkt&HR,58.54,Not Placed,
152 | 151,M,71.00,Central,58.66,Central,Science,58.00,Sci&Tech,Yes,56,Mkt&Fin,61.3,Placed,690000
153 | 152,M,65.00,Central,65.00,Central,Commerce,75.00,Comm&Mgmt,No,83,Mkt&Fin,58.87,Placed,270000
154 | 153,F,75.40,Others,60.50,Central,Science,84.00,Sci&Tech,No,98,Mkt&Fin,65.25,Placed,240000
155 | 154,M,49.00,Others,59.00,Others,Science,65.00,Sci&Tech,Yes,86,Mkt&Fin,62.48,Placed,340000
156 | 155,M,53.00,Central,63.00,Others,Science,60.00,Comm&Mgmt,Yes,70,Mkt&Fin,53.2,Placed,250000
157 | 156,M,51.57,Others,74.66,Others,Commerce,59.90,Comm&Mgmt,Yes,56.15,Mkt&HR,65.99,Not Placed,
158 | 157,M,84.20,Central,69.40,Central,Science,65.00,Sci&Tech,Yes,80,Mkt&HR,52.72,Placed,255000
159 | 158,M,66.50,Central,62.50,Central,Commerce,60.90,Comm&Mgmt,No,93.4,Mkt&Fin,55.03,Placed,300000
160 | 159,M,67.00,Others,63.00,Others,Science,64.00,Sci&Tech,No,60,Mkt&Fin,61.87,Not Placed,
161 | 160,M,52.00,Central,49.00,Others,Commerce,58.00,Comm&Mgmt,No,62,Mkt&HR,60.59,Not Placed,
162 | 161,M,87.00,Central,74.00,Central,Science,65.00,Sci&Tech,Yes,75,Mkt&HR,72.29,Placed,300000
163 | 162,M,55.60,Others,51.00,Others,Commerce,57.50,Comm&Mgmt,No,57.63,Mkt&HR,62.72,Not Placed,
164 | 163,M,74.20,Central,87.60,Others,Commerce,77.25,Comm&Mgmt,Yes,75.2,Mkt&Fin,66.06,Placed,285000
165 | 164,M,63.00,Others,67.00,Others,Science,64.00,Sci&Tech,No,75,Mkt&Fin,66.46,Placed,500000
166 | 165,F,67.16,Central,72.50,Central,Commerce,63.35,Comm&Mgmt,No,53.04,Mkt&Fin,65.52,Placed,250000
167 | 166,F,63.30,Central,78.33,Others,Commerce,74.00,Comm&Mgmt,No,80,Mkt&Fin,74.56,Not Placed,
168 | 167,M,62.00,Others,62.00,Others,Commerce,60.00,Comm&Mgmt,Yes,63,Mkt&HR,52.38,Placed,240000
169 | 168,M,67.90,Others,62.00,Others,Science,67.00,Sci&Tech,Yes,58.1,Mkt&Fin,75.71,Not Placed,
170 | 169,F,48.00,Central,51.00,Central,Commerce,58.00,Comm&Mgmt,Yes,60,Mkt&HR,58.79,Not Placed,
171 | 170,M,59.96,Others,42.16,Others,Science,61.26,Sci&Tech,No,54.48,Mkt&HR,65.48,Not Placed,
172 | 171,F,63.40,Others,67.20,Others,Commerce,60.00,Comm&Mgmt,No,58.06,Mkt&HR,69.28,Not Placed,
173 | 172,M,80.00,Others,80.00,Others,Commerce,72.00,Comm&Mgmt,Yes,63.79,Mkt&Fin,66.04,Placed,290000
174 | 173,M,73.00,Others,58.00,Others,Commerce,56.00,Comm&Mgmt,No,84,Mkt&HR,52.64,Placed,300000
175 | 174,F,52.00,Others,52.00,Others,Science,55.00,Sci&Tech,No,67,Mkt&HR,59.32,Not Placed,
176 | 175,M,73.24,Others,50.83,Others,Science,64.27,Sci&Tech,Yes,64,Mkt&Fin,66.23,Placed,500000
177 | 176,M,63.00,Others,62.00,Others,Science,65.00,Sci&Tech,No,87.5,Mkt&HR,60.69,Not Placed,
178 | 177,F,59.00,Central,60.00,Others,Commerce,56.00,Comm&Mgmt,No,55,Mkt&HR,57.9,Placed,220000
179 | 178,F,73.00,Central,97.00,Others,Commerce,79.00,Comm&Mgmt,Yes,89,Mkt&Fin,70.81,Placed,650000
180 | 179,M,68.00,Others,56.00,Others,Science,68.00,Sci&Tech,No,73,Mkt&HR,68.07,Placed,350000
181 | 180,F,77.80,Central,64.00,Central,Science,64.20,Sci&Tech,No,75.5,Mkt&HR,72.14,Not Placed,
182 | 181,M,65.00,Central,71.50,Others,Commerce,62.80,Comm&Mgmt,Yes,57,Mkt&Fin,56.6,Placed,265000
183 | 182,M,62.00,Central,60.33,Others,Science,64.21,Sci&Tech,No,63,Mkt&HR,60.02,Not Placed,
184 | 183,M,52.00,Others,65.00,Others,Arts,57.00,Others,Yes,75,Mkt&Fin,59.81,Not Placed,
185 | 184,M,65.00,Central,77.00,Central,Commerce,69.00,Comm&Mgmt,No,60,Mkt&HR,61.82,Placed,276000
186 | 185,F,56.28,Others,62.83,Others,Commerce,59.79,Comm&Mgmt,No,60,Mkt&HR,57.29,Not Placed,
187 | 186,F,88.00,Central,72.00,Central,Science,78.00,Others,No,82,Mkt&HR,71.43,Placed,252000
188 | 187,F,52.00,Central,64.00,Central,Commerce,61.00,Comm&Mgmt,No,55,Mkt&Fin,62.93,Not Placed,
189 | 188,M,78.50,Central,65.50,Central,Science,67.00,Sci&Tech,Yes,95,Mkt&Fin,64.86,Placed,280000
190 | 189,M,61.80,Others,47.00,Others,Commerce,54.38,Comm&Mgmt,No,57,Mkt&Fin,56.13,Not Placed,
191 | 190,F,54.00,Central,77.60,Others,Commerce,69.20,Comm&Mgmt,No,95.65,Mkt&Fin,66.94,Not Placed,
192 | 191,F,64.00,Others,70.20,Central,Commerce,61.00,Comm&Mgmt,No,50,Mkt&Fin,62.5,Not Placed,
193 | 192,M,67.00,Others,61.00,Central,Science,72.00,Comm&Mgmt,No,72,Mkt&Fin,61.01,Placed,264000
194 | 193,M,65.20,Central,61.40,Central,Commerce,64.80,Comm&Mgmt,Yes,93.4,Mkt&Fin,57.34,Placed,270000
195 | 194,F,60.00,Central,63.00,Central,Arts,56.00,Others,Yes,80,Mkt&HR,56.63,Placed,300000
196 | 195,M,52.00,Others,55.00,Others,Commerce,56.30,Comm&Mgmt,No,59,Mkt&Fin,64.74,Not Placed,
197 | 196,M,66.00,Central,76.00,Central,Commerce,72.00,Comm&Mgmt,Yes,84,Mkt&HR,58.95,Placed,275000
198 | 197,M,72.00,Others,63.00,Others,Science,77.50,Sci&Tech,Yes,78,Mkt&Fin,54.48,Placed,250000
199 | 198,F,83.96,Others,53.00,Others,Science,91.00,Sci&Tech,No,59.32,Mkt&HR,69.71,Placed,260000
200 | 199,F,67.00,Central,70.00,Central,Commerce,65.00,Others,No,88,Mkt&HR,71.96,Not Placed,
201 | 200,M,69.00,Others,65.00,Others,Commerce,57.00,Comm&Mgmt,No,73,Mkt&HR,55.8,Placed,265000
202 | 201,M,69.00,Others,60.00,Others,Commerce,65.00,Comm&Mgmt,No,87.55,Mkt&Fin,52.81,Placed,300000
203 | 202,M,54.20,Central,63.00,Others,Science,58.00,Comm&Mgmt,No,79,Mkt&HR,58.44,Not Placed,
204 | 203,M,70.00,Central,63.00,Central,Science,66.00,Sci&Tech,No,61.28,Mkt&HR,60.11,Placed,240000
205 | 204,M,55.68,Others,61.33,Others,Commerce,56.87,Comm&Mgmt,No,66,Mkt&HR,58.3,Placed,260000
206 | 205,F,74.00,Others,73.00,Others,Commerce,73.00,Comm&Mgmt,Yes,80,Mkt&Fin,67.69,Placed,210000
207 | 206,M,61.00,Others,62.00,Others,Commerce,65.00,Comm&Mgmt,No,62,Mkt&Fin,56.81,Placed,250000
208 | 207,M,41.00,Central,42.00,Central,Science,60.00,Comm&Mgmt,No,97,Mkt&Fin,53.39,Not Placed,
209 | 208,M,83.33,Central,78.00,Others,Commerce,61.00,Comm&Mgmt,Yes,88.56,Mkt&Fin,71.55,Placed,300000
210 | 209,F,43.00,Central,60.00,Others,Science,65.00,Comm&Mgmt,No,92.66,Mkt&HR,62.92,Not Placed,
211 | 210,M,62.00,Central,72.00,Central,Commerce,65.00,Comm&Mgmt,No,67,Mkt&Fin,56.49,Placed,216000
212 | 211,M,80.60,Others,82.00,Others,Commerce,77.60,Comm&Mgmt,No,91,Mkt&Fin,74.49,Placed,400000
213 | 212,M,58.00,Others,60.00,Others,Science,72.00,Sci&Tech,No,74,Mkt&Fin,53.62,Placed,275000
214 | 213,M,67.00,Others,67.00,Others,Commerce,73.00,Comm&Mgmt,Yes,59,Mkt&Fin,69.72,Placed,295000
215 | 214,F,74.00,Others,66.00,Others,Commerce,58.00,Comm&Mgmt,No,70,Mkt&HR,60.23,Placed,204000
216 | 215,M,62.00,Central,58.00,Others,Science,53.00,Comm&Mgmt,No,89,Mkt&HR,60.22,Not Placed,
217 |
--------------------------------------------------------------------------------
/Campus-Recruitment-Analysis/README.md:
--------------------------------------------------------------------------------
1 | # Campus Recruitment Analysis
2 | ---
3 |
4 | ## Objectives:
5 | The objective is to predict whether a student will be placed during Campus Recruitement or not.
6 |
7 | ## Tools Used:
8 | ---
9 |
10 |
11 |
12 | ## Steps Involved in Making the Model:
13 | ---
14 | 1) Importing our data.
15 | 2) Checking for null values and finding mean of different columns, their min and max values, and getting information about different columns of our data.
16 | 3) Visualizing our data in order to find out the best columns to use as features for our model.
17 | 4) Make a copy of original data to make the changes and seperate out the columns that we will use as our features.
18 | 5) Feeding these features to three different Models i.e. Logistic Regression, KNN and SVM to get the best results.
19 | 6) Concluding our analysis by testing the model with some random user input.
20 |
21 | ## Results:
22 | ---
23 | The Model were able to make a decent prediction about whether a person will be placed or not. Out of the three, Logistic Regression gave us the best results. These models can still be improved to make more accurate predictions.
24 |
--------------------------------------------------------------------------------
/Campus-Recruitment-Analysis/result1.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Campus-Recruitment-Analysis/result1.PNG
--------------------------------------------------------------------------------
/Campus-Recruitment-Analysis/result2.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Campus-Recruitment-Analysis/result2.PNG
--------------------------------------------------------------------------------
/Celestial-Bodies-Classification/Plots/galaxy.csv:
--------------------------------------------------------------------------------
1 | stdDevYAbsGrad,MeanGradX,MeanGradY
2 | 24.01071959,5.7696,6.6864
3 | 26.83103907,28.6683,22.3935
4 | 24.63554285,16.5179,15.4029
5 | 30.11643318,18.5229,21.1515
6 | 17.67997298,10.43,9.7756
7 | 33.08064606,23.9773,27.3875
8 | 33.08030906,26.3301,33.7485
9 | 27.51116183,29.1382,28.2884
10 | 25.12812368,16.7071,21.1005
11 | 23.57849188,29.007,17.7008
12 | 16.34040606,22.2401,12.1563
13 | 34.87572098,31.4384,25.5262
14 | 28.44356823,18.6446,21.9432
15 | 60.93591768,24.8843,29.0551
16 | 24.44979684,10.8278,19.2384
17 | 27.21984165,10.7248,23.0492
18 | 15.24252555,17.4073,13.2309
19 | 15.8170941,14.4787,17.0153
20 | 18.4465268,21.3496,17.993
21 | 17.21673676,8.8969,15.1711
22 | 28.94314744,23.184,34.504
23 | 23.49997148,24.9877,24.4155
24 | 16.63806272,13.159,12.4378
25 | 21.83990085,16.3143,18.8509
26 | 13.5126007,16.2157,15.8329
27 | 14.09052667,11.5425,11.4303
28 | 26.7173567,20.8032,21.9348
29 | 29.94636314,18.9118,20.3926
30 | 38.96799882,28.1021,30.4113
31 | 22.49793465,18.1799,15.3371
32 | 27.34042704,15.0385,16.3411
33 | 17.58976202,22.0593,16.4835
34 | 24.73050965,19.8257,19.5943
35 | 25.56309241,14.0095,14.7625
36 | 22.9559225,14.5494,17.3332
37 | 28.71490204,31.4242,23.349
38 | 20.92517718,29.0421,15.7253
39 | 15.92144531,17.7914,12.8382
40 | 31.80956545,24.8091,22.9081
41 | 11.47942717,5.5902,5.2572
42 | 29.69527297,22.4098,22.8452
43 | 17.0230138,12.7381,12.1261
44 | 17.94903964,15.8698,14.0598
45 | 28.9318257,19.9163,18.0237
46 | 27.41691864,7.5787,11.8165
47 | 25.95019312,18.6935,17.7061
48 | 18.63907819,16.9464,11.6696
49 | 28.82881683,19.9514,19.4004
50 | 32.42486984,21.2386,20.7098
51 | 21.48612049,12.7193,12.8353
52 | 31.99136255,17.4679,19.2079
53 | 27.43670781,21.7748,22.9706
54 | 21.2577209,12.8369,13.6735
55 | 18.58291222,16.194,16.5312
56 | 25.23307156,10.6303,15.7313
57 | 20.75020089,11.4437,11.2889
58 | 24.17229714,26.1628,22.5652
59 | 20.55607091,14.8054,19.7378
60 | 17.64148366,12.4092,14.4038
61 | 12.74528728,10.9249,10.8485
62 | 14.32767526,14.7908,14.6724
63 | 19.93473632,17.9222,17.5298
64 | 77.14467141,16.7624,27.8106
65 | 46.37142926,51.3646,53.9692
66 | 17.7524905,18.014,18.3064
67 | 32.29880561,24.3493,27.7675
68 | 27.1389998,32.7133,27.2753
69 | 15.55618115,16.6385,18.4849
70 | 21.84647429,22.9468,22.481
71 | 41.15880132,19.0906,21.5086
72 | 26.36876325,14.0428,16.1978
73 | 11.40650025,23.4701,11.2601
74 | 29.88692747,19.665,20.7262
75 | 22.34126163,16.0809,16.2667
76 | 13.84671893,13.395,17.6574
77 | 16.57115615,38.7889,13.9463
78 | 12.62582686,20.93,13.736
79 | 44.47049972,17.8011,17.7183
80 | 22.31216182,18.2836,20.0252
81 | 9.555024328,18.6683,10.9753
82 | 14.9110649,35.9191,16.6713
83 | 12.25192343,18.4421,13.8085
84 | 9.726028064,18.3557,10.7553
85 | 13.05569782,15.4253,15.3729
86 | 19.20726881,11.4862,22.365
87 | 12.84850773,12.31,17.507
88 | 12.91082637,19.6456,11.8016
89 | 86.28630797,34.77,39.3694
90 | 49.04475934,32.4351,27.5241
91 | 30.24210163,32.4651,32.8333
92 | 13.23324057,14.8272,11.3898
93 | 40.50106376,25.5989,23.1701
94 | 11.74315624,27.4946,12.7988
95 | 15.75172197,15.4619,15.1567
96 | 12.52030175,12.0556,10.662
97 | 86.39917469,19.917,32.5722
98 | 13.58226576,31.7904,14.2736
99 | 25.6942633,14.4672,17.6812
100 | 21.35789945,13.963,16.2682
101 | 32.52983851,16.5129,24.0975
102 | 16.70600491,7.7464,7.26
103 | 29.74379904,19.142,21.3816
104 | 18.47055988,23.784,20.7458
105 | 25.78645659,16.5074,18.8392
106 | 20.81710005,17.1838,18.9388
107 | 16.42106686,16.3082,17.8518
108 | 12.65908009,14.2385,13.0489
109 | 10.39332358,14.0172,12.695
110 | 37.43637274,9.3981,24.2621
111 | 43.62117843,12.856,27.9106
112 | 34.70325201,16.0119,25.1249
113 | 28.39191637,22.8888,20.9908
114 | 12.34867879,21.165,12.2904
115 | 23.81519167,16.6725,17.8163
116 | 29.48899401,13.2915,22.5885
117 | 48.63303833,54.2332,27.6048
118 | 42.44745505,15.5741,26.7063
119 | 24.40992335,17.5418,18.4486
120 | 31.72965003,23.9111,23.4359
121 | 39.15864005,19.5577,26.7423
122 | 28.73516633,14.6742,29.5402
123 | 34.06227082,27.4543,28.8775
124 | 30.56089002,29.6323,21.4989
125 | 30.98180267,6.7566,13.1952
126 | 70.18806697,29.7853,34.3337
127 | 23.22546292,17.2807,21.7351
128 | 25.2240411,19.8529,23.7493
129 | 20.67605935,11.6878,17.1886
130 | 29.48726114,18.5921,21.1695
131 | 23.98974294,20.3184,19.7442
132 | 18.7154325,18.2023,18.0093
133 | 37.81779351,22.9822,22.5624
134 | 34.53802422,17.9854,18.4482
135 | 27.00148479,20.953,17.9382
136 | 19.59378033,15.7069,17.8165
137 | 18.6094167,21.8199,13.5749
138 | 20.17383123,16.1089,15.3519
139 | 34.32752114,20.6937,21.6315
140 | 25.91794584,18.5906,16.2156
141 | 23.68248814,17.3408,18.2244
142 | 47.134172,50.6487,35.6945
143 | 24.19306857,34.1691,24.1927
144 | 26.63902865,19.5989,19.6943
145 | 21.84148903,31.1085,20.9209
146 | 27.67152204,20.3203,18.8401
147 | 22.19581767,24.1799,25.8697
148 | 30.05279154,13.9985,18.8571
149 | 23.62685337,14.4074,17.56
150 | 58.01806261,33.1142,34.7818
151 | 16.0098686,17.5716,17.2644
152 | 23.21087525,42.5273,16.4545
153 | 18.42994405,16.6823,22.0215
154 | 20.00453683,30.3393,19.7625
155 | 21.79446205,14.754,14.068
156 | 14.63741357,12.701,9.3872
157 | 23.53002805,16.5722,15.8728
158 | 15.08296363,12.1568,13.2298
159 | 14.43451553,13.9494,10.0988
160 | 26.04160107,12.7615,16.0769
161 | 23.12193259,17.0864,19.1812
162 | 36.66474844,37.2515,34.8881
163 | 32.65081335,31.6128,33.8644
164 | 18.44146596,10.3799,19.3073
165 | 15.09764044,9.6353,9.4927
166 | 19.46194173,10.2092,9.7196
167 | 38.36368893,34.2342,26.0444
168 | 21.25195872,16.1083,15.6243
169 | 20.25906406,22.4098,24.7982
170 | 13.2563869,14.7495,16.6225
171 | 24.27210728,15.9352,16.5202
172 | 21.9897567,7.1731,8.4995
173 | 20.08770636,18.4718,19.4646
174 | 28.95247072,15.1864,22.5352
175 | 18.45003406,14.2068,16.0452
176 | 18.16939292,9.9778,8.169
177 | 29.30354122,16.1548,15.9428
178 | 24.950463,10.9368,14.086
179 | 22.62870732,18.4302,20.0246
180 | 15.01313569,14.2776,11.5822
181 | 27.7704472,15.0733,20.6465
182 | 27.92481123,39.1174,25.4424
183 | 21.96851712,14.7421,17.9581
184 | 21.2544111,14.885,23.947
185 | 18.25817678,21.2707,20.7993
186 | 17.2490079,16.8806,16.8738
187 | 24.26162011,14.3126,15.0824
188 | 20.86630851,21.1192,21.163
189 | 24.19390914,13.4684,18.1008
190 | 34.24525075,7.5436,11.001
191 | 16.76930899,14.803,22.526
192 | 25.83957704,15.1427,21.7471
193 | 32.82469788,16.6482,22.803
194 | 16.67521495,10.7969,8.9425
195 | 26.63526504,20.7932,22.816
196 | 17.52575851,14.6564,13.9308
197 | 44.54048783,27.6919,31.7799
198 | 17.580537,11.7346,9.0626
199 | 37.15250064,24.6581,22.7343
200 | 31.38632372,20.3022,29.3018
201 | 28.39049524,17.2037,20.2555
202 | 28.71731083,22.318,20.5816
203 | 16.52363771,17.9448,15.9322
204 | 19.8080566,16.3154,11.4376
205 | 16.14362882,12.1686,9.2808
206 | 19.87431706,14.451,13.6988
207 | 19.58230482,8.3248,10.8786
208 | 15.3262084,14.2611,17.1801
209 | 20.06825373,14.4345,13.2685
210 | 29.26579635,17.2296,18.542
211 | 27.99105951,12.4234,17.6918
212 | 37.86933389,23.0095,22.1259
213 | 25.32947415,21.7546,19.5956
214 | 37.45188014,24.8388,22.9124
215 | 20.16648977,14.6063,14.7071
216 | 37.53623321,5.9563,14.7379
217 | 16.99160469,16.7733,15.9503
218 | 30.67542828,30.556,27.11
219 | 15.16281392,18.3744,10.9914
220 | 17.03297169,12.2408,14.2666
221 | 26.4452249,26.5987,24.9555
222 | 18.66784656,19.9503,19.2191
223 | 38.65055783,5.8084,23.7634
224 | 28.24874714,23.1912,23.0478
225 | 22.12971569,19.6493,17.9323
226 | 20.90601928,20.5304,15.2424
227 | 30.5440893,14.3152,20.797
228 | 23.97686885,7.6394,22.3498
229 | 25.38317432,23.2968,24.1512
230 | 16.23722636,14.6471,13.9805
231 | 15.46942474,10.3668,9.5768
232 | 21.16019081,15.7144,16.9478
233 | 19.69639471,22.3842,20.3166
234 | 18.6086281,13.9175,17.8355
235 | 24.37127185,27.2354,21.1308
236 | 31.7107768,10.0446,14.5252
237 | 22.26633423,7.3772,17.9002
238 | 16.969285,11.1146,14.5238
239 | 14.80249687,13.9109,12.2121
240 | 24.2686615,9.7,13.1878
241 | 21.6821143,6.9474,21.2456
242 | 22.18378867,10.8889,11.6747
243 | 18.82641102,11.4124,13.9396
244 | 30.1932025,9.8545,24.8233
245 | 26.79449914,8.9906,14.7902
246 | 23.42590026,15.4008,18.0346
247 | 22.21310845,22.5983,14.3483
248 | 26.5048696,17.9791,17.6931
249 | 25.56347851,24.5664,25.7716
250 | 35.91643343,18.021,19.1114
251 | 22.9732455,15.1554,17.1944
252 | 24.1602119,14.8218,16.0936
253 | 23.66997889,23.4808,23.1634
254 | 17.57966855,13.7339,10.8769
255 | 24.15028066,14.4213,15.2201
256 | 37.87384783,23.9416,23.6416
257 | 38.00021514,15.752,20.257
258 | 25.11891701,22.0438,14.6594
259 | 26.15605475,24.3176,23.02
260 | 14.04667642,12.939,11.2858
261 | 19.84967908,11.5442,10.5494
262 | 26.22761883,18.9845,17.4321
263 | 47.40425613,31.9248,30.1578
264 | 21.65200333,22.734,13.2428
265 | 26.19676933,28.6005,19.9657
266 | 20.49701988,9.5121,11.6199
267 | 18.55856471,9.7341,9.8801
268 | 23.35777469,10.2883,11.8987
269 | 41.27308832,7.7436,25.7922
270 | 92.28540963,9.6924,28.513
271 | 20.17312074,20.1473,20.0781
272 | 20.37725896,20.6054,18.6154
273 | 40.88839727,20.1964,20.987
274 | 13.0341333,6.0384,6.713
275 | 23.57816276,10.575,11.379
276 | 36.62798979,21.007,30.442
277 | 33.94473675,14.1086,15.4526
278 | 30.87126524,12.1482,24.8826
279 | 19.5488236,19.6472,21.5098
280 | 26.63790067,28.0155,30.5863
281 | 24.6227152,14.1521,15.7147
282 | 32.45727886,17.0598,22.257
283 | 26.17319945,14.6934,20.7084
284 | 27.16550367,12.5161,13.6105
285 | 20.69011433,21.7831,18.0083
286 | 17.40917836,15.351,15.8164
287 | 19.59039226,13.5424,9.7318
288 | 25.80259772,11.473,13.7774
289 | 26.26237879,12.4253,16.3355
290 | 22.02306262,10.5024,7.8208
291 | 15.02408703,16.7052,16.303
292 | 29.25244434,18.129,23.13
293 | 24.65145321,20.0038,14.6116
294 | 16.74414505,24.1762,14.6738
295 | 24.62351215,14.1348,15.5212
296 | 28.58066563,35.4446,22.4404
297 | 29.31075715,26.1156,20.0334
298 | 32.87335594,37.523,28.1946
299 | 35.55655367,15.9794,20.0556
300 | 37.7444463,24.3033,22.0963
301 | 27.69566038,29.7974,28.086
302 | 21.21559665,20.2519,22.3417
303 | 30.76723881,26.0402,23.9796
304 | 36.84975769,9.7408,21.9268
305 | 22.51716798,17.1354,15.2568
306 | 33.18331314,14.3057,15.7411
307 | 16.01403499,11.9683,11.0677
308 | 14.22470723,15.1016,10.8804
309 | 25.09513359,14.5229,14.7295
310 | 21.81993946,15.8556,16.8014
311 | 27.20422105,27.3506,24.0846
312 | 24.69681903,26.9457,20.8969
313 | 18.37557831,10.6582,11.4358
314 | 24.69265368,13.5668,18.3462
315 | 14.35059578,13.9608,13.0008
316 | 22.01708036,23.4973,17.9835
317 | 17.56885491,17.6563,16.3811
318 | 39.8715948,30.91,37.0594
319 | 16.54429109,17.5982,13.3606
320 | 28.64295463,35.3745,20.2499
321 | 19.87970188,16.3648,14.5354
322 | 19.24357548,12.6771,13.1091
323 | 29.37854264,32.8641,30.4943
324 | 18.8795955,7.885,8.4124
325 | 23.29748283,7.5082,12.0624
326 | 21.08785214,13.3349,16.1435
327 | 25.06710603,14.6762,16.9718
328 | 23.08835225,11.3502,15.2362
329 | 18.68288551,16.4872,21.7054
330 | 22.36153266,19.7615,17.3939
331 | 27.86496145,19.8714,18.4518
332 | 38.23072476,30.1381,41.2049
333 | 23.09008086,28.7903,19.7353
334 | 21.13755137,28.1065,15.8665
335 | 29.81008017,25.5684,22.1998
336 | 18.05439443,11.7147,13.2763
337 | 18.72655339,17.5852,14.6732
338 | 26.41252084,20.7572,17.4326
339 | 18.77525463,22.6828,13.3688
340 | 19.91002057,16.2216,15.059
341 | 30.62542114,19.9445,19.9445
342 | 37.3201212,11.4541,18.1177
343 | 22.74641884,10.46,13.3824
344 | 21.86116654,16.0564,15.2912
345 | 20.54324617,12.8562,10.7178
346 | 41.71218658,12.0868,20.8416
347 | 32.65081335,31.6128,33.8644
348 | 27.59521935,20.9448,20.9446
349 | 20.38874861,15.5285,17.2695
350 | 22.81022975,13.2085,16.7417
351 | 29.87690526,15.0125,14.2635
352 | 32.94962477,23.4943,26.9887
353 | 31.40122955,16.406,20.6326
354 | 27.89616271,30.4303,22.9225
355 | 19.0549355,17.4026,17.1464
356 | 26.55136203,18.2413,17.9643
357 | 16.96990958,18.532,13.987
358 | 26.19710818,17.0791,14.3341
359 | 12.74989449,15.0516,9.9084
360 | 13.68596474,4.4264,4.137
361 | 35.60906781,19.5499,22.6187
362 | 34.1669112,17.1731,24.3077
363 | 32.52213227,28.0687,26.1517
364 | 22.92597411,17.2509,16.8877
365 | 26.98733437,26.2584,19.5344
366 | 25.49151561,24.9838,16.4198
367 | 24.47294555,16.1102,26.0902
368 | 31.39191043,8.4072,11.4228
369 | 26.16176864,19.4712,19.1512
370 | 33.01791439,10.9172,15.7212
371 | 19.99794705,4.6708,10.5558
372 | 32.18806623,25.5755,29.1797
373 | 32.61018565,11.057,24.7404
374 | 25.07401922,19.9034,17.7996
375 | 19.87112455,13.8645,10.7661
376 | 7.830247193,5.2347,5.2667
377 | 37.30674179,21.1038,28.0736
378 | 14.43451553,13.9494,10.0988
379 | 25.06152522,11.4761,10.3769
380 | 17.74055774,20.6568,14.4682
381 | 20.34510101,15.5312,15.4136
382 | 27.79943892,14.8161,14.5199
383 | 18.34832107,11.2356,8.2124
384 | 35.87092924,5.2784,14.8166
385 | 31.79226556,24.0331,22.0745
386 | 14.99392928,10.2847,6.7559
387 | 20.33421154,20.8857,15.6739
388 | 18.43796799,16.4785,12.9621
389 | 27.28739167,15.406,17.584
390 | 15.92887252,16.6807,13.3205
391 | 40.50161669,16.2516,18.6976
392 | 32.32353145,28.5398,29.8384
393 | 18.34480717,16.3284,22.0886
394 | 27.14291792,8.0628,22.7526
395 | 20.22840777,12.6715,12.8161
396 | 12.54756486,11.3522,12.196
397 | 20.68726443,18.8718,11.6584
398 | 27.27327987,24.1115,31.0933
399 | 34.21185347,25.6602,25.9832
400 | 18.22226118,14.8796,8.7912
401 | 28.0106085,18.4892,18.0034
402 | 32.69199232,23.3775,28.2697
403 | 21.85180999,17.5124,18.26
404 | 28.46644076,23.3117,21.7745
405 | 20.70705295,16.2687,18.8349
406 | 26.93551927,15.2723,12.3369
407 | 33.53389493,29.8065,28.0243
408 | 20.71046837,28.3713,15.3747
409 | 21.7876874,14.6277,10.3569
410 | 18.86800749,8.4591,16.3611
411 | 23.42241375,14.6588,18.4132
412 | 25.43429074,21.1182,22.3462
413 | 24.59038666,9.1992,18.478
414 | 25.56057858,15.4717,15.7909
415 | 23.90316216,14.1414,17.5874
416 | 24.14122425,14.7624,16.1034
417 | 17.09659121,15.1878,12.713
418 | 18.85988802,17.4308,18.368
419 | 20.84949074,11.8477,13.0603
420 | 32.23183104,33.7127,24.5887
421 | 19.80008353,19.7927,18.7685
422 | 27.04589338,21.9573,19.2489
423 | 35.65799345,20.4963,22.3909
424 | 22.10808111,11.2749,13.1437
425 | 26.80027407,18.0329,21.0031
426 | 29.20381622,13.3233,13.9651
427 | 32.86449816,21.1504,24.631
428 | 18.97736335,15.9742,15.9006
429 | 16.28446848,12.5121,14.2121
430 | 36.57361473,9.0454,20.1758
431 | 16.16518604,8.0606,10.1994
432 | 23.51603317,24.4768,24.4804
433 | 25.11169047,17.9284,17.2358
434 | 32.58784701,24.3572,27.3766
435 | 19.10108349,15.9262,15.1342
436 |
--------------------------------------------------------------------------------
/Celestial-Bodies-Classification/Plots/plot.py:
--------------------------------------------------------------------------------
1 | import pandas as pd
2 | from pandas import DataFrame
3 | import matplotlib.pyplot as plt
4 | from mpl_toolkits.mplot3d import Axes3D
5 |
6 | df = pd.read_csv('star.csv', parse_dates=True)
7 |
8 | df1 = pd.read_csv('galaxy.csv', parse_dates=True)
9 |
10 | plot = plt.figure().gca(projection='3d')
11 | plot.scatter(df['stdDevYAbsGrad'], df['MeanGradX'], df['MeanGradY'], color = '#0000FF')
12 | plot.set_xlabel('stdDevYAbsGrad')
13 | plot.set_ylabel('Xmeangrad')
14 | plot.set_zlabel('Ymeangrad')
15 |
16 |
17 | plot.scatter(df1['stdDevYAbsGrad'], df1['MeanGradX'], df1['MeanGradY'], color = '#FF0000')
18 | plot.set_xlabel('stdDevYAbsGrad')
19 | plot.set_ylabel('Xmeangrad')
20 | plot.set_zlabel('Ymeangrad')
21 |
22 | plt.show()
23 |
--------------------------------------------------------------------------------
/Celestial-Bodies-Classification/Plots/plot1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Celestial-Bodies-Classification/Plots/plot1.png
--------------------------------------------------------------------------------
/Celestial-Bodies-Classification/Plots/plot2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Celestial-Bodies-Classification/Plots/plot2.png
--------------------------------------------------------------------------------
/Celestial-Bodies-Classification/Plots/readme.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Celestial-Bodies-Classification/Plots/readme.jpg
--------------------------------------------------------------------------------
/Celestial-Bodies-Classification/Plots/star.csv:
--------------------------------------------------------------------------------
1 | stdDevYAbsGrad,MeanGradX,MeanGradY
2 | 25.16030897,10.4356,11.4708
3 | 23.40660037,7.3516,7.5382
4 | 14.42029475,10.3362,14.1044
5 | 14.74115247,5.128,4.832
6 | 12.1592233,4.3016,4.3192
7 | 12.76711237,3.4718,3.6358
8 | 15.77328767,5.9696,5.914
9 | 12.90540844,3.7556,4.3536
10 | 18.04670162,6.4956,7.1008
11 | 17.64263631,4.9266,6.372
12 | 12.69578347,5.2114,5.3514
13 | 9.859728552,4.6249,3.2483
14 | 29.45423517,10.7261,12.5493
15 | 9.061646365,4.255,5.1754
16 | 11.83724899,2.3126,3.2194
17 | 34.20613535,6.3156,10.4408
18 | 26.50295219,4.7618,6.5112
19 | 30.96707655,5.2758,7.6886
20 | 6.029646146,1.1478,1.2144
21 | 25.91042437,10.9604,9.8554
22 | 17.59885117,8.4308,8.8912
23 | 24.47547296,9.4196,10.1656
24 | 15.13469718,3.603,4.3546
25 | 24.42635001,8.4496,8.525
26 | 11.38413699,3.8686,2.905
27 | 15.60749942,5.0224,5.7986
28 | 27.55672018,19.5332,23.6964
29 | 19.95130249,4.5782,4.477
30 | 12.91112376,3.6596,3.3018
31 | 20.92243683,4.7456,5.9236
32 | 18.30804348,4.7338,4.4372
33 | 13.94497928,4.4348,4.4536
34 | 11.54624178,3.151,3.1578
35 | 19.82596782,5.785,6.1
36 | 14.42983592,12.3268,9.3244
37 | 8.778333974,2.2754,2.5292
38 | 9.44799021,2.1224,2.691
39 | 5.249107318,2.2128,1.6894
40 | 24.97357857,10.3224,12.0646
41 | 22.66202001,12.015,11.057
42 | 26.34729372,6.6718,8.1188
43 | 13.11881348,5.6778,4.6036
44 | 23.13365151,14.7168,24.1702
45 | 20.92634878,10.191,10.1474
46 | 27.57122548,11.6906,12.5112
47 | 38.29520491,22.5834,24.841
48 | 17.21661628,15.5066,18.818
49 | 14.2786659,12.6418,13.05
50 | 28.40736285,7.6714,9.1172
51 | 27.70823204,12.2904,11.9912
52 | 15.62831026,6.124,5.2062
53 | 21.42536497,8.0624,9.144
54 | 13.3295925,3.03,3.0128
55 | 3.610141133,1.554,1.309
56 | 30.65225528,18.702,21.1762
57 | 8.171249853,2.956,2.9672
58 | 29.17741089,15.3648,16.9376
59 | 21.92123015,12.8704,11.2308
60 | 10.61001545,3.3442,4.0904
61 | 19.79199172,11.3643,9.1287
62 | 11.97154422,8.8142,7.0336
63 | 18.56025279,11.2402,8.3194
64 | 17.3293126,6.0402,5.6978
65 | 19.59350651,4.474,4.9826
66 | 9.784877538,3.9728,3.0466
67 | 27.01964457,9.5232,10.0144
68 | 26.08866276,13.1778,12.2334
69 | 23.31453989,12.0939,11.6563
70 | 17.83685236,7.972,7.5714
71 | 13.17136524,15.7386,14.4458
72 | 5.833819318,3.084,3.0596
73 | 9.235521477,3.4778,3.2952
74 | 40.25000518,26.184,25.5518
75 | 32.64080596,18.1884,20.1216
76 | 16.58730041,5.2538,4.4136
77 | 12.76006202,4.6099,5.7223
78 | 10.22276499,3.6986,4.126
79 | 15.4866166,10.3771,11.3225
80 | 27.69979217,22.876,23.9442
81 | 6.675627503,3.216,3.2912
82 | 6.630149196,5.0862,3.6858
83 | 7.135398492,2.8576,2.5094
84 | 9.978144316,3.8872,4.606
85 | 6.86833597,2.6936,3.819
86 | 4.106129545,3.4234,2.7504
87 | 21.34278445,6.0226,6.7572
88 | 24.65686405,5.9594,6.0752
89 | 18.54047895,11.8985,12.5429
90 | 4.942578016,3.4234,3.8984
91 | 23.37975131,10.9638,10.0378
92 | 12.56231404,6.7184,6.3686
93 | 19.7871788,10.151,11.1748
94 | 18.80556794,15.5172,15.0884
95 | 19.55125661,19.1736,21.4022
96 | 17.59084826,7.7978,7.8542
97 | 15.69178294,13.5844,10.8104
98 | 20.40311017,13.7599,15.7711
99 | 12.08112902,4.296,4.6858
100 | 13.08760482,7.754,6.6
101 | 21.11238268,6.6608,8.7924
102 | 18.666476,10.1948,9.0942
103 | 18.13826526,9.4674,8.0688
104 | 18.44001887,12.3846,12.048
105 | 21.27820195,13.1664,16.1514
106 | 18.89058076,12.3729,10.4043
107 | 22.6708304,13.3222,15.907
108 | 25.38860022,20.7978,27.1148
109 | 17.97586074,19.5898,20.8726
110 | 17.53098814,21.1029,19.7183
111 | 18.46927544,25.8881,20.9809
112 | 16.17618847,16.3134,14.6238
113 | 45.68619607,25.8038,31.6808
114 | 22.73654558,15.2585,14.7711
115 | 17.49298835,18.5172,19.4438
116 | 12.68859166,25.8178,13.5724
117 | 25.24924604,14.7414,16.8962
118 | 18.33691225,11.8908,9.693
119 | 18.94805134,13.0493,9.9505
120 | 14.64233093,18.6999,16.5441
121 | 24.45416479,16.3806,18.5206
122 | 10.53295611,9.8186,12.6834
123 | 9.951566984,11.8462,11.8384
124 | 25.48177822,11.5552,12.1816
125 | 18.1852376,7.5562,7.5688
126 | 35.38658927,10.2836,13.47
127 | 31.89669952,19.0748,24.4002
128 | 25.86143135,19.737,21.3906
129 | 23.78106784,25.7303,25.0807
130 | 16.03121666,5.9289,5.8565
131 | 9.175721179,3.821,4.8078
132 | 29.29417202,13.2272,13.8976
133 | 19.87643296,16.9286,16.6778
134 | 22.54258013,10.924,9.659
135 | 21.3037164,18.9228,18.9234
136 | 16.49677132,5.9934,7.244
137 | 21.21702665,15.3977,19.8003
138 | 38.63091595,18.6362,24.8964
139 | 17.20493531,7.7032,13.2064
140 | 18.17713643,7.3808,6.7152
141 | 21.41557813,14.3319,25.8927
142 | 24.19944066,10.5384,13.9428
143 | 15.06883511,4.6861,3.8779
144 | 17.01082452,3.3872,3.407
145 | 22.85852541,8.1666,9.696
146 | 10.17326134,7.4272,6.9876
147 | 22.11366676,12.6654,12.3668
148 | 20.47127058,7.0404,9.209
149 | 11.73901775,5.3495,6.3215
150 | 27.69084277,16.9731,16.2633
151 | 31.00572677,17.2507,19.2311
152 | 14.23220956,6.5812,5.0348
153 | 13.2669771,3.9702,3.9374
154 | 25.08418433,6.186,7.1694
155 | 37.22753311,17.3416,20.6232
156 | 29.67663733,11.7538,10.2764
157 | 40.11047558,10.8232,13.9692
158 | 20.35775245,6.8856,8.0756
159 | 13.53346169,4.2544,5.7074
160 | 24.70425694,11.9836,13.233
161 | 5.486450925,2.9599,3.2525
162 | 17.36084763,17.6926,15.6788
163 | 23.62277872,10.4628,12.5976
164 | 7.783633678,4.9938,4.2974
165 | 19.56661826,10.4558,9.4114
166 | 10.98881874,11.244,8.7826
167 | 10.45777168,6.576,7.7466
168 | 20.71182116,15.3026,15.2794
169 | 7.506875345,4.1894,3.8516
170 | 24.89415626,4.2344,4.928
171 | 18.69463224,2.0328,2.0112
172 | 11.94182816,0.9888,0.9496
173 | 20.24643741,9.9923,9.5915
174 | 25.34373715,6.2878,7.0948
175 | 11.92924977,1.6426,1.76
176 | 19.97256971,2.2276,2.3044
177 | 25.0736874,3.7674,4
178 | 21.23165659,3.6547,3.2471
179 | 15.0363468,2.7596,3.1252
180 | 12.85531572,3.3306,3.5076
181 | 7.9776437,1.313,1.599
182 | 16.79650461,1.2514,1.3536
183 | 13.81841933,1.0063,1.6723
184 | 20.24920859,4.6346,4.1056
185 | 31.02000799,10.329,13.752
186 | 9.44010376,16.651,10.171
187 | 18.19727826,9.8954,10.0898
188 | 12.92434103,2.5068,2.103
189 | 31.88464858,17.5453,12.2211
190 | 20.49268812,2.6398,3.8624
191 | 16.11621019,9.2004,8.837
192 | 21.5082272,2.2558,2.6984
193 | 18.90336742,9.302,10.15
194 | 36.25613791,7.0684,9.108
195 | 24.08095011,4.4112,3.8014
196 | 23.64594058,3.6636,3.1868
197 | 19.27097187,2.2048,1.9844
198 | 20.20634551,1.773,1.951
199 | 27.40191486,4.0418,3.5668
200 | 18.59396434,4.2869,5.1749
201 | 22.24107784,7.9236,9.1306
202 | 15.82847987,7.1054,8.385
203 | 21.58994247,5.4476,5.972
204 | 28.60951667,6.5438,6.266
205 | 22.33131004,3.178,3.7698
206 | 19.97994517,4.0415,4.3217
207 | 9.642158928,2.4555,2.4761
208 | 19.71110831,4.8948,4.253
209 | 5.409178777,1.3954,1.5136
210 | 30.34623823,12.4628,11.455
211 | 18.50575058,3.2562,3.3666
212 | 18.84382382,2.8464,2.848
213 | 18.20516015,7.1358,8.012
214 | 19.47916756,3.2326,2.9944
215 | 14.49196152,6.1708,5.6818
216 | 6.525386119,0.78,0.956
217 | 24.07933458,3.4138,3.9932
218 | 26.862239,24.9501,17.0147
219 | 16.62406313,16.2066,18.1254
220 | 26.48398812,13.0257,13.2927
221 | 27.12691259,24.3024,30.9854
222 | 15.27799836,5.7733,6.3897
223 | 16.38168903,8.8707,5.8229
224 | 28.49053941,12.1098,13.458
225 | 17.63147539,11.5552,11.0166
226 | 28.25776381,6.0038,7.7408
227 | 31.01665926,15.0994,20.7808
228 | 25.81323864,3.8152,3.7848
229 | 20.17074944,9.9373,8.9689
230 | 21.79640752,7.8128,8.4382
231 | 29.67992291,15.0242,14.976
232 | 21.34445749,9.8994,8.0116
233 | 9.220872061,5.1052,6.2062
234 | 19.14800752,6.6949,7.8053
235 | 18.40544558,2.4912,3.5646
236 | 16.69440544,9.088,8.0052
237 | 12.8674847,2.7054,2.6088
238 | 12.70978387,16.8386,14.1414
239 | 11.15786438,1.3494,1.5516
240 | 24.2024943,5.9501,7.0563
241 | 37.67740537,28.7451,37.4771
242 | 26.99063746,3.8706,4.0842
243 | 9.880801231,17.705,9.1448
244 | 14.30220492,3.0126,2.9538
245 | 15.11706502,17.7203,21.4023
246 | 7.78042816,1.6008,6.3042
247 | 18.03649384,11.2816,10.2114
248 | 5.806393714,0.8942,1.7698
249 | 3.181912658,0.5498,1.0428
250 | 5.509754106,8.9269,8.0437
251 | 25.41101805,4.8118,6.5276
252 | 19.13961057,2.5756,4.2052
253 | 28.13760602,20.4781,10.4561
254 | 35.08300249,18.7364,25.9694
255 | 17.55838775,9.9994,8.5634
256 | 30.93220144,13.0768,12.2086
257 | 17.36224958,4.2742,5.6658
258 | 11.9888047,7.9084,5.0276
259 | 10.02198805,12.2716,12.1666
260 | 22.19945495,5.1084,6.6
261 | 18.70594781,9.3858,21.9892
262 | 10.18197972,5.83,4.567
263 | 14.69434426,12.579,13.5474
264 | 16.96629213,6.3828,7.7556
265 | 15.967017,7.3928,6.9604
266 | 5.267268043,1.3862,1.8856
267 | 15.64128503,15.5212,15.5016
268 | 19.6457788,6.4046,7.1256
269 | 17.92134856,7.3746,6.9742
270 | 18.96586475,3.2268,3.1462
271 | 25.09324513,5.526,6.207
272 | 25.11597695,5.6138,5.0224
273 | 15.86049522,9.8504,6.5556
274 | 22.31604425,5.4864,5.487
275 | 16.87540397,16.0474,16.4422
276 | 26.30974125,26.0067,20.5227
277 | 13.8111956,8.004,7.074
278 | 16.37382147,1.8092,1.7416
279 | 18.08631171,2.0088,2.0808
280 | 5.371163881,10.2444,5.5316
281 | 21.73561363,4.3498,3.61
282 | 29.85310256,10.6537,13.4069
283 | 19.40504661,23.43,10.7932
284 | 26.16256484,3.7554,3.851
285 | 28.60031576,17.8615,19.1471
286 | 24.20462944,5.4418,6.6942
287 | 14.783916,3.7984,3.6234
288 | 14.6317533,11.0016,10.0244
289 | 20.09638848,2.1011,2.5705
290 | 20.74758368,2.2494,2.6318
291 | 13.43781584,8.7012,6.4612
292 | 22.86625794,3.3854,3.6202
293 | 23.07834774,6.6246,6.2758
294 | 6.782568056,0.986,1.2584
295 | 28.79020469,3.4614,4.4586
296 | 25.0079956,8.6522,14.834
297 | 20.77766086,4.4964,4.597
298 | 16.70761401,2.8704,3.1632
299 | 13.01944023,2.2558,2.424
300 | 6.069365581,2.4196,2.4988
301 | 9.575659307,1.8716,1.7378
302 | 25.13015887,5.5784,10.0756
303 | 16.20178027,12.966,7.946
304 | 18.54047525,2.8222,3.1542
305 | 18.95271842,3.1096,3.4706
306 | 21.58432996,8.1086,11.7496
307 | 13.34017237,12.5001,14.8739
308 | 12.21125453,4.376,5.5326
309 | 20.24361952,15.4912,16.6878
310 | 18.94597246,2.6286,2.9144
311 | 26.1651845,6.4264,6.9004
312 | 25.05596264,3.6598,3.644
313 | 17.7801422,21.3124,19.7656
314 | 24.54569288,4.0932,4.969
315 | 17.78376986,2.348,2.5658
316 | 7.53622579,1.1186,1.0922
317 | 27.38896584,8.3215,11.8679
318 | 23.7978108,6.4058,7.551
319 | 24.08294115,5.5392,6.5524
320 | 25.764262,4.7636,7.1844
321 | 34.6605671,14.0665,17.4543
322 | 23.27119746,10.3416,9.063
323 | 31.72916136,7.1142,11.7456
324 | 18.34978177,7.1762,7.1622
325 | 23.00288347,4.4968,4.6198
326 | 34.5916619,8.9028,11.8526
327 | 27.05839493,8.3359,7.1731
328 | 29.16829748,5.2365,5.3665
329 | 23.77823697,4.8072,6.5738
330 | 11.11805467,2.2807,2.3605
331 | 7.360543686,1.1632,1.8108
332 | 11.20728473,25.1008,11.737
333 | 23.83555391,10.8567,9.1499
334 | 8.959903258,1.9737,2.5231
335 | 10.88842963,4.7823,5.4005
336 | 13.14354814,5.6751,6.4815
337 | 22.98132257,8.83,8.0722
338 | 31.08550404,8.0134,7.3374
339 | 18.63999334,2.134,2.7906
340 | 34.68421658,3.5206,6.4004
341 | 16.69529263,17.7142,20.6304
342 | 14.26428024,8.4846,20.6446
343 | 15.51415393,9.4055,21.9887
344 | 17.19897293,11.9897,21.3945
345 | 23.00481086,4.3108,10.6846
346 | 19.99185286,9.352,13.0148
347 | 13.66569618,9.664,8.8472
348 | 21.44555238,10.384,15.4482
349 | 35.82541915,5.43,19.0338
350 | 27.96981698,4.7162,11.8714
351 | 35.52522191,3.7356,15.6594
352 | 19.85018085,6.8096,8.6006
353 | 22.05882439,16.499,13.5216
354 | 21.91613358,7.6238,8.283
355 | 19.6934983,24.8326,17.4478
356 | 18.61869804,23.6702,18.9482
357 | 19.03880183,5.4426,4.805
358 | 4.260168818,3.13,3.1642
359 | 12.44776523,14.6748,8.8078
360 | 16.61615705,10.3108,9.9478
361 | 22.62342851,2.804,3.5984
362 | 10.02591996,1.1656,1.573
363 | 17.67796777,3.0379,3.0419
364 | 8.068048339,6.5444,4.086
365 | 19.37889965,5.7078,9.2094
366 | 11.41543215,10.0218,9.9122
367 | 15.51098273,14.4264,10.2926
368 | 25.52583733,15.031,11.0378
369 | 16.49419288,16.5735,15.1261
370 | 17.18466856,3.5363,7.6121
371 | 19.28910498,10.0184,10.2836
372 | 20.03788979,8.5142,9.7222
373 | 19.41757162,10.775,10.4894
374 | 23.54432702,5.6206,6.9136
375 | 17.01290874,10.3914,8.2806
376 | 15.01868776,11.2502,11.0286
377 | 18.83763997,10.2625,18.5179
378 | 31.72837637,16.702,23.3722
379 | 23.82312794,13.5298,16.3744
380 | 11.31197815,9.4118,12.1468
381 | 24.5223472,14.8612,12.6972
382 | 17.35781654,17.2108,14.2478
383 | 13.59649772,13.7078,9.7342
384 | 15.97538271,8.8502,11.5418
385 | 18.99187102,9.8694,16.3718
386 | 21.62685797,8.6818,15.8868
387 | 14.18600471,11.4564,15.0638
388 | 15.92711066,12.2842,13.5186
389 | 13.68846138,10.096,6.895
390 | 15.86676096,15.0488,12.5608
391 | 12.70786565,10.2376,8.3916
392 | 13.14660716,10.0194,8.9004
393 | 25.87166849,4.5095,5.4281
394 | 11.19389619,2.1833,2.6053
395 | 14.18244936,1.1913,1.3945
396 | 20.04421645,2.5465,2.2617
397 | 12.2061079,1.2546,1.4886
398 | 21.04477942,3.7394,3.3044
399 | 13.25068451,1.7142,1.8498
400 | 21.19315689,7.614,8.4078
401 | 27.50494917,11.6357,7.7877
402 | 13.52225652,15.428,14.3184
403 | 16.54813053,2.1748,2.424
404 | 14.01141529,2.4436,1.8988
405 | 24.78530595,6.6972,5.153
406 | 17.28548452,3.7812,2.525
407 | 18.01068529,2.3936,1.6744
408 | 23.07225197,4.053,3.7946
409 | 22.08613764,4.389,4.582
410 | 23.01009063,5.81,5.823
411 | 22.40682221,4.0038,4.1316
412 | 21.51082695,2.7214,3.482
413 | 11.38522371,5.343,7.091
414 | 16.18676103,2.3859,2.4689
415 | 17.72382581,6.3358,5.4684
416 | 17.45940846,3.3973,3.4525
417 | 21.35318479,3.2646,2.8618
418 | 21.91612394,5.008,4.4056
419 | 18.99901208,2.176,1.9496
420 | 26.45104392,2.3478,2.6244
421 | 20.41753578,3.634,3.9792
422 | 6.876144196,12.7632,13.429
423 | 16.58597695,12.8944,9.9808
424 | 17.63954092,6.841,6.486
425 | 17.65478078,12.1434,14.5334
426 | 19.28135433,11.7193,11.5773
427 | 22.82687228,5.1976,10.0224
428 | 22.93689322,7.9788,11.7788
429 | 17.61244311,8.4268,13.4028
430 | 30.37390829,4.6878,13.2256
431 | 17.86192983,9.7202,10.7174
432 | 19.12985727,8.9108,8.031
433 | 31.17881614,9.7632,12.7196
434 | 32.29187557,10.7643,16.3943
435 | 9.282191498,5.7604,4.861
436 | 25.61047533,13.1564,15.0188
437 | 15.52941953,15.7742,8.823
438 | 19.32172819,11.7192,11.0634
439 | 12.20697869,8.0294,6.7056
440 | 33.49675848,10.6234,12.5466
441 | 14.26128719,7.9727,7.3181
442 | 19.86852554,14.5194,21.7278
443 | 25.44581691,26.1732,27.0142
444 | 19.0245801,9.465,6.8698
445 | 26.6906302,17.0056,18.3634
446 | 28.38741922,9.3674,11.0532
447 | 17.70089557,8.2444,8.164
448 | 25.80529803,11.4587,13.4213
449 | 25.52740105,12.5994,12.6334
450 | 25.81479902,9.2047,9.9619
451 | 11.01306786,1.5384,1.9694
452 | 14.53406862,2.5601,2.7673
453 | 29.50408012,19.6552,18.3204
454 | 18.2475506,11.7119,7.1811
455 | 12.79453166,11.1521,9.2531
456 | 55.0194412,24.7238,25.9158
457 | 50.89216159,28.4802,25.533
458 | 16.26111999,5.6036,7.6392
459 | 20.56965676,29.0038,23.6436
460 | 36.23870631,9.9956,18.6522
461 | 21.32301846,17.4264,15.5434
462 | 27.18866613,16.481,14.5586
463 | 19.48287102,3.9568,5.3264
464 | 21.23702249,12.9438,11.2334
465 | 34.01435003,21.1468,17.4802
466 | 23.29458444,5.9862,8.644
467 | 17.90641918,6.5736,5.4404
468 | 16.92143351,1.68,2.2202
469 | 10.85389128,6.4746,5.162
470 | 20.59350941,7.6389,7.1705
471 | 13.81604835,4.2464,3.9798
472 | 16.2285965,4.7703,4.6663
473 | 8.814959167,17.9013,13.2433
474 | 20.33283298,9.2735,10.1233
475 | 17.61219357,8.4397,7.2637
476 | 17.94249724,10.4104,8.8464
477 | 23.80177615,7.427,8.8096
478 | 11.52169606,4.7664,3.5228
479 | 19.15164369,23.9976,13.688
480 | 12.13579079,6.1143,5.2851
481 | 19.13273506,12.8438,9.957
482 | 53.82466442,22.8226,21.49
483 | 18.70909763,11.901,16.5476
484 | 14.30762628,19.8289,23.5695
485 | 13.02782972,6.3722,4.8278
486 | 18.42994072,11.0395,9.7211
487 | 39.54512239,6.2918,8.8248
488 | 34.84840422,7.5568,7.4066
489 | 26.34730022,5.651,8.1182
490 | 19.49389822,8.3588,6.2904
491 | 10.65065369,5.2568,5.276
492 | 17.20692872,7.4926,6.398
493 | 25.84596252,8.5976,8.4488
494 | 30.06431005,5.1144,4.9546
495 | 25.72263663,6.8254,4.2092
496 | 37.39374695,5.2498,6.883
497 | 49.53813138,12.7212,19.2382
498 | 19.46544436,5.8164,6.326
499 | 29.22537256,18.3792,18.5316
500 | 13.9444571,6.408,4.2296
501 | 24.5469504,11.678,12.0814
502 | 16.35988521,7.4644,7.334
503 | 12.54503535,3.8675,3.9803
504 | 34.58083907,16.759,18.387
505 | 23.10612785,14.5318,8.284
506 | 22.42952803,4.5372,3.8106
507 | 26.29778539,4.2916,4.7128
508 | 34.57568012,19.8694,15.8196
509 | 15.40432817,6.1304,6.4058
510 | 9.798390723,2.1032,1.9882
511 | 20.21899239,5.4504,4.9216
512 | 11.04484462,1.6064,1.5144
513 | 15.13719979,17.5296,19.3484
514 | 18.81318686,1.4488,1.38
515 | 23.03120053,8.376,9.2514
516 | 20.40367503,7.324,6.6888
517 | 21.87752549,5.6214,6.9316
518 | 17.86376369,11.4243,11.5367
519 | 28.02085279,9.1698,7.003
520 | 26.58835752,8.8832,9.0696
521 | 34.34344826,16.1608,14.6642
522 | 25.47697221,4.0875,5.2561
523 | 20.275964,2.5282,3.2128
524 | 14.85985712,14.036,13.5738
525 | 25.96996143,8.9274,9.9344
526 | 22.59751804,7.6718,7.6816
527 | 21.52694104,8.384,7.4212
528 | 14.46565848,4.2388,2.9478
529 | 29.17784008,4.2246,5.9396
530 | 31.78900687,23.9137,22.5815
531 | 25.89457432,5.982,6.1078
532 | 17.98872978,5.113,4.149
533 | 22.74031771,6.288,5.6084
534 | 25.38730342,12.3316,8.345
535 | 15.01828765,12.396,9.506
536 | 19.01122613,2.3846,1.841
537 | 10.99161724,13.626,6.3638
538 | 15.55070182,6.3061,6.5409
539 | 14.42263763,7.6595,6.8463
540 | 30.06825459,2.6702,3.4524
541 | 22.27615205,9.638,12.7658
542 | 26.97278839,13.8058,16.5262
543 | 24.1090712,10.4729,10.1949
544 | 21.57218933,5.4906,5.6856
545 | 20.17378594,11.0926,8.969
546 | 14.24035584,15.4208,12.1388
547 | 30.12635584,11.1548,19.122
548 | 17.978042,2.593,2.5024
549 | 9.00997409,4.9085,11.1633
550 | 12.08858132,1.7442,2.2358
551 | 6.888406257,1.1944,1.2882
552 | 21.10598393,30.6753,14.4185
553 | 29.08486243,23.7222,18.0958
554 | 17.73205377,19.571,17.9446
555 | 15.88514241,16.8373,14.3757
556 | 24.41421328,16.8831,15.1187
557 | 32.14425848,15.9861,16.8459
558 | 12.77425131,5.769,5.4344
559 | 25.40324938,13.482,11.461
560 | 37.36020089,37.384,24.5824
561 | 17.53312957,2.2392,1.7144
562 | 20.01217793,18.7352,20.4884
563 | 16.2058261,10.5119,15.3743
564 | 15.49695474,9.3858,6.1876
565 | 15.0129669,8.6922,11.515
566 | 23.82262982,6.3182,6.6308
567 | 26.09380982,21.7134,18.483
568 | 51.59468674,20.6146,24.29
569 | 9.855546578,3.498,2.7358
570 | 24.01259371,17.0362,24.2344
571 | 14.68533748,2.8286,4.4048
572 | 21.18800545,13.7448,21.745
573 | 22.26032819,6.1235,12.7083
574 | 7.855794574,4.3499,13.2619
575 | 25.45495132,5.9472,6.7146
576 | 25.70535693,3.8846,4.875
577 | 28.51441017,3.4028,5.0708
578 | 12.86253722,14.7334,20.0796
579 | 11.59340059,1.4318,1.9674
580 | 29.58040296,22.0554,23.4064
581 | 17.72400807,3.3618,5.1286
582 | 12.3921847,1.7524,3.0732
583 | 14.43451134,14.0604,21.0168
584 | 24.00009315,3.3642,4.273
585 | 19.73377605,20.2835,23.4909
586 | 19.44959824,19.0514,25.0594
587 | 11.91172516,9.7197,12.3537
588 | 32.18481483,4.6354,4.8116
589 | 18.67653226,3.9756,3.3174
590 | 25.37290573,24.7718,18.7426
591 | 9.847885538,7.5238,8.1362
592 | 21.2931273,3.9722,2.7614
593 | 34.63695678,22.245,19.965
594 | 23.36760263,3.574,3.4766
595 | 24.7593122,16.7876,14.4618
596 | 13.39996781,12.5712,6.5326
597 | 16.17662808,14.84,14.6628
598 | 15.19143568,10.6111,8.4047
599 | 16.91026702,3.6598,4.0712
600 | 23.80185349,2.2164,2.4032
601 | 22.82991677,9.7967,9.0005
602 | 31.25131005,11.7445,15.6001
603 | 26.83739337,6.4934,7.1346
604 | 24.14132293,4.5186,4.7448
605 | 21.92227455,2.8334,2.8938
606 | 9.778513998,0.7836,1.808
607 | 17.5094446,2.0192,2.7176
608 | 18.89228425,1.726,1.886
609 | 26.71323741,6.5184,6.7082
610 | 17.95213975,8.015,5.8732
611 | 19.73859614,6.1834,5.8338
612 | 19.28905435,11.3141,9.5135
613 | 8.516197987,2.8618,2.8072
614 | 30.78039192,30.4444,22.4464
615 | 13.64341183,1.2462,1.3058
616 | 19.20619825,5.342,5.3836
617 | 13.84960556,14.836,15.1686
618 | 9.37507213,5.2194,9.3516
619 | 35.62588082,9.1009,9.5147
620 | 25.45542655,3.155,2.8044
621 | 14.19579574,9.4134,5.0344
622 | 18.60920168,5.7898,6.6778
623 | 28.92070196,11.4093,9.5721
624 | 16.05319208,6.1222,4.732
625 | 23.98369688,11.136,11.2304
626 | 19.73871302,2.7096,2.8406
627 | 32.56384547,8.2323,15.6207
628 | 9.227184888,1.1884,1.4148
629 | 24.70475501,13.0522,12.3772
630 | 21.02709745,1.8184,1.6708
631 | 16.40142077,3.3948,3.1892
632 | 11.74780074,2.067,1.8924
633 | 26.92977715,6.8842,8.3838
634 | 17.44172165,2.1928,1.9476
635 | 15.56476248,11.8288,11.913
636 | 8.515800571,1.1552,1.3492
637 | 21.16986915,4.946,4.6002
638 | 20.42988626,4.8798,5.2266
639 | 27.60531737,2.9532,4.0354
640 | 9.045899624,1.4662,1.07
641 | 28.51636962,18.6934,11.7372
642 | 17.59157938,4.802,4.1074
643 | 20.66692634,7.7656,13.3334
644 | 16.05910419,10.3612,10.8292
645 | 12.36979538,6.8235,6.6215
646 | 19.53910551,5.9636,6.334
647 | 17.71844594,2.8658,3.9058
648 | 34.55125231,5.4546,6.342
649 | 28.22704951,10.1692,7.974
650 | 5.721015994,3.5726,3.276
651 | 22.50474332,8.474,8.1798
652 | 23.25279932,8.8666,7.682
653 | 30.39303301,16.217,20.8408
654 | 20.93985696,8.0984,7.7362
655 | 11.62205082,13.5821,11.9941
656 | 20.54449814,9.968,8.6296
657 | 25.11255002,13.4444,13.4944
658 | 19.90522476,5.2948,6.5918
659 | 25.036974,15.5314,13.6036
660 | 13.84814546,5.1088,4.9918
661 | 18.21993038,9.1517,10.1133
662 | 15.90918057,9.6114,10.4688
663 | 28.81062116,11.6953,14.6221
664 | 25.08305592,8.0762,8.6758
665 | 22.40163859,6.535,6.4308
666 | 22.21648448,5.5518,6.6588
667 | 25.01463793,9.9358,10.867
668 | 18.3449168,5.8929,6.1137
669 | 17.40499689,9.1751,9.0677
670 | 24.1562298,7.6248,9.5486
671 | 9.168421803,10.0798,7.3858
672 | 2.52856114,1.3892,1.3184
673 | 17.36298079,8.6846,6.5714
674 | 15.27763785,6.4472,7.6224
675 | 7.105701215,3.4444,2.4968
676 | 20.63251546,8.4628,9.5742
677 | 16.22828897,5.4266,5.7654
678 | 18.63744644,9.3445,9.2247
679 | 16.24273831,22.2947,18.4547
680 | 18.30948374,17.0164,16.5246
681 | 14.52392039,6.064,5.4608
682 | 13.65227448,6.1738,5.9988
683 | 16.93586344,10.122,8.8158
684 | 13.90864606,14.833,15.8478
685 | 21.97268249,7.2826,9.732
686 | 16.67838493,8.182,8.974
687 | 18.5507044,8.2357,9.2647
688 | 18.23941622,9.6833,9.0199
689 | 19.86262101,17.1754,21.5262
690 | 18.17675258,10.2458,12.5476
691 | 23.48646871,14.6538,15.1356
692 | 12.51772254,5.9856,6.4162
693 | 14.22624314,14.1409,10.5025
694 | 14.03683009,18.4728,17.001
695 | 17.55638662,11.5342,11.183
696 | 15.35903307,18.0294,17.8048
697 | 20.09284042,12.1028,16.642
698 | 15.9562599,17.5402,15.3114
699 | 14.32295617,5.7486,5.5716
700 | 6.949161658,10.8935,8.6515
701 | 18.19013796,8.528,7.941
702 | 18.19013796,,
703 | 26.06680604,,
704 |
--------------------------------------------------------------------------------
/Celestial-Bodies-Classification/kernel_svm.py:
--------------------------------------------------------------------------------
1 | # Kernel SVM
2 |
3 | # Importing the libraries
4 | import numpy as np
5 | import matplotlib.pyplot as plt
6 | import pandas as pd
7 | from sklearn import metrics
8 | import pickle
9 | from sklearn.model_selection import train_test_split
10 | from sklearn.preprocessing import StandardScaler
11 | from sklearn.svm import SVC
12 | from sklearn.metrics import confusion_matrix
13 |
14 | import time
15 | start_time = time.time()
16 |
17 | # Importing the dataset
18 | dataset = pd.read_csv('DataSheet.csv')
19 | X = dataset.iloc[:, [1,6,7]].values
20 | y = dataset.iloc[:, 10].values
21 |
22 | # Splitting the dataset into the Training set and Test set
23 | X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.20, random_state = 0)
24 |
25 | # Feature Scaling
26 | sc = StandardScaler()
27 | X_train = sc.fit_transform(X_train)
28 | X_test = sc.transform(X_test)
29 |
30 | # Fitting Kernel SVM to the Training set
31 | clf = SVC(kernel = 'rbf', C=2)
32 | clf.fit(X_train, y_train)
33 |
34 |
35 | # Accuracy
36 | acc = clf.score(X_test, y_test)
37 | print("\nAccuracy = ", acc*100,"%")
38 |
39 | # Predicting the Test set results
40 | svmPickle = open('k_svmpickle_file', 'wb')
41 |
42 | pickle.dump(clf, svmPickle)
43 |
44 | # Loading the model
45 | loaded_model = pickle.load(open('k_svmpickle_file', 'rb'))
46 | result = loaded_model.predict(X_test)
47 |
48 | # Displaying the predicted and actual values
49 | print("\n0 = star , 1 = galaxy")
50 | for x in range(len(result)):
51 | print("Predicted: ", result[x], " Data: ", X_test[x], " Actual: ", y_test[x])
52 |
53 | # Making the Confusion Matrix
54 | cm = confusion_matrix(y_test, result)
55 | print("\n The Confusion Matrix:")
56 | print(cm)
57 |
58 | sensitivity1 = cm[0,0]/(cm[0,0]+cm[0,1])
59 | print('Sensitivity : ', sensitivity1 )
60 |
61 | specificity1 = cm[1,1]/(cm[1,0]+cm[1,1])
62 | print('Specificity : ', specificity1)
63 |
64 |
65 | # Visualising the Training set results
66 | from matplotlib.colors import ListedColormap
67 | X_set, y_set = X_train, y_train
68 | X1, X2, X3 = np.meshgrid(np.arange(start = X_set[:, 0].min() - 1, stop = X_set[:, 0].max() + 1, step = 0.01),
69 | np.arange(start = X_set[:, 1].min() - 1, stop = X_set[:, 1].max() + 1, step = 0.01)
70 | np.arange(start = X_set[:, 2].min() - 1, stop = X_set[:, 2].max() + 1, step = 0.01))
71 | plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
72 | alpha = 0.75, cmap = ListedColormap(('red', 'green')))
73 | plt.xlim(X1.min(), X1.max())
74 | plt.ylim(X2.min(), X2.max())
75 | for i, j in enumerate(np.unique(y_set)):
76 | plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
77 | c = ListedColormap(('red', 'green'))(i), label = j)
78 | plt.title('Kernel SVM (Training set)')
79 | plt.xlabel('Age')
80 | plt.ylabel('Estimated Salaryry')
81 | plt.legend()
82 | plt.show()
83 |
84 | # Visualising the Test set results
85 | from matplotlib.colors import ListedColormap
86 | X_set, y_set = X_test, y_test
87 | X1, X2, X3 = np.meshgrid(np.arange(start = X_set[:, 0].min() - 1, stop = X_set[:, 0].max() + 1, step = 0.01),
88 | np.arange(start = X_set[:, 1].min() - 1, stop = X_set[:, 1].max() + 1, step = 0.01),
89 | np.arange(start = X_set[:, 2].min() - 1, stop = X_set[:, 2].max() + 1, step = 0.01))
90 | plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
91 | alpha = 0.75, cmap = ListedColormap(('red', 'green')))
92 | plt.xlim(X1.min(), X1.max())
93 | plt.ylim(X2.min(), X2.max())
94 | for i, j in enumerate(np.unique(y_set)):
95 | plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
96 | c = ListedColormap(('red', 'green'))(i), label = j)
97 | plt.title('Kernel SVM (Test set)')
98 | plt.xlabel('Age')
99 | plt.ylabel('Estimated Salary')
100 | plt.legend()
101 | plt.show()
102 |
103 | print("--- %s seconds ---" % (time.time() - start_time))
104 |
--------------------------------------------------------------------------------
/Celestial-Bodies-Classification/knn.py:
--------------------------------------------------------------------------------
1 | # K-Nearest Neighbors (K-NN)
2 |
3 | # Importing the libraries
4 | import numpy as np
5 | import matplotlib.pyplot as plt
6 | import pandas as pd
7 | import pickle
8 | from sklearn.model_selection import train_test_split
9 | from sklearn.preprocessing import StandardScaler
10 | from sklearn.neighbors import KNeighborsClassifier
11 | from sklearn.metrics import confusion_matrix
12 | import seaborn as sns
13 | import time
14 | start_time = time.time()
15 |
16 | # Importing the dataset
17 | dataset = pd.read_csv('DataSheet.csv')
18 | X = dataset.iloc[:, [1,6,7]].values
19 | y = dataset.iloc[:, 10].values
20 |
21 | # Splitting the dataset into the Training set and Test set
22 | X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.20, random_state = 0)
23 |
24 | # Feature Scaling
25 | sc = StandardScaler()
26 | X_train = sc.fit_transform(X_train)
27 | X_test = sc.transform(X_test)
28 |
29 | sns.countplot(x= 'name', data= dataset)
30 |
31 | # Fitting K-NN to the Training set
32 | classifier = KNeighborsClassifier(n_neighbors = 7, metric = 'minkowski', p = 2)
33 | classifier.fit(X_train, y_train)
34 |
35 | # Accuracy
36 | acc = classifier.score(X_test, y_test)
37 | print("\nAccuracy = ", acc*100,"%")
38 |
39 | # Predicting the Test set results
40 | knnPickle = open('knnpickle_file', 'wb')
41 |
42 | pickle.dump(classifier, knnPickle)
43 |
44 | # Loading the model
45 | loaded_model = pickle.load(open('knnpickle_file', 'rb'))
46 | result = loaded_model.predict(X_test)
47 |
48 | # Displaying the predicted and actual values
49 | print("\n0 = star , 1 = galaxy")
50 | for x in range(len(result)):
51 | print("Predicted: ", result[x], " Data: ", X_test[x], " Actual: ", y_test[x])
52 |
53 | # Making the Confusion Matrix
54 | cm = confusion_matrix(y_test, result)
55 | print("\n The Confusion Matrix:")
56 | print(cm)
57 |
58 | sensitivity1 = cm[0,0]/(cm[0,0]+cm[0,1])
59 | print('Sensitivity : ', sensitivity1 )
60 |
61 | specificity1 = cm[1,1]/(cm[1,0]+cm[1,1])
62 | print('Specificity : ', specificity1)
63 |
64 | # Visualising the Training set results
65 | from matplotlib.colors import ListedColormap
66 | X_set, y_set = X_train, y_train
67 | X1, X2 = np.meshgrid(np.arange(start = X_set[:, 0].min() - 1, stop = X_set[:, 0].max() + 1, step = 0.01),
68 | np.arange(start = X_set[:, 1].min() - 1, stop = X_set[:, 1].max() + 1, step = 0.01))
69 | plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
70 | alpha = 0.75, cmap = ListedColormap(('red', 'green')))
71 | plt.xlim(X1.min(), X1.max())
72 | plt.ylim(X2.min(), X2.max())
73 | for i, j in enumerate(np.unique(y_set)):
74 | plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
75 | c = ListedColormap(('red', 'green'))(i), label = j)
76 | plt.title('K-NN (Training set)')
77 | plt.xlabel('DistictNumpyArray')
78 | plt.ylabel('Star or Galaxy')
79 | plt.legend()
80 | plt.show()
81 |
82 | # Visualising the Test set results
83 | from matplotlib.colors import ListedColormap
84 | X_set, y_set = X_test, y_test
85 | X1, X2 = np.meshgrid(np.arange(start = X_set[:, 0].min() - 1, stop = X_set[:, 0].max() + 1, step = 0.01),
86 | np.arange(start = X_set[:, 1].min() - 1, stop = X_set[:, 1].max() + 1, step = 0.01))
87 | plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
88 | alpha = 0.75, cmap = ListedColormap(('red', 'green')))
89 | plt.xlim(X1.min(), X1.max())
90 | plt.ylim(X2.min(), X2.max())
91 | for i, j in enumerate(np.unique(y_set)):
92 | plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
93 | c = ListedColormap(('red', 'green'))(i), label = j)
94 | plt.title('K-NN (Test set)')
95 | plt.xlabel('DistictNumpyArray')
96 | plt.ylabel('Star or Galaxy')
97 | plt.legend()
98 | plt.show()
99 |
100 | print("--- %s seconds ---" % (time.time() - start_time))
101 |
--------------------------------------------------------------------------------
/Celestial-Bodies-Classification/svm.py:
--------------------------------------------------------------------------------
1 | # Support Vector Machine (SVM)
2 |
3 | # Importing the libraries
4 | import numpy as np
5 | import matplotlib.pyplot as plt
6 | import pandas as pd
7 | import sklearn
8 | from sklearn import datasets
9 | from sklearn import svm
10 | from sklearn import metrics
11 | import pickle
12 | import time
13 | start_time = time.time()
14 |
15 | # Importing the dataset
16 | dataset = pd.read_csv('DataSheet.csv')
17 | X = dataset.iloc[:, [1,6,7]].values
18 | y = dataset.iloc[:, 10].values
19 |
20 | # Splitting the dataset into the Training set and Test set
21 | from sklearn.model_selection import train_test_split
22 | X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.2, random_state = 0)
23 |
24 | # Feature Scaling
25 | from sklearn.preprocessing import StandardScaler
26 | sc = StandardScaler()
27 | X_train = sc.fit_transform(X_train)
28 | X_test = sc.transform(X_test)
29 |
30 | # Fitting SVM to the Training set
31 | clf = svm.SVC(kernel = 'linear', C=2)
32 | clf.fit(X_train, y_train)
33 |
34 | # Accuracy
35 | acc = clf.score(X_test, y_test)
36 | print("\nAccuracy = ", acc*100,"%")
37 |
38 |
39 | svmPickle = open('svmpickle_file', 'wb')
40 |
41 | pickle.dump(clf, svmPickle)
42 |
43 | # Loading the model
44 | loaded_model = pickle.load(open('svmpickle_file', 'rb'))
45 |
46 | # Predicting the Test set results
47 | result = loaded_model.predict(X_test)
48 |
49 | # Displaying the predicted and actual values
50 | print("\n0 = star , 1 = galaxy")
51 | for x in range(len(result)):
52 | print("Predicted: ", result[x], " Data: ", X_test[x], " Actual: ", y_test[x])
53 |
54 | # Making the Confusion Matrix
55 | from sklearn.metrics import confusion_matrix
56 | cm = confusion_matrix(y_test, result)
57 | print("\n The Confusion Matrix:")
58 | print(cm)
59 |
60 | sensitivity1 = cm[0,0]/(cm[0,0]+cm[0,1])
61 | print('Sensitivity : ', sensitivity1 )
62 |
63 | specificity1 = cm[1,1]/(cm[1,0]+cm[1,1])
64 | print('Specificity : ', specificity1)
65 |
66 |
67 | # Visualising the Training set results
68 | from matplotlib.colors import ListedColormap
69 | X_set, y_set = X_train, y_train
70 | X1, X2 = np.meshgrid(np.arange(start = X_set[:, 0].min() - 1, stop = X_set[:, 0].max() + 1, step = 0.01),
71 | np.arange(start = X_set[:, 1].min() - 1, stop = X_set[:, 1].max() + 1, step = 0.01))
72 | plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
73 | alpha = 0.75, cmap = ListedColormap(('red', 'green')))
74 | plt.xlim(X1.min(), X1.max())
75 | plt.ylim(X2.min(), X2.max())
76 | for i, j in enumerate(np.unique(y_set)):
77 | plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
78 | c = ListedColormap(('red', 'green'))(i), label = j)
79 | plt.title('SVM (Training set)')
80 | plt.xlabel('Age')
81 | plt.ylabel('Estimated Salary')
82 | plt.legend()
83 | plt.show()
84 |
85 | # Visualising the Test set results
86 | from matplotlib.colors import ListedColormap
87 | X_set, y_set = X_test, y_test
88 | X1, X2 = np.meshgrid(np.arange(start = X_set[:, 0].min() - 1, stop = X_set[:, 0].max() + 1, step = 0.01),
89 | np.arange(start = X_set[:, 1].min() - 1, stop = X_set[:, 1].max() + 1, step = 0.01))
90 | plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
91 | alpha = 0.75, cmap = ListedColormap(('red', 'green')))
92 | plt.xlim(X1.min(), X1.max())
93 | plt.ylim(X2.min(), X2.max())
94 | for i, j in enumerate(np.unique(y_set)):
95 | plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
96 | c = ListedColormap(('red', 'green'))(i), label = j)
97 | plt.title('SVM (Test set)')
98 | plt.xlabel('Age')
99 | plt.ylabel('Estimated Salary')
100 | plt.legend()
101 | plt.show()
102 |
103 | print("--- %s seconds ---" % (time.time() - start_time))
104 |
--------------------------------------------------------------------------------
/Cervical-Cancer-Risk/cervical-cancer-awareness.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Cervical-Cancer-Risk/cervical-cancer-awareness.png
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/Drowsiness Detection model.py:
--------------------------------------------------------------------------------
1 | # Importing Project Dependencies
2 | import numpy as np
3 | import os
4 | import cv2
5 | import tensorflow as tf
6 | from tensorflow import keras
7 | from tensorflow.keras import layers
8 |
9 | # Setting up config for GPU training
10 | if tf.test.is_gpu_available:
11 | physical_devices = tf.config.list_physical_devices("GPU")
12 | tf.config.experimental.set_memory_growth(physical_devices[0], True)
13 |
14 | # Loading in all the images and assigning target classes
15 | def load_images(folder):
16 | imgs, targets = [], []
17 | for foldername in os.listdir(folder):
18 | loc = folder + "/" + foldername
19 | targets.append(len(os.listdir(loc)))
20 | for filename in os.listdir(loc):
21 | img = cv2.imread(os.path.join(loc, filename))
22 | img = cv2.resize(img, (86, 86))
23 | if img is not None and img.shape == (86, 86, 3):
24 | imgs.append(img)
25 | print(foldername)
26 | imgs = np.array(imgs)
27 | y = np.zeros(imgs.shape[0]).astype(int)
28 | j, n = 0, 0
29 | for i in targets:
30 | y[j:i + j] = n
31 | n += 1
32 | j = i + j
33 | return imgs, y
34 |
35 |
36 | folder = "../input/drowsiness-detection"
37 | X, y = load_images(folder)
38 |
39 | # Splitting the data into 2 separate training and testing sets
40 | def train_test_split(X, y, testing_size=0.2):
41 | no_of_rows = X.shape[0]
42 | no_of_test_rows = int(no_of_rows * testing_size)
43 | rand_row_num = np.random.randint(0, no_of_rows, no_of_test_rows)
44 |
45 | X_test = np.array([X[i] for i in rand_row_num])
46 | X_train = np.delete(X, rand_row_num, axis=0)
47 |
48 | y_test = np.array([y[i] for i in rand_row_num])
49 | y_train = np.delete(y, rand_row_num, axis=0)
50 |
51 | return X_train, y_train, X_test, y_test
52 |
53 |
54 | X_train, y_train, X_test, y_test = train_test_split(X, y, testing_size=0.2)
55 | print(X_train[0].shape)
56 |
57 | # Model building using sequential API
58 | model = keras.Sequential(
59 | [
60 | keras.Input(shape=(86, 86, 3)),
61 | layers.Conv2D(75, 3, padding='valid', activation='relu'),
62 | layers.MaxPooling2D(pool_size=(5, 5)),
63 | layers.Conv2D(64, 2, padding='valid', activation='relu'),
64 | layers.MaxPooling2D(pool_size=(2, 2)),
65 | layers.Conv2D(128, 3, padding='valid', activation='relu'),
66 | layers.MaxPooling2D(pool_size=(2, 2)),
67 | layers.Flatten(),
68 | layers.Dense(64, activation='relu'),
69 | layers.Dense(2, activation='softmax'),
70 | ]
71 | )
72 | print(model.summary())
73 |
74 | # Model compilation with keeping track of accuracy while training & evaluation process
75 | model.compile(
76 | loss=keras.losses.SparseCategoricalCrossentropy(from_logits=False),
77 | optimizer=keras.optimizers.Adam(),
78 | metrics=['accuracy']
79 | )
80 |
81 | model.fit(X_train, y_train, batch_size=32, epochs=10)
82 |
83 | model.evaluate(X_test, y_test, batch_size=32)
84 |
85 | # Saving the model
86 | model.save('my_model (1).h5')
87 |
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/Eye_patch_extractor_&_GUI.py:
--------------------------------------------------------------------------------
1 | # Importing Project Dependencies
2 | import numpy as np
3 | import cv2
4 | import pandas as pd
5 | import tensorflow as tf
6 | from tensorflow import keras
7 | import time
8 | import winsound
9 | import streamlit as st
10 |
11 | # Setting up config for GPU usage
12 | physical_devices = tf.config.list_physical_devices("GPU")
13 | tf.config.experimental.set_memory_growth(physical_devices[0], True)
14 |
15 | # Using Har-cascade classifier from OpenCV
16 | face_cascade = cv2.CascadeClassifier(cv2.data.haarcascades + 'haarcascade_frontalface_default.xml')
17 |
18 | # Loading the trained model for prediction purpose
19 | model = keras.models.load_model('my_model (1).h5')
20 |
21 | # Title for GUI
22 | st.title('Drowsiness Detection')
23 | img = []
24 |
25 | # Navigation Bar
26 | nav_choice = st.sidebar.radio('Navigation', ('Home', 'Sleep Detection', 'Help Us Improve'), index=0)
27 | # Home page
28 | if nav_choice == 'Home':
29 | st.header('Prevents sleep deprivation road accidents, by alerting drowsy drivers.')
30 | st.image('ISHN0619_C3_pic.jpg')
31 | st.markdown('In accordance with the survey taken by the Times Of India, about 40 % of road '
32 | 'accidents are caused '
33 | 'due to sleep deprivation & fatigued drivers. In order to address this issue, this app will '
34 | 'alert such drivers with the help of deep learning models and computer vision. '
35 | '', unsafe_allow_html=True)
36 | st.image('sleep.jfif', width=300)
37 | st.markdown('
How to use? '
38 | '1. Go to Sleep Detection page from the Navigation Side-Bar. '
39 | '2. Make sure that, you have sufficient amount of light, in your room. '
40 | '3. Align yourself such that, you are clearly visible in the web-cam and '
41 | 'stay closer to the web-cam. '
42 | '4. Web-cam will take 3 pictures of you, so keep your eyes in the same state'
43 | ' (open or closed) for about 5 seconds. '
44 | '5. If your eyes are closed, the model will make a beep sound to alert you. '
45 | '6. Otherwise, the model will continue taking your pictures at regular intervals of time. '
46 | 'For the purpose of the training process of the model, '
47 | 'dataset used is available here '
49 | , unsafe_allow_html=True)
50 |
51 | # Sleep Detection page
52 | elif nav_choice == 'Sleep Detection':
53 | st.header('Image Prediction')
54 | cap = 0
55 | st.success('Please look at your web-cam, while following all the instructions given on the Home page.')
56 | st.warning(
57 | 'Keeping the eyes in the same state is important but you can obviously blink your eyes, if they are open!!!')
58 | b = st.progress(0)
59 | for i in range(100):
60 | time.sleep(0.0001)
61 | b.progress(i + 1)
62 |
63 | start = st.radio('Options', ('Start', 'Stop'), key='Start_pred', index=1)
64 |
65 | if start == 'Start':
66 | decision = 0
67 | st.markdown('Detected Facial Region of Interest(ROI) Extractd'
68 | ' Eye Features from the ROI ', unsafe_allow_html=True)
69 |
70 | # Best of 3 mechanism for drowsiness detection
71 | for _ in range(3):
72 | cap = cv2.VideoCapture(0)
73 | ret, frame = cap.read()
74 | gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
75 | faces = face_cascade.detectMultiScale(gray, 1.3, 5)
76 | # Proposal of face region by the har cascade classifier
77 | for (x, y, w, h) in faces:
78 | cv2.rectangle(frame, (x, y), (x + w, y + h), (255, 0, 0), 5)
79 | roi_gray = gray[y:y + w, x:x + w]
80 | roi_color = frame[y:y + h, x:x + w]
81 | frame1 = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
82 |
83 | try:
84 | # Cenentroid method for extraction of eye-patch
85 | centx, centy = roi_color.shape[:2]
86 | centx //= 2
87 | centy //= 2
88 | eye_1 = roi_color[centy - 40: centy, centx - 70: centx]
89 | eye_1 = cv2.resize(eye_1, (86, 86))
90 | eye_2 = roi_color[centy - 40: centy, centx: centx + 70]
91 | eye_2 = cv2.resize(eye_2, (86, 86))
92 | cv2.rectangle(frame1, (x + centx - 60, y + centy - 40), (x + centx - 10, y + centy), (0, 255, 0), 5)
93 | cv2.rectangle(frame1, (x + centx + 10, y + centy - 40), (x + centx + 60, y + centy), (0, 255, 0), 5)
94 | preds_eye1 = model.predict(np.expand_dims(eye_1, axis=0))
95 | preds_eye2 = model.predict(np.expand_dims(eye_2, axis=0))
96 | e1, e2 = np.argmax(preds_eye1), np.argmax(preds_eye2)
97 |
98 | # Display of face image and extracted eye-patch
99 | img_container = st.beta_columns(4)
100 | img_container[0].image(frame1, width=250)
101 | img_container[2].image(cv2.cvtColor(eye_1, cv2.COLOR_BGR2RGB), width=150)
102 | img_container[3].image(cv2.cvtColor(eye_2, cv2.COLOR_BGR2RGB), width=150)
103 | print(e1, e2)
104 |
105 | # Decision variable for prediction
106 | if e1 == 1 or e2 == 1:
107 | pass
108 | else:
109 | decision += 1
110 |
111 | except NameError:
112 | st.warning('Hold your camera closer!!!\nTrying again in 2s')
113 | cap.release()
114 | time.sleep(1)
115 | continue
116 |
117 | except:
118 | cap.release()
119 | continue
120 |
121 | finally:
122 | cap.release()
123 |
124 | # If found drowsy, then make a beep sound to alert the driver
125 | if decision == 0:
126 | st.error('Eye(s) are closed')
127 | winsound.Beep(2500, 2000)
128 |
129 | else:
130 | st.success('Eyes are Opened')
131 | st.warning('Please select "Stop" and then "Start" to try again')
132 |
133 | # Help Us Improve page
134 | else:
135 | st.header('Help Us Improve')
136 | st.success('We would appreciate your Help!!!')
137 | st.markdown(
138 | 'To make this app better, we would appreciate your small amount of time. '
139 | 'Let me take you through, some of the basic statistical analysis of this '
140 | 'model. Accuracy with naked eyes = 99.5% Accuracy with spectacles = 96.8% '
141 | 'As we can see here, accuracy with spectacles is not at all spectacular, and hence to make this app '
142 | 'better, and to use it in real-time situations, we require as much data as we can gather. '
143 | , unsafe_allow_html=True)
144 | st.warning('NOTE: Your identity will be kept anonymous, and only your eye-patch will be extracted!!!')
145 | # Image upload
146 | img_upload = st.file_uploader('Upload Image Here', ['png', 'jpg', 'jpeg'])
147 | if img_upload is not None:
148 | prog = st.progress(0)
149 | to_add = cv2.imread(str(img_upload.read()), 0)
150 | to_add = pd.DataFrame(to_add)
151 |
152 | # Save it in the database
153 | to_add.to_csv('Data_from_users.csv', mode='a', header=False, index=False, sep=';')
154 | for i in range(100):
155 | time.sleep(0.001)
156 | prog.progress(i + 1)
157 | st.success('Uploaded Successfully!!! Thank you for contributing.')
158 |
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/ISHN0619_C3_pic.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Drowsiness-Detection-web-app/ISHN0619_C3_pic.jpg
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/Pics-for-Readme/2021-04-10.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Drowsiness-Detection-web-app/Pics-for-Readme/2021-04-10.png
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/Pics-for-Readme/2021-05-03 (1).png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Drowsiness-Detection-web-app/Pics-for-Readme/2021-05-03 (1).png
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/Pics-for-Readme/2021-05-03 (2).png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Drowsiness-Detection-web-app/Pics-for-Readme/2021-05-03 (2).png
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/Pics-for-Readme/2021-05-03 (3).png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Drowsiness-Detection-web-app/Pics-for-Readme/2021-05-03 (3).png
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/Pics-for-Readme/2021-05-03 (4).png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Drowsiness-Detection-web-app/Pics-for-Readme/2021-05-03 (4).png
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/README.md:
--------------------------------------------------------------------------------
1 | # Drowsiness Detection Web app
2 | Prevents sleep deprivation road accidents, by alerting drowsy drivers.
3 | In this project, we have trained a convolutional neural network, to determine whether the eyes are closed or not, further, eye-patches are extracted from the face image to make all predictions. The dataset used for the training process can be accessed from the link given below:
4 | https://www.kaggle.com/kutaykutlu/drowsiness-detection.
5 |
6 | ## Live Testing The App
7 | ```sh
8 | $ pip install -r requirements.txt
9 | ```
10 | Then download `Eye_patch_extractor_&_GUI.py`, `ISHN0619_C3_pic.jpg`, `my_model (1).h5` & `sleep.jfif` files.
11 | ```sh
12 | $ streamlit run Eye_patch_extractor_&_GUI.py
13 | ```
14 |
15 | ## Understanding The Problem Statement
16 | According to the survey done by 'The Times of Inidia', nearly 40% of road accidents are caused by sleep deprivation. Fatigued drivers, long-duty driving are the major causes for the same. To solve this issue, this app primarily aims to predict whether or not the driver is sleeping, if found sleeping, it alerts the driver by making a high-frequency sound. This project is to avoid such sleep deprivation accidents!
17 |
18 | ## Implementation
19 | 1. A Deep Learning Model will be trained to detect whether the driver's eyelids are open or not. This will be achieved by training a Convolutional Neural Network using Tensorflow.
20 | 2. A web-cam will take a picture of the driver's face at regular intervals of time and the patch of the driver's eye will be extracted from that picture. This task will be achieved by using OpenCV.
21 | 3. This patch will be further used for the prediction purpose with the model trained in step 1.
22 | 4. Using this prediction, if the driver's eyes are closed a beep sound will be played, to alert the driver.
23 |
24 | ## Drowsiness Detetction Model Insights
25 | This model is trained with the help of TensorFlow and is based upon convolutional neural networks. It takes RGB images with the dimensions (86 * 86 * 3).
26 | ### Model Architecture
27 |
28 | Layer Number Layer Type Output Shape Trainable Parameters Activation Funtion
29 | 1 CONV2D (None, 84, 84, 75) 2,100 Relu
30 | 2 MaxPooling2D (None, 16, 16, 75) 0 None
31 | 3 Conv2D (None, 15, 15, 64) 19,264 Relu
32 | 4 MaxPooling2D (None, 7, 7, 64) 0 None
33 | 5 Conv2D (None, 5, 5, 128) 73,856 Relu
34 | 6 MaxPooling2D (None, 2, 2, 128) 0 None
35 | 7 Flattern (None, 512) 0 None
36 | 8 Dense (None, 64) 32,832 Sigmoid
37 | 9 Dense (None, 2) 130 Softmax
38 |
39 |
40 | ## Eye Patch Extractor & Predictor Insights
41 | This model uses OpenCV's "Haar Cascade Classifier" for face detection and after the proposal of the region of interest, it extracts the eye patch by the "Centroid Method" developed by us. These extracted features will be then passed to the trained model for Drowsiness Detection.
42 |
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/my_model (1).h5:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Drowsiness-Detection-web-app/my_model (1).h5
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/requirements.txt:
--------------------------------------------------------------------------------
1 | opencv_python==4.5.1.48
2 | tensorflow_gpu==2.3.1
3 | numpy==1.18.5
4 | pandas==1.1.4
5 | streamlit==0.73.1
6 | tensorflow==2.4.1
7 |
--------------------------------------------------------------------------------
/Drowsiness-Detection-web-app/sleep.jfif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Drowsiness-Detection-web-app/sleep.jfif
--------------------------------------------------------------------------------
/Kaggle 2020 Survey Analysis/README.md:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 | # Kaggle-Survey-2020-Commpetition
6 |
7 | The code in this Repo is submission for the annual Kaggle ML & DS survey Competition.
8 |
9 | # Our goal
10 | In this competition we were given the data of the annual survey that Kaggle conducts every year and we were asked to explore this data and come up with some conclusions that might be unique and wouldn't be visible if we just glanced through the data.
11 |
12 | I performed EDA on the data on features like - Age of a person, the country they live in, The Language they use to code, the IDEs they preder, their job titles and much more. After exploration we were able to come up with some conclusions which i have mentioned at the end of my notebook.
13 |
14 | # Few Results and Observations
15 | 1. Most Kaggle user's are quite young with their age between 22-29.
16 | 2. The Number of Men using Kaggle is huge as compared to the Woman. But we could see a significant growth in number of female Kagglers recently.
17 | 3. Most Kaggler's are from India followed by USA and other countries.
18 | 4. Most Kagglers have a Master's Degree.
19 | 5. Majority of Kagglers are Students followed by Data Scientists and Machine Learning Engineers.
20 | 6. Most Kagglers have Experience of 3-5 Years in the Programming and then there are Kagglers with an experience of 1-2 years.
21 | 7. Most Kagglers use Python followed by SQL and R.
22 | 8. The most Preffered IDEs are Jupyter, VScode and PyCharm.
23 | 9. Most Recommended Languages for Data Science Beginners is Python followed by R.
24 | 10. The most used data visualization Libraries are Matplotlib and Seaborn.
25 | 11. The most used framework for Machine learning and Deep learning is Sci-Kit learn followed by Tensorflow along with Keras.
26 | 12. The Most commonly used algorithms are Regression based followed by Decision trees, random forests and so on.
27 | 13. Most users share their work on Github followed by Kaggle and Colab. There are also many who dont like to share their work.
28 | 14. Most users preferred Coursera to learn Data science and Machine Learning followed by Kaggle Courses and Udemy.
29 | 15. Most users make use of Kaggle notebooks and forums to stay updated about latest Data science and ML topics followed by Youtube and Blogs on various websites.
30 |
--------------------------------------------------------------------------------
/Kaggle 2020 Survey Analysis/images.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Kaggle 2020 Survey Analysis/images.png
--------------------------------------------------------------------------------
/Noise-Removal-and-OCR-Using-CNNs-Autoencoders/images/result.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/Noise-Removal-and-OCR-Using-CNNs-Autoencoders/images/result.jpg
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Machine Learning @ DSC VIT(2020-2021)
2 | ---
3 | An Open source repository that was active during the term 2020-2021 at DSC-VIT,Bhopal. This repository consists of multiple ML and DL projects for people to learn from. Students could also contribute to it during the 2020-2021 season of DSC.
4 | **Currently, the repository is not maintained and all the contributors are requested to contribute to the new repositories that are there for the term 2021-2022**
5 |
6 | ### PROJECTS ON SHOWCASE:
7 | ---
8 | Here's a list of the showcased projects in this repo:
9 | ### 🔴Noise Removal and OCR Using CNNs and Autoencoders
10 |
11 |
12 |
13 |
14 |
15 |
16 | > This project deals with the age old problem in optical character recognition— Dealing w/ noise in the image data that can lead up to misrepresentation and inaccuracies during inference. Here, we implement an AutoEncoder model using TensorFlow and Keras that eliminates noise/distortions within the image data for better OCR operation.
17 |
18 | Link to the project.
19 |
20 | ---
21 | ### 🔴Book Recommender System
22 |
23 |
24 |
25 |
26 |
27 |
28 | > A Machine Learning project that makes the use of the KNN algorithm to recommend books to the users based off of the average ratings of the books in the data, the language they are written in, and the rating for that book.
29 |
30 | Link to the project.
31 |
32 | ---
33 | ### 🔴User Price Predictor (Web App)
34 |
35 |
36 |
37 |
38 |
39 |
40 | > A Flask-based web application that uses Machine Learning to predict the selling price of a car.
41 |
42 | Link to the project.
43 |
44 | ---
45 | ### 🔴Drowsiness Detection Web App
46 |
47 |
48 |
49 |
50 |
51 |
52 | > A model that aims to prevent road accidents caused due to sleep depriviation, by alerting drowsy drivers. In this project, a convolutional neural network has been trained to determine whether the eyes of the driver are closed or not. Further, eye-patches are extracted from the face image to make all predictions.
53 |
54 | Link to the project.
55 |
56 | ---
57 | ### 🔴Celestial Bodies' Classification
58 |
59 |
60 |
61 |
62 |
63 |
64 |
65 | > The Deep Space comprises of innumerable celetial bodies— planets, stars, galaxies, asteroids, etc. As a result, it is not possible to label each of these celetial bodies via a more traditional manual method. This is where machine learning shines, which allows Scientists to label a celestial body based on a variety of features like its gradient and standard deviation in a 2 dimensional space, etc. In this project, some models have been implemented, based on the same principles for classification of celestial bodies based on their features.
66 |
67 | Link to the project.
68 |
69 | ---
70 | ### 🔴Cervical Cancer Risk Prediction
71 |
72 |
73 |
74 |
75 |
76 |
77 | > Each year, hundreds of thousands of women lose their lives to cervical cancer, especially in developing countries where people often neglect/can't afford regular checkups and pap tests. This beginner-level project aims at an early detection of the risk of cervical cancer using different machine algorithms.
78 |
79 | Link to the project.
80 |
81 | ---
82 | ### 🔴Twitch Top Streamers' Analysis
83 |
84 |
85 |
86 |
87 |
88 |
89 |
90 | > The objective of this project was to predict the amount of followers gained by a streamer on Twitch based on the streaming data. Different visualization and data analysis techniques were used for understanding the data as well as deriving various insights from it.
91 |
92 | Link to the project.
93 |
94 | ---
95 | ### 🔴Kaggle 2020 Survey Analysis
96 |
97 |
98 |
99 |
100 |
101 |
102 | > A detailed data analysis for the Kaggle ML & DS survey.
103 |
104 | Link to the project.
105 |
106 | ---
107 | ### 🔴Campus Recruitment Analysis
108 |
109 |
110 |
111 |
112 |
113 |
114 | > A beginner-level project that uses different machine algorithms to predict whether a student will get placed into a job via campus recruitment or not.
115 |
116 | Link to the project.
117 |
118 | ---
119 | ### 🔴Titanic Disaster Prediction
120 |
121 |
122 |
123 |
124 |
125 | > The "Titanic-Machine Learning from Disaster" competition is an introductory kaggle competition for getting started with machine learning.
126 | This Machine Learning/Data Analysis project uses a relatively small dataset that exemplifies many of the practical problems that one deals with while doing machine learning projects. A great tutorial for beginners in ML.
127 |
128 | Link to the project.
129 |
130 |
131 | ### 👤PROJECT ADMIN(2020-2021)
132 |
133 | | |
134 | | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: |
135 | | **[Aman Sharma](https://www.linkedin.com/in/amansharma2910/)** |
136 | | |
137 | | *"Let The Dataset change your Mindset"* |
138 |
139 | > **_Need help?_**
140 | > **_Feel free to contact me @ [amansharma2910@gmail.com](mailto:amansharma2910@gmail.com?Subject=ML@DSC-VIT)_**
141 |
142 | ## Like This? Star ⭐ this Repo.
143 |
144 | > Aman Sharma, ML Lead @ DSC VIT © 2020
145 |
146 | [](https://github.com/amansharma2910)
147 | [](https://github.com/amansharma2910)
148 |
149 | ***
150 |
151 | ### CONTRIBUTORS ✨
152 | > 1. [amansharma2910](https://github.com/amansharma2910)
153 | >> __Contributions by amansharma2910:__
154 | >> [Noise Removal and OCR Using CNNs and Autoencoders](https://github.com/DSC-VIT-BHOPAL/Machine-Learning-Projects/blob/main/Noise-Removal-and-OCR-Using-CNNs-Autoencoders) ||
155 | >> [Cervical Cancer Risk Prediction](https://github.com/DSC-VIT-BHOPAL/Machine-Learning-Projects/tree/main/Cervical-Cancer-Risk)
156 |
157 | > 2. [AM1CODES](https://github.com/AM1CODES)
158 | >> __Contributions by AM1CODES:__
159 | >> [Book Recommender System](https://github.com/DSC-VIT-BHOPAL/Machine-Learning-Projects/tree/main/Book-Recommender-System) ||
160 | >> [Campus Recruitment Analysis](https://github.com/DSC-VIT-BHOPAL/Machine-Learning-Projects/tree/main/Campus-Recruitment-Analysis) ||
161 | >> [Kaggle 2020 Suvey Analysis](https://github.com/DSCVITBHOPAL/Machine-Learning-Projects/tree/main/Kaggle%202020%20Survey%20Analysis)
162 |
163 | > 3. [kritikashah20](https://github.com/kritikashah20)
164 | >> __Contributions by kritikashah20:__
165 | >> [Celestial Bodies' Classification](https://github.com/DSCVITBHOPAL/Machine-Learning-Projects/blob/main/Celestial-Bodies-Classification)
166 |
167 | > 4. [Ani0202](https://github.com/Ani0202)
168 | >> __Contributions by Ani0202:__
169 | >> Added KNN, Logistic Regression and SVM Classifiers; improved Decision Tree Classifier in [Cervical Cancer Detection project](https://github.com/DSC-VIT-BHOPAL/Machine-Learning-Projects/blob/main/Cervical-Cancer-Risk)
170 |
171 | > 5. [AndroAvi](https://github.com/AndroAvi)
172 | >> __Contributions by AndroAvi:__
173 | >> [Celestial Bodies' Classification](https://github.com/DSCVITBHOPAL/Machine-Learning-Projects/tree/main/Titanic-Survival-Dataset-Analysis)
174 |
175 |
176 | > 6. [Jackson-hub](https://github.com/Jackson-hub)
177 | >> __Contributions by Jackson-hub:__
178 | >> [Used Price Predictor](https://github.com/DSCVITBHOPAL/Machine-Learning-Projects/tree/main/UsedCarPricePredictor)
179 |
180 | > 7. [mayureshagashe2105](https://github.com/mayureshagashe2105)
181 | >> __Contributions by mayureshagashe2105:__
182 | >> [Drowsiness Detection Web App](https://github.com/DSCVITBHOPAL/Machine-Learning-Projects/tree/main/Drowsiness-Detection-web-app)
183 |
--------------------------------------------------------------------------------
/Titanic-Survival-Dataset-Analysis/README.md:
--------------------------------------------------------------------------------
1 | # Titanic-Survival-Dataset-Analysis
2 |
3 | A project depicting the analysis of the Titanic Dataset available at :
4 | https://www.kaggle.com/c/titanic
5 |
6 | The "Titanic-Machine Learning from Disaster" competition is an introductory kaggle competition for getting started with machine learning.
7 |
8 | This relatively small dataset exemplifies many of the practical problems that one deals with while doing machine learning projects, namely:
9 | 1. Correlated Data
10 | 2. Missing Values
11 | 3. Different kinds of features-categorical, ordinal, numeric, alphanumeric, as well as textual.
12 | 4. Outliers
13 |
14 | The goal here is to present and implement various methods of understanding the information contained inside the dataset that is explained with abstract information in several books and courses.
15 |
16 |
17 | The notebook contains the following files:
18 | 1. training_data.csv : The training data which I've analyzed.
19 | 2. test_data.csv: The test data to train the classifier model on.
20 | 3. ground_truth.csv: The actual class labels for test_data.csv for confidence and accuracy score calculation.
21 | 4. titanic_survival.ipynb: The jupyter notebook which consists of explanation and code in python.
22 |
23 |
24 | A basic outline of the workflow:
25 | 1. Clean the dataset by removing useless and filling in the missing values.
26 | 2. Visualize individual features' correlation with the label.
27 | 3. Plot feature grids to observe biases with survival, correlation within features and wrangle accordingly.
28 | 4. Categorize the feature types.
29 | 5. Convert non numerical categorical features into numerical ones.
30 | 6. Convert quantitative features into ordinal features based on their correlation with survival.
31 | 7. Using the scaler and model object(s) of one's choice, pipeline the training process and print the average accuracy.
--------------------------------------------------------------------------------
/Titanic-Survival-Dataset-Analysis/ground_truth.csv:
--------------------------------------------------------------------------------
1 | PassengerId,Survived
2 | 892,0
3 | 893,1
4 | 894,0
5 | 895,0
6 | 896,1
7 | 897,1
8 | 898,0
9 | 899,1
10 | 900,1
11 | 901,0
12 | 902,0
13 | 903,0
14 | 904,1
15 | 905,0
16 | 906,1
17 | 907,1
18 | 908,0
19 | 909,0
20 | 910,0
21 | 911,1
22 | 912,0
23 | 913,1
24 | 914,1
25 | 915,1
26 | 916,1
27 | 917,0
28 | 918,1
29 | 919,0
30 | 920,0
31 | 921,0
32 | 922,0
33 | 923,0
34 | 924,1
35 | 925,0
36 | 926,1
37 | 927,0
38 | 928,1
39 | 929,0
40 | 930,1
41 | 931,1
42 | 932,1
43 | 933,0
44 | 934,0
45 | 935,0
46 | 936,1
47 | 937,0
48 | 938,1
49 | 939,0
50 | 940,1
51 | 941,1
52 | 942,0
53 | 943,0
54 | 944,1
55 | 945,1
56 | 946,0
57 | 947,0
58 | 948,0
59 | 949,1
60 | 950,0
61 | 951,1
62 | 952,0
63 | 953,0
64 | 954,0
65 | 955,1
66 | 956,1
67 | 957,0
68 | 958,0
69 | 959,0
70 | 960,1
71 | 961,1
72 | 962,1
73 | 963,0
74 | 964,0
75 | 965,0
76 | 966,1
77 | 967,0
78 | 968,0
79 | 969,1
80 | 970,0
81 | 971,0
82 | 972,0
83 | 973,0
84 | 974,0
85 | 975,0
86 | 976,0
87 | 977,0
88 | 978,0
89 | 979,1
90 | 980,0
91 | 981,1
92 | 982,1
93 | 983,0
94 | 984,1
95 | 985,0
96 | 986,0
97 | 987,1
98 | 988,1
99 | 989,0
100 | 990,0
101 | 991,0
102 | 992,1
103 | 993,0
104 | 994,0
105 | 995,1
106 | 996,1
107 | 997,0
108 | 998,1
109 | 999,1
110 | 1000,0
111 | 1001,0
112 | 1002,0
113 | 1003,1
114 | 1004,0
115 | 1005,0
116 | 1006,0
117 | 1007,0
118 | 1008,0
119 | 1009,1
120 | 1010,0
121 | 1011,0
122 | 1012,1
123 | 1013,0
124 | 1014,1
125 | 1015,0
126 | 1016,1
127 | 1017,1
128 | 1018,0
129 | 1019,1
130 | 1020,0
131 | 1021,0
132 | 1022,0
133 | 1023,1
134 | 1024,0
135 | 1025,0
136 | 1026,0
137 | 1027,0
138 | 1028,0
139 | 1029,0
140 | 1030,1
141 | 1031,0
142 | 1032,0
143 | 1033,1
144 | 1034,0
145 | 1035,0
146 | 1036,0
147 | 1037,0
148 | 1038,0
149 | 1039,0
150 | 1040,0
151 | 1041,0
152 | 1042,1
153 | 1043,0
154 | 1044,0
155 | 1045,0
156 | 1046,0
157 | 1047,1
158 | 1048,1
159 | 1049,1
160 | 1050,0
161 | 1051,0
162 | 1052,1
163 | 1053,1
164 | 1054,1
165 | 1055,0
166 | 1056,0
167 | 1057,1
168 | 1058,0
169 | 1059,0
170 | 1060,1
171 | 1061,1
172 | 1062,0
173 | 1063,0
174 | 1064,0
175 | 1065,0
176 | 1066,0
177 | 1067,1
178 | 1068,1
179 | 1069,1
180 | 1070,1
181 | 1071,1
182 | 1072,0
183 | 1073,0
184 | 1074,1
185 | 1075,0
186 | 1076,1
187 | 1077,0
188 | 1078,1
189 | 1079,0
190 | 1080,0
191 | 1081,0
192 | 1082,0
193 | 1083,1
194 | 1084,0
195 | 1085,0
196 | 1086,1
197 | 1087,0
198 | 1088,1
199 | 1089,1
200 | 1090,0
201 | 1091,0
202 | 1092,1
203 | 1093,0
204 | 1094,0
205 | 1095,1
206 | 1096,0
207 | 1097,1
208 | 1098,0
209 | 1099,1
210 | 1100,1
211 | 1101,0
212 | 1102,0
213 | 1103,1
214 | 1104,0
215 | 1105,0
216 | 1106,0
217 | 1107,0
218 | 1108,0
219 | 1109,0
220 | 1110,1
221 | 1111,0
222 | 1112,1
223 | 1113,0
224 | 1114,1
225 | 1115,1
226 | 1116,1
227 | 1117,1
228 | 1118,1
229 | 1119,0
230 | 1120,0
231 | 1121,0
232 | 1122,0
233 | 1123,1
234 | 1124,0
235 | 1125,0
236 | 1126,0
237 | 1127,0
238 | 1128,0
239 | 1129,0
240 | 1130,0
241 | 1131,1
242 | 1132,1
243 | 1133,1
244 | 1134,1
245 | 1135,1
246 | 1136,0
247 | 1137,0
248 | 1138,0
249 | 1139,0
250 | 1140,1
251 | 1141,0
252 | 1142,1
253 | 1143,1
254 | 1144,0
255 | 1145,0
256 | 1146,0
257 | 1147,0
258 | 1148,0
259 | 1149,0
260 | 1150,1
261 | 1151,1
262 | 1152,1
263 | 1153,0
264 | 1154,1
265 | 1155,0
266 | 1156,1
267 | 1157,0
268 | 1158,0
269 | 1159,0
270 | 1160,1
271 | 1161,0
272 | 1162,0
273 | 1163,0
274 | 1164,1
275 | 1165,0
276 | 1166,0
277 | 1167,1
278 | 1168,0
279 | 1169,0
280 | 1170,0
281 | 1171,1
282 | 1172,0
283 | 1173,0
284 | 1174,0
285 | 1175,1
286 | 1176,0
287 | 1177,0
288 | 1178,0
289 | 1179,1
290 | 1180,0
291 | 1181,0
292 | 1182,1
293 | 1183,1
294 | 1184,0
295 | 1185,1
296 | 1186,0
297 | 1187,0
298 | 1188,1
299 | 1189,0
300 | 1190,0
301 | 1191,0
302 | 1192,1
303 | 1193,0
304 | 1194,0
305 | 1195,0
306 | 1196,1
307 | 1197,1
308 | 1198,0
309 | 1199,1
310 | 1200,0
311 | 1201,1
312 | 1202,0
313 | 1203,1
314 | 1204,0
315 | 1205,0
316 | 1206,1
317 | 1207,0
318 | 1208,0
319 | 1209,0
320 | 1210,0
321 | 1211,0
322 | 1212,0
323 | 1213,1
324 | 1214,0
325 | 1215,0
326 | 1216,1
327 | 1217,0
328 | 1218,1
329 | 1219,0
330 | 1220,0
331 | 1221,0
332 | 1222,1
333 | 1223,0
334 | 1224,0
335 | 1225,1
336 | 1226,0
337 | 1227,0
338 | 1228,0
339 | 1229,0
340 | 1230,0
341 | 1231,0
342 | 1232,0
343 | 1233,1
344 | 1234,0
345 | 1235,1
346 | 1236,0
347 | 1237,1
348 | 1238,0
349 | 1239,1
350 | 1240,0
351 | 1241,1
352 | 1242,1
353 | 1243,0
354 | 1244,0
355 | 1245,0
356 | 1246,1
357 | 1247,0
358 | 1248,1
359 | 1249,0
360 | 1250,1
361 | 1251,0
362 | 1252,0
363 | 1253,1
364 | 1254,1
365 | 1255,0
366 | 1256,1
367 | 1257,0
368 | 1258,0
369 | 1259,0
370 | 1260,1
371 | 1261,1
372 | 1262,0
373 | 1263,1
374 | 1264,1
375 | 1265,0
376 | 1266,1
377 | 1267,1
378 | 1268,0
379 | 1269,0
380 | 1270,0
381 | 1271,0
382 | 1272,0
383 | 1273,0
384 | 1274,0
385 | 1275,0
386 | 1276,0
387 | 1277,1
388 | 1278,0
389 | 1279,0
390 | 1280,0
391 | 1281,0
392 | 1282,0
393 | 1283,1
394 | 1284,0
395 | 1285,0
396 | 1286,1
397 | 1287,1
398 | 1288,0
399 | 1289,1
400 | 1290,0
401 | 1291,0
402 | 1292,1
403 | 1293,0
404 | 1294,1
405 | 1295,0
406 | 1296,1
407 | 1297,1
408 | 1298,0
409 | 1299,0
410 | 1300,1
411 | 1301,0
412 | 1302,0
413 | 1303,1
414 | 1304,0
415 | 1305,0
416 | 1306,1
417 | 1307,0
418 | 1308,0
419 | 1309,1
420 |
--------------------------------------------------------------------------------
/Titanic-Survival-Dataset-Analysis/titanic_test.csv:
--------------------------------------------------------------------------------
1 | PassengerId,Pclass,Name,Sex,Age,SibSp,Parch,Ticket,Fare,Cabin,Embarked
2 | 892,3,"Kelly, Mr. James",male,34.5,0,0,330911,7.8292,,Q
3 | 893,3,"Wilkes, Mrs. James (Ellen Needs)",female,47,1,0,363272,7,,S
4 | 894,2,"Myles, Mr. Thomas Francis",male,62,0,0,240276,9.6875,,Q
5 | 895,3,"Wirz, Mr. Albert",male,27,0,0,315154,8.6625,,S
6 | 896,3,"Hirvonen, Mrs. Alexander (Helga E Lindqvist)",female,22,1,1,3101298,12.2875,,S
7 | 897,3,"Svensson, Mr. Johan Cervin",male,14,0,0,7538,9.225,,S
8 | 898,3,"Connolly, Miss. Kate",female,30,0,0,330972,7.6292,,Q
9 | 899,2,"Caldwell, Mr. Albert Francis",male,26,1,1,248738,29,,S
10 | 900,3,"Abrahim, Mrs. Joseph (Sophie Halaut Easu)",female,18,0,0,2657,7.2292,,C
11 | 901,3,"Davies, Mr. John Samuel",male,21,2,0,A/4 48871,24.15,,S
12 | 902,3,"Ilieff, Mr. Ylio",male,,0,0,349220,7.8958,,S
13 | 903,1,"Jones, Mr. Charles Cresson",male,46,0,0,694,26,,S
14 | 904,1,"Snyder, Mrs. John Pillsbury (Nelle Stevenson)",female,23,1,0,21228,82.2667,B45,S
15 | 905,2,"Howard, Mr. Benjamin",male,63,1,0,24065,26,,S
16 | 906,1,"Chaffee, Mrs. Herbert Fuller (Carrie Constance Toogood)",female,47,1,0,W.E.P. 5734,61.175,E31,S
17 | 907,2,"del Carlo, Mrs. Sebastiano (Argenia Genovesi)",female,24,1,0,SC/PARIS 2167,27.7208,,C
18 | 908,2,"Keane, Mr. Daniel",male,35,0,0,233734,12.35,,Q
19 | 909,3,"Assaf, Mr. Gerios",male,21,0,0,2692,7.225,,C
20 | 910,3,"Ilmakangas, Miss. Ida Livija",female,27,1,0,STON/O2. 3101270,7.925,,S
21 | 911,3,"Assaf Khalil, Mrs. Mariana (Miriam"")""",female,45,0,0,2696,7.225,,C
22 | 912,1,"Rothschild, Mr. Martin",male,55,1,0,PC 17603,59.4,,C
23 | 913,3,"Olsen, Master. Artur Karl",male,9,0,1,C 17368,3.1708,,S
24 | 914,1,"Flegenheim, Mrs. Alfred (Antoinette)",female,,0,0,PC 17598,31.6833,,S
25 | 915,1,"Williams, Mr. Richard Norris II",male,21,0,1,PC 17597,61.3792,,C
26 | 916,1,"Ryerson, Mrs. Arthur Larned (Emily Maria Borie)",female,48,1,3,PC 17608,262.375,B57 B59 B63 B66,C
27 | 917,3,"Robins, Mr. Alexander A",male,50,1,0,A/5. 3337,14.5,,S
28 | 918,1,"Ostby, Miss. Helene Ragnhild",female,22,0,1,113509,61.9792,B36,C
29 | 919,3,"Daher, Mr. Shedid",male,22.5,0,0,2698,7.225,,C
30 | 920,1,"Brady, Mr. John Bertram",male,41,0,0,113054,30.5,A21,S
31 | 921,3,"Samaan, Mr. Elias",male,,2,0,2662,21.6792,,C
32 | 922,2,"Louch, Mr. Charles Alexander",male,50,1,0,SC/AH 3085,26,,S
33 | 923,2,"Jefferys, Mr. Clifford Thomas",male,24,2,0,C.A. 31029,31.5,,S
34 | 924,3,"Dean, Mrs. Bertram (Eva Georgetta Light)",female,33,1,2,C.A. 2315,20.575,,S
35 | 925,3,"Johnston, Mrs. Andrew G (Elizabeth Lily"" Watson)""",female,,1,2,W./C. 6607,23.45,,S
36 | 926,1,"Mock, Mr. Philipp Edmund",male,30,1,0,13236,57.75,C78,C
37 | 927,3,"Katavelas, Mr. Vassilios (Catavelas Vassilios"")""",male,18.5,0,0,2682,7.2292,,C
38 | 928,3,"Roth, Miss. Sarah A",female,,0,0,342712,8.05,,S
39 | 929,3,"Cacic, Miss. Manda",female,21,0,0,315087,8.6625,,S
40 | 930,3,"Sap, Mr. Julius",male,25,0,0,345768,9.5,,S
41 | 931,3,"Hee, Mr. Ling",male,,0,0,1601,56.4958,,S
42 | 932,3,"Karun, Mr. Franz",male,39,0,1,349256,13.4167,,C
43 | 933,1,"Franklin, Mr. Thomas Parham",male,,0,0,113778,26.55,D34,S
44 | 934,3,"Goldsmith, Mr. Nathan",male,41,0,0,SOTON/O.Q. 3101263,7.85,,S
45 | 935,2,"Corbett, Mrs. Walter H (Irene Colvin)",female,30,0,0,237249,13,,S
46 | 936,1,"Kimball, Mrs. Edwin Nelson Jr (Gertrude Parsons)",female,45,1,0,11753,52.5542,D19,S
47 | 937,3,"Peltomaki, Mr. Nikolai Johannes",male,25,0,0,STON/O 2. 3101291,7.925,,S
48 | 938,1,"Chevre, Mr. Paul Romaine",male,45,0,0,PC 17594,29.7,A9,C
49 | 939,3,"Shaughnessy, Mr. Patrick",male,,0,0,370374,7.75,,Q
50 | 940,1,"Bucknell, Mrs. William Robert (Emma Eliza Ward)",female,60,0,0,11813,76.2917,D15,C
51 | 941,3,"Coutts, Mrs. William (Winnie Minnie"" Treanor)""",female,36,0,2,C.A. 37671,15.9,,S
52 | 942,1,"Smith, Mr. Lucien Philip",male,24,1,0,13695,60,C31,S
53 | 943,2,"Pulbaum, Mr. Franz",male,27,0,0,SC/PARIS 2168,15.0333,,C
54 | 944,2,"Hocking, Miss. Ellen Nellie""""",female,20,2,1,29105,23,,S
55 | 945,1,"Fortune, Miss. Ethel Flora",female,28,3,2,19950,263,C23 C25 C27,S
56 | 946,2,"Mangiavacchi, Mr. Serafino Emilio",male,,0,0,SC/A.3 2861,15.5792,,C
57 | 947,3,"Rice, Master. Albert",male,10,4,1,382652,29.125,,Q
58 | 948,3,"Cor, Mr. Bartol",male,35,0,0,349230,7.8958,,S
59 | 949,3,"Abelseth, Mr. Olaus Jorgensen",male,25,0,0,348122,7.65,F G63,S
60 | 950,3,"Davison, Mr. Thomas Henry",male,,1,0,386525,16.1,,S
61 | 951,1,"Chaudanson, Miss. Victorine",female,36,0,0,PC 17608,262.375,B61,C
62 | 952,3,"Dika, Mr. Mirko",male,17,0,0,349232,7.8958,,S
63 | 953,2,"McCrae, Mr. Arthur Gordon",male,32,0,0,237216,13.5,,S
64 | 954,3,"Bjorklund, Mr. Ernst Herbert",male,18,0,0,347090,7.75,,S
65 | 955,3,"Bradley, Miss. Bridget Delia",female,22,0,0,334914,7.725,,Q
66 | 956,1,"Ryerson, Master. John Borie",male,13,2,2,PC 17608,262.375,B57 B59 B63 B66,C
67 | 957,2,"Corey, Mrs. Percy C (Mary Phyllis Elizabeth Miller)",female,,0,0,F.C.C. 13534,21,,S
68 | 958,3,"Burns, Miss. Mary Delia",female,18,0,0,330963,7.8792,,Q
69 | 959,1,"Moore, Mr. Clarence Bloomfield",male,47,0,0,113796,42.4,,S
70 | 960,1,"Tucker, Mr. Gilbert Milligan Jr",male,31,0,0,2543,28.5375,C53,C
71 | 961,1,"Fortune, Mrs. Mark (Mary McDougald)",female,60,1,4,19950,263,C23 C25 C27,S
72 | 962,3,"Mulvihill, Miss. Bertha E",female,24,0,0,382653,7.75,,Q
73 | 963,3,"Minkoff, Mr. Lazar",male,21,0,0,349211,7.8958,,S
74 | 964,3,"Nieminen, Miss. Manta Josefina",female,29,0,0,3101297,7.925,,S
75 | 965,1,"Ovies y Rodriguez, Mr. Servando",male,28.5,0,0,PC 17562,27.7208,D43,C
76 | 966,1,"Geiger, Miss. Amalie",female,35,0,0,113503,211.5,C130,C
77 | 967,1,"Keeping, Mr. Edwin",male,32.5,0,0,113503,211.5,C132,C
78 | 968,3,"Miles, Mr. Frank",male,,0,0,359306,8.05,,S
79 | 969,1,"Cornell, Mrs. Robert Clifford (Malvina Helen Lamson)",female,55,2,0,11770,25.7,C101,S
80 | 970,2,"Aldworth, Mr. Charles Augustus",male,30,0,0,248744,13,,S
81 | 971,3,"Doyle, Miss. Elizabeth",female,24,0,0,368702,7.75,,Q
82 | 972,3,"Boulos, Master. Akar",male,6,1,1,2678,15.2458,,C
83 | 973,1,"Straus, Mr. Isidor",male,67,1,0,PC 17483,221.7792,C55 C57,S
84 | 974,1,"Case, Mr. Howard Brown",male,49,0,0,19924,26,,S
85 | 975,3,"Demetri, Mr. Marinko",male,,0,0,349238,7.8958,,S
86 | 976,2,"Lamb, Mr. John Joseph",male,,0,0,240261,10.7083,,Q
87 | 977,3,"Khalil, Mr. Betros",male,,1,0,2660,14.4542,,C
88 | 978,3,"Barry, Miss. Julia",female,27,0,0,330844,7.8792,,Q
89 | 979,3,"Badman, Miss. Emily Louisa",female,18,0,0,A/4 31416,8.05,,S
90 | 980,3,"O'Donoghue, Ms. Bridget",female,,0,0,364856,7.75,,Q
91 | 981,2,"Wells, Master. Ralph Lester",male,2,1,1,29103,23,,S
92 | 982,3,"Dyker, Mrs. Adolf Fredrik (Anna Elisabeth Judith Andersson)",female,22,1,0,347072,13.9,,S
93 | 983,3,"Pedersen, Mr. Olaf",male,,0,0,345498,7.775,,S
94 | 984,1,"Davidson, Mrs. Thornton (Orian Hays)",female,27,1,2,F.C. 12750,52,B71,S
95 | 985,3,"Guest, Mr. Robert",male,,0,0,376563,8.05,,S
96 | 986,1,"Birnbaum, Mr. Jakob",male,25,0,0,13905,26,,C
97 | 987,3,"Tenglin, Mr. Gunnar Isidor",male,25,0,0,350033,7.7958,,S
98 | 988,1,"Cavendish, Mrs. Tyrell William (Julia Florence Siegel)",female,76,1,0,19877,78.85,C46,S
99 | 989,3,"Makinen, Mr. Kalle Edvard",male,29,0,0,STON/O 2. 3101268,7.925,,S
100 | 990,3,"Braf, Miss. Elin Ester Maria",female,20,0,0,347471,7.8542,,S
101 | 991,3,"Nancarrow, Mr. William Henry",male,33,0,0,A./5. 3338,8.05,,S
102 | 992,1,"Stengel, Mrs. Charles Emil Henry (Annie May Morris)",female,43,1,0,11778,55.4417,C116,C
103 | 993,2,"Weisz, Mr. Leopold",male,27,1,0,228414,26,,S
104 | 994,3,"Foley, Mr. William",male,,0,0,365235,7.75,,Q
105 | 995,3,"Johansson Palmquist, Mr. Oskar Leander",male,26,0,0,347070,7.775,,S
106 | 996,3,"Thomas, Mrs. Alexander (Thamine Thelma"")""",female,16,1,1,2625,8.5167,,C
107 | 997,3,"Holthen, Mr. Johan Martin",male,28,0,0,C 4001,22.525,,S
108 | 998,3,"Buckley, Mr. Daniel",male,21,0,0,330920,7.8208,,Q
109 | 999,3,"Ryan, Mr. Edward",male,,0,0,383162,7.75,,Q
110 | 1000,3,"Willer, Mr. Aaron (Abi Weller"")""",male,,0,0,3410,8.7125,,S
111 | 1001,2,"Swane, Mr. George",male,18.5,0,0,248734,13,F,S
112 | 1002,2,"Stanton, Mr. Samuel Ward",male,41,0,0,237734,15.0458,,C
113 | 1003,3,"Shine, Miss. Ellen Natalia",female,,0,0,330968,7.7792,,Q
114 | 1004,1,"Evans, Miss. Edith Corse",female,36,0,0,PC 17531,31.6792,A29,C
115 | 1005,3,"Buckley, Miss. Katherine",female,18.5,0,0,329944,7.2833,,Q
116 | 1006,1,"Straus, Mrs. Isidor (Rosalie Ida Blun)",female,63,1,0,PC 17483,221.7792,C55 C57,S
117 | 1007,3,"Chronopoulos, Mr. Demetrios",male,18,1,0,2680,14.4542,,C
118 | 1008,3,"Thomas, Mr. John",male,,0,0,2681,6.4375,,C
119 | 1009,3,"Sandstrom, Miss. Beatrice Irene",female,1,1,1,PP 9549,16.7,G6,S
120 | 1010,1,"Beattie, Mr. Thomson",male,36,0,0,13050,75.2417,C6,C
121 | 1011,2,"Chapman, Mrs. John Henry (Sara Elizabeth Lawry)",female,29,1,0,SC/AH 29037,26,,S
122 | 1012,2,"Watt, Miss. Bertha J",female,12,0,0,C.A. 33595,15.75,,S
123 | 1013,3,"Kiernan, Mr. John",male,,1,0,367227,7.75,,Q
124 | 1014,1,"Schabert, Mrs. Paul (Emma Mock)",female,35,1,0,13236,57.75,C28,C
125 | 1015,3,"Carver, Mr. Alfred John",male,28,0,0,392095,7.25,,S
126 | 1016,3,"Kennedy, Mr. John",male,,0,0,368783,7.75,,Q
127 | 1017,3,"Cribb, Miss. Laura Alice",female,17,0,1,371362,16.1,,S
128 | 1018,3,"Brobeck, Mr. Karl Rudolf",male,22,0,0,350045,7.7958,,S
129 | 1019,3,"McCoy, Miss. Alicia",female,,2,0,367226,23.25,,Q
130 | 1020,2,"Bowenur, Mr. Solomon",male,42,0,0,211535,13,,S
131 | 1021,3,"Petersen, Mr. Marius",male,24,0,0,342441,8.05,,S
132 | 1022,3,"Spinner, Mr. Henry John",male,32,0,0,STON/OQ. 369943,8.05,,S
133 | 1023,1,"Gracie, Col. Archibald IV",male,53,0,0,113780,28.5,C51,C
134 | 1024,3,"Lefebre, Mrs. Frank (Frances)",female,,0,4,4133,25.4667,,S
135 | 1025,3,"Thomas, Mr. Charles P",male,,1,0,2621,6.4375,,C
136 | 1026,3,"Dintcheff, Mr. Valtcho",male,43,0,0,349226,7.8958,,S
137 | 1027,3,"Carlsson, Mr. Carl Robert",male,24,0,0,350409,7.8542,,S
138 | 1028,3,"Zakarian, Mr. Mapriededer",male,26.5,0,0,2656,7.225,,C
139 | 1029,2,"Schmidt, Mr. August",male,26,0,0,248659,13,,S
140 | 1030,3,"Drapkin, Miss. Jennie",female,23,0,0,SOTON/OQ 392083,8.05,,S
141 | 1031,3,"Goodwin, Mr. Charles Frederick",male,40,1,6,CA 2144,46.9,,S
142 | 1032,3,"Goodwin, Miss. Jessie Allis",female,10,5,2,CA 2144,46.9,,S
143 | 1033,1,"Daniels, Miss. Sarah",female,33,0,0,113781,151.55,,S
144 | 1034,1,"Ryerson, Mr. Arthur Larned",male,61,1,3,PC 17608,262.375,B57 B59 B63 B66,C
145 | 1035,2,"Beauchamp, Mr. Henry James",male,28,0,0,244358,26,,S
146 | 1036,1,"Lindeberg-Lind, Mr. Erik Gustaf (Mr Edward Lingrey"")""",male,42,0,0,17475,26.55,,S
147 | 1037,3,"Vander Planke, Mr. Julius",male,31,3,0,345763,18,,S
148 | 1038,1,"Hilliard, Mr. Herbert Henry",male,,0,0,17463,51.8625,E46,S
149 | 1039,3,"Davies, Mr. Evan",male,22,0,0,SC/A4 23568,8.05,,S
150 | 1040,1,"Crafton, Mr. John Bertram",male,,0,0,113791,26.55,,S
151 | 1041,2,"Lahtinen, Rev. William",male,30,1,1,250651,26,,S
152 | 1042,1,"Earnshaw, Mrs. Boulton (Olive Potter)",female,23,0,1,11767,83.1583,C54,C
153 | 1043,3,"Matinoff, Mr. Nicola",male,,0,0,349255,7.8958,,C
154 | 1044,3,"Storey, Mr. Thomas",male,60.5,0,0,3701,,,S
155 | 1045,3,"Klasen, Mrs. (Hulda Kristina Eugenia Lofqvist)",female,36,0,2,350405,12.1833,,S
156 | 1046,3,"Asplund, Master. Filip Oscar",male,13,4,2,347077,31.3875,,S
157 | 1047,3,"Duquemin, Mr. Joseph",male,24,0,0,S.O./P.P. 752,7.55,,S
158 | 1048,1,"Bird, Miss. Ellen",female,29,0,0,PC 17483,221.7792,C97,S
159 | 1049,3,"Lundin, Miss. Olga Elida",female,23,0,0,347469,7.8542,,S
160 | 1050,1,"Borebank, Mr. John James",male,42,0,0,110489,26.55,D22,S
161 | 1051,3,"Peacock, Mrs. Benjamin (Edith Nile)",female,26,0,2,SOTON/O.Q. 3101315,13.775,,S
162 | 1052,3,"Smyth, Miss. Julia",female,,0,0,335432,7.7333,,Q
163 | 1053,3,"Touma, Master. Georges Youssef",male,7,1,1,2650,15.2458,,C
164 | 1054,2,"Wright, Miss. Marion",female,26,0,0,220844,13.5,,S
165 | 1055,3,"Pearce, Mr. Ernest",male,,0,0,343271,7,,S
166 | 1056,2,"Peruschitz, Rev. Joseph Maria",male,41,0,0,237393,13,,S
167 | 1057,3,"Kink-Heilmann, Mrs. Anton (Luise Heilmann)",female,26,1,1,315153,22.025,,S
168 | 1058,1,"Brandeis, Mr. Emil",male,48,0,0,PC 17591,50.4958,B10,C
169 | 1059,3,"Ford, Mr. Edward Watson",male,18,2,2,W./C. 6608,34.375,,S
170 | 1060,1,"Cassebeer, Mrs. Henry Arthur Jr (Eleanor Genevieve Fosdick)",female,,0,0,17770,27.7208,,C
171 | 1061,3,"Hellstrom, Miss. Hilda Maria",female,22,0,0,7548,8.9625,,S
172 | 1062,3,"Lithman, Mr. Simon",male,,0,0,S.O./P.P. 251,7.55,,S
173 | 1063,3,"Zakarian, Mr. Ortin",male,27,0,0,2670,7.225,,C
174 | 1064,3,"Dyker, Mr. Adolf Fredrik",male,23,1,0,347072,13.9,,S
175 | 1065,3,"Torfa, Mr. Assad",male,,0,0,2673,7.2292,,C
176 | 1066,3,"Asplund, Mr. Carl Oscar Vilhelm Gustafsson",male,40,1,5,347077,31.3875,,S
177 | 1067,2,"Brown, Miss. Edith Eileen",female,15,0,2,29750,39,,S
178 | 1068,2,"Sincock, Miss. Maude",female,20,0,0,C.A. 33112,36.75,,S
179 | 1069,1,"Stengel, Mr. Charles Emil Henry",male,54,1,0,11778,55.4417,C116,C
180 | 1070,2,"Becker, Mrs. Allen Oliver (Nellie E Baumgardner)",female,36,0,3,230136,39,F4,S
181 | 1071,1,"Compton, Mrs. Alexander Taylor (Mary Eliza Ingersoll)",female,64,0,2,PC 17756,83.1583,E45,C
182 | 1072,2,"McCrie, Mr. James Matthew",male,30,0,0,233478,13,,S
183 | 1073,1,"Compton, Mr. Alexander Taylor Jr",male,37,1,1,PC 17756,83.1583,E52,C
184 | 1074,1,"Marvin, Mrs. Daniel Warner (Mary Graham Carmichael Farquarson)",female,18,1,0,113773,53.1,D30,S
185 | 1075,3,"Lane, Mr. Patrick",male,,0,0,7935,7.75,,Q
186 | 1076,1,"Douglas, Mrs. Frederick Charles (Mary Helene Baxter)",female,27,1,1,PC 17558,247.5208,B58 B60,C
187 | 1077,2,"Maybery, Mr. Frank Hubert",male,40,0,0,239059,16,,S
188 | 1078,2,"Phillips, Miss. Alice Frances Louisa",female,21,0,1,S.O./P.P. 2,21,,S
189 | 1079,3,"Davies, Mr. Joseph",male,17,2,0,A/4 48873,8.05,,S
190 | 1080,3,"Sage, Miss. Ada",female,,8,2,CA. 2343,69.55,,S
191 | 1081,2,"Veal, Mr. James",male,40,0,0,28221,13,,S
192 | 1082,2,"Angle, Mr. William A",male,34,1,0,226875,26,,S
193 | 1083,1,"Salomon, Mr. Abraham L",male,,0,0,111163,26,,S
194 | 1084,3,"van Billiard, Master. Walter John",male,11.5,1,1,A/5. 851,14.5,,S
195 | 1085,2,"Lingane, Mr. John",male,61,0,0,235509,12.35,,Q
196 | 1086,2,"Drew, Master. Marshall Brines",male,8,0,2,28220,32.5,,S
197 | 1087,3,"Karlsson, Mr. Julius Konrad Eugen",male,33,0,0,347465,7.8542,,S
198 | 1088,1,"Spedden, Master. Robert Douglas",male,6,0,2,16966,134.5,E34,C
199 | 1089,3,"Nilsson, Miss. Berta Olivia",female,18,0,0,347066,7.775,,S
200 | 1090,2,"Baimbrigge, Mr. Charles Robert",male,23,0,0,C.A. 31030,10.5,,S
201 | 1091,3,"Rasmussen, Mrs. (Lena Jacobsen Solvang)",female,,0,0,65305,8.1125,,S
202 | 1092,3,"Murphy, Miss. Nora",female,,0,0,36568,15.5,,Q
203 | 1093,3,"Danbom, Master. Gilbert Sigvard Emanuel",male,0.33,0,2,347080,14.4,,S
204 | 1094,1,"Astor, Col. John Jacob",male,47,1,0,PC 17757,227.525,C62 C64,C
205 | 1095,2,"Quick, Miss. Winifred Vera",female,8,1,1,26360,26,,S
206 | 1096,2,"Andrew, Mr. Frank Thomas",male,25,0,0,C.A. 34050,10.5,,S
207 | 1097,1,"Omont, Mr. Alfred Fernand",male,,0,0,F.C. 12998,25.7417,,C
208 | 1098,3,"McGowan, Miss. Katherine",female,35,0,0,9232,7.75,,Q
209 | 1099,2,"Collett, Mr. Sidney C Stuart",male,24,0,0,28034,10.5,,S
210 | 1100,1,"Rosenbaum, Miss. Edith Louise",female,33,0,0,PC 17613,27.7208,A11,C
211 | 1101,3,"Delalic, Mr. Redjo",male,25,0,0,349250,7.8958,,S
212 | 1102,3,"Andersen, Mr. Albert Karvin",male,32,0,0,C 4001,22.525,,S
213 | 1103,3,"Finoli, Mr. Luigi",male,,0,0,SOTON/O.Q. 3101308,7.05,,S
214 | 1104,2,"Deacon, Mr. Percy William",male,17,0,0,S.O.C. 14879,73.5,,S
215 | 1105,2,"Howard, Mrs. Benjamin (Ellen Truelove Arman)",female,60,1,0,24065,26,,S
216 | 1106,3,"Andersson, Miss. Ida Augusta Margareta",female,38,4,2,347091,7.775,,S
217 | 1107,1,"Head, Mr. Christopher",male,42,0,0,113038,42.5,B11,S
218 | 1108,3,"Mahon, Miss. Bridget Delia",female,,0,0,330924,7.8792,,Q
219 | 1109,1,"Wick, Mr. George Dennick",male,57,1,1,36928,164.8667,,S
220 | 1110,1,"Widener, Mrs. George Dunton (Eleanor Elkins)",female,50,1,1,113503,211.5,C80,C
221 | 1111,3,"Thomson, Mr. Alexander Morrison",male,,0,0,32302,8.05,,S
222 | 1112,2,"Duran y More, Miss. Florentina",female,30,1,0,SC/PARIS 2148,13.8583,,C
223 | 1113,3,"Reynolds, Mr. Harold J",male,21,0,0,342684,8.05,,S
224 | 1114,2,"Cook, Mrs. (Selena Rogers)",female,22,0,0,W./C. 14266,10.5,F33,S
225 | 1115,3,"Karlsson, Mr. Einar Gervasius",male,21,0,0,350053,7.7958,,S
226 | 1116,1,"Candee, Mrs. Edward (Helen Churchill Hungerford)",female,53,0,0,PC 17606,27.4458,,C
227 | 1117,3,"Moubarek, Mrs. George (Omine Amenia"" Alexander)""",female,,0,2,2661,15.2458,,C
228 | 1118,3,"Asplund, Mr. Johan Charles",male,23,0,0,350054,7.7958,,S
229 | 1119,3,"McNeill, Miss. Bridget",female,,0,0,370368,7.75,,Q
230 | 1120,3,"Everett, Mr. Thomas James",male,40.5,0,0,C.A. 6212,15.1,,S
231 | 1121,2,"Hocking, Mr. Samuel James Metcalfe",male,36,0,0,242963,13,,S
232 | 1122,2,"Sweet, Mr. George Frederick",male,14,0,0,220845,65,,S
233 | 1123,1,"Willard, Miss. Constance",female,21,0,0,113795,26.55,,S
234 | 1124,3,"Wiklund, Mr. Karl Johan",male,21,1,0,3101266,6.4958,,S
235 | 1125,3,"Linehan, Mr. Michael",male,,0,0,330971,7.8792,,Q
236 | 1126,1,"Cumings, Mr. John Bradley",male,39,1,0,PC 17599,71.2833,C85,C
237 | 1127,3,"Vendel, Mr. Olof Edvin",male,20,0,0,350416,7.8542,,S
238 | 1128,1,"Warren, Mr. Frank Manley",male,64,1,0,110813,75.25,D37,C
239 | 1129,3,"Baccos, Mr. Raffull",male,20,0,0,2679,7.225,,C
240 | 1130,2,"Hiltunen, Miss. Marta",female,18,1,1,250650,13,,S
241 | 1131,1,"Douglas, Mrs. Walter Donald (Mahala Dutton)",female,48,1,0,PC 17761,106.425,C86,C
242 | 1132,1,"Lindstrom, Mrs. Carl Johan (Sigrid Posse)",female,55,0,0,112377,27.7208,,C
243 | 1133,2,"Christy, Mrs. (Alice Frances)",female,45,0,2,237789,30,,S
244 | 1134,1,"Spedden, Mr. Frederic Oakley",male,45,1,1,16966,134.5,E34,C
245 | 1135,3,"Hyman, Mr. Abraham",male,,0,0,3470,7.8875,,S
246 | 1136,3,"Johnston, Master. William Arthur Willie""""",male,,1,2,W./C. 6607,23.45,,S
247 | 1137,1,"Kenyon, Mr. Frederick R",male,41,1,0,17464,51.8625,D21,S
248 | 1138,2,"Karnes, Mrs. J Frank (Claire Bennett)",female,22,0,0,F.C.C. 13534,21,,S
249 | 1139,2,"Drew, Mr. James Vivian",male,42,1,1,28220,32.5,,S
250 | 1140,2,"Hold, Mrs. Stephen (Annie Margaret Hill)",female,29,1,0,26707,26,,S
251 | 1141,3,"Khalil, Mrs. Betros (Zahie Maria"" Elias)""",female,,1,0,2660,14.4542,,C
252 | 1142,2,"West, Miss. Barbara J",female,0.92,1,2,C.A. 34651,27.75,,S
253 | 1143,3,"Abrahamsson, Mr. Abraham August Johannes",male,20,0,0,SOTON/O2 3101284,7.925,,S
254 | 1144,1,"Clark, Mr. Walter Miller",male,27,1,0,13508,136.7792,C89,C
255 | 1145,3,"Salander, Mr. Karl Johan",male,24,0,0,7266,9.325,,S
256 | 1146,3,"Wenzel, Mr. Linhart",male,32.5,0,0,345775,9.5,,S
257 | 1147,3,"MacKay, Mr. George William",male,,0,0,C.A. 42795,7.55,,S
258 | 1148,3,"Mahon, Mr. John",male,,0,0,AQ/4 3130,7.75,,Q
259 | 1149,3,"Niklasson, Mr. Samuel",male,28,0,0,363611,8.05,,S
260 | 1150,2,"Bentham, Miss. Lilian W",female,19,0,0,28404,13,,S
261 | 1151,3,"Midtsjo, Mr. Karl Albert",male,21,0,0,345501,7.775,,S
262 | 1152,3,"de Messemaeker, Mr. Guillaume Joseph",male,36.5,1,0,345572,17.4,,S
263 | 1153,3,"Nilsson, Mr. August Ferdinand",male,21,0,0,350410,7.8542,,S
264 | 1154,2,"Wells, Mrs. Arthur Henry (Addie"" Dart Trevaskis)""",female,29,0,2,29103,23,,S
265 | 1155,3,"Klasen, Miss. Gertrud Emilia",female,1,1,1,350405,12.1833,,S
266 | 1156,2,"Portaluppi, Mr. Emilio Ilario Giuseppe",male,30,0,0,C.A. 34644,12.7375,,C
267 | 1157,3,"Lyntakoff, Mr. Stanko",male,,0,0,349235,7.8958,,S
268 | 1158,1,"Chisholm, Mr. Roderick Robert Crispin",male,,0,0,112051,0,,S
269 | 1159,3,"Warren, Mr. Charles William",male,,0,0,C.A. 49867,7.55,,S
270 | 1160,3,"Howard, Miss. May Elizabeth",female,,0,0,A. 2. 39186,8.05,,S
271 | 1161,3,"Pokrnic, Mr. Mate",male,17,0,0,315095,8.6625,,S
272 | 1162,1,"McCaffry, Mr. Thomas Francis",male,46,0,0,13050,75.2417,C6,C
273 | 1163,3,"Fox, Mr. Patrick",male,,0,0,368573,7.75,,Q
274 | 1164,1,"Clark, Mrs. Walter Miller (Virginia McDowell)",female,26,1,0,13508,136.7792,C89,C
275 | 1165,3,"Lennon, Miss. Mary",female,,1,0,370371,15.5,,Q
276 | 1166,3,"Saade, Mr. Jean Nassr",male,,0,0,2676,7.225,,C
277 | 1167,2,"Bryhl, Miss. Dagmar Jenny Ingeborg ",female,20,1,0,236853,26,,S
278 | 1168,2,"Parker, Mr. Clifford Richard",male,28,0,0,SC 14888,10.5,,S
279 | 1169,2,"Faunthorpe, Mr. Harry",male,40,1,0,2926,26,,S
280 | 1170,2,"Ware, Mr. John James",male,30,1,0,CA 31352,21,,S
281 | 1171,2,"Oxenham, Mr. Percy Thomas",male,22,0,0,W./C. 14260,10.5,,S
282 | 1172,3,"Oreskovic, Miss. Jelka",female,23,0,0,315085,8.6625,,S
283 | 1173,3,"Peacock, Master. Alfred Edward",male,0.75,1,1,SOTON/O.Q. 3101315,13.775,,S
284 | 1174,3,"Fleming, Miss. Honora",female,,0,0,364859,7.75,,Q
285 | 1175,3,"Touma, Miss. Maria Youssef",female,9,1,1,2650,15.2458,,C
286 | 1176,3,"Rosblom, Miss. Salli Helena",female,2,1,1,370129,20.2125,,S
287 | 1177,3,"Dennis, Mr. William",male,36,0,0,A/5 21175,7.25,,S
288 | 1178,3,"Franklin, Mr. Charles (Charles Fardon)",male,,0,0,SOTON/O.Q. 3101314,7.25,,S
289 | 1179,1,"Snyder, Mr. John Pillsbury",male,24,1,0,21228,82.2667,B45,S
290 | 1180,3,"Mardirosian, Mr. Sarkis",male,,0,0,2655,7.2292,F E46,C
291 | 1181,3,"Ford, Mr. Arthur",male,,0,0,A/5 1478,8.05,,S
292 | 1182,1,"Rheims, Mr. George Alexander Lucien",male,,0,0,PC 17607,39.6,,S
293 | 1183,3,"Daly, Miss. Margaret Marcella Maggie""""",female,30,0,0,382650,6.95,,Q
294 | 1184,3,"Nasr, Mr. Mustafa",male,,0,0,2652,7.2292,,C
295 | 1185,1,"Dodge, Dr. Washington",male,53,1,1,33638,81.8583,A34,S
296 | 1186,3,"Wittevrongel, Mr. Camille",male,36,0,0,345771,9.5,,S
297 | 1187,3,"Angheloff, Mr. Minko",male,26,0,0,349202,7.8958,,S
298 | 1188,2,"Laroche, Miss. Louise",female,1,1,2,SC/Paris 2123,41.5792,,C
299 | 1189,3,"Samaan, Mr. Hanna",male,,2,0,2662,21.6792,,C
300 | 1190,1,"Loring, Mr. Joseph Holland",male,30,0,0,113801,45.5,,S
301 | 1191,3,"Johansson, Mr. Nils",male,29,0,0,347467,7.8542,,S
302 | 1192,3,"Olsson, Mr. Oscar Wilhelm",male,32,0,0,347079,7.775,,S
303 | 1193,2,"Malachard, Mr. Noel",male,,0,0,237735,15.0458,D,C
304 | 1194,2,"Phillips, Mr. Escott Robert",male,43,0,1,S.O./P.P. 2,21,,S
305 | 1195,3,"Pokrnic, Mr. Tome",male,24,0,0,315092,8.6625,,S
306 | 1196,3,"McCarthy, Miss. Catherine Katie""""",female,,0,0,383123,7.75,,Q
307 | 1197,1,"Crosby, Mrs. Edward Gifford (Catherine Elizabeth Halstead)",female,64,1,1,112901,26.55,B26,S
308 | 1198,1,"Allison, Mr. Hudson Joshua Creighton",male,30,1,2,113781,151.55,C22 C26,S
309 | 1199,3,"Aks, Master. Philip Frank",male,0.83,0,1,392091,9.35,,S
310 | 1200,1,"Hays, Mr. Charles Melville",male,55,1,1,12749,93.5,B69,S
311 | 1201,3,"Hansen, Mrs. Claus Peter (Jennie L Howard)",female,45,1,0,350026,14.1083,,S
312 | 1202,3,"Cacic, Mr. Jego Grga",male,18,0,0,315091,8.6625,,S
313 | 1203,3,"Vartanian, Mr. David",male,22,0,0,2658,7.225,,C
314 | 1204,3,"Sadowitz, Mr. Harry",male,,0,0,LP 1588,7.575,,S
315 | 1205,3,"Carr, Miss. Jeannie",female,37,0,0,368364,7.75,,Q
316 | 1206,1,"White, Mrs. John Stuart (Ella Holmes)",female,55,0,0,PC 17760,135.6333,C32,C
317 | 1207,3,"Hagardon, Miss. Kate",female,17,0,0,AQ/3. 30631,7.7333,,Q
318 | 1208,1,"Spencer, Mr. William Augustus",male,57,1,0,PC 17569,146.5208,B78,C
319 | 1209,2,"Rogers, Mr. Reginald Harry",male,19,0,0,28004,10.5,,S
320 | 1210,3,"Jonsson, Mr. Nils Hilding",male,27,0,0,350408,7.8542,,S
321 | 1211,2,"Jefferys, Mr. Ernest Wilfred",male,22,2,0,C.A. 31029,31.5,,S
322 | 1212,3,"Andersson, Mr. Johan Samuel",male,26,0,0,347075,7.775,,S
323 | 1213,3,"Krekorian, Mr. Neshan",male,25,0,0,2654,7.2292,F E57,C
324 | 1214,2,"Nesson, Mr. Israel",male,26,0,0,244368,13,F2,S
325 | 1215,1,"Rowe, Mr. Alfred G",male,33,0,0,113790,26.55,,S
326 | 1216,1,"Kreuchen, Miss. Emilie",female,39,0,0,24160,211.3375,,S
327 | 1217,3,"Assam, Mr. Ali",male,23,0,0,SOTON/O.Q. 3101309,7.05,,S
328 | 1218,2,"Becker, Miss. Ruth Elizabeth",female,12,2,1,230136,39,F4,S
329 | 1219,1,"Rosenshine, Mr. George (Mr George Thorne"")""",male,46,0,0,PC 17585,79.2,,C
330 | 1220,2,"Clarke, Mr. Charles Valentine",male,29,1,0,2003,26,,S
331 | 1221,2,"Enander, Mr. Ingvar",male,21,0,0,236854,13,,S
332 | 1222,2,"Davies, Mrs. John Morgan (Elizabeth Agnes Mary White) ",female,48,0,2,C.A. 33112,36.75,,S
333 | 1223,1,"Dulles, Mr. William Crothers",male,39,0,0,PC 17580,29.7,A18,C
334 | 1224,3,"Thomas, Mr. Tannous",male,,0,0,2684,7.225,,C
335 | 1225,3,"Nakid, Mrs. Said (Waika Mary"" Mowad)""",female,19,1,1,2653,15.7417,,C
336 | 1226,3,"Cor, Mr. Ivan",male,27,0,0,349229,7.8958,,S
337 | 1227,1,"Maguire, Mr. John Edward",male,30,0,0,110469,26,C106,S
338 | 1228,2,"de Brito, Mr. Jose Joaquim",male,32,0,0,244360,13,,S
339 | 1229,3,"Elias, Mr. Joseph",male,39,0,2,2675,7.2292,,C
340 | 1230,2,"Denbury, Mr. Herbert",male,25,0,0,C.A. 31029,31.5,,S
341 | 1231,3,"Betros, Master. Seman",male,,0,0,2622,7.2292,,C
342 | 1232,2,"Fillbrook, Mr. Joseph Charles",male,18,0,0,C.A. 15185,10.5,,S
343 | 1233,3,"Lundstrom, Mr. Thure Edvin",male,32,0,0,350403,7.5792,,S
344 | 1234,3,"Sage, Mr. John George",male,,1,9,CA. 2343,69.55,,S
345 | 1235,1,"Cardeza, Mrs. James Warburton Martinez (Charlotte Wardle Drake)",female,58,0,1,PC 17755,512.3292,B51 B53 B55,C
346 | 1236,3,"van Billiard, Master. James William",male,,1,1,A/5. 851,14.5,,S
347 | 1237,3,"Abelseth, Miss. Karen Marie",female,16,0,0,348125,7.65,,S
348 | 1238,2,"Botsford, Mr. William Hull",male,26,0,0,237670,13,,S
349 | 1239,3,"Whabee, Mrs. George Joseph (Shawneene Abi-Saab)",female,38,0,0,2688,7.2292,,C
350 | 1240,2,"Giles, Mr. Ralph",male,24,0,0,248726,13.5,,S
351 | 1241,2,"Walcroft, Miss. Nellie",female,31,0,0,F.C.C. 13528,21,,S
352 | 1242,1,"Greenfield, Mrs. Leo David (Blanche Strouse)",female,45,0,1,PC 17759,63.3583,D10 D12,C
353 | 1243,2,"Stokes, Mr. Philip Joseph",male,25,0,0,F.C.C. 13540,10.5,,S
354 | 1244,2,"Dibden, Mr. William",male,18,0,0,S.O.C. 14879,73.5,,S
355 | 1245,2,"Herman, Mr. Samuel",male,49,1,2,220845,65,,S
356 | 1246,3,"Dean, Miss. Elizabeth Gladys Millvina""""",female,0.17,1,2,C.A. 2315,20.575,,S
357 | 1247,1,"Julian, Mr. Henry Forbes",male,50,0,0,113044,26,E60,S
358 | 1248,1,"Brown, Mrs. John Murray (Caroline Lane Lamson)",female,59,2,0,11769,51.4792,C101,S
359 | 1249,3,"Lockyer, Mr. Edward",male,,0,0,1222,7.8792,,S
360 | 1250,3,"O'Keefe, Mr. Patrick",male,,0,0,368402,7.75,,Q
361 | 1251,3,"Lindell, Mrs. Edvard Bengtsson (Elin Gerda Persson)",female,30,1,0,349910,15.55,,S
362 | 1252,3,"Sage, Master. William Henry",male,14.5,8,2,CA. 2343,69.55,,S
363 | 1253,2,"Mallet, Mrs. Albert (Antoinette Magnin)",female,24,1,1,S.C./PARIS 2079,37.0042,,C
364 | 1254,2,"Ware, Mrs. John James (Florence Louise Long)",female,31,0,0,CA 31352,21,,S
365 | 1255,3,"Strilic, Mr. Ivan",male,27,0,0,315083,8.6625,,S
366 | 1256,1,"Harder, Mrs. George Achilles (Dorothy Annan)",female,25,1,0,11765,55.4417,E50,C
367 | 1257,3,"Sage, Mrs. John (Annie Bullen)",female,,1,9,CA. 2343,69.55,,S
368 | 1258,3,"Caram, Mr. Joseph",male,,1,0,2689,14.4583,,C
369 | 1259,3,"Riihivouri, Miss. Susanna Juhantytar Sanni""""",female,22,0,0,3101295,39.6875,,S
370 | 1260,1,"Gibson, Mrs. Leonard (Pauline C Boeson)",female,45,0,1,112378,59.4,,C
371 | 1261,2,"Pallas y Castello, Mr. Emilio",male,29,0,0,SC/PARIS 2147,13.8583,,C
372 | 1262,2,"Giles, Mr. Edgar",male,21,1,0,28133,11.5,,S
373 | 1263,1,"Wilson, Miss. Helen Alice",female,31,0,0,16966,134.5,E39 E41,C
374 | 1264,1,"Ismay, Mr. Joseph Bruce",male,49,0,0,112058,0,B52 B54 B56,S
375 | 1265,2,"Harbeck, Mr. William H",male,44,0,0,248746,13,,S
376 | 1266,1,"Dodge, Mrs. Washington (Ruth Vidaver)",female,54,1,1,33638,81.8583,A34,S
377 | 1267,1,"Bowen, Miss. Grace Scott",female,45,0,0,PC 17608,262.375,,C
378 | 1268,3,"Kink, Miss. Maria",female,22,2,0,315152,8.6625,,S
379 | 1269,2,"Cotterill, Mr. Henry Harry""""",male,21,0,0,29107,11.5,,S
380 | 1270,1,"Hipkins, Mr. William Edward",male,55,0,0,680,50,C39,S
381 | 1271,3,"Asplund, Master. Carl Edgar",male,5,4,2,347077,31.3875,,S
382 | 1272,3,"O'Connor, Mr. Patrick",male,,0,0,366713,7.75,,Q
383 | 1273,3,"Foley, Mr. Joseph",male,26,0,0,330910,7.8792,,Q
384 | 1274,3,"Risien, Mrs. Samuel (Emma)",female,,0,0,364498,14.5,,S
385 | 1275,3,"McNamee, Mrs. Neal (Eileen O'Leary)",female,19,1,0,376566,16.1,,S
386 | 1276,2,"Wheeler, Mr. Edwin Frederick""""",male,,0,0,SC/PARIS 2159,12.875,,S
387 | 1277,2,"Herman, Miss. Kate",female,24,1,2,220845,65,,S
388 | 1278,3,"Aronsson, Mr. Ernst Axel Algot",male,24,0,0,349911,7.775,,S
389 | 1279,2,"Ashby, Mr. John",male,57,0,0,244346,13,,S
390 | 1280,3,"Canavan, Mr. Patrick",male,21,0,0,364858,7.75,,Q
391 | 1281,3,"Palsson, Master. Paul Folke",male,6,3,1,349909,21.075,,S
392 | 1282,1,"Payne, Mr. Vivian Ponsonby",male,23,0,0,12749,93.5,B24,S
393 | 1283,1,"Lines, Mrs. Ernest H (Elizabeth Lindsey James)",female,51,0,1,PC 17592,39.4,D28,S
394 | 1284,3,"Abbott, Master. Eugene Joseph",male,13,0,2,C.A. 2673,20.25,,S
395 | 1285,2,"Gilbert, Mr. William",male,47,0,0,C.A. 30769,10.5,,S
396 | 1286,3,"Kink-Heilmann, Mr. Anton",male,29,3,1,315153,22.025,,S
397 | 1287,1,"Smith, Mrs. Lucien Philip (Mary Eloise Hughes)",female,18,1,0,13695,60,C31,S
398 | 1288,3,"Colbert, Mr. Patrick",male,24,0,0,371109,7.25,,Q
399 | 1289,1,"Frolicher-Stehli, Mrs. Maxmillian (Margaretha Emerentia Stehli)",female,48,1,1,13567,79.2,B41,C
400 | 1290,3,"Larsson-Rondberg, Mr. Edvard A",male,22,0,0,347065,7.775,,S
401 | 1291,3,"Conlon, Mr. Thomas Henry",male,31,0,0,21332,7.7333,,Q
402 | 1292,1,"Bonnell, Miss. Caroline",female,30,0,0,36928,164.8667,C7,S
403 | 1293,2,"Gale, Mr. Harry",male,38,1,0,28664,21,,S
404 | 1294,1,"Gibson, Miss. Dorothy Winifred",female,22,0,1,112378,59.4,,C
405 | 1295,1,"Carrau, Mr. Jose Pedro",male,17,0,0,113059,47.1,,S
406 | 1296,1,"Frauenthal, Mr. Isaac Gerald",male,43,1,0,17765,27.7208,D40,C
407 | 1297,2,"Nourney, Mr. Alfred (Baron von Drachstedt"")""",male,20,0,0,SC/PARIS 2166,13.8625,D38,C
408 | 1298,2,"Ware, Mr. William Jeffery",male,23,1,0,28666,10.5,,S
409 | 1299,1,"Widener, Mr. George Dunton",male,50,1,1,113503,211.5,C80,C
410 | 1300,3,"Riordan, Miss. Johanna Hannah""""",female,,0,0,334915,7.7208,,Q
411 | 1301,3,"Peacock, Miss. Treasteall",female,3,1,1,SOTON/O.Q. 3101315,13.775,,S
412 | 1302,3,"Naughton, Miss. Hannah",female,,0,0,365237,7.75,,Q
413 | 1303,1,"Minahan, Mrs. William Edward (Lillian E Thorpe)",female,37,1,0,19928,90,C78,Q
414 | 1304,3,"Henriksson, Miss. Jenny Lovisa",female,28,0,0,347086,7.775,,S
415 | 1305,3,"Spector, Mr. Woolf",male,,0,0,A.5. 3236,8.05,,S
416 | 1306,1,"Oliva y Ocana, Dona. Fermina",female,39,0,0,PC 17758,108.9,C105,C
417 | 1307,3,"Saether, Mr. Simon Sivertsen",male,38.5,0,0,SOTON/O.Q. 3101262,7.25,,S
418 | 1308,3,"Ware, Mr. Frederick",male,,0,0,359309,8.05,,S
419 | 1309,3,"Peter, Master. Michael J",male,,1,1,2668,22.3583,,C
420 |
--------------------------------------------------------------------------------
/TwitchTopStreamers-Analysis/README.md:
--------------------------------------------------------------------------------
1 | # Twitch Top Streamers Analysis
2 |
3 | # Objectives
4 | The objective is to predict the followers gained by a streamer on Twitch during the past year of streaming.
5 |
6 | # Tools Used
7 |
8 |
9 | # Data Set used
10 | The data set used is a custom made data set compiled by taking the stats from different websites.
11 |
12 | # Contents of our Data set
13 | 1) Channel - The name of the Streamer's channel
14 | 2) Watch time(minutes) - The amount of time people have watched a particular streamer.
15 | 3) Stream time(minutes) - The total amount of time the streamer streamed in the past year.
16 | 4) Peak viewers - The maximum number of viewers the streamer had during his stream in the past year.
17 | 5) Average viewers - The average number of viewers a streamer had while streaming.
18 | 6) Followers - Number of Followers a Streamer had on Twitch.
19 | 7) Followers gained - The amount of followers a particular streamer gained in a year on Twitch.
20 | 8) Views gained - The amount of views a streamer gained in a year.
21 | 9) Partnered - Whether the streamer is Twitch Partnered or not.
22 | 10) Mature -Whether the streams are 18+ or not.
23 | 11) Language - The language used by a streamer during the stream.
24 |
25 |
26 | # Steps involved in making the model
27 | 1) Importing our data.
28 | 2) Checking for null values and finding mean of different columns, their min and max values, and getting information about different columns of our data.
29 | 3) Visualizing our data in order to find out the best columns to use as features for our model.
30 | 4) Make a copy of original data to make the changes and seperate out the columns that we will use as our features.
31 | 5) Using these features, we will train our Linear Regression Model.
32 | 6) Concluding our analysis by testing the model with some random user input.
33 |
34 | # Result
35 | Our model was able to predict a certain number of Followers gained but the error can be decresed further by optimizing it. We can also try and use a few features like whether the streamer is twitch partnered or not,etc.
36 | Our Result:
37 |
38 | 
39 |
--------------------------------------------------------------------------------
/TwitchTopStreamers-Analysis/Twitch-Result.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/TwitchTopStreamers-Analysis/Twitch-Result.PNG
--------------------------------------------------------------------------------
/UsedCarPricePredictor/Procfile:
--------------------------------------------------------------------------------
1 | web: gunicorn app:app
2 |
--------------------------------------------------------------------------------
/UsedCarPricePredictor/README.md:
--------------------------------------------------------------------------------
1 | ## CarPricePredictor
2 | A used-car price predictor created using Machine-Learning and Web Development.
3 |
4 | #### Problem Statement:
5 | The auto industry is changing rapidly and car prices are only going up. So to speak, new cars are getting costlier each year, making them a very high value purchase for the common man. And quite ironically, the average life span of a car is going down despite the steady rise in prices, which brings in good news for potential used car buyers! Thanks to manufacturers launching newer versions of their models sooner now as compared to a few years ago, more and more modern cars are now entering the used car market, which makes it easy for you to make a good buy without having to bust your wallet.
6 | And as the market for used-cars is increasing, the number of sellers is also increasing. Keeping this in mind, I have created a web application for sellers, to know what will be the price of their car, in the market.
7 |
8 | #### Dataset used:
9 | For solving this problem, I have used the CarDekho dataset, which was avaliable on Kaggle. This is the link: https://www.kaggle.com/shindenikhil/car-dekho-data
10 |
11 | #### References Used:
12 | I have used this video as a reference. A quick shoutout to Krish Naik , whom I have been following since May.He uploads very informative and concise videos for Machine Learning and many other topics in the field of AI.
13 |
14 | #### Algorithms used:
15 | After experimenting with a lot of Regression-based algorithms, Random Forest has worked out the best for me in terms of accuracy. If you want an in-depth explaination on random Forest and how do they work, feel free to checkout this article .
16 |
17 | #### Features used:
18 | Since this is a very basic used-car price predictor, I have only used a handful of features. Some of the features are:
19 | 1. Year: The total life-span of the car(currentYear - yearOfPurchasing)
20 | 2. Showroom Price: The price of the car when the seller first bought it.
21 | 3. Kilometers Driven: The total number of kilometers the car has driven.
22 | 4. Previous Owners: The number of owners the car has had.
23 | 5. Fuel Type: Specifies if the car runs on Petrol, Diesel or CNG.
24 | 6. Dealer/Individual: Specifies if the seller is a dealer or an individual.
25 | 7. Transmission Type: Specifies whether the car is a manual one or automatic.
26 |
27 | #### NOTE:
28 | "Success is the project that's always under construction": Keeping this quote in mind, I'd like to add many features into this project in the future. So stay updated!
29 |
30 | #### To run this on your machine:
31 | 1. Make sure you have Python3, Flask installed.
32 | 2. In you terminal, cd to the folder you got this project in, and type python run app.py
33 | 3. And voila!!
34 |
35 | #### P.S: I will be deploying this project fully in the upcoming days, so that you guys don't have to download the project folder everytime.
36 |
37 | #### Here are some screenshots of the application:
38 | 
39 |
40 | 
41 |
42 |
--------------------------------------------------------------------------------
/UsedCarPricePredictor/app.py:
--------------------------------------------------------------------------------
1 | from flask import Flask, render_template, request
2 | import jsonify
3 | import requests
4 | import pickle
5 | import numpy as np
6 | import sklearn
7 |
8 | # Initializing Flask framework
9 | app = Flask(__name__)
10 | model = pickle.load(open('random_forest_regression_model.pkl', 'rb'))
11 |
12 | @app.route('/')
13 | def home():
14 | return render_template("index.html")
15 |
16 |
17 | @app.route("/predict", methods=['GET', 'POST'])
18 | def predict():
19 | Fuel_Type_Diesel=0
20 | if request.method == 'POST':
21 |
22 | # Number of years of the car
23 | Year = int(request.form['Year'])
24 |
25 | # Present price of the car
26 | Present_Price=float(request.form['Present_Price'])
27 |
28 | # Total Kms driven in the car
29 | Kms_Driven=int(request.form['Kms_Driven'])
30 | Kms_Driven2=np.log(Kms_Driven)
31 |
32 | # Total owners of the car
33 | Owner=int(request.form['Owner'])
34 |
35 | # Fuel type
36 | Fuel_Type_Petrol=request.form['Fuel_Type_Petrol']
37 | if(Fuel_Type_Petrol=='Petrol'):
38 | Fuel_Type_Petrol=1
39 | Fuel_Type_Diesel=0
40 | else:
41 | Fuel_Type_Petrol=0
42 | Fuel_Type_Diesel=1
43 |
44 | # Updating the year:
45 | Year=2020-Year
46 |
47 | # If seller is individual or Dealer
48 | Seller_Type_Individual=request.form['Seller_Type_Individual']
49 | if(Seller_Type_Individual=='Individual'):
50 | Seller_Type_Individual=1
51 | else:
52 | Seller_Type_Individual=0
53 | Transmission_Mannual=request.form['Transmission_Mannual']
54 |
55 | # Transmission Manual
56 | if(Transmission_Mannual=='Mannual'):
57 | Transmission_Mannual=1
58 | else:
59 | Transmission_Mannual=0
60 |
61 | # Predicting the price
62 | prediction=model.predict([[Present_Price,Kms_Driven2,Owner,Year,Fuel_Type_Diesel,Fuel_Type_Petrol,Seller_Type_Individual,Transmission_Mannual]])
63 | output=round(prediction[0],2)
64 | if output < 0:
65 | return render_template('index.html', prediction_text = "Sorry you cannot sell this car")
66 | else:
67 | return render_template('index.html',prediction_text = "You Can Sell The Car at {}".format(output))
68 |
69 | return render_template('index.html')
70 |
71 | if __name__=="__main__":
72 | app.run(debug=True)
73 |
74 |
--------------------------------------------------------------------------------
/UsedCarPricePredictor/car_data.csv:
--------------------------------------------------------------------------------
1 | Car_Name,company,Year,Selling_Price,Present_Price,Kms_Driven,Fuel_Type,Seller_Type,Transmission,Owner
2 | ritz,maruti suzuki,2014,3.35,5.59,27000,Petrol,Dealer,Manual,0
3 | sx4,maruti suzuki,2013,4.75,9.54,43000,Diesel,Dealer,Manual,0
4 | ciaz,maruti suzuki,2017,7.25,9.85,6900,Petrol,Dealer,Manual,0
5 | wagon r,maruti suzuki,2011,2.85,4.15,5200,Petrol,Dealer,Manual,0
6 | swift,maruti suzuki,2014,4.6,6.87,42450,Diesel,Dealer,Manual,0
7 | vitara brezza,maruti suzuki,2018,9.25,9.83,2071,Diesel,Dealer,Manual,0
8 | ciaz,maruti suzuki,2015,6.75,8.12,18796,Petrol,Dealer,Manual,0
9 | s cross,maruti suzuki,2015,6.5,8.61,33429,Diesel,Dealer,Manual,0
10 | ciaz,maruti suzuki,2016,8.75,8.89,20273,Diesel,Dealer,Manual,0
11 | ciaz,maruti suzuki,2015,7.45,8.92,42367,Diesel,Dealer,Manual,0
12 | alto 800,maruti suzuki,2017,2.85,3.6,2135,Petrol,Dealer,Manual,0
13 | ciaz,maruti suzuki,2015,6.85,10.38,51000,Diesel,Dealer,Manual,0
14 | ciaz,maruti suzuki,2015,7.5,9.94,15000,Petrol,Dealer,Automatic,0
15 | ertiga,maruti suzuki,2015,6.1,7.71,26000,Petrol,Dealer,Manual,0
16 | dzire,maruti suzuki,2009,2.25,7.21,77427,Petrol,Dealer,Manual,0
17 | ertiga,maruti suzuki,2016,7.75,10.79,43000,Diesel,Dealer,Manual,0
18 | ertiga,maruti suzuki,2015,7.25,10.79,41678,Diesel,Dealer,Manual,0
19 | ertiga,maruti suzuki,2016,7.75,10.79,43000,Diesel,Dealer,Manual,0
20 | wagon r,maruti suzuki,2015,3.25,5.09,35500,CNG,Dealer,Manual,0
21 | sx4,maruti suzuki,2010,2.65,7.98,41442,Petrol,Dealer,Manual,0
22 | alto k10,maruti suzuki,2016,2.85,3.95,25000,Petrol,Dealer,Manual,0
23 | ignis,maruti suzuki,2017,4.9,5.71,2400,Petrol,Dealer,Manual,0
24 | sx4,maruti suzuki,2011,4.4,8.01,50000,Petrol,Dealer,Automatic,0
25 | alto k10,maruti suzuki,2014,2.5,3.46,45280,Petrol,Dealer,Manual,0
26 | wagon r,maruti suzuki,2013,2.9,4.41,56879,Petrol,Dealer,Manual,0
27 | swift,maruti suzuki,2011,3,4.99,20000,Petrol,Dealer,Manual,0
28 | swift,maruti suzuki,2013,4.15,5.87,55138,Petrol,Dealer,Manual,0
29 | swift,maruti suzuki,2017,6,6.49,16200,Petrol,Individual,Manual,0
30 | alto k10,maruti suzuki,2010,1.95,3.95,44542,Petrol,Dealer,Manual,0
31 | ciaz,maruti suzuki,2015,7.45,10.38,45000,Diesel,Dealer,Manual,0
32 | ritz,maruti suzuki,2012,3.1,5.98,51439,Diesel,Dealer,Manual,0
33 | ritz,maruti suzuki,2011,2.35,4.89,54200,Petrol,Dealer,Manual,0
34 | swift,maruti suzuki,2014,4.95,7.49,39000,Diesel,Dealer,Manual,0
35 | ertiga,maruti suzuki,2014,6,9.95,45000,Diesel,Dealer,Manual,0
36 | dzire,maruti suzuki,2014,5.5,8.06,45000,Diesel,Dealer,Manual,0
37 | sx4,maruti suzuki,2011,2.95,7.74,49998,CNG,Dealer,Manual,0
38 | dzire,maruti suzuki,2015,4.65,7.2,48767,Petrol,Dealer,Manual,0
39 | 800,maruti suzuki,2003,0.35,2.28,127000,Petrol,Individual,Manual,0
40 | alto k10,maruti suzuki,2016,3,3.76,10079,Petrol,Dealer,Manual,0
41 | sx4,maruti suzuki,2003,2.25,7.98,62000,Petrol,Dealer,Manual,0
42 | baleno,maruti suzuki,2016,5.85,7.87,24524,Petrol,Dealer,Automatic,0
43 | alto k10,maruti suzuki,2014,2.55,3.98,46706,Petrol,Dealer,Manual,0
44 | sx4,maruti suzuki,2008,1.95,7.15,58000,Petrol,Dealer,Manual,0
45 | dzire,maruti suzuki,2014,5.5,8.06,45780,Diesel,Dealer,Manual,0
46 | omni,maruti suzuki,2012,1.25,2.69,50000,Petrol,Dealer,Manual,0
47 | ciaz,maruti suzuki,2014,7.5,12.04,15000,Petrol,Dealer,Automatic,0
48 | ritz,maruti suzuki,2013,2.65,4.89,64532,Petrol,Dealer,Manual,0
49 | wagon r,maruti suzuki,2006,1.05,4.15,65000,Petrol,Dealer,Manual,0
50 | ertiga,maruti suzuki,2015,5.8,7.71,25870,Petrol,Dealer,Manual,0
51 | ciaz,maruti suzuki,2017,7.75,9.29,37000,Petrol,Dealer,Automatic,0
52 | fortuner,toyota,2012,14.9,30.61,104707,Diesel,Dealer,Automatic,0
53 | fortuner,toyota,2015,23,30.61,40000,Diesel,Dealer,Automatic,0
54 | innova,other,2017,18,19.77,15000,Diesel,Dealer,Automatic,0
55 | fortuner,toyota,2013,16,30.61,135000,Diesel,Individual,Automatic,0
56 | innova,other,2005,2.75,10.21,90000,Petrol,Individual,Manual,0
57 | corolla altis,toyota,2009,3.6,15.04,70000,Petrol,Dealer,Automatic,0
58 | etios cross,other,2015,4.5,7.27,40534,Petrol,Dealer,Manual,0
59 | corolla altis,toyota,2010,4.75,18.54,50000,Petrol,Dealer,Manual,0
60 | etios g,toyota,2014,4.1,6.8,39485,Petrol,Dealer,Manual,1
61 | fortuner,toyota,2014,19.99,35.96,41000,Diesel,Dealer,Automatic,0
62 | corolla altis,toyota,2013,6.95,18.61,40001,Petrol,Dealer,Manual,0
63 | etios cross,other,2015,4.5,7.7,40588,Petrol,Dealer,Manual,0
64 | fortuner,toyota,2014,18.75,35.96,78000,Diesel,Dealer,Automatic,0
65 | fortuner,toyota,2015,23.5,35.96,47000,Diesel,Dealer,Automatic,0
66 | fortuner,toyota,2017,33,36.23,6000,Diesel,Dealer,Automatic,0
67 | etios liva,toyota,2014,4.75,6.95,45000,Diesel,Dealer,Manual,0
68 | innova,other,2017,19.75,23.15,11000,Petrol,Dealer,Automatic,0
69 | fortuner,toyota,2010,9.25,20.45,59000,Diesel,Dealer,Manual,0
70 | corolla altis,toyota,2011,4.35,13.74,88000,Petrol,Dealer,Manual,0
71 | corolla altis,toyota,2016,14.25,20.91,12000,Petrol,Dealer,Manual,0
72 | etios liva,toyota,2014,3.95,6.76,71000,Diesel,Dealer,Manual,0
73 | corolla altis,toyota,2011,4.5,12.48,45000,Diesel,Dealer,Manual,0
74 | corolla altis,toyota,2013,7.45,18.61,56001,Petrol,Dealer,Manual,0
75 | etios liva,toyota,2011,2.65,5.71,43000,Petrol,Dealer,Manual,0
76 | etios cross,other,2014,4.9,8.93,83000,Diesel,Dealer,Manual,0
77 | etios g,toyota,2015,3.95,6.8,36000,Petrol,Dealer,Manual,0
78 | corolla altis,toyota,2013,5.5,14.68,72000,Petrol,Dealer,Manual,0
79 | corolla,toyota,2004,1.5,12.35,135154,Petrol,Dealer,Automatic,0
80 | corolla altis,toyota,2010,5.25,22.83,80000,Petrol,Dealer,Automatic,0
81 | fortuner,toyota,2012,14.5,30.61,89000,Diesel,Dealer,Automatic,0
82 | corolla altis,toyota,2016,14.73,14.89,23000,Diesel,Dealer,Manual,0
83 | etios gd,toyota,2015,4.75,7.85,40000,Diesel,Dealer,Manual,0
84 | innova,other,2017,23,25.39,15000,Diesel,Dealer,Automatic,0
85 | innova,other,2015,12.5,13.46,38000,Diesel,Dealer,Manual,0
86 | innova,other,2005,3.49,13.46,197176,Diesel,Dealer,Manual,0
87 | camry,other,2006,2.5,23.73,142000,Petrol,Individual,Automatic,3
88 | land cruiser,other,2010,35,92.6,78000,Diesel,Dealer,Manual,0
89 | corolla altis,toyota,2012,5.9,13.74,56000,Petrol,Dealer,Manual,0
90 | etios liva,toyota,2013,3.45,6.05,47000,Petrol,Dealer,Manual,0
91 | etios g,toyota,2014,4.75,6.76,40000,Petrol,Dealer,Manual,0
92 | corolla altis,toyota,2009,3.8,18.61,62000,Petrol,Dealer,Manual,0
93 | innova,other,2014,11.25,16.09,58242,Diesel,Dealer,Manual,0
94 | innova,other,2005,3.51,13.7,75000,Petrol,Dealer,Manual,0
95 | fortuner,toyota,2015,23,30.61,40000,Diesel,Dealer,Automatic,0
96 | corolla altis,toyota,2008,4,22.78,89000,Petrol,Dealer,Automatic,0
97 | corolla altis,toyota,2012,5.85,18.61,72000,Petrol,Dealer,Manual,0
98 | innova,other,2016,20.75,25.39,29000,Diesel,Dealer,Automatic,0
99 | corolla altis,toyota,2017,17,18.64,8700,Petrol,Dealer,Manual,0
100 | corolla altis,toyota,2013,7.05,18.61,45000,Petrol,Dealer,Manual,0
101 | fortuner,toyota,2010,9.65,20.45,50024,Diesel,Dealer,Manual,0
102 | Royal Enfield Thunder 500,other,2016,1.75,1.9,3000,Petrol,Individual,Manual,0
103 | UM Renegade Mojave,other,2017,1.7,1.82,1400,Petrol,Individual,Manual,0
104 | KTM RC200,other,2017,1.65,1.78,4000,Petrol,Individual,Manual,0
105 | Bajaj Dominar 400,bajaj,2017,1.45,1.6,1200,Petrol,Individual,Manual,0
106 | Royal Enfield Classic 350,other,2017,1.35,1.47,4100,Petrol,Individual,Manual,0
107 | KTM RC390,other,2015,1.35,2.37,21700,Petrol,Individual,Manual,0
108 | Hyosung GT250R,other,2014,1.35,3.45,16500,Petrol,Individual,Manual,1
109 | Royal Enfield Thunder 350,other,2013,1.25,1.5,15000,Petrol,Individual,Manual,0
110 | Royal Enfield Thunder 350,other,2016,1.2,1.5,18000,Petrol,Individual,Manual,0
111 | Royal Enfield Classic 350,other,2017,1.2,1.47,11000,Petrol,Individual,Manual,0
112 | KTM RC200,other,2016,1.2,1.78,6000,Petrol,Individual,Manual,0
113 | Royal Enfield Thunder 350,other,2016,1.15,1.5,8700,Petrol,Individual,Manual,0
114 | KTM 390 Duke ,other,2014,1.15,2.4,7000,Petrol,Individual,Manual,0
115 | Mahindra Mojo XT300,other,2016,1.15,1.4,35000,Petrol,Individual,Manual,0
116 | Royal Enfield Classic 350,other,2015,1.15,1.47,17000,Petrol,Individual,Manual,0
117 | Royal Enfield Classic 350,other,2015,1.11,1.47,17500,Petrol,Individual,Manual,0
118 | Royal Enfield Classic 350,other,2013,1.1,1.47,33000,Petrol,Individual,Manual,0
119 | Royal Enfield Thunder 500,other,2015,1.1,1.9,14000,Petrol,Individual,Manual,0
120 | Royal Enfield Classic 350,other,2015,1.1,1.47,26000,Petrol,Individual,Manual,0
121 | Royal Enfield Thunder 500,other,2013,1.05,1.9,5400,Petrol,Individual,Manual,0
122 | Bajaj Pulsar RS200,bajaj,2016,1.05,1.26,5700,Petrol,Individual,Manual,0
123 | Royal Enfield Thunder 350,other,2011,1.05,1.5,6900,Petrol,Individual,Manual,0
124 | Royal Enfield Bullet 350,other,2016,1.05,1.17,6000,Petrol,Individual,Manual,0
125 | Royal Enfield Classic 350,other,2013,1,1.47,46500,Petrol,Individual,Manual,0
126 | Royal Enfield Classic 500,other,2012,0.95,1.75,11500,Petrol,Individual,Manual,0
127 | Royal Enfield Classic 500,other,2009,0.9,1.75,40000,Petrol,Individual,Manual,0
128 | Bajaj Avenger 220,bajaj,2017,0.9,0.95,1300,Petrol,Individual,Manual,0
129 | Bajaj Avenger 150,bajaj,2016,0.75,0.8,7000,Petrol,Individual,Manual,0
130 | Honda CB Hornet 160R,honda,2017,0.8,0.87,3000,Petrol,Individual,Manual,0
131 | Yamaha FZ S V 2.0,yamaha,2017,0.78,0.84,5000,Petrol,Individual,Manual,0
132 | Honda CB Hornet 160R,honda,2017,0.75,0.87,11000,Petrol,Individual,Manual,0
133 | Yamaha FZ 16,yamaha,2015,0.75,0.82,18000,Petrol,Individual,Manual,0
134 | Bajaj Avenger 220,bajaj,2017,0.75,0.95,3500,Petrol,Individual,Manual,0
135 | Bajaj Avenger 220,bajaj,2016,0.72,0.95,500,Petrol,Individual,Manual,0
136 | TVS Apache RTR 160,tvs,2017,0.65,0.81,11800,Petrol,Individual,Manual,0
137 | Bajaj Pulsar 150,bajaj,2015,0.65,0.74,5000,Petrol,Individual,Manual,0
138 | Honda CBR 150,honda,2014,0.65,1.2,23500,Petrol,Individual,Manual,0
139 | Hero Extreme,hero,2013,0.65,0.787,16000,Petrol,Individual,Manual,0
140 | Honda CB Hornet 160R,honda,2016,0.6,0.87,15000,Petrol,Individual,Manual,0
141 | Bajaj Avenger 220 dtsi,bajaj,2015,0.6,0.95,16600,Petrol,Individual,Manual,0
142 | Honda CBR 150,honda,2013,0.6,1.2,32000,Petrol,Individual,Manual,0
143 | Bajaj Avenger 150 street,bajaj,2016,0.6,0.8,20000,Petrol,Individual,Manual,0
144 | Yamaha FZ v 2.0,yamaha,2015,0.6,0.84,29000,Petrol,Individual,Manual,0
145 | Yamaha FZ v 2.0,yamaha,2016,0.6,0.84,25000,Petrol,Individual,Manual,0
146 | Bajaj Pulsar NS 200,bajaj,2014,0.6,0.99,25000,Petrol,Individual,Manual,0
147 | TVS Apache RTR 160,tvs,2012,0.6,0.81,19000,Petrol,Individual,Manual,0
148 | Hero Extreme,hero,2014,0.55,0.787,15000,Petrol,Individual,Manual,0
149 | Yamaha FZ S V 2.0,yamaha,2015,0.55,0.84,58000,Petrol,Individual,Manual,0
150 | Bajaj Pulsar 220 F,bajaj,2010,0.52,0.94,45000,Petrol,Individual,Manual,0
151 | Bajaj Pulsar 220 F,bajaj,2016,0.51,0.94,24000,Petrol,Individual,Manual,0
152 | TVS Apache RTR 180,tvs,2011,0.5,0.826,6000,Petrol,Individual,Manual,0
153 | Hero Passion X pro,hero,2016,0.5,0.55,31000,Petrol,Individual,Manual,0
154 | Bajaj Pulsar NS 200,bajaj,2012,0.5,0.99,13000,Petrol,Individual,Manual,0
155 | Bajaj Pulsar NS 200,bajaj,2013,0.5,0.99,45000,Petrol,Individual,Manual,0
156 | Yamaha Fazer ,yamaha,2014,0.5,0.88,8000,Petrol,Individual,Manual,0
157 | Honda Activa 4G,honda,2017,0.48,0.51,4300,Petrol,Individual,Automatic,0
158 | TVS Sport ,tvs,2017,0.48,0.52,15000,Petrol,Individual,Manual,0
159 | Yamaha FZ S V 2.0,yamaha,2015,0.48,0.84,23000,Petrol,Individual,Manual,0
160 | Honda Dream Yuga ,honda,2017,0.48,0.54,8600,Petrol,Individual,Manual,0
161 | Honda Activa 4G,honda,2017,0.45,0.51,4000,Petrol,Individual,Automatic,0
162 | Bajaj Avenger Street 220,bajaj,2011,0.45,0.95,24000,Petrol,Individual,Manual,0
163 | TVS Apache RTR 180,tvs,2014,0.45,0.826,23000,Petrol,Individual,Manual,0
164 | Bajaj Pulsar NS 200,bajaj,2012,0.45,0.99,14500,Petrol,Individual,Manual,0
165 | Bajaj Avenger 220 dtsi,bajaj,2010,0.45,0.95,27000,Petrol,Individual,Manual,0
166 | Hero Splender iSmart,hero,2016,0.45,0.54,14000,Petrol,Individual,Manual,0
167 | Activa 3g,honda,2016,0.45,0.54,500,Petrol,Individual,Automatic,0
168 | Hero Passion Pro,hero,2016,0.45,0.55,1000,Petrol,Individual,Manual,0
169 | TVS Apache RTR 160,tvs,2014,0.42,0.81,42000,Petrol,Individual,Manual,0
170 | Honda CB Trigger,honda,2013,0.42,0.73,12000,Petrol,Individual,Manual,0
171 | Hero Splender iSmart,hero,2015,0.4,0.54,14000,Petrol,Individual,Manual,0
172 | Yamaha FZ S ,yamaha,2012,0.4,0.83,5500,Petrol,Individual,Manual,0
173 | Hero Passion Pro,hero,2015,0.4,0.55,6700,Petrol,Individual,Manual,0
174 | Bajaj Pulsar 135 LS,bajaj,2014,0.4,0.64,13700,Petrol,Individual,Manual,0
175 | Activa 4g,honda,2017,0.4,0.51,1300,Petrol,Individual,Automatic,0
176 | Honda CB Unicorn,honda,2015,0.38,0.72,38600,Petrol,Individual,Manual,0
177 | Hero Honda CBZ extreme,hero,2011,0.38,0.787,75000,Petrol,Individual,Manual,0
178 | Honda Karizma,honda,2011,0.35,1.05,30000,Petrol,Individual,Manual,0
179 | Honda Activa 125,honda,2016,0.35,0.57,24000,Petrol,Individual,Automatic,0
180 | TVS Jupyter,tvs,2014,0.35,0.52,19000,Petrol,Individual,Automatic,0
181 | Honda Karizma,honda,2010,0.31,1.05,213000,Petrol,Individual,Manual,0
182 | Hero Honda Passion Pro,hero,2012,0.3,0.51,60000,Petrol,Individual,Manual,0
183 | Hero Splender Plus,hero,2016,0.3,0.48,50000,Petrol,Individual,Manual,0
184 | Honda CB Shine,honda,2013,0.3,0.58,30000,Petrol,Individual,Manual,0
185 | Bajaj Discover 100,bajaj,2013,0.27,0.47,21000,Petrol,Individual,Manual,0
186 | Bajaj Pulsar 150,bajaj,2008,0.25,0.75,26000,Petrol,Individual,Manual,1
187 | Suzuki Access 125,other,2008,0.25,0.58,1900,Petrol,Individual,Automatic,0
188 | TVS Wego,tvs,2010,0.25,0.52,22000,Petrol,Individual,Automatic,0
189 | Honda CB twister,honda,2013,0.25,0.51,32000,Petrol,Individual,Manual,0
190 | Hero Glamour,hero,2013,0.25,0.57,18000,Petrol,Individual,Manual,0
191 | Hero Super Splendor,hero,2005,0.2,0.57,55000,Petrol,Individual,Manual,0
192 | Bajaj Pulsar 150,bajaj,2008,0.2,0.75,60000,Petrol,Individual,Manual,0
193 | Bajaj Discover 125,bajaj,2012,0.2,0.57,25000,Petrol,Individual,Manual,1
194 | Hero Hunk,hero,2007,0.2,0.75,49000,Petrol,Individual,Manual,1
195 | Hero Ignitor Disc,hero,2013,0.2,0.65,24000,Petrol,Individual,Manual,1
196 | Hero CBZ Xtreme,hero,2008,0.2,0.787,50000,Petrol,Individual,Manual,0
197 | Bajaj ct 100,bajaj,2015,0.18,0.32,35000,Petrol,Individual,Manual,0
198 | Activa 3g,honda,2008,0.17,0.52,500000,Petrol,Individual,Automatic,0
199 | Honda CB twister,honda,2010,0.16,0.51,33000,Petrol,Individual,Manual,0
200 | Bajaj Discover 125,bajaj,2011,0.15,0.57,35000,Petrol,Individual,Manual,1
201 | Honda CB Shine,honda,2007,0.12,0.58,53000,Petrol,Individual,Manual,0
202 | Bajaj Pulsar 150,bajaj,2006,0.1,0.75,92233,Petrol,Individual,Manual,0
203 | i20,hyndai,2010,3.25,6.79,58000,Diesel,Dealer,Manual,1
204 | grand i10,hyndai,2015,4.4,5.7,28200,Petrol,Dealer,Manual,0
205 | i10,hyndai,2011,2.95,4.6,53460,Petrol,Dealer,Manual,0
206 | eon,hyndai,2015,2.75,4.43,28282,Petrol,Dealer,Manual,0
207 | grand i10,hyndai,2016,5.25,5.7,3493,Petrol,Dealer,Manual,1
208 | xcent,hyndai,2017,5.75,7.13,12479,Petrol,Dealer,Manual,0
209 | grand i10,hyndai,2015,5.15,5.7,34797,Petrol,Dealer,Automatic,0
210 | i20,hyndai,2017,7.9,8.1,3435,Petrol,Dealer,Manual,0
211 | grand i10,hyndai,2015,4.85,5.7,21125,Diesel,Dealer,Manual,0
212 | i10,hyndai,2012,3.1,4.6,35775,Petrol,Dealer,Manual,0
213 | elantra,hyndai,2015,11.75,14.79,43535,Diesel,Dealer,Manual,0
214 | creta,hyndai,2016,11.25,13.6,22671,Petrol,Dealer,Manual,0
215 | i20,hyndai,2011,2.9,6.79,31604,Petrol,Dealer,Manual,0
216 | grand i10,hyndai,2017,5.25,5.7,20114,Petrol,Dealer,Manual,0
217 | verna,hyndai,2012,4.5,9.4,36100,Petrol,Dealer,Manual,0
218 | eon,hyndai,2016,2.9,4.43,12500,Petrol,Dealer,Manual,0
219 | eon,hyndai,2016,3.15,4.43,15000,Petrol,Dealer,Manual,0
220 | verna,hyndai,2014,6.45,9.4,45078,Petrol,Dealer,Manual,0
221 | verna,hyndai,2012,4.5,9.4,36000,Petrol,Dealer,Manual,0
222 | eon,hyndai,2017,3.5,4.43,38488,Petrol,Dealer,Manual,0
223 | i20,hyndai,2013,4.5,6.79,32000,Petrol,Dealer,Automatic,0
224 | i20,hyndai,2014,6,7.6,77632,Diesel,Dealer,Manual,0
225 | verna,hyndai,2015,8.25,9.4,61381,Diesel,Dealer,Manual,0
226 | verna,hyndai,2013,5.11,9.4,36198,Petrol,Dealer,Automatic,0
227 | i10,hyndai,2011,2.7,4.6,22517,Petrol,Dealer,Manual,0
228 | grand i10,hyndai,2015,5.25,5.7,24678,Petrol,Dealer,Manual,0
229 | i10,hyndai,2011,2.55,4.43,57000,Petrol,Dealer,Manual,0
230 | verna,hyndai,2012,4.95,9.4,60000,Diesel,Dealer,Manual,0
231 | i20,hyndai,2012,3.1,6.79,52132,Diesel,Dealer,Manual,0
232 | verna,hyndai,2013,6.15,9.4,45000,Diesel,Dealer,Manual,0
233 | verna,hyndai,2017,9.25,9.4,15001,Petrol,Dealer,Manual,0
234 | elantra,hyndai,2015,11.45,14.79,12900,Petrol,Dealer,Automatic,0
235 | grand i10,hyndai,2013,3.9,5.7,53000,Diesel,Dealer,Manual,0
236 | grand i10,hyndai,2015,5.5,5.7,4492,Petrol,Dealer,Manual,0
237 | verna,hyndai,2017,9.1,9.4,15141,Petrol,Dealer,Manual,0
238 | eon,hyndai,2016,3.1,4.43,11849,Petrol,Dealer,Manual,0
239 | creta,hyndai,2015,11.25,13.6,68000,Diesel,Dealer,Manual,0
240 | verna,hyndai,2013,4.8,9.4,60241,Petrol,Dealer,Manual,0
241 | eon,hyndai,2012,2,4.43,23709,Petrol,Dealer,Manual,0
242 | verna,hyndai,2012,5.35,9.4,32322,Diesel,Dealer,Manual,0
243 | xcent,hyndai,2015,4.75,7.13,35866,Petrol,Dealer,Manual,1
244 | xcent,hyndai,2014,4.4,7.13,34000,Petrol,Dealer,Manual,0
245 | i20,hyndai,2016,6.25,7.6,7000,Petrol,Dealer,Manual,0
246 | verna,hyndai,2013,5.95,9.4,49000,Diesel,Dealer,Manual,0
247 | verna,hyndai,2012,5.2,9.4,71000,Diesel,Dealer,Manual,0
248 | i20,hyndai,2012,3.75,6.79,35000,Petrol,Dealer,Manual,0
249 | verna,hyndai,2015,5.95,9.4,36000,Petrol,Dealer,Manual,0
250 | i10,hyndai,2013,4,4.6,30000,Petrol,Dealer,Manual,0
251 | i20,hyndai,2016,5.25,7.6,17000,Petrol,Dealer,Manual,0
252 | creta,hyndai,2016,12.9,13.6,35934,Diesel,Dealer,Manual,0
253 | city,honda,2013,5,9.9,56701,Petrol,Dealer,Manual,0
254 | brio,honda,2015,5.4,6.82,31427,Petrol,Dealer,Automatic,0
255 | city,honda,2014,7.2,9.9,48000,Diesel,Dealer,Manual,0
256 | city,honda,2013,5.25,9.9,54242,Petrol,Dealer,Manual,0
257 | brio,honda,2012,3,5.35,53675,Petrol,Dealer,Manual,0
258 | city,honda,2016,10.25,13.6,49562,Petrol,Dealer,Manual,0
259 | city,honda,2015,8.5,13.6,40324,Petrol,Dealer,Manual,0
260 | city,honda,2015,8.4,13.6,25000,Petrol,Dealer,Manual,0
261 | amaze,honda,2014,3.9,7,36054,Petrol,Dealer,Manual,0
262 | city,honda,2016,9.15,13.6,29223,Petrol,Dealer,Manual,0
263 | brio,honda,2016,5.5,5.97,5600,Petrol,Dealer,Manual,0
264 | amaze,honda,2015,4,5.8,40023,Petrol,Dealer,Manual,0
265 | jazz,honda,2016,6.6,7.7,16002,Petrol,Dealer,Manual,0
266 | amaze,honda,2015,4,7,40026,Petrol,Dealer,Manual,0
267 | jazz,honda,2017,6.5,8.7,21200,Petrol,Dealer,Manual,0
268 | amaze,honda,2014,3.65,7,35000,Petrol,Dealer,Manual,0
269 | city,honda,2016,8.35,9.4,19434,Diesel,Dealer,Manual,0
270 | brio,honda,2017,4.8,5.8,19000,Petrol,Dealer,Manual,0
271 | city,honda,2015,6.7,10,18828,Petrol,Dealer,Manual,0
272 | city,honda,2011,4.1,10,69341,Petrol,Dealer,Manual,0
273 | city,honda,2009,3,10,69562,Petrol,Dealer,Manual,0
274 | city,honda,2015,7.5,10,27600,Petrol,Dealer,Manual,0
275 | jazz,honda,2010,2.25,7.5,61203,Petrol,Dealer,Manual,0
276 | brio,honda,2014,5.3,6.8,16500,Petrol,Dealer,Manual,0
277 | city,honda,2016,10.9,13.6,30753,Petrol,Dealer,Automatic,0
278 | city,honda,2015,8.65,13.6,24800,Petrol,Dealer,Manual,0
279 | city,honda,2015,9.7,13.6,21780,Petrol,Dealer,Manual,0
280 | jazz,honda,2016,6,8.4,4000,Petrol,Dealer,Manual,0
281 | city,honda,2014,6.25,13.6,40126,Petrol,Dealer,Manual,0
282 | brio,honda,2015,5.25,5.9,14465,Petrol,Dealer,Manual,0
283 | city,honda,2006,2.1,7.6,50456,Petrol,Dealer,Manual,0
284 | city,honda,2014,8.25,14,63000,Diesel,Dealer,Manual,0
285 | city,honda,2016,8.99,11.8,9010,Petrol,Dealer,Manual,0
286 | brio,honda,2013,3.5,5.9,9800,Petrol,Dealer,Manual,0
287 | jazz,honda,2016,7.4,8.5,15059,Petrol,Dealer,Automatic,0
288 | jazz,honda,2016,5.65,7.9,28569,Petrol,Dealer,Manual,0
289 | amaze,honda,2015,5.75,7.5,44000,Petrol,Dealer,Automatic,0
290 | city,honda,2015,8.4,13.6,34000,Petrol,Dealer,Manual,0
291 | city,honda,2016,10.11,13.6,10980,Petrol,Dealer,Manual,0
292 | amaze,honda,2014,4.5,6.4,19000,Petrol,Dealer,Manual,0
293 | brio,honda,2015,5.4,6.1,31427,Petrol,Dealer,Manual,0
294 | jazz,honda,2016,6.4,8.4,12000,Petrol,Dealer,Manual,0
295 | city,honda,2010,3.25,9.9,38000,Petrol,Dealer,Manual,0
296 | amaze,honda,2014,3.75,6.8,33019,Petrol,Dealer,Manual,0
297 | city,honda,2015,8.55,13.09,60076,Diesel,Dealer,Manual,0
298 | city,honda,2016,9.5,11.6,33988,Diesel,Dealer,Manual,0
299 | brio,honda,2015,4,5.9,60000,Petrol,Dealer,Manual,0
300 | city,honda,2009,3.35,11,87934,Petrol,Dealer,Manual,0
301 | city,honda,2017,11.5,12.5,9000,Diesel,Dealer,Manual,0
302 | brio,honda,2016,5.3,5.9,5464,Petrol,Dealer,Manual,0
303 |
--------------------------------------------------------------------------------
/UsedCarPricePredictor/main.py:
--------------------------------------------------------------------------------
1 | from flask import Flask, render_template, request
2 | import jsonify
3 | import requests
4 | import pickle
5 | import numpy as np
6 | import sklearn
7 | from sklearn.preprocessing import StandardScaler
8 | app = Flask(__name__)
9 | model = pickle.load(open('random_forest_regression_model.pkl', 'rb'))
10 | @app.route('/',methods=['GET'])
11 | def Home():
12 | return render_template('index.html')
13 |
14 |
15 | standard_to = StandardScaler()
16 | @app.route("/predict", methods=['POST'])
17 | def predict():
18 | Fuel_Type_Diesel=0
19 | if request.method == 'POST':
20 | Year = int(request.form['Year'])
21 | Present_Price=float(request.form['Present_Price'])
22 | Kms_Driven=int(request.form['Kms_Driven'])
23 | Kms_Driven2=np.log(Kms_Driven)
24 | Owner=int(request.form['Owner'])
25 | Fuel_Type_Petrol=request.form['Fuel_Type_Petrol']
26 | if(Fuel_Type_Petrol=='Petrol'):
27 | Fuel_Type_Petrol=1
28 | Fuel_Type_Diesel=0
29 | else:
30 | Fuel_Type_Petrol=0
31 | Fuel_Type_Diesel=1
32 | Year=2020-Year
33 | Seller_Type_Individual=request.form['Seller_Type_Individual']
34 | if(Seller_Type_Individual=='Individual'):
35 | Seller_Type_Individual=1
36 | else:
37 | Seller_Type_Individual=0
38 | Transmission_Mannual=request.form['Transmission_Mannual']
39 | if(Transmission_Mannual=='Mannual'):
40 | Transmission_Mannual=1
41 | else:
42 | Transmission_Mannual=0
43 | prediction=model.predict([[Present_Price,Kms_Driven2,Owner,Year,Fuel_Type_Diesel,Fuel_Type_Petrol,Seller_Type_Individual,Transmission_Mannual]])
44 | output=round(prediction[0],2)
45 | if output<0:
46 | return render_template('index.html',prediction_texts="Sorry you cannot sell this car")
47 | else:
48 | return render_template('index.html',prediction_text="You Can Sell The Car at {}".format(output))
49 | else:
50 | return render_template('index.html')
51 |
52 | if __name__=="__main__":
53 | app.run(debug=True)
54 |
55 |
--------------------------------------------------------------------------------
/UsedCarPricePredictor/random_forest_regression_model.pkl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GDGCVITBHOPAL/Machine-Learning-Projects/caf9ea86bf495228a41e77e818359ad5a7258f1e/UsedCarPricePredictor/random_forest_regression_model.pkl
--------------------------------------------------------------------------------
/UsedCarPricePredictor/requirements.txt:
--------------------------------------------------------------------------------
1 | certifi==2020.12.5
2 | pandas==1.1.2
3 | python-dateutil==2.8.1
4 | pytz==2020.1
5 | six==1.15.0
6 |
--------------------------------------------------------------------------------
/UsedCarPricePredictor/static/stylesheets/styles.css:
--------------------------------------------------------------------------------
1 | body {
2 | background-color: #e1f4f3;
3 | text-align: center;
4 | /*padding: 0px;*/
5 | font-family: 'Mukta', sans-serif;
6 | }
7 |
8 | .heading-text {
9 | margin: 25px;
10 | }
11 |
12 | #research {
13 | font-size: 18px;
14 | width: 100px;
15 | height: 23px;
16 | top: 23px;
17 | }
18 |
19 | #box {
20 | border-radius: 60px;
21 | border-color: 45px;
22 | border-style: solid;
23 | font-family: cursive;
24 | text-align: center;
25 | background-color: blue;
26 | font-size: medium;
27 | position: absolute;
28 | width: 700px;
29 | bottom: 9%;
30 | height: 850px;
31 | right: 30%;
32 | padding: 0px;
33 | margin: 0px;
34 | font-size: 14px;
35 | }
36 |
37 | .card{
38 | margin-bottom: 10px;
39 | background-color: #e1f4f3;
40 | border: 1px;
41 | }
42 |
43 | #fuel {
44 | width: 83px;
45 | height: 43px;
46 | text-align: center;
47 | border-radius: 14px;
48 | font-size: 20px;
49 | }
50 |
51 | #fuel:hover {
52 | background-color: whitesmoke;
53 | }
54 |
55 |
56 |
57 | #research {
58 | width: 110px;
59 | height: 43px;
60 | text-align: center;
61 | border-radius: 14px;
62 | font-size: 18px;
63 | }
64 |
65 | #research:hover {
66 | background-color: whitesmoke;
67 | }
68 |
69 | #resea {
70 | width: 99px;
71 | height: 43px;
72 | text-align: center;
73 | border-radius: 14px;
74 | font-size: 18px;
75 | }
76 |
77 | #resea:hover {
78 | background-color: coral;
79 | }
80 |
81 | #sub {
82 | width: 120px;
83 | height: 43px;
84 | text-align: center;
85 | border-radius: 14px;
86 | font-size: 18px;
87 | }
88 |
89 | #sub:hover {
90 | background-color: darkcyan;
91 | }
92 |
93 | #first {
94 | border-radius: 14px;
95 | height: 25px;
96 | font-size: 20px;
97 | text-align: center;
98 | }
99 |
100 | #second {
101 | border-radius: 14px;
102 | height: 25px;
103 | font-size: 20px;
104 | text-align: center;
105 | }
106 |
107 | #third {
108 | border-radius: 14px;
109 | height: 25px;
110 | font-size: 20px;
111 | text-align: center;
112 | }
113 |
114 | #fourth {
115 | border-radius: 14px;
116 | height: 25px;
117 | font-size: 20px;
118 | text-align: center;
119 | }
120 |
121 | .transmission-card {
122 | position: relative;
123 | left: 330px;
124 | }
125 |
126 |
127 |
128 | .card-km, .card-dealer {
129 | padding-bottom: 25px;
130 | }
131 |
132 |
133 | .submit-button {
134 | margin-top: 25px;
135 | }
--------------------------------------------------------------------------------
/UsedCarPricePredictor/templates/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 | CarPricePredictor
8 |
9 |
10 |
12 |
13 |
14 |
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 |
25 |
26 |
27 |
28 |
29 |
32 |
33 |
34 |
35 |
36 |
142 |
143 |
144 |
145 |
146 |
147 |
148 |
{{ prediction_text }}
149 |
150 |
151 |
152 |
153 |
154 |
155 |
156 |
157 |
158 |
159 |
160 |
161 |
164 |
167 |
168 |
169 |
170 |
--------------------------------------------------------------------------------
/contribution.md:
--------------------------------------------------------------------------------
1 | # Contribution Guidelines for the Project
2 | ---
3 |
4 | TODO: Content to be added.
--------------------------------------------------------------------------------