├── Belarus Car Price Prediction ├── Belarus Car Price Prediction.ipynb ├── Belarus Car Price Prediction.pdf ├── cars.csv └── description.md ├── Breast Cancer Prediction ├── Breast Cancer Prediction.ipynb ├── Breast Cancer Prediction.pdf ├── Description.md └── data.csv ├── CONTRIBUTING.md ├── Cardiovascular Disease Prediction ├── Cadivascular Disease Prediction.ipynb ├── Cadivascular Disease Prediction.pdf └── description.md ├── Customer Churn Prediction ├── Customer Churn Prediction.ipynb ├── Customer Churn Prediction.pdf ├── churn.csv └── description.md ├── DataDistributions ├── DistributionViz.ipynb ├── README.md └── weatherData.csv ├── Delhi House Price Prediction ├── Delhi House Price Prediction.ipynb ├── Delhi House Price Prediction.pdf ├── MagicBricks.csv └── description.md ├── Diamond Price Prediction ├── Diamond Price Prediction.html ├── Diamond Price Prediction.ipynb ├── Diamond Price Prediction.pdf ├── description.md └── diamonds.csv ├── E-Commerce Product Delivery Prediction ├── E-Commerce Product Delivery Prediction.ipynb ├── E-Commerce Product Delivery Prediction.pdf ├── E_Commerce.csv └── description.md ├── Heart Stroke Prediction ├── Stroke Prediction.pdf ├── Stroke detection.ipynb ├── description.md └── healthcare-dataset-stroke-data.csv ├── Hotel Reservations Cancellation Prediction ├── Hotel Reservations Cancelation Prediction.ipynb ├── Hotel Reservations Cancelation Prediction.pdf ├── Hotel Reservations.csv └── description.md ├── House Price Prediction ├── House_Price_Prediction.pdf ├── description.md ├── home_data.csv └── house price.ipynb ├── Indian Used Car Price Prediction ├── Indian Used Car Price Prediction.ipynb ├── Indian Used Car Price Prediction.pdf ├── description.md └── usedCars.csv ├── LICENSE ├── Loan Approval Prediction ├── Loan Approval Prediction.ipynb ├── Loan Approval Prediction.pdf ├── description.md └── loan_approval_dataset.csv ├── Medical Cost Prediction ├── Medical Cost Prediction.ipynb ├── Medical Cost Prediction.pdf ├── description.md └── insurance.csv ├── Pima Indians Diabetes Prediction ├── Diabetes Prediction.ipynb ├── Diabetes Prediction.pdf ├── description.md └── diabetes.csv ├── README.md ├── Red Wine Quality ├── Description.md ├── Wine-Quality.ipynb ├── Wine-Quality.pdf └── winequality-red.csv ├── Room Occupancy Detection ├── Room Occupancy Detection.ipynb ├── Room Occupancy Detection.pdf ├── datatest.csv ├── datatest2.csv ├── datatraining.csv └── description.md ├── SFR Analysis ├── Launch SFR.csv ├── SFR Analysis.ipynb ├── SFR Analysis.pdf └── description.md ├── Salary Prediction ├── Salary Prediction.ipynb ├── Salary Prediction.pdf ├── Salary_Data_Based_country_and_race.csv └── description.md ├── Sleep Disorder Prediction ├── Sleep Disorder Prediction.ipynb ├── Sleep Disorder Prediction.pdf ├── Sleep_health_and_lifestyle_dataset.csv └── description.md ├── Telecom Customer Churn Prediction ├── Telecom Customer Churn Prediction.ipynb ├── Telecom Customer Churn Prediction.pdf ├── WA_Fn-UseC_-Telco-Customer-Churn.csv └── description.md ├── Titanic Survival Prediction ├── Titantic Prediction.ipynb ├── Titantic Prediction.pdf ├── description.md ├── titanic_test.csv └── titanic_train.csv ├── Warranty Claims Fraud Prediction ├── Warranty Claims Fraud Prediction.ipynb ├── Warranty Claims Fraud Prediction.pdf ├── description.md └── df_Clean.csv ├── nunique.py └── zzzzz---git.cmd /Belarus Car Price Prediction/Belarus Car Price Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Belarus Car Price Prediction/Belarus Car Price Prediction.pdf -------------------------------------------------------------------------------- /Belarus Car Price Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Belarus Car Price Prediction 2 | ![](https://i0.wp.com/bestsellingcarsblog.com/wp-content/uploads/2020/01/Geely-Atlas-Belarus-2019.jpg?fit=600%2C400&ssl=1) 3 | ## Project Overview: 4 | The Belarus Car Price Prediction project aims to forecast the prices of cars in Belarus by leveraging a dataset containing essential car features. These features include the make, model, year of production, condition, mileage, fuel type, engine volume, color, transmission type, drive unit, and segment. With a total of 56,244 rows and 12 columns, this project seeks to identify the key variables that have the most significant impact on car prices within the Belarusian market. 5 | 6 | ## Data Dictionary: 7 | 8 | | Variable | Description | 9 | |-----------------|------------------------------------------------------------| 10 | | make | Machine firm or car manufacturer | 11 | | model | Machine model | 12 | | price USD | Price in USD (target variable) | 13 | | year | Year of production | 14 | | condition | Represents the condition at the sale moment | 15 | | mileage | Mileage in kilometers | 16 | | fuel type | Type of fuel (electro, petrol, diesel) | 17 | | volume(cm3) | Volume of the engine in cubic centimeters | 18 | | color | Color of the car | 19 | | transmission | Type of transmission | 20 | | drive unit | Drive unit | 21 | | segment | Segment of the car | 22 | 23 | ## Impact: 24 | Through exploratory data analysis, several key insights were discovered. Notably, there was a significant increase in car prices in Belarus after the year 2000. Cars running on petrol with automatic transmission tend to have higher prices compared to diesel cars with manual transmission. Electric cars stand out as notably more expensive than other car types. Furthermore, cars with all-wheel drive exhibit the highest prices among all drive units, and speciality segment cars command the highest prices among all segments, followed by luxury European, American, and Asian car segments. 25 | 26 | For the predictive modeling, a decision tree regressor was employed to forecast car prices. This model achieved an impressive accuracy rate of 85.29%. The most influential features in predicting car prices were identified as the year of production and the engine's volume in cubic centimeters. 27 | 28 | This project offers valuable insights for both car buyers and sellers in Belarus, helping them make informed decisions in a dynamic automotive market. 29 | 30 | -------------------------------------------------------------------------------- /Breast Cancer Prediction/Breast Cancer Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Breast Cancer Prediction/Breast Cancer Prediction.pdf -------------------------------------------------------------------------------- /Breast Cancer Prediction/Description.md: -------------------------------------------------------------------------------- 1 | # Breast Cancer Prediction 2 | **Breast Cancer Prediction** is a classification task aimed at predicting the diagnosis of a breast mass as either malignant or benign. The dataset used for this prediction consists of features computed from a digitized image of a fine needle aspirate (FNA) of the breast mass. These features describe various characteristics of the cell nuclei present in the image. 3 | 4 | The dataset contains the following information for each instance: 5 | 6 | 1. ID number: A unique identifier for each sample. 7 | 2. Diagnosis: The target variable indicating the diagnosis, where 'M' represents malignant and 'B' represents benign. 8 | 9 | For each cell nucleus, ten real-valued features are computed, which are: 10 | 11 | 1. Radius: The mean distance from the center to points on the perimeter of the nucleus. 12 | 2. Texture: The standard deviation of gray-scale values in the nucleus. 13 | 3. Perimeter: The perimeter of the nucleus. 14 | 4. Area: The area of the nucleus. 15 | 5. Smoothness: A measure of local variation in radius lengths. 16 | 6. Compactness: Computed as the square of the perimeter divided by the area minus 1.0. 17 | 7. Concavity: Describes the severity of concave portions of the nucleus contour. 18 | 8. Concave points: Represents the number of concave portions of the nucleus contour. 19 | 9. Symmetry: Measures the symmetry of the nucleus. 20 | 10. Fractal dimension: This feature approximates the "coastline" of the nucleus, using the concept of fractal geometry. 21 | 22 | These features provide quantitative measurements that can be used to assess the characteristics of cell nuclei and aid in distinguishing between malignant and benign breast masses. By training a machine learning model on this dataset, it is possible to develop a predictive model that can assist in the early detection and diagnosis of breast cancer. 23 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing Guidelines 2 | Thank you for considering contributing to the Data Science Projects repository! We appreciate your interest in improving and enhancing the content of this repository. To ensure a smooth collaboration, please review and follow the guidelines below when contributing. 3 | 4 | ## How to Contribute 5 | 1. Fork the repository to your GitHub account. 6 | 2. Create a new branch for your contributions. 7 | 3. Make your changes, enhancements, or fixes in your branch. 8 | 4. Test your changes to ensure they do not introduce any issues. 9 | 5. Commit your changes with a clear and descriptive commit message. 10 | 6. Push your changes to your forked repository. 11 | 7. Submit a pull request to the main repository. 12 | ## Guidelines for Contributions 13 | - Before starting any work, it is recommended to check the existing issues and pull requests to see if the changes you intend to make are already being addressed. 14 | - If you plan to work on an existing issue, please comment on the issue to let others know that you are working on it to avoid duplication of efforts. 15 | - When creating a pull request, provide a clear and concise description of the changes you have made and the problem they solve. 16 | - Make sure your code follows the existing coding style and conventions used in the repository. 17 | - Include any necessary documentation or instructions along with your contributions. 18 | - Test your changes thoroughly to ensure they do not introduce any new bugs or issues. 19 | - Be open to feedback and constructive criticism. Reviewers may suggest changes or improvements to your contributions, and collaboration is encouraged to reach the best possible outcome. 20 | 21 | ## Getting Help 22 | If you have any questions or need assistance regarding the contribution process or any specific issue, feel free to reach out by creating a new issue in the repository. We will do our best to provide guidance and support. 23 | 24 | Thank you for your contributions and for helping to improve the Data Science Projects repository! 25 | -------------------------------------------------------------------------------- /Cardiovascular Disease Prediction/Cadivascular Disease Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Cardiovascular Disease Prediction/Cadivascular Disease Prediction.pdf -------------------------------------------------------------------------------- /Cardiovascular Disease Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Cardiovascular Disease Prediction 2 | 3 | ![](https://d112y698adiu2z.cloudfront.net/photos/production/software_photos/002/508/249/datas/original.png) 4 | 5 | ## Project Overview 6 | The **Cardiovascular Disease Prediction** project aims to predict the occurrence of cardiovascular disease in patients based on their medical records and history. By analyzing various factors, the project calculates the probability of cardiovascular disease developing in a patient. This predictive model is a valuable tool in identifying individuals who are at higher risk and enabling early intervention. 7 | 8 | ## Data Dictionary 9 | The project utilizes a comprehensive dataset with the following features: 10 | 11 | | Feature | Description | 12 | |---------------------------|---------------------------------------------------| 13 | | General Health | General health condition of the patient. | 14 | | Checkup | Date of the patient's last medical checkup. | 15 | | Exercise | Whether the patient engages in regular exercise. | 16 | | Heart Disease | Presence of existing heart disease in the patient.| 17 | | Skin Cancer | Presence of skin cancer in the patient. | 18 | | Other Cancer | Presence of other types of cancer in the patient. | 19 | | Depression | Presence of depression in the patient. | 20 | | Diabetes | Presence of diabetes in the patient. | 21 | | Arthritis | Presence of arthritis in the patient. | 22 | | Sex | Gender of the patient. | 23 | | Age-Category | Age category of the patient. | 24 | | BMI | Body Mass Index of the patient. | 25 | | Smoking History | Patient's history of smoking. | 26 | | Alcohol Consumption | Patient's alcohol consumption habits. | 27 | | Fruit Consumption | Patient's consumption of fruits. | 28 | | Green Vegetable Consumption | Patient's consumption of green vegetables. | 29 | | Fried Potato Consumption | Patient's consumption of fried potatoes. | 30 | 31 | ## Impact 32 | Through thorough exploratory data analysis, significant insights were revealed about the factors contributing to cardiovascular disease: 33 | 34 | - Age plays a crucial role, with individuals above the age of 55 being more prone to cardiovascular disease. The risk escalates with age, reaching a peak in patients aged 80 and above. 35 | - Higher BMI is correlated with an increased likelihood of cardiovascular disease. 36 | - Surprisingly, exercise can have varying effects based on age. While exercise generally reduces risk, older patients who exercise extensively might face elevated risks due to increased heart strain. 37 | - Dietary habits significantly influence the disease. Patients consuming more fruits and green vegetables exhibit lower susceptibility to cardiovascular disease, while those consuming fried potatoes are more prone. 38 | - Smoking history is a notable risk factor. 39 | - Intriguingly, medical history concerning cancer, arthritis, diabetes, and depression showed no significant impact on cardiovascular disease risk. 40 | 41 | This project's predictions and insights empower healthcare professionals to identify high-risk patients and implement preventive measures, ultimately reducing the prevalence of cardiovascular disease. 42 | 43 | In conclusion, the Cardiovascular Disease Prediction project underscores the importance of data-driven approaches in healthcare, aiding in the proactive management of patients' cardiovascular health. 44 | -------------------------------------------------------------------------------- /Customer Churn Prediction/Customer Churn Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Customer Churn Prediction/Customer Churn Prediction.pdf -------------------------------------------------------------------------------- /Customer Churn Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Customer Churn Prediction 2 | ![](https://miro.medium.com/v2/resize:fit:1400/1*47xx1oXuebvYwZeB0OutuA.png) 3 | ## Project Overview: 4 | The main objective of the Bank Customer Churn Prediction project is to analyze the demographics and financial information of bank customers, including factors like age, gender, credit score, country, balance, and more, in order to predict whether a customer will leave the bank or not. Customer churn, the decision of customers to leave a bank, can significantly impact the bank's business and profitability. By accurately predicting customer churn, the bank can take proactive measures to retain valuable customers and enhance customer satisfaction. 5 | ## About the Dataset: 6 | The dataset used in this project is sourced from Kaggle and comprises 10,000 rows and 14 columns. The dataset's primary objective is to predict whether a customer will churn (leave the bank) based on their demographics and financial information. 7 | 8 | The dataset contains several independent variables, which are potential factors that may influence a customer's decision to leave the bank. These variables include customer-specific information like credit score, country (geography), age, tenure (number of years with the bank), bank balance, the number of bank products utilized (NumOfProducts), whether the customer holds a credit card (HasCrCard), and whether the customer is an active member with the bank (IsActiveMember). The target variable, also known as the dependent variable, is labeled "Exited" and is represented by a binary flag: 1 if the customer closed their account with the bank and 0 if the customer is retained. 9 | ## Data Dictionary 10 | | Column Name | Description | 11 | | --- | --- | 12 | | RowNumber | Row number | 13 | | CustomerId | Unique identification key for different customers | 14 | | Surname | Customer's last name | 15 | | CreditScore | Credit score of the customer | 16 | |Geography | Country of the customer | 17 | |Age | Age of the customer | 18 | |Tenure | Number of years for which the customer has been with the bank | 19 | |Balance | Bank balance of the customer | 20 | |NumOfProducts | Number of bank products the customer is utilising | 21 | |HasCrCard | Binary flag for whether the customer holds a credit card with the bank or not | 22 | |IsActiveMember | Binary flag for whether the customer is an active member with the bank or not | 23 | |EstimatedSalary | Estimated salary of the customer in Dollars | 24 | |Exited | Binary flag 1 if the customer closed account with bank and 0 if the customer is retained | 25 | 26 | By leveraging this dataset and employing appropriate data science techniques, the project aims to build a predictive model that can accurately identify customers likely to churn, enabling the bank to implement targeted retention strategies and ultimately improve customer loyalty and business performance. 27 | -------------------------------------------------------------------------------- /DataDistributions/README.md: -------------------------------------------------------------------------------- 1 | # DataDistributions 2 | 3 | A jupyter notebook detailing how to visualize and compare distributions within a dataset. 4 | -------------------------------------------------------------------------------- /DataDistributions/weatherData.csv: -------------------------------------------------------------------------------- 1 | Location,Season,Entry,Temp 2 | City A,Spring,1,35.0 3 | City B,Spring,1,35.0 4 | City C,Spring,1,69.0 5 | City A,Summer,1,78.0 6 | City B,Summer,1,74.0 7 | City C,Summer,1,72.0 8 | City A,Fall,1,46.0 9 | City B,Fall,1,66.0 10 | City C,Fall,1,63.0 11 | City A,Winter,1,16.0 12 | City B,Winter,1,45.0 13 | City C,Winter,1,54.0 14 | City A,Spring,2,25.0 15 | City B,Spring,2,48.0 16 | City C,Spring,2,68.0 17 | City A,Summer,2,75.0 18 | City B,Summer,2,75.0 19 | City C,Summer,2,93.0 20 | City A,Fall,2,50.0 21 | City B,Fall,2,59.0 22 | City C,Fall,2,67.0 23 | City A,Winter,2,29.0 24 | City B,Winter,2,36.0 25 | City C,Winter,2,46.0 26 | City A,Spring,3,63.0 27 | City B,Spring,3,39.0 28 | City C,Spring,3,73.0 29 | City A,Summer,3,76.0 30 | City B,Summer,3,89.0 31 | City C,Summer,3,94.0 32 | City A,Fall,3,46.0 33 | City B,Fall,3,49.0 34 | City C,Fall,3,61.0 35 | City A,Winter,3,13.0 36 | City B,Winter,3,30.0 37 | City C,Winter,3,54.0 38 | City A,Spring,4,42.0 39 | City B,Spring,4,54.0 40 | City C,Spring,4,88.0 41 | City A,Summer,4,74.0 42 | City B,Summer,4,71.0 43 | City C,Summer,4,91.0 44 | City A,Fall,4,53.0 45 | City B,Fall,4,54.0 46 | City C,Fall,4,72.0 47 | City A,Winter,4,-3.0 48 | City B,Winter,4,24.0 49 | City C,Winter,4,48.0 50 | City A,Spring,5,59.0 51 | City B,Spring,5,66.0 52 | City C,Spring,5,68.0 53 | City A,Summer,5,81.0 54 | City B,Summer,5,61.0 55 | City C,Summer,5,102.0 56 | City A,Fall,5,0.0 57 | City B,Fall,5,67.0 58 | City C,Fall,5,79.0 59 | City A,Winter,5,1.0 60 | City B,Winter,5,38.0 61 | City C,Winter,5,54.0 62 | City A,Spring,6,44.0 63 | City B,Spring,6,47.0 64 | City C,Spring,6,74.0 65 | City A,Summer,6,87.0 66 | City B,Summer,6,77.0 67 | City C,Summer,6,94.0 68 | City A,Fall,6,0.0 69 | City B,Fall,6,54.0 70 | City C,Fall,6,60.0 71 | City A,Winter,6,24.0 72 | City B,Winter,6,31.0 73 | City C,Winter,6,65.0 74 | City A,Spring,7,38.0 75 | City B,Spring,7,56.0 76 | City C,Spring,7,61.0 77 | City A,Summer,7,72.0 78 | City B,Summer,7,68.0 79 | City C,Summer,7,96.0 80 | City A,Fall,7,47.0 81 | City B,Fall,7,59.0 82 | City C,Fall,7,71.0 83 | City A,Winter,7,22.0 84 | City B,Winter,7,45.0 85 | City C,Winter,7,62.0 86 | City A,Spring,8,21.0 87 | City B,Spring,8,56.0 88 | City C,Spring,8,78.0 89 | City A,Summer,8,81.0 90 | City B,Summer,8,72.0 91 | City C,Summer,8,88.0 92 | City A,Fall,8,44.0 93 | City B,Fall,8,48.0 94 | City C,Fall,8,65.0 95 | City A,Winter,8,9.0 96 | City B,Winter,8,33.0 97 | City C,Winter,8,50.0 98 | City A,Spring,9,43.0 99 | City B,Spring,9,46.0 100 | City C,Spring,9,87.0 101 | City A,Summer,9,80.0 102 | City B,Summer,9,68.0 103 | City C,Summer,9,90.0 104 | City A,Fall,9,62.0 105 | City B,Fall,9,50.0 106 | City C,Fall,9,50.0 107 | City A,Winter,9,26.0 108 | City B,Winter,9,26.0 109 | City C,Winter,9,57.0 110 | City A,Spring,10,65.0 111 | City B,Spring,10,65.0 112 | City C,Spring,10,74.0 113 | City A,Summer,10,77.0 114 | City B,Summer,10,78.0 115 | City C,Summer,10,94.0 116 | City A,Fall,10,46.0 117 | City B,Fall,10,55.0 118 | City C,Fall,10,70.0 119 | City A,Winter,10,23.0 120 | City B,Winter,10,33.0 121 | City C,Winter,10,49.0 122 | City A,Spring,11,48.0 123 | City B,Spring,11,27.0 124 | City C,Spring,11,71.0 125 | City A,Summer,11,85.0 126 | City B,Summer,11,67.0 127 | City C,Summer,11,82.0 128 | City A,Fall,11,44.0 129 | City B,Fall,11,57.0 130 | City C,Fall,11,64.0 131 | City A,Winter,11,-1.0 132 | City B,Winter,11,27.0 133 | City C,Winter,11,63.0 134 | City A,Spring,12,27.0 135 | City B,Spring,12,47.0 136 | City C,Spring,12,74.0 137 | City A,Summer,12,87.0 138 | City B,Summer,12,73.0 139 | City C,Summer,12,98.0 140 | City A,Fall,12,36.0 141 | City B,Fall,12,45.0 142 | City C,Fall,12,56.0 143 | City A,Winter,12,7.0 144 | City B,Winter,12,18.0 145 | City C,Winter,12,54.0 146 | City A,Spring,13,55.0 147 | City B,Spring,13,27.0 148 | City C,Spring,13,84.0 149 | City A,Summer,13,80.0 150 | City B,Summer,13,81.0 151 | City C,Summer,13,84.0 152 | City A,Fall,13,42.0 153 | City B,Fall,13,70.0 154 | City C,Fall,13,71.0 155 | City A,Winter,13,30.0 156 | City B,Winter,13,34.0 157 | City C,Winter,13,55.0 158 | City A,Spring,14,46.0 159 | City B,Spring,14,69.0 160 | City C,Spring,14,62.0 161 | City A,Summer,14,78.0 162 | City B,Summer,14,65.0 163 | City C,Summer,14,83.0 164 | City A,Fall,14,56.0 165 | City B,Fall,14,54.0 166 | City C,Fall,14,58.0 167 | City A,Winter,14,16.0 168 | City B,Winter,14,39.0 169 | City C,Winter,14,55.0 170 | City A,Spring,15,29.0 171 | City B,Spring,15,37.0 172 | City C,Spring,15,77.0 173 | City A,Summer,15,77.0 174 | City B,Summer,15,81.0 175 | City C,Summer,15,100.0 176 | City A,Fall,15,23.0 177 | City B,Fall,15,51.0 178 | City C,Fall,15,62.0 179 | City A,Winter,15,28.0 180 | City B,Winter,15,45.0 181 | City C,Winter,15,56.0 182 | City A,Spring,16,50.0 183 | City B,Spring,16,52.0 184 | City C,Spring,16,75.0 185 | City A,Summer,16,82.0 186 | City B,Summer,16,62.0 187 | City C,Summer,16,79.0 188 | City A,Fall,16,55.0 189 | City B,Fall,16,63.0 190 | City C,Fall,16,64.0 191 | City A,Winter,16,12.0 192 | City B,Winter,16,34.0 193 | City C,Winter,16,59.0 194 | City A,Spring,17,44.0 195 | City B,Spring,17,47.0 196 | City C,Spring,17,69.0 197 | City A,Summer,17,89.0 198 | City B,Summer,17,65.0 199 | City C,Summer,17,82.0 200 | City A,Fall,17,48.0 201 | City B,Fall,17,67.0 202 | City C,Fall,17,52.0 203 | City A,Winter,17,0.0 204 | City B,Winter,17,29.0 205 | City C,Winter,17,57.0 206 | City A,Spring,18,53.0 207 | City B,Spring,18,57.0 208 | City C,Spring,18,63.0 209 | City A,Summer,18,84.0 210 | City B,Summer,18,57.0 211 | City C,Summer,18,85.0 212 | City A,Fall,18,48.0 213 | City B,Fall,18,60.0 214 | City C,Fall,18,65.0 215 | City A,Winter,18,10.0 216 | City B,Winter,18,30.0 217 | City C,Winter,18,60.0 218 | City A,Spring,19,29.0 219 | City B,Spring,19,50.0 220 | City C,Spring,19,53.0 221 | City A,Summer,19,63.0 222 | City B,Summer,19,71.0 223 | City C,Summer,19,88.0 224 | City A,Fall,19,20.0 225 | City B,Fall,19,47.0 226 | City C,Fall,19,71.0 227 | City A,Winter,19,20.0 228 | City B,Winter,19,34.0 229 | City C,Winter,19,58.0 230 | City A,Spring,20,53.0 231 | City B,Spring,20,39.0 232 | City C,Spring,20,62.0 233 | City A,Summer,20,95.0 234 | City B,Summer,20,69.0 235 | City C,Summer,20,91.0 236 | City A,Fall,20,13.0 237 | City B,Fall,20,62.0 238 | City C,Fall,20,57.0 239 | City A,Winter,20,30.0 240 | City B,Winter,20,34.0 241 | City C,Winter,20,48.0 242 | City A,Spring,21,34.0 243 | City B,Spring,21,50.0 244 | City C,Spring,21,79.0 245 | City A,Summer,21,81.0 246 | City B,Summer,21,62.0 247 | City C,Summer,21,99.0 248 | City A,Fall,21,23.0 249 | City B,Fall,21,45.0 250 | City C,Fall,21,62.0 251 | City A,Winter,21,16.0 252 | City B,Winter,21,42.0 253 | City C,Winter,21,65.0 254 | City A,Spring,22,65.0 255 | City B,Spring,22,35.0 256 | City C,Spring,22,67.0 257 | City A,Summer,22,82.0 258 | City B,Summer,22,80.0 259 | City C,Summer,22,89.0 260 | City A,Fall,22,74.0 261 | City B,Fall,22,53.0 262 | City C,Fall,22,68.0 263 | City A,Winter,22,24.0 264 | City B,Winter,22,23.0 265 | City C,Winter,22,50.0 266 | City A,Spring,23,19.0 267 | City B,Spring,23,50.0 268 | City C,Spring,23,88.0 269 | City A,Summer,23,89.0 270 | City B,Summer,23,53.0 271 | City C,Summer,23,97.0 272 | City A,Fall,23,36.0 273 | City B,Fall,23,63.0 274 | City C,Fall,23,59.0 275 | City A,Winter,23,33.0 276 | City B,Winter,23,25.0 277 | City C,Winter,23,53.0 278 | City A,Spring,24,45.0 279 | City B,Spring,24,52.0 280 | City C,Spring,24,73.0 281 | City A,Summer,24,95.0 282 | City B,Summer,24,64.0 283 | City C,Summer,24,100.0 284 | City A,Fall,24,60.0 285 | City B,Fall,24,52.0 286 | City C,Fall,24,65.0 287 | City A,Winter,24,21.0 288 | City B,Winter,24,26.0 289 | City C,Winter,24,49.0 290 | City A,Spring,25,29.0 291 | City B,Spring,25,44.0 292 | City C,Spring,25,80.0 293 | City A,Summer,25,96.0 294 | City B,Summer,25,59.0 295 | City C,Summer,25,93.0 296 | City A,Fall,25,30.0 297 | City B,Fall,25,55.0 298 | City C,Fall,25,76.0 299 | City A,Winter,25,16.0 300 | City B,Winter,25,29.0 301 | City C,Winter,25,53.0 302 | City A,Spring,26,47.0 303 | City B,Spring,26,21.0 304 | City C,Spring,26,74.0 305 | City A,Summer,26,90.0 306 | City B,Summer,26,79.0 307 | City C,Summer,26,82.0 308 | City A,Fall,26,61.0 309 | City B,Fall,26,50.0 310 | City C,Fall,26,69.0 311 | City A,Winter,26,28.0 312 | City B,Winter,26,30.0 313 | City C,Winter,26,62.0 314 | City A,Spring,27,38.0 315 | City B,Spring,27,51.0 316 | City C,Spring,27,75.0 317 | City A,Summer,27,75.0 318 | City B,Summer,27,73.0 319 | City C,Summer,27,96.0 320 | City A,Fall,27,30.0 321 | City B,Fall,27,58.0 322 | City C,Fall,27,53.0 323 | City A,Winter,27,28.0 324 | City B,Winter,27,36.0 325 | City C,Winter,27,61.0 326 | City A,Spring,28,35.0 327 | City B,Spring,28,61.0 328 | City C,Spring,28,68.0 329 | City A,Summer,28,83.0 330 | City B,Summer,28,75.0 331 | City C,Summer,28,84.0 332 | City A,Fall,28,29.0 333 | City B,Fall,28,58.0 334 | City C,Fall,28,66.0 335 | City A,Winter,28,7.0 336 | City B,Winter,28,31.0 337 | City C,Winter,28,58.0 338 | City A,Spring,29,30.0 339 | City B,Spring,29,44.0 340 | City C,Spring,29,60.0 341 | City A,Summer,29,89.0 342 | City B,Summer,29,72.0 343 | City C,Summer,29,87.0 344 | City A,Fall,29,21.0 345 | City B,Fall,29,51.0 346 | City C,Fall,29,63.0 347 | City A,Winter,29,16.0 348 | City B,Winter,29,24.0 349 | City C,Winter,29,51.0 350 | City A,Spring,30,46.0 351 | City B,Spring,30,33.0 352 | City C,Spring,30,85.0 353 | City A,Summer,30,82.0 354 | City B,Summer,30,80.0 355 | City C,Summer,30,93.0 356 | City A,Fall,30,67.0 357 | City B,Fall,30,52.0 358 | City C,Fall,30,58.0 359 | City A,Winter,30,7.0 360 | City B,Winter,30,16.0 361 | City C,Winter,30,62.0 362 | City A,Spring,31,51.0 363 | City B,Spring,31,39.0 364 | City C,Spring,31,73.0 365 | City A,Summer,31,91.0 366 | City B,Summer,31,66.0 367 | City C,Summer,31,88.0 368 | City A,Fall,31,51.0 369 | City B,Fall,31,42.0 370 | City C,Fall,31,69.0 371 | City A,Winter,31,5.0 372 | City B,Winter,31,26.0 373 | City C,Winter,31,47.0 374 | City A,Spring,32,38.0 375 | City B,Spring,32,69.0 376 | City C,Spring,32,58.0 377 | City A,Summer,32,84.0 378 | City B,Summer,32,58.0 379 | City C,Summer,32,100.0 380 | City A,Fall,32,35.0 381 | City B,Fall,32,52.0 382 | City C,Fall,32,81.0 383 | City A,Winter,32,10.0 384 | City B,Winter,32,24.0 385 | City C,Winter,32,52.0 386 | City A,Spring,33,29.0 387 | City B,Spring,33,48.0 388 | City C,Spring,33,63.0 389 | City A,Summer,33,82.0 390 | City B,Summer,33,90.0 391 | City C,Summer,33,82.0 392 | City A,Fall,33,27.0 393 | City B,Fall,33,51.0 394 | City C,Fall,33,60.0 395 | City A,Winter,33,20.0 396 | City B,Winter,33,31.0 397 | City C,Winter,33,58.0 398 | City A,Spring,34,52.0 399 | City B,Spring,34,57.0 400 | City C,Spring,34,74.0 401 | City A,Summer,34,89.0 402 | City B,Summer,34,69.0 403 | City C,Summer,34,96.0 404 | City A,Fall,34,29.0 405 | City B,Fall,34,52.0 406 | City C,Fall,34,62.0 407 | City A,Winter,34,0.0 408 | City B,Winter,34,37.0 409 | City C,Winter,34,56.0 410 | City A,Spring,35,35.0 411 | City B,Spring,35,56.0 412 | City C,Spring,35,71.0 413 | City A,Summer,35,88.0 414 | City B,Summer,35,71.0 415 | City C,Summer,35,82.0 416 | City A,Fall,35,31.0 417 | City B,Fall,35,57.0 418 | City C,Fall,35,69.0 419 | City A,Winter,35,34.0 420 | City B,Winter,35,15.0 421 | City C,Winter,35,57.0 422 | City A,Spring,36,45.0 423 | City B,Spring,36,55.0 424 | City C,Spring,36,75.0 425 | City A,Summer,36,80.0 426 | City B,Summer,36,73.0 427 | City C,Summer,36,89.0 428 | City A,Fall,36,41.0 429 | City B,Fall,36,56.0 430 | City C,Fall,36,50.0 431 | City A,Winter,36,-15.0 432 | City B,Winter,36,31.0 433 | City C,Winter,36,50.0 434 | City A,Spring,37,32.0 435 | City B,Spring,37,43.0 436 | City C,Spring,37,72.0 437 | City A,Summer,37,71.0 438 | City B,Summer,37,68.0 439 | City C,Summer,37,82.0 440 | City A,Fall,37,25.0 441 | City B,Fall,37,57.0 442 | City C,Fall,37,63.0 443 | City A,Winter,37,7.0 444 | City B,Winter,37,26.0 445 | City C,Winter,37,59.0 446 | City A,Spring,38,51.0 447 | City B,Spring,38,29.0 448 | City C,Spring,38,78.0 449 | City A,Summer,38,88.0 450 | City B,Summer,38,70.0 451 | City C,Summer,38,86.0 452 | City A,Fall,38,35.0 453 | City B,Fall,38,49.0 454 | City C,Fall,38,50.0 455 | City A,Winter,38,9.0 456 | City B,Winter,38,17.0 457 | City C,Winter,38,61.0 458 | City A,Spring,39,34.0 459 | City B,Spring,39,46.0 460 | City C,Spring,39,86.0 461 | City A,Summer,39,82.0 462 | City B,Summer,39,63.0 463 | City C,Summer,39,100.0 464 | City A,Fall,39,16.0 465 | City B,Fall,39,61.0 466 | City C,Fall,39,81.0 467 | City A,Winter,39,14.0 468 | City B,Winter,39,24.0 469 | City C,Winter,39,51.0 470 | City A,Spring,40,64.0 471 | City B,Spring,40,53.0 472 | City C,Spring,40,66.0 473 | City A,Summer,40,67.0 474 | City B,Summer,40,70.0 475 | City C,Summer,40,86.0 476 | City A,Fall,40,44.0 477 | City B,Fall,40,58.0 478 | City C,Fall,40,66.0 479 | City A,Winter,40,5.0 480 | City B,Winter,40,27.0 481 | City C,Winter,40,57.0 482 | City A,Spring,41,56.0 483 | City B,Spring,41,46.0 484 | City C,Spring,41,70.0 485 | City A,Summer,41,82.0 486 | City B,Summer,41,72.0 487 | City C,Summer,41,90.0 488 | City A,Fall,41,44.0 489 | City B,Fall,41,43.0 490 | City C,Fall,41,74.0 491 | City A,Winter,41,36.0 492 | City B,Winter,41,35.0 493 | City C,Winter,41,41.0 494 | City A,Spring,42,53.0 495 | City B,Spring,42,55.0 496 | City C,Spring,42,66.0 497 | City A,Summer,42,71.0 498 | City B,Summer,42,67.0 499 | City C,Summer,42,93.0 500 | City A,Fall,42,58.0 501 | City B,Fall,42,50.0 502 | City C,Fall,42,81.0 503 | City A,Winter,42,15.0 504 | City B,Winter,42,25.0 505 | City C,Winter,42,54.0 506 | City A,Spring,43,18.0 507 | City B,Spring,43,34.0 508 | City C,Spring,43,88.0 509 | City A,Summer,43,99.0 510 | City B,Summer,43,62.0 511 | City C,Summer,43,100.0 512 | City A,Fall,43,52.0 513 | City B,Fall,43,57.0 514 | City C,Fall,43,68.0 515 | City A,Winter,43,6.0 516 | City B,Winter,43,14.0 517 | City C,Winter,43,57.0 518 | City A,Spring,44,50.0 519 | City B,Spring,44,42.0 520 | City C,Spring,44,67.0 521 | City A,Summer,44,78.0 522 | City B,Summer,44,74.0 523 | City C,Summer,44,91.0 524 | City A,Fall,44,45.0 525 | City B,Fall,44,52.0 526 | City C,Fall,44,63.0 527 | City A,Winter,44,18.0 528 | City B,Winter,44,20.0 529 | City C,Winter,44,58.0 530 | City A,Spring,45,38.0 531 | City B,Spring,45,42.0 532 | City C,Spring,45,80.0 533 | City A,Summer,45,91.0 534 | City B,Summer,45,57.0 535 | City C,Summer,45,97.0 536 | City A,Fall,45,25.0 537 | City B,Fall,45,62.0 538 | City C,Fall,45,60.0 539 | City A,Winter,45,11.0 540 | City B,Winter,45,25.0 541 | City C,Winter,45,47.0 542 | City A,Spring,46,38.0 543 | City B,Spring,46,57.0 544 | City C,Spring,46,67.0 545 | City A,Summer,46,80.0 546 | City B,Summer,46,68.0 547 | City C,Summer,46,91.0 548 | City A,Fall,46,64.0 549 | City B,Fall,46,51.0 550 | City C,Fall,46,53.0 551 | City A,Winter,46,3.0 552 | City B,Winter,46,42.0 553 | City C,Winter,46,53.0 554 | City A,Spring,47,48.0 555 | City B,Spring,47,28.0 556 | City C,Spring,47,65.0 557 | City A,Summer,47,87.0 558 | City B,Summer,47,67.0 559 | City C,Summer,47,91.0 560 | City A,Fall,47,40.0 561 | City B,Fall,47,57.0 562 | City C,Fall,47,71.0 563 | City A,Winter,47,40.0 564 | City B,Winter,47,36.0 565 | City C,Winter,47,47.0 566 | City A,Spring,48,48.0 567 | City B,Spring,48,63.0 568 | City C,Spring,48,81.0 569 | City A,Summer,48,77.0 570 | City B,Summer,48,71.0 571 | City C,Summer,48,95.0 572 | City A,Fall,48,52.0 573 | City B,Fall,48,51.0 574 | City C,Fall,48,71.0 575 | City A,Winter,48,6.0 576 | City B,Winter,48,29.0 577 | City C,Winter,48,53.0 578 | City A,Spring,49,39.0 579 | City B,Spring,49,69.0 580 | City C,Spring,49,81.0 581 | City A,Summer,49,80.0 582 | City B,Summer,49,55.0 583 | City C,Summer,49,90.0 584 | City A,Fall,49,43.0 585 | City B,Fall,49,41.0 586 | City C,Fall,49,60.0 587 | City A,Winter,49,-6.0 588 | City B,Winter,49,36.0 589 | City C,Winter,49,54.0 590 | City A,Spring,50,44.0 591 | City B,Spring,50,52.0 592 | City C,Spring,50,66.0 593 | City A,Summer,50,78.0 594 | City B,Summer,50,67.0 595 | City C,Summer,50,86.0 596 | City A,Fall,50,55.0 597 | City B,Fall,50,65.0 598 | City C,Fall,50,61.0 599 | City A,Winter,50,14.0 600 | City B,Winter,50,24.0 601 | City C,Winter,50,40.0 602 | City A,Spring,51,29.0 603 | City B,Spring,51,56.0 604 | City C,Spring,51,77.0 605 | City A,Summer,51,77.0 606 | City B,Summer,51,71.0 607 | City C,Summer,51,97.0 608 | City A,Fall,51,36.0 609 | City B,Fall,51,61.0 610 | City C,Fall,51,50.0 611 | City A,Winter,51,6.0 612 | City B,Winter,51,28.0 613 | City C,Winter,51,55.0 614 | City A,Spring,52,46.0 615 | City B,Spring,52,54.0 616 | City C,Spring,52,70.0 617 | City A,Summer,52,88.0 618 | City B,Summer,52,53.0 619 | City C,Summer,52,95.0 620 | City A,Fall,52,38.0 621 | City B,Fall,52,50.0 622 | City C,Fall,52,71.0 623 | City A,Winter,52,21.0 624 | City B,Winter,52,38.0 625 | City C,Winter,52,53.0 626 | City A,Spring,53,12.0 627 | City B,Spring,53,51.0 628 | City C,Spring,53,60.0 629 | City A,Summer,53,83.0 630 | City B,Summer,53,67.0 631 | City C,Summer,53,84.0 632 | City A,Fall,53,25.0 633 | City B,Fall,53,43.0 634 | City C,Fall,53,57.0 635 | City A,Winter,53,8.0 636 | City B,Winter,53,45.0 637 | City C,Winter,53,49.0 638 | City A,Spring,54,50.0 639 | City B,Spring,54,59.0 640 | City C,Spring,54,69.0 641 | City A,Summer,54,84.0 642 | City B,Summer,54,83.0 643 | City C,Summer,54,96.0 644 | City A,Fall,54,23.0 645 | City B,Fall,54,41.0 646 | City C,Fall,54,59.0 647 | City A,Winter,54,18.0 648 | City B,Winter,54,44.0 649 | City C,Winter,54,44.0 650 | City A,Spring,55,24.0 651 | City B,Spring,55,42.0 652 | City C,Spring,55,84.0 653 | City A,Summer,55,99.0 654 | City B,Summer,55,67.0 655 | City C,Summer,55,76.0 656 | City A,Fall,55,42.0 657 | City B,Fall,55,58.0 658 | City C,Fall,55,56.0 659 | City A,Winter,55,19.0 660 | City B,Winter,55,32.0 661 | City C,Winter,55,49.0 662 | City A,Spring,56,31.0 663 | City B,Spring,56,66.0 664 | City C,Spring,56,81.0 665 | City A,Summer,56,80.0 666 | City B,Summer,56,59.0 667 | City C,Summer,56,93.0 668 | City A,Fall,56,66.0 669 | City B,Fall,56,53.0 670 | City C,Fall,56,57.0 671 | City A,Winter,56,1.0 672 | City B,Winter,56,45.0 673 | City C,Winter,56,57.0 674 | City A,Spring,57,11.0 675 | City B,Spring,57,53.0 676 | City C,Spring,57,80.0 677 | City A,Summer,57,73.0 678 | City B,Summer,57,66.0 679 | City C,Summer,57,97.0 680 | City A,Fall,57,61.0 681 | City B,Fall,57,46.0 682 | City C,Fall,57,57.0 683 | City A,Winter,57,0.0 684 | City B,Winter,57,32.0 685 | City C,Winter,57,47.0 686 | City A,Spring,58,41.0 687 | City B,Spring,58,47.0 688 | City C,Spring,58,71.0 689 | City A,Summer,58,87.0 690 | City B,Summer,58,71.0 691 | City C,Summer,58,86.0 692 | City A,Fall,58,54.0 693 | City B,Fall,58,61.0 694 | City C,Fall,58,69.0 695 | City A,Winter,58,-2.0 696 | City B,Winter,58,30.0 697 | City C,Winter,58,56.0 698 | City A,Spring,59,61.0 699 | City B,Spring,59,54.0 700 | City C,Spring,59,63.0 701 | City A,Summer,59,75.0 702 | City B,Summer,59,69.0 703 | City C,Summer,59,91.0 704 | City A,Fall,59,31.0 705 | City B,Fall,59,42.0 706 | City C,Fall,59,66.0 707 | City A,Winter,59,27.0 708 | City B,Winter,59,37.0 709 | City C,Winter,59,55.0 710 | City A,Spring,60,46.0 711 | City B,Spring,60,38.0 712 | City C,Spring,60,66.0 713 | City A,Summer,60,85.0 714 | City B,Summer,60,71.0 715 | City C,Summer,60,88.0 716 | City A,Fall,60,46.0 717 | City B,Fall,60,40.0 718 | City C,Fall,60,70.0 719 | City A,Winter,60,5.0 720 | City B,Winter,60,25.0 721 | City C,Winter,60,59.0 722 | City A,Spring,61,46.0 723 | City B,Spring,61,40.0 724 | City C,Spring,61,76.0 725 | City A,Summer,61,91.0 726 | City B,Summer,61,61.0 727 | City C,Summer,61,88.0 728 | City A,Fall,61,41.0 729 | City B,Fall,61,46.0 730 | City C,Fall,61,69.0 731 | City A,Winter,61,20.0 732 | City B,Winter,61,39.0 733 | City C,Winter,61,46.0 734 | City A,Spring,62,31.0 735 | City B,Spring,62,47.0 736 | City C,Spring,62,65.0 737 | City A,Summer,62,66.0 738 | City B,Summer,62,77.0 739 | City C,Summer,62,83.0 740 | City A,Fall,62,34.0 741 | City B,Fall,62,48.0 742 | City C,Fall,62,51.0 743 | City A,Winter,62,23.0 744 | City B,Winter,62,25.0 745 | City C,Winter,62,54.0 746 | City A,Spring,63,26.0 747 | City B,Spring,63,42.0 748 | City C,Spring,63,77.0 749 | City A,Summer,63,89.0 750 | City B,Summer,63,79.0 751 | City C,Summer,63,92.0 752 | City A,Fall,63,65.0 753 | City B,Fall,63,53.0 754 | City C,Fall,63,76.0 755 | City A,Winter,63,0.0 756 | City B,Winter,63,38.0 757 | City C,Winter,63,46.0 758 | City A,Spring,64,26.0 759 | City B,Spring,64,43.0 760 | City C,Spring,64,53.0 761 | City A,Summer,64,76.0 762 | City B,Summer,64,59.0 763 | City C,Summer,64,93.0 764 | City A,Fall,64,56.0 765 | City B,Fall,64,51.0 766 | City C,Fall,64,66.0 767 | City A,Winter,64,18.0 768 | City B,Winter,64,37.0 769 | City C,Winter,64,51.0 770 | City A,Spring,65,25.0 771 | City B,Spring,65,28.0 772 | City C,Spring,65,86.0 773 | City A,Summer,65,83.0 774 | City B,Summer,65,60.0 775 | City C,Summer,65,94.0 776 | City A,Fall,65,48.0 777 | City B,Fall,65,56.0 778 | City C,Fall,65,73.0 779 | City A,Winter,65,13.0 780 | City B,Winter,65,27.0 781 | City C,Winter,65,43.0 782 | City A,Spring,66,29.0 783 | City B,Spring,66,45.0 784 | City C,Spring,66,61.0 785 | City A,Summer,66,69.0 786 | City B,Summer,66,71.0 787 | City C,Summer,66,76.0 788 | City A,Fall,66,53.0 789 | City B,Fall,66,45.0 790 | City C,Fall,66,78.0 791 | City A,Winter,66,15.0 792 | City B,Winter,66,30.0 793 | City C,Winter,66,56.0 794 | City A,Spring,67,39.0 795 | City B,Spring,67,60.0 796 | City C,Spring,67,67.0 797 | City A,Summer,67,78.0 798 | City B,Summer,67,63.0 799 | City C,Summer,67,95.0 800 | City A,Fall,67,35.0 801 | City B,Fall,67,64.0 802 | City C,Fall,67,67.0 803 | City A,Winter,67,39.0 804 | City B,Winter,67,24.0 805 | City C,Winter,67,54.0 806 | City A,Spring,68,22.0 807 | City B,Spring,68,40.0 808 | City C,Spring,68,71.0 809 | City A,Summer,68,89.0 810 | City B,Summer,68,68.0 811 | City C,Summer,68,98.0 812 | City A,Fall,68,52.0 813 | City B,Fall,68,49.0 814 | City C,Fall,68,50.0 815 | City A,Winter,68,7.0 816 | City B,Winter,68,21.0 817 | City C,Winter,68,48.0 818 | City A,Spring,69,53.0 819 | City B,Spring,69,41.0 820 | City C,Spring,69,65.0 821 | City A,Summer,69,90.0 822 | City B,Summer,69,85.0 823 | City C,Summer,69,92.0 824 | City A,Fall,69,28.0 825 | City B,Fall,69,55.0 826 | City C,Fall,69,68.0 827 | City A,Winter,69,20.0 828 | City B,Winter,69,42.0 829 | City C,Winter,69,45.0 830 | City A,Spring,70,45.0 831 | City B,Spring,70,43.0 832 | City C,Spring,70,60.0 833 | City A,Summer,70,99.0 834 | City B,Summer,70,78.0 835 | City C,Summer,70,88.0 836 | City A,Fall,70,23.0 837 | City B,Fall,70,60.0 838 | City C,Fall,70,64.0 839 | City A,Winter,70,21.0 840 | City B,Winter,70,34.0 841 | City C,Winter,70,63.0 842 | City A,Spring,71,34.0 843 | City B,Spring,71,43.0 844 | City C,Spring,71,70.0 845 | City A,Summer,71,73.0 846 | City B,Summer,71,58.0 847 | City C,Summer,71,85.0 848 | City A,Fall,71,47.0 849 | City B,Fall,71,54.0 850 | City C,Fall,71,50.0 851 | City A,Winter,71,31.0 852 | City B,Winter,71,23.0 853 | City C,Winter,71,53.0 854 | City A,Spring,72,39.0 855 | City B,Spring,72,46.0 856 | City C,Spring,72,80.0 857 | City A,Summer,72,65.0 858 | City B,Summer,72,70.0 859 | City C,Summer,72,93.0 860 | City A,Fall,72,25.0 861 | City B,Fall,72,45.0 862 | City C,Fall,72,75.0 863 | City A,Winter,72,5.0 864 | City B,Winter,72,45.0 865 | City C,Winter,72,58.0 866 | City A,Spring,73,26.0 867 | City B,Spring,73,30.0 868 | City C,Spring,73,71.0 869 | City A,Summer,73,76.0 870 | City B,Summer,73,67.0 871 | City C,Summer,73,86.0 872 | City A,Fall,73,35.0 873 | City B,Fall,73,46.0 874 | City C,Fall,73,74.0 875 | City A,Winter,73,34.0 876 | City B,Winter,73,28.0 877 | City C,Winter,73,40.0 878 | City A,Spring,74,28.0 879 | City B,Spring,74,64.0 880 | City C,Spring,74,83.0 881 | City A,Summer,74,79.0 882 | City B,Summer,74,63.0 883 | City C,Summer,74,94.0 884 | City A,Fall,74,36.0 885 | City B,Fall,74,52.0 886 | City C,Fall,74,66.0 887 | City A,Winter,74,24.0 888 | City B,Winter,74,44.0 889 | City C,Winter,74,58.0 890 | City A,Spring,75,65.0 891 | City B,Spring,75,45.0 892 | City C,Spring,75,64.0 893 | City A,Summer,75,82.0 894 | City B,Summer,75,74.0 895 | City C,Summer,75,98.0 896 | City A,Fall,75,32.0 897 | City B,Fall,75,55.0 898 | City C,Fall,75,61.0 899 | City A,Winter,75,21.0 900 | City B,Winter,75,30.0 901 | City C,Winter,75,56.0 902 | City A,Spring,76,40.0 903 | City B,Spring,76,55.0 904 | City C,Spring,76,70.0 905 | City A,Summer,76,83.0 906 | City B,Summer,76,64.0 907 | City C,Summer,76,103.0 908 | City A,Fall,76,27.0 909 | City B,Fall,76,59.0 910 | City C,Fall,76,63.0 911 | City A,Winter,76,5.0 912 | City B,Winter,76,45.0 913 | City C,Winter,76,56.0 914 | City A,Spring,77,47.0 915 | City B,Spring,77,50.0 916 | City C,Spring,77,81.0 917 | City A,Summer,77,88.0 918 | City B,Summer,77,64.0 919 | City C,Summer,77,89.0 920 | City A,Fall,77,37.0 921 | City B,Fall,77,34.0 922 | City C,Fall,77,67.0 923 | City A,Winter,77,19.0 924 | City B,Winter,77,23.0 925 | City C,Winter,77,57.0 926 | City A,Spring,78,40.0 927 | City B,Spring,78,54.0 928 | City C,Spring,78,83.0 929 | City A,Summer,78,90.0 930 | City B,Summer,78,68.0 931 | City C,Summer,78,88.0 932 | City A,Fall,78,48.0 933 | City B,Fall,78,48.0 934 | City C,Fall,78,78.0 935 | City A,Winter,78,29.0 936 | City B,Winter,78,36.0 937 | City C,Winter,78,59.0 938 | City A,Spring,79,23.0 939 | City B,Spring,79,46.0 940 | City C,Spring,79,59.0 941 | City A,Summer,79,87.0 942 | City B,Summer,79,71.0 943 | City C,Summer,79,84.0 944 | City A,Fall,79,61.0 945 | City B,Fall,79,50.0 946 | City C,Fall,79,65.0 947 | City A,Winter,79,15.0 948 | City B,Winter,79,16.0 949 | City C,Winter,79,43.0 950 | City A,Spring,80,37.0 951 | City B,Spring,80,43.0 952 | City C,Spring,80,65.0 953 | City A,Summer,80,86.0 954 | City B,Summer,80,71.0 955 | City C,Summer,80,99.0 956 | City A,Fall,80,25.0 957 | City B,Fall,80,65.0 958 | City C,Fall,80,77.0 959 | City A,Winter,80,25.0 960 | City B,Winter,80,28.0 961 | City C,Winter,80,54.0 962 | City A,Spring,81,53.0 963 | City B,Spring,81,38.0 964 | City C,Spring,81,70.0 965 | City A,Summer,81,78.0 966 | City B,Summer,81,73.0 967 | City C,Summer,81,99.0 968 | City A,Fall,81,46.0 969 | City B,Fall,81,54.0 970 | City C,Fall,81,65.0 971 | City A,Winter,81,7.0 972 | City B,Winter,81,22.0 973 | City C,Winter,81,59.0 974 | City A,Spring,82,40.0 975 | City B,Spring,82,43.0 976 | City C,Spring,82,61.0 977 | City A,Summer,82,90.0 978 | City B,Summer,82,64.0 979 | City C,Summer,82,92.0 980 | City A,Fall,82,55.0 981 | City B,Fall,82,51.0 982 | City C,Fall,82,62.0 983 | City A,Winter,82,9.0 984 | City B,Winter,82,40.0 985 | City C,Winter,82,62.0 986 | City A,Spring,83,22.0 987 | City B,Spring,83,46.0 988 | City C,Spring,83,76.0 989 | City A,Summer,83,86.0 990 | City B,Summer,83,68.0 991 | City C,Summer,83,97.0 992 | City A,Fall,83,44.0 993 | City B,Fall,83,34.0 994 | City C,Fall,83,65.0 995 | City A,Winter,83,-3.0 996 | City B,Winter,83,34.0 997 | City C,Winter,83,53.0 998 | City A,Spring,84,30.0 999 | City B,Spring,84,56.0 1000 | City C,Spring,84,70.0 1001 | City A,Summer,84,68.0 1002 | City B,Summer,84,76.0 1003 | City C,Summer,84,93.0 1004 | City A,Fall,84,58.0 1005 | City B,Fall,84,50.0 1006 | City C,Fall,84,59.0 1007 | City A,Winter,84,14.0 1008 | City B,Winter,84,33.0 1009 | City C,Winter,84,52.0 1010 | City A,Spring,85,35.0 1011 | City B,Spring,85,51.0 1012 | City C,Spring,85,85.0 1013 | City A,Summer,85,77.0 1014 | City B,Summer,85,70.0 1015 | City C,Summer,85,85.0 1016 | City A,Fall,85,27.0 1017 | City B,Fall,85,45.0 1018 | City C,Fall,85,66.0 1019 | City A,Winter,85,-15.0 1020 | City B,Winter,85,45.0 1021 | City C,Winter,85,57.0 1022 | City A,Spring,86,33.0 1023 | City B,Spring,86,57.0 1024 | City C,Spring,86,80.0 1025 | City A,Summer,86,86.0 1026 | City B,Summer,86,58.0 1027 | City C,Summer,86,94.0 1028 | City A,Fall,86,42.0 1029 | City B,Fall,86,47.0 1030 | City C,Fall,86,68.0 1031 | City A,Winter,86,7.0 1032 | City B,Winter,86,24.0 1033 | City C,Winter,86,53.0 1034 | City A,Spring,87,35.0 1035 | City B,Spring,87,59.0 1036 | City C,Spring,87,77.0 1037 | City A,Summer,87,96.0 1038 | City B,Summer,87,70.0 1039 | City C,Summer,87,100.0 1040 | City A,Fall,87,46.0 1041 | City B,Fall,87,68.0 1042 | City C,Fall,87,70.0 1043 | City A,Winter,87,2.0 1044 | City B,Winter,87,26.0 1045 | City C,Winter,87,57.0 1046 | City A,Spring,88,41.0 1047 | City B,Spring,88,62.0 1048 | City C,Spring,88,74.0 1049 | City A,Summer,88,74.0 1050 | City B,Summer,88,79.0 1051 | City C,Summer,88,88.0 1052 | City A,Fall,88,28.0 1053 | City B,Fall,88,39.0 1054 | City C,Fall,88,57.0 1055 | City A,Winter,88,11.0 1056 | City B,Winter,88,45.0 1057 | City C,Winter,88,56.0 1058 | City A,Spring,89,38.0 1059 | City B,Spring,89,46.0 1060 | City C,Spring,89,80.0 1061 | City A,Summer,89,86.0 1062 | City B,Summer,89,68.0 1063 | City C,Summer,89,96.0 1064 | City A,Fall,89,25.0 1065 | City B,Fall,89,49.0 1066 | City C,Fall,89,69.0 1067 | City A,Winter,89,27.0 1068 | City B,Winter,89,30.0 1069 | City C,Winter,89,49.0 1070 | City A,Spring,90,40.0 1071 | City B,Spring,90,59.0 1072 | City C,Spring,90,66.0 1073 | City A,Summer,90,85.0 1074 | City B,Summer,90,69.0 1075 | City C,Summer,90,98.0 1076 | City A,Fall,90,30.0 1077 | City B,Fall,90,57.0 1078 | City C,Fall,90,68.0 1079 | City A,Winter,90,9.0 1080 | City B,Winter,90,45.0 1081 | City C,Winter,90,65.0 1082 | City A,Spring,91,15.0 1083 | City B,Spring,91,48.0 1084 | City C,Spring,91,64.0 1085 | City A,Summer,91,71.0 1086 | City B,Summer,91,71.0 1087 | City C,Summer,91,88.0 1088 | City A,Fall,91,47.0 1089 | City B,Fall,91,55.0 1090 | City C,Fall,91,70.0 1091 | City A,Winter,91,14.0 1092 | City B,Winter,91,28.0 1093 | City C,Winter,91,55.0 1094 | City A,Spring,92,59.0 1095 | City B,Spring,92,57.0 1096 | City C,Spring,92,76.0 1097 | City A,Summer,92,86.0 1098 | City B,Summer,92,64.0 1099 | City C,Summer,92,99.0 1100 | City A,Fall,92,26.0 1101 | City B,Fall,92,46.0 1102 | City C,Fall,92,59.0 1103 | City A,Winter,92,7.0 1104 | City B,Winter,92,17.0 1105 | City C,Winter,92,61.0 1106 | City A,Spring,93,22.0 1107 | City B,Spring,93,43.0 1108 | City C,Spring,93,68.0 1109 | City A,Summer,93,84.0 1110 | City B,Summer,93,79.0 1111 | City C,Summer,93,102.0 1112 | City A,Fall,93,19.0 1113 | City B,Fall,93,58.0 1114 | City C,Fall,93,57.0 1115 | City A,Winter,93,15.0 1116 | City B,Winter,93,29.0 1117 | City C,Winter,93,59.0 1118 | City A,Spring,94,49.0 1119 | City B,Spring,94,48.0 1120 | City C,Spring,94,55.0 1121 | City A,Summer,94,76.0 1122 | City B,Summer,94,75.0 1123 | City C,Summer,94,89.0 1124 | City A,Fall,94,30.0 1125 | City B,Fall,94,45.0 1126 | City C,Fall,94,57.0 1127 | City A,Winter,94,23.0 1128 | City B,Winter,94,25.0 1129 | City C,Winter,94,61.0 1130 | City A,Spring,95,34.0 1131 | City B,Spring,95,34.0 1132 | City C,Spring,95,83.0 1133 | City A,Summer,95,91.0 1134 | City B,Summer,95,57.0 1135 | City C,Summer,95,103.0 1136 | City A,Fall,95,23.0 1137 | City B,Fall,95,52.0 1138 | City C,Fall,95,57.0 1139 | City A,Winter,95,30.0 1140 | City B,Winter,95,21.0 1141 | City C,Winter,95,58.0 1142 | City A,Spring,96,30.0 1143 | City B,Spring,96,55.0 1144 | City C,Spring,96,69.0 1145 | City A,Summer,96,80.0 1146 | City B,Summer,96,65.0 1147 | City C,Summer,96,94.0 1148 | City A,Fall,96,36.0 1149 | City B,Fall,96,62.0 1150 | City C,Fall,96,59.0 1151 | City A,Winter,96,27.0 1152 | City B,Winter,96,33.0 1153 | City C,Winter,96,58.0 1154 | City A,Spring,97,62.0 1155 | City B,Spring,97,47.0 1156 | City C,Spring,97,66.0 1157 | City A,Summer,97,85.0 1158 | City B,Summer,97,71.0 1159 | City C,Summer,97,95.0 1160 | City A,Fall,97,35.0 1161 | City B,Fall,97,54.0 1162 | City C,Fall,97,59.0 1163 | City A,Winter,97,11.0 1164 | City B,Winter,97,23.0 1165 | City C,Winter,97,64.0 1166 | City A,Spring,98,34.0 1167 | City B,Spring,98,37.0 1168 | City C,Spring,98,73.0 1169 | City A,Summer,98,83.0 1170 | City B,Summer,98,57.0 1171 | City C,Summer,98,89.0 1172 | City A,Fall,98,32.0 1173 | City B,Fall,98,43.0 1174 | City C,Fall,98,69.0 1175 | City A,Winter,98,7.0 1176 | City B,Winter,98,39.0 1177 | City C,Winter,98,49.0 1178 | City A,Spring,99,34.0 1179 | City B,Spring,99,60.0 1180 | City C,Spring,99,68.0 1181 | City A,Summer,99,85.0 1182 | City B,Summer,99,76.0 1183 | City C,Summer,99,100.0 1184 | City A,Fall,99,49.0 1185 | City B,Fall,99,51.0 1186 | City C,Fall,99,81.0 1187 | City A,Winter,99,0.0 1188 | City B,Winter,99,45.0 1189 | City C,Winter,99,50.0 1190 | City A,Spring,100,26.0 1191 | City B,Spring,100,42.0 1192 | City C,Spring,100,74.0 1193 | City A,Summer,100,89.0 1194 | City B,Summer,100,60.0 1195 | City C,Summer,100,87.0 1196 | City A,Fall,100,42.0 1197 | City B,Fall,100,55.0 1198 | City C,Fall,100,68.0 1199 | City A,Winter,100,23.0 1200 | City B,Winter,100,45.0 1201 | City C,Winter,100,52.0 1202 | -------------------------------------------------------------------------------- /Delhi House Price Prediction/Delhi House Price Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Delhi House Price Prediction/Delhi House Price Prediction.pdf -------------------------------------------------------------------------------- /Delhi House Price Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Delhi House Price Prediction 2 | ![](https://m.economictimes.com/thumb/height-450,width-600,imgsize-201544,msid-80555468/1.jpg) 3 | 4 | ## Project Overview 5 | The "Delhi House Price Prediction" project focuses on predicting the prices of houses in various localities of Delhi. The primary objective is to develop a predictive model that can accurately estimate the prices of houses based on several key features present in the dataset. The dataset, obtained from Kaggle, contains information on factors such as house area, number of bedrooms, locality, and more. By analyzing these features, the project aims to provide valuable insights for potential buyers and sellers in the real estate market. 6 | 7 | ## Data Dictionary 8 | The project utilizes a dataset with 1259 rows and 11 columns, each representing different attributes related to houses in Delhi. Here is a brief overview of the columns: 9 | 10 | | Column Name | Description | 11 | | --- | --- | 12 | | Area | Area of the house in square feet | 13 | | BHK | Number of bedrooms | 14 | | Bathroom | Number of bathrooms | 15 | | Furnishing | Furnishing status | 16 | | Locality | Locality of the house | 17 | | Parking | Number of parking spaces available | 18 | | Price | Price of the house in INR | 19 | | Status | Property's status (ready to move or under construction) | 20 | | Transaction | Whether it's a new property or a re-sale | 21 | | Type | Type of the property | 22 | | Per_Sqft | Price per square foot | 23 | 24 | ## Impact 25 | The project's impact is twofold. Firstly, it addresses the need for accurate house price prediction in the dynamic real estate market of Delhi. Potential buyers can utilize the model's predictions to make informed decisions when purchasing a house. Sellers, on the other hand, can gain insights into fair pricing strategies for their properties. 26 | 27 | Secondly, through exploratory data analysis (EDA), significant insights have been uncovered. The analysis revealed that house prices are influenced by factors such as the area, number of bedrooms, and locality. Localities like Punjabi Bagh, Lajpat Nagar, and Vasant Kunj stand out as high-end areas with elevated property prices. The preference for new builder floor properties indicates a demand for customization and independence among buyers. 28 | 29 | In terms of machine learning, the project employed regression models such as Decision Tree Regressor and Random Forest Regressor. The Random Forest Regressor outperformed the Decision Tree Regressor, achieving an impressive accuracy of 84.98%. 30 | 31 | In conclusion, the "Delhi House Price Prediction" project provides valuable insights into the dynamic real estate market of Delhi, offering both buyers and sellers a reliable tool for estimating house prices. The project's utilization of regression models underscores its commitment to accuracy and effectiveness in predicting house prices, contributing to informed decision-making in the real estate domain. 32 | -------------------------------------------------------------------------------- /Diamond Price Prediction/Diamond Price Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Diamond Price Prediction/Diamond Price Prediction.pdf -------------------------------------------------------------------------------- /Diamond Price Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Diamond Price Prediction 2 | ![](https://dataanalyticsedge.com/wp-content/uploads/2019/11/feature_en_diamond_certification-777x383.jpg) 3 | ## Objective 4 | The main objective of this project is to develop a predictive model that can accurately estimate the prices of diamonds based on their characteristics. By analyzing the dataset and identifying patterns and relationships, the model will be able to predict the prices of unseen diamonds as well. 5 | 6 | |Column Name|Description| 7 | |-----------|-----------| 8 | |carat|Weight of the diamond| 9 | |cut|Quality of the cut (Fair, Good, Very Good, Premium, Ideal)| 10 | |color|Diamond colour, from J (worst) to D (best)| 11 | |clarity|How clear the diamond is (I1 (worst), SI2, SI1, VS2, VS1, VVS2, VVS1, IF (best))| 12 | |x|Length in mm| 13 | |y|Width in mm| 14 | |z|Depth in mm| 15 | |depth|Total depth percentage = z / mean(x, y) = 2 * z / (x + y) (43--79)| 16 | |table|Width of top of diamond relative to widest point (43--95)| 17 | |price|Price in US dollars (326--18,823)| 18 | 19 | ## Impact 20 | Accurate price prediction for diamonds has significant implications for various stakeholders, including buyers, sellers, and jewelers. A reliable predictive model can assist buyers in making informed decisions when purchasing diamonds and help sellers set competitive prices. Additionally, jewelers can benefit from price estimation to assess the value of their inventory and determine appropriate pricing strategies. 21 | 22 | This project aims to leverage machine learning techniques to extract valuable insights from the dataset and build a robust model for diamond price prediction. By accurately estimating diamond prices, it contributes to the efficiency and transparency of the diamond market and enhances decision-making in the jewelry industry. 23 | -------------------------------------------------------------------------------- /E-Commerce Product Delivery Prediction/E-Commerce Product Delivery Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/E-Commerce Product Delivery Prediction/E-Commerce Product Delivery Prediction.pdf -------------------------------------------------------------------------------- /E-Commerce Product Delivery Prediction/description.md: -------------------------------------------------------------------------------- 1 | # E-Commerce Product Delivery Prediction 2 | ![](https://globalskylogistics.com/wp-content/uploads/2018/04/THE-CHANGING-NATURE-OF-E-COMMERCE-DELIVERY.jpg) 3 | ## Project Overview: 4 | The aim of this project is to predict whether products from an international e-commerce company will reach customers on time or not. Additionally, the project analyzes various factors influencing product delivery and studies customer behavior. The company primarily sells electronic products. 5 | 6 | ## Data Dictionary: 7 | The dataset used for model building contains 10,999 observations of 12 variables, including: 8 | 9 | | Variable | Description | 10 | |----------------------|---------------------------------------------------------| 11 | | ID | ID Number of Customers | 12 | | Warehouse_block | The Company's Warehouse block (A, B, C, D, E) | 13 | | Mode_of_Shipment | The mode of shipment (Ship, Flight, Road) | 14 | | Customer_care_calls | Number of calls made for shipment inquiries | 15 | | Customer_rating | Customer rating (1 - Lowest, 5 - Highest) | 16 | | Cost_of_the_Product | Cost of the product in US Dollars | 17 | | Prior_purchases | Number of prior purchases | 18 | | Product_importance | Product importance categorization (low, medium, high) | 19 | | Gender | Gender of customers (Male, Female) | 20 | | Discount_offered | Discount offered on specific products | 21 | | Weight_in_gms | Weight of the product in grams | 22 | | Reached.on.Time_Y.N | Target variable (1 - Product did not reach on time, 0 - Product reached on time) | 23 | 24 | ## Conclusion: 25 | The project aimed to predict product delivery timeliness while examining factors affecting delivery and customer behavior. Exploratory data analysis revealed that product weight and cost significantly impact delivery. Products weighing between 2500 - 3500 grams and costing less than $250 had a higher likelihood of timely delivery. Most products were shipped from Warehouse F via ship, suggesting its proximity to a seaport. 26 | 27 | Customer behavior also plays a crucial role in predicting timely delivery. Higher customer call volumes correlated with delayed delivery. Interestingly, customers with more prior purchases had a higher rate of on-time deliveries, indicating their loyalty to the company. Products with a discount of 0-10% were more likely to be delivered late, whereas those with discounts exceeding 10% had a higher on-time delivery rate. 28 | 29 | In terms of machine learning models, the decision tree classifier achieved the highest accuracy at 69%, outperforming other models. The random forest classifier and logistic regression achieved accuracies of 68% and 67%, respectively, while the K Nearest Neighbors model had the lowest accuracy at 65%. 30 | 31 | -------------------------------------------------------------------------------- /Heart Stroke Prediction/Stroke Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Heart Stroke Prediction/Stroke Prediction.pdf -------------------------------------------------------------------------------- /Heart Stroke Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Stroke Prediction 2 | ![](https://dezyre.gumlet.io/images/blog/heart-disease-prediction-using-machine-learning-project/Heart_Disease_Prediction_using_Machine_Learning.png?w=330&dpr=2.6) 3 | 4 | This data science project aims to predict the likelihood of a patient experiencing a stroke based on various input parameters such as gender, age, presence of diseases, and smoking status. The dataset provides relevant information about each patient, enabling the development of a predictive model. 5 | ## Dataset Information 6 | The dataset used in this project contains information necessary to predict the occurrence of a stroke. Each row in the dataset represents a patient, and the dataset includes the following attributes: 7 | 1. id: Unique identifier 8 | 2. gender: "Male", "Female", or "Other" 9 | 3. age: Age of the patient 10 | 4. hypertension: 0 if the patient doesn't have hypertension, 1 if the patient has hypertension 11 | 5. heart_disease: 0 if the patient doesn't have any heart diseases, 1 if the patient has a heart disease 12 | 6. ever_married: "No" or "Yes" 13 | 7. work_type: "Children", "Govt_job", "Never_worked", "Private", or "Self-employed" 14 | 8. Residence_type: "Rural" or "Urban" 15 | 9. avg_glucose_level: Average glucose level in the blood 16 | 10. bmi: Body mass index 17 | 11. smoking_status: "Formerly smoked", "Never smoked", "Smokes", or "Unknown" 18 | 12. stroke: 1 if the patient had a stroke, 0 if not 19 | ## Context 20 | According to the World Health Organization (WHO), stroke is the second leading cause of death worldwide, responsible for approximately 11% of total deaths. This project aims to leverage machine learning techniques to build a predictive model that can identify individuals at risk of stroke based on their demographic and health-related features. By detecting high-risk individuals early, appropriate preventive measures can be taken to reduce the incidence and impact of stroke. 21 | 22 | To enhance the accuracy of the stroke prediction model, the dataset will be analyzed and processed using various data science methodologies and algorithms. 23 | -------------------------------------------------------------------------------- /Hotel Reservations Cancellation Prediction/Hotel Reservations Cancelation Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Hotel Reservations Cancellation Prediction/Hotel Reservations Cancelation Prediction.pdf -------------------------------------------------------------------------------- /Hotel Reservations Cancellation Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Hotel Reservations Cancellation Prediction 2 | ![](https://static.vecteezy.com/system/resources/previews/000/474/062/original/hotel-reservation-conceptual-illustration-design-vector.jpg) 3 | ## Project Overview: 4 | 5 | The goal of the Hotel Reservations Cancellation Prediction project is to anticipate potential reservation cancellations by analyzing various features and variables associated with hotel bookings. In the context of the project, online hotel reservation channels have revolutionized booking methods and customer behavior. However, a significant number of reservations are canceled, leading to revenue loss for hotels. Cancellations occur for reasons such as changes in plans or scheduling conflicts, often facilitated by the option of free or low-cost cancellations. This project seeks to predict such cancellations based on available data. 6 | 7 | ## Data Dictionary: 8 | 9 | Here is a data dictionary summarizing the key columns in the dataset: 10 | 11 | | Column Name | Description | 12 | |---------------------------------|-------------------------------------------------------| 13 | | Booking_ID | Unique identifier of each booking | 14 | | no_of_adults | Number of adults | 15 | | no_of_children | Number of children | 16 | | no_of_weekend_nights | Number of weekend nights (Saturday or Sunday) | 17 | | no_of_week_nights | Number of weeknights (Monday to Friday) | 18 | | meal_type | Meal type booked by the customer | 19 | | required_car_parking_spaces | Does the customer require a car parking space? (0 - No, 1 - Yes) | 20 | | lead_time | Number of days between the booking date and arrival date | 21 | | arrival_year | Year of arrival | 22 | | arrival_month | Month of arrival | 23 | | arrival_date | Date of arrival | 24 | | market_segment | Market segment designation | 25 | | repeated_guest | Is the customer a repeated guest? (0 - No, 1 - Yes) | 26 | | no_previous_cancellations | Number of previous bookings canceled by the customer prior to the current booking | 27 | | previous_bookings_not_canceled | Number of previous bookings not canceled by the customer prior to the current booking | 28 | | avg_price_per_room | Average price per day of the reservation (in euros) | 29 | | no_of_special_requests | Total number of special requests made by the customer (e.g., high floor, view from the room, etc) | 30 | | booking_status | Flag indicating if the booking was canceled or not | 31 | 32 | ## Conclusion: 33 | 34 | The exploratory data analysis of the Hotel Reservations Cancellation Prediction project revealed several key insights: 35 | 36 | 1. **Guest Composition**: Reservations made for 2 adults with no children, possibly representing couples, had the highest cancellation count. Reservations with children involved had lower cancellation rates. 37 | 38 | 2. **Booking Timing**: Most reservations were made for weeknights and had significantly higher cancellations compared to those made for weekend nights. 39 | 40 | 3. **Year and Month**: The year 2018 had a higher cancellation rate compared to 2017, with the highest cancellations occurring in July and October. 41 | 42 | 4. **Services Impact**: Services opted for during reservation did not significantly impact reservation cancellations. 43 | 44 | 5. **Lead Time**: Lead time had a significant impact on reservation cancellations. Guests with shorter lead times were less likely to cancel, while longer lead times increased the likelihood of cancellations. This suggests that the hotel should consider accepting reservations with shorter lead times. 45 | 46 | 6. **Market Segment**: Reservations made through online platforms had the highest number of cancellations, emphasizing the importance of the hotel's online reputation. 47 | 48 | In terms of predictive modeling, Decision Tree Classifier demonstrated the highest accuracy (85%) among the models employed, making it a promising choice for predicting reservation cancellations. 49 | 50 | This project provides valuable insights for hotel management to optimize their reservation policies and reduce cancellations, ultimately improving revenue and customer satisfaction. 51 | -------------------------------------------------------------------------------- /House Price Prediction/House_Price_Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/House Price Prediction/House_Price_Prediction.pdf -------------------------------------------------------------------------------- /House Price Prediction/description.md: -------------------------------------------------------------------------------- 1 | # House Price Prediction 2 | ![](https://nycdsa-blog-files.s3.us-east-2.amazonaws.com/2021/03/chaitali-majumder/house-price-497112-KhCJQICS.jpg) 3 | This data science project focuses on predicting house prices using a dataset containing various features and attributes related to residential properties. By analyzing and modeling the data, the project aims to develop a predictive model that can estimate the sale prices of houses accurately. 4 | ## Dataset Information 5 | The dataset used in this project consists of information about different residential properties. It includes a wide range of features that can potentially influence the price of a house, such as the number of bedrooms, bathrooms, square footage, location, neighborhood characteristics, and other relevant factors. 6 | ## Objective 7 | The main objective of this project is to leverage machine learning techniques to build a robust predictive model for house price estimation. By training the model on historical data and leveraging its learned patterns and relationships, it will be able to predict the prices of new or unseen houses accurately. 8 | ## Approach 9 | The project will involve several steps, including data preprocessing, exploratory data analysis, feature engineering, model selection, and evaluation. Techniques such as data cleaning, handling missing values, feature scaling, and encoding categorical variables will be employed to prepare the dataset for model training. Various regression algorithms, such as linear regression, random forests will be explored and evaluated to determine the most suitable model for accurate price prediction. 10 | ## Impact 11 | Accurate house price prediction can have significant implications for various stakeholders, including homebuyers, sellers, real estate agents, and investors. With an effective predictive model, prospective buyers can make informed decisions about property investments, sellers can set competitive prices, and agents can provide better guidance to their clients. Additionally, investors can use the predicted prices to identify profitable opportunities in the real estate market. 12 | 13 | Through this project, insights and patterns in the housing market can be uncovered, allowing for a better understanding of the factors influencing house prices and facilitating more informed decision-making in the real estate industry. 14 | -------------------------------------------------------------------------------- /Indian Used Car Price Prediction/Indian Used Car Price Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Indian Used Car Price Prediction/Indian Used Car Price Prediction.pdf -------------------------------------------------------------------------------- /Indian Used Car Price Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Indian Used Car Price Prediction 2 | ![](http://cdn.dribbble.com/users/1239720/screenshots/3506944/car_mg.gif) 3 | ## Project Overview 4 | The aim of this data science project is to predict the price of used cars in major Indian metro cities. The prediction is based on various car features, including the car's manufacturer, model, variant, fuel type, color, kilometers driven, body style, transmission type, manufacture date, model year, CNG kit availability, ownership history, dealer details, and a quality score. The project utilizes data analysis and machine learning techniques to provide insights and price predictions, ultimately helping both buyers and sellers in the used car market. 5 | 6 | ## Data Dictionary 7 | Here is a detailed data dictionary for the columns in the "Indian IT Cities Used Car Dataset 2023": 8 | 9 | | Column Name | Description | 10 | |-------------------|------------------------------------------------| 11 | | ID | Unique ID for each listing | 12 | | Company | Name of the car manufacturer | 13 | | Model | Name of the car model | 14 | | Variant | Name of the car variant | 15 | | Fuel Type | Fuel type of the car | 16 | | Color | Color of the car | 17 | | Kilometer | Number of kilometers driven by the car | 18 | | Body Style | Body style of the car | 19 | | Transmission Type | Transmission type of the car | 20 | | Manufacture Date | Manufacture date of the car | 21 | | Model Year | Model year of the car | 22 | | CngKit | Whether the car has a CNG kit or not | 23 | | Price | Price of the car | 24 | | Owner Type | Number of previous owners of the car | 25 | | Dealer State | State in which the car is being sold | 26 | | Dealer Name | Name of the dealer selling the car | 27 | | City | City in which the car is being sold | 28 | | Warranty | Warranty offered by the dealer | 29 | | Quality Score | Quality score of the car | 30 | 31 | ## Conclusion 32 | In the course of this data science project aimed at predicting used car prices in major Indian metro cities, several significant insights have emerged through exploratory data analysis and machine learning techniques. These insights shed light on both the demand and pricing dynamics within the Indian used car market, offering valuable information to both prospective buyers and sellers. 33 | 34 | **Demand and Price Trends:** 35 | One of the most notable observations is the strong correlation between demand and price within the used car market. Lower-priced used cars enjoy a significantly higher demand, indicating a preference among consumers for budget-friendly options. In contrast, luxury car manufacturers such as MG, Mercedes Benz, BMW, Volvo, and KIA command the highest prices. This highlights a trend where consumers are more inclined to opt for new luxury cars instead of pre-owned ones. 36 | 37 | **Fuel Type Influence:** 38 | Fuel type plays a pivotal role in pricing, with the market predominantly comprising petrol and diesel cars. Diesel cars, while not overwhelmingly prevalent, do exhibit slightly higher pricing compared to their petrol counterparts. 39 | 40 | **Color Preferences:** 41 | Car color also contributes to the pricing dynamics, as common colors like white, grey, silver, and black are in high demand, while more exotic colors such as burgundy, riviera red, dark blue, and black magic tend to fetch higher prices in the used car market. 42 | 43 | **Kilometer Reading Impact:** 44 | An analysis of the odometer readings reveals that most cars listed for sale have covered fewer than 10,000 kilometers. Unsurprisingly, cars with lower mileage tend to command higher prices, underlining the importance of mileage in pricing determination. 45 | 46 | **Body Style Preferences:** 47 | Body style preferences vary among consumers, with hatchbacks, SUVs, and sedans being the top choices. Conversely, MPVs and luxury SUVs are typically more expensive. 48 | 49 | **Age and Resale Value:** 50 | The age of a car is a pivotal factor influencing its resale value. Newer cars, typically those under 5 years old, tend to have higher prices, reflecting a demand for relatively younger used vehicles. 51 | 52 | **Geographic Price Variation:** 53 | Significant variation in car prices across different regions has been observed. Delhi, Maharashtra, and Rajasthan stand out as the top states with the highest car prices. Additionally, specific dealers like Car Estate, Star Auto India, and Car Choice consistently list cars at higher prices. 54 | 55 | **Owner Type and Warranty Impact:** 56 | Buyers often prefer cars with a 1st owner type, leading to increased demand and higher prices for such vehicles. Additionally, cars offering warranties tend to command slightly higher prices, as warranties provide assurance to potential buyers. 57 | 58 | **Quality Score and Pricing:** 59 | The quality score of a car has a direct bearing on its price. Cars with higher quality scores generally command higher prices in the market. 60 | 61 | In terms of machine learning models, decision tree regressor and random forest regressor models were employed to predict car prices. Among these, the random forest regressor model demonstrated superior performance. An examination of feature importance revealed that car age, body style, and car manufacturer are the key drivers affecting car prices. 62 | 63 | This data-driven project equips both buyers and sellers in the Indian used car market with valuable insights, empowering them to make informed decisions and facilitating more accurate price predictions. 64 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2023 Sukhman Singh 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /Loan Approval Prediction/Loan Approval Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Loan Approval Prediction/Loan Approval Prediction.pdf -------------------------------------------------------------------------------- /Loan Approval Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Loan Approval Prediction Project 2 | 3 | ![](https://miro.medium.com/v2/resize:fit:640/1*UC0sy0bENl-DLPy3jmXNag.jpeg) 4 | 5 | ## Project Overview 6 | 7 | The **Loan Approval Prediction** project aims to predict whether a loan application will be approved by a bank. This prediction is made by analyzing various factors and information provided by the applicant. The project involves assessing variables such as loan amount, tenure, CIBIL score, education, assets, and other relevant features. The primary objective is to understand the factors that influence loan approval and develop a predictive model to determine the likelihood of loan approval for new applicants. Additionally, the project seeks to enhance customer service by prioritizing applicants who are more likely to have their loans approved. 8 | 9 | #### Data Dictionary 10 | 11 | | Variable | Description | 12 | |--------------------------|-----------------------------------------------------| 13 | | loan_id | Unique identifier for each loan application | 14 | | no_of_dependents | Number of dependents of the applicant | 15 | | education | Education level of the applicant | 16 | | self_employed | Indicates whether the applicant is self-employed | 17 | | income_annum | Annual income of the applicant | 18 | | loan_amount | Loan amount requested by the applicant | 19 | | loan_tenure | Tenure of the loan requested by the applicant (in years) | 20 | | cibil_score | CIBIL score of the applicant | 21 | | residential_asset_value | Value of the applicant's residential asset | 22 | | commercial_asset_value | Value of the applicant's commercial asset | 23 | | luxury_asset_value | Value of the applicant's luxury asset | 24 | | bank_assets_value | Value of the applicant's bank asset | 25 | | loan_status | Status of the loan (Approved/Rejected) | 26 | 27 | ## Conclusion 28 | 29 | The **Loan Approval Prediction** project provides valuable insights into loan approval determinants and offers predictive models to aid banks in making informed lending decisions. The project's outcomes contribute to a more efficient and customer-focused loan approval process. The project's conclusion emphasizes the following key points: 30 | 31 | 1. **Efficient Loan Approval Process:** The project's predictive models enable banks to streamline the loan approval process by prioritizing applications with higher chances of approval. This reduces processing time and improves customer satisfaction. 32 | 33 | 2. **Informed Decision-Making:** By identifying significant factors, the project empowers decision-makers to make data-driven lending decisions, enhancing the accuracy and reliability of loan approvals. 34 | 35 | 3. **Customer-Centric Approach:** The ability to provide priority services to applicants with higher chances of loan approval strengthens the bank's customer-centric approach, leading to better customer experiences. 36 | 37 | 4. **Risk Mitigation:** Understanding the factors contributing to loan approval allows banks to assess and manage risk more effectively, resulting in a reduced risk of defaults and non-repayment. 38 | 39 | In conclusion, the **Loan Approval Prediction** project contributes to a more effective loan approval process, better risk management, and improved customer service within the banking sector. 40 | -------------------------------------------------------------------------------- /Medical Cost Prediction/Medical Cost Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Medical Cost Prediction/Medical Cost Prediction.pdf -------------------------------------------------------------------------------- /Medical Cost Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Medical Cost Prediction 2 | ![](https://miro.medium.com/v2/resize:fit:1400/0*ssbGU5VIxtVB6NrF) 3 | This data science project aims to predict individual medical costs using a dataset containing various attributes related to health insurance. The project focuses on analyzing features such as age, gender, BMI, number of children, smoking status, region, and predicting the corresponding medical costs. 4 | ## Dataset Information 5 | The dataset used in this project provides information about health insurance beneficiaries and their medical costs. It includes the following columns: 6 | 7 | | Variable | Description | 8 | | --- | --- | 9 | | age | age of primary beneficiary | 10 | |bmi | body mass index | 11 | |children | number of children covered by health insurance | 12 | |smoker | smoking | 13 | |region | the beneficiary's residential area in the US | 14 | |charges | individual medical costs billed by health insurance | 15 | 16 | ## Objective 17 | The main objective of this project is to develop a predictive model that can accurately estimate the medical costs for individuals based on their attributes. By analyzing the dataset and identifying patterns and relationships, the model will provide insights into the factors influencing medical expenses. 18 | 19 | ## Approach 20 | The project will involve several steps, including data preprocessing, exploratory data analysis, feature engineering, model selection, and evaluation. The dataset will be prepared by handling missing values, encoding categorical variables, and scaling numerical features. Various regression algorithms, such as linear regression, decision trees, random forests, or gradient boosting, will be explored and evaluated to determine the most effective model for cost prediction. 21 | 22 | ## Impact 23 | Accurate medical cost prediction has significant implications for various stakeholders, including insurance companies, healthcare providers, and individuals. A reliable predictive model can assist insurance companies in assessing risks, determining appropriate premium rates, and managing resources efficiently. Healthcare providers can benefit from cost estimation to optimize resource allocation and budget planning. Additionally, individuals can gain insights into their potential medical expenses and make informed decisions regarding health insurance coverage. 24 | 25 | By leveraging machine learning techniques, this project aims to provide valuable insights into medical cost prediction and contribute to more accurate financial planning in the healthcare industry. 26 | -------------------------------------------------------------------------------- /Pima Indians Diabetes Prediction/Diabetes Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Pima Indians Diabetes Prediction/Diabetes Prediction.pdf -------------------------------------------------------------------------------- /Pima Indians Diabetes Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Pima Indian Diabetes Prediction 2 | ![](https://www.cdc.gov/diabetes/images/library/spotlights/diabetes-stats-report-724px.png?_=42420) 3 | ## Project Overview: 4 | The primary objective of the Pima Indian Diabetes Prediction project is to analyze various medical factors of female patients, particularly those of Pima Indian heritage and at least 21 years old, to predict whether they have diabetes or not. The dataset used in this project is originally from the National Institute of Diabetes and Digestive and Kidney Diseases, and it includes diagnostic measurements and medical predictor variables. 5 | ## About the Dataset: 6 | The dataset contains medical information from female patients, specifically of Pima Indian heritage, with a focus on diagnosing diabetes. The data includes several medical predictor variables such as Glucose Level, Blood Pressure, Skin Thickness, Insulin Level, number of pregnancies, BMI, age, and more. The target variable, "Outcome," indicates whether the patient has diabetes (1) or does not have diabetes (0). The dataset was carefully selected from a larger database, adhering to certain constraints to maintain accuracy and relevance. 7 | ### Data Dictionary 8 | | Feature | Description | 9 | |---------|------------| 10 | | Pregnancies | Number of times pregnant | 11 | | Glucose | Plasma glucose concentration a 2 hours in an oral glucose tolerance test | 12 | | BloodPressure | Diastolic blood pressure (mm Hg) | 13 | | SkinThickness | Triceps skin fold thickness (mm) | 14 | | Insulin | 2-Hour serum insulin (mu U/ml) | 15 | | BMI | Body mass index (weight in kg/(height in m)^2) | 16 | | DiabetesPedigreeFunction | Diabetes pedigree function | 17 | | Age | Age (years) | 18 | | Outcome | Class variable (0 or 1) | 19 | ## Impact: 20 | The Pima Indian Diabetes Prediction project holds significant potential for impacting healthcare outcomes for Pima Indian females. Early detection and diagnosis of diabetes can play a crucial role in managing the condition effectively and preventing complications. By developing an accurate predictive model, healthcare providers can identify individuals at higher risk of diabetes and offer timely interventions and personalized treatment plans. This project's successful implementation may lead to improved health management strategies, reduced healthcare costs, and enhanced overall well-being for the Pima Indian female community. Additionally, the insights gained from this study may contribute to broader research on diabetes risk factors and aid in formulating targeted public health initiatives for diabetes prevention and awareness within the Pima Indian population. The ethical and responsible use of data in this project will be ensured to protect patient privacy and promote transparency in the application of predictive modeling in healthcare settings. 21 | -------------------------------------------------------------------------------- /Pima Indians Diabetes Prediction/diabetes.csv: -------------------------------------------------------------------------------- 1 | Pregnancies,Glucose,BloodPressure,SkinThickness,Insulin,BMI,DiabetesPedigreeFunction,Age,Outcome 2 | 6,148,72,35,0,33.6,0.627,50,1 3 | 1,85,66,29,0,26.6,0.351,31,0 4 | 8,183,64,0,0,23.3,0.672,32,1 5 | 1,89,66,23,94,28.1,0.167,21,0 6 | 0,137,40,35,168,43.1,2.288,33,1 7 | 5,116,74,0,0,25.6,0.201,30,0 8 | 3,78,50,32,88,31,0.248,26,1 9 | 10,115,0,0,0,35.3,0.134,29,0 10 | 2,197,70,45,543,30.5,0.158,53,1 11 | 8,125,96,0,0,0,0.232,54,1 12 | 4,110,92,0,0,37.6,0.191,30,0 13 | 10,168,74,0,0,38,0.537,34,1 14 | 10,139,80,0,0,27.1,1.441,57,0 15 | 1,189,60,23,846,30.1,0.398,59,1 16 | 5,166,72,19,175,25.8,0.587,51,1 17 | 7,100,0,0,0,30,0.484,32,1 18 | 0,118,84,47,230,45.8,0.551,31,1 19 | 7,107,74,0,0,29.6,0.254,31,1 20 | 1,103,30,38,83,43.3,0.183,33,0 21 | 1,115,70,30,96,34.6,0.529,32,1 22 | 3,126,88,41,235,39.3,0.704,27,0 23 | 8,99,84,0,0,35.4,0.388,50,0 24 | 7,196,90,0,0,39.8,0.451,41,1 25 | 9,119,80,35,0,29,0.263,29,1 26 | 11,143,94,33,146,36.6,0.254,51,1 27 | 10,125,70,26,115,31.1,0.205,41,1 28 | 7,147,76,0,0,39.4,0.257,43,1 29 | 1,97,66,15,140,23.2,0.487,22,0 30 | 13,145,82,19,110,22.2,0.245,57,0 31 | 5,117,92,0,0,34.1,0.337,38,0 32 | 5,109,75,26,0,36,0.546,60,0 33 | 3,158,76,36,245,31.6,0.851,28,1 34 | 3,88,58,11,54,24.8,0.267,22,0 35 | 6,92,92,0,0,19.9,0.188,28,0 36 | 10,122,78,31,0,27.6,0.512,45,0 37 | 4,103,60,33,192,24,0.966,33,0 38 | 11,138,76,0,0,33.2,0.42,35,0 39 | 9,102,76,37,0,32.9,0.665,46,1 40 | 2,90,68,42,0,38.2,0.503,27,1 41 | 4,111,72,47,207,37.1,1.39,56,1 42 | 3,180,64,25,70,34,0.271,26,0 43 | 7,133,84,0,0,40.2,0.696,37,0 44 | 7,106,92,18,0,22.7,0.235,48,0 45 | 9,171,110,24,240,45.4,0.721,54,1 46 | 7,159,64,0,0,27.4,0.294,40,0 47 | 0,180,66,39,0,42,1.893,25,1 48 | 1,146,56,0,0,29.7,0.564,29,0 49 | 2,71,70,27,0,28,0.586,22,0 50 | 7,103,66,32,0,39.1,0.344,31,1 51 | 7,105,0,0,0,0,0.305,24,0 52 | 1,103,80,11,82,19.4,0.491,22,0 53 | 1,101,50,15,36,24.2,0.526,26,0 54 | 5,88,66,21,23,24.4,0.342,30,0 55 | 8,176,90,34,300,33.7,0.467,58,1 56 | 7,150,66,42,342,34.7,0.718,42,0 57 | 1,73,50,10,0,23,0.248,21,0 58 | 7,187,68,39,304,37.7,0.254,41,1 59 | 0,100,88,60,110,46.8,0.962,31,0 60 | 0,146,82,0,0,40.5,1.781,44,0 61 | 0,105,64,41,142,41.5,0.173,22,0 62 | 2,84,0,0,0,0,0.304,21,0 63 | 8,133,72,0,0,32.9,0.27,39,1 64 | 5,44,62,0,0,25,0.587,36,0 65 | 2,141,58,34,128,25.4,0.699,24,0 66 | 7,114,66,0,0,32.8,0.258,42,1 67 | 5,99,74,27,0,29,0.203,32,0 68 | 0,109,88,30,0,32.5,0.855,38,1 69 | 2,109,92,0,0,42.7,0.845,54,0 70 | 1,95,66,13,38,19.6,0.334,25,0 71 | 4,146,85,27,100,28.9,0.189,27,0 72 | 2,100,66,20,90,32.9,0.867,28,1 73 | 5,139,64,35,140,28.6,0.411,26,0 74 | 13,126,90,0,0,43.4,0.583,42,1 75 | 4,129,86,20,270,35.1,0.231,23,0 76 | 1,79,75,30,0,32,0.396,22,0 77 | 1,0,48,20,0,24.7,0.14,22,0 78 | 7,62,78,0,0,32.6,0.391,41,0 79 | 5,95,72,33,0,37.7,0.37,27,0 80 | 0,131,0,0,0,43.2,0.27,26,1 81 | 2,112,66,22,0,25,0.307,24,0 82 | 3,113,44,13,0,22.4,0.14,22,0 83 | 2,74,0,0,0,0,0.102,22,0 84 | 7,83,78,26,71,29.3,0.767,36,0 85 | 0,101,65,28,0,24.6,0.237,22,0 86 | 5,137,108,0,0,48.8,0.227,37,1 87 | 2,110,74,29,125,32.4,0.698,27,0 88 | 13,106,72,54,0,36.6,0.178,45,0 89 | 2,100,68,25,71,38.5,0.324,26,0 90 | 15,136,70,32,110,37.1,0.153,43,1 91 | 1,107,68,19,0,26.5,0.165,24,0 92 | 1,80,55,0,0,19.1,0.258,21,0 93 | 4,123,80,15,176,32,0.443,34,0 94 | 7,81,78,40,48,46.7,0.261,42,0 95 | 4,134,72,0,0,23.8,0.277,60,1 96 | 2,142,82,18,64,24.7,0.761,21,0 97 | 6,144,72,27,228,33.9,0.255,40,0 98 | 2,92,62,28,0,31.6,0.13,24,0 99 | 1,71,48,18,76,20.4,0.323,22,0 100 | 6,93,50,30,64,28.7,0.356,23,0 101 | 1,122,90,51,220,49.7,0.325,31,1 102 | 1,163,72,0,0,39,1.222,33,1 103 | 1,151,60,0,0,26.1,0.179,22,0 104 | 0,125,96,0,0,22.5,0.262,21,0 105 | 1,81,72,18,40,26.6,0.283,24,0 106 | 2,85,65,0,0,39.6,0.93,27,0 107 | 1,126,56,29,152,28.7,0.801,21,0 108 | 1,96,122,0,0,22.4,0.207,27,0 109 | 4,144,58,28,140,29.5,0.287,37,0 110 | 3,83,58,31,18,34.3,0.336,25,0 111 | 0,95,85,25,36,37.4,0.247,24,1 112 | 3,171,72,33,135,33.3,0.199,24,1 113 | 8,155,62,26,495,34,0.543,46,1 114 | 1,89,76,34,37,31.2,0.192,23,0 115 | 4,76,62,0,0,34,0.391,25,0 116 | 7,160,54,32,175,30.5,0.588,39,1 117 | 4,146,92,0,0,31.2,0.539,61,1 118 | 5,124,74,0,0,34,0.22,38,1 119 | 5,78,48,0,0,33.7,0.654,25,0 120 | 4,97,60,23,0,28.2,0.443,22,0 121 | 4,99,76,15,51,23.2,0.223,21,0 122 | 0,162,76,56,100,53.2,0.759,25,1 123 | 6,111,64,39,0,34.2,0.26,24,0 124 | 2,107,74,30,100,33.6,0.404,23,0 125 | 5,132,80,0,0,26.8,0.186,69,0 126 | 0,113,76,0,0,33.3,0.278,23,1 127 | 1,88,30,42,99,55,0.496,26,1 128 | 3,120,70,30,135,42.9,0.452,30,0 129 | 1,118,58,36,94,33.3,0.261,23,0 130 | 1,117,88,24,145,34.5,0.403,40,1 131 | 0,105,84,0,0,27.9,0.741,62,1 132 | 4,173,70,14,168,29.7,0.361,33,1 133 | 9,122,56,0,0,33.3,1.114,33,1 134 | 3,170,64,37,225,34.5,0.356,30,1 135 | 8,84,74,31,0,38.3,0.457,39,0 136 | 2,96,68,13,49,21.1,0.647,26,0 137 | 2,125,60,20,140,33.8,0.088,31,0 138 | 0,100,70,26,50,30.8,0.597,21,0 139 | 0,93,60,25,92,28.7,0.532,22,0 140 | 0,129,80,0,0,31.2,0.703,29,0 141 | 5,105,72,29,325,36.9,0.159,28,0 142 | 3,128,78,0,0,21.1,0.268,55,0 143 | 5,106,82,30,0,39.5,0.286,38,0 144 | 2,108,52,26,63,32.5,0.318,22,0 145 | 10,108,66,0,0,32.4,0.272,42,1 146 | 4,154,62,31,284,32.8,0.237,23,0 147 | 0,102,75,23,0,0,0.572,21,0 148 | 9,57,80,37,0,32.8,0.096,41,0 149 | 2,106,64,35,119,30.5,1.4,34,0 150 | 5,147,78,0,0,33.7,0.218,65,0 151 | 2,90,70,17,0,27.3,0.085,22,0 152 | 1,136,74,50,204,37.4,0.399,24,0 153 | 4,114,65,0,0,21.9,0.432,37,0 154 | 9,156,86,28,155,34.3,1.189,42,1 155 | 1,153,82,42,485,40.6,0.687,23,0 156 | 8,188,78,0,0,47.9,0.137,43,1 157 | 7,152,88,44,0,50,0.337,36,1 158 | 2,99,52,15,94,24.6,0.637,21,0 159 | 1,109,56,21,135,25.2,0.833,23,0 160 | 2,88,74,19,53,29,0.229,22,0 161 | 17,163,72,41,114,40.9,0.817,47,1 162 | 4,151,90,38,0,29.7,0.294,36,0 163 | 7,102,74,40,105,37.2,0.204,45,0 164 | 0,114,80,34,285,44.2,0.167,27,0 165 | 2,100,64,23,0,29.7,0.368,21,0 166 | 0,131,88,0,0,31.6,0.743,32,1 167 | 6,104,74,18,156,29.9,0.722,41,1 168 | 3,148,66,25,0,32.5,0.256,22,0 169 | 4,120,68,0,0,29.6,0.709,34,0 170 | 4,110,66,0,0,31.9,0.471,29,0 171 | 3,111,90,12,78,28.4,0.495,29,0 172 | 6,102,82,0,0,30.8,0.18,36,1 173 | 6,134,70,23,130,35.4,0.542,29,1 174 | 2,87,0,23,0,28.9,0.773,25,0 175 | 1,79,60,42,48,43.5,0.678,23,0 176 | 2,75,64,24,55,29.7,0.37,33,0 177 | 8,179,72,42,130,32.7,0.719,36,1 178 | 6,85,78,0,0,31.2,0.382,42,0 179 | 0,129,110,46,130,67.1,0.319,26,1 180 | 5,143,78,0,0,45,0.19,47,0 181 | 5,130,82,0,0,39.1,0.956,37,1 182 | 6,87,80,0,0,23.2,0.084,32,0 183 | 0,119,64,18,92,34.9,0.725,23,0 184 | 1,0,74,20,23,27.7,0.299,21,0 185 | 5,73,60,0,0,26.8,0.268,27,0 186 | 4,141,74,0,0,27.6,0.244,40,0 187 | 7,194,68,28,0,35.9,0.745,41,1 188 | 8,181,68,36,495,30.1,0.615,60,1 189 | 1,128,98,41,58,32,1.321,33,1 190 | 8,109,76,39,114,27.9,0.64,31,1 191 | 5,139,80,35,160,31.6,0.361,25,1 192 | 3,111,62,0,0,22.6,0.142,21,0 193 | 9,123,70,44,94,33.1,0.374,40,0 194 | 7,159,66,0,0,30.4,0.383,36,1 195 | 11,135,0,0,0,52.3,0.578,40,1 196 | 8,85,55,20,0,24.4,0.136,42,0 197 | 5,158,84,41,210,39.4,0.395,29,1 198 | 1,105,58,0,0,24.3,0.187,21,0 199 | 3,107,62,13,48,22.9,0.678,23,1 200 | 4,109,64,44,99,34.8,0.905,26,1 201 | 4,148,60,27,318,30.9,0.15,29,1 202 | 0,113,80,16,0,31,0.874,21,0 203 | 1,138,82,0,0,40.1,0.236,28,0 204 | 0,108,68,20,0,27.3,0.787,32,0 205 | 2,99,70,16,44,20.4,0.235,27,0 206 | 6,103,72,32,190,37.7,0.324,55,0 207 | 5,111,72,28,0,23.9,0.407,27,0 208 | 8,196,76,29,280,37.5,0.605,57,1 209 | 5,162,104,0,0,37.7,0.151,52,1 210 | 1,96,64,27,87,33.2,0.289,21,0 211 | 7,184,84,33,0,35.5,0.355,41,1 212 | 2,81,60,22,0,27.7,0.29,25,0 213 | 0,147,85,54,0,42.8,0.375,24,0 214 | 7,179,95,31,0,34.2,0.164,60,0 215 | 0,140,65,26,130,42.6,0.431,24,1 216 | 9,112,82,32,175,34.2,0.26,36,1 217 | 12,151,70,40,271,41.8,0.742,38,1 218 | 5,109,62,41,129,35.8,0.514,25,1 219 | 6,125,68,30,120,30,0.464,32,0 220 | 5,85,74,22,0,29,1.224,32,1 221 | 5,112,66,0,0,37.8,0.261,41,1 222 | 0,177,60,29,478,34.6,1.072,21,1 223 | 2,158,90,0,0,31.6,0.805,66,1 224 | 7,119,0,0,0,25.2,0.209,37,0 225 | 7,142,60,33,190,28.8,0.687,61,0 226 | 1,100,66,15,56,23.6,0.666,26,0 227 | 1,87,78,27,32,34.6,0.101,22,0 228 | 0,101,76,0,0,35.7,0.198,26,0 229 | 3,162,52,38,0,37.2,0.652,24,1 230 | 4,197,70,39,744,36.7,2.329,31,0 231 | 0,117,80,31,53,45.2,0.089,24,0 232 | 4,142,86,0,0,44,0.645,22,1 233 | 6,134,80,37,370,46.2,0.238,46,1 234 | 1,79,80,25,37,25.4,0.583,22,0 235 | 4,122,68,0,0,35,0.394,29,0 236 | 3,74,68,28,45,29.7,0.293,23,0 237 | 4,171,72,0,0,43.6,0.479,26,1 238 | 7,181,84,21,192,35.9,0.586,51,1 239 | 0,179,90,27,0,44.1,0.686,23,1 240 | 9,164,84,21,0,30.8,0.831,32,1 241 | 0,104,76,0,0,18.4,0.582,27,0 242 | 1,91,64,24,0,29.2,0.192,21,0 243 | 4,91,70,32,88,33.1,0.446,22,0 244 | 3,139,54,0,0,25.6,0.402,22,1 245 | 6,119,50,22,176,27.1,1.318,33,1 246 | 2,146,76,35,194,38.2,0.329,29,0 247 | 9,184,85,15,0,30,1.213,49,1 248 | 10,122,68,0,0,31.2,0.258,41,0 249 | 0,165,90,33,680,52.3,0.427,23,0 250 | 9,124,70,33,402,35.4,0.282,34,0 251 | 1,111,86,19,0,30.1,0.143,23,0 252 | 9,106,52,0,0,31.2,0.38,42,0 253 | 2,129,84,0,0,28,0.284,27,0 254 | 2,90,80,14,55,24.4,0.249,24,0 255 | 0,86,68,32,0,35.8,0.238,25,0 256 | 12,92,62,7,258,27.6,0.926,44,1 257 | 1,113,64,35,0,33.6,0.543,21,1 258 | 3,111,56,39,0,30.1,0.557,30,0 259 | 2,114,68,22,0,28.7,0.092,25,0 260 | 1,193,50,16,375,25.9,0.655,24,0 261 | 11,155,76,28,150,33.3,1.353,51,1 262 | 3,191,68,15,130,30.9,0.299,34,0 263 | 3,141,0,0,0,30,0.761,27,1 264 | 4,95,70,32,0,32.1,0.612,24,0 265 | 3,142,80,15,0,32.4,0.2,63,0 266 | 4,123,62,0,0,32,0.226,35,1 267 | 5,96,74,18,67,33.6,0.997,43,0 268 | 0,138,0,0,0,36.3,0.933,25,1 269 | 2,128,64,42,0,40,1.101,24,0 270 | 0,102,52,0,0,25.1,0.078,21,0 271 | 2,146,0,0,0,27.5,0.24,28,1 272 | 10,101,86,37,0,45.6,1.136,38,1 273 | 2,108,62,32,56,25.2,0.128,21,0 274 | 3,122,78,0,0,23,0.254,40,0 275 | 1,71,78,50,45,33.2,0.422,21,0 276 | 13,106,70,0,0,34.2,0.251,52,0 277 | 2,100,70,52,57,40.5,0.677,25,0 278 | 7,106,60,24,0,26.5,0.296,29,1 279 | 0,104,64,23,116,27.8,0.454,23,0 280 | 5,114,74,0,0,24.9,0.744,57,0 281 | 2,108,62,10,278,25.3,0.881,22,0 282 | 0,146,70,0,0,37.9,0.334,28,1 283 | 10,129,76,28,122,35.9,0.28,39,0 284 | 7,133,88,15,155,32.4,0.262,37,0 285 | 7,161,86,0,0,30.4,0.165,47,1 286 | 2,108,80,0,0,27,0.259,52,1 287 | 7,136,74,26,135,26,0.647,51,0 288 | 5,155,84,44,545,38.7,0.619,34,0 289 | 1,119,86,39,220,45.6,0.808,29,1 290 | 4,96,56,17,49,20.8,0.34,26,0 291 | 5,108,72,43,75,36.1,0.263,33,0 292 | 0,78,88,29,40,36.9,0.434,21,0 293 | 0,107,62,30,74,36.6,0.757,25,1 294 | 2,128,78,37,182,43.3,1.224,31,1 295 | 1,128,48,45,194,40.5,0.613,24,1 296 | 0,161,50,0,0,21.9,0.254,65,0 297 | 6,151,62,31,120,35.5,0.692,28,0 298 | 2,146,70,38,360,28,0.337,29,1 299 | 0,126,84,29,215,30.7,0.52,24,0 300 | 14,100,78,25,184,36.6,0.412,46,1 301 | 8,112,72,0,0,23.6,0.84,58,0 302 | 0,167,0,0,0,32.3,0.839,30,1 303 | 2,144,58,33,135,31.6,0.422,25,1 304 | 5,77,82,41,42,35.8,0.156,35,0 305 | 5,115,98,0,0,52.9,0.209,28,1 306 | 3,150,76,0,0,21,0.207,37,0 307 | 2,120,76,37,105,39.7,0.215,29,0 308 | 10,161,68,23,132,25.5,0.326,47,1 309 | 0,137,68,14,148,24.8,0.143,21,0 310 | 0,128,68,19,180,30.5,1.391,25,1 311 | 2,124,68,28,205,32.9,0.875,30,1 312 | 6,80,66,30,0,26.2,0.313,41,0 313 | 0,106,70,37,148,39.4,0.605,22,0 314 | 2,155,74,17,96,26.6,0.433,27,1 315 | 3,113,50,10,85,29.5,0.626,25,0 316 | 7,109,80,31,0,35.9,1.127,43,1 317 | 2,112,68,22,94,34.1,0.315,26,0 318 | 3,99,80,11,64,19.3,0.284,30,0 319 | 3,182,74,0,0,30.5,0.345,29,1 320 | 3,115,66,39,140,38.1,0.15,28,0 321 | 6,194,78,0,0,23.5,0.129,59,1 322 | 4,129,60,12,231,27.5,0.527,31,0 323 | 3,112,74,30,0,31.6,0.197,25,1 324 | 0,124,70,20,0,27.4,0.254,36,1 325 | 13,152,90,33,29,26.8,0.731,43,1 326 | 2,112,75,32,0,35.7,0.148,21,0 327 | 1,157,72,21,168,25.6,0.123,24,0 328 | 1,122,64,32,156,35.1,0.692,30,1 329 | 10,179,70,0,0,35.1,0.2,37,0 330 | 2,102,86,36,120,45.5,0.127,23,1 331 | 6,105,70,32,68,30.8,0.122,37,0 332 | 8,118,72,19,0,23.1,1.476,46,0 333 | 2,87,58,16,52,32.7,0.166,25,0 334 | 1,180,0,0,0,43.3,0.282,41,1 335 | 12,106,80,0,0,23.6,0.137,44,0 336 | 1,95,60,18,58,23.9,0.26,22,0 337 | 0,165,76,43,255,47.9,0.259,26,0 338 | 0,117,0,0,0,33.8,0.932,44,0 339 | 5,115,76,0,0,31.2,0.343,44,1 340 | 9,152,78,34,171,34.2,0.893,33,1 341 | 7,178,84,0,0,39.9,0.331,41,1 342 | 1,130,70,13,105,25.9,0.472,22,0 343 | 1,95,74,21,73,25.9,0.673,36,0 344 | 1,0,68,35,0,32,0.389,22,0 345 | 5,122,86,0,0,34.7,0.29,33,0 346 | 8,95,72,0,0,36.8,0.485,57,0 347 | 8,126,88,36,108,38.5,0.349,49,0 348 | 1,139,46,19,83,28.7,0.654,22,0 349 | 3,116,0,0,0,23.5,0.187,23,0 350 | 3,99,62,19,74,21.8,0.279,26,0 351 | 5,0,80,32,0,41,0.346,37,1 352 | 4,92,80,0,0,42.2,0.237,29,0 353 | 4,137,84,0,0,31.2,0.252,30,0 354 | 3,61,82,28,0,34.4,0.243,46,0 355 | 1,90,62,12,43,27.2,0.58,24,0 356 | 3,90,78,0,0,42.7,0.559,21,0 357 | 9,165,88,0,0,30.4,0.302,49,1 358 | 1,125,50,40,167,33.3,0.962,28,1 359 | 13,129,0,30,0,39.9,0.569,44,1 360 | 12,88,74,40,54,35.3,0.378,48,0 361 | 1,196,76,36,249,36.5,0.875,29,1 362 | 5,189,64,33,325,31.2,0.583,29,1 363 | 5,158,70,0,0,29.8,0.207,63,0 364 | 5,103,108,37,0,39.2,0.305,65,0 365 | 4,146,78,0,0,38.5,0.52,67,1 366 | 4,147,74,25,293,34.9,0.385,30,0 367 | 5,99,54,28,83,34,0.499,30,0 368 | 6,124,72,0,0,27.6,0.368,29,1 369 | 0,101,64,17,0,21,0.252,21,0 370 | 3,81,86,16,66,27.5,0.306,22,0 371 | 1,133,102,28,140,32.8,0.234,45,1 372 | 3,173,82,48,465,38.4,2.137,25,1 373 | 0,118,64,23,89,0,1.731,21,0 374 | 0,84,64,22,66,35.8,0.545,21,0 375 | 2,105,58,40,94,34.9,0.225,25,0 376 | 2,122,52,43,158,36.2,0.816,28,0 377 | 12,140,82,43,325,39.2,0.528,58,1 378 | 0,98,82,15,84,25.2,0.299,22,0 379 | 1,87,60,37,75,37.2,0.509,22,0 380 | 4,156,75,0,0,48.3,0.238,32,1 381 | 0,93,100,39,72,43.4,1.021,35,0 382 | 1,107,72,30,82,30.8,0.821,24,0 383 | 0,105,68,22,0,20,0.236,22,0 384 | 1,109,60,8,182,25.4,0.947,21,0 385 | 1,90,62,18,59,25.1,1.268,25,0 386 | 1,125,70,24,110,24.3,0.221,25,0 387 | 1,119,54,13,50,22.3,0.205,24,0 388 | 5,116,74,29,0,32.3,0.66,35,1 389 | 8,105,100,36,0,43.3,0.239,45,1 390 | 5,144,82,26,285,32,0.452,58,1 391 | 3,100,68,23,81,31.6,0.949,28,0 392 | 1,100,66,29,196,32,0.444,42,0 393 | 5,166,76,0,0,45.7,0.34,27,1 394 | 1,131,64,14,415,23.7,0.389,21,0 395 | 4,116,72,12,87,22.1,0.463,37,0 396 | 4,158,78,0,0,32.9,0.803,31,1 397 | 2,127,58,24,275,27.7,1.6,25,0 398 | 3,96,56,34,115,24.7,0.944,39,0 399 | 0,131,66,40,0,34.3,0.196,22,1 400 | 3,82,70,0,0,21.1,0.389,25,0 401 | 3,193,70,31,0,34.9,0.241,25,1 402 | 4,95,64,0,0,32,0.161,31,1 403 | 6,137,61,0,0,24.2,0.151,55,0 404 | 5,136,84,41,88,35,0.286,35,1 405 | 9,72,78,25,0,31.6,0.28,38,0 406 | 5,168,64,0,0,32.9,0.135,41,1 407 | 2,123,48,32,165,42.1,0.52,26,0 408 | 4,115,72,0,0,28.9,0.376,46,1 409 | 0,101,62,0,0,21.9,0.336,25,0 410 | 8,197,74,0,0,25.9,1.191,39,1 411 | 1,172,68,49,579,42.4,0.702,28,1 412 | 6,102,90,39,0,35.7,0.674,28,0 413 | 1,112,72,30,176,34.4,0.528,25,0 414 | 1,143,84,23,310,42.4,1.076,22,0 415 | 1,143,74,22,61,26.2,0.256,21,0 416 | 0,138,60,35,167,34.6,0.534,21,1 417 | 3,173,84,33,474,35.7,0.258,22,1 418 | 1,97,68,21,0,27.2,1.095,22,0 419 | 4,144,82,32,0,38.5,0.554,37,1 420 | 1,83,68,0,0,18.2,0.624,27,0 421 | 3,129,64,29,115,26.4,0.219,28,1 422 | 1,119,88,41,170,45.3,0.507,26,0 423 | 2,94,68,18,76,26,0.561,21,0 424 | 0,102,64,46,78,40.6,0.496,21,0 425 | 2,115,64,22,0,30.8,0.421,21,0 426 | 8,151,78,32,210,42.9,0.516,36,1 427 | 4,184,78,39,277,37,0.264,31,1 428 | 0,94,0,0,0,0,0.256,25,0 429 | 1,181,64,30,180,34.1,0.328,38,1 430 | 0,135,94,46,145,40.6,0.284,26,0 431 | 1,95,82,25,180,35,0.233,43,1 432 | 2,99,0,0,0,22.2,0.108,23,0 433 | 3,89,74,16,85,30.4,0.551,38,0 434 | 1,80,74,11,60,30,0.527,22,0 435 | 2,139,75,0,0,25.6,0.167,29,0 436 | 1,90,68,8,0,24.5,1.138,36,0 437 | 0,141,0,0,0,42.4,0.205,29,1 438 | 12,140,85,33,0,37.4,0.244,41,0 439 | 5,147,75,0,0,29.9,0.434,28,0 440 | 1,97,70,15,0,18.2,0.147,21,0 441 | 6,107,88,0,0,36.8,0.727,31,0 442 | 0,189,104,25,0,34.3,0.435,41,1 443 | 2,83,66,23,50,32.2,0.497,22,0 444 | 4,117,64,27,120,33.2,0.23,24,0 445 | 8,108,70,0,0,30.5,0.955,33,1 446 | 4,117,62,12,0,29.7,0.38,30,1 447 | 0,180,78,63,14,59.4,2.42,25,1 448 | 1,100,72,12,70,25.3,0.658,28,0 449 | 0,95,80,45,92,36.5,0.33,26,0 450 | 0,104,64,37,64,33.6,0.51,22,1 451 | 0,120,74,18,63,30.5,0.285,26,0 452 | 1,82,64,13,95,21.2,0.415,23,0 453 | 2,134,70,0,0,28.9,0.542,23,1 454 | 0,91,68,32,210,39.9,0.381,25,0 455 | 2,119,0,0,0,19.6,0.832,72,0 456 | 2,100,54,28,105,37.8,0.498,24,0 457 | 14,175,62,30,0,33.6,0.212,38,1 458 | 1,135,54,0,0,26.7,0.687,62,0 459 | 5,86,68,28,71,30.2,0.364,24,0 460 | 10,148,84,48,237,37.6,1.001,51,1 461 | 9,134,74,33,60,25.9,0.46,81,0 462 | 9,120,72,22,56,20.8,0.733,48,0 463 | 1,71,62,0,0,21.8,0.416,26,0 464 | 8,74,70,40,49,35.3,0.705,39,0 465 | 5,88,78,30,0,27.6,0.258,37,0 466 | 10,115,98,0,0,24,1.022,34,0 467 | 0,124,56,13,105,21.8,0.452,21,0 468 | 0,74,52,10,36,27.8,0.269,22,0 469 | 0,97,64,36,100,36.8,0.6,25,0 470 | 8,120,0,0,0,30,0.183,38,1 471 | 6,154,78,41,140,46.1,0.571,27,0 472 | 1,144,82,40,0,41.3,0.607,28,0 473 | 0,137,70,38,0,33.2,0.17,22,0 474 | 0,119,66,27,0,38.8,0.259,22,0 475 | 7,136,90,0,0,29.9,0.21,50,0 476 | 4,114,64,0,0,28.9,0.126,24,0 477 | 0,137,84,27,0,27.3,0.231,59,0 478 | 2,105,80,45,191,33.7,0.711,29,1 479 | 7,114,76,17,110,23.8,0.466,31,0 480 | 8,126,74,38,75,25.9,0.162,39,0 481 | 4,132,86,31,0,28,0.419,63,0 482 | 3,158,70,30,328,35.5,0.344,35,1 483 | 0,123,88,37,0,35.2,0.197,29,0 484 | 4,85,58,22,49,27.8,0.306,28,0 485 | 0,84,82,31,125,38.2,0.233,23,0 486 | 0,145,0,0,0,44.2,0.63,31,1 487 | 0,135,68,42,250,42.3,0.365,24,1 488 | 1,139,62,41,480,40.7,0.536,21,0 489 | 0,173,78,32,265,46.5,1.159,58,0 490 | 4,99,72,17,0,25.6,0.294,28,0 491 | 8,194,80,0,0,26.1,0.551,67,0 492 | 2,83,65,28,66,36.8,0.629,24,0 493 | 2,89,90,30,0,33.5,0.292,42,0 494 | 4,99,68,38,0,32.8,0.145,33,0 495 | 4,125,70,18,122,28.9,1.144,45,1 496 | 3,80,0,0,0,0,0.174,22,0 497 | 6,166,74,0,0,26.6,0.304,66,0 498 | 5,110,68,0,0,26,0.292,30,0 499 | 2,81,72,15,76,30.1,0.547,25,0 500 | 7,195,70,33,145,25.1,0.163,55,1 501 | 6,154,74,32,193,29.3,0.839,39,0 502 | 2,117,90,19,71,25.2,0.313,21,0 503 | 3,84,72,32,0,37.2,0.267,28,0 504 | 6,0,68,41,0,39,0.727,41,1 505 | 7,94,64,25,79,33.3,0.738,41,0 506 | 3,96,78,39,0,37.3,0.238,40,0 507 | 10,75,82,0,0,33.3,0.263,38,0 508 | 0,180,90,26,90,36.5,0.314,35,1 509 | 1,130,60,23,170,28.6,0.692,21,0 510 | 2,84,50,23,76,30.4,0.968,21,0 511 | 8,120,78,0,0,25,0.409,64,0 512 | 12,84,72,31,0,29.7,0.297,46,1 513 | 0,139,62,17,210,22.1,0.207,21,0 514 | 9,91,68,0,0,24.2,0.2,58,0 515 | 2,91,62,0,0,27.3,0.525,22,0 516 | 3,99,54,19,86,25.6,0.154,24,0 517 | 3,163,70,18,105,31.6,0.268,28,1 518 | 9,145,88,34,165,30.3,0.771,53,1 519 | 7,125,86,0,0,37.6,0.304,51,0 520 | 13,76,60,0,0,32.8,0.18,41,0 521 | 6,129,90,7,326,19.6,0.582,60,0 522 | 2,68,70,32,66,25,0.187,25,0 523 | 3,124,80,33,130,33.2,0.305,26,0 524 | 6,114,0,0,0,0,0.189,26,0 525 | 9,130,70,0,0,34.2,0.652,45,1 526 | 3,125,58,0,0,31.6,0.151,24,0 527 | 3,87,60,18,0,21.8,0.444,21,0 528 | 1,97,64,19,82,18.2,0.299,21,0 529 | 3,116,74,15,105,26.3,0.107,24,0 530 | 0,117,66,31,188,30.8,0.493,22,0 531 | 0,111,65,0,0,24.6,0.66,31,0 532 | 2,122,60,18,106,29.8,0.717,22,0 533 | 0,107,76,0,0,45.3,0.686,24,0 534 | 1,86,66,52,65,41.3,0.917,29,0 535 | 6,91,0,0,0,29.8,0.501,31,0 536 | 1,77,56,30,56,33.3,1.251,24,0 537 | 4,132,0,0,0,32.9,0.302,23,1 538 | 0,105,90,0,0,29.6,0.197,46,0 539 | 0,57,60,0,0,21.7,0.735,67,0 540 | 0,127,80,37,210,36.3,0.804,23,0 541 | 3,129,92,49,155,36.4,0.968,32,1 542 | 8,100,74,40,215,39.4,0.661,43,1 543 | 3,128,72,25,190,32.4,0.549,27,1 544 | 10,90,85,32,0,34.9,0.825,56,1 545 | 4,84,90,23,56,39.5,0.159,25,0 546 | 1,88,78,29,76,32,0.365,29,0 547 | 8,186,90,35,225,34.5,0.423,37,1 548 | 5,187,76,27,207,43.6,1.034,53,1 549 | 4,131,68,21,166,33.1,0.16,28,0 550 | 1,164,82,43,67,32.8,0.341,50,0 551 | 4,189,110,31,0,28.5,0.68,37,0 552 | 1,116,70,28,0,27.4,0.204,21,0 553 | 3,84,68,30,106,31.9,0.591,25,0 554 | 6,114,88,0,0,27.8,0.247,66,0 555 | 1,88,62,24,44,29.9,0.422,23,0 556 | 1,84,64,23,115,36.9,0.471,28,0 557 | 7,124,70,33,215,25.5,0.161,37,0 558 | 1,97,70,40,0,38.1,0.218,30,0 559 | 8,110,76,0,0,27.8,0.237,58,0 560 | 11,103,68,40,0,46.2,0.126,42,0 561 | 11,85,74,0,0,30.1,0.3,35,0 562 | 6,125,76,0,0,33.8,0.121,54,1 563 | 0,198,66,32,274,41.3,0.502,28,1 564 | 1,87,68,34,77,37.6,0.401,24,0 565 | 6,99,60,19,54,26.9,0.497,32,0 566 | 0,91,80,0,0,32.4,0.601,27,0 567 | 2,95,54,14,88,26.1,0.748,22,0 568 | 1,99,72,30,18,38.6,0.412,21,0 569 | 6,92,62,32,126,32,0.085,46,0 570 | 4,154,72,29,126,31.3,0.338,37,0 571 | 0,121,66,30,165,34.3,0.203,33,1 572 | 3,78,70,0,0,32.5,0.27,39,0 573 | 2,130,96,0,0,22.6,0.268,21,0 574 | 3,111,58,31,44,29.5,0.43,22,0 575 | 2,98,60,17,120,34.7,0.198,22,0 576 | 1,143,86,30,330,30.1,0.892,23,0 577 | 1,119,44,47,63,35.5,0.28,25,0 578 | 6,108,44,20,130,24,0.813,35,0 579 | 2,118,80,0,0,42.9,0.693,21,1 580 | 10,133,68,0,0,27,0.245,36,0 581 | 2,197,70,99,0,34.7,0.575,62,1 582 | 0,151,90,46,0,42.1,0.371,21,1 583 | 6,109,60,27,0,25,0.206,27,0 584 | 12,121,78,17,0,26.5,0.259,62,0 585 | 8,100,76,0,0,38.7,0.19,42,0 586 | 8,124,76,24,600,28.7,0.687,52,1 587 | 1,93,56,11,0,22.5,0.417,22,0 588 | 8,143,66,0,0,34.9,0.129,41,1 589 | 6,103,66,0,0,24.3,0.249,29,0 590 | 3,176,86,27,156,33.3,1.154,52,1 591 | 0,73,0,0,0,21.1,0.342,25,0 592 | 11,111,84,40,0,46.8,0.925,45,1 593 | 2,112,78,50,140,39.4,0.175,24,0 594 | 3,132,80,0,0,34.4,0.402,44,1 595 | 2,82,52,22,115,28.5,1.699,25,0 596 | 6,123,72,45,230,33.6,0.733,34,0 597 | 0,188,82,14,185,32,0.682,22,1 598 | 0,67,76,0,0,45.3,0.194,46,0 599 | 1,89,24,19,25,27.8,0.559,21,0 600 | 1,173,74,0,0,36.8,0.088,38,1 601 | 1,109,38,18,120,23.1,0.407,26,0 602 | 1,108,88,19,0,27.1,0.4,24,0 603 | 6,96,0,0,0,23.7,0.19,28,0 604 | 1,124,74,36,0,27.8,0.1,30,0 605 | 7,150,78,29,126,35.2,0.692,54,1 606 | 4,183,0,0,0,28.4,0.212,36,1 607 | 1,124,60,32,0,35.8,0.514,21,0 608 | 1,181,78,42,293,40,1.258,22,1 609 | 1,92,62,25,41,19.5,0.482,25,0 610 | 0,152,82,39,272,41.5,0.27,27,0 611 | 1,111,62,13,182,24,0.138,23,0 612 | 3,106,54,21,158,30.9,0.292,24,0 613 | 3,174,58,22,194,32.9,0.593,36,1 614 | 7,168,88,42,321,38.2,0.787,40,1 615 | 6,105,80,28,0,32.5,0.878,26,0 616 | 11,138,74,26,144,36.1,0.557,50,1 617 | 3,106,72,0,0,25.8,0.207,27,0 618 | 6,117,96,0,0,28.7,0.157,30,0 619 | 2,68,62,13,15,20.1,0.257,23,0 620 | 9,112,82,24,0,28.2,1.282,50,1 621 | 0,119,0,0,0,32.4,0.141,24,1 622 | 2,112,86,42,160,38.4,0.246,28,0 623 | 2,92,76,20,0,24.2,1.698,28,0 624 | 6,183,94,0,0,40.8,1.461,45,0 625 | 0,94,70,27,115,43.5,0.347,21,0 626 | 2,108,64,0,0,30.8,0.158,21,0 627 | 4,90,88,47,54,37.7,0.362,29,0 628 | 0,125,68,0,0,24.7,0.206,21,0 629 | 0,132,78,0,0,32.4,0.393,21,0 630 | 5,128,80,0,0,34.6,0.144,45,0 631 | 4,94,65,22,0,24.7,0.148,21,0 632 | 7,114,64,0,0,27.4,0.732,34,1 633 | 0,102,78,40,90,34.5,0.238,24,0 634 | 2,111,60,0,0,26.2,0.343,23,0 635 | 1,128,82,17,183,27.5,0.115,22,0 636 | 10,92,62,0,0,25.9,0.167,31,0 637 | 13,104,72,0,0,31.2,0.465,38,1 638 | 5,104,74,0,0,28.8,0.153,48,0 639 | 2,94,76,18,66,31.6,0.649,23,0 640 | 7,97,76,32,91,40.9,0.871,32,1 641 | 1,100,74,12,46,19.5,0.149,28,0 642 | 0,102,86,17,105,29.3,0.695,27,0 643 | 4,128,70,0,0,34.3,0.303,24,0 644 | 6,147,80,0,0,29.5,0.178,50,1 645 | 4,90,0,0,0,28,0.61,31,0 646 | 3,103,72,30,152,27.6,0.73,27,0 647 | 2,157,74,35,440,39.4,0.134,30,0 648 | 1,167,74,17,144,23.4,0.447,33,1 649 | 0,179,50,36,159,37.8,0.455,22,1 650 | 11,136,84,35,130,28.3,0.26,42,1 651 | 0,107,60,25,0,26.4,0.133,23,0 652 | 1,91,54,25,100,25.2,0.234,23,0 653 | 1,117,60,23,106,33.8,0.466,27,0 654 | 5,123,74,40,77,34.1,0.269,28,0 655 | 2,120,54,0,0,26.8,0.455,27,0 656 | 1,106,70,28,135,34.2,0.142,22,0 657 | 2,155,52,27,540,38.7,0.24,25,1 658 | 2,101,58,35,90,21.8,0.155,22,0 659 | 1,120,80,48,200,38.9,1.162,41,0 660 | 11,127,106,0,0,39,0.19,51,0 661 | 3,80,82,31,70,34.2,1.292,27,1 662 | 10,162,84,0,0,27.7,0.182,54,0 663 | 1,199,76,43,0,42.9,1.394,22,1 664 | 8,167,106,46,231,37.6,0.165,43,1 665 | 9,145,80,46,130,37.9,0.637,40,1 666 | 6,115,60,39,0,33.7,0.245,40,1 667 | 1,112,80,45,132,34.8,0.217,24,0 668 | 4,145,82,18,0,32.5,0.235,70,1 669 | 10,111,70,27,0,27.5,0.141,40,1 670 | 6,98,58,33,190,34,0.43,43,0 671 | 9,154,78,30,100,30.9,0.164,45,0 672 | 6,165,68,26,168,33.6,0.631,49,0 673 | 1,99,58,10,0,25.4,0.551,21,0 674 | 10,68,106,23,49,35.5,0.285,47,0 675 | 3,123,100,35,240,57.3,0.88,22,0 676 | 8,91,82,0,0,35.6,0.587,68,0 677 | 6,195,70,0,0,30.9,0.328,31,1 678 | 9,156,86,0,0,24.8,0.23,53,1 679 | 0,93,60,0,0,35.3,0.263,25,0 680 | 3,121,52,0,0,36,0.127,25,1 681 | 2,101,58,17,265,24.2,0.614,23,0 682 | 2,56,56,28,45,24.2,0.332,22,0 683 | 0,162,76,36,0,49.6,0.364,26,1 684 | 0,95,64,39,105,44.6,0.366,22,0 685 | 4,125,80,0,0,32.3,0.536,27,1 686 | 5,136,82,0,0,0,0.64,69,0 687 | 2,129,74,26,205,33.2,0.591,25,0 688 | 3,130,64,0,0,23.1,0.314,22,0 689 | 1,107,50,19,0,28.3,0.181,29,0 690 | 1,140,74,26,180,24.1,0.828,23,0 691 | 1,144,82,46,180,46.1,0.335,46,1 692 | 8,107,80,0,0,24.6,0.856,34,0 693 | 13,158,114,0,0,42.3,0.257,44,1 694 | 2,121,70,32,95,39.1,0.886,23,0 695 | 7,129,68,49,125,38.5,0.439,43,1 696 | 2,90,60,0,0,23.5,0.191,25,0 697 | 7,142,90,24,480,30.4,0.128,43,1 698 | 3,169,74,19,125,29.9,0.268,31,1 699 | 0,99,0,0,0,25,0.253,22,0 700 | 4,127,88,11,155,34.5,0.598,28,0 701 | 4,118,70,0,0,44.5,0.904,26,0 702 | 2,122,76,27,200,35.9,0.483,26,0 703 | 6,125,78,31,0,27.6,0.565,49,1 704 | 1,168,88,29,0,35,0.905,52,1 705 | 2,129,0,0,0,38.5,0.304,41,0 706 | 4,110,76,20,100,28.4,0.118,27,0 707 | 6,80,80,36,0,39.8,0.177,28,0 708 | 10,115,0,0,0,0,0.261,30,1 709 | 2,127,46,21,335,34.4,0.176,22,0 710 | 9,164,78,0,0,32.8,0.148,45,1 711 | 2,93,64,32,160,38,0.674,23,1 712 | 3,158,64,13,387,31.2,0.295,24,0 713 | 5,126,78,27,22,29.6,0.439,40,0 714 | 10,129,62,36,0,41.2,0.441,38,1 715 | 0,134,58,20,291,26.4,0.352,21,0 716 | 3,102,74,0,0,29.5,0.121,32,0 717 | 7,187,50,33,392,33.9,0.826,34,1 718 | 3,173,78,39,185,33.8,0.97,31,1 719 | 10,94,72,18,0,23.1,0.595,56,0 720 | 1,108,60,46,178,35.5,0.415,24,0 721 | 5,97,76,27,0,35.6,0.378,52,1 722 | 4,83,86,19,0,29.3,0.317,34,0 723 | 1,114,66,36,200,38.1,0.289,21,0 724 | 1,149,68,29,127,29.3,0.349,42,1 725 | 5,117,86,30,105,39.1,0.251,42,0 726 | 1,111,94,0,0,32.8,0.265,45,0 727 | 4,112,78,40,0,39.4,0.236,38,0 728 | 1,116,78,29,180,36.1,0.496,25,0 729 | 0,141,84,26,0,32.4,0.433,22,0 730 | 2,175,88,0,0,22.9,0.326,22,0 731 | 2,92,52,0,0,30.1,0.141,22,0 732 | 3,130,78,23,79,28.4,0.323,34,1 733 | 8,120,86,0,0,28.4,0.259,22,1 734 | 2,174,88,37,120,44.5,0.646,24,1 735 | 2,106,56,27,165,29,0.426,22,0 736 | 2,105,75,0,0,23.3,0.56,53,0 737 | 4,95,60,32,0,35.4,0.284,28,0 738 | 0,126,86,27,120,27.4,0.515,21,0 739 | 8,65,72,23,0,32,0.6,42,0 740 | 2,99,60,17,160,36.6,0.453,21,0 741 | 1,102,74,0,0,39.5,0.293,42,1 742 | 11,120,80,37,150,42.3,0.785,48,1 743 | 3,102,44,20,94,30.8,0.4,26,0 744 | 1,109,58,18,116,28.5,0.219,22,0 745 | 9,140,94,0,0,32.7,0.734,45,1 746 | 13,153,88,37,140,40.6,1.174,39,0 747 | 12,100,84,33,105,30,0.488,46,0 748 | 1,147,94,41,0,49.3,0.358,27,1 749 | 1,81,74,41,57,46.3,1.096,32,0 750 | 3,187,70,22,200,36.4,0.408,36,1 751 | 6,162,62,0,0,24.3,0.178,50,1 752 | 4,136,70,0,0,31.2,1.182,22,1 753 | 1,121,78,39,74,39,0.261,28,0 754 | 3,108,62,24,0,26,0.223,25,0 755 | 0,181,88,44,510,43.3,0.222,26,1 756 | 8,154,78,32,0,32.4,0.443,45,1 757 | 1,128,88,39,110,36.5,1.057,37,1 758 | 7,137,90,41,0,32,0.391,39,0 759 | 0,123,72,0,0,36.3,0.258,52,1 760 | 1,106,76,0,0,37.5,0.197,26,0 761 | 6,190,92,0,0,35.5,0.278,66,1 762 | 2,88,58,26,16,28.4,0.766,22,0 763 | 9,170,74,31,0,44,0.403,43,1 764 | 9,89,62,0,0,22.5,0.142,33,0 765 | 10,101,76,48,180,32.9,0.171,63,0 766 | 2,122,70,27,0,36.8,0.34,27,0 767 | 5,121,72,23,112,26.2,0.245,30,0 768 | 1,126,60,0,0,30.1,0.349,47,1 769 | 1,93,70,31,0,30.4,0.315,23,0 -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Data Science Projects 2 | ![](https://lh3.googleusercontent.com/yuUrDV2DAtBRvItHZ2FvXMkPbHR5NEt4kXbpp8dgK-r9jI9-irP19GJb2CvdBRYmy41KG4BxFu2Hod9GzdgGc46iYmm7As4bNNsc-JP7vYwY8d1BzHgZdvKR7H4xtLM20zR9gn0PJE-nQU0navp9Xh0pHc3Cp-CjYUENN7dWZ3NJiw8CiHFEJn7Mc0ul_A) 3 | Welcome to my Data Science Projects Repository! This repository contains a collection of my data science projects, showcasing my skills and expertise in the field. Each project demonstrates different aspects of data analysis, machine learning, and visualization. 4 | 5 | ![GitHub Repo stars](https://img.shields.io/github/stars/SUKHMAN-SINGH-1612/Data-Science-Projects?style=social) ![GitHub forks](https://img.shields.io/github/forks/SUKHMAN-SINGH-1612/Data-Science-Projects?style=social) 6 | 7 | 8 | ### GitHub Page 9 | [![Data-Science-Projects](https://img.shields.io/badge/Data_Science_Projects-GitHub_Page-%2300BFFF.svg)](https://sukhman-singh-1612.github.io/data_science/) 10 | 11 | ## Projects 12 | 1. [Breast Cancer Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Breast%20Cancer%20Prediction) 13 | - **Description:** The project predicts the diagnosis (M = malignant, B = benign) of the Breast Cancer 14 | - **Technologies Used:** The notebooks uses Decision Tree Classification and Logistic Regression 15 | - **Results:** The logistic regression gave 97% accuracy and decision tree gave 93.5% accuracy 16 | 2. [Red Wine Quality Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Red%20Wine%20Quality) 17 | - **Description:** The project predicts the quality of the wine in the value 0 or 1. 1 for good quality and 0 for bad quality 18 | - **Technologies Used:** The notebooks uses logistic regression, support vector machine, decision tree and knn 19 | - **Results:** The logistic regression model performs the best with accuracy of 86.67% 20 | 3. [Heart Stroke Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Heart%20Stroke%20Prediction) 21 | - **Description:** The project predicts the risk of heart stroke on studying the person's demographics and medical info 22 | - **Technologies Used:** The notebooks uses logistic regression, support vector machine, decision tree and knn 23 | - **Results:** The logistic regression, SVM and KNN performs the best with 93.8 % accuracy 24 | 4. [House Price Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/House%20Price%20Prediction) 25 | - **Description:** The project predicts the house price after studying the variables such as location, area, bredroom, bathroom count and many more. 26 | - **Technologies Used:** The notebooks uses Linear Regression, Ridge Regression and Random Forest Regressor 27 | - **Results:** The Random Forest Regressor performed best with accuracy of 87.89% 28 | 5. [Titanic Survival Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Titanic%20Survival%20Prediction) 29 | - **Description:** The project predicts the survival during the titanic disaster based on socio-economic measures 30 | - **Technologies Used:** The notebooks uses Descision Tree Classifier 31 | - **Results:** The Decision Tree Classifer performed well on the test dataset with an accuracy of 89.5% 32 | 6. [Diamond Price Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Diamond%20Price%20Prediction) 33 | - **Description:** The project predicts the price (in US dollars) of the diamonds based on their features 34 | - **Technologies Used:** The notebooks uses Descision Tree Regressor and Random Forest Regressor 35 | - **Results:** The Decision Tree Regresor performed well on the test dataset with an accuracy of 96% 36 | 7. [Medical Cost Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Medical%20Cost%20Prediction) 37 | - **Description:** The project predicts the medical treatment cost by analysing the patients age, gender, bmi, smoking habits etc. 38 | - **Technologies Used:** The notebooks uses Linear and Polynomial Regression, Decision Tree and Random Forest Regressor 39 | - **Results:** The Decision Tree Regressor and Random Forest Regressor performed well 40 | 8. [Room Occupancy Detection](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Room%20Occupancy%20Detection) 41 | - **Description:** The project predicts the room occupancy by analyzing the sensor data such as temperature, light and co2 level. 42 | - **Technologies Used:** The notebooks uses Random Forest Classifier 43 | - **Results:** The Random Forest Classifier performed well with an accuracy of 98% 44 | 9. [Sleep Disorder Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Sleep%20Disorder%20Prediction) 45 | - **Description:** The project aims to predict sleep disorders and their types by analyzing lifestyle and medical variables, such as age, BMI, sleep duration, 46 | blood pressure, and more 47 | - **Technologies Used:** The notebooks uses Random Forest Classifier and Decision Tree cLassifier 48 | - **Results:** The Random Forest Classifier performed well with an accuracy of 89% 49 | 10. [Pima Indians Diabetes Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Pima%20Indians%20Diabetes%20Prediction) 50 | - **Description:** The primary objective of the Pima Indian Diabetes Prediction project is to analyze various medical factors of female patients, to predict whether they have diabetes or not. 51 | - **Technologies Used:** The notebooks uses Logistic Regression, Random Forest Classifier and Support Vector Machine 52 | - **Results:** The Logistic Regression performed with an accuracy of 78%. 53 | 11. [Bank Customer Churn Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Customer%20Churn%20Prediction) 54 | - **Description:** The main objective of the Bank Customer Churn Prediction project is to analyze the demographics in order to predict whether a customer will leave the bank or not. 55 | - **Technologies Used:** The notebooks uses Random Forest Classifier and Decision Tree Classifier 56 | - **Results:** The Random Forest Classifier and Decision Tree Classifier performed equally well with an accuracy of 87% 57 | 12. [Salary Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Salary%20Prediction) 58 | - **Description:** The main objective of the Salary Prediction project is analyze the employee's demographics such as age, experience job title, country and race to predicts the salary. 59 | - **Technologies Used:** The notebooks uses Descision Tree Regressor and Random Forest Regressor 60 | - **Results:** The Random Forest Regressor performed best with 94.6% accuracy 61 | 13. [Delhi House Price Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Delhi%20House%20Price%20Prediction) 62 | - **Description:** he primary objective is to develop a predictive model that can accurately estimate the prices of houses based on several key features present in the dataset. 63 | - **Technologies Used:** The notebooks uses Descision Tree Regressor and Random Forest Regressor 64 | - **Results:** The Random Forest Regressor performed best with 84.98% accuracy 65 | 14. [Loan Approval Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Loan%20Approval%20Prediction) 66 | - **Description:** The Loan Approval Prediction project aims to predict whether a loan application will be approved by a bank. 67 | - **Technologies Used:** The notebooks uses Random Forest Classifier and Decision Tree Classifier 68 | - **Results:** The Decision Tree Classifier performed well with an accuracy of 91.4% 69 | 15. [Cardiovascular Disease Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Cardiovascular%20Disease%20Prediction) 70 | - **Description:** The Cardiovascular Disease Prediction project aims to predict the occurrence of cardiovascular disease in patients based on their medical records and history. 71 | - **Technologies Used:** The notebooks uses Random Forest Classifier, Decision Tree Classifier and Logistic Regression 72 | - **Results:** The Logistic Regression performed well with an accuracy of 91.4% 73 | 16. [Belarus Car Price Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Belarus%20Car%20Price%20Prediction) 74 | - **Description:** The Belarus Car Price Prediction project aims to predict the price of car in Belarus based on car features. 75 | - **Technologies Used:** The notebooks uses Decision Tree Regressor 76 | - **Results:** The Decision Tree Regressor gave an accuracy of 86.29% 77 | 17. [Warranty Claims Fraud Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Warranty%20Claims%20Fraud%20Prediction) 78 | - **Description:** The aim of this data science project is to predict the authenticity of warranty claims by analyzing various factors such as region, product category, claim value, and more. 79 | - **Technologies Used:** The notebooks uses Decision Tree Classifier, Random Forest Classifier and Logistic Regression 80 | - **Results:** All three models gave an accuracy of 91-92% 81 | 18. [E-Commerce Product Delivery Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/E-Commerce%20Product%20Delivery%20Prediction) 82 | - **Description:** The aim of this project is to predict whether products from an international e-commerce company will reach customers on time or not. 83 | - **Technologies Used:** The notebooks uses Decision Tree Classifier, Random Forest Classifier, Logistic Regression and KNN Classifier 84 | - **Results:** The decision tree classifier model performed best with 69% accuracy 85 | 19. [Hotel Reservations Cancellation Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Hotel%20Reservations%20Cancellation%20Prediction) 86 | - **Description:** The aim of this project to predict the possible reservations that are going to cancelled by the customers by analyzing various features and variables associated with the reservation. 87 | - **Technologies Used:** The notebooks uses Decision Tree Classifier, Random Forest Classifier and Logistic Regression. 88 | - **Results:** The decision tree classifier model performed best with 85% accuracy 89 | 20. [Telecom Customer Churn Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Telecom%20Customer%20Churn%20Prediction) 90 | - **Description:** The aim of this project is to analyze customer demographics, services, tenure and other variables to predict whether a particular customer will churn or not. 91 | - **Technologies Used:** The notebooks uses Decision Tree Classifier, Random Forest Classifier and K Nearest Neighbor Classifier. 92 | - **Results:** The random forest classifier model performed best with 82% accuracy 93 | 21. [SFR Analysis](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/SFR%20Analysis) 94 | - **Description:** The objective of this project is to analyze the SFR (SpaceFund Realty) of the aerospace companies and their missions in order to help the investors to make better decisions. 95 | - **Technologies Used:** The notebooks uses Decision Tree Classifier, Random Forest Classifier. 96 | - **Results:** The random forest classifier and decision tree classifier gave 87% accuracy. 97 | 22. [Indian Used Car Price Prediction](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/tree/main/Indian%20Used%20Car%20Price%20Prediction) 98 | - **Description:** The aim of this data science project is to predict the price of used cars in major Indian metro cities. 99 | - **Technologies Used:** The notebooks uses Decision Tree Regressor and Random Forest Regressor. 100 | - **Results:** The random forest regressor gave 87.8% accuracy 101 | ## License 102 | This project is licensed under the [MIT License](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/blob/main/LICENSE). You are free to use the code and resources for educational or personal purposes. 103 | ## Contributing 104 | Contributions are welcome! If you would like to contribute to this repository, please follow the guidelines outlined in [CONTRIBUTING.md](https://github.com/SUKHMAN-SINGH-1612/Data-Science-Projects/blob/main/CONTRIBUTING.md). Any improvements, bug fixes, or additional projects are greatly appreciated. 105 | ## Feedback and Contact 106 | I welcome any feedback, suggestions, or questions you may have about the projects or the repository. Feel free to reach out to me via email at sukhmansinghbhogal@gmail.com 107 | 108 | Enjoy exploring my data science projects! 109 | -------------------------------------------------------------------------------- /Red Wine Quality/Description.md: -------------------------------------------------------------------------------- 1 | # Red Wine Quality Prediction 2 | This data science project aims to predict the quality of red variants of the Portuguese "Vinho Verde" wine. The dataset consists of physicochemical (input) and sensory (output) variables, with no information about grape types, wine brand, or selling price due to privacy and logistic constraints. 3 | 4 | The dataset can be approached as either a classification or regression task. The quality scores are ordered and not evenly distributed, meaning there are significantly more normal wines than excellent or poor ones. 5 | ## Dataset Information 6 | The dataset used in this project is sourced from the UCI Machine Learning Repository and is also available for convenience on Kaggle. However, if there are any licensing concerns, the dataset will be promptly removed upon request. Additional details can be found in the reference [Cortez et al., 2009]. 7 | ## Input Variables (Physicochemical Tests) 8 | 1. Fixed acidity 9 | 2. Volatile acidity 10 | 3. Citric acid 11 | 4. Residual sugar 12 | 5. Chlorides 13 | 6. Free sulfur dioxide 14 | 7. Total sulfur dioxide 15 | 8. Density 16 | 9. pH 17 | 10. Sulphates 18 | 11. Alcohol 19 | ## Output Variable (Sensory Data) 20 | 1. Quality (score between 0 and 10) 21 | For further insights and analysis, refer to the research paper by Cortez et al. (2009). This project aims to leverage machine learning techniques to build a predictive model that can estimate the quality of red wines based on their physicochemical properties and sensory attributes. 22 | -------------------------------------------------------------------------------- /Red Wine Quality/Wine-Quality.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Red Wine Quality/Wine-Quality.pdf -------------------------------------------------------------------------------- /Room Occupancy Detection/Room Occupancy Detection.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Room Occupancy Detection/Room Occupancy Detection.pdf -------------------------------------------------------------------------------- /Room Occupancy Detection/description.md: -------------------------------------------------------------------------------- 1 | # Occupancy Prediction from Sensor Data 2 | ![](https://code.datasciencedojo.com/datasciencedojo/datasets/raw/master/Occupancy%20Detection/O6YGSH0.jpg) 3 | The Occupancy Prediction project aims to predict whether a room is occupied or not based on the data collected from various sensors. The dataset used in this project is obtained from the UCI Machine Learning Repository and consists of 7 attributes, namely date, temperature, humidity, light, CO2, humidity ratio, and occupancy. 4 | ## Dataset Description 5 | The dataset contains experimental data used for binary classification of room occupancy in an office room. The key features used for prediction are Temperature, Humidity, Light, and CO2. The ground-truth occupancy labels were obtained from time-stamped pictures taken every minute. 6 | ## Data Attributes 7 | - Date: The date and time of the data entry. 8 | - Temperature: The ambient temperature of the room. 9 | - Humidity: The relative humidity of the room. 10 | - Light: The light intensity in the room. 11 | - CO2: The carbon dioxide level in the room. 12 | - Humidity Ratio: The ratio of water vapor in the air to the amount of dry air. 13 | - Occupancy: The binary target variable indicating whether the room is occupied (1) or not (0). 14 | 15 | ## Data Dictionary 16 | | Column Position | Atrribute Name | Definition | Data Type | Example | % Null Ratios | 17 | |-------------------|----------------|---------------------------------------|--------------|------------------------------------------------|---------------| 18 | | 1 | Date | Date & time in year-month-day hour:minute:second format | Qualitative | 2/4/2015 17:57, 2/4/2015 17:55, 2/4/2015 18:06 | 0 | 19 | | 2 | Temperature | Temperature in degree Celcius | Quantitative | 23.150, 23.075, 22.890 | 0 | 20 | | 3 | Humidity | Relative humidity in percentage | Quantitative | 27.272000, 27.200000, 27.390000 | 0 | 21 | | 4 | Light | Illuminance measurement in unit Lux | Quantitative | 426.0, 419.0, 0.0 | 0 | 22 | | 5 | CO2 | CO2 in parts per million (ppm) | Quantitative | 489.666667, 495.500000, 534.500000 | 0 | 23 | | 6 | HumidityRatio | Humadity ratio: Derived quantity from temperature and relative humidity, in kgwater-vapor/kg-air | Quantitative | 0.004986, 0.005088, 0.005203 | 0 | 24 | | 7 | Occupancy | Occupied or not: 1 for occupied and 0 for not occupied | Quantitative | 1, 0 | 0 | 25 | ## Goal 26 | The primary goal of this project is to build an efficient binary classification model that accurately predicts room occupancy based on sensor data. 27 | ## Model Selection 28 | To achieve this goal, machine learning algorithms, random forests was used and it gave an accuracy of 98%. 29 | ## Impact 30 | The Occupancy Prediction project has the potential to revolutionize various domains, including energy efficiency, resource allocation, safety, and IoT applications. By accurately predicting room occupancy based on sensor data, it can enable smart building automation, optimize resource usage, enhance security measures, and promote sustainable practices. Additionally, it offers valuable insights into occupancy patterns, empowering researchers and policymakers to make informed decisions for a more efficient and comfortable living and working environment. 31 | -------------------------------------------------------------------------------- /SFR Analysis/Launch SFR.csv: -------------------------------------------------------------------------------- 1 | "Company","SFR","Payload (kg)","Launch Cost ($M)","Price ($/kg)","Funding ($M)","Launch Class","Orbit Altitude","Tech Type","Country","HQ Location","Description" 2 | "Arianespace/Avio","9","20,000","170.000","8,500","Public","Medium, Heavy","LEO","Rocket","Italy","Colleferro","Developing the Vega & Ariane launch vehicles" 3 | "Astra Space","9","300","3.950","13,167","Public","Small","LEO","Rocket","United States","Alameda, CA","Providing routine launch access to Earth orbit for entrepreneurs and enterprises" 4 | "Black Sky Aerospace","9","350","0.600","1,714","-","Small","Suborbital","Rocket","Australia","Browns Plains, Queensland","Offers a range of sounding rockets, capable of flights up to 300km in multistage configurations" 5 | "Blue Origin","9","0","0.000","0","-","Tourism, Small, Heavy","Suborbital","Rocket","United States","Kent, WA","Lowering the cost of access to space with reusable launch vehicles" 6 | "CNIM Air Space","9","2,700","0.000","0","Public","Small, Medium","Suborbital","Balloon","France","Ayguesvives","Balloons that lift anything from a few kilograms to several tons and are able to operate at an altitude as low as a few hundred metres or as high as 40 km" 7 | "Galactic Energy","9","5,000","0.000","14,400","243.00","Small, Medium","LEO","Rocket","China","-","Developing the Ceres-1 and Pallas-1 launch vehicles" 8 | "Northrop Grumman","9","8,000","80.000","10,000","Public","Small, Medium","LEO","Rocket","United States","Falls Church, VA","Designs, manufactures, and operates launch vehicles, propulsion systems, and satellites and related components" 9 | "Raptor Aerospace","9","16","0.000","0","-","Small","Suborbital","Rocket","United Kingdom","Badersfield, England","Developing suborbital rockets to proide access and research for traditional and ‘New-Space’ markets" 10 | "Rocket Lab","9","300","4.900","16,333","257.31","Small","LEO","Rocket","United States","Long Beach, CA","The first rocket company and launch site for cubesat payloads in New Zealand" 11 | "SpaceX*","9","63,800","90.000","1,411","8410","Heavy, Super Heavy","LEO","Rocket","United States","Hawthorne, CA","SpaceX designs, manufactures, and launches advanced rockets and spacecraft" 12 | "Swedish Space Corp","9","2,000","0.000","0","-","Small","Suborbital","Balloon, Rocket","Sweden","Solna","SSC designs sounding rocket vehicles, stratospheric balloon systems, and payloads" 13 | "United Launch Alliance","9","17,800","99.000","5,562","Joint Venture","Medium, Heavy","LEO","Rocket","United States","Centennial, CO","A joint venture between Lockheed Martin and Boeing, ULA produces the Vulcan, Atlas and Delta Launch Vehicle Families" 14 | "UP Aerospace","9","36","0.000","0","0.72","Small","Suborbital","Rocket","United States","Highlands Ranch, CO","UP Aerospace is a suborbital space launch and flight test service provider" 15 | "Virgin Galactic/The Spaceship Company","9","372","1.500","4,032","Public","Tourism","Suborbital","Spaceplane","United States","Mojave, CA","Building a fleet of WhiteKnightTwo carrier aircraft and SpaceShipTwo reusable spaceships" 16 | "Virgin Orbit/VOX Space","9","500","12.000","24,000","-","Small","LEO","Plane, Rocket","United States","Long Beach, CA","The small payload orbital launch sister company to Virgin Galactic" 17 | "World View","9","4,500","0.000","0","48.68","Small, Medium","Suborbital","Balloon","United States","Tucson, AZ","The Stratollite is a remotely operated, navigable vehicle that can remain aloft for days, weeks, and months on end" 18 | "Firefly Aerospace","8","1,000","15.000","15,000","98.16","Small","LEO","Rocket","United States","Cedar Park, TX","Developing the Firefly Alpha launch vehicle; highest payload performance with the lowest cost per kg to orbit in its vehicle class" 19 | "iSpace","8","300","5.000","16,667","275.00","Small","LEO","Rocket","China","Beijing","iSpace is a Chinese private space launch company developing the Hyperbola rocket family" 20 | "Mitsubishi Heavy Industries","8","19,000","112.500","5,921","Public","Medium","LEO","Rocket","Japan","Tokyo, Tokyo","Launch vehicle manufacturer and launch services provider" 21 | "Relativity Space","8","1,250","10.000","8,000","1334.54","Small, Heavy","LEO","Rocket","United States","Los Angeles, CA","Designing orbital-class launch vehicles that are autonomously constructed" 22 | "Sierra Space","8","5,000","0.000","0","1400.00","Medium","LEO","Spaceplane","United States","Broomfield, CO","Developing Dream Chaser, a multi-mission space utility vehicle designed to transport crew and cargo to low-Earth orbit" 23 | "ABL Space Systems","7","1,350","12.000","8,889","420.00","Small","LEO","Rocket","United States","El Segundo, CA","Building rockets to launch small satellites" 24 | "Landspace","7","4,000","0.000","0","361.30","Small, Medium","LEO","Rocket","China","Beijing","Commercial launch vehicle manufacturer and space launch provider in China" 25 | "Stoke Space Technologies","7","500","0.000","0","74.10","Small","LEO","Rocket","United States","Seattle, WA","Developinig the world's first 100% reusable rocket" 26 | "Strato","7","0","0.000","0","-","Small","Suborbital","Balloon","Ukraine","Kiev","Launching sratospheric balloons for research and promotional purposes" 27 | "T-Minus","7","45","0.000","0","-","Small","Suborbital","Rocket","Netherlands","Delft","Offering commercial suborbital platforms for scientific research, commercial applications, and defense" 28 | "Dawn Aerospace","6","100","0.000","0","4.54","Small","LEO","Spaceplane","Netherlands","Delft","Developing scalable and sustainable space transportation technology to become the backbone of the near space economy" 29 | "Deep Blue Aerospace","6","4,500","0.000","0","45.60","Small, Medium","LEO","Rocket","China","Beijing","Committed to reducing the cost of entering space via reusable launch products" 30 | "Exos Aerospace","6","100","0.000","0","9.09","Small","Suborbital","Rocket","United States","Caddo Mills, TX","Selling reusable launch vehicles for small, sub-orbital scientific/experimental payloads" 31 | "Gilmour Space","6","215","7.000","32,558","66.00","Small","LEO","Rocket","Australia","Pimpama, Queensland","Developing the ERIS launch vehicles to provide reliable and cost-effective access to space" 32 | "Honda","6","1,000","0.000","0","Public","Small","LEO","Rocket","Japan","Tokyo, Tokyo","Applying autonomous technology to develop a resuable small rocket" 33 | "Isar Aerospace Technologies","6","1,000","10.000","10,000","180.00","Small","LEO","Rocket","Germany","Ottobrun","Lowering the entry barriers to space and making space access affordable and sustainable" 34 | "Masten Space Systems","6","200","0.000","0","-","Small","Suborbital","Rocket","United States","Mojave, CA","Buildig the Xogdor rocket to test payloads at supersonic speeds and at the edge of space" 35 | "One Space","6","200","3.200","16,000","116","Small","LEO","Rocket","China","Beijing","Building a private 3-stage nanosatellite launch vehicle in China" 36 | "PLD Space","6","300","0.000","0","27.14","Small","LEO","Rocket","Spain","Elche","A Euro startup developing tech for suborbital & orbital launch services, dedicated to small payloads and nanosatellites" 37 | "Sea Launch/S7","6","6,160","0.000","0","-","Medium","GTO","Rocket","Russia","-","Integrated launch services for the Zenit Launch Vehicle via a mobile sea platform" 38 | "Skyrora","6","500","12.600","25,200","4.85","Small","LEO","Rocket","United Kingdom","Edinburgh, Scotland","Cost effective small satellite launch services from the United Kingdom" 39 | "Space Forest","6","50","0.000","4,500","-","Small","Suborbital","Rocket","Poland","Gdynia","Commercializing new suborbital rocket technologies" 40 | "Space Perspective*","6","0","0.000","0","65.00","Tourism","Suborbital","Balloon","United States","Kennedy Space Center, FL","The off-world travel company" 41 | "SpinLaunch","6","100","0.500","5,000","150.00","Small","LEO","Other","United States","Long Beach, CA","Small satellite space launch via a kinetic launch system" 42 | "Zero 2 Infinity","6","0","0.000","0","13.83","Tourism","Suborbital","Balloon","Spain","Barcelona","Developing a zero-emission space tourism platform" 43 | "Equatorial Space Systems","5","350","0.000","0","0.75","Small","LEO","Rocket","United States","Albuquerque, NM","Developing Volanis, a series of hybrid small-launch vehicles operated from an ocean platform" 44 | "Hanwha Aerospace","5","500","0.000","0","Public","Small","LEO","Rocket","South Korea","Seoul","Developing a small satellite launcher in cooperation with the Korea Aerospace Research Institute" 45 | "HyImpulse","5","500","7.780","15,560","9.70","Small","LEO","Rocket","Germany","Kochen","Developing an orbital small launcher, powered by a unique, green hybrid propulsion technology" 46 | "INNOSPACE","5","1,000","20.000","20,000","28.40","Small","LEO","Rocket","South Korea","Sejong City","Developing hybrid small satellite launch vehicles" 47 | "Interstellar Technologies","5","100","0.440","4,400","0.35","Small","Suborbital","Rocket","Japan","Taiki, Hokkaido","Privately developing rocket engines and suborbital launch vehicles in Japan" 48 | "Launcher","5","773","10.000","12,937","14.90","Small","LEO","Rocket","United States","Hawthorne, CA","A team on a 10-year journey to deliver small satellites to orbit." 49 | "LinkSpace","5","200","4.500","22,500","-","Small","LEO","Rocket","China","Beijing","Developing reusable launch vehicle technology for orbital payloads in China" 50 | "Maritime Launch Services","5","5,000","45.000","9,000","-","Medium","LEO","Rocket","Canada","Halifax, Nova Scotia","Building up Canada's NewSpace industry as the first private launch provider in Canso, Nova Scotia" 51 | "Orbex","5","220","0.000","0","39.53","Small","LEO","Rocket","United Kingdom","London, England","Launch services for small, micro and nano satellites" 52 | "Orienspace","5","6,500","0.000","0","171.90","Medium","LEO","Rocket","China","-","Developing the sea-launched Gravity-1 vehicle" 53 | "PD Aerospace","5","6,000","0.000","0","6.75","Tourism, Medium","Suborbital","Spaceplane","Japan","Nagoya, Aichi","Developing a fully-reusable suborbital spacecraft" 54 | "Perigee Aerospace","5","65","2.000","30,769","13.86","Small","LEO","Rocket","South Korea","Daejeon","Developing Blue Whale 1, a micro launch vehicle" 55 | "Reaction Engines","5","17,000","0.000","860","95.37","Medium","LEO","Spaceplane","United Kingdom","Abingdon, England","Developing the re-usable Skylon spaceplane powered by SABRE propulsion" 56 | "Rocket Factory Augsburg (by OHB)","5","1,350","0.000","0","-","Small","LEO","Rocket","Germany","Augsburg","Offering hassle-free space transportation to dedicated orbits" 57 | "Skyroot Aerospace","5","720","0.000","0","68.00","Small","LEO","Rocket","India","Hyderabad, Telangana","Building technologies for responsive, reliable and economic access to space" 58 | "Starchaser","5","186","0.000","0","-","Small","Suborbital","Rocket","United Kingdom","Hyde, England","Building a reusable three-person rocket ship for space tourism" 59 | "Sugarhouse Aerospace","5","50","0.250","5,000","-","Small","Suborbital","Rocket","United States","Springville, UT","Sugarhouse Aerospace is built on a simple idea - space shouldn't be reserved for governments and billionaires" 60 | "Vaya Space","5","1,000","9.000","9,000","9.00","Small","LEO","Rocket","United States","Cocoa, FL","Offering flexible launch capability with a cycle of less than thirty days to build, integrate, and launch" 61 | "Zephalto","5","0","0.000","0","-","Tourism","Suborbital","Balloon","France","Le Pouget","Providing passengers with a trip into the stratosphere" 62 | "Aevum","4","100","0.000","0","-","Small","LEO","Plane, Rocket","United States","Huntsville, AL","Provides earth-to-space space delivery services for small payloads" 63 | "Agnikul Cosmos","4","100","1.200","12,000","14.58","Small","LEO","Rocket","India","Chennai, Tamil Nadu","Designing, manufacturing, and launching orbital-class launch vehicles" 64 | "Bellatrix Aerospace","4","150","2.000","13,333","11.00","Small","LEO","Rocket","India","Bangalore, Karnataka","Orbital launch vehicles (rockets) and electric propulsion systems" 65 | "bluShift Aerospace","4","30","0.000","0","-","Small","LEO","Rocket","United States","Brunswick, ME","Developing a unique line of rockets powered by bio-derived fuels to launch tiny satellites into space" 66 | "CosmoCourse","4","434","1.500","0","-","Tourism","Suborbital","Rocket","Russia","Moscow","Creating a reusable suborbital space complex for tourist flights into space" 67 | "Earth to Sky","4","600","4.500","7,500","5.25","Small","LEO","Rocket","United States","Huntsville, AL","Providing launch services to LEO at an affordable cost" 68 | "Green Launch","4","5","0.000","0","0.50","Small","LEO","Other","United States","Edmond, OK","Green Launch is a technology created to revolutionize and expand our access to space" 69 | "Hypersonix","4","150","0.000","0","-","Small","LEO","Rocket, Other","Australia","Sydney, New South Wales","Bringing increased efficiency and flexibility to space launch through the use of cutting edge aerodynamic technology" 70 | "Microcosm","4","160","4.200","26,250","-","Small","LEO","Rocket","United States","Torrance, CA","Developing the Demi-Sprite Launch System" 71 | "OrbitX","4","200","4.800","24,000","0.05","Small","LEO","Rocket","Philippines","Quezon City","Using clean tech to develop a sustainable and cheap rocket called Haribon SLS" 72 | "Pangea Aerospace","4","300","4.540","15,133","7.10","Small","LEO","Rocket","Spain","Barcelona","Enabling Low cost access to space with the Aerospike engined reusable Small satellite launch vehicle ""MESO""" 73 | "Pipeline2Space (by HyperSciences)","4","10","1.000","100,000","-","Small","LEO","Other","United States","Spokane, WA","Using RAM-accelerators to change the economics of space launch" 74 | "Pythom","4","100","1.200","12,000","0.75","Small","LEO","Rocket","United States","Bishop, CA","Pythom is creating a custom-built, two-person craft to explore Mars" 75 | "Radian Aerospace*","4","2,265","0.000","0","27.50","Medium","LEO","Spaceplane","United States","Renton, WA","Developing the Radian One spaceplane to carry people and cargo to and from orbit" 76 | "Space Darts","4","100","0.100","1,000","15.00","Small","LEO","Rocket","Russia","Moscow","Mass production of on-demand launchers for small sats" 77 | "Space Engine Systems","4","5,500","0.000","0","22.00","Medium","LEO","Spaceplane","Canada","Edmonton, Alberta","Designing a reusable, single-stage-to-orbit cruise vehicle" 78 | "Space Pioneer","4","0","0.000","0","100.00","Medium, Heavy","LEO","Rocket","China","-","Developing reusable, medium and heavy orbital launch vehicles" 79 | "Space Transportation (Lingkong Tianxing)","4","1,000","0.000","0","18.00","Small","LEO","Rocket, Spaceplane","China","Beijing","Launcher manufacturer aimed at developing reusable vehicles for small payloads" 80 | "Space Vector","4","0","0.000","0","-","Small","LEO","Rocket","United States","Chatsworth, CA","Space Vector has launched 37 vehicles to date with program turnaround times as short as 8 months" 81 | "Space Walker","4","100","0.000","0","4.98","Small","Suborbital","Spaceplane","Japan","Minato, Tokyo","Developing a sub-orbital spaceplane for scientific missions" 82 | "Stofiel Aerospace","4","250","5.000","20,000","-","Small","LEO","Balloon, Rocket","United States","St. Louis, MO","Balloon-based small satellite launcher" 83 | "Stratolaunch","4","0","0.000","0","-","-","LEO","Spaceplane","United States","Seattle, WA","Developing Black Ice, a fully reusable space plane initially optimized for cargo launch" 84 | "TLON Space","4","25","0.000","0","-","Small","LEO","Rocket","Argentina","Buenos Aires","Developng a revolutionarily light small launch vehicle" 85 | "VSAT Aerospace","4","100","3.000","30,000","-","Small","LEO","Rocket","Brazil","Santo André, São Paulo","Delivering payloads to low Earth orbit" 86 | "WAGNER Star Industries","4","1,500","3.000","2,000","-","Small","LEO","Spaceplane","United States","St Petersburg, FL","WAGNER is a simple, reliable, low-cost launch systems company" 87 | "X-Bow Launch Systems","4","0","0.000","0","-","Small","LEO","Rocket","United States","Huntsville, AL","Dedicated to providing affordable access to orbit for commercial and government payloads" 88 | "0-G Launch","3","0","0.000","25,000","-","Small","LEO","Plane, Rocket","United States","Washington, D.C.","Developing the Space Jet air-launch platform" 89 | "Acrux Aerospace Technologies","3","25","1.000","40,000","0.05","Small","LEO","Rocket","Brazil","Iguassu Falls, Paraná","Designing the Montenegro nanolauncher for launches at a very competitive price" 90 | "Advanced Rockets Corp","3","25,000","4.100","164","0.07","Heavy","LEO","Rocket","United States","Los Angeles, CA","The Dynamics Enhanced Launch Vehicle is designed for a life of over 400 reuses with very minimal maintenance and rapid turnarounds" 91 | "ARCA Space Corporation","3","100","1.000","10,000","-","Small","LEO","Rocket","United States","Las Cruces, NM","Single Stage to Orbit (SSTO) rocket leveraging aerospike engine technology launching to low earth orbit" 92 | "Bagaveev","3","10","0.250","25,000","0.96","Small","LEO","Rocket","United States","Half Moon Bay, CA","Dedicated nanosatellite launch provider" 93 | "Beyond Earth","3","30","0.900","30,000","-","Small","LEO","Rocket","United States","Menlo Park, CA","Providing rapid response small satellite launch vehicles for government and commercial customers" 94 | "C6 Launch Systems","3","30","0.000","0","-","Small","LEO","Rocket","Canada","Toronto, Ontario","Provding dedicated launch services for cube and nanosatellites" 95 | "Exodus Space Corp","3","1,200","0.000","0","0.30","Small","LEO","Spaceplane","United States","Denver, CO","Developing the Horizontal Takeoff Horizontal Landing (HTHL) AstroClipper for the future of commercial space transportation" 96 | "Gloyer-Taylor Labs","3","0","0.000","300","-","-","LEO","Rocket","United States","Tullahoma, TN","Developing the Advanced Cryognic Expendable (ACE) launch vehicle" 97 | "Hudson Space Systems","3","80","0.760","9,500","0.10","Small","Suborbital","Rocket","United States","Hoboken, NJ","Developing a next generation of reusable launch vehicles for microgravity research" 98 | "Interorbital Systems","3","40","0.500","12,500","-","Small","LEO","Rocket","United States","Mojave, CA","A rocket, satellite, and spacecraft manufacturing company." 99 | "iRocket","3","1,500","0.000","0","-","Small","LEO","Rocket","United States","New Hyde Park, NY","Building the first fully autonomous, and fully reusable small launch vehicle for affordable access to space" 100 | "JP Aerospace","3","0","0.000","0","-","-","Suborbital","Balloon","United States","Rancho Cordova, CA","JP Aerospace is a volunteer-based DIY Space Program. Home of PongSat and Airship to Orbit . We invented the better sandbag" 101 | "Laros","3","150","3.000","20,000","-","Small","LEO","Rocket","Russia","Moscow","Developing the LAROS-RC2 orbital carrier and accompanying mobile launch infrastructure" 102 | "LongShot*","3","0","0.000","0","-","Small","LEO","Other","United States","-","Developing a non-traditional hypervelocity launch system" 103 | "Orbit Boy","3","200","0.000","0","-","Small","LEO","Plane, Rocket","United Kingdom","London, England","On-demand air launch system utilizing standard transport aircrafts from any airport" 104 | "Phantom Space","3","450","0.000","0","5.00","Small","LEO","Rocket","United States","Tucson, AZ","Revolutionizing the way we transport satellites and space assets into space" 105 | "Phoenix Launch Systems","3","31","0.750","34,000","0.50","Small","LEO","Rocket","United States","Las Vegas, NV","Developing responsive launch services and products for the nanosatellite industry" 106 | "Promin Aerospace","3","0","0.200","0","0.50","Small","LEO","Rocket","Ukraine","Kiev","Developing a unique ultralight rocket platform" 107 | "Reaction Dynamics","3","150","0.000","30,000","1.15","Small","LEO","Rocket","Canada","St. Laurent, Quebec","Making it affordable for companies and universities to launch small satellites into orbit" 108 | "Ripple Aerospace","3","2,600","0.000","0","-","Medium","LEO","Rocket","Norway","Kristiansand","Bringing back the ""sea-launch"" concept with full-force!" 109 | "RocketStar","3","300","6.000","20,000","-","Small","LEO","Rocket","United States","New York, NY","We offer rideshare and dedicated launch opportunities at severe discounts to other launch providers, and can go from first conversation to launch in 18 months or less" 110 | "Sirius Space Services","3","800","0.000","0","-","Small","LEO","Rocket","France","Puteaux","Developing a range of sustainable, reusable launchers dedicated to the launch of small satellites" 111 | "Smallspark Space Systems","3","400","8.100","20,250","0.23","Small","LEO","Rocket","United Kingdom","Cardiff, Wales","Developing vertical launch systems to provide the ability for the United Kingdom to effectively, reliably and affordably deliver satellites into orbit from the planned major UK vertical launch spaceports" 112 | "Space One (by Canon)","3","240","0.000","0","Joint Venture","Small","LEO","Rocket","Japan","Tokyo, Tokyo","Joint venture between Canon Electronics, IHI Aerospace, Shimizu, and Development Bank of Japan" 113 | "SpaceRyde","3","0","0.250","0","0.15","Small","LEO","Balloon, Rocket","Canada","Toronto, Ontario","SpaceRyde offers affordable, on-schedule, dedicated launch for small sats" 114 | "Thor Launch Systems (by 8 Rivers)","3","0","0.000","0","-","-","LEO","Other","United States","Durham, NC","Enabling the mass commercialization of space by moving beyond chemical rockets" 115 | "TiSPACE","3","390","0.000","0","-","Small","LEO","Rocket","Taiwan","Chunan","Developing a cost-effective space launch system using cutting-edge, non-explosive hybrid rocket technologies" 116 | "Venture Orbital Systems","3","80","0.000","0","0.91","Small","LEO","Rocket","France","Paris","Developing a reactive, reliable and cost-efficient nano-launcher" 117 | "Astraius","2","800","0.000","0","-","Small","LEO","Plane, Rocket","United Kingdom","London, England","Developing the UK’s first commercially operated horizontal launch company for small to medium size satellites" 118 | "Astron Systems","2","360","1.500","4,167","-","Small","LEO","Rocket","United Kingdom","Thame, England","Bringing about the next generation of low-cost, reusable launch vehicles" 119 | "B2Space","2","150","0.000","0","-","Small","LEO","Balloon, Rocket","United Kingdom","Bristol, England","Developing a balloon-launched vehicle for smallsat payloads" 120 | "Black Arrow Space Technologies","2","500","10.000","20,000","0.10","Small","LEO","Rocket","United Kingdom","Swindon, England","Developing a unique seaborne launch system" 121 | "Celestia Aerospace","2","16","0.200","12,500","-","Small","LEO","Plane, Rocket","Spain","Barcelona","Personalized engineering support and dedicated airborne orbital launch platform" 122 | "CloudIX","2","22","0.000","0","-","Small","LEO","Balloon, Rocket","United States","Hayward, CA","Ultra low-cost rockets and satellite deployment" 123 | "CubeCab","2","5","0.250","50,000","-","Small","LEO","Rocket","United States","Mountain View, CA","LEO launch provider for 3U cubesats" 124 | "EOS-X","2","0","0.000","0","-","Tourism","Suborbital","Balloon","Spain","Madrid","Revolutionizing near space tourism and opening it to a greater audience" 125 | "ESC Aerospace","2","0","0.000","50","-","-","LEO","Other","United States","Orlando, FL","Enabling transportation to LEO" 126 | "Eutropia","2","25","0.125","5,000","-","Small","Suborbital","Rocket","Australia","Melbourne, Victoria","Reusable hybrid rocketry" 127 | "Fenix Space","2","0","0.000","0","-","-","LEO","Plane, Rocket","United States","San Bernardino, CA","Developing a reliable tow-glider launch system" 128 | "Fore Dynamics","2","200","0.000","0","-","Small","LEO","Rocket","United States","Placentia, CA","Affordable and reliable small satellite launch system for LEO, SSO, and GEO missions" 129 | "HyPrSpace","2","250","0.000","0","-","Small","LEO","Rocket","France","Le Haillan","Developing a fleet of hybrid rockets for micro launch" 130 | "Independence-X Aerospace","2","200","4.500","22,500","-","Small","LEO","Rocket","Malaysia","Kuala Lumpur","Aiming to innovate space exploration technologies to deliver efficient and cost effective solutions for humanity" 131 | "IO Aircraft","2","63,500","28.000","441","-","Medium, Heavy","LEO","Spaceplane","United States","Ankeny, IA","Developing single stage to orbit, fixed wing, hypersonic aircraft" 132 | "LIA Aerospace","2","150","0.000","25,000","-","Small","LEO","Rocket","Argentina","-","Creating the small-sat launch vehicle of the future" 133 | "New Rocket Technologies","2","500","9.000","18,000","-","Small","LEO","Rocket","Russia","Moscow","Building light, two-stage rockets for small satellite launch" 134 | "Odyne Space","2","150","0.000","0","-","Small","LEO","Rocket","United States","Portland, OR","Cheaper access to space for nanosatellite payloads" 135 | "OmSpace Rocket and Exploration","2","0","0.000","0","-","Small","LEO","Rocket","India","Bharuch, Gujarat","Building a reusable small launch vehicle" 136 | "POLARIS","2","1,150","0.000","0","-","Small","LEO","Spaceplane","Germany","-","Developing a novel, reusable hypersonic and space launch system" 137 | "R3 Aerospace","2","0","0.000","0","-","Small","LEO","Rocket","United States","Tucson, AZ","Developing exciting new space technologies" 138 | "S-Motor","2","160","0.000","0","-","Small","LEO","Rocket","China","Beijing","Developing a three-stage solid-propellant launch vehicle" 139 | "Space Ops","2","10","0.000","0","-","Small","LEO","Rocket","Australia","Redfern, New South Wales","Developing a reusable two-stage orbital-class launch vehicle" 140 | "SpaceHorizon","2","1,200","0.000","0","-","Small","LEO","Rocket","Canada","Ottawa, Ontario","Striving to design, build, and launch rockets, in and from Canada" 141 | "STAR Orbitals","2","150","0.000","0","-","Small","LEO","Rocket","India","Surat, Gujarat","Developing the Phoenix small launch vehicle" 142 | "Tachyon Aerospace","2","0","0.000","0","-","-","LEO","Other","United Kingdom","London, England","Developing the Trans-Atmospheric Flight Vehicle (TAV 1)" 143 | "VALT Enterprises","2","25","1.700","68,000","-","Small","LEO","Rocket","United States","Sanford, ME","Hypersonic Delivery Systems: Suborbital & Orbital Missions" 144 | "Vector","2","200","0.000","0","-","Small","LEO","Rocket","United States","Huntington Beach, CA","Developing suborbital and orbital launch vehicles" 145 | "Vellon Space","2","50","0.000","0","-","Small","LEO","Plane, Rocket","India","Madurai, Tamil Nadu","Developing a series of Launch Vehicles based on high-altitude air launch" 146 | "Velontra","2","0","0.000","0","-","-","LEO","Spaceplane","United States","Cincinnati, OH","Building a hypersonic space plane that can takeoff from anywhere in any weather" 147 | "Vogue Aerospace","2","500","2.000","4,000","0.10","Small","LEO","Rocket","United States","Naples, FL","Developing a unique launch vehicle and propulsion system" 148 | "Wave Motion Launch","2","45,360","0.000","0","-","Heavy","LEO","Other","United States","Seattle, WA","Building a space launch system for sending hardened satellites and bulk cargo into space" 149 | "Astromotive Space Technologies","1","0","0.000","0","-","Small","LEO","Other","United States","Salem, OR","Disrupting space launch with a new kinetic launch system" 150 | "AvalonSpace","1","0","0.000","0","-","-","-","Rocket","United Kingdom","Bristol, England","Vertically launching low-cost orbital payloads from the UK" 151 | "CubeSat Solutions","1","50","0.000","0","-","Small","Suborbital","Rocket","Thailand","Chiang Mai","Designer of low-weight, low-cost satellites and small launching vehicles" 152 | "Dark","1","200","5.000","25,000","-","Small","LEO","Plane, Rocket","France","-","Making access to space widely available, less constrained and more efficient" 153 | "Enter Space","1","3,000","0.000","1,000","-","Medium","LEO","Spaceplane","Australia","Sydney, New South Wales","Developing technology for cost effective, reliable, and frequent access to space" 154 | "Forever Space","1","0","0.050","0","-","Small","Lunar","Other","United States","San Francisco, CA","Developing the Otis electric vehicle for civil space transportation" 155 | "Hybrid Propulsion","1","250","4.000","16,000","-","Small","LEO","Rocket","France","Talence","Developing a solution for quick missions to LEO" 156 | "HyperMach Aerospace Holdings","1","45,360","0.000","0","-","Heavy","LEO","Spaceplane","United States","Los Angeles, CA","Designing a single stage to orbit hypersonic vehicle of revolutionary design and propulsion capability" 157 | "Jarmyn Enterprise Space","1","50","0.000","0","-","Small","LEO","Rocket","Australia","Mawson Lakes, South Australia","Building a single-stage to orbit launch system dedicated to small payloads" 158 | "Launch Tech Space","1","0","0.000","0","-","Small","LEO","Other","United States","Pasadena, CA","Developing an eco friendly, electromagnetic space launch system" 159 | "Marcom","1","1,000","8.000","8,000","-","Small","LEO","Rocket","South Africa","Johannesburg","Developing the Cheetah small launch vehicle" 160 | "New Ascent","1","0","0.000","0","0.11","-","-","Other","United States","Annapolis, MD","Redefining launch from the ground up" 161 | "Project S3","1","250","0.000","0","-","Small","-","Spaceplane","Croatia","Zagreb","Aiming to provide orbital launches of small satellites for the most affordable prices" 162 | "Rocketplane Global","1","3,000","0.000","0","-","Medium","LEO","Spaceplane","United States","Green Bay, WI","Spaceplane based satellite launch" 163 | "Rocketsan","1","100","0.000","0","-","Small","LEO","Rocket","Turkey","Ankara","Rocketsan is developing a micro-satellite launch system for the Turkish government" 164 | "Seres Space Exploration Technology","1","20,000","0.000","0","-","Heavy","LEO","Rocket","China","Beijing","Developing a reusable launch vehicle named ‘Tianmeng’" 165 | "ShipInSpace","1","20,000","0.000","2,000","-","Heavy","LEO","Rocket","United Kingdom","Witney, England","Working on a very low-cost Launcher to LEO" 166 | "Spaceborne","1","0","0.000","0","-","-","LEO","Rocket","Uzbekistan","Tashkent","Building low cost, single stage and highly reusable launch vehicles" 167 | "SpaceTrek","1","240","0.000","0","-","Small","LEO","Rocket","China","Beijing","Customized launch services for sub-orbital and orbital payloads" 168 | "Stratobooster","1","0","0.000","0","-","Small","LEO","Balloon, Rocket","United Kingdom","Middlesbrough, England","Launching small satellites into space using high altitude balloons" 169 | "Suborbitality","1","75","0.000","5,000","-","Small","Suborbital","Rocket","Czech Republic","Prague","Creating a dedicated, reusable, and low-cost suborbital rocket" 170 | "Tehiru","1","300","0.000","0","-","Small","LEO","Plane, Rocket","United States","New York, NY","Developing a hybrid, modular, affordable & air-launched rocket system that uses electric landing for reusability" 171 | "Trans Space Travels","1","0","0.000","0","-","-","Suborbital","Spaceplane","Germany","Bayreuth","Developing SOL ASPIRET, a suborbital spaceplane" 172 | "Deywoss One","0","200","0.000","0","-","Small","LEO","Spaceplane","United States","Merepoint, ME","Developing PROTEUS, an innovative hybrid and autonomous launcher for small sats" 173 | "Halo Aerospace","0","0","0.000","0","-","-","Suborbital","Balloon, Rocket","United States","West Palm Beach, FL","Providing launch services via HABs and sounding rockets" 174 | "LEO Launcher","0","680","0.000","0","-","Small","LEO","Plane, Rocket","United States","Houston, TX","Long range heavy lift aircraft horizontal launches" 175 | "Merida Aerospace","0","0","0.000","0","-","Small","LEO","Rocket","United States","Tampa, FL","Developing a small launcher" 176 | "Orbspace","0","0","0.000","0","-","Tourism","Suborbital","Rocket","Japan","Tsukuba, Ibaraki","Developing Infinity, a small reusable rocket" 177 | "Pyramid Exploration Technologies","0","0","0.000","0","-","Small","LEO","Rocket","South Africa","Polokwane","Developing high-performance, low carbon micro launch vehicles" 178 | "Space One","0","0","0.000","0","-","-","LEO","Other","Luxembourg","Luxembourg","Developing electromagnetic launch systems to change how we launch payloads into space" 179 | "Space Railway Corporation","0","0","0.000","0","-","-","LEO","Other","United States","Fort Worth, TX","Developing a proposed ""railway"" to space that is safer, cheaper, and greener than rockets" 180 | "SpaceLoon","0","0","0.000","0","-","Small","Suborbital","Balloon","United States","Miami, FL","Building a ballooning platform to offer novel access to the mesosphere" 181 | "SpaceTaxi","0","0","0.000","0","-","Tourism","Suborbital","Rocket","India","Delhi","Aiming to provide affordable space tourism and rocket launch services" 182 | "The Rocket & Satellite Company","0","0","0.000","0","-","-","LEO","Rocket","Pakistan","Karachi","Offering low-cost space launch systems" 183 | "Timewarp Space","0","150","0.000","0","-","Small","LEO","Rocket","India","Bangalore, Karnataka","Dedicated to providing frequent and reliable space access" 184 | "United Frontiers","0","500","0.000","2,000","-","Small","LEO","Rocket","United States","Port Canaveral, FL","SpaceBox is a suborbital launch and recovery platform designed to enable affordable access to space for educational, professional, consumer and hobbyist payloads" -------------------------------------------------------------------------------- /SFR Analysis/SFR Analysis.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/SFR Analysis/SFR Analysis.pdf -------------------------------------------------------------------------------- /SFR Analysis/description.md: -------------------------------------------------------------------------------- 1 | # Space Fund Realty (SFR) Analysis 2 | ![](https://c4.wallpaperflare.com/wallpaper/81/233/257/smoke-cape-canaveral-rocket-falcon-9-wallpaper-preview.jpg) 3 | ## Project Overview 4 | The Space Fund Realty (SFR) Analysis project aims to provide valuable insights into aerospace companies and their missions, ultimately assisting investors in making informed decisions. The SFR is a crucial rating system that evaluates companies based on their missions, payload, launch costs, and other factors, providing an indication of their development and stability. The SFR rating ranges from 1 to 9, with higher ratings signifying more developed companies. 5 | 6 | ## Data Dictionary 7 | Here is a description of the dataset used in this project: 8 | 9 | | Column Name | Description | 10 | |---------------------|-------------------------------------------------------| 11 | | Company | Name of the company | 12 | | SFR | SpaceFund Realty rating of the company | 13 | | Payload(kg) | Payload of the mission | 14 | | Launch Cost (million USD) | Launch cost of the mission | 15 | | Price per kg | Price per kg payload of the mission | 16 | | Launch Class | Launch class of the mission | 17 | | Orbit Altitude | Orbit altitude of the mission | 18 | | Tech Type | Technology type of the mission | 19 | | Country | Country of the company | 20 | | HQ Location | Headquarters location of the company | 21 | | Description | Description of the mission | 22 | 23 | ## Conclusion 24 | Based on the exploratory data analysis of the dataset, several key insights were uncovered: 25 | 26 | 1. **Geographical Distribution**: The majority of companies in the dataset were from the United States, resulting in a higher number of well-rated companies with an SFR greater than 6. However, China ranked second, surpassing the United Kingdom in the number of companies with a high SFR rating. 27 | 28 | 2. **Mission Characteristics**: Most of the missions in the dataset were rocket-type, small launch class, and focused on Low Earth Orbit missions. This trend is closely related to the distribution of SFR ratings, with a significant number of missions having SFR ratings between 2 and 3. 29 | 30 | 3. **Launch Cost and Payload Relationship**: A strong correlation was observed between launch cost and payload with SFR ratings. Missions with higher launch costs and payloads tended to have higher SFR ratings. This suggests that well-established companies are capable of carrying heavier payloads into space and investing more in their missions. 31 | 32 | 4. **Machine Learning Models**: Decision Tree and Random Forest Classifier models were employed in this project. Both models yielded similar results with an accuracy of 87%. However, it's important to note that due to the relatively small dataset, the models had lower recall scores when predicting SFR ratings greater than 6. Increasing the dataset size could potentially improve model performance in this regard. 33 | 34 | In conclusion, this analysis provides valuable insights for investors looking to make informed decisions about aerospace companies. Understanding the factors influencing SFR ratings can assist in assessing the development and stability of these companies in the dynamic space industry. Additionally, expanding the dataset could further enhance the accuracy and reliability of predictive models. 35 | -------------------------------------------------------------------------------- /Salary Prediction/Salary Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Salary Prediction/Salary Prediction.pdf -------------------------------------------------------------------------------- /Salary Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Salary Prediction based on Country and Race 2 | ![](https://raw.githubusercontent.com/Masterx-AI/Project_Employee_Salary_Prediction_/main/es.jpg) 3 | ## Project Overview: 4 | The focus of the Demographic-Based Salary Prediction project is to develop a predictive model that estimates the salaries of individuals from diverse countries and races based on their demographic attributes. These attributes encompass a range of variables, including occupation, age, gender, experience, and education. The dataset for this project, acquired from Kaggle, comprises 32,561 rows and 15 columns, with 8 independent variables and the target variable, "Salary." 5 | ## About the Dataset: 6 | The dataset provides an expansive compilation of salary and demographic information, augmented by details regarding years of professional experience. It serves as a valuable resource for investigating the intricate relationship between income and various socio-demographic factors. Demographic features such as age, gender, education level, country of origin, and race constitute the foundation for a comprehensive analysis. This dataset empowers researchers to uncover patterns and trends in income distribution across diverse demographic categories, shedding light on potential variations or inequalities in earning potential. Additionally, the dataset incorporates a crucial dimension - "Years of Experience" - offering a lens into the impact of accumulated professional tenure on salary levels. This dynamic facet enables in-depth exploration of how income is influenced by both demographic attributes and the evolution of professional expertise. Overall, the dataset presents an opportunity for conducting exhaustive studies on income diversity and gaining insights into the multifaceted determinants that shape earning prospects in today's workforce. 7 | ## Data Dictionary: 8 | The dataset includes the following columns: 9 | |Column|Description| 10 | |---|---| 11 | |Unnamed: 0|Index| 12 | |Age|Age of the employee| 13 | |Education Level|Education level of the employee| 14 | |Job Title|Job title of the employee| 15 | |Years of Experience|Years of experience of the employee| 16 | |Salary|Salary of the employee| 17 | |Country|Country of the employee| 18 | |Race|Race of the employee| 19 | 20 | By leveraging this dataset, the project seeks to build a robust predictive model that accurately estimates salaries based on demographic attributes, contributing valuable insights to the understanding of income disparities and the complex interplay of factors that influence earnings within diverse demographic contexts. 21 | -------------------------------------------------------------------------------- /Sleep Disorder Prediction/Sleep Disorder Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Sleep Disorder Prediction/Sleep Disorder Prediction.pdf -------------------------------------------------------------------------------- /Sleep Disorder Prediction/Sleep_health_and_lifestyle_dataset.csv: -------------------------------------------------------------------------------- 1 | Person ID,Gender,Age,Occupation,Sleep Duration,Quality of Sleep,Physical Activity Level,Stress Level,BMI Category,Blood Pressure,Heart Rate,Daily Steps,Sleep Disorder 2 | 1,Male,27,Software Engineer,6.1,6,42,6,Overweight,126/83,77,4200,None 3 | 2,Male,28,Doctor,6.2,6,60,8,Normal,125/80,75,10000,None 4 | 3,Male,28,Doctor,6.2,6,60,8,Normal,125/80,75,10000,None 5 | 4,Male,28,Sales Representative,5.9,4,30,8,Obese,140/90,85,3000,Sleep Apnea 6 | 5,Male,28,Sales Representative,5.9,4,30,8,Obese,140/90,85,3000,Sleep Apnea 7 | 6,Male,28,Software Engineer,5.9,4,30,8,Obese,140/90,85,3000,Insomnia 8 | 7,Male,29,Teacher,6.3,6,40,7,Obese,140/90,82,3500,Insomnia 9 | 8,Male,29,Doctor,7.8,7,75,6,Normal,120/80,70,8000,None 10 | 9,Male,29,Doctor,7.8,7,75,6,Normal,120/80,70,8000,None 11 | 10,Male,29,Doctor,7.8,7,75,6,Normal,120/80,70,8000,None 12 | 11,Male,29,Doctor,6.1,6,30,8,Normal,120/80,70,8000,None 13 | 12,Male,29,Doctor,7.8,7,75,6,Normal,120/80,70,8000,None 14 | 13,Male,29,Doctor,6.1,6,30,8,Normal,120/80,70,8000,None 15 | 14,Male,29,Doctor,6,6,30,8,Normal,120/80,70,8000,None 16 | 15,Male,29,Doctor,6,6,30,8,Normal,120/80,70,8000,None 17 | 16,Male,29,Doctor,6,6,30,8,Normal,120/80,70,8000,None 18 | 17,Female,29,Nurse,6.5,5,40,7,Normal Weight,132/87,80,4000,Sleep Apnea 19 | 18,Male,29,Doctor,6,6,30,8,Normal,120/80,70,8000,Sleep Apnea 20 | 19,Female,29,Nurse,6.5,5,40,7,Normal Weight,132/87,80,4000,Insomnia 21 | 20,Male,30,Doctor,7.6,7,75,6,Normal,120/80,70,8000,None 22 | 21,Male,30,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 23 | 22,Male,30,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 24 | 23,Male,30,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 25 | 24,Male,30,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 26 | 25,Male,30,Doctor,7.8,7,75,6,Normal,120/80,70,8000,None 27 | 26,Male,30,Doctor,7.9,7,75,6,Normal,120/80,70,8000,None 28 | 27,Male,30,Doctor,7.8,7,75,6,Normal,120/80,70,8000,None 29 | 28,Male,30,Doctor,7.9,7,75,6,Normal,120/80,70,8000,None 30 | 29,Male,30,Doctor,7.9,7,75,6,Normal,120/80,70,8000,None 31 | 30,Male,30,Doctor,7.9,7,75,6,Normal,120/80,70,8000,None 32 | 31,Female,30,Nurse,6.4,5,35,7,Normal Weight,130/86,78,4100,Sleep Apnea 33 | 32,Female,30,Nurse,6.4,5,35,7,Normal Weight,130/86,78,4100,Insomnia 34 | 33,Female,31,Nurse,7.9,8,75,4,Normal Weight,117/76,69,6800,None 35 | 34,Male,31,Doctor,6.1,6,30,8,Normal,125/80,72,5000,None 36 | 35,Male,31,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 37 | 36,Male,31,Doctor,6.1,6,30,8,Normal,125/80,72,5000,None 38 | 37,Male,31,Doctor,6.1,6,30,8,Normal,125/80,72,5000,None 39 | 38,Male,31,Doctor,7.6,7,75,6,Normal,120/80,70,8000,None 40 | 39,Male,31,Doctor,7.6,7,75,6,Normal,120/80,70,8000,None 41 | 40,Male,31,Doctor,7.6,7,75,6,Normal,120/80,70,8000,None 42 | 41,Male,31,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 43 | 42,Male,31,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 44 | 43,Male,31,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 45 | 44,Male,31,Doctor,7.8,7,75,6,Normal,120/80,70,8000,None 46 | 45,Male,31,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 47 | 46,Male,31,Doctor,7.8,7,75,6,Normal,120/80,70,8000,None 48 | 47,Male,31,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 49 | 48,Male,31,Doctor,7.8,7,75,6,Normal,120/80,70,8000,None 50 | 49,Male,31,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 51 | 50,Male,31,Doctor,7.7,7,75,6,Normal,120/80,70,8000,Sleep Apnea 52 | 51,Male,32,Engineer,7.5,8,45,3,Normal,120/80,70,8000,None 53 | 52,Male,32,Engineer,7.5,8,45,3,Normal,120/80,70,8000,None 54 | 53,Male,32,Doctor,6,6,30,8,Normal,125/80,72,5000,None 55 | 54,Male,32,Doctor,7.6,7,75,6,Normal,120/80,70,8000,None 56 | 55,Male,32,Doctor,6,6,30,8,Normal,125/80,72,5000,None 57 | 56,Male,32,Doctor,6,6,30,8,Normal,125/80,72,5000,None 58 | 57,Male,32,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 59 | 58,Male,32,Doctor,6,6,30,8,Normal,125/80,72,5000,None 60 | 59,Male,32,Doctor,6,6,30,8,Normal,125/80,72,5000,None 61 | 60,Male,32,Doctor,7.7,7,75,6,Normal,120/80,70,8000,None 62 | 61,Male,32,Doctor,6,6,30,8,Normal,125/80,72,5000,None 63 | 62,Male,32,Doctor,6,6,30,8,Normal,125/80,72,5000,None 64 | 63,Male,32,Doctor,6.2,6,30,8,Normal,125/80,72,5000,None 65 | 64,Male,32,Doctor,6.2,6,30,8,Normal,125/80,72,5000,None 66 | 65,Male,32,Doctor,6.2,6,30,8,Normal,125/80,72,5000,None 67 | 66,Male,32,Doctor,6.2,6,30,8,Normal,125/80,72,5000,None 68 | 67,Male,32,Accountant,7.2,8,50,6,Normal Weight,118/76,68,7000,None 69 | 68,Male,33,Doctor,6,6,30,8,Normal,125/80,72,5000,Insomnia 70 | 69,Female,33,Scientist,6.2,6,50,6,Overweight,128/85,76,5500,None 71 | 70,Female,33,Scientist,6.2,6,50,6,Overweight,128/85,76,5500,None 72 | 71,Male,33,Doctor,6.1,6,30,8,Normal,125/80,72,5000,None 73 | 72,Male,33,Doctor,6.1,6,30,8,Normal,125/80,72,5000,None 74 | 73,Male,33,Doctor,6.1,6,30,8,Normal,125/80,72,5000,None 75 | 74,Male,33,Doctor,6.1,6,30,8,Normal,125/80,72,5000,None 76 | 75,Male,33,Doctor,6,6,30,8,Normal,125/80,72,5000,None 77 | 76,Male,33,Doctor,6,6,30,8,Normal,125/80,72,5000,None 78 | 77,Male,33,Doctor,6,6,30,8,Normal,125/80,72,5000,None 79 | 78,Male,33,Doctor,6,6,30,8,Normal,125/80,72,5000,None 80 | 79,Male,33,Doctor,6,6,30,8,Normal,125/80,72,5000,None 81 | 80,Male,33,Doctor,6,6,30,8,Normal,125/80,72,5000,None 82 | 81,Female,34,Scientist,5.8,4,32,8,Overweight,131/86,81,5200,Sleep Apnea 83 | 82,Female,34,Scientist,5.8,4,32,8,Overweight,131/86,81,5200,Sleep Apnea 84 | 83,Male,35,Teacher,6.7,7,40,5,Overweight,128/84,70,5600,None 85 | 84,Male,35,Teacher,6.7,7,40,5,Overweight,128/84,70,5600,None 86 | 85,Male,35,Software Engineer,7.5,8,60,5,Normal Weight,120/80,70,8000,None 87 | 86,Female,35,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 88 | 87,Male,35,Engineer,7.2,8,60,4,Normal,125/80,65,5000,None 89 | 88,Male,35,Engineer,7.2,8,60,4,Normal,125/80,65,5000,None 90 | 89,Male,35,Engineer,7.3,8,60,4,Normal,125/80,65,5000,None 91 | 90,Male,35,Engineer,7.3,8,60,4,Normal,125/80,65,5000,None 92 | 91,Male,35,Engineer,7.3,8,60,4,Normal,125/80,65,5000,None 93 | 92,Male,35,Engineer,7.3,8,60,4,Normal,125/80,65,5000,None 94 | 93,Male,35,Software Engineer,7.5,8,60,5,Normal Weight,120/80,70,8000,None 95 | 94,Male,35,Lawyer,7.4,7,60,5,Obese,135/88,84,3300,Sleep Apnea 96 | 95,Female,36,Accountant,7.2,8,60,4,Normal,115/75,68,7000,Insomnia 97 | 96,Female,36,Accountant,7.1,8,60,4,Normal,115/75,68,7000,None 98 | 97,Female,36,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 99 | 98,Female,36,Accountant,7.1,8,60,4,Normal,115/75,68,7000,None 100 | 99,Female,36,Teacher,7.1,8,60,4,Normal,115/75,68,7000,None 101 | 100,Female,36,Teacher,7.1,8,60,4,Normal,115/75,68,7000,None 102 | 101,Female,36,Teacher,7.2,8,60,4,Normal,115/75,68,7000,None 103 | 102,Female,36,Teacher,7.2,8,60,4,Normal,115/75,68,7000,None 104 | 103,Female,36,Teacher,7.2,8,60,4,Normal,115/75,68,7000,None 105 | 104,Male,36,Teacher,6.6,5,35,7,Overweight,129/84,74,4800,Sleep Apnea 106 | 105,Female,36,Teacher,7.2,8,60,4,Normal,115/75,68,7000,Sleep Apnea 107 | 106,Male,36,Teacher,6.6,5,35,7,Overweight,129/84,74,4800,Insomnia 108 | 107,Female,37,Nurse,6.1,6,42,6,Overweight,126/83,77,4200,None 109 | 108,Male,37,Engineer,7.8,8,70,4,Normal Weight,120/80,68,7000,None 110 | 109,Male,37,Engineer,7.8,8,70,4,Normal Weight,120/80,68,7000,None 111 | 110,Male,37,Lawyer,7.4,8,60,5,Normal,130/85,68,8000,None 112 | 111,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 113 | 112,Male,37,Lawyer,7.4,8,60,5,Normal,130/85,68,8000,None 114 | 113,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 115 | 114,Male,37,Lawyer,7.4,8,60,5,Normal,130/85,68,8000,None 116 | 115,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 117 | 116,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 118 | 117,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 119 | 118,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 120 | 119,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 121 | 120,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 122 | 121,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 123 | 122,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 124 | 123,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 125 | 124,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 126 | 125,Female,37,Accountant,7.2,8,60,4,Normal,115/75,68,7000,None 127 | 126,Female,37,Nurse,7.5,8,60,4,Normal Weight,120/80,70,8000,None 128 | 127,Male,38,Lawyer,7.3,8,60,5,Normal,130/85,68,8000,None 129 | 128,Female,38,Accountant,7.1,8,60,4,Normal,115/75,68,7000,None 130 | 129,Male,38,Lawyer,7.3,8,60,5,Normal,130/85,68,8000,None 131 | 130,Male,38,Lawyer,7.3,8,60,5,Normal,130/85,68,8000,None 132 | 131,Female,38,Accountant,7.1,8,60,4,Normal,115/75,68,7000,None 133 | 132,Male,38,Lawyer,7.3,8,60,5,Normal,130/85,68,8000,None 134 | 133,Male,38,Lawyer,7.3,8,60,5,Normal,130/85,68,8000,None 135 | 134,Female,38,Accountant,7.1,8,60,4,Normal,115/75,68,7000,None 136 | 135,Male,38,Lawyer,7.3,8,60,5,Normal,130/85,68,8000,None 137 | 136,Male,38,Lawyer,7.3,8,60,5,Normal,130/85,68,8000,None 138 | 137,Female,38,Accountant,7.1,8,60,4,Normal,115/75,68,7000,None 139 | 138,Male,38,Lawyer,7.1,8,60,5,Normal,130/85,68,8000,None 140 | 139,Female,38,Accountant,7.1,8,60,4,Normal,115/75,68,7000,None 141 | 140,Male,38,Lawyer,7.1,8,60,5,Normal,130/85,68,8000,None 142 | 141,Female,38,Accountant,7.1,8,60,4,Normal,115/75,68,7000,None 143 | 142,Male,38,Lawyer,7.1,8,60,5,Normal,130/85,68,8000,None 144 | 143,Female,38,Accountant,7.1,8,60,4,Normal,115/75,68,7000,None 145 | 144,Female,38,Accountant,7.1,8,60,4,Normal,115/75,68,7000,None 146 | 145,Male,38,Lawyer,7.1,8,60,5,Normal,130/85,68,8000,Sleep Apnea 147 | 146,Female,38,Lawyer,7.4,7,60,5,Obese,135/88,84,3300,Sleep Apnea 148 | 147,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,Insomnia 149 | 148,Male,39,Engineer,6.5,5,40,7,Overweight,132/87,80,4000,Insomnia 150 | 149,Female,39,Lawyer,6.9,7,50,6,Normal Weight,128/85,75,5500,None 151 | 150,Female,39,Accountant,8,9,80,3,Normal Weight,115/78,67,7500,None 152 | 151,Female,39,Accountant,8,9,80,3,Normal Weight,115/78,67,7500,None 153 | 152,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,None 154 | 153,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,None 155 | 154,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,None 156 | 155,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,None 157 | 156,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,None 158 | 157,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,None 159 | 158,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,None 160 | 159,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,None 161 | 160,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,None 162 | 161,Male,39,Lawyer,7.2,8,60,5,Normal,130/85,68,8000,None 163 | 162,Female,40,Accountant,7.2,8,55,6,Normal Weight,119/77,73,7300,None 164 | 163,Female,40,Accountant,7.2,8,55,6,Normal Weight,119/77,73,7300,None 165 | 164,Male,40,Lawyer,7.9,8,90,5,Normal,130/85,68,8000,None 166 | 165,Male,40,Lawyer,7.9,8,90,5,Normal,130/85,68,8000,None 167 | 166,Male,41,Lawyer,7.6,8,90,5,Normal,130/85,70,8000,Insomnia 168 | 167,Male,41,Engineer,7.3,8,70,6,Normal Weight,121/79,72,6200,None 169 | 168,Male,41,Lawyer,7.1,7,55,6,Overweight,125/82,72,6000,None 170 | 169,Male,41,Lawyer,7.1,7,55,6,Overweight,125/82,72,6000,None 171 | 170,Male,41,Lawyer,7.7,8,90,5,Normal,130/85,70,8000,None 172 | 171,Male,41,Lawyer,7.7,8,90,5,Normal,130/85,70,8000,None 173 | 172,Male,41,Lawyer,7.7,8,90,5,Normal,130/85,70,8000,None 174 | 173,Male,41,Lawyer,7.7,8,90,5,Normal,130/85,70,8000,None 175 | 174,Male,41,Lawyer,7.7,8,90,5,Normal,130/85,70,8000,None 176 | 175,Male,41,Lawyer,7.6,8,90,5,Normal,130/85,70,8000,None 177 | 176,Male,41,Lawyer,7.6,8,90,5,Normal,130/85,70,8000,None 178 | 177,Male,41,Lawyer,7.6,8,90,5,Normal,130/85,70,8000,None 179 | 178,Male,42,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 180 | 179,Male,42,Lawyer,7.8,8,90,5,Normal,130/85,70,8000,None 181 | 180,Male,42,Lawyer,7.8,8,90,5,Normal,130/85,70,8000,None 182 | 181,Male,42,Lawyer,7.8,8,90,5,Normal,130/85,70,8000,None 183 | 182,Male,42,Lawyer,7.8,8,90,5,Normal,130/85,70,8000,None 184 | 183,Male,42,Lawyer,7.8,8,90,5,Normal,130/85,70,8000,None 185 | 184,Male,42,Lawyer,7.8,8,90,5,Normal,130/85,70,8000,None 186 | 185,Female,42,Teacher,6.8,6,45,7,Overweight,130/85,78,5000,Sleep Apnea 187 | 186,Female,42,Teacher,6.8,6,45,7,Overweight,130/85,78,5000,Sleep Apnea 188 | 187,Female,43,Teacher,6.7,7,45,4,Overweight,135/90,65,6000,Insomnia 189 | 188,Male,43,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 190 | 189,Female,43,Teacher,6.7,7,45,4,Overweight,135/90,65,6000,Insomnia 191 | 190,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 192 | 191,Female,43,Teacher,6.7,7,45,4,Overweight,135/90,65,6000,Insomnia 193 | 192,Male,43,Salesperson,6.4,6,45,7,Overweight,130/85,72,6000,Insomnia 194 | 193,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 195 | 194,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 196 | 195,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 197 | 196,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 198 | 197,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 199 | 198,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 200 | 199,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 201 | 200,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 202 | 201,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Insomnia 203 | 202,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,Insomnia 204 | 203,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,Insomnia 205 | 204,Male,43,Engineer,6.9,6,47,7,Normal Weight,117/76,69,6800,None 206 | 205,Male,43,Engineer,7.6,8,75,4,Overweight,122/80,68,6800,None 207 | 206,Male,43,Engineer,7.7,8,90,5,Normal,130/85,70,8000,None 208 | 207,Male,43,Engineer,7.7,8,90,5,Normal,130/85,70,8000,None 209 | 208,Male,43,Engineer,7.7,8,90,5,Normal,130/85,70,8000,None 210 | 209,Male,43,Engineer,7.7,8,90,5,Normal,130/85,70,8000,None 211 | 210,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,None 212 | 211,Male,43,Engineer,7.7,8,90,5,Normal,130/85,70,8000,None 213 | 212,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,None 214 | 213,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,None 215 | 214,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,None 216 | 215,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,None 217 | 216,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,None 218 | 217,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,None 219 | 218,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,None 220 | 219,Male,43,Engineer,7.8,8,90,5,Normal,130/85,70,8000,Sleep Apnea 221 | 220,Male,43,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,Sleep Apnea 222 | 221,Female,44,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 223 | 222,Male,44,Salesperson,6.4,6,45,7,Overweight,130/85,72,6000,Insomnia 224 | 223,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 225 | 224,Male,44,Salesperson,6.4,6,45,7,Overweight,130/85,72,6000,Insomnia 226 | 225,Female,44,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 227 | 226,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 228 | 227,Female,44,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 229 | 228,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 230 | 229,Female,44,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 231 | 230,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 232 | 231,Female,44,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 233 | 232,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 234 | 233,Female,44,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 235 | 234,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 236 | 235,Female,44,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 237 | 236,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 238 | 237,Male,44,Salesperson,6.4,6,45,7,Overweight,130/85,72,6000,Insomnia 239 | 238,Female,44,Teacher,6.5,7,45,4,Overweight,135/90,65,6000,Insomnia 240 | 239,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 241 | 240,Male,44,Salesperson,6.4,6,45,7,Overweight,130/85,72,6000,Insomnia 242 | 241,Female,44,Teacher,6.5,7,45,4,Overweight,135/90,65,6000,Insomnia 243 | 242,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 244 | 243,Male,44,Salesperson,6.4,6,45,7,Overweight,130/85,72,6000,Insomnia 245 | 244,Female,44,Teacher,6.5,7,45,4,Overweight,135/90,65,6000,Insomnia 246 | 245,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 247 | 246,Female,44,Teacher,6.5,7,45,4,Overweight,135/90,65,6000,Insomnia 248 | 247,Male,44,Salesperson,6.3,6,45,7,Overweight,130/85,72,6000,Insomnia 249 | 248,Male,44,Engineer,6.8,7,45,7,Overweight,130/85,78,5000,Insomnia 250 | 249,Male,44,Salesperson,6.4,6,45,7,Overweight,130/85,72,6000,None 251 | 250,Male,44,Salesperson,6.5,6,45,7,Overweight,130/85,72,6000,None 252 | 251,Female,45,Teacher,6.8,7,30,6,Overweight,135/90,65,6000,Insomnia 253 | 252,Female,45,Teacher,6.8,7,30,6,Overweight,135/90,65,6000,Insomnia 254 | 253,Female,45,Teacher,6.5,7,45,4,Overweight,135/90,65,6000,Insomnia 255 | 254,Female,45,Teacher,6.5,7,45,4,Overweight,135/90,65,6000,Insomnia 256 | 255,Female,45,Teacher,6.5,7,45,4,Overweight,135/90,65,6000,Insomnia 257 | 256,Female,45,Teacher,6.5,7,45,4,Overweight,135/90,65,6000,Insomnia 258 | 257,Female,45,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 259 | 258,Female,45,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 260 | 259,Female,45,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 261 | 260,Female,45,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 262 | 261,Female,45,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,Insomnia 263 | 262,Female,45,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,None 264 | 263,Female,45,Teacher,6.6,7,45,4,Overweight,135/90,65,6000,None 265 | 264,Female,45,Manager,6.9,7,55,5,Overweight,125/82,75,5500,None 266 | 265,Male,48,Doctor,7.3,7,65,5,Obese,142/92,83,3500,Insomnia 267 | 266,Female,48,Nurse,5.9,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 268 | 267,Male,48,Doctor,7.3,7,65,5,Obese,142/92,83,3500,Insomnia 269 | 268,Female,49,Nurse,6.2,6,90,8,Overweight,140/95,75,10000,None 270 | 269,Female,49,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 271 | 270,Female,49,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 272 | 271,Female,49,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 273 | 272,Female,49,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 274 | 273,Female,49,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 275 | 274,Female,49,Nurse,6.2,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 276 | 275,Female,49,Nurse,6.2,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 277 | 276,Female,49,Nurse,6.2,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 278 | 277,Male,49,Doctor,8.1,9,85,3,Obese,139/91,86,3700,Sleep Apnea 279 | 278,Male,49,Doctor,8.1,9,85,3,Obese,139/91,86,3700,Sleep Apnea 280 | 279,Female,50,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Insomnia 281 | 280,Female,50,Engineer,8.3,9,30,3,Normal,125/80,65,5000,None 282 | 281,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,None 283 | 282,Female,50,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 284 | 283,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 285 | 284,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 286 | 285,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 287 | 286,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 288 | 287,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 289 | 288,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 290 | 289,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 291 | 290,Female,50,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 292 | 291,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 293 | 292,Female,50,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 294 | 293,Female,50,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 295 | 294,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 296 | 295,Female,50,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 297 | 296,Female,50,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 298 | 297,Female,50,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 299 | 298,Female,50,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 300 | 299,Female,51,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 301 | 300,Female,51,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 302 | 301,Female,51,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 303 | 302,Female,51,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 304 | 303,Female,51,Nurse,7.1,7,55,6,Normal Weight,125/82,72,6000,None 305 | 304,Female,51,Nurse,6,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 306 | 305,Female,51,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 307 | 306,Female,51,Nurse,6.1,6,90,8,Overweight,140/95,75,10000,Sleep Apnea 308 | 307,Female,52,Accountant,6.5,7,45,7,Overweight,130/85,72,6000,Insomnia 309 | 308,Female,52,Accountant,6.5,7,45,7,Overweight,130/85,72,6000,Insomnia 310 | 309,Female,52,Accountant,6.6,7,45,7,Overweight,130/85,72,6000,Insomnia 311 | 310,Female,52,Accountant,6.6,7,45,7,Overweight,130/85,72,6000,Insomnia 312 | 311,Female,52,Accountant,6.6,7,45,7,Overweight,130/85,72,6000,Insomnia 313 | 312,Female,52,Accountant,6.6,7,45,7,Overweight,130/85,72,6000,Insomnia 314 | 313,Female,52,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 315 | 314,Female,52,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 316 | 315,Female,52,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 317 | 316,Female,53,Engineer,8.3,9,30,3,Normal,125/80,65,5000,Insomnia 318 | 317,Female,53,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 319 | 318,Female,53,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 320 | 319,Female,53,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 321 | 320,Female,53,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 322 | 321,Female,53,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 323 | 322,Female,53,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 324 | 323,Female,53,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 325 | 324,Female,53,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 326 | 325,Female,53,Engineer,8.3,9,30,3,Normal,125/80,65,5000,None 327 | 326,Female,53,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 328 | 327,Female,53,Engineer,8.3,9,30,3,Normal,125/80,65,5000,None 329 | 328,Female,53,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 330 | 329,Female,53,Engineer,8.3,9,30,3,Normal,125/80,65,5000,None 331 | 330,Female,53,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 332 | 331,Female,53,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 333 | 332,Female,53,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 334 | 333,Female,54,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 335 | 334,Female,54,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 336 | 335,Female,54,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 337 | 336,Female,54,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 338 | 337,Female,54,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 339 | 338,Female,54,Engineer,8.4,9,30,3,Normal,125/80,65,5000,None 340 | 339,Female,54,Engineer,8.5,9,30,3,Normal,125/80,65,5000,None 341 | 340,Female,55,Nurse,8.1,9,75,4,Overweight,140/95,72,5000,Sleep Apnea 342 | 341,Female,55,Nurse,8.1,9,75,4,Overweight,140/95,72,5000,Sleep Apnea 343 | 342,Female,56,Doctor,8.2,9,90,3,Normal Weight,118/75,65,10000,None 344 | 343,Female,56,Doctor,8.2,9,90,3,Normal Weight,118/75,65,10000,None 345 | 344,Female,57,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,None 346 | 345,Female,57,Nurse,8.2,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 347 | 346,Female,57,Nurse,8.2,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 348 | 347,Female,57,Nurse,8.2,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 349 | 348,Female,57,Nurse,8.2,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 350 | 349,Female,57,Nurse,8.2,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 351 | 350,Female,57,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 352 | 351,Female,57,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 353 | 352,Female,57,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 354 | 353,Female,58,Nurse,8,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 355 | 354,Female,58,Nurse,8,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 356 | 355,Female,58,Nurse,8,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 357 | 356,Female,58,Nurse,8,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 358 | 357,Female,58,Nurse,8,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 359 | 358,Female,58,Nurse,8,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 360 | 359,Female,59,Nurse,8,9,75,3,Overweight,140/95,68,7000,None 361 | 360,Female,59,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,None 362 | 361,Female,59,Nurse,8.2,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 363 | 362,Female,59,Nurse,8.2,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 364 | 363,Female,59,Nurse,8.2,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 365 | 364,Female,59,Nurse,8.2,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 366 | 365,Female,59,Nurse,8,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 367 | 366,Female,59,Nurse,8,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 368 | 367,Female,59,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 369 | 368,Female,59,Nurse,8,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 370 | 369,Female,59,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 371 | 370,Female,59,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 372 | 371,Female,59,Nurse,8,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 373 | 372,Female,59,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 374 | 373,Female,59,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,Sleep Apnea 375 | 374,Female,59,Nurse,8.1,9,75,3,Overweight,140/95,68,7000,Sleep Apnea -------------------------------------------------------------------------------- /Sleep Disorder Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Sleep Disorder Prediction 2 | ![](https://images.onlymyhealth.com/imported/images/2021/December/20_Dec_2021/big_sleep.jpg) 3 | ## Project Overview: 4 | The main objective of this data science project is to analyze various lifestyle and medical variables of individuals, such as age, BMI, physical activity, sleep duration, blood pressure, etc., and use this information to predict the occurrence and type of sleep disorder they may experience. Sleep disorders, like Insomnia and Sleep Apnea, can have significant impacts on an individual's health and overall well-being. By identifying individuals at risk of sleep disorders, appropriate interventions and treatments can be provided to improve their sleep quality and overall health. 5 | ## Dataset Description: 6 | The dataset used for this project is called the "Sleep Health and Lifestyle Dataset." It consists of 400 rows (individuals) and 13 columns (variables) that cover a wide range of information related to sleep patterns and daily habits. The dataset includes the following key features: 7 | 8 | Comprehensive Sleep Metrics: This section allows exploring various sleep-related metrics such as sleep duration, quality of sleep, and factors influencing sleep patterns. 9 | 10 | Lifestyle Factors: The dataset provides insights into lifestyle factors such as physical activity levels, stress levels, and BMI categories, which may have an impact on an individual's sleep health. 11 | 12 | Cardiovascular Health: The dataset includes measurements of blood pressure and heart rate, which are crucial indicators of an individual's cardiovascular health and may have a correlation with sleep disorders. 13 | 14 | Sleep Disorder Analysis: The primary focus of this project is to identify the presence or absence of sleep disorders in individuals. The dataset labels individuals with three categories in the "Sleep Disorder" column: 15 | 16 | - None: Individuals who do not exhibit any specific sleep disorder. 17 | - Insomnia: Individuals who experience difficulty falling asleep or staying asleep, leading to inadequate or poor-quality sleep. 18 | - Sleep Apnea: Individuals who suffer from pauses in breathing during sleep, resulting in disrupted sleep patterns and potential health risks. 19 | ### Data Dictionary 20 | | Column Name | Description | 21 | | --- | --- | 22 | |Person_ID | Unique ID assigned to each person | 23 | |Gender|The gender of the person (Male/Female)| 24 | |Age | Age of the person in years | 25 | |Occupation | The occupation of the person | 26 | |Sleep_duration | The duration of sleep of the person in hours | 27 | |Quality_of_sleep | A subjective rating of the quality of sleep, ranging from 1 to 10| 28 | |Physical_activity | The level of physical activity of the person (Low/Medium/High) | 29 | |Stress Level| A subjective rating of the stress level, ranging from 1 to 10 | 30 | |BMI_category | The BMI category of the person (Underweight/Normal/Overweight/Obesity) | 31 | |Blood_pressure | The blood pressure of the person in mmHg | 32 | |Heart_rate | The heart rate of the person in beats per minute | 33 | |Daily Steps | The number of steps taken by the person per day | 34 | |Sleep_disorder | The presence or absence of a sleep disorder in the person (None, Insomnia, Sleep Apnea) | 35 | 36 | ## Impact 37 | By undertaking this data science project, we aim to provide valuable insights into the factors influencing sleep disorders and develop a model that can help identify individuals at risk, thus promoting better sleep health and overall well-being. 38 | -------------------------------------------------------------------------------- /Telecom Customer Churn Prediction/Telecom Customer Churn Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Telecom Customer Churn Prediction/Telecom Customer Churn Prediction.pdf -------------------------------------------------------------------------------- /Telecom Customer Churn Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Telecom Customer Churn Prediction 2 | ![](https://miro.medium.com/v2/resize:fit:795/0*8Iu_eymr6eR-YuQw) 3 | ## Project Overview: 4 | 5 | The aim of this data science project is to analyze customer demographics, services, tenure, and other variables to predict whether a particular customer will churn or not. Churn, in this context, refers to customers leaving the telecommunications company's services. By understanding the factors that contribute to churn, the company can take proactive measures to retain customers. 6 | 7 | ## Data Dictionary: 8 | 9 | Here is a data dictionary describing the variables used in this project: 10 | 11 | | Variable | Description | 12 | |------------------|------------------------------------------------------------| 13 | | CustomerID | Unique customer ID | 14 | | Gender | Customer's gender | 15 | | SeniorCitizen | Whether the customer is a senior citizen or not (1, 0) | 16 | | Partner | Whether the customer has a partner or not (Yes, No) | 17 | | Dependents | Whether the customer has dependents or not (Yes, No) | 18 | | Tenure | Number of months the customer has stayed with the company | 19 | | PhoneService | Whether the customer has a phone service or not (Yes, No) | 20 | | MultipleLines | Whether the customer has multiple lines or not (Yes, No, No phone service) | 21 | | InternetService | Customer’s internet service provider (DSL, Fiber optic, No) | 22 | | OnlineSecurity | Whether the customer has online security or not (Yes, No, No internet service) | 23 | | OnlineBackup | Whether the customer has online backup or not (Yes, No, No internet service) | 24 | | DeviceProtection | Whether the customer has device protection or not (Yes, No, No internet service) | 25 | | TechSupport | Whether the customer has tech support or not (Yes, No, No internet service) | 26 | | StreamingTV | Whether the customer has streaming TV or not (Yes, No, No internet service) | 27 | | StreamingMovies | Whether the customer has streaming movies or not (Yes, No, No internet service) | 28 | | Contract | The contract term of the customer (Month-to-month, One year, Two years) | 29 | | PaperlessBilling | Whether the customer has paperless billing or not (Yes, No) | 30 | | PaymentMethod | The customer’s payment method (Electronic check, Mailed check, Bank transfer (automatic), Credit card (automatic)) | 31 | | MonthlyCharges | The amount charged to the customer monthly | 32 | | TotalCharges | The total amount charged to the customer | 33 | | Churn | Whether the customer churned or not (Yes or No) | 34 | 35 | ## Conclusion: 36 | 37 | From the exploratory data analysis, several insights have been derived: 38 | 39 | - Senior citizens exhibit a lower churn rate compared to younger customers. 40 | - Customers who are single or do not have dependents tend to have a higher churn rate. 41 | - Customers generally express higher satisfaction with streaming services compared to services like online backup and device protection, resulting in a lower churn rate for streaming services. 42 | - Tenure has an inverse relationship with churn rate, with customers having a tenure shorter than 5 months having a higher churn rate. 43 | - Customers with month-to-month contracts are more likely to churn compared to those with one or two-year contracts, indicating that longer contract durations are associated with lower churn. 44 | - Customers with higher monthly charges and lower total charges are more likely to churn, suggesting that the company should consider reducing monthly charges to mitigate churn. 45 | - The most important features for predicting customer churn, as determined by feature importance, include tenure, contract type, monthly charges, and total charges. 46 | 47 | Machine learning models, including Decision Tree Classifier, Random Forest Classifier, and K Nearest Neighbors Classifier, were employed in this project. The Random Forest Classifier demonstrated the highest accuracy of 82%, along with a high F1 Score and the lowest mean squared error and mean absolute error. Therefore, the Random Forest Classifier is recommended as a suitable model for predicting customer churn in this telecom company's dataset. 48 | -------------------------------------------------------------------------------- /Titanic Survival Prediction/Titantic Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Titanic Survival Prediction/Titantic Prediction.pdf -------------------------------------------------------------------------------- /Titanic Survival Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Titanic Survivor Prediction 2 | ![](https://storage.googleapis.com/kaggle-competitions/kaggle/3136/logos/header.png) 3 | This data science project aims to build a predictive model to answer the question: "What sorts of people were more likely to survive?" using passenger data from the Titanic disaster. The project will analyze various attributes of the passengers, such as their names, ages, genders, socio-economic classes, and more, to predict the likelihood of survival. 4 | ## Dataset Information 5 | The dataset used in this project contains information about the passengers who were aboard the Titanic during its ill-fated voyage. It includes a range of features, including the passengers' names, ages, genders, socio-economic classes, cabin information, ticket details, and survival status. 6 | #### Data Dictionary 7 | |Variable|Definition|Key| 8 | |--------|----------|---| 9 | |survival| Survival| 0 = No, 1 = Yes| 10 | |pclass| Ticket class| 1 = 1st, 2 = 2nd, 3 = 3rd| 11 | |sex| Sex | 12 | |Age| Age in years| 13 | |sibsp |# of siblings / spouses aboard the Titanic | 14 | |parch |# of parents / children aboard the Titanic | 15 | |ticket |Ticket number | 16 | |fare |Passenger fare | 17 | |cabin |Cabin number | 18 | |embarked |Port of Embarkation| C = Cherbourg, Q = Queenstown, S = Southampton| 19 | ## Objective 20 | The main objective of this project is to develop a predictive model that can accurately determine the likelihood of survival for passengers based on their individual characteristics. By analyzing the historical data and identifying patterns, the model will provide insights into the factors that influenced the chances of survival during the Titanic disaster. 21 | ## Approach 22 | The project will involve several steps, including data preprocessing, exploratory data analysis, feature engineering, model selection, and evaluation. The dataset will be carefully analyzed to handle missing values, perform feature engineering, and extract meaningful insights. Various machine learning algorithms, such as logistic regression, decision trees, random forests, or gradient boosting, will be explored and evaluated to determine the most effective model for predicting survival outcomes. 23 | -------------------------------------------------------------------------------- /Warranty Claims Fraud Prediction/Warranty Claims Fraud Prediction.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/Warranty Claims Fraud Prediction/Warranty Claims Fraud Prediction.pdf -------------------------------------------------------------------------------- /Warranty Claims Fraud Prediction/description.md: -------------------------------------------------------------------------------- 1 | # Warranty Claims Fraud Prediction 2 | ![](https://trendsettertelugu.com/wp-content/uploads/2022/12/File-a-warranty-claim.webp) 3 | ## Project Overview 4 | The aim of this data science project is to predict the authenticity of warranty claims by analyzing various factors such as region, product category, claim value, and more. The dataset used for this project was sourced from Kaggle and comprises 358 rows and 21 columns. 5 | 6 | ## Data Dictionary 7 | | Column Name | Description | 8 | |---------------------|-------------------------------------------------| 9 | | Unnamed: 0 | Index | 10 | | Region | Region of the claim | 11 | | State | State of the claim | 12 | | Area | Area of the claim | 13 | | City | City of the claim | 14 | | Consumer_profile | Consumer profile (Business/Personal) | 15 | | Product_category | Product category (Household/Entertainment) | 16 | | Product_type | Product type (AC/TV) | 17 | | AC_1001_Issue | Issue with AC component 1 (0 - No issue/No component, 1 - repair, 2 - replacement) | 18 | | AC_1002_Issue | Issue with AC component 2 (0 - No issue/No component, 1 - repair, 2 - replacement) | 19 | | AC_1003_Issue | Issue with AC component 3 (0 - No issue/No component, 1 - repair, 2 - replacement) | 20 | | TV_2001_Issue | Issue with TV component 1 (0 - No issue/No component, 1 - repair, 2 - replacement) | 21 | | TV_2002_Issue | Issue with TV component 2 (0 - No issue/No component, 1 - repair, 2 - replacement) | 22 | | TV_2003_Issue | Issue with TV component 3 (0 - No issue/No component, 1 - repair, 2 - replacement) | 23 | | Claim_Value | Claim value in INR | 24 | | Service_Center | Service center code | 25 | | Product_Age | Product age in days | 26 | | Purchased_from | Purchased from (Dealer, Manufacturer, Internet) | 27 | | Call_details | Call duration | 28 | | Purpose | Purpose of the call | 29 | | Fraud | Fraudulent (1) or Genuine (0) Conclusion | 30 | 31 | ## Conclusion 32 | From the exploratory data analysis, it was found that: 33 | - Warranty claims are most frequent in the southern region of India, particularly in Andhra Pradesh and Tamil Nadu. 34 | - Fraudulent claims are more common in urban regions such as Hyderabad and Chennai. 35 | - The dataset includes claims for two products: TVs and ACs. TVs have higher warranty claims when purchased for personal use compared to ACs. 36 | - Fraudulent claims for ACs occur even when there are no issues with the AC parts. 37 | - Fraudulent claims for TVs can occur with or without issues in the TV parts. 38 | - Fraudulent claims are more frequent when purchases are made through the manufacturer. 39 | - Fraudulent claims tend to have higher claim values compared to genuine claims. 40 | - Service center 13 had the highest number of fraudulent claims despite having fewer total warranty claims. 41 | - Fraudulent claims are more frequent when the customer care call duration is less than 3-4 minutes. 42 | 43 | Machine learning models, including Decision Tree Classifier, Random Forest Classifier, and Logistic Regression, were employed for prediction. These models achieved excellent accuracy levels of 91-92%. However, due to the limited number of fraudulent claims and a small dataset size, the models exhibited lower recall scores for fraudulent claims. This issue can be mitigated by collecting more data. 44 | 45 | This project holds the potential to assist in identifying and preventing warranty claims fraud, thereby saving resources and enhancing the efficiency of warranty claim processing. 46 | -------------------------------------------------------------------------------- /nunique.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AI0228/Data-Science-Projects-main/5261fccb336f74d0ca909c541c5258486be498d7/nunique.py -------------------------------------------------------------------------------- /zzzzz---git.cmd: -------------------------------------------------------------------------------- 1 | git add . 2 | git commit -m "mmm" 3 | git push 4 | pause --------------------------------------------------------------------------------