- What Is Logistic Regression in Python
- Mathematics Involved in Logistic Regression
- Performance Measuring via Confusion Matrices
- Demonstration of Logistic Regression with Python Code

* Logistic Regression is one of the most popular Machine Learning Algorithms, used in the case of predicting various categorical datasets*. Categorical Datasets have only two outcomes, either 0/1 or Yes/No

**This article was published as a part of theÂ Data Science Blogathon.**

- Overview
- What Is Logistic Regression?
- Why Apply Logistic Regression?
- Mathematics Involved in Logistic Regression
- Implementation Of Logistic Regression In Making Predictions
- Measuring Performance
- Key Features Of Logistic Regression
- Types of Logistic Regression
- Python Code Implementation
- Performing Exploratory data analysis:
- Using function to replace null entries
- Filling the missing Age data
- Drop null data
- Create dummy variables for Sex and Embarked column s
- Add dummy variables to the DataFrame and drop non-numeric data
- Print the finalized data set
- Split the data set into x and y data
- Split the data set into training data and test data
- Create the model
- Train the model and create predictions
- Calculate performance metrics
- Generate a confusion matrix

It is a type of Regression Machine Learning Algorithms being deployed to solve Classification Problems/categorical,

Problems having binary outcomes, such as Yes/No, 0/1, True/False, are the ones being called classification problems.

Linear regression doesnâ€™t give a good fit line for the problems having only two values(being shown in the figure), It will give less accuracy while prediction because it will fail to cover the datasets, being linear in nature.

**For the best fit of categorical datasets, a Curve is being required which is being possible with the help of Logistic Regression, as it uses a Sigmoid function to make predictions**

* The main reason behind bending of the Logistic Regression curve is because of being calculated using a Sigmoid Function* (also known as Logistic Function because being used in logistic regression) being given below

This the mathematical function which is having the ‘S – Shaped curve’. The value of the Sigmoid Function always lies between 0 and 1, which is why it’s being deployed to solve categorical problems having two possible values.

* Logistic Regression deploys the sigmoid function to make predictions in the case of Categorical values*.

It sets a cut-off point value, which is mostly being set as 0.5, which, when being exceeded by the predicted output of the Logistic curve, gives respective predicted output in form of which category the dataset belongs

For Example,

In the case of the Diabetes prediction Model, if the output exceeds the cutoff point, prediction output will be given as Yes for Diabetes otherwise No, if the value is below the cutoff point

* For measuring the performance of the model solving classification problems, the Confusion matrix is being used*, below is the implementation of the Confusion Matrix.

**– TN Stands for True Negatives(**The predicted(negative) value matches the actual(negative) value)**– FP stands for False Positives****(**The actual value, was negative, but the model predicted a positive value)**– FN stands for False Negatives****(**The actual value, was positive, but the model predicted a negative value)**– TP stands for True Positives(**The predicted(positive) value matched the actual value(positive))

**For a good model, one should not have a high number of False Positive or False Negative**

1. Logistic regression is one of the most popular Machine Learning algorithms, used in the Supervised Machine Learning technique. It is used for predicting the categorical dependent variable, using a given set of independent variables.

2. It predicts the output of a categorical variable, which is discrete in nature. It can be either Yes or No, 0 or 1, true or False, etc. but instead of giving the exact value as 0 and 1, it gives the output as the probability of the dataset which lies between 0 and 1.

3. It is similar to Linear Regression. The only difference is that Linear Regression is used for solving Regression problems, whereas Logistic regression is used for solving the classification problems/Categorical problems.

4 In Logistic regression, the “S” shaped logistic (sigmoid) function is being used as a fitting curve, which gives output lying between 0 and 1.

Binomial Logistic regression deals with those problems with target variables having only two possible values, 0 or 1.

Which can Signify Yes/No, True /False, Dead/Alive, and other categorical values.

Ordinal Logistic Regression Deals with those problems whose target variables can have 3 or more than 3 values, unordered in nature. Those values don’t have any quantitative significance

For Example Type 1 House, Type 3 House, Type 3 House, etc

Multinomial Logistic regression, just Ordinal Logistic Regression, deals with Problems having target values to be more than or equal to3. The main difference lies that unlike Ordinal, those values are well ordered. The values Hold Quantitative Significance

For Example, Evaluation Of skill as Low, Average, Expert

** **

**[ Note: The Datasets Being Taken is The Titanic Dataset]**

importpandasaspdimportnumpyasnpimportmatplotlib.pyplotasplt%matplotlib inlineimportseabornassns

sns,set()

**Python Code:**

```
import pandas as pd
import numpy as np
# import matplotlib.pyplot as plt
# import seaborn as sns
titanic_data = pd.read_csv('titanic_train.csv')
print(titanic_data.head())
```

** **

`sns`

.heatmap(titanic_data.isnull(),cbar=False)sns.countplot(x='Survived',data=titanic_data)sns.countplot(x='Survived',hue='Sex',data=titanic_data)sns.countplot(x='Survived',hue='Pclass',data=titanic_data)

** **

heatmap

sns

.boxplot(titanic_data[‘Pclass’],titanic_data[‘Age’])

** **

** **

definput_missing_age(columns):age=columns[0]passenger_class=columns[1]ifpd.isnull(age):if(passenger_class==1):returntitanic_data[titanic_data['Pclass']==1]['Age'].mean()elif(passenger_class==2):returntitanic_data[titanic_data['Pclass']==2]['Age'].mean()elif(passenger_class==3):returntitanic_data[titanic_data['Pclass']==3]['Age'].mean()else:returnage

`titanic_data`

['Age']=titanic_data[['Age','Pclass']].apply(input_missing_age,axis=1)

`titanic_data`

.drop('Cabin',axis=1,inplace=True)titanic_data.dropna(inplace=True)

* *

`sex_data`

=pd.get_dummies(titanic_data['Sex'],drop_first=True)embarked_data=pd.get_dummies(titanic_data['Embarked'],drop_first=True)

* *

`titanic_data`

=pd.concat([titanic_data,sex_data,embarked_data],axis=1)titanic_data.drop(['Name','PassengerId','Ticket','Sex','Embarked'],axis=1,inplace=True)

titanic_data

.head()

* *

`y_data`

=titanic_data['Survived']x_data=titanic_data.drop('Survived',axis=1)

* *

fromsklearn.model_selectionimporttrain_test_split

x_training_data,x_test_data,y_training_data,y_test_data=train_test_split(x_data,y_data,test_size=0.3)

* *

fromsklearn.linear_modelimportLogisticRegression model=LogisticRegression()

Train the model and create predictionsmodel

.fit(x_training_data,y_training_data)predictions=model.predict(x_test_data)

fromsklearn.metricsimportclassification_reportprint(classification_report(y_test_data,predictions))

** **

```
precision recall f1-score support
0 0.83 0.87 0.85 169
1 0.75 0.68 0.72 98
accuracy 0.80 267
macro avg 0.79 0.78 0.78 267
weighted avg 0.80 0.80 0.80 267
```

* *

fromsklearn.metricsimportconfusion_matrixprint(confusion_matrix(y_test_data,

predictions)

** **

```
[[145 22]
[ 30 70]]
```

*The media shown in this article are not owned by Analytics Vidhya and is used at the Authorâ€™s discretion.Â *

Lorem ipsum dolor sit amet, consectetur adipiscing elit,

Become a full stack data scientist
##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

##

Understanding Cost Function
Understanding Gradient Descent
Math Behind Gradient Descent
Assumptions of Linear Regression
Implement Linear Regression from Scratch
Train Linear Regression in Python
Implementing Linear Regression in R
Diagnosing Residual Plots in Linear Regression Models
Generalized Linear Models
Introduction to Logistic Regression
Odds Ratio
Implementing Logistic Regression from Scratch
Introduction to Scikit-learn in Python
Train Logistic Regression in python
Multiclass using Logistic Regression
How to use Multinomial and Ordinal Logistic Regression in R ?
Challenges with Linear Regression
Introduction to Regularisation
Implementing Regularisation
Ridge Regression
Lasso Regression

Introduction to Stacking
Implementing Stacking
Variants of Stacking
Implementing Variants of Stacking
Introduction to Blending
Bootstrap Sampling
Introduction to Random Sampling
Hyper-parameters of Random Forest
Implementing Random Forest
Out-of-Bag (OOB) Score in the Random Forest
IPL Team Win Prediction Project Using Machine Learning
Introduction to Boosting
Gradient Boosting Algorithm
Math behind GBM
Implementing GBM in python
Regularized Greedy Forests
Extreme Gradient Boosting
Implementing XGBM in python
Tuning Hyperparameters of XGBoost in Python
Implement XGBM in R/H2O
Adaptive Boosting
Implementing Adaptive Boosing
LightGBM
Implementing LightGBM in Python
Catboost
Implementing Catboost in Python

Introduction to Clustering
Applications of Clustering
Evaluation Metrics for Clustering
Understanding K-Means
Implementation of K-Means in Python
Implementation of K-Means in R
Choosing Right Value for K
Profiling Market Segments using K-Means Clustering
Hierarchical Clustering
Implementation of Hierarchial Clustering
DBSCAN
Defining Similarity between clusters
Build Better and Accurate Clusters with Gaussian Mixture Models

Introduction to Machine Learning Interpretability
Framework and Interpretable Models
model Agnostic Methods for Interpretability
Implementing Interpretable Model
Understanding SHAP
Out-of-Core ML
Introduction to Interpretable Machine Learning Models
Model Agnostic Methods for Interpretability
Game Theory & Shapley Values

Deploying Machine Learning Model using Streamlit
Deploying ML Models in Docker
Deploy Using Streamlit
Deploy on Heroku
Deploy Using Netlify
Introduction to Amazon Sagemaker
Setting up Amazon SageMaker
Using SageMaker Endpoint to Generate Inference
Deploy on Microsoft Azure Cloud
Introduction to Flask for Model
Deploying ML model using Flask

Excellent article Thank you for sharing

Dear sir, I have read blog regarding Regression for biggineers of Google. Heartly thanks for simple wording . I could understand more clearly. But I am getting example of Titanic. Still could understand in little bit. Requesting you to share more topics regarding machine learning on same blog. Once again thanks. Regards , Deepali Pandit 8975947456