LOGISTIC REGRESSION

Katabathina
3 min readMar 23, 2024

--

Have you ever wondered if an email is spam, whether an online purchase is fraudulent, or if a picture contains a cat? These are all classification problems, where we want to predict one of two possible outcomes. Enter Logistic Regression, a powerful tool in our data science toolbox!

Imagine a Coin Toss… But More Complicated

Linear regression, a familiar concept, predicts continuous values based on a straight line. Logistic regression tackles a different beast: categorical outcomes. Think of flipping a coin: it can only land on heads or tails. Logistic regression works similarly, but instead of a straight line, it uses a sigmoid function (S-shaped curve) to predict the probability of one outcome (e.g., heads) happening.

The Sigmoid Function: From Numbers to Probabilities

The sigmoid function takes any number as input and squishes it between 0 and 1. As the input increases, the output approaches 1 (highly likely). Conversely, as the input decreases, the output approaches 0 (very unlikely). This magic allows us to interpret the output as a probability!

Building our Logistic Regression Model

Let’s say we want to predict if someone will click on an ad based on their age. Here’s a simplified breakdown:

  1. Independent Variable (X): Age (numerical)
  2. Dependent Variable (y): Clicked (Yes/No, categorical)
  3. Model: We use the sigmoid function to transform a linear equation involving age into a probability of clicking (between 0 and 1).

Coefficients: Unveiling the Why

The linear equation in our model has a weight (coefficient) associated with age. A positive coefficient suggests a higher age increases the probability of clicking, while a negative coefficient indicates the opposite.

Making Predictions with our Model

Once trained with data, we can plug in a new age value into the model. The sigmoid function will output a probability between 0 and 1. We can then set a threshold (e.g., 0.7) to classify:

  • Probability >= 0.7: Likely to click (predict “Yes”)
  • Probability < 0.7: Unlikely to click (predict “No”)

Python Code in Action (using scikit-learn library):

Python

from sklearn.linear_model import LogisticRegression
from sklearn.model_selection import train_test_split
import pandas as pd
import matplotlib.pyplot as plt
# Sample Data (Age & Clicked)
data = pd.DataFrame({'Age': [25, 32, 41, 50, 68], 'Clicked': [1, 0, 1, 0, 1]})
# Split data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(data['Age'], data['Clicked'])
# Create and train the Logistic Regression model
model = LogisticRegression()
model.fit(X_train.reshape(-1, 1), y_train)
# Make predictions on the test set
predictions = model.predict(X_test.reshape(-1, 1))
# Print the predictions and actual values
print("Predicted Clicks:", predictions)
print("Actual Clicks:", y_test.tolist())
# Plot the sigmoid function and decision boundary
plt.plot(data['Age'], model.predict_proba(data[['Age']])[:, 1]) # Probability of clicking
plt.scatter(data['Age'], data['Clicked'])
plt.xlabel('Age')
plt.ylabel('Probability of Clicking')
plt.title('Logistic Regression Model')
plt.show()

This code snippet demonstrates:

  • Importing necessary libraries
  • Creating sample data (Age & Clicked)
  • Splitting data into training and testing sets
  • Building and training the Logistic Regression model
  • Making predictions on the test set
  • Plotting the sigmoid function and decision boundary

Visualizing the Power of Logistic Regression

The graph showcases the sigmoid function and decision boundary. Data points above the boundary are classified as likely to click (red), while those below are classified as unlikely (blue).

Logistic Regression: A Stepping Stone

Logistic regression is a powerful tool for binary classification problems. It’s easy to understand, interpret, and implement, making it a great starting point for beginners in data science. As you progress, you can explore more complex algorithms like decision trees and neural networks for tackling a wider range of classification challenges.

--

--