Python Engineer

Free Python and Machine Learning Tutorials

Become A Patron and get exclusive content! Get access to ML From Scratch notebooks, join a private Slack channel, get priority response, and more! I really appreciate the support!

back to course overview

Logistic Regression in Python - ML From Scratch 03

15 Sep 2019

In this Machine Learning from Scratch Tutorial, we are going to implement the Logistic Regression algorithm, using only built-in Python modules and numpy. We will also learn about the concept and the math behind this popular ML algorithm.

All algorithms from this course can be found on GitHub together with example tests.

Further readings:

Implementation

import numpy as np class LogisticRegression: def __init__(self, learning_rate=0.001, n_iters=1000): self.lr = learning_rate self.n_iters = n_iters self.weights = None self.bias = None def fit(self, X, y): n_samples, n_features = X.shape # init parameters self.weights = np.zeros(n_features) self.bias = 0 # gradient descent for _ in range(self.n_iters): # approximate y with linear combination of weights and x, plus bias linear_model = np.dot(X, self.weights) + self.bias # apply sigmoid function y_predicted = self._sigmoid(linear_model) # compute gradients dw = (1 / n_samples) * np.dot(X.T, (y_predicted - y)) db = (1 / n_samples) * np.sum(y_predicted - y) # update parameters self.weights -= self.lr * dw self.bias -= self.lr * db def predict(self, X): linear_model = np.dot(X, self.weights) + self.bias y_predicted = self._sigmoid(linear_model) y_predicted_cls = [1 if i > 0.5 else 0 for i in y_predicted] return np.array(y_predicted_cls) def _sigmoid(self, x): return 1 / (1 + np.exp(-x))