Lecture

Regression with Linear Models

Regression is a type of supervised learning used to predict continuous numerical values.

Unlike classification, where we predict categories, regression estimates a numeric output based on input features.

The following are some example uses of regression:

  • Predicting house prices based on square footage
  • Estimating temperature from weather data
  • Forecasting sales from historical trends

Types of Linear Regression

  • Simple Linear Regression: Uses one feature to predict a target.
  • Multiple Linear Regression: Uses multiple features to predict a target.
  • Regularized Linear Regression: Adds a penalty to reduce overfitting (e.g., Ridge, Lasso).

Example: Predicting House Prices

The following example shows how to use linear regression to predict house prices.

Linear Regression Example
from sklearn.linear_model import LinearRegression from sklearn.model_selection import train_test_split from sklearn.metrics import mean_squared_error, r2_score import numpy as np # Sample dataset X = np.array([[1000], [1500], [2000], [2500], [3000]]) # square footage y = np.array([200000, 250000, 300000, 350000, 400000]) # prices # Train/test split X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42) # Create and train model model = LinearRegression() model.fit(X_train, y_train) # Predictions y_pred = model.predict(X_test) # Evaluation mse = mean_squared_error(y_test, y_pred) r2 = r2_score(y_test, y_pred) print("Mean Squared Error:", mse) print("R² Score:", r2)
Quiz
0 / 1

Linear regression can predict categories like 'spam' or 'not spam'.

True
False

Lecture

AI Tutor

Design

Upload

Notes

Favorites

Help