ML Algorithms
Supervised Learning
03 / 13

Linear Regression — Interview Questions

A curated bank of conceptual, mathematical, and case-study interview questions on Linear Regression.

How to use this page
Each question is tagged Easy / Medium / Hard. Try to answer before expanding the accordion. Questions are grouped by theme — Fundamentals, Regularization, Diagnostics, and Tricky cases.

1. Fundamentals & Math

2. Regularization (Ridge, Lasso, Elastic Net)

3. Assumptions & Diagnostics

4. Tricky / Case-Study Questions

Quick-fire one-liners

  • Why intercept? So the model isn't forced through the origin.
  • Bias-variance with λ? ↑λ → ↑bias, ↓variance.
  • Cook's distance? Influence of a single point on all fitted values.
  • Dummy variable trap? Drop one one-hot column to avoid perfect collinearity.
  • Robust loss? Huber — quadratic near 0, linear in the tails.
# Quick sklearn cheat-sheet for interviews
from sklearn.linear_model import LinearRegression, Ridge, Lasso, ElasticNetCV
from sklearn.preprocessing import StandardScaler
from sklearn.pipeline import make_pipeline

# OLS
ols = LinearRegression().fit(X, y)

# Ridge with CV — always scale first!
ridge = make_pipeline(StandardScaler(), Ridge(alpha=1.0)).fit(X, y)

# Lasso for feature selection
lasso = make_pipeline(StandardScaler(), Lasso(alpha=0.1)).fit(X, y)

# Elastic Net — handles correlated features + sparsity
enet = make_pipeline(StandardScaler(),
                     ElasticNetCV(l1_ratio=[.1, .5, .9], cv=5)).fit(X, y)