Skip to content

Add Problem #40: Linear Regression#7

Merged
duoan merged 1 commit intoduoan:masterfrom
ThierryHJ:feature/add-linear-regression
Mar 8, 2026
Merged

Add Problem #40: Linear Regression#7
duoan merged 1 commit intoduoan:masterfrom
ThierryHJ:feature/add-linear-regression

Conversation

@ThierryHJ
Copy link
Contributor

@ThierryHJ ThierryHJ commented Mar 7, 2026

Add Problem #40: Linear Regression

Summary

Adds a new Medium-difficulty problem where users implement linear regression using three approaches:

  1. Closed-form (Normal Equation) β€” solve w = (X^T X)^{-1} X^T y via torch.linalg.lstsq
  2. Gradient Descent from scratch β€” manual gradient computation, no autograd
  3. PyTorch nn.Linear β€” standard training loop with loss.backward() + optimizer.step()

All three methods return (w, b) so that y_pred = X @ w + b.

Motivation

  • Linear regression is the most common ML interview warm-up question β€” asked at Meta, Google, Amazon, and others
  • Implementing it three ways tests understanding of the math (normal equation), optimization fundamentals (manual GD), and PyTorch training conventions (nn.Linear + autograd)
  • The "three approaches" format is a popular interview structure that tests depth of understanding

Files Changed

File Status Description
torch_judge/tasks/linear_regression.py NEW Task definition with 6 test cases
templates/40_linear_regression.ipynb NEW Template notebook (blank stub)
solutions/40_linear_regression_solution.ipynb NEW Reference solution
templates/00_welcome.ipynb Modified Added row #40 to Training & Optimization table, updated count 39β†’40
README.md Modified Added row #40 to problem table, updated badge 39β†’40

Test Cases

# Test Name What It Verifies
1 Closed-form returns correct shapes (w, b) with w: (D,), b: ()
2 Closed-form finds correct weights Exact recovery on noiseless data (atol=1e-4)
3 Gradient descent converges Near-correct weights after 2000 steps (atol=0.1)
4 nn.Linear approach works Training loop produces correct weights (atol=0.1)
5 All three methods agree Cross-method consistency on noisy data (atol=0.15)
6 Closed-form uses no autograd Verifies requires_grad=False on output

Conventions Followed

  • Auto-discovery via _registry.py β€” no manual registration needed
  • Standard 5-cell notebook format (markdown β†’ import β†’ implementation β†’ debug β†’ submit)
  • Colab badge on template notebook
  • {fn} placeholder in all test code
  • torch.allclose() for numerical comparison with appropriate tolerances
  • Difficulty emoji consistent with existing Medium problems (🟑)

How to Test

# From repo root
python3 -c "
from torch_judge.tasks._registry import TASKS
print('linear_regression' in TASKS)  # True
"

Open solutions/40_linear_regression_solution.ipynb in JupyterLab and run all cells β€” the final check('linear_regression') should show 6/6 tests passed.

@duoan duoan merged commit 0531bb6 into duoan:master Mar 8, 2026
1 check passed
@ThierryHJ ThierryHJ deleted the feature/add-linear-regression branch March 15, 2026 00:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants