You will build every component of a logistic regression pipeline by hand — softmax, loss functions, gradient computation, and parameter updates — using NumPy only (no PyTorch).
cw2_logistic_regression/losses.py — implement softmax & 4 loss functionscw2_logistic_regression/model.py — implement forward & predictcw2_logistic_regression/optimizer.py — implement gradient descentrun.py, trainer.py, config.py — provided, do not modifycommon/ — shared data loading, metrics, visualizationtests/test_cw2.py — numerical gradient checks
Submitting on Canvas: zip cw2_logistic_regression/ (with outputs/ included) together with your report.pdf.
| File | What to implement |
|---|---|
losses.py |
softmax(logits)cross_entropy_loss(logits, labels, num_classes)hinge_loss(logits, labels, num_classes)exponential_loss(logits, labels, num_classes)squared_loss(logits, labels, num_classes)
|
model.py |
LogisticRegression.forward(X)LogisticRegression.predict(X)
|
optimizer.py |
GradientDescent.__init__(...)GradientDescent.step(model, grad_logits, X)
|
Do not modify run.py, config.py, or trainer.py.
cd code
pip install -r requirements.txt
python setup_data.py # downloads FashionMNIST as .npy files
cd cw2_logistic_regression
Quick mode for debugging (10K samples, 10 epochs):
python run.py --quick
Train with a specific loss function:
python run.py --loss_type cross_entropy
python run.py --loss_type hinge
python run.py --loss_type exponential
python run.py --loss_type squared
Compare all four loss functions (generates the comparison plot for your report):
python run.py --compare_all
Verify your implementation with numerical gradient checks:
cd code
python -m tests.test_cw2
After running --compare_all, inspect outputs/loss_comparison.png. You should see:
losses.py — all 5 functions implemented and gradient checks passmodel.py — forward and predict implementedoptimizer.py — gradient descent step implementedoutputs/ — contains plots from python run.py --compare_allreport.pdf — using the provided template| Component | Points |
|---|---|
Softmax + 4 loss functions (losses.py) — correctness verified by gradient check | 40 |
Model forward & predict (model.py) | 20 |
Gradient descent optimizer (optimizer.py) | 15 |
Gradient checks pass (python -m tests.test_cw2) | 5 |
| Report — loss comparison analysis, convergence discussion, visualizations | 20 |
| Bonus: L2 regularization (capped at 100 total) | +10 |
| Total | 100 |