A linear combination of these two functions defines the penalty in elastic net regularization. For 10 points each:
[10h] Name these two functions. Lasso and ridge regression are modifications of least squares regression with these two functions incorporated into the penalty function.
ANSWER: L1 norm AND L2 norm [accept distance in place of “norm”; accept rectilinear or Manhattan or taxicab in place of “L1”; accept Euclidean or quadratic or square in place of “L2”; accept answers in either order]
[10e] In general, lasso regression lacks a closed form solution since this operation cannot be applied to the penalty function. According to the fundamental theorem of calculus, this operation is the opposite of integration.
ANSWER: differentiation [or derivative]
[10m] Regularization techniques like ridge regression help avoid this phenomenon observed in models with high variance. In this phenomenon, models perform well on training data, but fail to generalize on fresh data.
ANSWER: overfitting
<CT, Other Science - Computer Science>