Chapter 1 introduction, exercise 1-10 solutions
- (*)www Consider the sum-of-squares error function given
in which the function is given by the polynomial . Show that the coefficents that minimize this error function are given by the solution to the following set of linear equations
where
Solution:
-
(*) Write down the set of coupled linear equations, analogous to , satisfied by the coefficients which minimize the regularized sum-of-squares error function given by
Solution:
-
(**) Suppose that we have three coloured boxes (red), (blue), and (green). Box contains 3 apples, 4 oranges, and 3 limes, box contains 1 apple, 1 orange, and 0 limes, and box contains 3 apples, 3 oranges, and 4 limes. If a box is choosen at random with probabilities , and a piece of fruit is removed from the box (with equal probability of selecting any of the items in the box), then what is the probability of selecting an apple? If we observe that the selected fruit is in fact an orange, what is the probability that it came from the green box?
Solution: Bayes’ theorem
- (*)www Consider a probability density defined over a continuous variable ,
and suppose that we make a nonlinear change of variable using , so that the
density transforms according toBy differentiating (1.27), show that the location of the maximum of the density in is not in general related to the location of the maximum of the density over by the simple functional relation as a consequence of the Jacobian factor. This shows that the maximum of a probability density (in contrast to a simple function) is dependent on the choice of variable. Verify that, in the case of a linear transformation, the location of the maximum transforms in the same way as the variable itself.
Solution: TODO
- (*) Using the definition
show that satisfies
Solution:
-
(*) Show that if two variables and are independent, then their covariance is zero.
Solution:
If is a discrete distribution, then
If is a continuous distribution, then
- (**) www In this exercise, we prove the normalization condition (1.48) for
the univariate Gaussian. To do this consider, the integralwhich we can evaluate by first writing its square in the form
Now make the transformation from Cartesian coordinates to the polar coordinates and then subsitute . Show that, by performing the integrals over and , and then taking the square root of both sides, we obtain
Finally, use this result to show that the Gaussian distribution is normalized.
Solution:
with and polar coordinates transformation ,
Why ? check this linkso, , then using the transformation
- (**)www By using a change of variables, verify that the univariate Gaussian
distribution given bysatisfies
Next, by differentiating both sides of the normalization condition
with respect to , verfiy that the Gaussian satisfies
Finally, show that (1.51) holds.
Solution:
with and integral of an odd function is 0 is odd if
by differentiating both sides of the normalization condition (1.127) with respect to
At last,
- (*)www Show that the mode (i.e. the maximum) of Gaussian distribution (1.46) is
given by , Similarly, show that the mode of the multivariate Gaussianis given by
Solution:
we got
the multivariate case: TODO
- (*)www Suppose that the two variables and are statistically independent.
Show that the mean and variance of their sum satisfiesSolution: if and are statistically independent, then