Chapter 1 introduction, exercise 1-10 solutions

  1. (*)www Consider the sum-of-squares error function given

    in which the function is given by the polynomial . Show that the coefficents that minimize this error function are given by the solution to the following set of linear equations

    where

    Solution:

  2. (*) Write down the set of coupled linear equations, analogous to , satisfied by the coefficients which minimize the regularized sum-of-squares error function given by

    Solution:

  3. (**) Suppose that we have three coloured boxes (red), (blue), and (green). Box contains 3 apples, 4 oranges, and 3 limes, box contains 1 apple, 1 orange, and 0 limes, and box contains 3 apples, 3 oranges, and 4 limes. If a box is choosen at random with probabilities , and a piece of fruit is removed from the box (with equal probability of selecting any of the items in the box), then what is the probability of selecting an apple? If we observe that the selected fruit is in fact an orange, what is the probability that it came from the green box?

    Solution: Bayes’ theorem

  4. (*)www Consider a probability density defined over a continuous variable ,
    and suppose that we make a nonlinear change of variable using , so that the
    density transforms according to

    By differentiating (1.27), show that the location of the maximum of the density in is not in general related to the location of the maximum of the density over by the simple functional relation as a consequence of the Jacobian factor. This shows that the maximum of a probability density (in contrast to a simple function) is dependent on the choice of variable. Verify that, in the case of a linear transformation, the location of the maximum transforms in the same way as the variable itself.

    Solution: TODO

  5. (*) Using the definition

    show that satisfies

    Solution:

  6. (*) Show that if two variables and are independent, then their covariance is zero.

    Solution:

    If is a discrete distribution, then

    If is a continuous distribution, then

  7. (**) www In this exercise, we prove the normalization condition (1.48) for
    the univariate Gaussian. To do this consider, the integral

    which we can evaluate by first writing its square in the form

    Now make the transformation from Cartesian coordinates to the polar coordinates and then subsitute . Show that, by performing the integrals over and , and then taking the square root of both sides, we obtain

    Finally, use this result to show that the Gaussian distribution is normalized.

    Solution:

    with and polar coordinates transformation ,
    Why ? check this link

    so, , then using the transformation

  8. (**)www By using a change of variables, verify that the univariate Gaussian
    distribution given by

    satisfies

    Next, by differentiating both sides of the normalization condition

    with respect to , verfiy that the Gaussian satisfies

    Finally, show that (1.51) holds.

    Solution:

    with and integral of an odd function is 0 is odd if

    by differentiating both sides of the normalization condition (1.127) with respect to

    At last,

  9. (*)www Show that the mode (i.e. the maximum) of Gaussian distribution (1.46) is
    given by , Similarly, show that the mode of the multivariate Gaussian

    is given by

    Solution:

    we got

    the multivariate case: TODO

  10. (*)www Suppose that the two variables and are statistically independent.
    Show that the mean and variance of their sum satisfies

    Solution: if and are statistically independent, then

Chapter 1 Introduction, Exercise 1-10 Solutions -