-->

Gradient descent versus normal equation

Gradient descent versus normal equation
    Gradient descent and normal equation (also called batch processing) both are methods for finding out the local minimum of a function. I have given some intuition about gradient descent in previous article. Gradient descent is actually an iterative method to find out the parameters. We start with the initial guess of the parameters (usually zeros but not necessarily), and gradually adjust those parameters so that we get the function that best fit the given data points. Here is the mathematical formulation of gradient descent in the case of linear regression.
    Read more »
    fardi zayden
    @مرسلة بواسطة
    كاتب ومحرر اخبار اعمل في موقع دراسات تقنية .

    إرسال تعليق