Machine Learning Fundamentals: Linear Regression
It all starts with "Y = (w * X) + b", but, What is it, and what does it means? The foundations of Linear Regressions are based on a simple equation, which we may find extrange, but most of us have seen it before, so as soon as we remember that, it becomes quite easy to understand. When we were studying, one of the most loved/hated topics in math are Algebra, specifically: Equations, and there is a singular one that we use a lot when we get into Equations: "The Line Equation", which defines the value on the Y plane (Vertical Line), as a dependency of the value on the X plane (Horizontal Line), so it defines Y in terms of X, or what is the same, it states that Y value depends on X value. We may find this equation in many forms, here is an image with the most common ways we can find it. As you can see, after a quick review, we can find that the Linear Regression Equation looks quite the same as the "Slope-Intercept" form of the Line Equation. In fact, it'