It all starts with "Y = (w * X) + b", but, What is it, and what does it means?
The foundations of Linear Regressions are based on a simple equation, which we may find extrange, but most of us have seen it before, so as soon as we remember that, it becomes quite easy to understand.
When we were studying, one of the most loved/hated topics in math are Algebra, specifically: Equations, and there is a singular one that we use a lot when we get into Equations: "The Line Equation", which defines the value on the Y plane (Vertical Line), as a dependency of the value on the X plane (Horizontal Line), so it defines Y in terms of X, or what is the same, it states that Y value depends on X value.
We may find this equation in many forms, here is an image with the most common ways we can find it.
As you can see, after a quick review, we can find that the Linear Regression Equation looks quite the same as the "Slope-Intercept" form of the Line Equation. In fact, it's the same equation, just with other leters and names to represent each element of the line. Let's compare each letter side to side, to see their relationship.
As you can see, the Y and X values have the same meaning (for now), but the W and M have a distinct name. On the Linear Regression, W (Weight) means the value that multiplies the X value, so we can get the point on the Cartesian Planes, and for instance, get it's Slope. In the Line Equation, M does the same, but as shown in the table, it's caled Slope instead.
On the other hand, B in the Linear Regression Equation means Bias, so basically it is used to move the Line Up or Down depending on the B value Sign, and is called Bias as it can move the line to be close or far from any point in the Y Axis on the Cartesian Planes. And again, it does the same on the Line Equation, but is called Y-Intercept, as it reflects the value where the Line will cros the Y Axis when X value is Zero (0).
Getting Deeper into Linear Regression
So now that we have cleared the relationship between the Linear Regression Equation and the Line Equation, we may get into what is exactly the "Linear Regression", and what is it used for.
Linear Regression is an algorithm or Lineal Model which tries to represent the relationship between multiple input elements in a dataset (X values) and the outputs (Y values) as Linear, so once we have found the Weights and Biases, it's quite easy to find the value for Y based on X, and as X value changes, we may know how much will Y value change too.
The Line Equation may have 1 input (X value), and 1 output (Y value), But a Linear Regression Model may have 1 or more Inputs (X values), and 1 or more Outputs (Y values). In fact, the most common way we find Linear Regression Samples is for Multiple Input and a Single Output (ie: predicting the price of a house based on it's area, rooms count, location, etc).
When dealing with Linear Regression Models, if the result is a single value, the result is called Regression Line, as it represents the Line of the relationship between Y and X values. But if the model have multiple outputs, then the result is called Hyper-Plane, as the results are esentially multiple lines on different planes. Here is a visual representation of both types of Linear Regression Models.
NOTE: Each point in the image is a feature or input value (X value)
Regression Models are commonly used to predict continuous values (numbers with no delimited range, or infinite posibilities within a range), ie: the price of an item, the weight of an object, the area of a constructio, etc.
NOTE: There are more Regression Models, others than Linear (ie: Logistic, Ridge, Lasso, Polynomial, Bayesian, etc), but not all Regression Models returns Continuous Values, some of them are Categorical (predicts values within a range).
In a next post, we're going to explore other Regression Models and how do they work. Mean while, feel free to ask anything about Linear Regression, or other topic related to Artificial Intelligence, Machine Learning and/or Data Science.
Comments
Post a Comment