Turning 42 In Lockdown

There was no fanfare for me this year, because like many people around the world, I celebrated my birthday when the coronavirus was at its peak. Despite missing out on the usual hoopla, I feel…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Locally Weighted Learning

Locally weighted Learning is a Machine Learning algorithm that tries to approximate a function provided some data. To understand the topic in detail, first, we need to understand the problem statement. The basic problem statement is Regression. First, let’s talk about Regression.

Let’s say that we are provided with a dataset that contains some values of some independent variable (in this case, x) and some dependent output on the independent variable f(x). The problem statement is to calculate the approximate value of f(x) at some given x. This problem statement in Machine Learning is called Regression. When f(x) is linear in nature, it becomes a Linear Regression.

An example illustrating Linear Regression
An example illustrating linear Regression

Now, as we have understood the problem statement, it is time to understand our main algorithm.

In this algorithm, we lookout for a few data points from the given dataset that are in proximity of the test point and average those points in some manner (Simple of weighted) to predict the output. Below is an example demonstrating the basic idea of the algorithms

Let’s say we want to calculate the value of f(x) at x = 42. We follow the below two steps to calculate this value:

Approximating value at given x with LWL

Using the above-described algorithm, we try to compute the y-values of numbers from 2–100, and it fits a line like this.

Gif showing Regression with the help of LWL
Regressing through Dataset

The same technique can also be used in the case of classification examples. The example that we saw here uses only 1-Dimension. It can be extended to further dimensions using L2 or another norm.

One more thing to notice is that I pointed out that after selecting, we take an average of y-values the query point(test point). Well, this average can also be weighted.

There are various functions that can be used to calculate weights for averaging. A few of these are plotted below:

For example, let’s take a query point x = 42 again and we choose the weight functions as 1/d and four closest points are x = [41,42.5,41.5,40] then the corresponding Euclidean distances are d = [1,0.5,0.5,2] and corresponding weights will be W = [1,2,2,0.5]

f(42) = (1*f(41)+2*f(42.5)+2*f(41.5)+0.5*f(40)) / (1+2+2+0.5)

Below are Some warnings while using this method:

Add a comment

Related posts:

When You Possess a Different Kind Courage

It is not the courage of the fearful, the kind possessed by those who pretend to be brave. It is not the courage of the weak, the kind possessed by those who pretend to be strong. For these kinds of…

Basic Things to know before starting embedded systems.

Basic Things to know before starting embedded systems.. If you are programming on Arduino or other boards give a kick start to three important concepts.This knowledge is must for embedded….

What the app currently looks like

With these two well known services I’m sure for each of us there are features and aspects that we love, as well as things we don’t. One of the lesser known Music streaming services out there is…