machine learning, supervised and unsupervised

In machine learning by Henry Wolf1 Comment

I lied. I said that I was going to use this blog to write about articles as I read them, but the fact of the matter is that I need a bit more of a basis in machine learning before I jump into academic articles. The complexities of deep learning would not be of any benefit to me as much of it would go over my head. It is for that reason that I will start by writing about Andrew Ng’s Machine Learning course on Coursera. The class ended without my enrolment, but going through the archives allows me to go through the course videos at a faster pace than I would have been able to otherwise.

The course begins with an introduction to the two main types of learning that are used in machine learning algorithms: supervised and unsupervised. Supervised learning is when the machine is given the “right answers”. One example used was that of housing prices. The machine is given a bunch of data (house sizes and their prices) and asked to do a task (predict the price of a house when given its size). In this example the “right answer” is the price of the house. This example is a regression problem as the machine is to predict along a continuous value, be it linear, exponential, etc.

I believe that I have seen this type of system in use whilst searching for a car to buy online. CarGurus.com has a system that does the above almost exactly. Instead of the size of a house, it uses the mileage of a vehicle to predict the price. The machine is given information on the mileage and price of each car by model, make, and year. It can then predict the fair market value of a car given the mileage. It uses this information to make recommendations about which cars are the best deal, telling users to the dollar the instant market value if they purchased the car. As far as I know, this does not take the condition of the car into question, but it is a good example of a supervised learning system that utilises regression.

Another type of supervised learning involves classification, which requires the machine to predict a discrete value as its output. The example provided is whether a cancerous tumour is malignant or not, based on its size and the age of the patient. Putting these as the x and y axes of a graph, a line can be drawn to discern the likelihood that the tumour is malignant. In this case, the line is linear, but it need not be.

It is also true that this idea is much stronger when one realises that the machine need not work in two-dimensional space. Later, I suspect that course will cover using layers of hidden units to create more powerful algorithms.

Unsupervised learning is when only the data is given, without a “right answer”. This is primarily used for clustering algorithms. The two examples given were those of Google News and the cocktail party problem. Google News analyses the content of a news article and groups them using an unsupervised clustering algorithm. Articles that come out at the same time and have a significant amountĀ of shared unique words, such as proper names, are very likely to be related.

The cocktail party problem is as important in cognitive science as it is in machine learning. The question has to do with how one can understand the person talking to them when there is a significant amount of background noise. Why does it not get jumbled together? With regards to machine learning, the question has to do with getting one stream of information from an audio recording. With two sources of audio, this can be done by recording from two microphones and clustering the sounds. The specifics of the example were unclear, but perhaps loudness and similarity of sound frequency are taken into account to solve this question. Humans may be doing something similar. Perhaps this is why living creatures evolved two ears, rather than one (except of course for the praying mantis – which does not believe in evolution).

Perhaps the most interesting factoid was that this problem can be solved with a single line of code, at least in the Octave programming language. Unfortunately, it seems there are no well-developed OSX bundles of Octave, so I will be attempting to do the same in Matlab (or Python or R). Wish me luck.

Comments

  1. Good stuff. Looking forward to learning more about the different kinds of machine learning.

Leave a Comment