Machine learning is one of the most progressive technologies in the world today. Researchers all across the world are capitalizing on a large number of algorithms to solve complex computational problems involving rigorous mathematics. The beauty of machine learning and its sub-domain neural network is that it yields desirable results without being explicitly programmed. Practitioners don’t have to tell a machine to do everything with clear instructions when using machine learning development services. Neural networks can learn everything by themselves, all one needs is a large dataset.
Ever since the inception of major algorithms in the 1980s took place, there have been less drastic changes in machine learning. Researchers might have come up with mathematically advanced algorithms and learning procedures, but when it comes to the dataset, nothing has changed. Put differently, machine learning algorithms still require a large amount of data to efficiently learn about a process or task. Even though this is fundamentally how we are going about in the world, adding more and more datasets to a model, it isn’t very smart to be precise.
A Learning Experiment
We are still far off from the idea of human intelligence and the notion of common sense when it comes to machine learning. For example, consider a scenario where a human child is learning. One fine Sunday, Susie’s dad wakes up early and takes her to the garden. He decides to do fun learning activities together with her since it’s the only day that he finds to spend with her daughter. As Susie sets out to play, her dad gifts her brand new Lego set and starts teaching her how to build towers. With excitement and curiosity, Susie watches the way her father puts little pieces together, fitting them into each other’s grooves and turning them into a towel. Minutes later, Suzie starts doing the same. After about half an hour, her dad calls her up and asks her to start building the legos sideways instead of up. What do you think Susie did? Could she have been able to replicate the learning and build it the towel sideways? Or would she have waited for her dad to give her a million examples before she learned the new process? Exactly!
This is how humans and machines differ. Children do not need millions of examples to learn and start replicating things. Instead, they use their innate ability to learn things at a more general level. This form of learning is often termed as the generalized form of learning. Cognitive science researchers came up with this idea that common sense was an innate ability of humans, something they are born with. Sadly, that’s not the case with machines, which is why even the simplest of machine learning models require at least a thousand samples to learn a particular task.
Scientists in the past have tried feeding direct knowledge to computers for learning, just like in the case of babies. But, this approach has failed miserably, which is why the idea of machine learning came up in the first place. Today’s machine learning models can be trained to learn almost anything and everything. This is one of the main reasons why it is being applied to almost all industries that we know today. Be it healthcare, business, education or entertainment, machine learning is all around us whether we realize it or not.
One of the biggest problems with machine learning is that companies who successfully build models seldom care about the amount of dataset that goes into building it. They have an abundance of data, especially with things turning digital. Consider the social media titan Facebook for an example. More than a billion people have their profiles on Facebook. These people post statuses, pictures along with a lot of other information. It gives Facebook an upper hand to harness all the data and train an algorithm based on it. Josh Tenenbaum, a psychologist at the Massachusetts Institute of Technology (MIT) in Cambridge, highlights the fact that when companies already have an abundance of data in today’s digital world, they don’t push technologies like artificial intelligence and machine learning from learning like humans.
This is the case with every other organization using machine learning around the world. All they care about is short term goals and receiving the results. Most of the machine learning models don’t even understand the mechanics of the task. They just seek to find an underlying pattern in the dataset that is statistically relevant for them.
Inducing human-like learning
However, there is ongoing research at the Massachusetts Institute of Technology (MIT) that aims to understand human intelligence and replicate the same for machines. The initiative is called the Intelligent Quest and answers crucial nature vs nurture theories of learning.
One approach to making machines learn from small datasets can be learning from trial and error, just the way babies do. They don’t look at millions of samples. All they look at is a few instructions and they keep on doing and learning from their errors. Moreover, this would shift the focus from supervision learning that machines are mostly used to.
In another move, Amazon has come up with Amazon Rekognition Custom labels, which is enabling users to build their own specialized ML-based image analysis capabilities and helps detect unique objects. The best part about this tool is that it helps detect machine parts from a limited number of images. These images need to be labeled and go as an input to the model without having to build it from scratch. The model further gives a state of the art performance for its unique image analysis needs.
Even though Amazon’s Rekognition tool might work on identifying machine parts, the same underlying background can be extended to other tasks. It is imperative to understand that babies look at the world at all kinds of odd angles, without focusing on a particular task. In spite of this, organizations are able to recognize familiar objects and extend their learning in a generalized manner. Machine learning, therefore, must learn to replicate the innate ability of common sense, if it truly wants to be like humans.