Chris Bishop: Even the most sophisticated computers can't tell a dog from a cat

Tuesday 06 January 2009 01:00 GMT
Comments

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

The task of recognising structure or objects in data is called pattern recognition. It's something computers find incredibly difficult. Let's say we want to distinguish between pictures of cats and pictures of dogs. The problem is that there is huge variation in the images. There are different sizes and different colours of cats and dogs, and different shapes, and there are changes in the lighting and changes in the background. Even just working out which part of the image is the animal and which is the background is hard for a computer.

Back in the 1970s, scientists tried creating artificial intelligence using an idea called Expert Systems, which were based on handcrafted rules. The problem is, when we have a rule, we often find there's an exception to the rule – a cat with long fur and a dog with short fur. When we think of a rule, we can often find an exception. So while rule-based systems have been found useful for some applications, we've pretty much given up using them to do pattern recognition.

So what we need is a new approach. The idea is this – instead of programming the computer to solve the pattern recognition problem directly, we programme the computer to learn from data and then we train the computer to solve the problem, a bit like the way you and I learn things from experience.

An adult human is thought to be able to distinguish tens of thousands of categories of objects. Even a toddler is significantly better at recognising everyday objects than a supercomputer. And yet the progress that we've made so far has already led to some practical applications, from allowing robots in factories to see what they're assembling, to allowing tumours to be detected in medical images.

The challenge of digital intelligence is one of the most fascinating frontiers of computer science. It's more than half a century since the first digital computers were built, and yet we're still at the beginning of the digital revolution. Whatever advances the next 50 years bring, they will be at least as important and at least as exciting as those of the past 50, and it's the scientists of the next generation that will make it happen.

From one of the Royal Institution Christmas Lectures by Professor Chris Bishop, Chief Research Scientist at Microsoft Research, Cambridge

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in