senses

A new pair of eyes

Researchers are working very hard on the ability of computers to mimic the human senses—in their own way, to see, smell, touch, taste and hear. In this article we highlight two examples of algorithms that seem to be beating us at our own game.

Your eyes can be deceiving. Sometimes -even for humans- it is hard to distinguish a muffin from a Chiwawa. Most of us can recognize an object after seeing it once or twice. But the algorithms that power computer vision and voice recognition need thousands of examples to become familiar with each new image or word.

Researchers at Google DeepMind now have a way around this. They made a few clever tweaks to a deep-learning algorithm that allows it to recognize objects in images and other things from a single example—something known as “one-shot learning.” Read More

> Read full A new pair of eyes post


Augmented senses (four years later)

Exactly four years ago our favorite A.I. cheerleaders at IBM unveiled their list of innovations that had the potential to change the way people work, live and interact during the next five years. And yes, these have all to do with cognitive computing.

This new generation of machines will learn, adapt, sense and begin to experience the world as it really is. This year’s predictions focus on one element of the new era, the ability of computers to mimic the human senses—in their own way, to see, smell, touch, taste and hear.

Read More

> Read full Augmented senses (four years later) post