From self-driving cars to smart surveillance cams, society is slowly learning to trust AI over human eyes. But although our new machine vision systems are tireless and ever-vigilant, they’re far from infallible. Just look at the toy turtle above. It looks like a turtle, right? Well, not to a neural network trained by Google to identify everyday objects. To Google’s AI it looks exactly like a rifle.
This 3D-printed turtle is an example of what’s known as an “adversarial image.” In the AI world, these are pictures engineered to trick machine vision software, incorporating special patterns that make AI systems flip out. Think of them as optical illusions for computers. You can make adversarial glasses that trick facial recognition systems...
from
https://www.theverge.com/2017/11/2/16597276/google-ai-image-attacks-adversarial-turtle-rifle-3d-printed
from
http://ifeeltechinc.blogspot.com/2017/11/googles-ai-thinks-this-turtle-looks.html
No comments:
Post a Comment