Researchers Create 'Psychedelic' Stickers That Confuse AI Image Recognition
Researchers Create 'Psychedelic' Stickers That Confuse AI Image Recognition
Published on January 07, 2018 at 10:34PM
"Researchers at Google were able to create little stickers with 'psychedelic'-looking patterns on them that could trick computer AI image-classifying algorithms into mis-classifying images of objects that it would normally be able to recognize," writes amxcoder: The patterned stickers work by tricking the image recognition algorithm into focusing on, and studying, the little pattern on the small sticker -- and ignoring the rest of the image, including the actual object in the picture... The images on the stickers were created by the researchers using knowledge of features and shapes, patterns, and colors that the image recognition algorithms look for and focus on. These stickers were created so that the algorithm finds them 'more interesting' than the rest of the image and will focus most of it's attention on analyzing the pattern, while giving the rest of the image content a lower importance, thus ignoring it or confusing it. The technique "works in the real world, and can be disguised as an innocuous sticker," note the researchers -- describing them as "targeted adversarial image patches."
Published on January 07, 2018 at 10:34PM
"Researchers at Google were able to create little stickers with 'psychedelic'-looking patterns on them that could trick computer AI image-classifying algorithms into mis-classifying images of objects that it would normally be able to recognize," writes amxcoder: The patterned stickers work by tricking the image recognition algorithm into focusing on, and studying, the little pattern on the small sticker -- and ignoring the rest of the image, including the actual object in the picture... The images on the stickers were created by the researchers using knowledge of features and shapes, patterns, and colors that the image recognition algorithms look for and focus on. These stickers were created so that the algorithm finds them 'more interesting' than the rest of the image and will focus most of it's attention on analyzing the pattern, while giving the rest of the image content a lower importance, thus ignoring it or confusing it. The technique "works in the real world, and can be disguised as an innocuous sticker," note the researchers -- describing them as "targeted adversarial image patches."
Read more of this story at Slashdot.
No comments