first_img Each generation lays claim to a new type of technology: the Internet, mobile phones, game consoles, e-readers, robots.Children are growing up in a world where automated machines operate cars, patrol buildings, play sports, and have sex.So, how do we teach kids to respect androids when their natural inclination is to touch, explore, and often violate?AdChoices广告Researchers from Naver Labs, KAIST, and Seoul National University in South Korea developed Shelly, a turtle-like cyborg designed to teach wee ones not to abuse robots.The tortoise-shaped toy is fun to play with, lighting up and dancing—until someone presses too hard or whacks it. Sensing danger, the bot’s head, arms, and legs retreat into its shell, hiding until it feels safe to come out again.The aim, according to IEEE Spectrum, is to educate children to restrain abusing behavior.“Since the children perceived Shelly as [a] robot, we believe that they also learned that abusing other robots is wrong,” Naver Labs research intern Jason Choi told the magazine.Large enough to occupy up to seven children under the age of 13, the droid uses LEDs to convey different emotional states—happy, sulky, angry, frightened; when scared, its lights turn off, and limbs pull back for 14 seconds.(When researchers reduced hiding length to seven seconds, abuse actually increased; the concealment was seen as a reward for the best uppercut.)Rather than testing how the hiding behavior affects children, analysts focused on the idea that their toy is significantly less interesting when it’s not moving, glowing, and vibrating. Children, therefore, learn that to keep Shelly active, they must treat it with care.The results are not surprising: Most toddlers and pre-teens were quick to understand the consequences of beating the robot—and often advised or condemned others who made that mistake.“Previous research has found that robots that rely on verbal warnings or escaping from abusive situations are not effective in restraining abusive behaviors,” Choi said. “These kinds of reactions rather excite people’s curiosity and motivate them to abuse robots continuously.“In our research, we showed that stopping attractive interaction is a better solution than somehow reacting to the abusive behavior,” he continued. “We can use this result generally by adding this simple algorithm to other robots.”Children aren’t the only ones who struggle to understand our future overlord’s robots. NPR podcast Hidden Brain last year tackled the question of whether this new technology could fundamentally change how humans interact with each other in the episode “Could You Kill a Robot?“ Let us know what you like about Geek by taking our survey. Stay on targetlast_img

Written by 

Leave a Reply

Your email address will not be published. Required fields are marked *