Expressive robotic lamp is a Pixar fantasy come to life
Despite the countless ways that robots can improve our everyday lives, many people remain skeptical about incorporating these mechanical creatures into their day-to-day activities, especially at home. It’s one thing to welcome a new kitchen appliance or upgraded electronic device into your house, but it’s quite another to embrace an autonomous, seemingly sentient robot. This problem is only exacerbated by engineers’ inability to agree on the best design to encourage interaction between robots and humans.
If you engineer a robot that mimics the size and stature of a human, then it will remind people of “The Terminator.” If you design a machine with big eyes and a digital smile, then it will be seen as little more than a child’s toy. If you create a robot that takes its form and function from a familiar animal, then people will be horrified when you attach a flamethrower to its back. So, what’s the right answer? How do we create a world where robots and humans engage effortlessly?
The answer to this conundrum might be found in a 1986 Pixar short film called "Luxo Jr." For those unfamiliar with the historic computer-animated film, it depicts two desk lamps as they interact with each other and the world around them. The short film has no dialogue, and the emotions and intentions of the lamps are conveyed through nonverbal behaviors. The posture, gestures, and gazes of the animated lamps not only convey story and motivations but also endear the lamps to every human that sees them. Apple seems to have taken direct inspiration from this classic movie with ELEGNT, a new smart lamp that interacts with its human users in new and expressive ways.
It has been rumored for a while that Apple is considering entering the home robotics industry, and the release of a new paper and video on Apple’s Machine Learning Research site only seems to confirm this. The paper, titled “ELEGNT: Expressive and Functional Movement Design for Non-Anthropomorphic Robot,” was written by Yuhan Hu, Peide Huang, Mouli Sivapurapu, and Jian Zhang and was published in arXiv. In an excerpt from the introduction, the team writes: “In this paper, we present the design and prototyping of a lamp-like robot that explores the interplay between functional and expressive objectives in movement design. Using a research-through-design methodology, we document the hardware design process, define expressive movement primitives, and outline a set of interaction scenario storyboards. We propose a framework that incorporates both functional and expressive utilities during movement generation, and implement the robot behavior sequences in different function- and social- oriented tasks.”
So, what makes the ELEGNT lamp so innovative? In a video released alongside the paper, the researchers compare expression-driven versus function-driven movements across six task scenarios. According to the researchers, “Our findings indicate that expression-driven movements significantly enhance user engagement and perceived robot qualities. This effect is especially pronounced in social-oriented tasks.”
The footage speaks for itself. In several split-screen tests, viewers can see how expressive movements give the robot an inquisitive, life-like personality that endears the robot to anyone who sees it.
In a recent TechCrunch article, author Brian Heater writes: “A split-screen video highlights the importance of expressive movements. Asked what the weather is like outside, one version simply states the answer. The other swivels its head to look out the window as if the view offers insight on which the robot can draw. It’s a simple example, but one that drives home how even small movements tap into our lizard brain’s pareidolia. The familiarity of expressive movements helps form a connection between human and object.”
Kyle Barr also reported on the ELEGNT lamp for Gizmodo. In his article, he writes: “The expressive robot is far more entertaining than one that merely does what you tell it to. In one highlight, the robot arm tried to extend to look at a note that its arm couldn’t reach, before shaking its head in dejection and apologizing with an AI-generated voice. In another part of the video, a user asked the lamp for the weather.”
There is even a clip of the expressive robot dancing along to ambient music while its human user goes about their everyday activities. It’s like a Pixar fantasy come to life.
No one knows if this prototype will result in a marketable device, what the price point will be, or how long it will take to make it to your local Apple store. It’s nearly impossible to know if a device like this would improve my life in meaningful ways or just serve as a dust collector on one of my shelves, but in the immortal words of Philip J. Fry, “Shut up and take my money.”