Startup translates human emotion into quantifiable data for auto industry
“We don’t make a wearable,” said Ben Bland, CTO of Sensum. “We provide physiological and emotional data insights that can be captured with a wide array of sensor technology.”
What kind of data?
Ben said, “We can use data from heart rate monitors, technology that monitors palms, facial expression recognition, and voice analysis.”
“The skin on your hands has an evolutionary hangover. Our hands get clammy when we’re excited. We’re unaware of it, but sensors can pick up tiny changes through skin conductance,” Ben said.
This is the phenomenon that, with emotional arousal from stimuli, the skin momentarily becomes a better conductor of electricity.
Facial recognition works by creating a map of a human face, using algorithms that monitor vast quantities of pixels, signalling when the person is smiling, laughing, etc.
Now Sensum’s market focus is on the automotive industry. “We were blown away by the sector’s response,” said Ben. “We’ve spoken to almost every top-tier auto manufacturer, and they get it.”
With autonomous cars, it’s an accepted principle that the technology needs to understand the humans using it – both the driver and the passengers.
“Audi, among others, has already built an empathic car,” says Ben, when I ask him how far along the automakers are. “They’re all thinking about this, but some are further along than others.”
It sounds like Sensum has a captive audience, and is poised to hit the ground running after building its technology for several years. Ben says the technology can be applied to cars in three ways: safety, comfort and entertainment.
“We’ve been perfecting our emotional processing engine since 2011,” says Ben. “We’re world leaders at measuring emotions in the wild – anywhere outside the lab, where people live and breath.”
How large is Sensum now?
“We have fifteen full-time staff, currently,” said Ben.
Are you already at deal stage?
Ben said, “Yeah, we’ve been securing deals with car brands and tier one suppliers.”
Sensum has helped build the Ford Buzz car that flashes 200,000 external LED lights to correspond with the driver’s mood. Ben explains, “It’s a cool way to show how a vehicle responds to the driver’s emotions in real-time.”
Ford showed that driving a sportscar ranks with life’s most exciting moments. I wouldn’t know, since I drive a Renault Megane.
Ben stresses, “The Ford Buzz car is just showing what’s possible.”
What other customers can you describe?
“We worked with Red Bull Media House on a project with extreme sports athletes, measuring their real-time emotional state to create visualisations in video and VR content.”
He said, “We’ve also worked with Unilever – providing a new research tool that responds to consumer’s biometric state in the moment of product usage.”
Are there applications outside the vehicle? Will you ever sell a B-to-C product directly to consumers?
Ben said, “Not in the immediate future. We’re building a universal emotional processing engine. Thinking ahead, there are uses in the home, such as playlists that come on automatically based on your mood, or mood lighting – it’s about optimising services to someone’s current state. It could even detect a heart attack that’s about to happen, but that kind of medical use would be some way off for Sensum,” he warns.
Switching gears, Ben commented on the future of transport.
“The incentives to change are dramatic and obvious in the auto industry. Self-driving cars are vastly safer. If we let AI take over, we could get road deaths down to zero,” he said. “We take for granted how complex our cars are already, so we can expect to see what seems ‘futuristic’ come in very quickly.”
“Our customer base is very global, but I think Europe can be a hub for advanced mobility.”