Would you get nearly naked before your computer in search of a better-fitting dress or the perfect pair of jeans? A British company is hoping to convince millions of us to do just that — scan our bodies for commerce.
The next time you walk into a dressing room at a department store, there is a very slim chance you could hear a British woman’s voice issuing from a gleaming white pod telling you to be still, “Your scan is about to start.”
Tania Fauvel works for Bodymetrics, a company that scans people’s bodies in the name of fashion. “From their body scans we create a 3-D model, and from that we can actually try on clothing to see how it fits online,” she said.
Bodymetrics installs those gleaming white pods in dressing rooms. They’re equipped with lasers or special cameras. The pods create detailed digital 3-D models of your body. The company has plans to put a pod in the U.S. this month. Right now there’s one in Selfridges, a department store in London.
“The customers would come here. They would get undressed to their underwear,” explained Suran Goonatilake, CEO of Bodymetrics. “And it takes about five seconds and we get hundreds of measurements.”
Right now, the pod just delivers suggestions of jeans that are likely to fit you. Eventually, Bodymetrics wants to be able to display what those jeans would look like on you — digitally.
“This isn’t the first time that we’ve seen technology where the idea is to match clothes to some 3-D rendering of an image,” said Sucharita Mulpuru, a retail analyst at Forrester.
She says pods like this have been around for at least a decade, and often they are little more than a marketing gimmick. What makes this moment special is that this technology may be on the verge of leaping out of high-tech pods in department store dressing rooms into our living rooms.
The same cameras Bodymetrics uses in its high-tech scanning pod are also built into the Kinect — Microsoft’s hands-free video game controller. These sensors are already in more than 20 million homes worldwide.
Mulpuru says the real holy grail for companies like Bodymetrics is to let you scan your body at home. After all, in a store, you can actually try on the jeans and see if they fit, but when you are shopping online at home, that’s not possible.
“Return rates in online retail have between a 20- and 30-percent rate,” Mulpuru said. “If they could cut that in half, that would be very lucrative and it would be less frustrating for the customer.”
But teaching computers to see and model the real world in three dimensions won’t just change fitting rooms or online retail. It could transform everything from surgery to architecture. It could create computers that watch us and model the world or even help us navigate it.
David Kim, a Microsoft researcher in Cambridge, U.K., wants to use the Microsoft Kinect to build computers that observe us — watch us, maybe without us even realizing. “The computer will just pick up my context — it will know what I am intending to do,” he said.
Right now, he’s using the Kinect to model humans and their environment.
A few months ago, Kim showed me a project Microsoft calls KinectFusion. He picked up a regular Kinect sensor. And as a colleague of his explained what he was doing, Kim walked around holding the Kinect.
On a television screen, I could see the device building a detailed 3-D model of the room — and everything in it, including me. It captured everything down to the shape of my ears and the wrinkles on my shirt.
Then David pressed a button, and suddenly thousands of virtual balls shot at my 3-D image on the screen. The balls were not real — they were just pixels on the screen programmed to behave like balls. And they did. They bounced off my head, rolled off the surface of the table, collected in the bottom of a coffee mug.
Dozens of companies, including Bodymetrics, want to use the same technology to drape your body — not with balls in a video game, but with virtual clothes online.
Tania Fauvel of Bodymetrics is convinced this will soon transform how millions of us shop. But I can’t help wondering if people are really ready to stand in front of their computers or a Kinect naked.
Fauvel said it isn’t necessary to bare it all. “So long as you are wearing tight-fitting clothing, that’s fine,” she said.
But capturing embarrassingly accurate images of your body — as awkward as it may be — is just the first step in creating a digital dressing room that really works.
The second may be more complicated: creating digital models of clothes.
“It is very difficult to model cloth in three dimensions,” says Susan Ashdown, a professor at Cornell University who studies body mapping and the clothing industry. She says a silk blouse and cotton blouse with the exact same cut will behave differently when you try them on. A digital dressing room will have to account for that.
“When you take the whole range of human sizes, shapes and postures and the whole wide variety of types of cloth and how they interact on the body, it’s mind-boggling,” she says.
But computer-aided design is beginning to get there.
Julia Shaw at OptiTex creates digital models of clothes for department stores like Target and Kohl’s. These computer designs let stores create or tweak new styles without actually stitching prototype garments.
OptiTex can model these digital garments on images of human bodies to illustrate what the clothes would look like.
“You can type in and customize the models to meet anybody’s personal body specifications,” Shaw said.
But doing this live online — with real people — would require a huge amount of computing power.
“It’s cutting edge,” Shaw said. “We are still on the cusp of having enough computing power to make this work online live.”
Shaw believes all the pieces are coming together, but to make it really work would it require server farms in the cloud powered by the same graphics chips that are now used to run some of the world’s fastest supercomputers.
And that’s a pretty big investment for any clothing company to make just to help you find the perfect pair of jeans.