Snap wants machine learning experts to make more animated messages
What is Sergey Arkhangelskiy’s, CEO of Wannaby, client of Inna Efimchik, perspective on Snapchat ML services?
Snap is calling on outside artificial intelligence experts to create silly effects for its popular Snapchat messaging service.
The company debuted a new service on Thursday that data scientists can use to import machine learning models into Snapchat to create more compelling Lenses, or special effects that tap augmented reality technology. That technology superimposes digital animations on videos, such as windshield wipers on an individual’s glasses, among other feats.
Eitan Pilipski, senior vice president of Snap’s camera platform, said the new service, Snap ML, is intended to build a bridge between machine learning engineers, a typically serious bunch, and Snap’s outside community of creative animators and content creators.
The idea is that A.I. engineers will train deep learning models to do things like recognize hand gestures, Pilipski explained. They can then import those hand-gesture models and create features using augmented reality.
During a videoconference call, for example, Pilipski used his fingers to make a peace sign, which caused him to digitally sprout long hair, glasses, and a tie-dye headband, as if he were in a Cheech & Chong movie.
But Pilipski, who previously spent 16 years as a technology executive at Qualcomm, also pitched Snap ML as more than fun. He described it as tool for companies to advertise their services to Snapchat users.
Wannaby, a startup that makes augmented reality apps for fashion companies like Gucci, has used Snap ML to create a Lens that superimposes digital shoes on people’s feet when they point their smartphone’s camera at them. Wannaby’s Lens can seemingly change the materials or colors of the shoes, so shoppers can see what they look like.
For Wannaby CEO Sergey Arkhangelskiy, having access to Snapchat’s big user base is more important than Snap’s underlying technology. Snap said it had 229 million daily active users in the first quarter of 2020. Arkhangelskiy said his company’s own augmented reality app is still “better” than Snapchat’s because he has more control over the underlying technology, letting him create more powerful features.
Pilipski conceded that Snap has put limits on the size of Lenses outsiders can create, but that’s because the technology must work for millions of users who have different kinds of smartphones.
Still, Arkhangelskiy said Snap’s new ML service and Lens tools enable him to make more compelling augmented reality features for his customers than Snap rival Instagram. Instagram’s technology is “mostly about faces,” said Arkhangelskiy, and the tools the company offers are generally designed for creating filters on people’s visages, rather than Snap’s software, which can accommodate a variety of body parts.
Instagram, which is owned by Facebook, has also been courting businesses and has debuted features to help companies sell their products. Google and Apple have created similar technology that others can use to build augmented reality apps.
Original Post:
By Jonathan Vanian
https://fortune.com/2020/06/11/snap-machine-learning-lens-studio/