Snap’s annual Lens Fest typically previews upcoming features and tools. This year, many announcements feature AI-powered tools.
At this year’s event, the company announced a beta program for Lens Studio 5.0, which includes new generative AI lens development tools and an API that allows creators to integrate ChatGPT, “unlocking new types of learning, conversational and creative experiences for Snapchatters.”
Lens Studio 5.0
Lens Studio is the toolkit that developers use to create lenses and filters – the augmented reality effects used by everyday Snapchatters. Lens Studio is also the backbone of Camera Kit, which Snap partners use to develop their own AR projects that work in other web or app-based experiences.
The upcoming fifth generation of the platform uses AI in two main ways: to help create content and to create new types of experiences.
Generative AI Lenses
Generative AI takes a prompt and returns a product. Through a partnership with DALL-E and ChatGPT creator OpenAI, Lens Studio 5.0 pairs Gen AI with Snap’s face mesh tools. Together the tool recognizes a face, maps it in three dimensions, and builds effects that react with a user’s face in real-time.
Expressive lenses built around face meshes are some of the most popular on Snapchat. Making them faster and easier to build is a powerful tool for developers, but also points towards more exciting experiences for platform users.
Snap also partnered with 3D AI generator Meshy, allowing users to generate materials and textures with text prompts. The tool alleviates what Trevor Stephenson, Snap’s senior manager of software engineering, called “the most time-consuming and challenging aspects of creating high-fidelity AR experiences.”
Finally, Stephenson teased tools that would allow users to change camera features based on simple prompts, “like a director.” In an accompanying video, a woman searches for “moody vibes” and the app activates a fuzzy black-and-white lens.
The first lenses made with GenAI became available earlier this year, but the toolset to make them was not previously available to the general developer community.
Most readers have likely interacted with a Large Language Model that responds to a prompt not with a product but with its own ideas. Lens Studio 5.0 includes a “Remote API” that allows developers to integrate ChatGPT into their lenses.
While many readers will already be relatively familiar with this technology, innovative social media applications such as Snapchat are how many people first experience emerging technologies such as artificial intelligence, providing a significant entry point for the less technologically inclined.
AI elsewhere in Snap
Snap isn’t just using AI to create content. Head of education, learning, and training Stacey Long Genovese also showcased an integration with Mendable, which Snap used to create “a generative AI-powered tool that ingests all of our learning content to create an intelligent browsing experience.”
Snap AI also suggests lenses based on a user’s environment or memories. At the Snap Partners Summit in the spring, the company announced My AI – a customizable AI companion. However, this feature is still only supported on iOS devices and has received few exciting updates since its release.
Snap has a history of using the seemingly old to embrace the new. The self-proclaimed “camera company” introduced countless people to the world of augmented reality.
Now, as many users and developers alike are tempted to take the magic of AR for granted, Snap is slowly welcoming people into the future of artificial intelligence as a production tool and social companion. Besides, AR goes hand-in-hand with AI anyway, with its strong need for advanced computer vision.