Snap launches generative AI lenses that create AR effects based on text


Snap’s annual Lens Fest typically previews upcoming features and tools. This year, many announcements feature AI-powered tools.

At this year’s event, the company announced a beta program for Lens Studio 5.0, which includes new generative AI lens development tools and an API that allows creators to integrate ChatGPT, “unlocking new types of learning, conversational and creative experiences for Snapchatters.”

Lens Studio 5.0

Lens Studio is the toolkit that developers use to create lenses and filters – the augmented reality effects used by everyday Snapchatters. Lens Studio is also the backbone of Camera Kit, which Snap partners use to develop their own AR projects that work in other web or app-based experiences.

The upcoming fifth generation of the platform uses AI in two main ways: to help create content and to create new types of experiences.



Generative AI Lenses

Generative AI takes a prompt and returns a product. Through a partnership with DALL-E and ChatGPT creator OpenAI, Lens Studio 5.0 pairs Gen AI with Snap’s face mesh tools. Together the tool recognizes a face, maps it in three dimensions, and builds effects that react with a user’s face in real-time.

Preview of Gen-AI face mesh in Lens Studio 5.0
The 2023 Snap Lens Fest previewed a tool for AI-assisted face mesh generators through integration with open AI.| Bild: Snap

Expressive lenses built around face meshes are some of the most popular on Snapchat. Making them faster and easier to build is a powerful tool for developers, but also points towards more exciting experiences for platform users.

Snap also partnered with 3D AI generator Meshy, allowing users to generate materials and textures with text prompts. The tool alleviates what Trevor Stephenson, Snap’s senior manager of software engineering, called “the most time-consuming and challenging aspects of creating high-fidelity AR experiences.”

A preview of an AI-assisted texture generator in Lens Studio 5.0.
Snap demonstrated the use of an AI-assisted texture generator to speed up the Lens production process.| Bild: Snap.

Finally, Stephenson teased tools that would allow users to change camera features based on simple prompts, “like a director.” In an accompanying video, a woman searches for “moody vibes” and the app activates a fuzzy black-and-white lens.

The first lenses made with GenAI became available earlier this year, but the toolset to make them was not previously available to the general developer community.


Mendable, which Snap used to create “a generative AI-powered tool that ingests all of our learning content to create an intelligent browsing experience.”

Snap AI also suggests lenses based on a user’s environment or memories. At the Snap Partners Summit in the spring, the company announced My AI – a customizable AI companion. However, this feature is still only supported on iOS devices and has received few exciting updates since its release.

Snap has a history of using the seemingly old to embrace the new. The self-proclaimed “camera company” introduced countless people to the world of augmented reality.

Now, as many users and developers alike are tempted to take the magic of AR for granted, Snap is slowly welcoming people into the future of artificial intelligence as a production tool and social companion. Besides, AR goes hand-in-hand with AI anyway, with its strong need for advanced computer vision.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top