The future of brand content in the metaverse might involve generative-AI

This avatar creator wants to make branded metaverse assets more accessible using AI.
article cover

Ready Player Me

· 3 min read

Want to feel old? Listen up: Dall-E, the deep language model that generates images from text descriptions, was released by OpenAI all the way back in…January 2021. (Who can remember the first Dall-E picture they saw? 🙋‍♀️)

In the wake of Bing’s chatbot Sydney attempting to break up NYT reporter Kevin Roose’s marriage, the hype around image-generating AI Dall-E may feel like a thing of the past.

But while Dall-E (which must be pronounced in the Wall-E voice for full effect) may have passed its spotlight off to Sydney, its application for retail brands remains intact. In the years since Dall-E’s release, we’ve seen it used by everyone from Heinz (ketchup in space, anyone?) to Nestlé (whose French brand La Laitière used Dall-E to turn an iconic Vermeer painting into a scene featuring the yogurt brand’s own milkmaid).

The AI-generated image opportunities for brands don’t stop with traditional marketing campaigns. Earlier this month, virtual avatar creator Ready Player Me (whose brand partners have included L’Oréal, Adidas, and Calvin Klein, and whose avatars can be used across multiple virtual worlds) made moves to bring generative AI to brands in the metaverse, and to change the face of digital fashion.

Ready Player Me’s latest feature (still in experimentation mode) uses Dall-E to customize and stylize clothing items for avatars used across virtual worlds.

  • The Dall-E integration is available to the public through Ready Player Me Labs, which serves as a testing ground for new features before they’re released to users.

So how does it work? In simple terms, Ready Player Me sends a user’s text prompt to Dall-E, receives the AI-generated image, and applies it to 3D wearables.

Stay up to date on the retail industry

All the news and insights retail pros need to know, all in one newsletter. Join over 180,000 retail professionals by subscribing today.

AI-generated assets are the future of virtual worlds, said Ready Player Me CEO and co-founder Timmu Toke, who demonstrated the platform’s latest capability by asking for a “dense hot dog pattern on a black background” for his avatar’s jacket.

That’s important, because the market for digital assets (for video games and other virtual worlds) is on the rise: According to McKinsey & Co, global spending on virtual goods reached $110 billion in 2021, and is expected to reach at least $135 billion by 2024. And around 30% of that is attributable to virtual fashion.

Digital co-creation: There has been a certain amount of controversy surrounding AI-generated art because the models are trained on images from around the internet (potentially including other artists’ work). But Timmu says fashion brands and others who are leaning into the decentralized nature of the metaverse aren’t worried about the possibility of relinquishing sole control over digital designs.

In fact, Timmu says the Dall-E integration plays into the trend of co-creation that many brands are moving into. There’s a future in which brands build models allowing users to create assets in that brand’s style, and in which designers create base assets for users to customize, Timmu said.

What’s more, AI will make virtual asset generation faster and cheaper, he added.

In part because of that increased accessibility, Timmu said it won’t be long before metaverse first movers (higher-budget luxury brands) are introducing AI-generated content as part of their virtual presence.

“In five years, I think it’ll be like…Will there be any non-AI generated assets?”—MA

Stay up to date on the retail industry

All the news and insights retail pros need to know, all in one newsletter. Join over 180,000 retail professionals by subscribing today.