#openaicom #metahumancreatorcom #elevenlabsio #githubcom #ubisoftcom
**Zero-shot Example-based Gesture Generation from Speech**
This is a proof-of-concept project for a virtual assistant, which employs GPT-4, Unreal Engine’s Metahuman, Eleven Labs API, and Ubisoft’s ZEGGS model. The AI model GPT-4 is used to generate text and associated emotion tags, which inform the assistant’s speech sentiment, gesture style, and facial expressions. Animation is accomplished through Unreal Engine’s Metahuman, and speech synthesis is achieved via Eleven Labs API. The ZEGGS model, developed by Ubisoft La Forge, is used to generate gestures. While the system is noted to have some glitches and a considerable delay, it provides a look into the potential future capabilities of AI technology.