Myvideo

Guest

Login

Meet MAVE: the virtual K-pop stars created with Unreal Engine and MetaHuman

Uploaded By: Myvideo
105 views
0
0 votes
0

Find out how Metaverse Entertainment used MetaHuman and Unreal Engine to create a natural, believable, and charming virtual K-pop band, and in the process, produced IP content in various forms. Read more at: #UnrealEngine, #virtualproduction, #digitalhuman, #Virtualkpop #MetaverseEntertainment, #UE5 #MAVE:, Metaverse Entertainment is a company that combines the technology of Netmarble F&C and the sensibility of Kakao Entertainment. Founded in 2021, our company has talented individuals coming from various industries, including games, movies, and entertainment. Through their experience and collaboration, a group called MAVE: was formed. MAVE: is a meta K-pop band. We aim to develop attractive characters with all new appearances, as well as working in various forms of content and media, also providing fans with interactive experiences. Considering the visual quality, which is comparable to offline rendering and the efficiency in making various types of content from a single source, Unreal Engine was our best option. Unreal Engine truly helped us to produce a music video in a short period of time with high-quality visuals. Unreal Engine enabled us to create the large amount of content we needed for social media channels without compromising the content quality. This has helped us build a strong foundation for MAVE:'s transmedia storytelling through constant communication with fans. A four-person K-pop band requires each member to have a unique personality and an attractive appearance. As well as the appearance, it is important for the members to be able to express characteristics and detailed facial expressions reacting to various situations. Even a single facial expression requires repetitive and time-consuming tasks such as modeling and rigging. MetaHuman technology has been the perfect choice. MetaHuman is built on decades of accumulated Digital Human creation technology. The facial rig enables us to easily create detailed facial expressions for a desired character, as well as share animations with multiple characters. MetaHuman's high degree of compatibility with external tools makes it possible for us to share animation mesh data with them. This greatly reduced the actual production time. Rig Logic is an advanced character customization system in MetaHuman. We had the benefit of the Rig Logic white paper in developing the functions we required effectively. The character production cycle consists of character planning, modeling, facial expression creation, and rigging, hair production, body correction, and more. Character planning was conducted in close collaboration with the experts from Kakao Entertainment, who have a great deal of successful K-pop band planning experience. K-pop band members are created with a combination of an attractive basic appearance along with matching style. In traditional K-pop bands, the members are selected from existing trainee pool and their final looks are finished up with make-up and styling. Virtual bands have a big difference in that new members have to be created from scratch, including their appearance, expression, movement, and tone. We built a pipeline that uses a GAN network to automatically generate desired images and modify or combine eigenvectors to provide an environment as close as possible to the original one. This allowed entertainment experts to select a character whose appearance ideally matched the formula of a successful K-pop band and then modify the character as intended, utilizing their knowledge to the fullest extent. In the stage of creating and modifying facial expressions, we analyzed the model and developed our own tool that automatically creates about 800 facial expressions by employing information of the location and size of each area and the flow of muscles. In addition, we created a function that can customize a unique expression that reflects the character's personality and characteristics. As rigging was required for a facial expression that did not exist before, Unreal Engine's Control Rig was automatically created and configured to suit the character. Since the Mesh to MetaHuman Plugin hadn't been released yet, we developed our own tools and features. This experience helped us a lot in modifying the algorithm as needed and building an automated pipeline. We used Maya's XGen to create the character's hair. Through Unreal Engine's groom hair rendering, we were able to obtain hair quality comparable to that of offline renderers. We created a tool that turn groom-based hair into a card and used it according to performance, and then optimized the work process so that it automatically created cards without binding assets. We also applied automation to the body correction stage and modified the shape according to current pose, utilizing dozens of correction shapes. ....

Share with your friends

Link:

Embed:

Video Size:

Custom size:

x

Add to Playlist:

Favorites
My Playlist
Watch Later