Tag: Unreal Engine

  • Epic Games Enhances Live Link Face for Real-Time iPhone Mocap

    Epic Games Enhances Live Link Face for Real-Time iPhone Mocap

    Key Takeaways

    1. Epic Games released version 1.4.2 of the Live Link Face app, improving real-time facial capture for iPhone users by fixing a manual IP address entry issue.
    2. Live Link Face allows users to stream facial motion data directly from an iPhone’s TrueDepth camera into Unreal Engine, enhancing digital character animation.
    3. The app supports creators in various fields, such as indie game development and livestreaming, by making high-quality facial animation accessible with common devices.
    4. Epic Games explored biometric applications, proposing a facial recognition system for age estimation to enhance online safety for kids, though the FTC rejected the proposal.
    5. Live Link Face remains free on the App Store and is regularly updated, with version 1.4.2 providing important usability improvements for creators in unique network setups.


    Epic Games has rolled out version 1.4.2 of its Live Link Face app, bringing a small yet meaningful fix that improves the real-time facial capture for iPhone users. This update addresses a regional problem that had stopped some users from entering IP addresses manually, which is essential for connecting the app to Unreal Engine in various network settings.

    Enhancing Real-Time Facial Capture

    Originally launched in 2020, Live Link Face allows users to stream facial motion data straight from an iPhone’s TrueDepth camera into Unreal Engine. When paired with MetaHuman Animator, creators can apply detailed facial movements to digital characters instantly, eliminating the need for complex studio motion capture setups.

    Epic Games remarked in their announcement, “Every subtle expression, look, and emotion is accurately captured and faithfully replicated on your digital human.”

    Valuable Tool for Creators

    This technology is becoming more important in many creative fields. Indie game developers, animation studios, VTubers, and livestreamers are all leveraging this tool to bring their digital characters to life with expressive, real-time performances. Aaron Sims, a well-known character designer from films like Men in Black and Gremlins 2, has commended this progress, saying, “We can take the realism all the way down to the pore. As someone who used to make puppets and prosthetics, now I can do anything I want.”

    Epic’s app also supports the company’s larger goal of making high-end animation workflows accessible to more people. By reducing the entry barriers, they allow smaller teams and individual creators to achieve professional-level facial animation using commonly available devices like the iPhone.

    Exploring New Applications

    Beyond just performance capture, Epic has also looked into biometric uses. In 2023, the company collaborated with identity verification companies Yoti and SuperAwesome to suggest a facial recognition system for estimating age. This proposal was submitted to the US Federal Trade Commission, aiming to enhance online safety for kids with privacy-focused age verification methods. Although the FTC ultimately rejected the proposal, PC Gamer mentioned that the concept may not be entirely off the table.

    As of April 2025, Live Link Face is still available for free on the App Store and continues to get regular updates. While version 1.4.2 is somewhat minor, it provides crucial usability enhancements for creators dealing with non-standard or limited network configurations.

    Source:
    Link


  • Unity 6 Launches with Windows on Arm Support and Enhanced Graphics

    Unity 6 Launches with Windows on Arm Support and Enhanced Graphics

    Unity has released its newest game development engine, Unity 6, enabling game developers to build video games efficiently on Windows, Macs, and Linux systems.

    Custom Engines vs. Unity

    Game studios, such as Kojima Productions, often utilize proprietary game engines tailored for their major titles. This approach allows them to integrate distinctive features that are absent in standard game engines and address bugs swiftly. Among the game development engines available, Unreal Engine is recognized as the most robust and feature-rich. Although it’s a commercial product, it has been utilized to create blockbuster games like BioShock, Fortnite, and Black Myth: Wukong.

    Simplicity for Smaller Studios

    Nevertheless, the intricacy of custom engines or Unreal Engine can slow down the development process for smaller studios. The complexity and the multitude of operating systems, CPUs, and GPUs can pose challenges. In such cases, a more straightforward commercial engine like Unity can significantly reduce the time and resources needed to test and release simpler games.

    New Features in Unity 6

    The latest version introduces various essential features aimed at helping developers produce visually appealing games more swiftly. Graphics rendering has been enhanced, boasting up to 50% better performance compared to its predecessor. Unity 6 incorporates Spatial-Temporal Post-Processing (STP), which upscales low-resolution images in real-time, contributing to improved game performance. Additionally, the rendering of environmental elements such as skies, water, foam, and plants has been upgraded for a more lifelike gaming experience.

    Moreover, Unity 6 brings an upgraded Sentis AI library, enabling games to respond in real-time to inputs from cameras, microphones, and other sensors. Developers can select AI tools from well-known sources like Hugging Face. The UI (User Interface) Toolkit and game editing tools, including ProBuilder and Cinemachine, have been refined to enhance developer efficiency. The latest iteration also introduces support for Windows on Arm for both the editor and compiled applications, making it compatible with the specialized Snapdragon X computers. However, game developers usually require high-performance machines with strong graphics cards, such as the ASUS ROG Strix Scar 16 (available on Amazon), to ensure high productivity and compatibility with widely used tools like Autodesk Maya and Adobe Creative Cloud, which are not compatible with Snapdragon X CPUs.

    Unity 6, Unity on YouTube, Unity blog post.

  • Epic Games and Qualcomm Transform In-Car Experience with Unreal Engine

    Epic Games and Qualcomm Transform In-Car Experience with Unreal Engine

    Qualcomm Technologies and Epic Games are teaming up to revolutionize the driving experience by incorporating Unreal Engine into modern vehicles. Unreal Engine, a well-known tool for crafting stunning video game graphics, will be seamlessly integrated with Qualcomm’s Snapdragon Cockpit Platform.

    A Personal Assistant on Wheels

    Imagine entering a car where the dashboard serves as more of a personal assistant instead of just a set of dials and buttons. By merging Qualcomm’s powerful computing with Unreal Engine’s realistic 3D visuals, automakers can create rich, immersive digital experiences in your vehicle. This could mean advanced navigation systems, entertainment options, or even alerts for maintenance—ultimately making your time on the road easier, safer, and more connected.

    Rethinking Cockpit Design

    What’s thrilling about this partnership is the opportunity for car manufacturers to go beyond the usual cockpit designs. With Unreal Engine’s capability to produce stunningly realistic 2D and 3D images, vehicles could customize their displays to suit the driver’s preferences or suggest nearby EV charging stations.

    A Joint Vision for the Future

    Both Qualcomm and Epic Games view this collaboration as a significant leap towards the future of driving. Laxmi Rayapudi from Qualcomm shared her enthusiasm about how this alliance will “set new standards” in the automotive industry, while Bill Clifford from Epic Games highlighted the “unmatched possibilities” that Unreal Engine offers. Together, they’re not only aiming to create a “better car”—but they are also striving to develop a more intelligent and engaging system for what lies ahead. Let’s observe how this unfolds.


    Image 1