Live Link Face

Live Link Face

For Unreal Engine

开发者: Unreal Engine

中国
APP ID 复制
1495370836
价格
免费
内购
0个评分
图形和设计(免费)
昨日下载量
最近更新
2025-02-17
最早发布
2020-07-07
版本统计
  • 67天18小时

    最新版本上线距今

  • 3

    近1年版本更新次数

  • 2020-07-07

    全球最早版本上线日期

版本记录
显示信息
日期
  • 全部
每页显示条数
  • 请选择
  • 版本: 1.4.2

    版本更新日期

    2025-02-17

    Live Link Face

    Live Link Face

    For Unreal Engine

    更新日志

    - Fixes a region specific issue where it was not possible to manually enter the IP address of a live link target using the system keyboard

    视频/截图

    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图

    应用描述

    Live Link Face for effortless facial animation in Unreal Engine — Capture performances for MetaHuman Animator to achieve the highest fidelity results or stream facial animation in real time from your iPhone or iPad for live performances.

    Capture facial performances for MetaHuman Animator:

    - MetaHuman Animator uses Live Link Face to capture performances on iPhone then applies its own processing to create high-fidelity facial animation for MetaHumans.
    - The Live Link Face iOS app captures raw video and depth data, which is ingested directly from your device into Unreal Engine for use with the MetaHuman plugin.
    - Facial animation created with MetaHuman Animator can be applied to any MetaHuman character, in just a few clicks.
    - This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11, as well as the MetaHuman Plugin for Unreal Engine.

    Real time animation for live performances:

    - Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network.
    - Visualize facial expressions in real time with live rendering in Unreal Engine.
    - Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
    - Record the raw ARKit animation data and front-facing video reference footage.
    - Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.

    Timecode support for multi-device synchronization:

    - Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
    - Video reference is frame accurate with embedded timecode for editorial.

    Control Live Link Face remotely with OSC or via the MetaHuman Plugin for Unreal Engine:

    - Trigger recording externally so actors can focus on their performances.
    - Capture slate names and take numbers consistently.
    - Extract data for processing and storage.

    Browse and manage the captured library of takes:

    - Delete takes within Live Link Face, share via AirDrop.
    - Transfer directly over network when using MetaHuman Animator.
    - Play back the captured video on the phone.
  • 版本: 1.4.1

    版本更新日期

    2025-02-06

    Live Link Face

    Live Link Face

    For Unreal Engine

    更新日志

    - Fixes issue on iPhone 16 devices where recorded takes could sometimes fail to be processed in Unreal Editor.

    视频/截图

    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图

    应用描述

    Live Link Face for effortless facial animation in Unreal Engine — Capture performances for MetaHuman Animator to achieve the highest fidelity results or stream facial animation in real time from your iPhone or iPad for live performances.

    Capture facial performances for MetaHuman Animator:

    - MetaHuman Animator uses Live Link Face to capture performances on iPhone then applies its own processing to create high-fidelity facial animation for MetaHumans.
    - The Live Link Face iOS app captures raw video and depth data, which is ingested directly from your device into Unreal Engine for use with the MetaHuman plugin.
    - Facial animation created with MetaHuman Animator can be applied to any MetaHuman character, in just a few clicks.
    - This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11, as well as the MetaHuman Plugin for Unreal Engine.

    Real time animation for live performances:

    - Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network.
    - Visualize facial expressions in real time with live rendering in Unreal Engine.
    - Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
    - Record the raw ARKit animation data and front-facing video reference footage.
    - Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.

    Timecode support for multi-device synchronization:

    - Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
    - Video reference is frame accurate with embedded timecode for editorial.

    Control Live Link Face remotely with OSC or via the MetaHuman Plugin for Unreal Engine:

    - Trigger recording externally so actors can focus on their performances.
    - Capture slate names and take numbers consistently.
    - Extract data for processing and storage.

    Browse and manage the captured library of takes:

    - Delete takes within Live Link Face, share via AirDrop.
    - Transfer directly over network when using MetaHuman Animator.
    - Play back the captured video on the phone.
  • 版本: 1.4.0

    版本更新日期

    2025-01-06

    Live Link Face

    Live Link Face

    For Unreal Engine

    更新日志

    - Adds the ability to toggle recording with the hardware volume buttons and bluetooth camera shutter devices.
    - Adds an 'Always Send Face Pose' live link option to enable the continuous streaming of face pose and timecode information even if no face is currently detected.
    - Improves Tentacle Sync device compatibility.
    - Fixes issue where timecode values could become desynchronised with connected Tentacle Sync devices.

    视频/截图

    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图

    应用描述

    Live Link Face for effortless facial animation in Unreal Engine — Capture performances for MetaHuman Animator to achieve the highest fidelity results or stream facial animation in real time from your iPhone or iPad for live performances.

    Capture facial performances for MetaHuman Animator:

    - MetaHuman Animator uses Live Link Face to capture performances on iPhone then applies its own processing to create high-fidelity facial animation for MetaHumans.
    - The Live Link Face iOS app captures raw video and depth data, which is ingested directly from your device into Unreal Engine for use with the MetaHuman plugin.
    - Facial animation created with MetaHuman Animator can be applied to any MetaHuman character, in just a few clicks.
    - This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11, as well as the MetaHuman Plugin for Unreal Engine.

    Real time animation for live performances:

    - Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network.
    - Visualize facial expressions in real time with live rendering in Unreal Engine.
    - Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
    - Record the raw ARKit animation data and front-facing video reference footage.
    - Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.

    Timecode support for multi-device synchronization:

    - Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
    - Video reference is frame accurate with embedded timecode for editorial.

    Control Live Link Face remotely with OSC or via the MetaHuman Plugin for Unreal Engine:

    - Trigger recording externally so actors can focus on their performances.
    - Capture slate names and take numbers consistently.
    - Extract data for processing and storage.

    Browse and manage the captured library of takes:

    - Delete takes within Live Link Face, share via AirDrop.
    - Transfer directly over network when using MetaHuman Animator.
    - Play back the captured video on the phone.
  • 版本: 1.3.2

    版本更新日期

    2023-12-12

    Live Link Face

    Live Link Face

    For Unreal Engine

    更新日志

    - Adds official support for the iPhone 15 series of devices
    - Fixes bug where the app would sometimes fail to request camera permissions
    - Fixes issue where recorded MetaHuman Animator takes would sometimes fail to be imported to Unreal Engine due to an incorrect number of audio channels
    - Improves support for devices that only support 30 FPS ARKit capture

    视频/截图

    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图

    应用描述

    Live Link Face for effortless facial animation in Unreal Engine — Capture performances for MetaHuman Animator to achieve the highest fidelity results or stream facial animation in real time from your iPhone or iPad for live performances.

    Capture facial performances for MetaHuman Animator:

    - MetaHuman Animator uses Live Link Face to capture performances on iPhone then applies its own processing to create high-fidelity facial animation for MetaHumans.
    - The Live Link Face iOS app captures raw video and depth data, which is ingested directly from your device into Unreal Engine for use with the MetaHuman plugin.
    - Facial animation created with MetaHuman Animator can be applied to any MetaHuman character, in just a few clicks.
    - This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11, as well as the MetaHuman Plugin for Unreal Engine.

    Real time animation for live performances:

    - Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network.
    - Visualize facial expressions in real time with live rendering in Unreal Engine.
    - Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
    - Record the raw ARKit animation data and front-facing video reference footage.
    - Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.

    Timecode support for multi-device synchronization:

    - Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
    - Video reference is frame accurate with embedded timecode for editorial.

    Control Live Link Face remotely with OSC or via the MetaHuman Plugin for Unreal Engine:

    - Trigger recording externally so actors can focus on their performances.
    - Capture slate names and take numbers consistently.
    - Extract data for processing and storage.

    Browse and manage the captured library of takes:

    - Delete takes within Live Link Face, share via AirDrop.
    - Transfer directly over network when using MetaHuman Animator.
    - Play back the captured video on the phone.
  • 版本: 1.3.1

    版本更新日期

    2023-06-22

    Live Link Face

    Live Link Face

    For Unreal Engine

    更新日志

    - Fixes incorrect header column names in recorded CSV files when using the Live Link (ARKit) capture mode
    - Fixes incorrect OSC response when making a thermal state query

    视频/截图

    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图

    应用描述

    Live Link Face for effortless facial animation in Unreal Engine — Capture performances for MetaHuman Animator to achieve the highest fidelity results or stream facial animation in real time from your iPhone or iPad for live performances.

    Capture facial performances for MetaHuman Animator:

    - MetaHuman Animator uses Live Link Face to capture performances on iPhone then applies its own processing to create high-fidelity facial animation for MetaHumans.
    - The Live Link Face iOS app captures raw video and depth data, which is ingested directly from your device into Unreal Engine for use with the MetaHuman plugin.
    - Facial animation created with MetaHuman Animator can be applied to any MetaHuman character, in just a few clicks.
    - This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11, as well as the MetaHuman Plugin for Unreal Engine.

    Real time animation for live performances:

    - Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network.
    - Visualize facial expressions in real time with live rendering in Unreal Engine.
    - Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
    - Record the raw ARKit animation data and front-facing video reference footage.
    - Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.

    Timecode support for multi-device synchronization:

    - Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
    - Video reference is frame accurate with embedded timecode for editorial.

    Control Live Link Face remotely with OSC or via the MetaHuman Plugin for Unreal Engine:

    - Trigger recording externally so actors can focus on their performances.
    - Capture slate names and take numbers consistently.
    - Extract data for processing and storage.

    Browse and manage the captured library of takes:

    - Delete takes within Live Link Face, share via AirDrop.
    - Transfer directly over network when using MetaHuman Animator.
    - Play back the captured video on the phone.
  • 版本: 1.3.0

    版本更新日期

    2023-06-14

    Live Link Face

    Live Link Face

    For Unreal Engine

    更新日志

    - Added support for MetaHuman Animator

    视频/截图

    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图

    应用描述

    Live Link Face for effortless facial animation in Unreal Engine — Capture performances for MetaHuman Animator to achieve the highest fidelity results or stream facial animation in real time from your iPhone or iPad for live performances.

    Capture facial performances for MetaHuman Animator:

    - MetaHuman Animator uses Live Link Face to capture performances on iPhone then applies its own processing to create high-fidelity facial animation for MetaHumans.
    - The Live Link Face iOS app captures raw video and depth data, which is ingested directly from your device into Unreal Engine for use with the MetaHuman plugin.
    - Facial animation created with MetaHuman Animator can be applied to any MetaHuman character, in just a few clicks.
    - This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11, as well as the MetaHuman Plugin for Unreal Engine.

    Real time animation for live performances:

    - Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network.
    - Visualize facial expressions in real time with live rendering in Unreal Engine.
    - Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
    - Record the raw ARKit animation data and front-facing video reference footage.
    - Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.

    Timecode support for multi-device synchronization:

    - Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
    - Video reference is frame accurate with embedded timecode for editorial.

    Control Live Link Face remotely with OSC or via the MetaHuman Plugin for Unreal Engine:

    - Trigger recording externally so actors can focus on their performances.
    - Capture slate names and take numbers consistently.
    - Extract data for processing and storage.

    Browse and manage the captured library of takes:

    - Delete takes within Live Link Face, share via AirDrop.
    - Transfer directly over network when using MetaHuman Animator.
    - Play back the captured video on the phone.
  • 版本: 1.2.1

    版本更新日期

    2022-06-02

    Live Link Face

    Live Link Face

    For Unreal Engine

    更新日志

    Bug fixes and performance improvements.

    视频/截图

    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图

    应用描述

    Virtual production-ready facial animation in real time from your iPhone or iPad -- Live Link Face for Unreal Engine.

    Stream high-quality facial expressions to characters and visualize them with live rendering in UnrealEngine. Record facial tracking data that can be further fine-tuned in animation tools to achieve a finalperformance and assembled in Unreal Engine’s Sequencer. Shoot professional-grade performance capture with an integrated stage workflow.

    Facial animation via front-camera and ARKit:
    ● Stream out the data live to an Unreal Engine instance via Live Link over a network.
    ● Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
    ● Record the raw facial animation data and front-facing video reference footage.
    ● Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.

    Timecode support for multi-device synchronization:
    ● Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
    ● Video reference is frame accurate with embedded timecode for editorial.

    Control Live Link Face remotely with OSC:
    ● Trigger recording externally so actors can focus on their performances.
    ● Capture slate names and take numbers consistently.
    ● Extract data automatically for archival.

    Browse and manage the captured library of takes within Live Link Face:
    ● Delete takes, share via AirDrop.
    ● Play back the reference video on the phone.
  • 版本: 1.2.0

    版本更新日期

    2022-05-27

    Live Link Face

    Live Link Face

    For Unreal Engine

    更新日志

    Added support for the new Tentacle Sync E mk2 timecode device.
    Timecode from the system timer now rolls over to 0 hours after hour 23 to avoid very high values for hours.

    视频/截图

    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图

    应用描述

    Virtual production-ready facial animation in real time from your iPhone or iPad -- Live Link Face for Unreal Engine.

    Stream high-quality facial expressions to characters and visualize them with live rendering in UnrealEngine. Record facial tracking data that can be further fine-tuned in animation tools to achieve a finalperformance and assembled in Unreal Engine’s Sequencer. Shoot professional-grade performance capture with an integrated stage workflow.

    Facial animation via front-camera and ARKit:
    ● Stream out the data live to an Unreal Engine instance via Live Link over a network.
    ● Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
    ● Record the raw facial animation data and front-facing video reference footage.
    ● Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.

    Timecode support for multi-device synchronization:
    ● Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
    ● Video reference is frame accurate with embedded timecode for editorial.

    Control Live Link Face remotely with OSC:
    ● Trigger recording externally so actors can focus on their performances.
    ● Capture slate names and take numbers consistently.
    ● Extract data automatically for archival.

    Browse and manage the captured library of takes within Live Link Face:
    ● Delete takes, share via AirDrop.
    ● Play back the reference video on the phone.
  • 版本: 1.1.2

    版本更新日期

    2021-11-08

    Live Link Face

    Live Link Face

    For Unreal Engine

    更新日志

    Bug fixes and performance improvements.

    视频/截图

    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图

    应用描述

    Virtual production-ready facial animation in real time from your iPhone or iPad -- Live Link Face for Unreal Engine.

    Stream high-quality facial expressions to characters and visualize them with live rendering in UnrealEngine. Record facial tracking data that can be further fine-tuned in animation tools to achieve a finalperformance and assembled in Unreal Engine’s Sequencer. Shoot professional-grade performance capture with an integrated stage workflow.

    Facial animation via front-camera and ARKit:
    ● Stream out the data live to an Unreal Engine instance via Live Link over a network.
    ● Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
    ● Record the raw facial animation data and front-facing video reference footage.
    ● Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.

    Timecode support for multi-device synchronization:
    ● Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
    ● Video reference is frame accurate with embedded timecode for editorial.

    Control Live Link Face remotely with OSC:
    ● Trigger recording externally so actors can focus on their performances.
    ● Capture slate names and take numbers consistently.
    ● Extract data automatically for archival.

    Browse and manage the captured library of takes within Live Link Face:
    ● Delete takes, share via AirDrop.
    ● Play back the reference video on the phone.
  • 版本: 1.1.1

    版本更新日期

    2021-09-09

    Live Link Face

    Live Link Face

    For Unreal Engine

    更新日志

    Bug fixes and performance improvements.

    视频/截图

    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图
    Live Link Face App 截图

    应用描述

    Virtual production-ready facial animation in real time from your iPhone or iPad -- Live Link Face for Unreal Engine.

    Stream high-quality facial expressions to characters and visualize them with live rendering in UnrealEngine. Record facial tracking data that can be further fine-tuned in animation tools to achieve a finalperformance and assembled in Unreal Engine’s Sequencer. Shoot professional-grade performance capture with an integrated stage workflow.

    Facial animation via front-camera and ARKit:
    ● Stream out the data live to an Unreal Engine instance via Live Link over a network.
    ● Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
    ● Record the raw facial animation data and front-facing video reference footage.
    ● Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.

    Timecode support for multi-device synchronization:
    ● Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
    ● Video reference is frame accurate with embedded timecode for editorial.

    Control Live Link Face remotely with OSC:
    ● Trigger recording externally so actors can focus on their performances.
    ● Capture slate names and take numbers consistently.
    ● Extract data automatically for archival.

    Browse and manage the captured library of takes within Live Link Face:
    ● Delete takes, share via AirDrop.
    ● Play back the reference video on the phone.