#chrisyoungcom
**Chris Young** has developed a modification to the Unreal Live Link face support plugin that allows ARKit sources to be recognized by packaged Android builds. This allows live face and body performances to be streamed to the Quest, using the Metahuman Face app and Xsens motion data, offering an opportunity to view/direct a remote performance in Extended Reality (XR). It also takes advantage of the high-quality metahuman pipeline for recording and post-animation work directly in sequencer. Chris Young speculates that this innovation could potentially shift how “appointment experiences” evolve in the XR environment, likening its potential impact to how “Must See TV” was a driving force for television ratings.