VMagicMirror expects 2 main use cases of Streaming and Desktop Mascot. Show
This tips writes about the setup for streaming. Choose screen capture or Window captureFor the popular streaming software like OBS, usually you have 2 ways to capture the character shown on VMagicMirror.
Screen capture uses as-is image. This way has the merit to be easy to understand what happens, and shadow and semi-transparent interfaces (like touch pad) are correctly shown. When you use window capture please check following points. Turn off shadow effects. In If the bloom effect seem to prevent chromakey composition, then turn off. In setting window If your avatar has green part, you might need to change chromakey color. In setting window Keyboard and touchpad objects are semi-transparent and thus maybe looks not good, when in composit image. In this case hide them, or replace the device textures (see Change Device Textures). NOTE: OBS has one more useful choice of “Game Capture”. Game capture supports to capture transparent window as is, and this feature will work very well for VMagicMirror with transparent background. If your PC has enough capacity to do so, please consider using game capture instead of normal window capture. Check CPU UsageWhen streaming PC uses many computational resources. Check the following points to reduce CPU usage of In
In setting window,
Consider How to Place the CharacterWhen placing the character please consider NOT to show the avatar hands or arms. As usual VTuber shows only upper than shoulder, the hand motion might lead unnatural image. Even in this case shoulder and upper-arm motion is reflected to the streaming, and this makes character more lively. If the streaming is mainly for the talking, you can choose the layout the camera is hidden. When you are planning the game play streaming, then move gamepad to the higher position, or just change camera layout to show your hands. If you want to apply bust-up layout and also want to show the hand motion, then move up keyboard and touch pad. Choose how to Switch the Face ExpressionsYou can use several way to switch the character’s face expressions.
Please see the detail in Expressions. About日本語 VSeeFace is a free, highly configurable face and hand tracking VRM and VSFAvatar avatar puppeteering program for virtual youtubers with a focus on robust tracking and high image quality. VSeeFace offers functionality similar to Luppet, 3tene, Wakaru and similar programs. VSeeFace runs on Windows 8 and above (64 bit only). Perfect sync is supported through iFacialMocap/FaceMotion3D/VTube Studio/MeowFace. VSeeFace can send, receive and combine tracking data using the VMC protocol, which also allows support for tracking through Virtual Motion Capture, Tracking World, Waidayo and more. Capturing with native transparency is supported through OBS’s game capture, Spout2 and a virtual camera. Face tracking, including eye gaze, blink, eyebrow and mouth tracking, is done through a regular webcam. For the optional hand tracking, a Leap Motion device is required. You can see a comparison of the face tracking performance compared to other popular vtuber applications here. In this comparison, VSeeFace is still listed under its former name OpenSeeFaceDemo.
If you have any questions or suggestions, please first check the FAQ. If that doesn’t help, feel free to contact me, @Emiliana_vt! Please note that Live2D models are not supported. For those, please check out VTube Studio or PrprLive. DownloadTo update VSeeFace, just delete the old folder or overwrite it when unpacking the new version. Download If you use a Leap Motion, update your Leap Motion software to V5.2 or newer! Just make sure to uninstall any older versions of the Leap Motion software first. If necessary, V4 compatiblity can be enabled from VSeeFace’s advanced settings. VSeeFace v1.13.36oからLeap Motionの手トラッキングにLeap Motion Gemini V5.2以上が必要です。V5.2インストール前にLeap Motion Orionの旧バージョンをアンインストールしないと正常な動作が保証されません。必要に応じてVSeeFaceの設定からV4互換性を有効にすることができます。 Old versions can be found in the release archive here. This website, the #vseeface-updates channel on Deat’s discord and the release archive are the only official download locations for VSeeFace. I post news about new versions and the development process on Twitter with the The latest release notes can be found here. Some tutorial videos can be found in this section. The reason it is currently only released in this way, is to make sure that everybody who tries it out has an easy channel to give me feedback. VSeeFaceはVTuber向けのフェーストラッキングソフトです。Webカメラで簡単にVRMアバターを動かすことができます。Leap Motionによる手と指のトラッキング機能もあります。iFacialMocap/FaceMotion3Dによるパーフェクトシンクも対応です。VMCプロトコルも対応です(Waidayo、iFacialMocap2VMC)。ダウンロードはこちら。リリースノートはこちら。まだベータ版です。 VRM以外UnityのAssetBundle形式のVSFAvatarも使えます。SDKはこちら。VSFAvatar形式のモデルでカスタムシェーダーやDynamic Bonesやコンストレイントなどを使用が出来ます。 @Virtual_Deatさんのディスコードサーバーに入るとルールズチャンネルで👌にクリックでルールを同意して他のチャンネルも表示されます。#vseefaceと日本語チャンネルもあります。 VSeeFaceはクロマキーで録画が出来ないけどOBSのGame CaptureでAllow transparencyをチェックしてVSeeFaceで右下の※ボタンでUIを見えないにすれば綺麗な透明の背景になります。 UIの日本語訳があり、日本語のチュートリアル動画もあります。最初の画面で日本語を選択が出来ます。 ライセンス:営利・非営利問わずご自由にお使いください。 Terms of useYou can use VSeeFace to stream or do pretty much anything you like, including non-commercial and commercial uses. Just don’t modify it (other than the translation VSeeFace is beta software. There may be bugs and new versions may change things around. It is offered without any kind of warrenty, so use it at your own risk. It should generally work fine, but it may be a good idea to keep the previous version around when updating. ライセンス:営利・非営利問わずご自由にお使いください。 DisclaimerTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. CreditsVSeeFace is being created by @Emiliana_vt and @Virtual_Deat. VSFAvatarStarting with VSeeFace v1.13.36, a new Unity asset bundle and VRM based avatar format called VSFAvatar is supported by VSeeFace. This format allows various Unity functionality such as custom animations, shaders and various other components like dynamic bones, constraints and even window captures to be added to VRM models. This is done by re-importing the VRM into Unity and adding and changing various things. To learn more about it, you can watch this tutorial by @Virtual_Deat, who worked hard to bring this new feature about! A README file with various important information is included in the SDK, but you can also read it here. SDK download: v1.13.38 (release archive) 日本語の情報が@narou_rielさんのメモサイトにはあります。 Make sure to set the Unity project to linear color space. You can watch how the two included sample models were set up here. TutorialsThere are a lot of tutorial videos out there. This section lists a few to help you get started, but it is by no means comprehensive. Make sure to look around! Official tutorials
VSeeFace tutorials
VRM model tutorials
日本語のチュートリアル動画:
ManualThis section is still a work in progress. For help with common issues, please refer to the troubleshooting section. The most important information can be found by reading through the help screen as well as the usage notes inside the program. FAQHow can I move my character?You can rotate, zoom and move the camera by holding the Alt key and using the different mouse buttons. The exact controls are given on the help screen. Once you’ve found a camera position you like and would like for it to be the initial camera position, you can set the default camera setting in the How do I do chroma keying with a gray background?VSeeFace does not support chroma keying. Instead, capture it in OBS using a game capture and enable the What’s the best way to set up a collab then?You can set up the virtual camera function, load a background image and do a Discord (or similar) call using the virtual VSeeFace camera. Can I get rid of the ※ button in the corner somehow? It shows on OBS.You can hide and show the ※ button using the space key. Sometimes blue bars appear at the edge of the screen, what’s up with that and how do I get rid of them?Those bars are there to let you know that you are close to the edge of your webcam’s field of view and should stop moving that way, so you don’t lose tracking due to being out of sight. If you have set the UI to be
hidden using the ※ button in the lower right corner, blue bars will still appear, but they will be invisible in OBS as long as you are using a Does VSeeFace have gaze tracking?Yes, unless you are using the Why can’t VSeeFace show the whole body of my model?It can, you just have to move the camera. Please refer to the last slide of the Why isn’t my custom window resolution saved when exiting VSeeFace?Resolutions that are smaller than the default resolution of 1280x720 are not saved, because it is possible to shrink the window in such a way that it would be hard to change it back. You might be able to manually enter such a resolution in the settings.ini file. Can I change avatars/effect settings/props without having the UI show up in OBS with a hotkey?You can completely avoid having the UI show up in OBS, by using the Spout2 functionality. For more information, please refer to this. Effect settings can be controlled with components from the VSeeFace SDK, so if you are using a VSFAvatar model, you can create animations linked to hotkeyed blendshapes to animate and manipulate the effect settings. The local “L” hotkey will open a file opening dialog to directly open model files without going through the avatar picker UI, but loading the model can lead to lag during the loading process. Is Spout2 capture supported by StreamLabs?StreamLabs does not support the Spout2 OBS plugin, so because of that and various other reasons, including lower system load, I recommend switching to OBS. OBS has a function to import already set up scenes from StreamLabs, so switching should be rather easy. What are the requirements for a custom model to make use the gaze tracking?If humanoid eye bones are assigned in Unity, VSeeFace will directly use these for gaze tracking. The gaze strength setting in VSeeFace determines how far the eyes will move and can be subtle, so if you are trying to determine whether your eyes are set up correctly, try turning it up all the way. You can also use the Vita model to test this, which is known to have a working eye setup. Also, see here if it does not seem to work. To use the VRM blendshape presets for gaze tracking, make sure that no eye bones are assigned in Unity’s humanoid rig configuration. Sometimes other bones (ears or hair) get assigned as eye bones by mistake, so that is something to look out for. The synthetic gaze, which moves the eyes either according to head movement or so that
they look at the camera, uses the With ARKit tracking, I animating eye movements only through eye bones and using the look blendshapes only to adjust the face around the eyes. Otherwise both bone and blendshape movement may get applied. What should I if my model freezes or starts lagging when the VSeeFace window is in the background and a game is running?In rare cases it can be a tracking issue. If your screen is your main light source and the game is rather dark, there might not be enough light for the camera and the face tracking might freeze. More often, the issue is caused by Windows allocating all of the GPU or CPU to the game, leaving nothing for VSeeFace. Here are some things you can try to improve the situation:
If that doesn’t help, you can try the following things:
It can also help to reduce the tracking and rendering quality settings a bit if it’s just your PC in general struggling to keep up. For more information on this, please check the performance tuning section. I’m looking straight ahead, but my eyes are looking all the way in some direction?Make sure the gaze offset sliders are centered. They can be used to correct the gaze for avatars that don’t have centered irises, but they can also make things look quite wrong when set up incorrectly. My eyebrows barely move?Make sure your eyebrow offset slider is centered. It can be used to overall shift the eyebrow position, but if moved all the way, it leaves little room for them to move. How do I adjust the Leap Motion’s position? My arms are stiff and stretched out?First, hold the alt key and right click to zoom out until you can see the Leap Motion model in the scene. Then use the sliders to adjust the model’s position to match its location relative to yourself in the real world. You can refer to this video to see how the sliders work. What about privacy? Is any of my data or my face transmitted online? Can my face leak into the VSeeFace window?I took a lot of care to minimize possible privacy issues. The face tracking is done in a separate process, so the camera image can never show up in the actual VSeeFace window, because it only receives the tracking points (you can see what those look like by clicking the button at the bottom of the General settings; they are very abstract). If you are extremely worried about having a webcam attached to the PC running VSeeFace, you can use the network tracking or phone tracking functionalities. No tracking or camera data is ever transmitted anywhere online and all tracking is performed on the PC running the face tracking process. The onnxruntime library used in the face tracking process by default includes telemetry that is sent to Microsoft, but I have recompiled it to remove this telemetry functionality, so nothing should be sent out from it. Even if it was enabled, it wouldn’t send any personal information, just generic usage data. When starting, VSeeFace downloads one file from the VSeeFace website to check if a new version is released and display an update notification message in the upper left corner. There are no automatic updates. It shouldn’t establish any other online connections. Depending on certain settings, VSeeFace can receive tracking data from other applications, either locally over network, but this is not a privacy issue. If the VMC protocol sender is enabled, VSeeFace will send blendshape and bone animation data to the specified IP address and port. As for data stored on the local PC, there are a few log files to help with debugging, that will be overwritten after restarting VSeeFace twice, and the
configuration files. This data can be found as described here. Screenshots made with the The VSeeFace website does use Google Analytics, because I’m kind of curious about who comes here to download VSeeFace, but the program itself doesn’t include any analytics. You can also check out this article about how to keep your private information private as a streamer and VTuber. It’s not complete, but it’s a good introduction with the most important points. I moved my Leap Motion from the desk to a neck holder, changed the position to chest and now my arms are in the sky?Changing the position also changes the height of the Leap Motion in VSeeFace, so just pull the Leap Motion position’s height slider way down. Zooming out may also help. My Leap Motion complains that I need to update its software, but I’m already on the newest version of V2?To fix this error, please install the V5.2 (Gemini) SDK. It says it’s used for VR, but it is also used by desktop applications. Capturing VSeeFace through Spout2 does not work even though it was enabled?Your system might be missing the Microsoft Visual C++ 2010 Redistributable library. After installing it from here and rebooting it should work. Do hotkeys work even while VSeeFace is in the background?All configurable hotkeys also work while it is in the background or minimized, so the expression hotkeys, the audio lipsync toggle hotkey and the configurable position reset hotkey all work from any other program as well. On some systems it might be necessary to run VSeeFace as admin to get this to work properly for some reason. My VSeeFace randomly disappears?/It can no longer find the facetracker.exe file?/Why did VSeeFace delete itself off my PC?VSeeFace never deletes itself. This is usually caused by over-eager anti-virus programs. The face tracking is written in Python and for some reason anti-virus programs seem to dislike that and sometimes decide to delete VSeeFace or parts of it. There should be a way to whitelist the folder somehow to keep this from happening if you encounter this type of issue. Check the “Console” tabs. There are probably some errors marked with a red symbol. You might have to scroll a bit to find it. These are usually some kind of compiler errors caused by other assets, which prevent Unity from compiling the VSeeFace SDK scripts. One way of resolving this is to remove the offending assets from the project. Another way is to make a new Unity project with only UniVRM 0.89 and the VSeeFace SDK in it. I’m using a custom shader in my VSFAvatar, but a transparent section turns opaque parts of my model translucent in OBS?In cases where using a shader with transparency leads to objects
becoming translucent in OBS in an incorrect manner, setting the alpha blending operation to “Max” often helps. For example, there is a setting for this in the “Rendering Options”, “Blending” section of the Poiyomi shader. In the case of a custom shader, setting Can I switch avatars with a hotkey?There is the “L” hotkey, which lets you directly load a model file. In general loading models is too slow to be useful for use through hotkeys. If you want to switch outfits, I recommend adding them all to one model. With VRM this can be done by changing making meshes transparent by changing the alpha value of its material through a material blendshape. For VSFAvatar, the objects can be toggled directly using Unity animations. Since VSeeFace has no greenscreen option, how can I use it with Shoost?Enable Spout2 support in the When exporting a VSFAvatar, this error appears? |