1. 24
  1.  

  2. 5

    I’d love to see a video of this in action!

    1. 3

      I can try to get one today.

    2. 4

      Have you looked at something like Facerig, or is this just seeing how far you can push the jank?

      Also, are attempts at becoming a VTuber on-topic for Lobsters?

      1. 1

        I’ve looked at Facerig before, this was mostly an attempt to see how far I could push the jank. Also I found the avatar I was using and really liked it, but there wasn’t an obvious way to use it with Facerig directly.

        1. 2

          Makes sense.

          Also, if you haven’t tried full-body tracking in VRChat, it’s really fun. Nice to hang out in some of the dance spaces that way. ^_^

          1. 1

            I’ve tried it and we have it on our Vive setup, however for meetings I only really need my hands and head.

      2. 2

        Congrats! As a regular VRChat user, I want to try this out myself.

        1. 2

          The only thing I dislike about it is that it relies on VRChat, whose terms of service you have to comply to, and whose SDK seems to be a bit restricted (you cannot use all the Unity features you want).

          I would rather try to use VSeeFace, although it relies on face tracking alone, instead of the possible full-body tracking we can get with SteamVR.

          1. 3

            I’m pretty sure it would be doable to make something else on top of Unreal or Unity (maybe even Godot). This is something that I’ve been thinking about for a while, but I am not a game development kind of person. I’m just trying to get things to work at all.

            At the very least I see something like this as a tech preview for what you could do with more specialized open source software. The main problem I had is that I already found an avatar I liked in VRChat and it is difficult to export those from VRChat to other programs. I have gotten a copy of the unity package and shader bundle for my avatar though, so we’ll see where it goes from there.

          2. 1

            There is a lot of WiFi noise in my apartment or something and it was really interfering with ALVR’s stream encoding.

            I’m looking forward to a future post titled “Wireless VR from inside my Faraday cage” or something. ;)

            I started out by taking a picture of my office from about the angle that my laptop sits at

            Could you get closer to perfect alignment by using your laptop camera to take the photo so it’d match, then avoiding moving your laptop during the meeting? (And re take the photo of you move your laptop.)

            1. 2

              That’s what I have now! I have a 45 second loop that OBS plays on repeat. It makes it look like the flag is blowing in the breeze of the fan.

              1. 1

                Ah right! I missed that you’d sorted out the alignment while switching to the animated video.