Threads for mtlynch

  1. 2

    I really enjoyed the article in hindsight to the debugging mechanisms available in go, but well, why not place the files in the regular filesystem as final solution?

    1. 1

      Thanks for reading!

      why not place the files in the regular filesystem as final solution?

      I did consider it because it was a bit of a mess to chase these issues down, but I value the ease of replication too highly. By putting everything into SQLite, I can just run Litestream, and Litestream handles replicatiing and restoring all application state. If I used the regular filesystem, then I have to roll my own solution for backup and restore.

      1. 2

        yes,i understand litestream might be a nice solution to sync sqlite databases, but for my understanding, its just simpler to sync filesystem contents, isnt it? Plus for filesystems, its not like keeping snapshots is something like a new technoloy.

        1. 1

          I’ve tried using cloud-backed FUSE filesystems in the past (e.g., s3fuse, gcsfuse), and I’ve found them to be much harder to work with than Litestream. They seem like they’re just a regular filesystem, but then I ran into lots of gotchas in places where they can’t fully behave like a local disk. And SQLite doesn’t support networked filesystems, so I’d still have a separate solution for replicating my database

          I know there are tools like SyncThing that can sync a folder to cloud storage, but I think integrating SyncThing is more complex than Litestream.

          Are there tools you’ve found that would work well here?

    1. 2

      Is that actually a fix ? If I get this right you’re now crashing for people that try to reclaim disk space but don’t give it enough memory to perform vacuum. Kinda ironic.

      1. 2

        Thanks for reading!

        If I get this right you’re now crashing for people that try to reclaim disk space but don’t give it enough memory to perform vacuum.

        Yeah, I wish I could prevent the crashes entirely, but I think it might be out of my hands. I don’t think I can prevent SQLite from bloating the write-ahead-log on VACUUM when Litestream is attached.

        That said, I suspect that in practice, this is unlikely to cause crashes. The users who want to do periodic VACUUM tend to run on full-blown server rather than a RAM-limited VM. If you have several GB of RAM, VACUUM should be fine.

      1. 3

        Thanks for sharing this! I’ve been interested in learning Zig for a while, but I haven’t found an excuse to do it, since most of my work is web development.

        I found Andrew Kelley’s interview on the CoRecursive podcast interesting because it’s fun to hear about what motivates someone to take on a project as ambitious as replacing all of the world’s C software:

        https://corecursive.com/067-zig-with-andrew-kelley/

        1. 3

          In addition to being a KVM, I bet this gizmo could be easily adapted to stream presentations at conferences and meetups (including captured HDMI and live video of the presenter).

          1. 3

            Author here. Thanks for reading!

            I’m not sure that TinyPilot would be a great match for that scenario. If you need remote access to a machine that has a full OS and network connection, then TinyPilot will work, but it’s going to be slower and more limited than solutions like RDP or VNC.

            TinyPilot is more for scenarios where RDP or VNC don’t work. For example, if you need to access a machine before its OS boots, or the machine has no network connectivity, or you can’t install client software on the machine.

            1. 1

              Interesting idea, especially because the Raspberry Pi has the camera input available out of the box to capture the presenter. Combining this as a picture in picture stream (on the Pi), or even presenting them on the same webpage as different streams would make for a nice portable conference device.

              1. 1

                I’m investigating it for specifically this purpose.

              1. 1

                If I understand it correctly, this device could drive two machines at the same time? Would it be possible to extend it to more?

                1. 2

                  Multiple HDMI capture devices could be run concurrently as they are just a standard USB capture devices, so that part should be fine. I think you would run into problems with the keyboard component as it requires the OTG/Gadget USB port to present as a virtual keyboard device to the server, which is why the USB C port is used and splitter is required.

                  I have invited the original author to provide some feedback in this thread.

                  1. 1

                    Author here.

                    This can actually only control one machine at a time. The Pi only has one USB port capable of impersonating a keyboard, so it’s limited to just one.

                    One way you can extend it to multiple machines is by combining TinyPilot with a non-networked KVM. It should work as long as the KVM supports switching targets by hotkey. I know of at least one user currently doing that.

                    1. 3

                      Thanks for clarifying that. Couldn’t something like this https://www.testdevlab.com/blog/2016/08/how-to-create-programmatically-switchable-usb-hub/ be used, but with RPi instead of Arduino? That way, RPi select which port it talks to?

                      1. 2

                        I haven’t investigated it thoroughly, but my intuition is that it wouldn’t work, unfortunately.

                        From the perspective of the target computer, the Pi is a USB keyboard. So I think this would be equivalent to plugging a USB keyboard into a USB hub and then trying to get it to type into multiple different machines.

                        Hubs I’ve seen allow multiple devices to share one port, but they don’t support multiple machines sharing a single device.

                        1. 1

                          That’s a good point.

                          The link I shared is something that tries to use the hub, but allows for programmatically enabling/disabling ports. With the ability, if one would disable all but one port, then it could work, no? If yes, would the same approach be needed also for the HDMI inputs?

                          1. 1

                            The same approach wouldn’t be required for the HDMI USB devices because they are just standard USB devices. The HDMI dongles present as video capture cards in the Raspberry Pi, they aren’t presenting as a device to the managed PC that would need to be ‘shared’ like the keyboard with the OTG port.

                            As long as you have USB ports spare on the Raspberry Pi (or even with a hub), and adequate power, I don’t see an issue scaling out with additional HDMI capture devices to capture the video input.

                            -edit-

                            I just tested this on a similar ARM device through a USB hub with 2 x HDMI capture devices attached and they were detected fine and video capture devices presented as /dev/videoX.