1. 23

  2. 3

    Have you considered how you’ll make this virtual world accessible to people with disabilities? I’m thinking in particular of blind people. The graphical virtual worlds that I’m aware of are are completely inaccessible. The old MUDs, MOOs, etc. were accessible, because they were text-based.

    That said, this sounds like a worthwhile project. Good luck.

    1. 2

      I’m shooting for text-based worlds first. 2d and 3d worlds may be supported later.

    2. 3

      There’s the beginnings of a manifesto in the associated source repo too.

      1. 1

        Ah yes, heh… I need to finish writing it…

      2. 1

        I’m curious how something can be both anti-abuse and anti-censorship. Against whom doing the abusing/censoring?

        1. 7

          It’s a good question, and no system can be perfect. Here are the boundaries that I am interested in:

          • I don’t consider it censorship if someone is filtering out themselves from seeing your content, that’s their own freedom to filter. Communities and individuals can, and should, decide when they would rather not see certain content.
          • However, the inability to find any venue for expression of ideas I do think is censorship.
          • One current assumption I think is wrong in contemporary social networks is the assumption that everyone should be able to message you by default, or at least at no or equal cost to all parties (I go about this more in the post, and I shall go into it more later as well). Instead I think it makes sense to have multiple paths to one’s doorstep, which one may hand out judiciously. So that’s one filtering layer, the ability for people to reach you in the first place.
          • As said in the post, I think the assumption that moderation should happen at the instance level is leading to a lot of current problems on the fediverse… especially when we want as many people to run instances. It’s not sustainable, and it leads to big fights. Instead I think mailing lists are a better example of moderation: you might join many different lists with different expectations of what is and isn’t acceptable behavior for the different facets of you life.
          1. 3

            I think censoring things by default, but indicating that something is censored (with an opt-in to view it) is a good balance. It could also work the other way, but censorship is generally expected for adoption.

            1. 4

              I think censoring things by default, but indicating that something is censored (with an opt-in to view it) is a good balance.

              We do visible, expandable censorship that collapses subthreads here on Lobsters. It’s a nice compromise. For me, I can ignore that stuff to focus on high-priority content when I have little time. Later, I might expand and skim it to see whether kicking it off the thread was something every group would be behind or one group dropping another. I at least can see what everyone said if there’s no deleted comments. There’s also sometimes good info in there that was just unpopular to those viewing at the time.