1. 25

Do you remember Usenet back when it was useful and there was a genuine feeling of community? Do you remember how trolls were dealt with? Do you remember how when the system for handling these individuals broke and trolls were able to be present, it was still ok because you could add someone to your killfile and never see another post or comment from them again?

I’d like to propose that this is added to Lobste.rs, for the sake of the community. Lobste.rs is big enough that chinks in the armour are beginning to show, and it would be awesome to just silently blackhole those who spread negativity.

Just as niceness begets niceness, being able to ignore those who are bent on lowering the tone contributes to a cleaner, kinder community.

“In the physical world we have to deal with such people”, “learn to be tolerant” I hear you argue. But in the physical world we have rules and societal structure that has teeth. People are not welcome in certain environments for exhibiting behaviours deemed unpleasant, and it’s a lot harder to be a jerk when people point at your face and call you on it.

In the virtual world, it is possible to pretend these people don’t exist, or, more accurately, to make them not exist.

https://en.wikipedia.org/wiki/Kill_file

  1.  

  2. 29

    Very interesting proposal. My personal feeling is that when toxic people are ignored but allowed to remain, they become “broken stairs” - dangerous for anyone who hasn’t been warned they’re there, an ongoing detriment to the community. Usenet was unmoderated, much like Twitter today, and users had no choice but to act defensively. Lobste.rs strives to do better, and actually get toxic people out of the community. It’s not fair to new people to have to find out about broken stairs the hard way.

    I realize that not all conflicts are resolved “the right way”; moderators aren’t perfect. And it’s not always reasonable to ask users to bring toxic behavior to moderator attention, for a variety of reasons. But I would ask it, of anyone who feels safe doing so, as a favor to present and future fellow community members. This post is entirely my personal opinion, not official, except for this: As a new moderator here, I am always willing to hear grievances in confidence.

    This is where the small scale of the site at least gives us a chance. @jcs has an excellent track record of acting on unrepentant toxicity, and personally I’ve felt it’s quite rare here compared to the rest of the internet.

    That said, it’s an excellent idea to have a discussion about it. With all that in mind, do you still feel you’d like a killfile or ignore feature? I know that you were on the receiving end of some of the last two weeks' unpleasantness, and I’m hopeful you feel that was an isolated incident, but prepared to hear otherwise.

    1. 14

      With all that in mind, do you still feel you’d like a killfile or ignore feature?

      I do. I believe there is a space between someone who should be drummed out of the community (abusers, jerks, spammers, trolls, +worse) and someone who I simply don’t want to have to read (serial self-promoters, people who just hammer one point or topic regardless of if it connects with the story, etc). That said, maybe the right place for such a feature is client side filtering… but it would be nice for the ignore-list to be available on anyplace I log into lobste.rs.

      PS: nice to see you with a Sysop tag.

      1. 3

        Another category of person, though hopefully fairly rare, are people who might be fine but just really rub me the wrong way. A killfile in that case is more about me than them: it helps keep me from being unnecessarily annoyed or drawn into unproductive arguments with people I really do not get along with.

        I don’t think there are currently any crustaceans in this category for me, but there are a few prolific HN commenters in it.

      2. 10

        I agree that a killfile is not a good overall solution for the health of the whole site, and that on its own it helps single users at the expense of the rest. However, it seems like “number of people who have added user X to a killfile” would be a very helpful piece of data which could help inform the decision to take more drastic action.

        If it’s not implemented as part of the site, people might just implement it as a client-side solution (greasemonkey, etc) and then you would use the ability to have that data available for moderation decisions.

        1. 12

          A similar feature is easily done by “How many people have flagged this post/comment as trolling”. The data is already there.

          1. 1

            True; there doesn’t seem to be a downside from aggregating this from posts vs having a specific per-user score.

        2. 6

          As a new moderator here, I am always willing to hear grievances in confidence.

          Congrats on the moderatorship btw! I think they (the admins) chose well. :)

          1. 10

            Well, thank you. :) It’s a responsibility, etc. Hopefully I’ll continue to deserve that sentiment. :)

          2. 5

            I’ve definitely noticed the ratio of insight:snark in the comments change for the worse in the past month.

            1. 11

              The main offender (I’ve seen) of this, Yui, has been banned, so perhaps that will help? I haven’t been paying attention close enough ot know if it’s a more general trend.

            2. 3

              But isn’t the karma already a good indication for “toxic” people? And in toxic, I won’t explicitly include people with diverging opinions or mindsets, even on topics that most westerners are very sensitive about (homosexuality, gender, …), but which are still within the ranges of free expression.

              I think what could be helpful is to somehow present the karma a bit more prominently. The karma is a value assigned to individuals by the community within Lobsters, so when somebody with low karma posts something somewhere, there’s a pretty high chance that this person didn’t post something nice. However, it may “break up” the equality here a bit. It all comes at a price…

              All in all, most people downvote opinions they don’t agree with, but I don’t think that minority opinions should lead to bad karma, so maybe a “reputation” could be based mostly on how often the user has been flagged + maybe a pinch of the karma he has gotten.

              1. 4

                But isn’t the karma already a good indication for “toxic” people?

                No, I don’t think it is. New users and users who don’t comment or submit stories very often will also have low karma. I’ve only got 6 karma but it’s not because I’ve been down-voted a bunch rather I just don’t comment very often.

                1. 2

                  My score is 177, averaging 1.86 per story/comment. I was referring to the value 1.86 :)

                  New users start with 0.00.

                2. 4

                  You want to be very careful with how you balance karma prominence. Too prominent, and people start trying too hard to game it. (I’m not hugely fond of presenting the average, even though it’s easy to compute, because that incentivizes sticking to “safe” comments and active threads, regardless of how valuable what you have to say is. I find myself doing that, even though I’m aware of it and try not to.) Not prominent enough and nobody cares, though ranking based on it has an impact even if nobody can see the absolute numbers.

              2. 24

                I’ll take the bait: I think killfiles are misguided, I think they show an unwillingness to take personal responsibility for the evolution of the community, and I think that adding them (in any capacity) to lobsters would be a step backwards. I think a handful of important soul-searching questions are worth examining here.

                First, we all agree that there are absolutely unrepentant trolls out there. This is a known fact. So, first question: all of us have control over who we decide to invite and not invite…why are we inviting in trolls? Why are we inviting in complete randos?

                For example, I ended up bringing on a couple of folks (with some invites outstanding), after either some minor background checks or whatever. I’ve had folks ask for invites and then are left out because I check their username against, say, HN, and only seeing adspam or what I consider garbage posts. If they just want to enjoy the lobsters content, they don’t need to register. So, maybe instead of throwing up our hands in the air and saying “well golly jeepers how did the trolls get in?” we should consider doing a little due diligence.

                Second, I look at statements like

                it would be awesome to just silently blackhole those who spread negativity.

                and I immediately am somewhat concerned. Who will tell these folks they’ve been blackholed? Who decides what is and is not negativity? Hell, a lot of the reason we have a public moderation log is because of the arbitrary backroom moderation of places like HN. Are the negative people the ones who are loudest, the ones who we disagree with, the ones who are trolls, or the ones who are broken records? If you we a killfile just to keep out badthink (and that is exactly what is being sketched out here), what exactly do we consider badthink?

                If we can’t at least come up with a solid, public, visible, well-defined list of badthink, then a killfile is just a way of screwing over people we don’t care for. If we decide to go down the route of censorship, let’s at least have the spine to be upfront about what we will and won’t censor.

                Third, I want to suggest that we really look at the existing toxicity on lobsters. It’s easy (and quite popular these days!) for folks (well-meaning or not) to claim a sort of vague general toxicity in a community, and then to enact rules and regulations to prevent the aforementioned toxicity, and then the brain death and fear sets in. This is especially the cycle if the community falls prey to believing in abstract “there was an incident language”. So, my third question is, what are the concrete incidents and perpetrators in the last $time_period that motivate discussion of a killfile?

                Don’t let’s settle for vague accusations, for catty “well, a user who won’t be named was mean or said something that I felt threatened by”, for nonspecific “the environment is getting toxic” rhetoric. Folks, we expect more from our compilers and politicians, so let’s please have the spine to at least call out the behavior we dislike.

                Fourth, let’s consider what recourse we actually have available to us. Remember, we have:

                • downvoting of bad posts
                • flagging of bad articles
                • complaining to mods
                • hiding of bad tags
                • suggestions for improving tags

                Those worked fine to remove the content of at least one jerk user recently, yui. So, the fourth question: do we really lack the tools we need to deal with the occasional bad-faith poster?

                ~

                I’d suggest a lot of care and debate on this topic, because it has huuuuge ramifications for the next couple of years of culture.

                Remember, we don’t have to do something just because somebody says “oh god something must be done”.

                1. 9

                  I’d like to be able to killfile someone not because they’re toxic but because they’re boring, in the same way that I can hide articles or tags without (I feel) making a moral judgement about them. There’s one particular user who… ok, I won’t be coy, it’s michaelochurch, no offense intended and I’m not trying to make this personal. I don’t think he’s a troll. I don’t think he’s a badthinker. I don’t think he’s acting in bad faith. I don’t really want him banned. It’s just that he’s posted broadly the same things many times over in many comments and I don’t care to read or engage with what he says any more.

                  1. 7

                    let’s please have the spine to at least call out the behavior we dislike

                    If we’re going to change anything about the way this site works based on unpleasant individuals, I agree that it should be done in an public and well-documented manner. Isn’t that what is supposed to set lobster apart from other similar sites?

                    1. 4

                      It is. One of the dangers of a killfile is that it subverts the transparency of the public moderation log by instituting de facto private moderation.

                      Another thing to consider is that on the whole the Lobste.rs community has done a good job of limiting group think by restricting reporting to specific cases (none of which are “I don’t like this”). A killfile system would likely lack this sort of social agreement: that we disagree publicly, and only moderate bad-faith discussion.

                    2. 5

                      there is a difference (i’d go as far as to say a large difference) between “this person is behaving in ways that are toxic to the lobsters community” and “this person annoys me, bores me, or otherwise gets on my nerves”. moderation is for the former case; as irene says, these people are broken stairs and should not be entitled to use lobsters as a platform to indulge their negative behaviour. but no one is entitled to have me, personally, read what they have to say! i am not “screwing over” or “censoring” someone if i choose to filter them out from my personal feed; it’s the difference between having the host ban someone from a party and simply choosing to spend your party time being wherever they are not.

                      1. 2

                        You do raise a fair point–we do need to figure out if the killfiles are per-user or site-wide.

                        A low-hanging fruit would be to promote a user to the global killfile once enough people add them to their personal killfiles…but even then, that does start to look a lot like the censorship issues we mention above.

                        This is one of those cases where the technology is pretty easy, but the people are tricky.

                        1. 3

                          the idea of a site-wide killfile hadn’t even crossed my mind - i think of “killfile” as a strictly personal thing, as opposed to moderation, which is a site-wide thing. the line is blurred somewhat by people subscribing to shared killfiles (popular on twitter, due to the site’s refusal to do anything more than the most half-assed abuse prevention), but i’m pretty sure the lobsters implementation would just end up being people privately directing other people’s posts to /dev/null

                    3. 9

                      The Usenet had killfiles because there wasn’t the possibility to downvote or report comments. I think the latter is a better solution, maybe not perfect, but definitely better.

                      Killfiles are a method to ignore people, but if you set up a filter for spam, you hide it from you but not from everyone else. The spam is still there. Have you gotten into Usenet recently? It’s a spam minefield. I bet users would love to get downvoting/reporting abilities instead of that.

                      1. 9

                        Many years ago I frequented a private forum that had some people I didn’t like and would constantly argue with, so I created a Greasemonkey script that would just filter out posts from those people. I announced the script on the forum and some people were happy to use it themselves while others were upset that such a thing existed and thought that I was being immature.

                        Having tag filters to weed out topics you don’t want to see, it seems natural to be able to filter out users you don’t want to see. As others have said, though, my concern is that certain people would just get filtered out by everyone, leaving them effectively shadowbanned and ignored.

                        I don’t really have a strong opinion one way or the other for this feature, but at least as far as being the person that would have to write all the code to support this server-side, I’ll say that I don’t want to implement it. Usenet kill files were client-side anyway, and nothing is stopping you from writing a Greasemonkey script (or whatever is cool these days) to filter out what your browser shows you.

                        1. 1

                          Agreed. I posted a link recently to how to make bookmarklets so this sort of thing can be hacked together.

                        2. 6

                          But in the physical world we have rules and societal structure that has teeth.

                          I think (i) moderators and (ii) users' abilities to downvote and report replicates this in the virtual world; and this seems much better than resorting to ghosting them out of your existence (but nobody else’s).

                          Inasmuch as we want to cultivate a better community, I think we should reject suggestions which curate a single user’s view of discussions particularly.

                          Maybe you like to hang out on a particular tag page; fair enough. But discussions are what make the community unique; they are the community inasmuch as they’re the only place where we actually interact with each other beyond a simple shared bookmarking site.

                          Fragmenting that aspect of it seems ill-advised, compared to encouraging some small personal loss (seeing Randy X. Hacker’s comments, who you just can’t stand) in return for large collective gain (seeing Randy X. WhyCan'tIViewThisOnTor’s comments, responding to them, and a community consensus emerging where it becomes obvious we gain nothing and lose some of our sanity by having them here).

                          1. 4

                            Just as niceness begets niceness, being able to ignore those who are bent on lowering the tone contributes to a cleaner, kinder community.

                            I don’t see this as a benefit to the community at all - it may benefit individuals, but at the expense of the community because tone-lowering individuals are tolerated. Real problem people in the community should be addressed (and I believe they are). If it’s below the level of being a problem, then I believe the voting system handles this pretty well already.

                            I think the functionality described would lead the community in a wrong direction where participants inadvertently (or advertently?) set themselves up in an echo chamber and lose the valuable discussion that comes from being exposed to conflicting perspectives.

                            1. 4

                              I have mixed feelings about this. On one hand, I’m not really against it on first principles. If you’ve “flipped the switch” on someone, there’s no value in reading that person’s comments, and then maybe you shouldn’t be presented with them at all. That said, I see it as a non-solution to a non-problem. I say “non-problem” because I haven’t encountered anyone here whose comments were so bad that I decided, “I never want to read what this person has to say”. It’s a non-solution because kill-files are not a good way to deal with the extremes of the bad-behavior spectrum: harassment, doxxing, spamming. We’ll still need to ban those people.

                              My worry is (a) about fragmenting the discussion, and (b) about the fact that kill files focus on the wrong things, which is who is saying something rather than what is said.

                              I’d much rather have a client-side “collapse” feature that allows one to hide a post (and its replies) from their main view, and perhaps even an “auto-collapse” feature that allows them to list users that they’ve decided they don’t want to bother with. (This also allows them to uncollapse posts, as opposed to a kill-file, under which they wouldn’t know that the post even existed.) I’m not against the personal auto-collapse existing, even though I hope that few people would use it.

                              1. 4

                                I’d much rather have a client-side “collapse” feature that allows one to hide a post (and its replies) from their main view

                                That feature already exists next to the username above each comment.

                                1. 1

                                  perhaps even an “auto-collapse” feature that allows them to list users that they’ve decided they don’t want to bother with. (This also allows them to uncollapse posts, as opposed to a kill-file, under which they wouldn’t know that the post even existed.)

                                  This seems like the best way to deal with this if we decide to implement killfiles at all.

                              2. [Comment removed by author]

                                1. 4

                                  Engaging in a public space necessarily involves interaction with other people. Lobsters has many mechanisms in place already to a) limit the influx of bad actors, b) manage and punish the actions of bad actors. These mechanisms are built with certain criteria in mind, including:

                                  • Transparency (public moderation log and user invite tree)
                                  • Focus on quality discussion from a variety of opinions and viewpoints (don’t downvote or flag for disagreement)

                                  I don’t know that it’s worthwhile to introduce a feature that subverts these major ideals of the site to allow people to simply hide (de-facto silently moderate) the words of people they find disagreeable. Flagging already exists. If the problem is response time, then more moderators may be warranted instead (and indeed, Irene has recently been added as one).

                                  1. [Comment removed by author]

                                    1. 4

                                      Hell, I’d volunteer too–that said, the people who want that sort of job are typically not at all the people who should be doing that job.

                                  2. 2

                                    I do somewhat have to disagree with you here.

                                    I’m sure that there are people around who are silenced, and who are hesitant to use the system in place to defend themselves and their interests. I’m not sure that those people are in this community, and the problem with what you’re sketching out is that–nearly by definition–we can’t account for them. And thus, you propose potentially drastic community changes with only the silent and uncountable potential victims as your justification. We shouldn’t make policy based on the ransom of a population we can’t enumerate.

                                    Having dealt with online communities for quite some time, I would suggest that the people who are too timid to use the tools that they have and to stick up for themselves are simply not going to have a good time. If they want to be part of the conversation, then they need to deal with the people who are conversing. They are not owed special treatment, and it’s very patronizing of us to assume that they do.

                                    For our part, we need to be vigilant against trolls and friendly and sympathetic to people who do have complaints–but once we’ve made that effort, we need to be met halfway.

                                    One cannot defend, in any meaningful sense and with any long-term benefit, people who are willfully defenseless.

                                  3. 3

                                    …but the majority of posts get zero comments.

                                    1. 2

                                      Weren’t killfiles largely a client-side tool back in the day? My initial thought is, perhaps just write a greasemonkey script to hide those folk you want hidden. No need to touch the server side with complexity.

                                      That initial thought aside, I have personally discovered features and tools on more modern forums that I like more than just blacklisting. In particular, I like: 1) systems that track the votes I have made to a given user, so I get an at-a-glance reminder next to their posts as a heuristic of my previous thoughts on them and 2) systems that let me assign a ‘foe’ tag to a user, so I am, again, visually reminded when seeing one of their posts or comments. The key to both of these is that they give me context rather than filtering. I might use that context to allocate less cognitive load to that user, but I am not hiding them from me. If they improve or have perpendicular behaviours, I still get to see that, and give a new chance to judge.

                                      If a user is a true troll, they will be taken care of via a ban. If I simply disagree with them, hiding their activity only creates a self-imposed ignorance bubble that lessens my own context and learning.

                                      1. 1

                                        They were a client-side tool, but in the webapp context, the “client” is now mostly provided by the website’s interface. For example, comment threading and expansion/collapsing used to be a problem for the client, but now it’s part of the built-in lobste.rs interface. In practice I don’t find greasemonkey scripts to be a great solution for most things, because they’re tedious to write and bitrot as sites change, sort of the in-browser equivalent of trying to maintain a webscraper.