1. 9
  1.  

  2. 9

    Almost all of these are awful for accessibility - the position-off-the-page with CSS, followed by display: none or visibility: none w/ CSS are probably the most a11y-friendly, but are also probably the most easily detectable by bots.

    1. 4

      I love the idea of using the honeypot technique rather than a CAPTCHA, and I’ve used it in the past.

      My concern is that it may not be as effective today as it was when it was first introduced about 10 years ago. Bots may be smarter now, and it doesn’t take a whole lot to adapt to honeypots. If your site is targeted specifically, a honeypot won’t protect you, because the bot herder will notice the failures and fix the bot. It’s really only good for preventing untargeted things like blog spam, and I worry that it’s not as good for that as it used to be.

      1. 1

        Did anyone try Honeypot? What are the pros and cons compare to CAPTCHA?

        1. 4

          You can run a headless browser to easily get around most of these, as they’ll render a page just like a legitimate user-agent would. Also, almost everything you try would be an accessibility issue. I’d say just use a captcha technique, and as always, don’t rely on the client for security.

          Edit: But a few of these would prevent the most basic spam attempts, which is probably a lot of them, so you might as well.

          1. 4

            It works very well for some of my forms. It has huge benefit over captcha:

            • it’s effortless and accessible for users (when done well: be careful about trap fields showing up in screen readers or for keyboard users).

            • it doesn’t discriminate against users who block trackers. Google’s new Re-CAPTCHA assumes that if Google can’t track you (you’re not logged in to gmail), you must be evil and have to repent by filling the captcha over and over again.

            The reason it works is that majority of spambots by volume are the dumbest ones. A bot that only does a regex over plain HTML will be sending spams orders of magnitude faster than a bot that runs headless Chrome for every page, and therefore you’re orders of magnitude more likely to be spammed by a dumb bot.

          2. 1

            If requiring Javascript is an option, a simple but effective method is to handle the form submit in JS. The vast majority of spambots seem to take a spray-and-pray approach where they get your HTML, look for any form, and try to send a POST to that form’s action URL. Leaving the action empty and storing the URL in something like data-action is enough to get rid of this level of spam. It’s easy enough to bind an event handler to the form’s submit event, and either use data-action to set the form’s action or just send the form’s contents in an XHR request.

            In my experience this approach works well, at least on stuff like contact forms on smallish websites. There’s no fiddling with texts and trying to fool bots into filling a honeypot field. An obvious downside is the JS requirement.

            Executing JS is presumably too expensive for generic spam, though that equation will be different if the form gives access to something of greater value. Once a bit of human effort becomes worth it, a honeypot won’t be effective either (and even CAPTCHA-solving can just be farmed out to human solvers, see services like anti-captcha).