1. 23
  1.  

  2. 4

    Reminds me of

    https://metacpan.org/pod/Acme::Bleach, which leads to this reference - https://www.templetons.com/tech/proletext.html - of interest in the recent kerfuffle about HTML email.

    1. 2
      sub whiten { local $_ = unpack "b*", pop; tr/01/ \t/; s/(.{9})/$1\n/g; $tie.$_ }
      sub brighten { local $_ = pop; s/^$tie|[^ \t]//g; tr/ \t/01/; pack "b*", $_ }
      sub dirty { $_[0] =~ /\S/ }
      sub dress { $_[0] =~ /^$tie/ }
      open 0 or print "Can't rebleach '$0'\n" and exit;
      (my $shirt = join "", <0>) =~ s/(.*)^\s*use\s+Acme::Bleach\s*;\n//sm;
      my $coat = $1;
      my $pressed = '#line ' . ("$coat\n" =~ tr/\n/\n/) . ' ' . (caller)[1] . "\n";
      local $SIG{__WARN__} = \&dirty;
      do {eval $coat . brighten $shirt; print STDERR $@ if $@; exit}
          unless dirty $shirt && not dress $shirt;
      

      Perl was something else, huh.

      1. 3

        Madness. I love it!

        1. 2

          Wouldn’t this quickly stop working of URL length limits in browsers? There is only one character so for a new URL you just +1 one character.

          Edit: I looked at the source code, it actually uses 2 different characters, so the encoding is much more effective, and will handle around 2^2000 url’s, which is probably more than the underlying storage could store(the size of storage needed is far more than is comprehensible). I think that it could even be done without a storage layer by encoding the URL with zero-width characters. That could store url’s up to 250 chars long reliably and maybe longer ones, depending on the browser.

              1. 1

                But seriously, don’t use URL shorteners.