1. 33
  1.  

  2. 2

    The creator (@jil) live streams development on Stork — https://www.twitch.tv/jameslittle230

    1. 1

      I haven’t used this, but I have used lunr, which promises the same thing. It wasn’t a great experience, and the index quickly grew too big to be a good idea. To be honest I can’t really see a use case for something like this. If your content doesn’t change very often, just use google/bing/whatever to search your site, if you can’t have any server-side moving parts.

      1. 7

        If your content doesn’t change very often, just use google/bing/whatever to search your site

        This might not be the best way to do for some usecases, like i would not like to wait for an external service to index my pages. Also i don’t have any control over how i want things to be indexed/categorized etc. I’ve been using https://docsearch.algolia.com/ for my personal wiki, it’s been an wonderful experience so far!

        1. 5

          A good use case is offline documentation.

          1. 1

            That’s a good idea

          2. 1

            It would be interesting if you could shard the indexes so that you get good long-term caching. If your site adds contents slowly then you should be able to generate an (relatively large) index of all of it at one snapshot and then a much smaller index of the newer contents. Both can be marked with multi-year caching policies, but the URL for the second one will change every time you add contents to the site. Once the second index is sufficiently large, you add a third, until the aggregate size of the newer ones is sufficiently large that it’s better to download a complete index.