1. 17

This is a bit of a followup to my earlier post on string unescaping. I don’t think the approach as I’ve written it is very practical, but I think it’s an intriguing direction. I believe parsing JSON on GPU can be done, but probably requires some very clever and tricky techniques to work well with the memory hierarchy available on GPU.

  1.  

  2. 1

    I wonder what programming languages are nice for this scan/scatter style of programming. I’ve seen a lot of new GPU languages floating around, but I haven’t tried them.

    The algorithm seems pretty elaborate. CUDA is an obvious choice but I wonder how annoying it would be to get a full JSON parser working in it (as opposed to the Dyck language). I think you would need a lot more than 8 steps, e.g. to do the backslash escape decoding.

    Anyway it would be interesting to see a prototype of something runnable in a high-level language! APL? :)

    (Although I don’t think APL-like languages give you the ability to express numeric types of different widths, which you probably want for GPU programming.)

    1. 1

      This is interesting, but if the goal is to get the best performance then JSON isn’t the best format in the first place.