Home from LambdaConf. The Haskell training, Type-Kwon-Do workshop, talk on Rank N Types, and OOP vs. FP panel all went well.
Very happy to be back in my house with my dogs. On to editing the Haskell book now, which for now for me means that I am spending my evenings working through our Zendesk backlog. We are on (checks…) ticket number 1,124 for the book. This doesn’t count the private review we do, this is just the public stuff.
I’ve been poking around asciidoctor, contemplating a potential alternative to doing the next two books entirely in LaTeX. I suspect Julie would be perfectly happy staying in LaTeX and to be quite honest I would too, but I think it would be a source of friction if we collaborated with other people. One issue I’m running into is that all the existing asciidoc -> PDF pipelines are terrible, so it’d be a split build anyway.
Another thing on my plate is finishing up the blog post on documentation for our micro-publisher thing so that we can publicize it more.
I’ve done the first three available courses in this and it’s been really good.
The first course is a bit like “Make Your Own Neural Network” (https://www.amazon.com/Make-Your-Own-Neural-Network-ebook/dp/B01EER4Z4G) where a neural network is coded from scratch. The second course goes through some optimisations to speed things up and improve accuracy. Some basic Tensorflow is introduced. The third course is really short and focusses on structuring training/dev/test data sets.
The Python bits are done in Jupyter notebooks which Coursera host themselves, so you don’t need to setup anything locally.
The remaining two courses will be available in October and November.
That’s v interesting to hear. I was thinking I’d have to do the next session as I can’t start on anything till 1 Oct but maybe I can catch up. How close are the estimated times to the reality?
Accurate enough. You can start and switch sessions later if you need to pause and catch up.