Getting Mathjax to Play Nicely with Markdown and Highlight.js

July 24, 2017

Mathjax and markdown tend to fight with one another a bit. When I started blogging math notes on here, the combination of Mathjax and Cryogen's markdown parser and Highlight.js fought with one another a lot. So here's a quick tutorial on fixing it.

The assumption here is that you want to write in normal markdown, i.e., the kind of thing that you could convert to a PDF with pandoc. And you want to write latex math. But you observe that doing so blows up when you convert to html and use mathjax.

Continue reading →

Avoiding Inheritance Through Really Unidiomatic Python

July 17, 2017

So, confession: my brain works in a functional way, and really doesn't work in an object-oriented way. This can be a bit of an issue when you're using an object-oriented language and trying to avoid excessive code duplication.

So here's some really unnatural stuff I just did. The problem: I'm writing a Python library to wrap a bunch of legal and political APIs with a simpler interface. (Extreme work in progress.) I had a bunch of code that looked very similar. For example, this was what two of my session objects (interfaces to different APIs) looked like, in relevant part:

Continue reading →

Mathy Ng Lecture 5: generative learning algorithms, naive bayes

July 17, 2017

Ng lecture 5: generative learning algorithms

The idea of generative algorithms

Continue reading →

A Debugging Trek, and: (naive) Batch Gradient Descent in Haskell

July 17, 2017

So I implemented batch gradient descent in Haskell, to simultaneously solidify my understanding of the algorithm and work on learning Haskell.

It got a bit bumpy. I've preserved my realtime notes of the mess. But the short version is that after a certain number of iterations that was an increasing function the learning rate, the model would just terminate in weights of Infinity for all features.

Continue reading →

Lecture 2 of Andrew Ng's mathier ML course

July 7, 2017

One of the things I'm doing at RC is working through the mathier version of Andrew Ng's famous machine learning course. Here are my notes from the first substantive lecture (lecture 2).

n.b. the math is all in code blocks because the markdown processor screws with underscores and carets otherwise, and mathjax can't handle that. This is making me insane and I might actually write some kind of post-processor to jerk around the generated html files to fix this, but it'll have to do for now.

Continue reading →