There are always three things that trip me up in programming. The first one is encoding. This one is usually quickly fixed, so no biggie, but always involves Windows (everybody else using utf-8 by now). Sometimes console, sometimes git on the console, sometimes web, sometimes html, often cp1252 or worse. In Python 3 on Windows it's normally fixed in seconds or minutes, but you're always going "this is not my problem, why the fuck am I still on this goddamn platform?!"

The second thing is calendars. For starters there's UTC, daylight savings, local time, translations, ISO 8601 and epoch. And as if that wasn't enough there's always server, client, remote services, database and logs. The only way to get it right is to write unit tests. If you're like me, you write them

*after* you've realized there's a problem.

The third thing is statistics and probability theory. Sure, it's maths and not programming, but for all interesting problems this one pops up. Today I used

*outliers* and

*confidence intervals*, but not sure if these were the right tools for my mathematical model. This problem is a harder nut to crack, because I need more knowledge, not more time. I studied it poorly at the university, and I might have passed the course (not sure), but I struggled with it at the time as I struggle with it now. What does it

*mean* to calculate the confidence interval for

*a single draw* or for

*the mean of N draws*? What distribution to use when? Can the

*z-value* be obtained by calculus rather than tables? Must the

*z-value* be used to get to the

*p-value*? To me it's hardly ever obvious when the probability is

*conditional* or

*independent*.

Encoding and calendars are fixable with time. Statistics I need to study, and to be frank I should learn it well for my book as well. Perhaps I'd be more motivated to go to the university again as it's something I need and want. Last time I was only there for the drinks, maybe a month more would do me good. (If I eventually go back for yet another month of free Swedish education, I sure hope they've replaced the stuffy statistics professor. He was a bit like IBM's

Deep Blue: very precise, but more like inanimate matter than a living creature.)