I've tried numpy, and sure it's great for certain vector algebra and time-to-market, but for getting high performance in irregular traversing of millions of datapoints... not so much. I know there's been plenty written in Python, Instagram back-end being the most notable, and I agree Python is an excellent scripting tool, but not when doing something real. (By real I don't mean delivering tons of crap content in social media, I mean search, map-reduce, crunch, calculate, compute.)
Go, however, has made an impressive journey to version 1.8. Performance is on par with Java (which also have made some small strides the last decade), but without the baggage of a shitty past. It's still only half the speed of C++ for optimized, parallel number crunching, but already I think it's where I had hoped D would be years ago. And the advantage of having fast, simple, portable and cross-platform builds without any configuration what so ever, along with a fairly lean syntax and a good standard library, as well as easy integration with C/C++ surely outweighs the cons. Already I think it's killed whatever was left of rust Rust - the syntax and immaturity speaks itself - as Rust is only 50% faster than Go 1.8.
So right now I'm banging my head wishing I could write my one-liner loop in Go instead of Python.
while ti < l and perf[ti][0] < t: ti += 1
Yep...