It also says I’ve done 27km in my Xero Genesis sandals, or as I have begun to call them, Xero Tolerance.
You make one mistake, and something will break. You do get to keep all the bloody pieces.
In any case, when I started on this barefoot-style / natural running adventure, I had subconsciously set myself the limit of 200km before evaluating the success of the experiment.
At 200km, the experiment was still unsuccessful (different parts of feet and ankles were taking turns complaining) so I moved the threshold to 300km, with the plan to move it to 400km if required.
I call this The Stubborn Scientific Method(tm): You keep running the experiment (harr harr) until it says what you want it to say.
To be fair, in this specific case an injury would have (and still can), stop the experiment. Most fortunately the muscles, bones and tendons in my feet, ankles and calves, although complaining quite audibly, have held up.
This past Sunday I did a long(ish) run where it felt for the first time like my feet and ankles had finally toughened up enough (and perhaps my form had also improved slightly) to just keep on propelling me forward quietly and efficiently.
Together with the brilliant sunny winter morning conditions, this conspired to reconfigure my face machine into a rather long-lasting grin.
I am carefully optimistic that I might be able to make this specific adventure a more permanent one, and that makes me really happy.
The Emacs Section
NERD-ALERT. SKIP TO THE NEXT SECTION IF YOU ARE NOT INTO TEXT EDITORS!
A friend from work sent me a ZIP file with research data.
I was super surprised that I could easily decompress the ZIP file using Emacs Dired (Dired is of course the file-manager built into Emacs, doh), but that there was no easy way to mark and extract specific files from the archive.
It worked, but it didn’t default to the opposite Dired file-list pane as all commander-style tools should do, and by default it re-created relative paths, which is the opposite of the default in most two-pane commanders I know.
As is the wont of Emacs users, I reshaped the code ever so slightly to work like I thought it should.
Shaping Emacs Lisp code has a pleasant fluid feeling to it. Code is data, code is configuration, data flows through code.
I’m telling you this story, because it was a nice little reminder of one of the reasons I like this software so much.
Differentiable Image Parameterizations, a beautiful machine learning article on Distill that surveys and showcases different techniques for generating beautiful images with deep learning. These networks sort of learn to see in order to solve specific tasks, but you can tickle them in different ways to get them to show you the insides of their visual circuitry, and it’s quite beautiful.
The Prophylactic Extraction of Third Molars: A Public Health Hazard is an article which was published all the way back in 2007. It makes the claim that at least two thirds of wisdom tooth extraction are unnecessary. One could say that their only function is to… extract your money. BA DUM TSSSSS! To that I would like to add: WHY DENTISTRY WHY? HAVE YOU NOT HURT US ENOUGH?!
A colleague at work emailed this TechCrunch post about a 3D printed neural network that diffracts light going through in order to do its trained inference work on incoming images. Although it’s a retro-futuro-mind-bending idea to do it with a whole neural network, and it smacks of hell-yeah-this-is-what-scifi-promised-me-that-AI-would-look-like, I could not help but recall a certain Very Flat Cat telling us about this sort of passive light-based computation almost 20 years ago.
The Poetry Section
GOU#1 had to select an English poem to recite for class.
I had forgotten how much subtlety and recognisable human complexity this poem was able to pack into such a petite little frame. If you have the time, read the analysis linked above after spending some time with the poem itself.
Two roads diverged in a yellow wood, And sorry I could not travel both And be one traveler, long I stood And looked down one as far as I could To where it bent in the undergrowth;
Then took the other, as just as fair, And having perhaps the better claim, Because it was grassy and wanted wear; Though as for that the passing there Had worn them really about the same,
And both that morning equally lay In leaves no step had trodden black. Oh, I kept the first for another day! Yet knowing how way leads on to way, I doubted if I should ever come back.
I shall be telling this with a sigh Somewhere ages and ages hence: Two roads diverged in a wood, and I— I took the one less traveled by, And that has made all the difference.
Friends, no matter which paths you take this week, I hope that we may meet again.
This post covers the week from Monday July 9 to Sunday July 15.
The business part of my week was unfairly dominated by far too much after-work obsessing over programming languages, with which I seem to have an unhealthy (or perhaps not) obsession.
I will externalise some of these thoughts further down in this post.
I’m starting with a weekend / running update, which should be reasonably safe for non-nerds to read. However, after that, the nerd dial will go up to 11 with stuff about tools and programming languages right up to the end of the post.
I would have wanted to use the adjective “face-melting”, but I’m not sure if any intensity of nerdery could ever reach that level.
We can dream.
Weekend running update
Most fortunately the weekend had other plans and supplied us with at least 2.5 parties, the first of which even culminated in a ridiculously fun trail run in the mountains on the winter morning after.
The winter morning sun was just perfect, the company was great, and I had forgotten all forms of performance tracking devices at home.
Readers with bionic eyes might notice the Lunas on my feet.
I have now ran just over 260km in them, but, in a surprise twist to the regular readers of this blog, my biological equipment has still not yet completely adjusted to the new style of locomotion.
They currently have to work extra hard to stabilise my feet while running, because, you know, no shoes.
Because doing this thing was not hard enough already, and because the Lunas are perhaps still a bit too cushiony, and because my friend the Very Flat Cat forgot that I’m very suggestible after 11:00 in the morning when my prefrontal cortex takes the rest of the day off, I am now also the very shy owner of a pair of Xero Genesis running sandals:
The soles are only 5mm thick, and quite hard, being rated for a few thousand miles and all. The upshot of this is that one’s feet have to work even harder than in the Lunas.
My first run in these was amazing: I could feel my feet reacting to every little pebble, and my running style having to adapt even more to the terrain.
However, there was a price to pay for all of that additional terrain feel (and the fact that I took a much longer maiden run than I should have): The next day, the tendons in my feet felt even more (ab)used than usual.
WITH GREAT POWER COMES GREAT RESPONSIBILITY, it seems.
Due to these shoes being so powerful, I have had to resign to introducing Xero running far more gradually than I had initially thought.
Vacation-based-thinking-driven tool sharpening aka The WVV 2018 Data Science Toolbox(tm).
During the previously blogged-about Mpumalanga vacation, the lack of alarms, devices, and other work accoutrements, resulted in there being ample time for staring-into-space-grade thinking sessions.
During one of these thinking sessions, I realised that I had somehow neglected my data science toolbox for a while.
At some point a few years back, I was so into ipython notebooks (what has now become jupyter) that I used them as my main work lab notes modality.
However, in the meantime I had fallen slightly out of love with the computational notebook style of data programming, because I had begun to develop doubts about their role in the analysis pipeline.
interlude 1:jupyter notebooks are nice for initial data exploration, and they’re especially useful for remote computation with embedded graphics. However, that initial momentum of discovery risks devolving into an unwieldy monolith of code snippets, data transformations and experiments. There’s a fine line to be walked between flexible experimentation on the one hand, and version-controlled, time-stamped, permutational and scientific rigour on the other.
interlude 2:I have to apologise for using the term “data science” in a non-comedic context. In spite of the inherent humour, it has turned into a usable blanket term for computational data understanding.
Due to my growing doubts in the order of Jupyter, and due to being occupied with less traditionally data sciencey work projects, I had unfortunately let my data science toolbox gather perhaps a bit too much dust.
Slightly more worrying than falling out of love with the Jupyter Notebooks (I still like them, I’m just not that madly in love anymore), was the more specific issue that I’d even let the datavis parts get a bit dusty.
Although I should probably write a more complete post about this, here is the list of ingredients of the official 2018 WHV Data Science Toolbox(tm):
Programming language and library ecosystem: Python.
This language, in spite of its shortcomings, dominates the data science / machine learning world thanks to its STELLAR ecosystem.
numpy, pandas, scipy, scikit-*, tensorflow, pytorch, keras, cython… this snowball has turned into a pretty sizeable planet.
For this reason, it would be hard to justify any other choice for data science.
However, since I’ve been seeing more of Lisp and the rest of the ever-expanding programming language landscape, I can see (Python’s shortcomings as a programming language) clearly now.
In terms of interactive programming, Python beats the majority of practical programming languages, with Common Lisp being one notable exception.
However, it’s not functional enough, which engenders unnecessarily imperative, side-effecting code. More specifically, it’s not expression-oriented.
More about this slightly further down. Maybe.
Datavis: Anything, as long as it’s Vega or Vega-Lite.
I spent a few years of my life wrangling d3.js, down to INNARD-LEVEL.
Mike Bostock’s idea of data-element-joins is genius, and internalising it was intellectually satisfying.
(if it’s any consolation, the new kid can be considered the grand-child of d3.js.)
vega and vega-lite are so-called visualization grammars, or visualization DSLs (domain specific languages).
The upshot is that one codes up a chart, or a whole set of linked charts and their interactive behaviour, using a language that was designed for this purpose.
This chart code can be easily shared, or converted into interactive visual representations that can be embedded in applications, online or in print quality documents.
With Altair, you can even send your pandas dataframes to vega and vega-lite charts all from the comfort of your slightly defective Python armchair.
Development Environment: PyCharm.
You knew it was not going to be Jupyter Notebooks, but you probably expected it to be Emacs.
Well it’s not. Surprise!
The remote interpreter support in PyCharm enables me to connect to a Python virtual environment anywhere on the planet, which I often do.
The JetBrains wizards have optimised the remote communication of code intelligence, so completion, documentation and general code understanding is almost indistinguishable from that on a completely local project.
Two notable drawbacks are visualization and long-running jobs.
For the long-running jobs I do tend to use Jupyter Notebooks or when at all possible mosh, which is amazing. However, because the primary modality is not the notebook, my code is versioned and organised into separate libraries which I can call into from notebook or mosh.
For visualization, it’s either connecting to the altair chart server via SSH pipe, dumping the chart to the unison-synced project, and/or a Jupyter Notebook.
Of course you use Postgres on an SSD for your data, and of course you know enough SQL to make short work of most of the heavy-weight transformations often required at the start your data crunching pipeline.
For all of my lab notes, reports, books, papers and blog posts, I use Emacs Org mode.
LaTeX math with live preview, live code snippets, SVG graphics, bibtex references, export to anything. This is one of the best ways to document your science.
Programming language addiction update.
I spend far too much obsessing over programming languages, old and new.
For the past two weeks, I wasted even more precious time than usual reading up about programming languages.
Because I would really like to spend more of my time on other, perhaps more valuable activities, I’ve been trying to better define what it is I’m actually looking for.
Of course there is no single best programming language, but a whole set of good languages that map in intricate ways to different problem domains.
In spite of this, I have been pining for a language with, in order of importance:
A Functional Programming DNA, with which I’m referring to a) expression-orientedness, b) a preference for pure functions, and at a higher level, c) the modelling of reality as more or less explicit dataflows.
Interactive programming, with Common Lisp being the textbook example of this.
Great tooling and IDEs, meaning first-class support by something from JetBrains, Microsoft or Emacs.
Great concurrency and parallelism stories.
A great library ecosystem.
Modest memory use.
Having just explicitly written this down for the first time (!! – it was consuming so much glucose just being kept amorphously swirling around in my brain) I can now mentally map some of my most recent language dalliances to these points.
This language is far too simple for my taste, but probably really great for teams.
I did recently take a more serious look when setting up a telegram bot using tbot and being amazed at how simple it was building web services like these using goroutines and channels.
Go satisfies points 3 to 6 from the list above. Makes sense that I decided to file this experiment away under “check when you need to put a webservice together REALLY QUICKLY”.
When I saw up that rust, surprisingly, is an expression-oriented language, I flew through the O’Reilly Programming Rust book I had bought previously as part of a bundle.
Evaluating rust by the list above, we award it a fractional 1 because expression-oriented, 3 due to jetbrains plugin amongst others, 4(ish) – great memory safety, but compared to clojure, concurrency and parallelism stories still have much room to grow, a solid 5 thanks to cargo and a very strong 6.
I filed this one away under “re-evaluate whenever you reach for your trusty C++”. (also, actix-web looks amazing for super high performance microservices.)