Weekly Head Voices #158: Charlie and the Chocolate Factory.

(Note that there’s now a Telegram group that you can join to be kept up to date with these posts. I’m never going to make the A-List, but at least I haz the gimmicks!)

This edition of the weekly (haha) head voices attempts to reflect on the period of time from Monday November 5 to Sunday November 18, 2018.

The following action scene happened exactly halfway through:

Pre-requisite running photo, this one taken in Paarl. It was already quite hot. Getting really hot really early in the morning is Paarl’s thing.

Running aka Irony update

Seeing that you’ve made me talk about running again, have a look at this photo of one Luna Mono 2.0 after about 700 km of (mostly road) running in about seven months, and one brand new Luna Mono 2.0:

At around about the same time as the new shoes arrived, shortly after South African customs charged me a painful amount before letting the new babies through, both my ankles, from around the posterior tibial tendon area, let me know in no uncertain terms that they were now demanding a break.

After repeated explanations by my life partner (she counts being a rheumatologist amongst her many talents), and by a foot surgeon friend, that my flat feet mean that my posterior tibial tendons have to work even harder than they would usually have done had I been anatomically speaking more normal, I had to start facing the music:

I was going to have to wear normal person running shoes again.

(If I have to be honest I would have to say that the music was in fact more about having to take a running break. I had sneakily been pushing up my weekly distance, trying to run through ankle discomfort, and this was probably the true core of the problem.

All of that being said, I am choosing to interpret matters a bit differently. Running breaks are really hard yo.)

I’ve now done two runs in my pre-Mono Kinvara 8s, and it does indeed feel (of course it does) like my ankles might slowly be recovering. I am hopeful that the trend continues, and that I can eventually rotate in my Lunas again.

Nerd toys update: RTX 2070 in da house.

After weeks of deliberating, I broke down and bought an NVIDIA RTX 2070 for deep learning.

This in turn led to a flurry of experimentation and to be quite honest a slight case of deep learning binging.

At least I have the following new blog posts to show for it:

(I know that some of these occurred outside of the two week timespan covered by this post.)

On the memory saving of mixed-precision training.

In my tests with ResNet50, a serious convolutional neural network for image classification, the exact same network with the exact same training settings required 14159 MiB in fp32 mode but only 7641 MiB in mixed precision mode.

This means that in some cases, this new RTX 2070 can go toe-to-toe with many far more expensive cards.

Furthermore, I informally measured a training speed boost of about 20% with the smaller ResNet34.

It’s no wonder that the RTX 2070 gets the Tim Dettmers stamp of approval for the most cost-effective training.

Your message, to take home.

I came across this backyard philosophy jewel on reddit the other day and loved it. It’s about the 1971 movie Willy Wonka & the Chocolate Factory, a stellar adaptation of Roald Dahl’s book Charlie and the Chocolate Factory.

… in test screenings, Willy Wonka had a scene with a hiker seeking a guru, asking him the meaning of life. The guru requests a Wonka Bar. Finding no golden ticket, he says, “Life is a disappointment.” The director loved it, but few laughed. A psychologist told him that the message was too real.

Just remember the Buddhist Twist my friends:

… and finally passing through the gate of wishlessness (apranihita) – realizing that nirvana is the state of not even wishing for nirvana.

Weekly Head Voices #151: We are pleased to meet you.

The Weekly Head Voices number 151 are trying to tell you something about the week from Monday July 30 to Sunday August 5.

Prepare yourself for a slightly stranger than usual post. I have: two short programming ideas, a bad review of an outdoor security passive infrared sensor, using Jupyter Notebook for (GPU-accelerated) numerical computation when you only have a browser, computing device input latency, and an utterly unexpected bit of backyard philosophy from the gut.

Two random micro side-project ideas

I would like to start with two hobby / maker ideas that popped up in my head this week. There’s a high probability I will not get around to them, but perhaps they help you to spawn a new set of hopefully more worthwhile ideas.

Chrome or Firefox plugin to convert Spotify playlists to Apple Music using the new MusicKit JS API

I seem to see many more Spotify playlists shared than Apple Music playlists. For example, at this moment I’m listening to the official Lowlands 2018 playlist.

This is not ideal, as I am an Apple Music subscriber, but not a Spotify subscriber.

It turns out there are paid apps to convert Spotify playlists to Apple music playlists.

However, it also turns out that Apple has a new thing (still in beta) called MusicKit JS.

I briefly dissected the Spotify Playlist website.

It would be straight-forward for a Chrome or Firefox plugin (WebExtension, so same code. I’ve done this before) to go through this playlist, search for each track using the MusicKit JS API, and then recreate the playlist in the user’s Apple Music account.

This solution would be much cleaner and simpler than the current app-based ones.

An Emacs package for displaying your RescueTime productivity metric right on the mode line

I scanned the RescueTime API documentation.

I was just about to start working on it, when I came up with the bright idea to name the package ironic.el, and so I stopped.

On that topic: The struggle for practically sustainable focus is real, and it never seems to stop.

The Head Voices REVIEW(tm) the Optex HX-80 outdoor passive infrared security detector: AVOID AT ALL COSTS

From the Optex HX-80 outdoor passive infrared security detector’s web-page we have the following:

The most important element in reliable outdoor detector is accuracy to distinguish a human from a small animal. … In addition, the HX-80N’s dual PIR’s and 20 detection zones utilize the ‘AND’ detection pattern technology … This technology helps to prevent false alarms caused by a pet or small animal.

Well, I had two of these installed by trained professionals.

(There are of course interesting discussions to be had about the necessity of devices such as the HX-80, or its mythical actually working counterpart, down here.)

I can confirm that they excel at one fairly specific function: Triggering the alarm, and thus automatically calling my security company, at the most ungodly hours of the night, whenever a certain small grey cat, looking exceptionally unlike a human, decides to take a stroll outside of our house.

Oh yes, the cat is not even ours, but belongs to our neighbour.

The installation and subsequent repeated fine-tuning of our Optex HX-80 have only had the result of me having to punch in an additional key-sequence every evening to bypass the two ‘AND’-detection-pattern-technology-equipped HX-80 devices.

You will understand that the only reasonable Head Voices REVIEW(tm) of the Optex HX-80 is:

  • 100% NON-FUNCTIONING THROUGH INFERIOR DESIGN.
  • AVOID AT ALL COSTS.
  • DON’T TRUST THE MARKETING.
  • THE TRUTH IS OUT THERE.
  • JUST DON’T.

Image result for just don't meme

Some more odd but perhaps useful bits

Google Colaboratory for Numerical Computation when all you have is a browser.

I’m late to the party (again), but Google Colab is really great if you need a Jupyter Notebook with some GPU power behind it.

It comes with tensorflow pre-installed (being Google and all), but getting the GPU-accelerated PyTorch 0.4.1 (latest version of the most amazing deep learning tool at the time of writing) going was a cinch.

To repeat this experiment, create new notebook with File | New Python 3 Notebook, then change Edit | Notebook Settings | Hardware accelerator to GPU.

You can then install the correct version of PyTorch by executing

!pip install http://download.pytorch.org/whl/cu80/torch-0.4.1-cp36-cp36m-linux_x86_64.whl

in a notebook cell.

What a time to be alive!

P.S. Remember, under normal (non-Colab) circumstances we keep our Notebooks as empty as possible. Prefer as much as possible of your code in Python modules. The notebooks are only there to act as glue, for visualization and sometimes for long-running jobs.

Dan Luu’s computer and mobile device input latency research

This most amazing work was recently brought to my attention by WHV reader Matthew Brecher in the comments under my 2017 Android vs iPhone performance post.

In it, Dan Luu measured the input latency of various devices, using the 240fps camera on his iPhone SE, or with the 1000 fps  Sony RX100 V camera if the device was too fast.

For the computers in his study, input latency was defined as the time between keypress and character appearing on the display. For the mobile devices, it was defined as the time between finger movement and display scrolling starting.

If you have any interest in this sort of technology and also in-depth technology journalism, the full article is definitely worth your time.

I wanted to mention two interesting points:

  1. The 1983 Apple 2e, with a CPU running at 1MHz, had significantly lower input latency (30ms between button press and character display) than any modern multi-GHz system. The comparison is of course not completely fair, but it’s still nice to see.
  2. Amongst the mobile devices, Apple dominates the fast / low latency end of the spectrum. Their devices, in terms of input lag, are ALL faster than all of the Android devices tested, including for example the 2017 Google Pixel 2XL.
    • Yes, this is me eating my hat, and some more of that yummy humble pie.
    • Android 9, code-name Pie, has just been (will soon be… err) released and has an amazing list of features. I still hope they manage they also manage to catch up with regards to some of the basics like input latency.

Yet another reason to eat more fibre

There are an estimated 100 trillion (10 to the power of 14; 100 with 12 zeroes) bacterial cells housed in each of our bodies.

Each adult human consists of on average only 37 trillion human cells, meaning there are on average almost 3 alien cells for every 1 of your own cells.

I find this a beautiful realisation: All aspects of our lives depend on this multitude of foreign visitors.

They help us digest our food, and, as it has been turning out more recently, they play a crucial role in our mood,  our behaviour and our thinking.

We (or at least the clever people) now talk about the microbiome-gut-brain axis, further underlining the importance that our bacterial visitors play in our lives.

Taking a few more steps back, thinking about the relationship between the 37 trillion human cells, and the 100 trillion visiting cells,  I ask the question:

Who am I really? Who exactly is thinking this?

I, or perhaps rather “we”, find this truly fascinating.

What I was initially planning to mention before going off on this tangent, was a recent paper accepted for publication in the Journal of Physiology, with the title Short-Chain Fatty Acids: Microbial Metabolites That Alleviate Stress-induced Brain-Gut Axis Alterations (click for PDF fulltext).

The Physiological Society press release is more digestibly (I had to) titled “Eat high fibre foods to reduce effects of stress on gut and behaviour“.

In short, fibre stimulates gut bacteria to produce short-chain fatty acids (SCFA), which, besides being the main source of nutrition for cells in this region of the body, also decrease levels of stress and anxiety, at the very least in mice.

The end

Thank you for sticking around friends!

I hope that you found something of value, even if not directly from this post.

I’ll see you next time! Until then, remember to eat your vegetables.

 

Weekly Head Voices #144: Eternal learner.

Welcome back friends!

(Right after the nerd news, there’s running and backyard philosophy. You can start wherever you like.)

Nerd News

The Weekly Nerd News Network (WNNN) wanted to bring the following points under your attention:

  • Emacs 26.1, the first major release since September 2016, when 25.1 came out, happened on May 28. Although Emacs reached perfection (and sentience, some say) a few decades ago, this new version does include improvements such as native line numbering for the VIM refugees and buttery smooth scrolling on X11 (read the very entertaining story behind this).
  • PyTorch (my favourite deep learning tool by far) and Caffe are merging. This is amazing because while PyTorch is some of the most dynamic and flexible deep learning software you can pay with, Caffe runs on your telephone. You’ll be able to fine-tune your deep network on PyTorch, and then click a button (or type some obscure incantation, probably) to get that network in a highly efficient compiled form on any embedded device or scaled up to run on your cloud. Although apparently not possible, this really does feel like free lunch!

Reunion

In Weekly Non Nerd News (WNNN), an old friend came to visit all the way from Omaruru, an occasion which served as the happy excuse for a mini-reunion at my place.

It’s strange to think that some of the university stories we recounted are now more than 20 years ago.

In that time, humans go from birth to fully formed adult human beings with opinions, and relationships, and stories of their own.

Thank you Omaruru Friend for bringing us all back together again.

Running mouse

The flu and/or cold virus that managed to enter through the cracks left by my immune system being under pressure from above-mentioned celebrations caused a week-long period of man flu, a period that I was only able to conclude today with a lovely winter morning run.

As one does, I continued searching until I found evidence confirming my belief that running with some remaining flu symptoms would not be irresponsible.

What I found was even better than that!

A 2005 study titled Moderate exercise protects mice from death due to influenza virus, published in the journal Brain, Behaviour and Immunity, found that in mice that had just been infected with a real influenza (i.e. not man flu) virus, moderate exercise had an additional protective effect relative to no exercise or strenuous exercise mice. The PDF full-text can be found on the sci-hub website, or via their telegram bot (the bot is really convenient, you can find and read fulltexts on your phone!).

Thanks to the internet, and lab mice, I had confidence that I was probably not going to die due to my run.

Confirmation bias aside, or not, based on more reading it looks like moderate exercise is not the worst thing you can do during or after cold or flu. The secret is to keep it relaxed, and to keep a very close eye on your heart and your temperature.

Mastery

I finally finished reading the book Mastery by George Leonard, a recommendation by LS that I am grateful for.

It can get preachy at times, but the core message is really good, and especially timeous in this era of hyper distraction.

Below is Leonard’s message, sent at least once through the old washing machine that is my brain.

Learning is a lifelong process.

More specifically, the path to mastery of any worthwhile skill usually consists of short bursts of novelty exhilaration (you often start with one of these) followed by long and seemingly boring plateaux of never-ending practice with no kick.

No kick means that many learners decide to quit, and switch to something exciting, only to repeat their cycle of not-mastery there.

If you are able to make peace with the plateaux, and keep on trudging along, you are on the path to mastery.

In a decidedly Buddhist twist, being on the path to mastery means that you are in fact an eternal learner, and you will never become a master.

The author of the book is an Aikido sensei. I especially loved the story he told of the beginners and the senseis.

When beginners practise, they ask the sensei for a new move to practice every few minutes. They try to get through as many moves as possible during their 2 hour training session.

When senseis practise, they practise the same basic move over and over for many hours, losing themselves in the universe of that single apparently straight-forward form.

The Buddhist Twist

From the Wikipedia page on Buddhism:

The Four Truths express the basic orientation of Buddhism: we crave and cling to impermanent states and things, which is dukkha, “incapable of satisfying” and painful. This keeps us caught in saṃsāra, the endless cycle of repeated rebirth, dukkha and dying again. But there is a way to liberation from this endless cycle to the state of nirvana, namely following the Noble Eightfold Path.

… and then later:

…. and finally passing through the gate of wishlessness (apranihita) – realizing that nirvana is the state of not even wishing for nirvana.

I can work with this.

Readers, I wish you wishlessness!