Teb’s Tidbits: Neural Networks, Moore’s Law, A Word of Caution… and Dinosaurs!

What I’m Thinking About Lately

Tyler Elliot Bettilyon
4 min readJun 1, 2018

This week an article by the MIT Technology Review caught my eye, about how human brains process information. In particular, the researchers are suggesting that human brains are probably using discrete processing mechanisms, rather than continuous ones. It’s an interesting read, I’m not sure how on board I am with their methodology, but I think the assertion that brains store and process discrete information is totally plausible, and interesting.

That article also reminded me that neural networks (a machine learning algorithm) are inspired by, and modeled after neurons. Neural networks were a big theme in my information diet this week.

John McDonald’s talk from GDC Using Deep Learning to Combat Cheating in CSGO found its way to me. In the talk he made this joke about neural networks, “If you learned those 30 years ago… it turns out they finally work.”

And the secret to get then to work?

“A million times as much data, and a billion times as much compute. And then they work great!”

When I first heard someone say neural nets were (at least) 30 years old, I was pretty surprised. Do you know when the neural network was invented?

… wait for it …

1949. By two scientists named Warren McCulloch and Walter Pitts; Mind. Blown.

It turns out that research into neural networks was all but abandoned in the “AI Winter” of the 1970s and 80s. The comparatively wimpy machines they had back then could not handle the massive amount of data that is required to make neural networks actually work. Nowadays supercomputer access just requires a credit card and an AWS account.

That reminded me of an essay by MIT Professor Rodney Brooks, The End of Moores Law. It’s a fun and interesting look into the history of processing power, which is truly a story of exponential growth; and also what it means for “Moore’s Law” to come to an end. Lots of people (Dr. Brooks among them) are excited about the end of the reliable reduction in transistor size after the quantum tunneling threshold.

Who knows where we’ll turn next to increase our computation speeds. A lot of people think customizable computation circuits, such as FPGA’s will be a big part of the future. The continued development of GPUs and the new world of TPUs are other areas to keep your eye on.

On a very related note, Open AI wrote about AI compute trends on May 16th. They describe just how hungry for computational power the most advanced machine learning methods are. Moore’s Law defined the exponential trend in CPU speed increases; and state of the art AI systems continue to push the limits with every new advance. If speed increases stagnate after the “end of Moore’s Law”, the current excitement in the AI community could follow suit. I’m not personally betting on another “AI winter” but I do think we’re already seeing a little less AI hype.

I’ve also seen more authors using a societal lens to talk about AI, and software more generally.

The first human fatality at the hands of a self driven car occurred earlier this year in Arizona. There was obviously the Cambridge Analytica scandal. And in a kind of response to these problems, the GDPR has gone into effect and privacy policy updates are flooding my inbox. This is good news to me, I think it’s high time the software industry grappled with what The Atlantic once called The Coming Software Apocalypse in a piece sheds more light on just how desperately the software industry needs a cultural shift.

With our lives becoming ever more intertwined with computers, the technology industry needs to stop glorifying reckless behavior. Mark Zuckerberg’s famous motto, “move fast and break things,” epitomizes the attitude in Silicon Valley for the last several years. As scandals mount for the tech world, a timely (if decidedly less catchy) word of caution caught my eye on Medium, Move Slowly, and Don’t Break Things. It’s kind of unsettling that this is a novel idea to some people.

Finally, I promised you dinosaurs. Specifically Google’s 404 Dinosaur game.

I noticed that two people did the same cool thing on the Internet. I love it different people attempt similar projects. I learn a lot by seeing multiple approaches, and it’s fun to compare and contrast them. So, check out two ways to build a system that uses machine learning system to play Google’s 404 dinosaur jumping game. Two solutions, two presentation formats, and they both come with open source code — truly a gift from the Internet.

And in keeping with the theme, both involving machine learning and neural networks.

Enjoy, and never stop learning!

-Teb

-==-

For more like this follow me, and checkout Teb’s Lab.

--

--

Tyler Elliot Bettilyon

A curious human on a quest to watch the world learn. I teach computer programming and write about software’s overlap with society and politics. www.tebs-lab.com