If you take a look around right now, you’ll find yourself surrounded by some very complex objects. Smartphones, computers, printers, cars, televisions, toasters — the list goes on. But no matter how complex these devices are, you can use them to do what you need, even though you’d have a hard time building any one of them on your own.
This small miracle is thanks to a principle we call abstraction. Abstraction is the design philosophy that replaces a tangled mess of complex details with a tidy interface you can use to GSD (Get Stuff Done). Abstraction is also at work deep in the heart of every software program, straining to hold back the floodwaters of confusion. …
With very few exceptions, all the languages you use to write code are high-level languages. Computers can’t run their code directly. Instead, you need a translation step that converts your high-level code into lower-level machine code that can be funneled straight to the computer’s CPU. …
Long ago, almost programming languages made a crucial decision. They decided that memory management was too important to leave in the hands of programmers.
There are a few exceptions (C++ programmers, please stand up). But in most modern programming environments, you don’t need to think about grabbing a block of memory, allocating chunks of space, and cleaning up when you’re done. This deprives programmers of the joy of debugging memory leaks, which used to be one of our most important jobs. But whatever. We’ve learned to adapt.
In the dark ages, when strange beasts roamed the land, computer programming was much different than it is now. Serious programmers were expected to be hands-on and manage the way their programs used memory. How did they do it? And why is life so different today? In this article you’ll dig deeper into the way code uses memory.
Once, many years ago, when the internet was first created, web pages were like ordinary documents. You wrote one, you saved it, and then you copied it to a web server. And there it stayed, forever the same, unless some day you decided to make changes and upload a new version.
In August of 2015, my hands stopped working. I could still control them, but every movement accumulated more pain, so every motion came with a cost — getting dressed in the morning, sending a text, lifting a glass.
I was interning at Google that summer about to begin a PhD in Scotland, but coding all day would have left me in agony. In relating this story, I often mention that for months before I learned to work without my hands, I had nothing to do but go to a bar and order a shot of vodka with a straw in it. …
Peter Farrell spent more than a decade teaching math and computer science. Somewhere along the way, he began using Python to create programming challenges to pair with his lessons. But what started as a way to reinforce math concepts gradually developed into something else — a gateway to a more practical approach to math education.
Peter saw how coding projects allowed students to shift from passively learning concepts to actively working, reasoning, and playing with them. In other words, code helped them to go from learning about math to actually doing math. As he says “Why should the science, art, and home-ec students have all the fun? …
Young Coder is a publication about coding, science, and tech — usually with a twist. We love quirky science, creative coding, and all the places inspiration meets semicolons.
Kicks Condor is an elementary school computer teacher, a former computer expert (of the sort “discontinued in the 1990s”), and an IndieWeb blogger. When asked to run an after-school coding club, he realized there were two ways he could go.
The first possibility was to use the popular code-themed games and activities from code.org. The second option was to dive into something more creative and open-ended: in this case, the interactive story-building tool Twine. Although Twine doesn’t require code, it lets children use variables and conditional logic if they choose to do so — in other words, to enhance their stories and solve actual problems. …
Gerald Friedland is Principal Scientist at Lawrence Livermore National Lab and Adjunct Professor at the University of California, Berkeley. Like many, he taught himself to program as a child in the 1980s with the ancient — but refreshingly straightforward — Commodore 16.
Here’s where things take an interesting turn. When his daughter, Mona, turned 7, he chose to go back to the source. Instead of today’s code playgrounds and glitzy graphical environments, he decided to introduce her to BASIC and the 8-bit Commodore programming environment — with the help of a free emulator. …