Photo by Lacie Slezak on Unsplash

How to Learn Anything

Even though I almost failed my school exams at age 16, I turned my life around and went on to get a bachelor’s degree (BEng) in electronic engineering, a master of science (MS) in electrical engineering, and a master of arts (MA) in psychology. I also became a doctor of philosophy in clinical psychology, gaining a PhD after about ten years of study, research, and clinical practice.

I left my UK undergraduate university with a first-class honors degree, which indicates the highest level of academic achievement. During that time, I obtained a grade-point average (GPA) of 3.7 out of 4.0 and won three academic achievement awards. For my master of science degree, which was from Stanford University, I attained a GPA of 3.6 out of 4.3. That GPA is pretty good, but it could have easily been much higher, as I’ll explain later.

In spite of the fact that I have a ton of letters after my name—including a CEng, which conveys that I’m a UK Royal Chartered Engineer, gained through a rigorous peer-review process—I have come to understand that learning does not necessarily have to be associated with gaining official qualifications. As a lifelong learner, I now know the value of intentionally spending time every day learning something new, regardless of whether an academic organization will assess me and rubber-stamp my progress.

Recently, I was talking with a friend who was preparing for his doctoral qualifying examinations (“quals”) for a PhD in computer science at Brown University. I explained my learning system to him, which I describe at the end of this article, and he suggested that I write about it. This article covers not only that system, but also several other important things that I have discovered about learning.

Cultivate a Growth Mindset

Beliefs, which are always about the past, reduce our in-the-moment options related to how we can respond to reality, and therefore all beliefs are in fact limiting beliefs. We all have a set of beliefs that was installed early in our lives. These beliefs are held in place with strong emotions and are rarely questioned.

A key aspect of our belief system is in relationship to our own intelligences and capabilities. I use the plural form of intelligence because I’m referring not only to cognitive intelligence, but also to emotional, social, relational, spatial, practical, and bunch of other intelligences. There are two main ways of perceiving our own abilities: (1) that they are pre-determined and immutable, or (2) that are changeable and can be developed. The former attitude is called a fixed mindset, and the latter is called a growth mindset.

People who are lumbered with a fixed mindset tend to believe that their intelligences and capabilities cannot be improved, and that the main objective in life is to let other people know that they are both smart and capable. The focus with this mindset is on maintaining an external image of success and competence, which includes not taking risks or doing anything that might make them look “stupid.”

On the other hand, people who hold a growth mindset understand that their intelligences and capabilities are always growing, changing, and developing. These people know that they can learn and grow and become better at things. Furthermore, they know that this is accomplished by trying new things, taking risks, and focusing on growth and development.

Becoming aware of our beliefs about our intelligences and capabilities leads to us having choice about how we see ourselves, which frees us to grow, change, and improve. Ironically, even our mindset can be changed from fixed to growth, assuming we already have a seed of growth mindset. We all have a seed of growth mindset somewhere inside.

Carol Dweck, the most well-known researcher on mindset, and the author of Mindset: The New Psychology of Success, explains it like this:

In a fixed mindset, students believe their basic abilities, their intelligence, their talents, are just fixed traits. They have a certain amount and that’s that, and then their goal becomes to look smart all the time and never look dumb. In a growth mindset students understand that their talents and abilities can be developed through effort, good teaching and persistence. They don’t necessarily think everyone’s the same or anyone can be Einstein, but they believe everyone can get smarter if they work at it.

I’ll add that believing that you cannot become as creative or skillful in physics as Einstein, or even more so, would be a limiting belief.

Teach What You Want to Learn

Many people think that they cannot teach something until they have mastered it, yet one of the most effective ways to learn something is to teach it. One of my friends, who is licensed clinical psychologist (with a PhD), and who created a depth psychology program at a local graduate school, told me that he has learned more about psychology from teaching as a professor than when he was a student.

Between the ages of 16 and 18, I was enrolled in a vocational, general engineering course, intended to prepare me to become an engineering technician. During that time, I became passionate about math, especially calculus. A good friend of mine from my previous school was enrolled in a similar training program to become a technician on the London Underground. Math was also part of his training, but he was scoring D grades on his exams, and he was worried that he would be expelled from the training.

I spent an afternoon with him going through many example calculus problems. On the next exam that he took, he got an A grade. This gave me additional confidence, knowing that I understood the material well enough to help him to get that result. More importantly, however, in the process of teaching calculus to my friend, I had to figure out how to explain it, and to make those explanations as simple as possible. I had to understand it deeply, and to boil it down into its fundamental components.

Calculus is not just a set of procedures to remember by rote, it is a new (in the mid-17th century) and powerful perspective on reality. This new perspective, once fully understood, can be re-applied throughout life, in many different domains. To teach a perspective, one must be able to inhabit it fully, yet also be able to step out of it and return to the position of the learner. By perceiving fundamental truths from multiple perspectives, we become true masters of them.

I am not standing here holding some nugget of shallow knowledge that I can explain to you, enabling you to then carry your own replica. My knowledge is embodied in a freedom-of-perspective, and an awareness of a different and fundamental aspect of reality. I can understand your perspective well enough to come join you in it, and then I can take you by the hand and bring you on a 360-degree fly-by of this new territory. I can teach you to fly around this new truth, to zoom into it, to taste it, and to know it deeply. This is true knowledge.

Create Something

I started programming when I was eight, using the Basic programming language on an Acorn Electron, which was a computer that was only sold in the U.K. I went on to program various computers, including an Atari ST in low-level assembly language when I was 15. In 1995, after I left The University of Reading (pronounced red-ing) in the UK, even though by then I knew Modula-2 and had a lot of experience with low-level programming in various forms of assembly language, I had not learned to use the C programming language.

At the time, C was the de-facto systems programming language, high-level enough to create complex programs relatively easily, yet expressive enough to create very highly performant programs. Even now, 23 years later, C is the prototypical systems programming language. Systems programming is the process of creating the software that applications run on top of, such as the operating system of a computer. For example, C was used to create the Linux operating system. Most of the computers that run the applications that make up the World Wide Web (what we think of as The Internet) are running the Linux operating system. At its lowest level, at least from a software perspective, The Internet runs on C and Linux.

C is also an imperative language. A language is imperative when you use the language to command the computer to do stuff: “If this is true, then do that.”, which usually consists of commands to make the computer change itself. C is also a procedural language, which means that you use chunks of instructions (chunks of code) called procedures to build up a complex program. Procedures are named, re-usable sequences of commands. An example procedure, also called a function, might be named ReadFileAndStoreItInMemory. When you call this particular procedure, meaning that you tell the computer to execute it, you might also provide the name of the file to read from and the location in the computer’s memory at which to store the file’s contents.

At that time, I had just started working at a multinational semiconductor company called SGS-Thomson Microelectronics (now called ST). This was my first job after graduating. I was designing sub-systems in silicon chips that accompanied the 3D accelerator chips being developed by NVIDIA. ST both manufactured chips for NVIDIA—after helping to design them at a physical level—and was a major investor in NVIDIA.

As well as wanting to learn the C programming language, I wanted to learn about 3D graphics, so I decided to work on a personal side-project that involved both of these technologies. I bought two books: one about C and the other about 3D graphics algorithms, and I got started in creating a 3D graphics application.

My fiancée and I had chosen to live so close to the business park where I worked that it took me only five minutes to walk there. This saved me a lot of time that would otherwise have been spent on commuting, which enabled me to focus on my side-project for an hour or two per day. A side-effect of living so close to work was that we rented a flat (an apartment) in a very inexpensive housing estate, far from the trendy city center. Even though we saved money on rent, our friends who lived below us saw us as a curiosity because we owned books.

Within a couple of months, I was able to create a 3D modeling and rendering application that used my own proprietary 3D modeling language. When the application was running, I could create three dimensional points by entering their coordinates at a command prompt in the user interface of my program.

Points could be joined together to form lines, and lines could be combined into triangles. Triangles could be further combined to create the surfaces that make up objects, which could then be moved, resized, copied, or deleted. For example, I painstakingly created a 3D model of a coffee mug, using pencil and paper, and entered all the coordinates via the user interface that I had created. I could then fly the point-of-view around the mug and zoom closer and further away from it. I could also duplicate the mug with one command, and thereby have two of them.

I extended the application so that I could save the 3D models and composed scenes into files on disk. The data was saved in the proprietary 3D scripting language that I had created. This enabled me to quit the application, enhance it, and then re-start it and re-load the 3D worlds that I had begun to create.

Because of that project, I not only came to deeply understand the fundamental elements of 3D graphics algorithms, but I also became very skilled with C. I learned how to create and destroy very large and complex dynamically-allocated memory structures using variables that point to locations in memory, and other variables that point to those pointers. Using memory like this (indirectly accessing memory) is a very advanced topic in programming and is characteristic of advanced programming in C. By choosing to build a real system, I had to become extremely familiar and capable with these skills and concepts. Even though I rarely program using C today, I still feel very competent with C.

To anyone who wants to learn about anything, I recommend completing a little project in which you create something using the knowledge that you are gaining. This will both drive the acquisition of knowledge, and make the practical application of that knowledge so clear that you will never forget it.

Learn Gradually

My first few months at my undergraduate university were marked by a lot of drinking and partying, and I didn’t get much studying done. The first large set of exams came in the Spring of my first year, and I realized that I needed to study intensely to do well on them. The exams were scheduled just after the three-week Easter vacation. For those three weeks, I went back to stay at my mother’s house, and there I studied every day.

When I returned to university to take the examinations, I already felt absolutely ready. The night before the first exam, I contacted my friends and asked if they wanted to go to the local pub for a drink. To me, it felt right to simply relax and have fun the night before an exam. At that point, “the hay was already in the barn,” as one of my CrossFit coaches says. I was amazed to find that all of my friends were staying at home that night and cramming for the exams, and planning to stay up late and get little sleep. I ended up obtaining A grades across the board in those exams, and I realized that my strategy of starting early, and learning gradually, was very effective. For the rest of my bachelor’s degree, I studied much more consistently and persistently that I had in my first few months at university.

My master of science degree from Stanford took about seven years to attain (I was fully-matriculated for less than four of those years), requiring persistence and a relatively small amount of consistent effort over a long period of time. A full-time student is expected to complete an MS at Stanford in two years, but many pack-in enough units every quarter to complete the requirements in one year. They do this because full-time tuition is time-based, a fixed amount for each quarter that you’re enrolled. By completing the requirements for the degree in one year, it costs half the normal price.

However, because I did my master’s degree part-time, it was paid for on a per-unit basis. Even though this degree cost over $100K, I didn’t pay for any of it. I studied on weekends and evenings, while working full time, and my employer (NVIDIA) picked up the tab. For that, I am truly grateful.

I took a single three-unit or four-unit class in most quarters, and I sometimes skipped quarters when other aspects of my life became particularly demanding. I loved the process of taking classes while working because it delivered new and structured material for me to learn, and each class represented a well-defined chunk of accomplishment. Since that time, I have learned to create short-term and medium-term goals in my self-directed learning, goals that imbue my self-directed learning with similar regular experiences of completion.

Study What You Love

Before I started my undergraduate degree, I specifically chose electrical engineering (EE) and not computer science (CS). I clearly ended up performing very well on that program, with a 3.7/4.0 GPA (92.5%).

By the time I started on my master of science (MS) degree, I had been working for three years as a computer processor design engineer, developing extremely complex and high-performance digital circuits that are hidden away inside tiny silicon chips. Over that time, I gradually became more interested in software, the systems that run on the hardware.

For my MS, approximately 50% of the units that I chose were from the computer science department. I still acquired enough units from the electrical engineering department for my degree to be an MS in electrical engineering (MSEE). I did this because the MSEE degree had a reputation for being harder than the MSCS degree (but I’m not sure that’s actually true), and was therefore perceived as more prestigious. However, I took enough CS units to have officially received an MSCS instead. In fact, I even met the depth and breadth requirements to enter a computer science PhD program at Stanford. Actually, I may have needed two more letter-graded units, but I’m not sure.

I already knew that I enjoyed the CS classes a lot more than the EE classes. This became even clearer to me as I was reviewing the calculation of my GPA for this article. I completed only 13 units of EE classes with letter grades, but 21 units of CS classes with letter grades. This means that my 3.6/4.3 GPA (84%) is composed of a 3.2 GPA (74%) for the 13 EE units and a 3.9 GPA (90%) for the 21 CS units. If I had focused on what I enjoyed, rather than trying to do something that I thought would look more impressive on my resume, I would probably have finished with a much higher GPA, and also had more fun.

A few years ago I was talking with a friend who works in the mergers and acquisitions department at Yahoo. He told me that Yahoo was only acqui-hiring (acquiring as a way to hire) startups founded by engineers with Stanford MSCS degrees. They only wanted to pay the tens of millions, or hundreds of millions, of dollars to acquire one or two people if they had Stanford MSCS degrees. When I told my friend that I have a Stanford MSEE degree, but that it’s virtually indistinguishable from an MSCS degree, he told me that his manager would not be interested. Ironically, by choosing to have my degree labeled in an apparently more prestigious way, I had forfeited an easy avenue to at least ten million dollars of sign-on bonus.

Use a Learning System

At the end of my undergraduate degree program, there was a three-month vacation period between the end of classes and the final exams. The exams are called “finals” in the UK and they cover material from the final year, but can also cover material from any point in the entire degree program.

In terms of breath and depth, preparing for those exams is similar to preparing for doctoral qualifying exams in the US. In fact, based on the results from those exams and my final year project, I was accepted into a PhD program in electronic engineering, specializing in digital signal processing (DSP). In the UK (and most of Europe), the work involved in getting a PhD consists of only research, unlike US PhD programs, in which the first two or three years look like a grade-based master’s degree. I didn’t take the offer to study for that PhD because I didn’t think I could afford it, and I believed that I was responsible for taking financial care of my fianceé at the time. I decided to start working instead.

I knew that I had three months to prepare for finals, so I set up a schedule and a plan. From 9 am to 5 pm every day, with a one-hour break for lunch, my fianceé and I would study. I created a large table, covering many pages in a loose-leaf binder, that I used to track my overall progress. Each row in the table represented a subject, a topic within a subject, or a topic within a topic. Horizontally, on each row, I added a series of boxes. I studied a topic for a fixed amount of time before recording at the end of its row an estimate, as a percentage, of how well I fully understood the topic. I then took a short break and practiced juggling, before moving on to a different topic.

I used this large table to make sure that I visited all the topics and sub-topics spanning all of the subjects that I might be tested on. I used it to gauge what I needed to focus on to bring my overall comprehension up to a consistent level. Rather that focusing on the topics that were already the clearest to me, I was able to easily identify areas with low understanding estimates, and return to them more frequently. This led to my understanding estimates on those topics increasing rapidly.

Over those three months of consistent and well-directed study, I not only learned how to juggle, but also developed a broad and deep understanding of electronic engineering. I got a first-class honors degree, and my fianceé got an upper second-class honors degree (also called a “2:1”), the minimum requirement for entry into many UK postgraduate courses. I don’t think that she was using my table system, but she was studying consistently every day with me.

Conclusion

Learning is a lifelong process, and learning how to learn is a skill, or set of skills, that you can develop and get better at. By believing that you can learn anything and develop any skill, you will practice consistently and conscientiously to hone your understanding so that when the time comes for you to apply your knowledge, it will be deeply embedded in you, readily accessible and effectively utilizable.

Further Reading

Since you’ve read this far, you’re probably going to love reading, or listening to, my extremely popular article on How to Become World-Class at Anything.