Where Self-Taught Coders Fall Short and How to Avoid It

While the “learn to code” movement has largely died out from mainstream attention, it’s still alive and well and influencing budding programmers all over. Development jobs offer the sweet prospect of a profitable, creative career which is purely based on merit and can propel anyone from broke to upper middle class in a few years. The scariest part is that it’s mostly true, but you can’t just learn to write code, you need to learn to program.

The distinction between the two seems subtle. If you just learn a language, you have a tool, but you need to learn what you can do with the tool, and how to do it in order to succeed. Programming is ultimately just a process to manipulate a machine into doing what we want as a compromise between lifeless silicon and the human imagination. It’s one thing to know how to talk, it’s another to know what to say.

Self-taught coders tend to get a lot right, but they also tend to have some of the worst holes in their knowledge. The issue isn’t teaching yourself, it’s the limitations which you are unaware of. After all, there’s a big difference between learning a language and learning to use it to program. Both wisdom and knowledge help you learn the difference between “can” and “should”. How can you fill in the gaps?

Learning a Language

Programming is like learning a language, the etiquette and culture, and how to hold a conversation while writing code is just learning the language itself. Most people don’t need to learn how to speak to another person when they learn a language, they just need the tool to convert their current thoughts into a culturally acceptable equivalent. Humans are mostly the same though, if you can communicate with one group, as long as you have the language and some understanding, you can probably communicate with any other.

Computers are fundamentally different. If you break down a program into pseudocode (or further), it will (typically) be understandable to a person, but not necessarily at a deep level. Almost anyone can read a book, but they can’t necessarily write one. People are great at seeing patterns, but picking up and understanding a pattern on analysis is different than synthesizing a discrete example of it.

Many self-taught coders fall into the trap of associating learning a language with learning about programming. Computer science is completely different from just coding because it gets into the fundamentals. Computers function based on math, logic, and a way to turn that into something variable in a machine.

Learning to Program

Skipping math for programming is like trying to learn to speak without understanding what the words actually mean. A computer can spit back a sentence or even paragraphs based on a grammatical algorithm, but it doesn’t make any sense. AI might change that eventually, but it is a computer simulating a human level of understanding without actually understanding (for now).

You can’t just regurgitate the same basic blocks, you need to organize them and have a way to store data. You need data structures and algorithms. It’s not enough to memorize quicksort, you need to truly understand what an algorithm means and how and why it works.

What is Big O notation and what does it mean? Some of the more dedicated self-taught coders will learn what it is, but not how it’s calculated or measured. What time do your algorithms run in and what’s the worst case scenario? Why do these differences even matter?

The Difference Between “Can” and “Should”

Understanding the fundamentals of computer science will help you learn to tell the difference between when you can do something when you should do it. Quicksort is one of the best sort algorithms for general purpose sorts, but there are many cases where it can be a terrible choice. There are even cases where a bubble sort wins.

What are the implications of static typing versus dynamic typing? Certain thought processes will work better with certain languages. The language becomes a tool in your toolbox when you understand the principles of programming.

Amateur coders tend to lose sight of the forest for the trees when learning a language. Mastering the idioms of a language won’t necessarily help you learn to think like a computer. Learning additional languages can contribute to a natural understanding of basic principles, but it isn’t guaranteed. Someone who speaks 5 languages isn’t necessarily a linguist, and someone who can code in 5 languages isn’t necessarily a computer scientist. There is a conscious step required to advance that level of understanding.

Filling in the Gaps

You have to learn, understand, and internalize the math and material behind computer science to really learn to program. At a minimum, you need to know the basics of Big O notation, formal logic, and understand basic algebra. Different sub-fields and even higher levels will require more advanced and different types of math. You might be able to get away without much math for very simple programs and very basic coding, but it will bite you sooner than later. Stack Exchange and Dunning-Kruger can only get you get so far.

Where are you for math and logic? Big O notation is just a form of math for analyzing algorithms, so it can be learned with the “standard” math. Where are you for the application of math and logic? Data structures and algorithms are just the science where math and computers collide, so you need to understand the basis of the application in addition to the application itself.

How do you describe and differentiate languages? Why is Rust so different from C#? Why is Perl different than Python? Even an expert may not know off the top of their head, but a quick dive into the basics will tell them enough. Each language has its own set of features and assumptions about the nature of its intended operations; you need to know what these differences imply.

A language is a tool, and each tool has its own function. What are you missing and how can you fill in that knowledge? You need to be able to accurately determine what you’re missing and where to go to get it. If nothing else, grab a book on data structures and algorithms and see what you can get and what you can’t. There’s a wealth of knowledge required to make the jump from coder to programmer, are you ready to try and make that jump?

Featured image by Thierry Milherou from Pixabay