Learn To Code – Languages

Its 2014, almost 2015 and conventional wisdom about computers and programming have changed dramatically in the last 30 years since I graduated college. The number of people who use computers have changed from 10% to 90% in that period of time. My Google Nexus phone has way more memory and compute power than the mainframe I learned programming on in college. The PC that I bought in 1986 had a 20 megabyte hard drive – that would hold about 10 images shot on the camera embedded in my phone, or one shot in raw mode on my DSLR.

In the 1950’s into the 1970’s, computers were physically large, occupying large rooms and requiring many attendants or operators to manage. In the 1970’s the microchip or integrated circuit technology allowed computers to be built that would fit on a desk. Now we all need a laptop, a tablet and a phone and maybe a watch or a pair of glasses that are all computers of some kind. We have computers in our cars, smart homes. All our video gaming consoles are just computers.

Conventional wisdom which 30 years ago saw that computer programming was a highly specialized skill, now sees that everyone should learn how to code, even if they don’t do it very often. This is because as computers become more ubiquitous, we need to understand them – the same way that every should know basic auto maintenance like changing the oil or mounting the spare tire when you get a flat. The same way we know how to unclog a toilet or sink drain or oil squeaky hinges in our home. Computers are so much a part of every day life that we need to understand more about how they work.

So lets just accept the conventional wisdom for a moment. What does learning to code mean? What is code exactly and how does one learn it?

What is code?

Code is a means of expressing instructions that you want the computer to “do” or execute. Like asking your butler to answer the door. When people talk about code today, they usually mean writing those instructions in a “programming language”. Writing a program is the equivalent of teaching your dog to roll over on command. Once she has learned the instruction and response, you say “roll over” and she obeys. So semantically, programming is more like training the computer to do a task. Except that once you have written the program you can give it to any “compatible” computer and that computer will be “trained” as well. In this analogy, though, we would also have to learn to speak “dog”. Computers can only be trained in a language that they speak.

What language do computers speak?

Natively, computers speak in the instruction set that they were designed to execute. Originally, each type of computer spoke a unique language aligned with the design of the processor. This was problematic for programmers, so programmers decided to create new languages and train computers to understand them.

How do you train a computer to understand a language (you give it a program) (* Mind Blown *). A programming language needs a computer program that translates from that language into the native language of the computer that you want to train. These programs come in three flavors: Compiler, Interpreter, and Runtime Intermediary. A compiler is a program that quite literally takes in a set of instructions in one language (source code) and outputs a set of instructions in executable machine language. An interpreter is a program that does this one instruction at a time, so it runs at the same time your program runs. One of the advantages of an interpreter is that there is usually a “CLI” or command line interpreter mode where a programmer can type individual instructions and the shell displays the result of each instruction. The disadvantage is that it does it every time the command executes, rather than once before execution, so interpreters may have performance problems. A runtime intermediary is like a big program of pre-translated and usually optimized instructions that are designed to simplify your compiler or interpreter so instead of outputting machine code, it outputs intermediary code that has been pre-translated into machine language in the runtime intermediary. Examples of this are the Java Virtual Machine or the Microsoft CLR (the .Net runtime). This can also solve performance problems in interpreted languages.

Programmers have created hundreds of languages that they use to write code. Since the first language was invented, programmers using that language became frustrated and said, “This sucks, I can do better than that.” Virtually every language came out of some programmer who uttered that phrase, and of course was capable of creating a new language.

Today, popular languages include Java, PHP, Python, Ruby, Javascript have been around for more than 10 years, and have become widely used in new development. Older languages like C, Basic, COBOL, Perl, Smalltalk or Lisp are still widely used and not to be discounted. Newer languages like Scala, Clojure, Erlang, Haskell, Go, Groovy have yet to prove that they will last. Just read the timeline of computer languages to see how many have come and stayed or gone. Just like human languages, languages are derived from each other, they inherit words or concepts from each other, and they can be divided into families. Just like human languages, when the last native speaker dies, the language dies with them. When nobody is writing new programs in a language (just fixing the old ones) the language is essentially dead.

What does it mean to learn a programming language?

A computer program, essentially, is a set of steps that follow a sequence. Those steps can do one of four things:

1) Interact with hardware (e.g. display something, write a file, get input from mouse, touchscreen or keyboard, etc.)
2) Manipulate data (e.g. calculate a value, manipulate some text, change an image, etc.)
3) Make choices that affect the result (e.g. if the mouse clicks on a button, or if the data contains the word “stop”, etc.)
4) Define an alternate sequence of steps (e.g. repeat the same steps over, follow a different set of steps, or skip one or more steps, etc.)

Languages provide words, nouns and verbs which allow the programmer to create statements, expressions or sentences which effectively become these steps. Languages provide constructs for organizing statements in different ways, these tend to be called “paradigms”, because they change the way the programmer things about instructing the computer. This is not that dissimilar from the way human languages affect the thinking of native speakers. Languages provide ways of organizing programs as parts of other programs, this can be though of as analogous the notion of sentences, paragraphs, chapters, and books in human languages.

Like human languages, programming languages have conventions and representations for “context”, so you know how to interpret things like pronouns and which noun or verb an adjective or adverb is modifying. In an English sentence, “One boy pushed another boy so that he fell down.” It is unclear which boy fell, the pusher or the one being pushed. We normally assume things from the context, but semantically either interpretation could be correct. The compiler of a programming language should “catch” such ambiguity and flag an ambiguous instruction as an error – meaning that the computer cannot reliably or precisely interpret the instruction. Likewise, compilers will flag syntactically incomplete instructions as an error, or logically inconsistent statements as errors. Humans when encountering these sentences will make a judgment, and naturally apply a correction to make the sentence precise, consistent, or complete. Compilers often do no such thing, and act more like your English teacher in high school, effectively making red marks on your program to indicate your errors. Programming language compilers are pedantic about small infractions in ways that to human language speakers can feel incredibly frustrating.

Both human languages and programming languages are just shared abstractions by which we can communicate some underlying idea that we have. The same idea can be communicated in many languages, and would sound different, and look different on a page, but would mean the same thing.

Learning a programming language is similar to learning a human language in that it requires one to learn the vocabulary, syntax (sentence structure), and semantics of the language. But to really deeply learn to write amazing and powerful programs you have to understand how the language turns your statements into the native language of the computer. Not to be able to translate into “machine code”, per se – but to understand the underlying machine architecture and how it processes instructions.

Learning to write code can be a fun exercise. It teaches problem solving skills. It increases one’s ability to think in the abstract. But it can be frustrating for the beginner because computers are literal, not contextual. They only interpret the instructions typed in, not the body language or the facial expression, or the intentions of the programmer.

What language should a new programmer learn first?

There are a lot of articles that have been posted on this, but I like Python for beginners. I like it because it runs on virtually any computer PC or Mac that you might own. I like it because it has been around for a while and is pretty widely used and is “general purpose”. That means it can be made to do most anything you want it to do. I like it because Python is interpreted and sometimes that makes it easier to get started.

Python’s syntax is simple, and not cluttered with unnecessary punctuation. It is pretty easy to learn and there are some great tutorials out there for it that are designed for the beginner. My current favorite is called “Learn Python The Hard Way” by Zed A Shaw. I won’t tell you why, just do that one first. It will take you farther, faster than any other tutorial out there, and much of what you learn will apply to other languages as well.

I suppose that one could make a case for any language to be learned as a first language. If you have some project in mind, you should start by learning a language that will make it easy to do your project. But if you are just learning for the “fun” of it – then take my advice and start with Python.

So go out and get started. Learn to write code. If it makes sense to you, go as far as you can.

No Comments

Post a Comment