It’s said that learning a new language can change the way you think, by giving you a new perspective on the world. But how does that apply to your first programming language?
Whenever you get a few programmers together, a common topic of conversation is what programming languages you know and, in particular, which one you learned first and the different orders in which you learned the others. (As well as the usual size queen conversations about who knows the most or the most obscure languages.)
Chances are people who learned programming in school didn’t get to pick their first language—it was chosen for them. Sometimes it was based on industry; someone who took a course in business programming is going to learn it in a different language (COBOL, RPG) from someone learning science or engineering programming (FORTRAN,Pascal).
An even bigger influence is generational. Programming languages, particularly programming languages taught in school, tend to be taught in waves. People of a certain age will have first learned FORTRAN or COBOL, followed by people who first learned Pascal, followed by people who first learned a hobbying computer language such as TRS-80 or Apple BASIC, whereas people these days might learn Python first.
Think about it: Is someone whose first programming language was APL going to think about programming ever after in the same way as someone whose first programming language was COBOL or Assembly? “I find that my early exposure to APL warped my brain by forcing me to ask ‘what’s this mean?’ and ‘is this a reusable operation?’ and ‘what’s a pithy summary for all this algorithmic fluff?’” notes one present-day Python programmer.
“The tools we use have a profound (and devious!) influence on our thinking habits, and, therefore, on our thinking abilities,” wrote Turing award winner Edsger Dijikstra in 1975. “It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration. The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.”
Programming, after all, has been found to be so idiosyncratic and personal that the science of stylometry can help identify even an anonymous programmer based simply on the way in which they write their code. It stands to reason that an important part of this style is the language and the data structures you use—and those are often heavily dependent on which language you learned first. How many people do you know who got so hung up on Microsoft Excel after they learned it that they used it for everything, ranging from clip art to databases?
And depending on the language, you can get spoiled for anything else. One programmer, for example, learned Lisp early on (though to be fair, it was his third language, not his first), and became so enamored of it that he didn’t learn most of the other languages available at that time. In fact, he claims on an online forum discussion that his focus on Lisp destroyed his programming career. “I just wasn’t a very good programmer anymore,” he lamented. “Lisp’s power had made me complacent, and the world had passed me by. Looking back, I actually don’t think I was ever a very good programmer. I just happened to have the good fortune to recognize a good thing when I saw it, and used the resulting leverage to build a successful career. But I credit much of my success to the people who designed Common Lisp.”
He wasn’t the only one. “Many non-lisper friends have learned to do very difficult things while I personally remained ignorant of them because I never needed them,” noted one person discussing the piece. Another agreed. “Personally, I feel caged whenever I am not programming in Lisp or a language with similar metaprogramming capabilities.”
And the reverse is also true—some people may not be able to know how to do something because the language they use doesn’t support it, writes Yan Cui in the programming blog theburningmonk.com. “What if our creative powers are limited by the ideas that can be expressed by the programming language we use?” he writes. “What if some solutions are out of our reach because they cannot be clearly expressed in those languages? How would we ever discover these unknown unknowns?”
Ultimately, any good computer language should change the way you think, according to Turing award winner Alan Perlis, who wrote the Algol language, and stated: “A language that doesn’t affect the way you think about programming, is not worth knowing.”
Simplicity 2.0 is where we examine the intricate and transitory world of technology—through a Laserfiche lens. By keeping an eye on larger trends, we aim to make software that’s relevant to modern day workers, rather than build technology for technology’s sake.
Subscribe to Simplicity 2.0 and follow us on Twitter. If what we’re saying piques your interest, head over to Laserfiche.com where you’ll see how we apply the lessons learned on Simplicity 2.0 to our own processes, products and industry.