A Proliferation of Architects

 

One of the things that I have observed over the course of this century is the transition away from traditional “Data Processing” titles like programmer, programmer analyst, systems analyst, etc. The key evidence of this trend is the proliferating of self-aggrandizing titles involving the term architect. In 1985 when I graduated, I don’t remember every hearing of a software architect or a network architect, or God forbid an enterprise architect or an application architect or a solutions architect or any such thing.

I suppose it started with the transition from Mainframe style computers toward what we now refer to as distributed systems. First everything was mainframe and dumb terminals, then it was PC’s. Then it was networks of PC’s and Servers. For a while it was Client/Server and then multi-tier or n-tier and finally the generalization of the term distributed systems. Different computers doing different parts of the work in different places, mostly by “collaborating” with each other.

The programmers who worked on distributed systems had much more diverse titles. They were software developers, application developers, database developers, user interface developers, and with the advent of the internet and the world wide web, there were web developers. Back in the day, there were programmers who worked on the internals of systems that no user would ever see. Those people were called systems programmers, now in the age of distributed systems, their new self-aggrandized title was software engineer! Then as these servers and n-tier platforms started to become more complex, every server product needed a dedicated administrator so the unix operating system need a unix admin and the database needed a database admin and pretty soon we had java “containers” that needed their own admins. The networks that connected all the computers together needed administrators and pretty soon the whole thing got very complicated.

As these systems grew more and more complex, we started to realize that the genius-wizards who were good enough at their job to be able to see the big picture and to help us straighten things out when they got wrapped around the axle deserved a special title “Architect”. This brings me to the ultimate question for this blog post:

“What the hell does an architect have to do with information technology?”

Continue Reading

Learn To Code – Now

I recently spent some time working my way through “Learn Python The Hard Way” by Zed A. Shaw. Zed is a programmer who has accomplished more than most in his short time on Earth. He is outspoken and often edgy, and has a reputation for being both brilliant and blunt. Zed is the creator of the Mongrel server engine that powers many Ruby on Rails sites.

Zed comes off as a Hard Ass, more than anything, and his proposed methodology to learn programming is hard, as in hard assed, not hard as in difficult. Learn Python The Hard Way is old school. Which is good, because I am old. It reminds me of learning Fortran in my freshman year of college in 1980. Hollerith cards. 039 keypunch machines. All batch processing. When you are dealing with “physical” cards, and physical sorting of program steps, and waiting an hour to see if your code compiled, let alone executed to completion or got a correct answer you tend to do alot more “desk checking” than we do today. That is the thing that I like about LPTHW is that it teaches some technique around old school desk checking. Like reading your code backwards to find errors, something that we often did on green bar paper at a table at Helmut’s Alpine Kitchen at two o’clock in the morning with a pot of coffee and an order of biscuits and gravy.Continue Reading

Learn To Code – Languages

Its 2014, almost 2015 and conventional wisdom about computers and programming have changed dramatically in the last 30 years since I graduated college. The number of people who use computers have changed from 10% to 90% in that period of time. My Google Nexus phone has way more memory and compute power than the mainframe I learned programming on in college. The PC that I bought in 1986 had a 20 megabyte hard drive – that would hold about 10 images shot on the camera embedded in my phone, or one shot in raw mode on my DSLR.

In the 1950’s into the 1970’s, computers were physically large, occupying large rooms and requiring many attendants or operators to manage. In the 1970’s the microchip or integrated circuit technology allowed computers to be built that would fit on a desk. Now we all need a laptop, a tablet and a phone and maybe a watch or a pair of glasses that are all computers of some kind. We have computers in our cars, smart homes. All our video gaming consoles are just computers.

Conventional wisdom which 30 years ago saw that computer programming was a highly specialized skill, now sees that everyone should learn how to code, even if they don’t do it very often. This is because as computers become more ubiquitous, we need to understand them – the same way that every should know basic auto maintenance like changing the oil or mounting the spare tire when you get a flat. The same way we know how to unclog a toilet or sink drain or oil squeaky hinges in our home. Computers are so much a part of every day life that we need to understand more about how they work.

So lets just accept the conventional wisdom for a moment. What does learning to code mean? What is code exactly and how does one learn it?Continue Reading