The Beast

You adopt practice to “make the team go”.  However, every practice you adopt has a cost. The time you spend “making the practice go” is somewhat then a cost of making the team go.  I like to talk about the cost of your practices as “Feed the Beast!”  But what you really want to make your team go is to “Ride the Beast!” where the practices we have adopted start to carry the team faster than they could go without them.Continue Reading

A Proliferation of Architects

 

One of the things that I have observed over the course of this century is the transition away from traditional “Data Processing” titles like programmer, programmer analyst, systems analyst, etc. The key evidence of this trend is the proliferating of self-aggrandizing titles involving the term architect. In 1985 when I graduated, I don’t remember every hearing of a software architect or a network architect, or God forbid an enterprise architect or an application architect or a solutions architect or any such thing.

I suppose it started with the transition from Mainframe style computers toward what we now refer to as distributed systems. First everything was mainframe and dumb terminals, then it was PC’s. Then it was networks of PC’s and Servers. For a while it was Client/Server and then multi-tier or n-tier and finally the generalization of the term distributed systems. Different computers doing different parts of the work in different places, mostly by “collaborating” with each other.

The programmers who worked on distributed systems had much more diverse titles. They were software developers, application developers, database developers, user interface developers, and with the advent of the internet and the world wide web, there were web developers. Back in the day, there were programmers who worked on the internals of systems that no user would ever see. Those people were called systems programmers, now in the age of distributed systems, their new self-aggrandized title was software engineer! Then as these servers and n-tier platforms started to become more complex, every server product needed a dedicated administrator so the unix operating system need a unix admin and the database needed a database admin and pretty soon we had java “containers” that needed their own admins. The networks that connected all the computers together needed administrators and pretty soon the whole thing got very complicated.

As these systems grew more and more complex, we started to realize that the genius-wizards who were good enough at their job to be able to see the big picture and to help us straighten things out when they got wrapped around the axle deserved a special title “Architect”. This brings me to the ultimate question for this blog post:

“What the hell does an architect have to do with information technology?”

Continue Reading

Information Driven Projects

When you look at software development or corporate change projects, you often see some creative fiction. There is fiction in the plans, fiction in the designs and fiction in the requirements. This fiction is created by the notion that “Before we can start, we have to know everything required to get to done.”

Intuitively, we all know that this is not really true. We all know that information will “emerge” from our activities that will change how we get to done. We learn from our mistakes. We try things that don’t produce as good a result as we want. We clarify our understanding of the problem as we demonstrate portions of the solution.Continue Reading

Learn To Code – Now

I recently spent some time working my way through “Learn Python The Hard Way” by Zed A. Shaw. Zed is a programmer who has accomplished more than most in his short time on Earth. He is outspoken and often edgy, and has a reputation for being both brilliant and blunt. Zed is the creator of the Mongrel server engine that powers many Ruby on Rails sites.

Zed comes off as a Hard Ass, more than anything, and his proposed methodology to learn programming is hard, as in hard assed, not hard as in difficult. Learn Python The Hard Way is old school. Which is good, because I am old. It reminds me of learning Fortran in my freshman year of college in 1980. Hollerith cards. 039 keypunch machines. All batch processing. When you are dealing with “physical” cards, and physical sorting of program steps, and waiting an hour to see if your code compiled, let alone executed to completion or got a correct answer you tend to do alot more “desk checking” than we do today. That is the thing that I like about LPTHW is that it teaches some technique around old school desk checking. Like reading your code backwards to find errors, something that we often did on green bar paper at a table at Helmut’s Alpine Kitchen at two o’clock in the morning with a pot of coffee and an order of biscuits and gravy.Continue Reading

Learn To Code – Languages

Its 2014, almost 2015 and conventional wisdom about computers and programming have changed dramatically in the last 30 years since I graduated college. The number of people who use computers have changed from 10% to 90% in that period of time. My Google Nexus phone has way more memory and compute power than the mainframe I learned programming on in college. The PC that I bought in 1986 had a 20 megabyte hard drive – that would hold about 10 images shot on the camera embedded in my phone, or one shot in raw mode on my DSLR.

In the 1950’s into the 1970’s, computers were physically large, occupying large rooms and requiring many attendants or operators to manage. In the 1970’s the microchip or integrated circuit technology allowed computers to be built that would fit on a desk. Now we all need a laptop, a tablet and a phone and maybe a watch or a pair of glasses that are all computers of some kind. We have computers in our cars, smart homes. All our video gaming consoles are just computers.

Conventional wisdom which 30 years ago saw that computer programming was a highly specialized skill, now sees that everyone should learn how to code, even if they don’t do it very often. This is because as computers become more ubiquitous, we need to understand them – the same way that every should know basic auto maintenance like changing the oil or mounting the spare tire when you get a flat. The same way we know how to unclog a toilet or sink drain or oil squeaky hinges in our home. Computers are so much a part of every day life that we need to understand more about how they work.

So lets just accept the conventional wisdom for a moment. What does learning to code mean? What is code exactly and how does one learn it?Continue Reading

A taxonomy of software types

Generally, software falls into three classes; Apps, Tools, and Infrastructure.

The Breakdown

Apps – or applications as they were formerly known – are software built to help a user do some valuable activity, like check a bank balance or edit digital images. While the end user must learn how to use it, an app is useful without further development. Some apps are configurable, so can work differently for different users or organizations, but they are still focused on solving problems or delivering value related to some specific functional domain.

Tools – are software that are more general purpose, but with a specific flavor – that they are designed so that users of these can use them to “fashion” applications for themselves or other groups of end users. Tools express their own user experience, but are not always immediately valuable without some “fashioning”. Tools can range from Microsoft Excel to Sharepoint to web content management systems like WordPress, to giant ERP systems like SAP or Peoplesoft. Many business intelligence product fall into this category.

Infrastructure – are software that really has no end user experience. They are designed completely as a foundation for other software to be built upon. This would include any software whose primary interaction mode is through API (application programming interfaces) or CLI (command line interface) patterns. Products like databases, middleware, application servers, application frameworks all fall into this category.

So why is it so confusing to people? Because technology has its own functional domains. These classes are not mutually exclusive, in that for one user it is an application and another it is a tool. With add-ons, extensions, or plug-ins it becomes even more confusing, as these constructs blur the lines between tools and applications even further. Plug-ins for a tool, may be applications focused on one functional domain.Continue Reading

System Replacement Assumptions

I am in the middle of my umpteenth system replacement project.  There are some universal assumptions that are endemic to the user community in every system replacement project.  They are born of hope and frustration.  They are almost universal.

1) The new system will do everything the old one does, only better.
2) The new system will support all of my existing processes and procedures without significant change.
3) The new system will be faster that the old system.
4) The new system will have better data quality than the old system.
5) The new system will address ALL of the shortcomings of the old system.

If you have ever done one of these projects, you know.  They are assumptions that you must actively work against.  They require a constant stream of communication to dispel. I offer you my rationale for why they are never, ever, true.Continue Reading

Product Centered Worldview

Agile thinkers focus on the product – and how we are intentionally adding value to a “software asset” and potentially how we manage our “portfolio” of software assets. Phase-Gated Delivery (some people call it waterfall) patterns allow us to “optimize” the work against the constraints – but do not allow us to optimize the value delivery in time. In fact, they push ALL the value delivery to the very end of the cycle. But in theory, because we can optimize the projects by minimizing the schedule and cost against a constant value output – the value can be had for less investment. Problems arise when – the actual value is realized over time (meaning the sooner the customer has access to a software capability, the sooner his actual costs go down or his profits go up).Continue Reading

Design Philosophy And Coding Style

Caution, Rant Alert!

In my career, I have been some kind of leader on about a dozen new application projects. It is interesting to me that the only time I have ever heard about “design philosophy” or “coding style” is when some new developer comes on a project and gets his butt handed to him in a code review. Specifically, what I hear is a defensive posture which sounds like an excuse for code that is not a good fit for purpose or consistent with the surrounding application.

In a prior post HelpingTheTeam I talk about some new developer problems. This is another, perhaps less common issue.

Here is a list of bad coding practices that I have heard defended as “design philosophy” or “coding style” and why they are neither philosophy or style:Continue Reading