Time To Stop Obsessing About The Infrastructure?
By Alex Bunardzic under Theory on 12. March 2006I’ve been actively developing software for the past 17 - 18 years. When I began, back in 1987, the recommended language of choice was C. The rationale at that time being that C is a language that closely and faithfully mimics the inner workings of the underlying low level machinery.
Obsession with the infrastructure. Yes, computing infrastructure is fascinating. But what is a computer?
Mainframe is the Computer
If you asked IBM 20 years ago what is a computer, they’d tell you that the mainframe is the computer. They were selling mainframes at that time, and to them every problem looked like a nail.
Network is the Computer
At around the same time, Sun Microsystems attempted to take over the world with their slogan “Network is the computer”. The exact opposite of mainframes. Great.
Database is the Computer
Also, another big-ass vendor, Oracle, had to make an attempt to take over the world with their vision of a database being the computer. Yes, databases are fun.
Desktop is the Computer
Finally, Microsoft took a swing at it by announcing that desktop is the king. Yes, desktop is so cool, so sexy. And, as it became more and more affordable, it really started taking over the world by the storm.
Who cares?
Really, honestly, who cares? Why should we care? Why should we act as foot soldiers (unpaid foot soldiers, mind you) for those vendors? It’s like us paying them to bombard us with their advertising.
I must say that I’m at a loss when it comes to why are we, 20 years after these ugly vendor wards started raging, still caught in the debate. What’s in it for us? Yes, we all understand what’s in it for the vendors. But, why should we care about helping them see that their definition, their slogan wins?
It’s like getting caught in trying to define what is electricity. The producers of electric power claim that electricity is turbines that generate it. The producers of copper wires claim that electricity is wires. The producers of light bulbs claim that electricity is the end-point, the light bulbs that deliver incandescent light to our homes.
But do we, as consumers, care whose vision wins? No. All we care for is that when we turn the switch on, the room gets filled with light.
Who is Serving Whom?
My understanding has always been that we have invented computers to serve us. Now I see that all we’re doing is expending inordinate amounts of time serving them. I don’t see any justification for why this should be so, and I therefore proclaim here that it’s high time that we cease and desists and insist that all this computing infrastructure start serving our needs.
But in order for that to happen, we first must stop obsessing about the computing infrastructure. It’s about time we get over it.
Applications are Infrastructure
Just so that we’re clear on what encompasses the computing infrastructure, I’d like to point out here that it’s not only about the hardware and the middleware and the operating systems, databases, and frameworks. Application software also counts.
We all know how much we tend to obsess about the applications. Many of us find a number of software applications fascinating. Well, snap out of it! Stop serving your favorite applications, and start looking into how to make those applications serve you.
You should start looking at applications as being a kind of anomaly, a necessary evil produced out of dire necessity in the attempt to cover the gaping holes in the existing computing machinery. Applications are superfluous, and will eventually recede into the background. Today, they hold the center stage, and many people obsess about them. But that is bound to change.
Once we grow up and stop obsessing over the infrastructure, we’ll finally be ready to embrace the wonderful world of computing on our own terms. This is similar to how we had to grow up and embrace the wonderful world of printed word. It wasn’t easy, and it took some serious schooling, but eventually we managed to pull it off.
Yes, there are still pockets of illiteracy even among the most developed countries, but these are negligible compared to the overwhelming literacy that is now the cornerstone of our culture.
Aristotle Pagaltzis:
This is an extremely weak article. I could barely make myself read past the first few paragraphs, but I had to in order to respond.
You start with a series of conjectures and a far-fetched connection in your first two paragraphs and try to build an argument from it.
Why should any care about your “wonderful world of computing” vision any more than you yourself just said I should care about the big vendor visions?
The computer is a tool. It is nothing more and nothing less, never was, and never will be. The rationale that you should learn C has nothing to do with an obsession with infrastructure – that’s a baseless conjecture on your part. You cannot use a tool effectively if you do not understand it. It’s that simple.
It’s people who try to make more out of it – and that includes you – who spend all their time running after all the latest IT fads like headless chickens.
I have to say… this is the most laughable metaphor I’ve ever seen. But in staying with your metaphor,
You should also start seeing toasters and ovens as a kind of anomaly produced out of dire necessity in the attempt to cover the gaping holes in the existing electric machinery.
Err, what?
Please come back when you have a point.
comment at 12. March 2006
Alex Bunardzic:
Aristotle wrote:
I don’t understand my car. Yet, I have been using it effectively for the past 4 years.
comment at 12. March 2006
Aristotle Pagaltzis:
So what? I don’t know how the circuits in a CPU are laid out and why, either, and you still spent at least a month or so to get a driver’s permit.
But I am a much better programmer for knowing how to write assembler code well.
comment at 12. March 2006
Alex Bunardzic:
Aristotle Pagaltzis wrote:
Oh, now I understand! Of course you’re right — how could I have been so blind not to see it right away? I clearly recall how I’ve spent almost a month acquiring a permit to operate my microwave oven.
Of course, more complex tools, such as cell phones, come with much more stringent requirements — at least a semester of study and then a gruelling exam.
comment at 12. March 2006
Aristotle Pagaltzis:
Okay, now you’ve made me feel sheepish for taking the bait and engaging in your foolish car analogy. Well done.
Will you be as successful at explaining to me what this wonderful world of computing of yours is supposed to mean?
I tried to track back to some more sensical point from which to argue some actual point, but I’m having a hard time finding anything coherent in your article to grip onto. It all sounds about as meaningful as the vendor slogans you are talking about. You don’t define what infrastructure is, why the things you count as infrastructure fulfill the criteria to be counted that way, why such infrastructure is not worthwhile to obsess over, or what the “wonderful” and the “computing” parts in ”wonderful world of computing” each mean. If I saw this sort of vacuous blather anywhere but on lesscode.org, where it feels like a blemish on Ryan’s excellent work, I would just ignore it. That seemed to work a treat for the vendor slogans, too.
But go ahead, enlighten me.
comment at 12. March 2006
Alex Bunardzic:
Aristotle Pagaltzis wrote:
Sorry to have engaged you in this exercise of radical thinking. It is only natural to expect to precipitate wrath from technologists if you advocate taking technology away from the picture. (Imagine if I posted a manifesto on some accountants list insisting that we simplify the accounting system — the accountants and their guilds would most certainly tar me, ridicule and deride me, and then kick me out of town.)
Ryan had just informed me that my opinions on technology are not welcome any more on this list that is dedicated to discussing technology (albeit in diminishing terms). I am indeed an unorthodox thorn in the eye, that much I admit, so I will now bow out of the picture. I don’t enjoy raining on anyone’s parade.
The wonderful world of computing I was referring to draws parallels from the wonderful world of printed word. In the early days, people used to be dependent on the small group of elite experts who were in the privileged position of being capable of reading and writing. But the situation was actually untenable, the bottlenecks and backlogs grew unmanageable, and eventually people were forced to learn how to express themselves directly in this new medium.
I’m expecting the same thing to happen to the computing. We’re now in the early infancy of the computing, where huge numbers of people depend on a very small number of elite experts who claim to know how to express human intentions into the machine implementation.
I’m arguing that this situation is untenable (and not only because the above ‘claims’ are questionable at best). It’s simply too costly. We see different people scrambling to overcome this impasse, some attempting to offshore the elitist work to India etc., others clutching straws with Model Driven Architecture and other pie-in-the-sky code generating frameworks.
The situation will not get resolved unless we reach the level of expressiveness where people could express their intentions directly. Same as most people today can write a memo by themselves and express their position with relative ease (read: inexpesivelly) to their targeted audience.
Again, I realize now that it would be practically impossible for you to grasp this idea. Simply by the virtue of the fact that you’re trapped inside, enamored by the technology. Technologists today remind one of those scholars of the earlier periods who kept insisting on speaking in Latin, and were looking down on anyone who’d speak directly, without beating around the bush and to the point.
Eventually, the Latin speaking crowd died out. Yes, they are still kicking around in some academic circles, but their overall impact on everyday life is drastically diminished.
So, in the attempt to shed some light on your confusion, let me stretch things a bit and say that the infrastructure I was talking about is analogous to Latin. Yes, it is fascinating to study Latin, and one can indeed learn a lot by doing so, but it is absolutely not necessary to do that if one is to function in everyday world. Some of the most expressive and engaging writers never studied Latin.
Still, Latin is, in many ways, the underlying infrastructure to most of the Indo-European languages (English included).
Same is with computing — some of the most expressive and engaging ‘computer conversationalists’ in the future will be the people who never knew how to program in Assembler.
Now, I’m not saying that knowing how to program in Assembler is bad. It’s cool. But it’s not mandatory. Insisting that everyone who is ever going to formulate and express their intentions to the computing machinery be a proficient assembler programmer is akin to saying that anyone who wants to express their thoughts in a memo or email or blog post or press release must be a full fledged Latin scholar.
Feel free to ask more questions. I really didn’t want to get deep into this whole debate, for the space/time constraint reasons, but would be happy to elaborate some more.
comment at 13. March 2006
Aristotle Pagaltzis:
Surprise, now that you’re trying to explain your position instead of hand-waving wildly, it actually makes sense, and, despite your talking down to my obviously inferior intellect that is unable to grasp such topics, I mostly agree. I’ve extolled this as a virtue of PHP, which makes PHP valuable despite it mostly being such a huge pile of crap.
Still, Latin is considered a transformative experience by those who learn it today. I don’t consider that a coincidence. And just because everyone knows to read and write (big overstatement, even in the industrialised world, but that’s another topic) does not mean that any of them would get a job in journalism or would make it as a writer. Likewise, to dip a bit into your car analogy, you pay someone who is trained to understand cars in order to fix yours, because you don’t understand it. And you need experts who understand cars really well to design and build them. That using a microwave requires zero skills of any sort does not mean building one doesn’t either.
So no, I don’t consider my point invalidated. Assembler skill is not going to be obsolete any time soon – in fact, dare I predict, will never be. It’s going to remain required for those who build the things that others can then use easily. I also dare predict that having “arcane“ knowledge will always be more important in computers than it is in other disciplines, for the mundane reason that physical objects are constrained by the rules governing this universe (and our understanding of them!), so their construction can only get so complex. This is in stark contrast with computing, where defining the rules of the universe is what we do, and to arbitrary complexity.
In any case, computing does currently require everyone who wants to use the analogon of a microwave to first be able to build one, and this is definitely not the way things should be. I believe that Alan Kay’s ETech talk from a few days ago had a number of relevant points. Heck, he’s been working on exactly this issue for pretty much all of his life.
But I’m not at all surprised about any of this. Computing is young, very young, and we started with nothing but assembler and no idea of what works and what doesn’t. It takes a while to discover where and how to layer the next abstraction onto the current one, and progressively longer so the higher you get. Computing is still bootstrapping itself. On top of that, the commercialisation that the discipline experienced around the ’80s has put the brakes on the process severly – nothing new has happened in 20 years.
So I guess in an extremely roundabout way, I can even agree about the relevance of the vendors to the matter in question.
comment at 13. March 2006
Alex Bunardzic:
I’m sorry, Aristotle, I’ve never implied anything about your ‘inferior intellect’. All I said was that someone who has vested interest in purveying the technology in general cannot be expected to be sympathetic to the efforts toward minimizing the technology. It’s like certified accountants cannot be expected to be sympathetic toward any efforts that would minimize their role in society.
But you’ve obviously proven me wrong, which is a testament to the sharpness of your intelect.
I agree with everything you wrote above. Everything is bang on. And it makes me very happy to see that you’re not into proliferating the technology at the expense of human suffering.
The only thing I’d like to add is that the idea, the ‘power to the people’ if you will, that regular guys should attain the ability to write memos to the machines, is what really matters. No one is expecting these memos to be brilliant or anything. It’s like in real life today — I write a short memo to my boss, to my secretary, to my child’s teacher. I’m not expecting it to be framed and put up on the wall as being an exemplary specimen of vigorous writing style. Just my way of expressing my pressing needs at a given moment in time. But I’m never expecting, or hoping to be a real writer or a real journalist.
These memos are most of the time short-lived, ephemeral. In the future, I see us writing scripts to machines using some form of a very expressive language that is sensitive to the common sense context. Not building anything, not expecting to be professionals or experts in working with computers.
comment at 13. March 2006
Sean Smith:
Interesting discussion. You both seem to be coming at this discussion from opposite sides of the same canyon, but the answer is in the river down below.
I think the car analogy is a good one. Computers and cars are probably the most complex object ordinary people use on a regular basis. The amount of engineering that goes into building them both is far more complex than the average user realizes (sp?).
The problem I have is that you both are looking at computer use from a programming perspective and it is this view that causes your arguments to fail for me.
A car essentially has 3 inputs you need to use it– gas, break, wheel. However, there is far more that goes into driving a car than just pushing a pedal and turning the wheel. Depth perception, judgment (can I make the light before it turns red), awareness (making sure to check blindspots) are all necessary for the safe operation of a vehicle. None of those things require any “programming” of the vehicle, however. Your average (read: most) car driver doesn’t car how or why the car goes forward when the accelartor is pressed, only that it does.
Likewise, your average computer user doesn’t care how or why their word document opens, only that it does. We are never going to reach the point where the average computer user is building their own applications, just like we will never have average drivers building their own car engines or transmissions.
The most important qualification for programming languages is not whether they allow your aver user to program, but how useful they are for programmers (automobile engineers) to produce the applications (cars)– whatever platform that may be; web, desktop, database– that the average computer user (driver) will use ony a daily basis.
Different languages, from assembly to Ruby and Lisp, will always have their own specialty areas. The real goal should be to develop tools that fit the situations computer users (programmers included) face day in and day out.
Just like vehicles (cars, trucks, semis, suvs, etc), platforms, frameworks, and languages will always have different uses and users.
Thanks for the interesting read guys.
comment at 14. March 2006
twifkak:
Said Alex:
And what do you do for a living, sell flowers?
I agree entirely with the idea that “normal folk” should be able to program, and I think you’ll find that many coders feel that way — just Google for “long tail.”
(Sean, if you need something concrete, just look at MS Excel. Plenty of people are writing little mathematical “apps” in Excel using the builtin formula language, obviating what would have been, in the past, a need to commission an application.)
But what does all this have to do with The Infrastructure?
comment at 14. March 2006
Alex Bunardzic:
Sean Smith wrote:
That is one way of looking at things. There are other ways. The original intention of defining programming languages was to give humans the means of expressing themselves to the machines. The split to ‘average computer users’ and ’specialized computer programmers’ occurred out of necessity — the existing programming languages weren’t expressive enough. Meaning, some people started specializing and claiming that they can, after four years of intense study, translate the intentions of human users to the underlying computing machinery. These people became known as ‘programmers’. Or, sometimes also known as ‘hackers’.
But you gotta ask yourself: what would happen if someone were to invent a programming language that is sufficiently expressive so that the machine will “do what I mean, not what I say”?
The main reason we need hackers/programmers today is due to the fact that machines are notorious for doing exactly what we tell them to do, not what we actually mean. The machines are just too bloody literal-minded. Hackers specialize in breaking through the intentions and re-framing them in such a way that the machines can process the request unambiguously.
But if someone can build a hacker for us, then we can talk to the hacker and let it do the legwork for us. The language that would enable us to talk to the hacker would be more expressive than the language that hacker will use to talk to the machine.
This is similar to having a chauffeur. In the early days of automobile industry, cars were more complicated to drive than they are today. Well off people used to have chauffeurs. The user will talk to the chauffeur, expressing his/her intentions (”Jives, drive me to the Opera House, will you?”), and the chauffeur will then ‘talk’ to the underlying machinery (i.e. the car) and will make it go to the desired destination.
To me, a language like Ruby is the precursor of the future ‘hacker’ — a device that allows me to talk to the underlying machinery though a ‘hacker’, or a ‘chauffeur’, allows me to express my intentions without worrying too much whether these intentions will get interpreted too literally. Even today, Ruby can do that, to a certain extent, because it is context-sensitive, and has certain level of common-sense built into its fundamental bias.
Yes, there are general purpose butlers, and there are specialized butlers. A chauffeur is a specialized butler. Like, you wouldn’t expect him to iron your pants and shirts.
But the underlying constant to all these uses must be expressiveness. We must reach the level of expressiveness where our intentions get interpreted correctly.
comment at 14. March 2006
Alex Bunardzic:
twifkak wrote:
If I may use an analogy from the world of theater, I’ve started my career building props. Then I moved to building costumes, doing the makeup for the actors, etc. I occasionally acted as a Prompt as well.
But now I want to move to write and direct plays.
In the technical terms, I used to build the infrastructure on which the programs will run. I was also involved in building frameworks etc. All these things were allowing the scripts to run.
But now I’d like to focus on writing the scripts. I want to move more into producing the content, not the props.
That’s a good example. Although I wouldn’t call them ‘apps’. These are just scripts.
Almost all software development effort today goes toward building/maintaining the infrastructure. Very little goes toward delivering the content. I think it’s the wrong focus.
comment at 14. March 2006
Allegra » Blog Archive » Superstition Is Everywhere:
[…] In reply to a ferocious flame, Alex Bunardzic said I don’t understand my car. Yet, I have been using it effectively for the past 4 years. […]
pingback at 17. March 2006