lesscode.org


Archive for August, 2005

Giant solution in search of a problem  2

Cat.: Then they fight you..., Rails
15. August 2005

Twenty years ago I got involved in a course of AI (Artificial Intelligence) study. Soon afterwards, I’ve learned, much to my dismay, that this field has been, somewhat maliciously, dubbed ‘giant solution in search of a problem.’ Much as I’ve tried to convince myself (and the others) that AI has some pragmatic potential, I had to eventually abandon my efforts, and turn to some of those truly pragmatic solutions (such as RDBMS, for example).

History tends to repeat itself, and so (not surprisingly), I’ve ran into the same scenario (i.e. got involved with more than one ‘giant solution in search of a problem’, or GSISOP) throughout my meandering career.

The first major bump in the road was working with CASE (Computer Aided Software Engineering) tools. Those beasts used to be all the rage in the mid to early ‘80s, but witness the speed with which CASE turned into a dirty word in the early ‘90s. That utopia had to be abandoned promptly.

The next bump came in the form of ‘screen scrapers’ which were proposed as a GSISOP to bridge the gap between the monolithic mainframe apps and the burgeoning two-tier client/server enablers.

One of the biggest GSISOP bumps I’ve ever experienced was Lotus Notes, later dubbed Domino (does anyone even remember that unruly beast?) That platform was so ridiculously clunky, that even the biggest Domino zealots had extremely hard time explaining what is it that this product actually does?

Fast forward to 2005. What is the GSISOP we as a development community are forced to deal with today? CASE is dead as a doornail; screen scraping and client/server are dead as well; so is Domino. Does it mean that GSISOP is dead too? I wouldn’t bet the farm on that.

Ten years ago I’ve abandoned the world of two- and three-tier client server technology for the new kid on the block. That kid was then called the ‘nomadic code’. Remember the ‘network is the computer’ slogan? Or, for that matter, does anyone here remember the ‘PC killer’ – the NC (the network computer, or the glorified dumb terminal that was all the rage back in 1996)?

I remember how in those heady days me and some of my more idealistic colleagues firmly believed that the applications of the future will consist of modules of highly mobile code buzzing around the net, arriving ‘just in time’ to assist with some hairy, highly critical, highly demanding and customized behavior. We furthermore believed that these modules will live in a highly competitive environment, vying for the end-user’s attention, competing to deliver the best, most robust service, thus building their own reputation. Talk about GSISOP!

Yes, we were the Java evangelists. For us, Java was the way of the future.

We had to endure a lot of flack for our beliefs. But we’ve persisted, prevailed, and through Herculean efforts managed to bring Java into the corporate fold. Despite the false and doomsday claims that Java was ‘too slow’, that it ‘wouldn’t scale’, that it was ‘too complex’, today we see that Java is the bread-and-butter of the contemporary corporate computing.

So why am I telling you all this boring stuff? The reason is that today we finally have this ‘new kid on the block’ – Ruby on Rails. Similar to what Java was ten years ago, RoR today promises a completely new way of not only looking at things, but also on doing things.

And similar to the case with Java ten years ago, many people fail to see the true potential of RoR. Not surprisingly, because, same as Java from ten years ago, RoR is not a GSISOP.

Why am I convinced that RoR is not a GSISOP? First of all, it’s not ‘giant’ (same as Java was an agile, small footprint solution ten years ago). Secondly, it’s not a ‘solution in search of a problem’, because it deals with a very specific problem at hand – how to implement the principle of Occam’s Razor. And it does it extremely successfully.

So, if Java is not a GSISOP, why do we need another non-GSISOP? Doesn’t the principle of Occam’s Razor (‘entities should not multiply beyond necessity’) go directly against having two similar solutions?

You see, the problem is that Java WAS not a GSISOP, but (sadly) grew into one over the years. Today, I claim that Java slowly and maybe imperceptibly morphed itself into a GSISOP. One need go no further than examine the Struts framework, for example, if one would like to understand what a GSISOP really is.

As we all know, Java community is getting mighty agitated over the RoR promise. What’s particularly amusing to me personally is how the same people who had to endure unjustified attacks on Java some eight-ten years ago, are now using the same unjustified argumentation to attack RoR!

Of course, a number of Java supporters are now rushing out to perform what I like to call ‘putting a lipstick on a corpse’, in order to prove that Java is indeed capable of doing anything that RoR can do. These are sadly misplaced efforts, which eerily remind me of the long gone days of the screen scraping efforts. Same as the mainframe folks tried desperately to prove that Mainframes can do client/server computing without leaving their comfy OS, Java advocates are now wasting their time trying to prove that ‘Java can be concise too’! It resembles some political PR where a spokesperson of some dictatorial regime claims that their regime can be democratic too. Yes, there is no doubt that it can, but the real question is: why isn’t it? If Java can be concise too, then why isn’t it concise already? Why do we need Struts? If Java is by its nature concise, why did we end up with such monstrosities as Struts and Jetspeed and Expresso and Tapestry, etc., all of which harken back to the bad old days of mainframe programming?

In the future (if the future ever comes for me), I’m planning to discuss the shift in the philosophy of complex system development that RoR brings to the table. I’m planning to focus on the complex-vs-complicated dichotomy in particular.

More Developers, Less Code  10

Cat.: Then they fight you...
14. August 2005

Why is it that when any institution has a need for software, they immediately start looking for pre-built “solutions”, instead of consulting a team of developers? Is it because that management feels that it will be less expensive to buy a “turnkey” software product than to pay developers to do the job right? Or is it that the higher-ups are looking for a stronger chain of responsibility if something goes wrong? I would assert that the real reason is somewhere in between the two.

Many of us have worked in houses whose primary business is something other than software. Our jobs ranged from being a one-man (or woman) IT department to simply being that person who picks up the phone when a user calls the helpdesk. In any event, if you’ve worked in this environment, it will quickly become clear to you that non-technical management seems to be its own worst enemy. I’m not saying that all management is bad, just that when it comes to making decisions about technology, a person’s self-importance can be their undoing. Software developers, we are guilty of this too. We put on a smile for the managers, when secretly we think to ourselves how much smarter we are. This attitude only pushes the divide further.

With such a sharp difference between IT and management, it’s no surprise that there is an internal conflict when it comes to making IT decisions. I’m going to describe a scenario that took place at place I had previously worked. This place is an educational institution, with a fairly reasonable IT budget. There are 2 people on staff in the IT department, both of whom are able programmers, with skills concentrated in dynamic web applications, namely PHP. We’ll call this place “the Institute”.

For years, student grading at the Institute was done by pencil and paper. At the end of every term, each teacher would tally up all his students test scores, and issue a grade. Once all a student’s grades were tallied, a small team in the registrar’s office would type up report cards on a typewriter, sign them, and mail them out to the parents. One copy of the report card was kept in a file cabinet, and the only method of access control was a key, having 5 copies, distributed to people who would need access to the information. Times were simple, and everything seemed to flow smoothly.

The Institute had been buying computers in small lots for quite some time. They were distributed sparsely - in the library, in the academic offices, and in the classrooms of teachers who managed to get themselves some kind of priority. Slowly but surely, an IT sprawl was occurring. The Institute saw it fit to hire a small IT staff - 2 people. Both of them young men who really had a passion of technology, and who liked the idea of growing their own infrastructure from the beginning. Eventually, the Institute had computers in most classrooms, a library full of PCs, and even a computer lab that students could drift in and out of in their spare time. The IT staff had been setting up a Microsoft-centric shop, because that’s what the management wanted. The two of them didn’t complain too much about it since the job was still fun, and Microsoft clients and servers seemed to work pretty well together - despite the occasional hiccup. Nothing a reboot couldn’t fix! There was server space and e-mail for everyone. Again, times were good.

There was one problem that nagged the IT pair, though. Grading was still being done with pencil and paper. The higher ups had started to catch on, too. We could make things so much faster if this was all done by computer. By this time, every teacher had his/her own laptop, and some had even taken to doing their grades in Excel - which sped up but didn’t unify the process. With this problem at hand, the IT staff started drawing up plans for a grading system. It would be a simple web application where teachers could enter test scores incrementally throughout the semester, grades would be tallied, and report cards could be printed. It could simplify the whole report card process to a single batch job. They had estimated it would take 6 months to develop and test, with another couple of months for small modifications and training. They could write it in their spare time, considering most of their day was comprised of reading Slashdot between taking support calls. The best part of the whole deal, though, was that it would all be free. The entire framework for the program would run on Open Source software, and since the management already paid the IT staff a salary, there would be no overhead for development.

The management, however, was less than receptive to this idea. When it went down, who was responsible? Who could they call for support? If it took 6 months to develop, wouldn’t we be in action faster if we just bought a “turnkey solution”? IT pleaded with the management to be more reasonable. If it went down, IT would support it. That, after all, was their job. A 6 month development time was a fine wait if it means you’re getting a good piece of software. The managers, though, decided to press on. They decided on a piece of software that had first come as a free sample in the mail. It cost $100 per workstation. IT, suspicious of any software that arrives in the mail unsolicited, reluctantly installed it on all of the faculty laptops. At first, the faculty was excited about this software, but then the questions started to roll in. Why can’t I import all my old grades from Excel into it? Why does it crash when I do X, Y and Z? With the animosity growing between IT and management, the IT people were not as responsive as they should have been when it came to answering user support calls. They were upset that the management had gone behind their backs, and wanted to exact some punishment. Management, on the other hand, was upset that their shiny new $5,000 investment wasn’t the silver bullet that ended all their problems. After a few heated meetings between IT and management, it was finally agreed upon that the Institute would look for a new electronic grading system. The highest authority at the Institute a well-educated fellow named John, had recently come back from a demo of a software product that would supposedly give the Institute electronic grading in one fell swoop. It was a custom-designed program that ran on Filemaker: the pseudo-database. John had loved it, and this was what the Institute was going to use. It was not open for debate. $25,000 later, the Institute had a Filemaker system. The sales force had convinced the Institute to not only buy the software, but a brand new dual-processor server as well. IT gritted their teeth as this new system came online for the first time.

Of course, the Filemaker system wasn’t compatible with the previous program, so teachers had to once again re-enter their grades. Another problem that IT foresaw but was unable to do anything about was the issue of access control. In days past, there had been 5 copies of a key to the file cabinet where all the grades were kept. Management, used to this concept, was very receptive to the idea of a single password for the whole Filemaker database. The IT staff knew it was only a matter of time before this password got out to the students, and sure enough, they were right. It took a month for the students to start changing their grades.

Amidst this crisis, IT pleaded with the management again. John, however, wouldn’t budge. He had made the decision, and it was final. Filemaker was it, and it was IT’s job to fix all of the problems. The teachers had lost their trust in the software, and started doing their grades in Excel again. The administrators, knowing that the students had gotten access to the database in the past, didn’t trust the software either. At the end of every semester, the teachers would e-mail their grades to the registrars, who would combine a student’s grades into a report card, written in Word. They would print 2 copies of this report card. One to mail to parents, and another to put in the 5-key file cabinet. After this process was complete, they would enter the student’s grades into Filemaker, but only for “archival” purposes. And so it remains to this day at the Institute.

The moral of this story is twofold: IT and management need to cooperate on a new level, and for internal software, roll-your-own is almost always superior to shrinkwrapped. Internally developed software is better suited to the problem than anything you can buy, and it humanizes IT more than purchased software does. To know that the software was made for the company, by the company gives users a much better attitude about using it, as long as it is done well. All that needs is for IT and management to trust eachother; something, I think, that will take a long time to happen.

It’s a very powerful tool.  Comments Off

Cat.: Theory
09. August 2005

I’m a sucker for a gushy Tim Berners-Lee interview:

ML: And you’ve never had a sleepless night over that?

TBL: No I haven’t. I haven’t had a sleepless night over it because I suppose I’m so much more surrounded by the good things that people are doing with it. There are lots of positive stories of people doing great things, putting educational information out there for people in developing countries and things, for example. There’s a huge spirit of goodness. Most of the people I meet who are developing the web are focused on all those things.

Looking for Contributors  8

Cat.: Wanted
09. August 2005

As a follow up to my previous post on what I hope lesscode.org might become, I’d like to solicit the names of individuals you feel might be conned into posting here. Go ahead and drop names in the comments and I’ll heckle them. You can contact me by email as well.

Also, I’ve toyed with the idea of allowing open submissions but I don’t like the idea of playing editor. If anyone has ideas for taking open submissions and staying sane, please leave suggestions/experiences.

lesscode.org is…  3

Cat.: First they ignore you..
09. August 2005

You may have noticed posts by a few really really smart individuals here over the past weeks. I feel like I should explain what’s going on and attempt a basic description of what I hope lesscode.org might become. Keep in mind that this is all very much a pipe-dream at this point.

About a month ago I sent an invitation to a handful of people that I feel have an excellent grasp on the history and potential of the web and IT in general and invited them to post here (I don’t feel comfortable listing names because I don’t want to get everyone’s hopes up). The list was small but the group covered an insane amount of expertise in technologies like PHP, Python, Perl, Ruby, Rails, Apache, HTTP, Web Architecture, XML, AJAX, Microformats, Syndication formats, and much more.

There are a few common threads that I felt ran through the group:

  1. Each person has demonstrated a strong understanding of the web and the tools/principles that underly it.

  2. Each had excellent writing skills and a passion for communicating.

  3. Each had indicated publicly, at one time or another, that they were mildy pissed off at some aspect of the direction being taken by business IT and/or mass media in their attempts to bring more of their operations to the web.

All in all though, there were more things the group did not have in common, such as background, specific technologies, F/OSS vs. Corporate, etc.

I received a truly inspiring amount of support and interest in response but the situation is such that many of these individuals are extremely busy and are already maintaining multiple weblogs. There’s also the small issue of whether this makes any sense. Here we have a web pundit known to advocate the distributed and connection-forming aspects of the web asking others that feel the same way to discard all that and post to one place (why not just throw up a Planet?). It doesn’t make a whole lot of sense on the surface.

The premise is simple: for a certain large aspect of the IT community, we’re too spread out to be heard. Many of the people we want to reach aren’t using the tools and processes necessary (e.g. aggregators, del.icio.us, technorati, mailing lists, etc.) to track discussion and this makes it hard to see that we have a really large overall community collaborating across hundreds of smaller technologies and companies. There’s plenty of content at the lower levels but very little oriented toward the bigger picture. The result is a perception that there isn’t a bigger picture.

One way of overcoming this is to provide a place where individuals can come out of their silos and speak from the position of a larger overall organization, which is exactly what I hope to provide. So here it is: lesscode.org is a loose federation of concerned hackers for web preservation and advocacy. Or, at least, that’s what I hope it will someday become :)

One of the problems I think we’re having is that I gave no guidelines as to what topics should be covered. I did that on purpose because I don’t want to provide a certain type of content, I want to provide a certain type of stage. lesscode.org will be oriented toward whatever topics its posters deem important. This comment from Bill after a post made me realize just how poorly I’ve explained all this:

Ryan, You have no idea. I feel like I just burgled the place!

It’s your place, man! You can’t burgle it.