lesscode.org


Archive for July, 2005

The Dark Horse  3

Cat.: Languages, Then they laugh at you...
21. July 2005

Chris Wine left this as a comment and I thought it deserved a page of its own. — Ryan

In a recent meeting with a BEA developer, I was told how a small non profit recently converted their website from ASP to PHP. Why I immediately asked? Because the resources for the implementation were less expensive, and there were thousands of CMS, blog and wiki applications to choose from, rather than just a few in the Microsoft environment. Application availability used to be the Achilles heel of platforms that competed with Microsoft. I remember UNIX vendors not having “enough” applications running on their OS, and paying big bucks to ISVs to port product to their platform. Now these scripting languages with zero marketing budgets have flipped the tables on Microsoft. I am impressed.

Scripting languages are winning the mind share for developers, because recent grads know and like these languages. They are easy and inexpensive to deploy, and you get results fast. While some say they are not scaleable, developments at Google and Yahoo say otherwise. For those that still want a more proven solution for large deployments, Java has its place in companies that have requirements for thousands of concurrent users. That said, there are many more companies that do not have requirements for thousands of concurrent users. And as Joel Spolsky says, “It is easier to start small and cheap and scale up, than to start big and expensive and scale down.”

I ran some quick searches, to determine what people in the technology universe are talking about. While my survey is by no means a scientific random sample, I still believe it has some merit. Also, I used less than ideal search parameters for the scripting languages as some of them are common words (Python and Ruby).The merit is not in the absolute values, but in the ratios. I believe that the number of relevant posts on these scripting languages is 300-1000% higher than what I show below.

Google Suggest (as of July 22nd)

J2EE Results

  • JBOSS J2EE 518,000 results
  • BEA J2EE 999,000 results
  • Oracle J2EE 1,920,000 results
  • IBM J2EE 1,950,000 results
  • Sun J2EE 3,690,000 results

Scripting Results

  • PHP Software 30,000,000 results
  • Perl Software 14,100,000 results
  • Python Software 8,850,000 results
  • Ruby Software 4,780,000 results
  • Jython Software 0 results

Dot Net Results

  • .Net Software 32,300,000 results

Totals

Total .NET Results: 32,300,000
Total J2EE results: 8,610,800
Total Scripting results: 23,458,000

Technorati

.Net Posts

  • .Net Software 55,046 posts

J2EE Posts

  • BEA J2EE 687 posts
  • BEA Portal 415 posts
  • JBOSS 991 posts
  • JBOSS Portal 215 posts
  • Oracle J2EE 1,534 posts
  • Oracle Portal 1,221 posts
  • IBM J2EE 1,456 posts
  • IBM Portal 1, 936 posts
  • Sun J2EE 1,791 posts
  • Sun Portal 4,376 posts
Scripting Software Posts
  • PHP software 15,829 posts
  • Perl software 6,828 posts
  • Python software 4,704 posts
  • Ruby software 2,037 posts
  • Jython software 143 posts

Totals

Total .Net Posts - 55,046
Total J2EE Posts -14,622
Total Scripting Posts – 29,398

Conclusions

The revolution is here. It may not be televised (yet!) but it is being blogged, and otherwise documented. BEA may of may not get bought. I don’t know, and now that I am wrapping up this post, I think my original uninteresting. The more interesting question is “Will the scripting languages assume the role of Linux in the mid-nineties?” Can scripting languages be a mindshare leader in 10 years, with growth outpacing all of the J2EE vendors and .NET combined? In 2015, will we talk primarily about .NET, and some scripting language(s)?

Does anyone even doubt this hyposthesis?

End of rant.

CW

Motherhood and Apple Pie  24

Cat.: Then they laugh at you...
21. July 2005

The internet is not an accident. The internet was not bound to happen. There was no guarantee that the internet would reach its current state as a side effect of emerging digital processing and communications capabilities. We did not recover complex alien technology.

The internet, that place where all eventual business will be transacted, all content and media will be distributed, all correspondence will be exchanged, all history will be recorded, and all pornography will be is being admired, has a design - and its meant for exactly these purposes.

Many of the principles that led to this design are still with us today, although I would challenge you to ascertain them by observing the mainstream technologies being peddled by leading vendors, publications, and analyst firms. Those who rose to power in a much different environment, where the short-term profits of disconnected, dead-end business software was deemed more important than laying a fertile ground where millions of new ideas (and hence new profits) could bloom.

But the dead-end has long been reached and so these industry leaders have turned their attention to this new place, built on principles and values very different from their own, and have somehow reached the conclusion that this thriving ecosystem must be re-arranged such that they have somewhere to place their baggage. Instead of embracing the people, principals, and technologies that gave rise to this phenomenon they have chosen to subvert its history and to implant the ridiculous notion that it is “incapable of meeting the stringent demands of the business community.”

Not only have these business radicals claimed the internet as their own but they have also somehow gained the confidence of all the worlds industry in their ability to deliver a new and sparkling internet, one no doubt capable of reproducing the complexities and flaws that plague existing mediums so as to make it feel more like home. They’ve brought their own principles and agendas, asserting them as obvious and correct while ignoring the wisdom we’ve gained and shared and gained and shared over years of collaborative practice and observation of working systems at this scale.

But business is something for which I’ve acquired admittedly little competence and so I would like to transition now and lay rest to this insulting notion that the tools and methods that dominate the modern web are of lesser quality and modernity than their big-vendor, enterprise class, industry accepted counterparts. The arrogant assumption that the people who built this place have been waiting idly by with suboptimal processes, tools, and protocols in the hopes that one day the masters of proprietary business software would bless us with their advanced capabilities. The rhetoric that suggests that the tools used to provide a brunt of the value on the internet are somehow expired, inelegant, or lacking in technical merit.

In an attempt to bring some semblance of reality to the conversation, I would like to present to you, Tim Berners-Lee’s Axioms of Web Architecture, otherwise known as “Motherhood and Apple Pie”. First recorded in one place by Tim in 1998, these principles had been around and were well known for many years before. Some trace back to early computing and some predate computing and were taken from such practices as mathematics.

These are the principles of design that have brought us where we are today and you can observe them working as designed in protocols and formats such as HTTP, URIs, MIME, HTML, and even XML (sometimes), and architectures such as REST. And you can observe them working in systems that facilitate the internet - tools such as Apache httpd, PHP, Perl, Python, C, UNIX, and newcomers such as Linux, PostgreSQL, Ruby, etc.

I present these principles now as evidence that we are quite aware what it is we’re doing and that these tools and protocols are the way they are for a reason. We realize that they are in many ways quite different from their analogs in the old-world of narrowly distributed business software but we ask that you please consider their design in the context of the ecosystem where they flourish and ponder whether this might not be coincidence.

The principles of design that have shaped the web and tools that underly it (again, from Axioms of Web Architecture):

  1. Simplicity
  2. Modular Design
  3. Tolerance
  4. Decentralization
  5. Test of Independent Invention
  6. Principle of Least Power

Simplicity

We begin our look at the design principles of the web with the most important aspect of design in any system - simplicity. Note that the best way of mixing the basic aspects of design (simplicity, consistency, correctness, and completeness) is a topic that’s been debated for decades. However, all major styles of design agree that “simplicity” ranks first.

From Berners-Lee’s Axioms:

A language which uses fewer basic elements to achieve the same power is simpler.

Sometimes simplicity is confused with “easy to understand”. For example, a two-line solution which uses recursion is pretty simple, even though some people might find it easier to work through a 10-line solution which avoids recursion.

Simplicity is sometimes defined as a measure of the number of parts required to make a thing work while retaining clarity. If one design requires many parts while another requires few, the latter is said to have a greater level of simplicity.

As an example, which of the following equivalent operations is more desirable?

2 ** 5

or…

2 * 2 * 2 * 2 * 2

While the first solution requires a slightly higher level of understanding it is indeed the simpler because it more clearly captures intent using fewer parts.

Note that “simplicity” is not a synonym for “a hack” or “quick-and-dirty”. Neither is it equivalent to “dumbed-down”. Another example might illustrate these differences:

One way to print a simple message …

public class WeReallyLikeClasses
{
   public static void main(String[] args)
   {
      System.out.println("hello world");
   }
}

and another way…

print 'hello world'

The former provides very little advantage over the latter while requiring more elements. The second example is said to be the simpler and thus is highly desirable. This is an admittedly simple case, which is kind of the point, but this might be a better example.

So I would like to impress upon you that the languages that dominate the web are not the way they are because we lack the ability to build more complex, sophisticated, flashy tools and languages, they are that way because we assume extra complexity is unnecessary until proven required and we see very little evidence to support the inclusion of the complexities that have been introduced into “enterprise class” software over the past decade.

Modular Design

The web’s core architecture and many of the tools that facilitate its operation are extremely modular in design. In fact, in my recent writing about the “LAMP stack” I’ve grown an aversion to the term because what I really wish to convey is not simply the base Linux, Apache, MySQL, and PHP components but near 20 very specific and modular pieces that can be combined in various ways to craft a targeted overall solution to a given problem domain. What about the BSD, lighttpd, PostgreSQL, Ruby configurations? The term LAMP should not exclude them — you can mix any one of the LAMP components into this configuration due to the high level of modularity of each piece.

Again we quote TBL:

When you design a system, or a language, then if the features can be broken into relatively loosely bound groups of relatively closely bound features, then that division is a good thing to be made a part of the design. This is just good engineering. It means that when you want to change the system, you can with luck in the future change only one part, which will only require you to understand (and test) that part. This will allow other people to independently change other parts at the same time. This is just classic good software design and books have been written about it. The corollary, the TOII is less frequently met.

And so it is with great interest that those who understand the principle of modularity and the benefits it provides in web-like environments watch as industry leaders now make such senseless claims as these:

Having products that are engineered to work together–something open-source competitors cannot do–will ultimately make Microsoft products easier to run and more cost-effective over time, said Paul Flessner, senior vice president of server applications.

As you can infer from the quote, it is not the principle of modularity that drives these components into separate pieces but an inability of “open-source competitors” to make things “work together”.

This is especially disturbing when the company making the claim has demonstrably poor modularity that is widely considered to be one of the most significant deterrents to progress on their platform.

Tolerance

Tolerance is another well-understood and very obvious principle of the web and the tools that support it:

“Be liberal in what you require but conservative in what you do”

This is the expression of a principle which applies pretty well in life, (it is a typical UU tenet), and is commonly employed in design across the Internet.

Write HTML 4.0-strict. Accept HTML-4.0-Transitional (a superset of strict).

This principle can be contentious. When browsers are lax about what they expect, the system works better but also it encourages laxness on the part of web page writers. The principle of tolerance does not blunt the need for a perfectly clear protocol specification which draws a precise distinction between a conformance and non-conformance. The principle of tolerance is no excuse for a product which contravenes a standard.

Again, adoption of this principle by leading vendors in their attempts to bring the business community to the web are sorely lacking. The complexities and required aspects of vendor driven specifications and tools guarantee that they will never be capable of doing for the business community what the web has done for the general public.

Decentralization

This is a principle of the design of distributed systems, including societies. It points out that any single common point which is involved in any operation trends to limit the way the system scales, and produce a single point of complete failure.

While AOL and Microsoft’s Passport are perfect examples of how ignoring the web’s basic principles can lead to disaster, I won’t go into it due to severe lack of challenge and therefore motivation.

However, I would like to note that this principle applies not only to technology but also to the structure of business, vendors, and communities. There are far too few vendors providing far too many services to the business community. Many vendors and analysts actually encourage companies to become completely dependent on a single vendor (as in, “we’re an IBM shop”). A better strategy for businesses is to diversify their technology providers between many smaller vendors that each provide tools and services adhering to the basic principles of the web and thus providing a base level of interoperability and freedom.

Those vendors don’t exist in great numbers at present, but they will.

Test of Independent Invention

This has strong ties to the principle of modularity and, again, is observable in most of the pieces we consider part of LAMP/friends.

If someone else had already invented your system, would theirs work with yours?

Does this system have to be the only one of its kind? This simple thought test is described in more detail in “Evolution” in these Design Issues. It is modularity inside-out: designing a system not to be modular in itself, but to be a part of an as-yet unspecified larger system. A critical property here is that the system tries to do one thing well, and leaves other things to other modules. It also has to avoid conceptual or other centralization, as no two modules can claim the need to be the unique center of a larger system.

Principle of Least Power

The Principle of Least Power is perhaps the most visible in the tools and technologies that have gained wide spread acceptance on the web and at the same time the least understood. Languages like HTML, PHP, and RSS are perfect examples of the strange adoption rates “least power” systems enjoy.

I will quote TBL in his entirety because he explains the principle so well:

In choosing computer languages, there are classes of program which range from the plainly descriptive (such as Dublin Core metadata, or the content of most databases, or HTML) though logical languages of limited power (such as access control lists, or conneg content negotiation) which include limited propositional logic, though declarative languages which verge on the Turing Complete (PDF) through those which are in fact Turing Complete though one is led not to use them that way (XSLT, SQL) to those which are unashamedly procedural (Java, C).

The choice of language is a common design choice. The low power end of the scale is typically simpler to design, implement and use, but the high power end of the scale has all the attraction of being an open-ended hook into which anything can be placed: a door to uses bounded only by the imagination of the programmer.

Computer Science in the 1960s to 80s spent a lot of effort making languages which were as powerful as possible. Nowadays we have to appreciate the reasons for picking not the most powerful solution but the least powerful. The reason for this is that the less powerful the language, the more you can do with the data stored in that language. If you write it in a simple declarative from, anyone can write a program to analyze it in many ways. The Semantic Web is an attempt, largely, to map large quantities of existing data onto a common language so that the data can be analyzed in ways never dreamed of by its creators. If, for example, a web page with weather data has RDF describing that data, a user can retrieve it as a table, perhaps average it, plot it, deduce things from it in combination with other information. At the other end of the scale is the weather information portrayed by the cunning Java applet. While this might allow a very cool user interface, it cannot be analyzed at all. The search engine finding the page will have no idea of what the data is or what it is about. This the only way to find out what a Java applet means is to set it running in front of a person.

I hope that is a good enough explanation of this principle. There are millions of examples of the choice. I chose HTML not to be a programming language because I wanted different programs to do different things with it: present it differently, extract tables of contents, index it, and so on.

I hope re-stating these principles at this point in the evolution of business on the web might help illuminate why leading vendors currently seem incapable of bringing the magic of the web to the business community - they don’t understand its basic principles. And instead of attempting to understand these basic principles so that they may apply them for the benefit of their investors, customers, and industry partners, they are attempting to discredit them.

This should also clear up any questions as to why IBM and other progressive companies are moving to adopt technologies such as Linux and PHP, why Google and Yahoo are so successful, and why many of us feel Python and Ruby have an edge on more traditional languages designed for the slowly deteriorating world of unconnected business software.

The good news in all this is that there is a resurgence of interest in the web’s basic principles that is somewhat oriented toward the business community. I believe this renewal of interest to be the result of increased communication through weblogs and other web-friendly collaboration tools combined with massive adoption of free and open source development methods. The bad news is that a huge portion of the software industry isn’t involved and are in many ways blocking progress using techniques that are hard to describe as anything other than dishonest.

(I reserve the right to make changes to this document after sleeping a bit. Please log suggestions and corrections using the comments sections below)

ActiveGrid First Impressions..  1

Cat.: LAMP
20. July 2005

I finally had a few minutes to look at the technical material available for ActiveGrid’s lineup. More on that later… Right now I’m just relieved to find, after wincing through the initial big boring corporate look of the site, that I think they get it :)

Update: I wrote Peter to register my approval and he had this to say:

Ha, you found our “alternative view”! Glad you appreciate it. The funny story behind that is that we didn’t tell the marketing guy we did it, and then had a pool as to how long it would take him to figure it out. The farthest out was our office manager at 45 days, and she won because he never figured it out until day 45 when we told him!!!

Scoble on Rails  8

Cat.: First they ignore you..
18. July 2005

David Heinemeir Hansson caught Scoble lusting after Rails and Django. I’ve been interested in understanding whether Rails is having the same impact on the .NET crowd as it is having on the Java crowd. If you’re coming from a .NET background, weigh in on Scoble’s comment thread.

The .NET environment seems more locked down than Java’s, which makes it harder for developers to wander outside of the vendor product line. Most people coming from Java have good experience with some form of Linux/Unix and associated tools so the LAMP/friends environment and conventions feel a bit more natural. Perhaps Udell’s advocacy of WAMP can put a chink the armor? I wonder how hard it would be to slap Ruby, Rails, and Ruby Gems or Django and Python into the WAMP click-and-install packaging. This would probably go a long way in getting people to take a look.

Anyway, this comes only a month or so after reports of a new Microsoft strategy to “extinguish LAMP”:

Having products that are engineered to work together–something open-source competitors cannot do–will ultimately make Microsoft products easier to run and more cost-effective over time, said Paul Flessner, senior vice president of server applications.

There you have it folks - we’re incapable of building “products” that work together. Being on Microsoft’s radar means that we’re going to have to be even more diligent in refuting baseless claims about our tools and processes. Luckily, they don’t seem to understand our core philosophy and, like most, see our approaches as quick-and-dirty rather than simple and elegant.

I have a feeling this is going to heat up over the next few months.

We already have a VM…  12

Cat.: Then they laugh at you...
17. July 2005

There is an emerging notion that there are really three cleanly separated platforms that will be contending for the enterprise over the next few years: Java, .NET, and LAMP/friends. Current mainstream thinking suggests that LAMP isn’t really a contender but that dynamic languages at least have a chance of riding in on top of the JVM or CLR.

But I’d like to consider some of Sam’s findings in working with Parrot in the context of dynamic languages on the JVM or CLR. Specifically, that using a common VM for multiple languages is hard; and keep in mind that Sam is hacking on a VM that is designed for dynamic languages, whereas the mainstream VMs are not.

As I was saying, initial thinking suggested that one of Jython, JRuby, IronPython, etc. would be the horse that dynamic languages would ride into the enterprise but I’ve grown away from this notion for a few reasons:

  • Rails is gaining acceptance with hardcore Java heads under the C Ruby implementation, not the Java Ruby implementation.

  • The LAMP stack is gaining acceptance at a rapid pace and in my experience, adding Java/.NET to a LAMP setup results in an impedance mismatch both in tool simplicity and quality of license / community.

  • The misconception that dynamic languages are useful only for “scripting” type tasks (e.g. gluing static language code together in controlled environments) is fading fast. It’s hard to find someone who will argue that Python or Ruby are incapable of being used as the primary language on large projects where these people were everywhere as early as a year ago.

  • Dynamic Language to Shitty Language bridge libraries give you the best of both worlds: a portable and efficient implementation of the language interpreter plus the mass of libraries that were developed under legacy runtimes (yes, I’m trolling now) like the JVM and CLR.

I’d much prefer to be working on the C implementations of these languages as opposed to the mainstream VM based ones. The JVM/CLR based implementations have problems: C extension libraries (are they even possible?), a lag in features from their mainline counterparts, less community involvement, and running atop mainstream VMs typically requires more resources than running atop the C interpreters.

Further, Java and .NET have licensing issues that create obstacles to adoption. The inability of Linux/BSD distributions to include the vendor implementations is a problem that neither vendor seems to be willing to budge on. The Mono and GCJ/GNU Classpath projects are moving in the right direction, no doubt, but it bothers me that in the end our best case is a situation where we’re running great languages on top of VMs designed for poor languages and have no participation from their original creators. That doesn’t really give me the warm and fuzzy about the future of dynamic languages…

Note that Parrot is a very different situation from the mainstream VMs. It’s being designed for dynamic languages, is licensed for the community, and probably won’t have the same issues with wrapping C extension libraries.

Getting to the point, I’d like to challenge the assumption that running dynamic languages on mainstream VMs is optimal / desirable in the long term. The existing interpreters have thriving communities, years of proven use, and a slew of C/C++ extension wrappers to even more proven libraries. They are already cross-platform and there’s bridge code for accessing the vast set of functionality that has been implemented in Java or C#.

The real question for me becomes, when will LAMP/friends gain legitimacy as an “enterprise class” platform and how quickly will that force the industry to re-evaluate optimal deployment models?