Now blogging at diego's weblog. See you over there!

Why the 'Java Stigma' Part 3


(Continuing parts one and two of the 'java stigmata series' as Chris so aptly defined it :-)).

Okay, going on with my rants on why Java is considered inferior, and why in some cases the reality of how many applications are developed matches this misperception.

The topic of the day is User Interfaces, but more from a high-level, design perspective, rather than looking at them from the implementation point of view as the first entry did.

To begin with, I think there is a major lack of training on UIs on the part of software engineers, and it's not our fault: there is little in the "environment" to push us in that direction. UIs are messy, difficult to do. You need design abilities and training, which might not match with coding abilities. And in computer science/engineering, UIs have never been paid their due.

All of this starts (as with the data structure migration problem) at the universities and later at workplaces. User Interface design is widely considered to fall under other types of design, including industrial design, graphic design, etc. As such, there are few (or none) offerings in this area for computer scientists/engineers. On a CS/Eng degree, you will spend hundreds of hours working on data structures and databases and algorithms, while UIs will be, with luck, covered by at most a single course, a few dozen hours in total. The problem, of course, is that UIs are not deterministic, you can't pre-define how a good UI works or what does it look like. The only way to become a good UI designer is through training, trial and error, and the developing of an "instinct" for what is good and what is not. It takes time. And more time. There are no debugging tools for UIs, and that makes it incredibly hard to quantify whether something is "working" or not.

Typical design processes up to this point (excluding recent processes such as eXtreme Programming) have paid little attention to the UI part of the equation that is any software development project. This extends the lack of care for UIs from university into the work environment.

The end result of this procrastination is that the UI of a program is usually considered a function of the underlying data and algorithms, when it's actually the reverse. It's common to start designing a program by saying "I will use such and such a database or data format for this". This, IMO, is wrong. The application, if it's to be usable, should be designed from the top down, not the bottom up. After all, all the user will ever see is the UI. Who cares how the data is stored or what sorting algorithm are you using, if the application looks like crap? (True, the underlying code should be optimized, etc, but my point is that code can always be adapted to different UIs, while the reverse is not true). The only thing that matters is the user interface. It's what the user deals with every day, it's where performance problems are found, it's where imagined "bottlenecks" are proven to be non-existent. For example, in Java/Swing, tons of components are created on the fly as dialogs are created, etc. However, this doesn't impact on performance because they are isolated operations. They would only have impact if they were being performed massively. That is, if a menu takes 1/10 of a second to be created, no one will notice, since compared to the time required by the operation to launch it is not noticeable. But, if we looked at the menu creation time from an algorithmic point of view it would be completely outrageous (I mean, says the Java-skeptic, 1/10 of a second to create a menu item! Are you insane?!); in reality it is a non-issue.

This is why HTML can work at all. The time it takes to parse what is effectively a user interface (ie., the webpage) is irrelevant compared to the average time it takes to obtain the webpage from the server. How the web evolved also reinforces my point. From the beginning, web page designers were actually graphic designers which is why web pages have been "usable" pretty much from the beginning

So, how does this affect Java?

In my opinion, Java is affected by this because it's the first widely deployed platform/environment (besides the Web) in 2 decades that is truly new (note that I say widely deployed. There have been many other cool environments in the meantime, but they never really caught on). This means that the user interface of most programs has to be created from scratch, while the Windows crowd happily puts together components that already exist. Besides, Java has to deal with multi-platform issues, in Windows that problem is non-existent. Since Java implies so much UI development from scratch, whatever inadequacies exist as a whole in the CS/Eng profession appear to be multiplied many times, compounded by the fact that there are no really good UI design tools integrated into the major development environments (with JBuilder the only probable exception).

However, I think that there is light at the end of the tunnel. We are getting better at doing this stuff. We've had a difficult learning curve since the first public alpha of Java (way back when in 1995. Jeez. Almost eight years. Has it been that long?), but Java applications are now beggining to look as well as their native counterparts, in many cases improving on them. New tools are coming out, with better support, and, more importantly, new processes are being used that improve UIs because they involve the user from the start in the development process (like in eXtreme Programming).

As time passes, the perception that Java is not good for client apps will disappear, and the simplicity and inherent "cleanliness" of Java development will allow Java client apps to establish their foothold in the desktop, hopefully creating cool innovations along the way.

Categories: technology
Posted by diego on January 5 2003 at 3:55 PM

Copyright © Diego Doval 2002-2011.
Powered by
Movable Type 4.37