Web-based applications are all the rage. That’s great news for application
developers, who are making a tidy living off of developing those fancy
browser-based applications, but where does that leave those developing
traditional client/server rich desktop applications? The situation for those
developers is quickly becoming "evolve or die." The shift to
Web-based applications has started and it is unstoppable.
While traditional desktop-based applications have proven their worth with
stability, security and maintainability, that is not enough to stem the tide,
and businesses of all sizes are being forced to develop rich applications for
the Web.
For many, Web application development using AJAX,
.NET and other technologies (or
architectures) has meant maintaining two separate code bases, one for the Web
and one for the desktop. Usually, it also means more than doubling overall
development costs and, in many cases, staffing levels. That is only a small
part of the problem. Keeping two code bases synchronized and development goals
on track usually adds another layer of management and tools, along with
exponentially increasing the likelihood of bugs and broken applications.
Of course, the simple answer is to not have two code bases. With a single code
base, most of those problems can be eliminated and costs can be contained.
Regrettably, consolidating into a single code base has pretty much been
impossible, mostly due to the unique challenges presented to developers by both
the desktop and the Web, each with its own specific requirements and rules for building
functional applications. That creates a big decision for many: Should development
efforts be focused on the Web or the desktop?