From
Taking Client Server Computing to the Next Level
Industry Analysis
Client Server Computing: An Inexorable Trend
Global competition has forced companies to improve their responsiveness to customers or perish in the process. Increasingly, traditional, hierarchical modes of management are no longer appropriate, as decisions must be made quickly and decisively by workers further down the chain of command, many of whom are in favor of more contemporary ways to combat their competition. Information technology has played a key role. Within the corporate enterprise, centralized, mainframe computers have been superseded by lower-cost personal computers, indicating a shift in power away from the control room onto the desktop. Often referred to as client server computing, this paradigm has spawned a new generation of software applications that exploit low cost, distributed hardware, and newer, more visual and intuitive methods of computing. No longer the exclusive purview of the corporate MIS director, critical data regarding sales, finance, manufacturing planning and customer service can now be shared across an entire company from the front line to top management.
The Forrester Group, an industry market research consultant, claims that 60% of the Fortune 1000 have shifted toward client server development, up from 30% in 1990. Several key trends, in our judgment, should continue accelerating the adoption of client server computing over the next several years:
Obstacles To Further Adoption
Despite the promise to deliver greater productivity to the enterprise, first generation implementations of client server computing have often failed to meet expectations. According to an industry study by the Standish Group, only 16% of software development projects are completed on time and on budget, and only 42% of projects completed by companies with over $500 million in revenue have the desired features and functionality. About 30% of the development projects are never completed. Moreover, while client server offers the user community the benefit of working independently of the IS organization, a general feeling persists that levels of security, reliability and performance commensurate with traditional mainframe computing do not yet exist.
Development cost overruns and reality's falling short of expectations have become the dark side of First Generation Client Server. While various tools to construct client server applications abound, several obstacles hinder the further adoption and acceptance of client server into the mainstream of corporate computing. In order to move from the trial phase to general deployment, a number of tools to test, simulate, track and fix software defects as part of a total quality solution need to be adopted and applied to the mission of building and deploying applications.
The chief challenges involved in moving to the next generation include:
In summary, the complexity of components involved in the design, build and test process, coupled with the communication obstacles presented by process/workflow and geographical dispersion increase the probability of both lengthening the process and introducing errors and defects.
The Case For Quality
The rising number of requests for new applications in the face of substantial unfulfilled backlogs has placed pressure on IT departments and software developers to bring products to market at an increasingly rapid pace. The increased complexity and time-to-market pressures have placed a strain on the software development process that greatly increases the probability of introducing software bugs or defects into the process, which can cause product delays, lost revenue/lost market opportunities and additional problems.
Faulty, defective software can have various implications. A poorly implemented manufacturing planning system can result in lost market share by leading to inefficient use of manufacturing capacity. The problems associated with the baggage handling system of the Denver airport, and the famous "Year 2000" dilemma, in which faulty database design will cost the U.S. government alone an estimated $1 billion, are best known to the public. However, poor project implementation and software product delays are common in client server implementations. Too often Quality Assurance has meant post-hoc implementation, where quality, to paraphrase the Ford Motor Company, has become Job 1.01.
Moving to the next generation of client server requires a series of software tools to design, build, test, and manage software development. We have defined a universe of second generation client server tools that together will improve the software development process. These tools include products to manage and track application development, test for and detect and report software bugs, and simulate performance prior to product release.
Key objectives include improving software reliability, reducing development costs, and reducing the time to deployment, by eliminating the need for manual testing. Taken together, these products help reduce the time to market for new applications and predict and manage the length of the software development cycle.
The Software Development Process
An overview of the client server application development cycle is useful in understanding the case for software quality. The cycle can be broken down into four discrete tasks: Design, Develop, Integrate and Test. While the tasks are discrete, the process is iterative, as the task of managing, identifying and fixing problems along the way occurs at each of the aforementioned steps.
We offer below an example of how a new sales tracking systems might be developed by tracking the various stages of development:
Automated Software Quality: Defining the Category
Historically, software has been developed using a variety of languages and programming techniques. Code changes in development teams were tracked through a series of ad-hoc procedures, failing to capture the intent or methods used by key programmers in building or enhancing products. Various companies have developed home grown systems, or process control tools, that have tracked software bugs and the need to test, although these have proven to be inadequate.
Just as the software programming task is evolving from black art to profession, so is testing transitioning from manual to automated methods. The need for a new class of products to ensure the integrity of the products under development has therefore become compelling. While the need for quality assurance has always been apparent in software development, increasing complexity has heightened awareness of the problems involved in the development and tracking of client server applications. While one could argue that the need for quality assurance is confined to the testing phase, we would argue that quality assurance is required throughout the development process. Because of the need for quality at every phase-design, build, deploy and manage-we believe that testing tools address a broader audience than application tools and that these products will be used in the up front development process, rather than in isolated testing stages.
Another factor supporting our view that the ASQ market provides strong growth potential is the broad set of capabilities that are evolving into a group of discrete products. We list below the principal products comprising the category:
Configuration management/version control. Configuration management tools and version control programs enable organizations to track all aspects of the application development process, including what changes, how it changes, and whom will be affected by the change. Configuration management tools address a broad spectrum of team based developers, and are increasingly important, given the large volume of code generated on various projects as well as the difficulty of keeping solid programmers in house, due to the rising demand for good software talent.
Error detection. These products detect and monitor applications for execution errors and memory leaks, critical to diagnosing applications that crash repeatedly.
GUI testing. Given the wholesale move from cryptic character based applications to those based on the Microsoft Windows GUI, GUI testing is generally recognized as the largest category within ASQ. A number of firms including Mercury, SQA and Segue have developed a series of proprietary testing scripts to automate the testing process by simulating whether a particular menu sequence acts as planned. These products dramatically reduce the amount of manual regression testing in the development process by simulating test cases by end users that identify flaws or bottlenecks.
Performance testing. By deploying a prototype on a series of desktop computers, or a single server, this class of products simulates how a product will perform under various conditions of load and stress, whether the fifty-first user will fatally "hang" the server or wreak havoc on the productivity of the first 50 users.
Defect tracking. This class of products identifies software defects, or bugs, and assigns a priority and person to resolve the issue.
Remote beta testing. This new class of products remotely monitors a trial group of users. The information collected on how users access the parts of an application, in terms of frequency of use, etc., can be reported back to development teams to incorporate into the next release.
Test management. Just as workflow products track the business process flow within organizations, so do a new class of process management products, that track the development process from start to finish. We believe such products may enable organizations to identify bottlenecks to productivity, as well as assign responsibility to improve efficiency in the development process.
Market Size
We segment the Automated Software Quality (ASQ) market into four categories: software configuration management (SCM), software testing, error detection and test management. We believe the ASQ market was a $380 million revenue opportunity in 1995 (license and services), with SCM the largest segment ($170 million) and testing the second largest ($105 million). Our estimates include tools for the workgroup and enterprise client server markets (UNIX, NT, Windows 95 and Windows 3.x-based tools), and do not include mainframe-based debugging, CASE and testing tools. We believe the ASQ market will grow at a rate exceeding 40% compounded annually over the next five years, and will become a $2.3 billion market by the year 2000. Within the ASQ market, we believe the two fastest growing segments are testing and SCM, with each segment slated to grow at a rate exceeding 50% per year.
The quality space has been viewed primarily as discrete sub-segments, with leaders in each group. For example, Atria Software is the leader in high end configuration management software, through its ClearCase product offering. Mercury Interactive, the pioneer in software testing, is the leader in its category. In the emerging segment of error detection, Pure is the market leader. We believe that the market has passed from the introductory phase to the growth phase within the life-cycle of the industry.
At this stage of market development, the ASQ business remains highly fragmented without a single vendor holding more than 12% of the market.
As the market matures and the need for a total quality solution to software development becomes apparent, we believe that a unified, agreed upon definition of the category will emerge. We believe that the category will continue to evolve, and would not be surprised to see more tools introduced to alter the definition of the category.
Critical Issues
As the ASQ Category evolves, we believe the following issues must be constantly monitored and revisited in order to draw conclusions about the growth of the industry.
©1996 Adams, Harkness & Hill, Inc.