Critical factors in technology choices

print   ·   T  T  
COST EFFECTIVENESS: Inside an IT major's facility in Kolkata.
COST EFFECTIVENESS: Inside an IT major's facility in Kolkata.

IT has to deliver on the key abilities in the context of the need to cut costs

While Total Cost of Ownership (TCO) has become the "acid criterion" for most IT decision - makers, the reality is that some fundamentals of this concept are being ignored in practice.

TAKING INFORMED decisions on IT spend is vital to sustaining one's competitive edge in today's hypercompetitive business environment. The decisions themselves have become complex, given the pressure on IT departments to "do more with less." While providing the greatest possible cost effectiveness is a key goal, IT has to ensure that focus on cost effectiveness does not shortchange the essential "abilities" of Information Technology — reliability, manageability, security and interoperability.

Essential abilities

While historically there has been focus on IT's reliability, security and manageability, these abilities were defined and `metriced' in the days of largely monolithic IT architectures and in the context of associated vendor solutions. It is, therefore, worthwhile to take a re-look at these abilities against current-day IT realities. Further, given the multitude of choices available in terms of platforms and vendors' focus on interoperability, heterogeneity is a reality even in fairly unsophisticated IT shops. In the light of this, any IT decision needs to factor in interoperability as it is provided by a vendor as well as a roadmap associated with the same. While Total Cost of Ownership (TCO) has become the "acid criterion" for most IT decision makers, the reality is that some fundamentals of this concept are being ignored in practice. A large number of them, for instance, admit to using acquisition cost as a key input for the mental model of TCO, without actually estimating the projected TCO in a methodical fashion, and take decisions. The negative impact of such a decision is felt both in the short-term— for example, being stuck with a solution that does not offer extensibility— as well as in the long term— for example, implementing a solution without a clear roadmap of future directions. In view of the above, it is important to re-examine the meaning and implications of these abilities in the current scenario and highlight some of the key questions that need to be addressed before making an IT decision.

Reliability v agility

Software Reliability is the probability of failure-free software operation for a specified time in a specified environment. Driven by their desire to reduce the number of moving parts to bring down the probability of failure, vendors usually define the environment tightly, thereby limiting the flexibility of the IT infrastructure. Given this situation, IT users need to ask the fundamental question: How much agility am I willing to trade off in order to achieve greater reliability? Specifically, is it possible to add more applications (independent software vendors or line-of-business) to the "stack" and how does that affect the reliability of the SLA (service level agreement)? Will the boundary conditions around the operating environment be violated if there is a new minor upgrade of the product and if there is a desire to gain the benefits associated with it? What is the breadth of available ISV solutions? Are there certification programs for them to give IT staff the skills they need to reduce the time-to-solution?Specific to vendor choice, a critical question that IT needs to ask of product vendors is their track record in improving the reliability over versions and their level of ongoing engineering investments to meet the ever-complex demands from the IT infrastructure. In a nutshell, reliability needs to be viewed across the lifecycle of the system (rather than as a static concept at the time of purchase). During this time, IT should also be able to add/remove capabilities without breaking support contracts and other commitments.


Any management framework or vision encompasses a multitude of client form factors such as desktops/laptops and mobile PCs as well as complex server infrastructure including operating systems, databases, mission critical and off-the-shelf applications. This finds reflection in the scope and the portfolio of solutions offered by the vendor. IT has to make sure that the vendor has a clearly defined management vision and roadmap that aligns with the needs of the existing infrastructure and scales well into the future. It is equally important to check whether the vendor has a vibrant partner ecosystem including ISVs and solution providers that add functionality to the management solution and enable its roll out.With the myriad of technologies and interface mechanisms getting complex, security is a journey rather than a destination. IT therefore needs to demand data from vendors on their investment in security— across design, development, product offerings and support policies— and the associated trajectory of results achieved toward improving security. Equally important is the ability of vendors to provide a portfolio of products/ solutions to meet the diverse needs across the client and server as well as across a diverse set of potential threats.


Interoperability lies at the heart of a successful Information Systems environment. It is that crucial link that enables users within enterprises to access and use a wide range of products and services through a greater number of systems, devices and technologies, regardless of the platforms involved. Given this, while choosing a solution, IT needs to take into account the ability to connect to other platforms, applications and data easily as well as the roadmap and industry initiatives associated with any given vendor around interoperability. When making vendor choices, the focus should be on whether the provider is adhering to widely accepted industry standards and open standards based on pervasively used technologies. It must be checked whether a vendor has gone beyond adopting standards and inbuilt semantic and process interoperability into the complex system.

TCO, the acid metric

As stated earlier, there is an apparent gulf between understanding the TCO model and the practice of estimating the TCO. The key aspects that need to be borne in mind are: TCO needs to be estimated and measured over a period, say three-to-five years It is important to understand the relative shares of the key buckets of cost— staffing, down time, training, software, hardware and outsourcing Related to the above, the interplay between these factors should be clearly understood and accounted for. Purely reducing the software acquisition cost, for instance, may result in increased downtime as well as associated staffing costs due to poor quality or lack of a cogent roadmap. Any decision is associated with risk and, therefore, in estimating the TCO, all dimensions of risk associated with the vendor, such as a roadmap and partner ecosystem need to be quantified or at least considered as a qualitative factor impacting the eventual decision. Thus, IT has to deliver on the key abilities, in the context of ongoing pressure to reduce costs. Before choosing a solution it is imperative that core IT abilities— reliability, manageability, security and interoperability— and the associated TCO are examined in detail and with due rigor so that the IT investments remain tuned to the business objectives of the enterprise. This process ensures that the IT infrastructure delivers greater reliability and better security with the lowest total cost of ownership. IT therefore has the key role in enabling people to build connections, improve operations, drive innovation, and develop customer relationships that the business thrives on.RADHESH BALAKRISHNAN

Director, Platform Strategy, Microsoft India



Recent Article in BUSINESS

‘We are the lowest cost pension providers in the world today’

To increase the coverage in the next five years especially in the unorganised sector »