Standardized IT systems can appear to make a lot of sense – standardization can be cost efficient, aligned with industry approaches and can help promote re-use. However, the business advantages standardization yields can be easily replicated by competitors. So what are the trade-offs and when does it make sense to choose standardization?
Standardization is typically preferred when cost efficiency is the motivation and/or when there is a requirement for interoperable IT components and process interfaces. The cost motivation is straightforward and is particularly appealing to organizations that are subject to cost pressures in their own market. Selecting standardized technology can yield immediate savings. The combination of multiple suppliers and ease of interchangeability creates a buyer's market. This can stimulate ongoing price reductions. The drive to achieve cost efficiency through standardization can be seen in all stages of technology deployments. In the design stage, selecting standardized technologies can increase interoperability and reduce complexity in system design. In the deployment stage, standardized products can enable a more predictable operating environment that is typically less costly to manage. In the operational stage, use of more standardized technologies may increase access to skilled personnel. Interoperability is an increasing concern for many organizations, especially those that operate in markets where eco-systems and partnerships are critical factors or those markets in which merger and acquisition activity is common. The process of linking or bringing together multiple heterogeneous IT systems can be greatly simplified when IT environments are built from highly standardized systems.