Download the authoritative guide: Enterprise Data Storage 2018: Optimizing Your Storage Infrastructure
Modeling computer systems has been used in the development of large systems for many years, and is a requirement in some environments as part of the architecture design process. This month we will look at some of the processes and reasons for modeling and simulation, as well as provide a bit of history on modeling and simulating computer systems.
The commercial application of modeling and simulating computer systems has been around for over 30 years. It was originally developed for mainframes when BGS (I believe an MIT spin-off) created a modeling package for IBM mainframes. In many ways, mainframes were and still are far simpler to model than current Unix systems for a number of reasons:
- They have far more deterministic queues of work
- They provided (even back then) far more information on the queues of work in the operating system
- Companies were paying large amounts of money and needed to plan for the future, and modeling was (and is) in IBM's best interests in most cases, as it allowed them to sell more hardware
- Service level agreements required accurate predictions of performance
Modeling, simulation, and capacity planning became a requirement for mainframe systems. This process was often combined with system tuning. Smaller mainframe companies such as Univac developed their own in-house modeling, capacity planning, and tuning groups, as their market share did not justify independent companies developing products.