Deliver Us from Evals: Evaluating Storage Software
Few people enjoy the grueling evaluation period that typically accompanies the purchase of storage software and systems. By knowing the key points to consider and learning from the experiences of others, however, it may be possible to make the process less distasteful.
Here we take a look at two different approaches in the real world and the lessons learned:
When Sprint decided to upgrade its storage management capabilities, it wanted to avoid having to be inundated with dozens of products to evaluate.
"Deliver us from storage evals," said Lynn Neal, senior systems integrator at Sprint. "You have to keep the whole Request for Proposal (RFP) process simple of you end up having to evaluate too many products."
Lessons learned from Sprint's recent storage purchases include:
- RFPs must neither be too general nor too specific. Too specific or too much of a shopping list and costs can mount. Too general and you end up with software doesn't fit your needs or that lacks business value.
- Identify pain points and tie those in to your functional requirements. To be funded, software has to solve a real issue. Then communicate internally to determine which functions are given overall priority. By publishing the results, you can flush out any lingering disagreements. And don't forget to achieve broad agreement from top management.
- Continually refer back to your RFP to avoid becoming sidetracked by vendor hype about the latest and greatest new features.
"Keep your original documentation and refer to it often," said Neal. "Otherwise you forget."
- Use your RFP to cull responses down to a maximum of five vendors. Don't pick any more than this or the selection process will become too complex. These are the products you are going to evaluate. And avoid beta versions unless you get an offer you can't refuse or have a special vendor relationship.
- Match your test lab match to your operational environment. So the only way to do a valid test for YOUR enterprise is to set up a test system that runs the same apps with the same workloads etc. And don't allow vendors to alter them or take an excessive period of time to complete their work in your lab.
"Limit vendor tests to a specific time period — no more than one to two weeks," said Neal. "Some vendors will try to tie up your test lab for weeks and throw your schedule well behind."
- Score each product objectively with regard to predefined features. Keep criteria simple and score only those items. If you test correctly, the highest scoring product will be the most suitable.
- Once testing is completed, bring cost into the equation. This helps you evaluate value. If the top-of-the-line system is twice the price of number two, maybe you can do without a specific feature and save a fortune