Many storage architects start off their workday sipping coffee, reading emails, checking on the status of various things in the environment. Mostly just killing time on the often repetitive and boring tasks of provisioning, monitoring, and maintaining the arrays, switches, servers, and other pieces that make up the infrastructure, and fighting any fires that may have flared up recently. Work seems to alternate between incredibly boring and repetitive and incredibly stressful at a moment’s notice.
On rare occasions an email or phone call from management will contain the words that are dear to the heart of technologists everywhere: We need some new stuff, figure out what we need and let’s bring it in and test it out! Getting to test drive the shiny new, ultra-fast, mega-big, leading edge tech is often the reason IT folks got into their jobs in the first place.
This time the product being evaluated is an All-Flash-Array. The usual goal in assigning this type of evaluation to the storage team is to have the team verify that the solution or product meets all of the critical requirements, to determine how well it meets other desired measurements, differentiate one particular product or solution from similar offerings from other vendors, and finally to come up with a well-thought out and thoroughly researched recommendation to management.
Excitement and enthusiasm aside, the only way to accomplish all of this in a timely manner is to take a very structured approach. Below is the way I suggest to organize this evaluation process.
Proposed All-Flash-Array Product Evaluation Method
- Identify problematic applications and look for storage-related bottlenecks.
- Examine the storage requirements for each application to determine which applications could benefit from AFA storage. This includes availability, reliability/recoverability, replication, change/provisioning frequency, performance, and deduplication.
- Benchmark storage performance overall and for each application.
- Create prioritized criteria checklist to evaluate products. This included critical/gating criteria based on requirements (objective) and additional criteria (subjective).
- Design tests to evaluate each product for each criteria. Tests should include tests for pass/fail of critical/gating criteria first to eliminate failures, and performance tests with real application loads or simulated loads that match the benchmarked environment loads as closely as possible.
- Identify which vendors and which offerings from those vendors to evaluate. Identification should include current vendors already in the storage environment and new All-Flash vendors. Begin filling out the criteria checklist for each offering, immediately eliminating those that don’t need the minimum/gating requirements.
- Work with resellers or vendors to arrange testing products with the tests.
- Perform the tests.
- Writeup findings and work with management to present the solution to the decision makers.
In the second part of this series, we will delve into more detail regarding each of the steps included in this process