Following our fresh, out-of-box testing, we then fill the drive with multiple game installs to within 1GB of its maximum capacity. Then, using Iometer, we begin a full one-hour test of random 4KB writes at a queue depth of 32. This quickly fills up the rest of the drive's spare area and then forces it into a simulated steady state, where the controller is having to process the incoming writes while also managing the resultant NAND fragmentation and performing garbage collection to make free blocks available to write to, all without any idle time.
This test puts the controller under far more stress than any client workload would and is most relevant to those looking to use drives in intensive workstation or server environments, for example. In such situations, it's important for a drive to maintain high performance and stay responsive, so we report here the average IOPS and average response time from the last five minutes of the one hour test.
Average performance is only one side of the coin, however. It's also important to consider consistency i.e. how varied performance is under sustained load. A drive with more consistent performance has more robust algorithms for NAND defragmentation, page mapping, garbage collection, and so on. To measure consistency, we calculate the standard deviation of the IOPS readings, again from the last five minutes where the drive is in its most steady state. Again, these figures are typically more relevant for enterprise users, but power users at home who do lots of multitasking, especially with write-intensive workloads, will benefit from having performance that is both fast and consistent – it reduces the prevalence of hiccups and stutters, especially as the drive becomes more full and worn over time.
The last graph shows IOPS divided by the standard deviation. This will help highlight the best drives i.e. those that can maintain high performance in a consistent fashion.
October 14 2021 | 15:04