Ask your storage administrator what their storage utilisation rate is and they’ll probably quote you the allocated figure they’ve allowed; sometimes that’s correct (and if it is, investigate!), sometimes it’s not as the actual consumption of storage used can be a lot lower. Most companies don’t manage their storage capacity effectively, nor do they have a storage capacity programme in place in order to monitor and control capacity as the data that needs to be stored rises.
When companies hit a capacity problem and their budget gets used to purchase additional storage capacity, the first thought is to turn to the cloud to solve their problems. Now, whilst we like the cloud (indeed, Zone Se7en has just moved over to the cloud), it’s not going to solve the problem of over-capacity until the cause is dealt with, which is the uncontrolled rise in storage needs.
The principal reason why storage capacity is becoming such an issue is the exponential growth in ‘secondary’ data, such as copies of original data, duplications, replications, data warehouse, etc. But don’t just delete this secondary data; it was created for a reason, probably a good reason, i.e. data protection, discovery, data sets and regulatory compliance. The idea is to optimise your storage capacity so that you can accommodate this additional data. Capacity management is linked to performance and data recovery; improve one and you need to improve the other two in order to ensure that your capacity can be optimised.
There are two principal ways of achieving storage capacity optimisation and management: using a utility and reporting tools. An all-round utility is compression which is usually applied to the volume level or at the LUN, dependent on your vendor’s implementation, and can deliver a 2:1 compression of primary data storage. You may find that performance is affected but this should be minor in comparison to the cost advantages. Data deduplication is also an option but most vendors have yet to support this method. However, it doesn’t impact the operation of applications as it is a background process, and it also has little impact on performance. One thing to note though is that data deduplication doesn’t work too well when it comes to media files. Thin provisioning is freely available through vendors and is probably the most popular method. Storage administrators are able to allocate storage logically, keeping actual allocation just above actual capacity, with storage being allocated automatically from a central pool, depending on demand. But administrators must monitor capacity to ensure that there is sufficient in the pool to allow for data growth.
When it comes to reporting tools for capacity management, also known as SRM (Storage Resource Management) solutions, independent and array vendors offer options – EMC ControlCenter, NetApp OnCommand Insight, Veritas CommandCentral from Symantec, and HP Storage Essentials to name but a few. The capabilities of these tools is wide and varied, allowing administrators to effectively monitor and manage their storage capacity, but it can be time-consuming so it is wise to apply the 80/20 rule; use only the elements of your SRM solution that you truly need.
But before you dive in and purchase a new storage solution, know what you should be focusing on in defining your storage capacity management plan:
• Monitor your thresholds to allow provisioning planning – you don’t want one array undersubscribed whilst another is oversubscribed.
• Monitor your utilisation including consumption of raw and secondary data, and allocation levels.
• Identify trends and adjust your storage capacity accordingly, i.e. know when the rate of data is growing, and calculate days of storage inventory. The target is usually 90 to 180 days.
By managing your storage capacity effectively, you could be saving your company as much as 75% on their storage costs!