From the largest enterprises to the smallest startups, data management has become a crucial consideration for all types of businesses, but 2020 has seen the data storage industry deal with major transitions that will reshape the storage landscape of the future. Unsurprisingly, the total sum of data being created will continue to expand at a rapid pace. According to the IDC, the global datasphere (the amount of data created and consumed in the world each year) will grow to 175 zettabytes by 2025, up from 59 zettabytes in 2020.

Projections indicate that there will likely be a gap between data capacity demand and the available supply. However, this will remain a manageable problem as long as expected advances in storage technologies occur during the timeframe. Lack of advances in a particular technology, such as magnetic disk, will necessitate greater use of other storage mediums, like flash and tape.

The Two-Tier Paradigm

Before we explore further, to explain how best understand how all this data ought to be distributed within a business’s choices of storage options, it’s useful to employ a two-tiered model that focuses on how data is used rather than the technology employed to hold it. This is a more modern alternative to the traditional memory storage pyramid model, in which solid state is used for “hot” applications, hard drives are used for “warm” applications, and tape is used for “cold” applications—the temperature scale indicating frequency of access. The two-tier model, by contrast, classifies data as project or perpetual and anticipates free movement between the two.

The project tier uses traditional file system architecture and is reserved for in-process data. The capabilities of the file system interface make it excellent for data that is being ingested, processed, or transformed. As a user creates content or modifies something they’re working on, the application may quickly hop around its data files and update them accordingly. It must do this with enough performance that the user’s creative process is not interrupted, and also with sufficient safety that the user’s data will remain intact in the event of malfunction.

The perpetual tier, on the other hand, is object based, which means data is bundled into packages that are marked with metadata and a unique tag. If you think of file-based architecture as a filing cabinet where users have to sort through the folders themselves to find the item they need, object-based architecture is like an infinite warehouse where everything is stored in a shared repository. To retrieve data, the system uses the metadata and identifying tag, creating a much more manageable system for storage at scale. Examples of data well suited to object storage include project assets that must be shared across a team so they can be the basis for future work, completed work that must be distributed, and finished computational results to be shared among researchers.

Key to this new paradigm is the ability of data to move freely between the two tiers. Current applications aren’t able to access the two tiers natively, so data mover software must be used to transfer work from the project tier to the perpetual tier—to archive a finished project, for instance—and recall it back to the project tier if it needs to be accessed or updated.

Data Storage Advances

Dealing with large amounts of data is no longer just the problem of giant conglomerates. As data generation accelerates, even small- and medium-sized businesses will have to develop storage strategies that incorporate both in progress and perpetual data. This will mean employing multiple storage solutions, such as solid state for project data and magnetic tape for perpetual data. However, the ideal mix of technologies used and in what ratios is not one-size fits all—and unanticipated events in the industry can change the calculus on the fly.

For instance, while solid-state disk pricing has been decreasing—2019 saw a dramatic drop of almost 50 percent—it appears that in 2020 this trend will reverse itself and there will be a shortage of flash that will drive prices 10 to 40 percent higher than the prior year. Even after supply recovers, solid state storage gains are expected to begin slowing as manufacturers begin to reach the physical limitations of the technology. In these scenarios, price considerations could push savvy companies to shift more of their data to perpetual tier technology like magnetic tape. By more aggressively pruning data from project tier, a business could take greater advantage of the low cost of tape without having to reduce the size of its datasphere.

Hard disk drives (HDDs) are facing a similar problem as solid state in that they are reaching the capacity cutoff for current technology. HDD companies are developing technologies such as Heat Assisted Magnetic Recording and Microwave Assisted Magnetic Recording to expand capacities, but another market trend could put a damper on research investments and delay these new methods. Demand for HDDs is actually decreasing as flash becomes the dominant storage medium for consumer products. Looking at the disk drive shipments from the last year through 3Q 2019, we see 328 million shipped over the last four quarters compared to 392 million for the prior year’s four quarters (about a 16 percent drop in volume). Again, the data that would have been stored on these devices will be shifted to other technologies.

While every storage category is exhibiting technology improvements, tape has enough technology headroom to achieve storage capacities of 100 TB or higher on a single cartridge in the next decade. The majority of capacity increase will likely be gained by adding more tracks across the tape, though gains could also come from increasing the linear bit density of the tape or the speed at which the tape can be run across the tape head. For the tape segment to see large growth, a widespread realization and adoption of “genetic diversity,” defined as multiple copies of digital content stored in diverse locations on different types of media, is required to protect customers’ digital assets. More recently, due to ransomware and other forms of attacks, tape is receiving more attention as a final line of defense. Tape also has the advantage of offering lowest prices among different substrates that could be considered for archive, with tape at less than $10 per TB, magnetic disk at less than $25 per TB, and optical technology sits at just below $50 per TB.

The Elephant in the Room

The Coronavirus pandemic has resulted in a very high level of uncertainty in all markets, and the storage market is no exception. However, this unforeseen bane on society may prove to be a boon for the storage market. COVID-19 testing, research, and tracking all produce mountains of data that needs to be stored in both the project and perpetual tiers. Even private companies may have to record data related to employee health monitoring, like body temperatures. Once the vaccine has been rolled out, some of this data creation will slow, but regulations put in place as a response to the pandemic may perpetuate some of these practices.

Let Us Help

Now that you have a better understanding of the contemporary data storage landscape, it’s time to take action. Let us help you examine your needs and the develop the best magnetic tape solution for you, whether your data primarily exists in the project  or perpetual tier. Get in touch with one of our specialists today and learn about the range of data storage products from Qualstar.