Data manager Komprise has its own take on how organizations should respond to substantial enterprise SSD prices rises and lengthened delivery times; manange data better so as to use less SSD capacity. Well, it would say that, wouldn’t it?
Indeed it would, but that means you listen to what it says critically, and then make your mind up. Here’s a Q and A session with Darren Cunningham, Komprise’s VP for Marketing, so you can get a sense of what it’s saying.
Blocks & Files: What’s the problem?
Darren Cunningham: The year 2026 is shaping up to be a pivotal year for enterprise IT infrastructure budgets. While a year ago tariffs were the primary concern, now exploding demands for high-bandwidth memory (HBM) and high-capacity flash storage (NAND/SSDs) needed for AI infrastructure are adding to equipment shortages and price pressures.
In particular, DRAM and SSD prices may rise by more than 50 percent in some segments, according to CNBC, as memory inventory has fallen sharply in the past year. Meanwhile, cloud storage prices are projected to rise in lockstep due to cloud providers’ reliance on these components and an expected surge in demand from customers needing a capacity backup strategy.
Blocks & Files: Why is this happening?
Darren Cunningham: Multiple factors are converging to create this infrastructure squeeze. AI acceleration demand is consuming memory production capacity. Manufacturers are diverting supply toward high-bandwidth memory for GPUs and other AI processors, limiting the availability of conventional DRAM and NAND used in enterprise servers and storage systems.
Even as fabrication capacity expands, the pace of growth can’t match global demand, leading to persistent shortages and long delivery lead times.
Infrastructure costs are rising broadly, according to analysis by IDC. Memory and flash price increases are driving up the total cost of servers, storage arrays, and cloud infrastructure, squeezing IT budgets and complicating long-term planning.
Blocks & Files: What initial response are you seeing?
Darren Cunningham: One strategy is to stockpile on computers, data storage and memory from vendors now, before prices go up. Yet this is only viable if your budget can flex. Devising ways to be more efficient with infrastructure and data storage will be a critical tactic in 2026, not only to deal with the current supply chain problems but for long-term competitive advantage.
Blocks & Files: How should IT infrastructure purchasers respond?
Darren Cunningham: For data storage teams and IT infrastructure and operations leaders, this trend means that continually buying more capacity is no longer a viable strategy. Squeezing more value from existing resources with a focus on data lifecycle management can avoid getting caught in the middle of supply chain disruptions which can happen at any time and for any reason.
There are a few critical strategies that come to play here:
- IT teams can optimize current infrastructure through better system tuning, right-sizing hardware, and using automation to monitor systems for performance and efficiency.
- Re-evaluate vendor relationships on pricing and support.
- Consider and implement stockpiling and/or using the cloud as a backfill cautiously, backed by analytics.
- Implement a storage-agnostic unstructured data management strategy to control costs, preserve infrastructure value and reduce risk for the long term.
Blocks & Files: Does one kind of data has the main role here?
Darren Cunningham: Unstructured data, such as user files, email and chats, logs, media, backups, application artifacts, and research outputs, typically accounts for 70–90 percent of enterprise data footprints. This data usually lives across distributed file systems, object stores, SaaS applications, cloud and on-premises IT infrastructure with limited visibility and governance. Without insight into what unstructured data is necessary, actively used, duplicates or low-value, these files consume expensive storage tiers, pushing organizations toward costly capacity expansions.
In this environment, reacting by simply buying more storage not only incurs costs and lead times, but fails to address the root cause: uncontrolled data growth and bloat.
Blocks & Files: How would Komprise suggest we deal with it?
Darren Cunningham: A modern unstructured data management strategy gives IT leaders the tools they need to act strategically rather than reactively.
Analytics-driven insights allow IT to understand data usage patterns and lifecycle needs. That way they can always place data where it should live given its current usage and value and get rid of data no longer needed such as orphaned, duplicate and outdated or irrelevant data. They can automatically tier cold or inactive data to low-cost storage tiers such as cloud object storage. This reduces pressure to buy new hardware, which is especially important during times of higher prices and shipping delays.
Not all data is equal and should not be treated the same. Gaining visibility across all storage including cloud gives IT infrastructure teams knowledge for decision-making. They can ensure that mission-critical datasets, including those supporting analytics and AI data pipelines, live on high-performance storage while less critical data moves to secondary or archival storage. Enterprise IT organizations can thereby stretch current investments further, freeing budget dollars for AI and digital innovation rather than infrastructure maintenance.
Blocks & Files: You have a longer term view I think?
Darren Cunningham: Industry analysts do not expect memory and storage markets to return to “normal” pricing and availability soon. Meanwhile, unstructured data volumes will continue to grow. IT infrastructure and operations leaders must choose between traditional reactionary tactics of buying additional expensive storage and backup systems as needed, if they can get it in time, or recentering on the data and how it can be managed differently based on business requirements, risk, value and age.
These decisions are not easy; IT teams are often bound by long-term vendor relationships, IT leadership preferences and the overall budget. Regardless of the near-term supply chain situation, a data-centric perspective to IT infrastructure procurement strategies can have positive, years-long benefits for the enterprise.
Comment
How else would you expect a data management vendor, one with extensive functionality and analytics devoted to finding the right location for data in the SSD-HDD-Cloud-Tape spectrum, to respond to a flash media shortage and price increase? They are going to tell us to manage our data better so we need less of it on flash. Even though this is likely to be the stock response, it doesn’t mean we should ignore it.
Using VDURA numbers, 30 TB of enterprise TLC SSD capacity costs $10,950. Assume you need a petabyte of the stuff and that’s a $10.95 million bill. Reduce the amount of data by 20 percent and you have saved $2.19 million.
Komprise’s software will have a cost and a value. If that value represents a substantial flash capacity saving exceeding the software’s cost by a good enough margin, then it looks like a good deal. And this is before adding in other factors such as possible better data security.