AI fuels demand for memory, storage, and new product launches.

AI Driving Memory and Storage Demand – Summary


  • AI implementation driving demand for memory and storage
  • Prices of storage products increasing due to demand

The growing implementation of AI is driving demand for memory and storage to support the processing of data and data retention, both for the training sets and the results of AI training used in inference engines. Memory and storage technology manufacturing cutbacks in Fall 2023 have also contributed to increased prices in the products, particularly solid-state memory and storage. Various storage and memory products are being introduced to support the demands of AI workflows, including high bandwidth memory (HBM) for AI applications, NAND flash SSDs for data flows, and HDDs for cost-effective secondary storage. Non-volatile memories are also expected to benefit from the growing demand for AI applications.

Key Elements of the Article:

Artificial intelligence implementation is increasing the demand for memory and storage products to support data processing and retention for AI training and inference engines. The prices of storage products, particularly solid-state memory and storage, are on the rise due to this demand. Various companies, including Samsung, SK hynix, and Micron, are focusing on developing HBM products to meet increased demand for AI applications. Western Digital has introduced new products to support AI workloads and outlined a six-stage AI Data Cycle framework to maximize AI investments. The storage infrastructure needed for AI workflows includes HBM for training, high-performance NAND SSDs for data flows, HDDs for secondary storage, and archival media for long-term data retention.

Western Digital announced the release of a 32TB ePMR Enterprise HDD for select customers, as well as various SSDs to support AI workflows, including high-performance PCIe Gen5 SSDs for training and inference, and high-capacity SSDs for fast AI data lakes. The company’s AI storage portfolio includes a range of products to support the preparation of data, model training, and creation of useful inference engines. The overall trend is towards the development and introduction of storage and memory products tailored to meet the demands of AI workflows and applications.