High-bandwidth memory (HBM) is increasingly popular in data centers for AI/ML applications due to its compact form factor and high bandwidth. However, its adoption in the mainstream market is limited by its expensive design and thermal challenges. HBM relies on costly silicon interposers and suffers from heat trapping due to its 2.5D structure. Despite these drawbacks, HBM continues to be favored in AI/ML due to its high bandwidth capabilities. It is expected to improve in speed and performance with the development of HBM3 Gen2. Efforts are being made to reduce the cost of HBM through alternative designs and materials, but it remains a challenge.
https://semiengineering.com/hbms-future-necessary-but-expensive/