Samsung cuts back on traditional foundry costs as it leans into HBM for AI computing


The tech world has been going wild over AI since we first started hearing about generative language models and the likes. Things have only escalated with more companies adopting AI powered computing, driving demand for capable hardware. We’ve seen this with Nvidia’s AI GPUs coming under hot demand for data centres, and now Samsung appears to be doubling its efforts for this new type of computational processing.

According to The Korea Economic Daily, Samsung has just increased its hiring target for specialists around AI chip development. Specifically the company is after those experienced with high-bandwidth memory or HBM DRAM, and next generation chips. These are important factors for developing AI powered hardware, as AI models are incredibly memory-intensive. They need heaps of the stuff. HBM offers the best way to deliver that memory for datacenter hardware, though attempts to use HBM for gaming graphics cards died a death a while back with AMD’s Vega generation.



buspartabs.online