There's one corner of the semiconductor sector that analysts are particularly bullish on to play the artificial intelligence theme. It's DRAM, or dynamic random access memory — a type of semiconductor memory needed for data processing. As the use of AI grows, more and more memory is required; indeed, Morgan Stanley called memory the "foundational building blocks" of artificial intelligence.
The American chipmaker is driving demand for the high bandwidth memory segment in the DRAM sector, according to analysts. It's a key component needed to run Nvidia's AI processors such as A100 AND H100 — something that regular DRAM cannot do, said Bank of America in a June 1 report. In fact, Nvidia's competitors AMD , Intel and even some Chinese fabless firms are also promoting AI GPUs, suggesting increased new high bandwidth memory orders, the bank said.