Cerebras Gets Into The Inference Market With A Bang

  • 📰 ForbesTech
  • ⏱ Reading Time:
  • 20 sec. here
  • 7 min. at publisher
  • 📊 Quality Score:
  • News: 28%
  • Publisher: 59%

Cerebras 뉴스

Nvidia,Groq,Wafer Scale Engine

I spent over 35 years as a high tech exec at HPE, IBM, and AMD. Now I love to learn and share the amazing hardware and services being built to enable Artificial Intelligence, the next big thing in technology.

Cerebras’ Wafer-Scale Engine has only been used for AI training, but new software enables leadership inference processing performance and costs. Should Nvidia be afraid?

As Cerebras prepares to go public, it has expanded its target markets and competitive stance by adding inference processing to its capabilities. This is critical, as inference processing is growing faster than training and is probably already the larger market. It is tough to determine how large the inference market may be, but Nvidia indicated it accounted for some 40% of sales in Q1.

 

귀하의 의견에 감사드립니다. 귀하의 의견은 검토 후 게시됩니다.
이 소식을 빠르게 읽을 수 있도록 요약했습니다. 뉴스에 관심이 있으시면 여기에서 전문을 읽으실 수 있습니다. 더 많은 것을 읽으십시오:

 /  🏆 318. in KR

대한민국 최근 뉴스, 대한민국 헤드 라인