Two prominent former Google researchers, one of whom was a co-inventor of the “transformer” artificial intelligence architecture that paved the way for the generative AI boom, announced on Thursday they had started a new AI company based in Tokyo.
After the explosive paper came out, advances in generative AI foundation models have centred on making the “transformer”-based models larger and larger. Instead of doing that, Sakana AI will focus on creating new architectures for foundation models, Jones said. “Rather than building one huge model that sucks all this data, our approach could be using a large number of smaller models, each with their own unique advantage and smaller data set, and having these models communicate and work with each other to solve a problem,” said Ha, though he clarified this was just an idea.
All the authors on the “Attention Is All You Need” paper have now left Google. The authors' new ventures have attracted millions in funding from venture investors, including Noam Shazeer, who is running AI chatbot start-up Character.AI, and Aidan Gomez, who founded large language model start-up Cohere.