According to The Information , Microsoft is training its new AI language model large enough to rival those of Google and OpenAI.
The new model, internally called MAI-1, is overseen by recently hired Mustafa Suleiman, co-founder of Google DeepMind and former CEO of startup Inflection, the report said, citing two Microsoft employees familiar with the project.
The exact purpose of the model has not yet been determined and will depend on how well it works. Microsoft could unveil the new model as early as its Build developer conference later this month.
According to the report, MAI-1 will be “much larger” than previous smaller open-source models that Microsoft has previously trained, meaning it will be more expensive.
Last month, Microsoft released a smaller AI model called Phi-3-mini, aiming to attract a wider customer base with cost-effective options.
The company has invested billions of dollars in OpenAI and deployed ChatGPT technology in its productivity software suite, putting it at the forefront of the generative AI race.
According to the report, Microsoft is dedicating large clusters of servers equipped with Nvidia GPUs along with large amounts of data to improve the model.
The report states that MAI-1 will have approximately 500 billion parameters, while OpenAI’s GPT-4 is reported to have one trillion parameters and Phi-3 mini only 3.8 billion parameters.