The 1.4 trillion parameter model would be 3.5 times bigger than Meta’s current open-source Llama model. Continue reading Near ...