A Tiny New Open-Source AI Model Performs as Well as Powerful Big Ones
Melissa Heikkiläarchive page | MIT Technology Review
“[The Allen Institute for Artificial Intelligence (Ai2)] claims that its biggest Molmo model, which has 72 billion parameters, outperforms OpenAI’s GPT-4o, which is estimated to have over a trillion parameters, in tests that measure things like understanding images, charts, and documents. Meanwhile, Ai2 says a smaller Molmo model, with 7 billion parameters, comes…
Read the full article here