Mistral AI co-founder and CEO Arthur Mensch.
Photo: Nathan Laine/Bloomberg (Getty Images)

While most AI companies are preciously unveiling their latest algorithms with press tours and blog posts, others seem content to throw their latest wares out into the digital ether like a pirate ship casting off dead weight. One company that fits this latter category is Mistral, a French AI startup that released its latest large language model, without explanation, in a nondescript Torrent link posted to X over the weekend.

Mistral, which recently raised $415 million in a series A funding round and is now estimated to be worth $2 billion, has been impressing folks with its fast, efficient LLMs and fun, carefree, hacker attitude. The company’s decision to unceremoniously drop its latest program inspired memes and compliments on X, with one commentator noting: “No blog, no sizzle, no description — just a torrent with the model files…Mistral understands their primary audience to be engineers and knows their cultural erogenous zones.”

On Monday, the company finally followed up its initial release with a blog post that shared more details about the program, which is merely dubbed Mixtral-8x7B. According to benchmarks provided in that blog post, Mistral’s algorithm outperforms some of its U.S. competitors, including Meta’s Llama 2 family and OpenAI’s GPT-3.5. Folks online seem to agree that Mistral’s new algorithm is pretty damn good. A whole lot of people are currently crowing about how fast and fun the program is.

An added bonus is that Mixtral-8x7B is open source, unlike the ironically named OpenAI—which has kept its latest LLMs closed source and inspired a certain amount of backlash as a result. Indeed, Mistral is focused on open sourcing all of its AI software, which puts it firmly in one side of a growing culture war in the AI industry. Mistral AI co-founder and CEO Arthur Mensch recently remarked on this decision, noting that his company was committed to pursuing “an open, responsible and decentralised approach to technology.”

Source link