TECH

Meta confirms that its Llama 3 open-source LLM is coming in the next month



At an event in London on Tuesday, Meta confirmed that it plans an initial release of Llama 3 — the next generation of its large language model used to power generative AI assistants — within the next month.

This confirms a report published on Monday by The Information that Meta was getting close to launch.

“Within the next month, actually less, hopefully in a very short period of time, we hope to start rolling out our new suite of next-generation foundation models, Llama 3,” said Nick Clegg, Meta’s president of global affairs. He described what sounds like the release of several different iterations or versions of the product. “There will be a number of different models with different capabilities, different versatilities [released] during the course of this year, starting really very soon.”

The plan, Meta Chief Product Officer Chris Cox added, will be to power multiple products across Meta with Llama 3.

Meta has been scrambling to catch up to OpenAI, which took it and other big tech companies like Google by surprise when it launched ChatGPT over a year ago and the app went viral, turning generative AI question and answers into everyday, mainstream experiences.

Meta has largely taken a very cautious approach with AI, but that hasn’t gone over well with the public, with previous versions of Llama criticized as too limited. (Llama 2 was released publicly in July 2023. The first version of Llama was not released to the public, yet it still leaked online.)

Llama 3, which is bigger in scope than its predecessors, is expected to address this, with capabilities not just to answer questions more accurately but also to field a wider range of questions that might include more controversial topics. It hopes this will make the product catch on with users.

“Our goal over time is to make a Llama-powered Meta AI be the most useful assistant in the world,” said Joelle Pineau, Vice President AI Research. “There’s quite a bit of work remaining to get there.” The company did not talk about the size of the parameters it’s using in Llama 3, nor did it offer any demos of how it would work. It’s expected to have about 140 billion parameters, compared to 70 billion for the biggest Llama 2 model.

Most notably, Meta’s Llama families, built as open-source products, represent a different philosophical approach to how AI should develop as a wider technology. In doing so, Meta is hoping to play into wider favor with developers versus more proprietary models.

But Meta is also playing it more cautiously, it seems, especially when it comes to other generative AI beyond text generation. The company is not yet releasing Emu, its image generation tool, Pineau said.

“Latency matters a lot along with safety along with ease of use, to generate images that you’re proud of and that represent whatever your creative context is,” Cox said.

Ironically — or perhaps predictably (heh) — even as Meta works to launch Llama 3, it does have some significant generative AI skeptics in the house.

Yann LeCun, the celebrated AI academic who is also Meta’s chief AI scientist, took a swipe at the limitations of generative AI overall and said his bet is on what comes after it. He predicts that will be joint embedding predicting architecture (JEPA), a different approach both to training models and producing results, which Meta has been using to build more accurate predictive AI in the area of image generation.

“The future of AI is JEPA. It’s not generative AI,” he said. “We’re going to have to change the name of Chris’s product division.”




Source link

WyomingDigitalNews