Meta's first Llamacon shows the tech giant still chasing

If you are like me and hope Meta's Llamacon keynote will give up the reasoning model that made fun of earlier this month, or its teacher model Bememoth is ready to be disappointed. The company’s first AI developer meeting was today, and although we didn’t get any new models, there were some announcements to help Meta catch up to be a super-competitive, fast race to build generated AI. But its announcement didn't help it succeed.
Every big tech company is competing to build a model that can handle complex tasks without the need for a lot of computing power (and money) to run. Meta's AI approach focuses on open source, which allows developers to peek behind curtains at the construction and training of models. Chief Product Officer Chris Cox abandoned an updated statistics confirming that 1.2 billion camel models have been downloaded to date. Between the integration of Meta AI in Facebook, Instagram, and WhatsApp, Meta is of course a major player in the field—even if sometimes it’s a party late or doing something different.
This is all Meta releases today and where the AI industry leaves the company in the future.
meta ai, application
The company's CEO Mark Zuckerberg confirmed via Instagram a few hours before his keynote that the company is renamed its smart glasses metaview app as a standalone app.
You can now download the app. If you can't find it by searching for “Meta AI” (like I can't), try searching for “Meta View”.
The app is an extension of its chatbot with voice mode features designed to let you chat with Meta AI and social discovery feeds. It's different from your Instagram or Facebook feed; you can't find and follow your friends. Instead, you'll see posts from random users' experiences with Meta AI, including the AI images they create, the tips they ask, and the answers to the chatbot.
CNBC reported the possibility of standalone Meta AI apps in February, but the choice to convert Meta View apps raises greater questions about Meta's AI and VR future. My colleague and smart glasses Scott Stein wrote: “Metagame makes drama for another compelling mobile app looks like a way to try to attract more people into the ecosystem rather than making thrilling glasses.”
Camel 4: Where is the reasoning model?
Meta does not have a complete Llama 4 model in Llamacon. Instead, Cox just repeated what we mostly know about Scouts and Mavericks. CNET contacted Meta for the latest information about Behemoth and Llama 4 inference models launched earlier this month, but Meta declined to comment.
The models available in the Llama 4 family are Scout and Maverick. The Scout is a smaller model designed to run on an NVIDIA H100 GPU (with a 10 million Token context window), while the Maverick is the next level with more functionality.
When Meta released the benchmark score for the Llama 4, the confusion was a little confused. The company initially said Maverick outperformed Openai's GPT-4O. But Hawkeye experts saw that the benchmark group confirmed that the Mavericks submitted for testing were not the same model as the ones people are actually using now. It is “optimized for dialogue”. Meta refuses to train models to post-test data, which is a big taboo, as this may make the model have an unfair advantage in benchmarking without accurately evaluating its performance.
Meta's AI policy states that it does train based on information shared on the Meta platform and on content shared with chatbots. The company recently ended the opt-out option for European users, so this applies to them too. You can view Meta's full privacy policy for more information.
Meta's Llama API Platform
Developers who want to build with Meta AI announced Tuesday that they will start previewing their Llama API, the upcoming developer platform for Llama application development. Developers can now request experimental access to Llama 4 quick reasoning as early as possible.
“You should be able to carry these custom models anytime, anywhere,” said Manohar Paluri, vice president of Meta AI. He also called for speed, ease of use and customization should be the hallmark of using the Llama API. The new Llama 4 models, Scout and Maverick will be included in the API.
Angela Fan, a research scientist who generates AI, also stressed that API privacy policies are different from conventional META AI policies. When you use the API, META does not train on input (your prompts and uploaded content) or output (sprayed content). This is helpful for developers who want to build models for an enterprise or enterprise but need to ensure that their uploaded data is secure.
What's next for Meta AI?
Llamacon's announcement helped Meta catch up with its competitors, but did not put them ahead of the curve, which could cause trouble for the future. Still no news about when Meta will release the behemoth or the inference model it promises in Llama 4 Drop.
The Meta View app is great, but it just helps with the Meta-level competitive environment. Most major AI players, including OpenAI, Claude and Cllexity, already have mobile apps. For Meta Smart glasses users, the evolution of the app may point to how AI will be at the forefront of these products.
I left a keynote and thought that meta-pair AI parties were always late – OpenAI, Google, and DeepSeek now have inference models. As I wrote My comments on MetaAI Last year, if the company came out to swing, there was nothing wrong with it. But so far, this seems not to be the case.
Watch the following: Meta AI vs. Chatgpt: Comparison of AI chatbots
I think the most surprising thing is the social discovery feed in Meta AI app. With all Meta's expertise in building social platforms, the Discover/Explore page could be a promising (although unlikely) replacement, as people can flood with AI instead of Facebook or Instagram. Of course, this is something to watch out for, especially when Meta updates the app and moves forward with AI.
For more information, check out our review of the best AI chatbots.