Meta has officially launched its standalone AI app, marking a significant move in the competitive landscape of AI assistants. The announcement coincided with Meta's inaugural LlamaCon developer conference, where CEO Mark Zuckerberg engaged in a discussion with Microsoft CEO Satya Nadella about the future of technology and the transformative potential of AI.
The Meta AI app, powered by the company's Llama 4 AI model, is designed to be a personal, conversational assistant that can understand complex questions and interact with users in a natural, intuitive way. It aims to differentiate itself from rivals like ChatGPT by offering a more personalized and socially integrated experience. The app is now available for download on iOS and Android devices.
One of the key features of the Meta AI app is its ability to personalize responses based on user preferences and social context. By connecting the app to their Facebook and Instagram accounts, users allow Meta AI to access their data and tailor its responses accordingly. This integration enables the AI to provide more relevant and helpful answers, recommendations, and suggestions. Users can also instruct Meta AI to remember specific details about themselves, such as their interests and dietary restrictions, to further enhance the personalization of future interactions.
The Meta AI app also introduces a unique "Discover" feed, which allows users to share their AI interactions with the community. This feed showcases a variety of prompts and responses, enabling users to explore and engage with content generated by others. Users can like, comment on, and share posts within the app, fostering a social media-like experience. The "Remix" option allows users to adapt and try out prompts created by others, encouraging collaboration and creativity. Users can also control whether their prompts are shared with others.
Voice interaction is another major focus of the Meta AI app. The app supports voice-first interaction, allowing users to engage in conversations using natural language. Meta is testing "full-duplex speech technology" that generates voice responses without first creating written text, promising a more natural and fluid conversational experience. This feature is currently available in the United States, Canada, Australia, and New Zealand.
The Meta AI app also integrates with Meta's Ray-Ban smart glasses, replacing the existing Meta View app. This integration allows users to start conversations on their glasses and continue them on their phones or desktops, ensuring a seamless experience across devices.
At LlamaCon, Zuckerberg and Nadella discussed the rapid advancement of AI and its potential impact on various aspects of society. They explored how AI is transforming their respective companies, with AI already writing code. Nadella noted that AI is responsible for generating a significant portion of the code at Microsoft and inquired about Meta's use of AI in code development. Zuckerberg predicted that AI could account for half of Meta's code development within the next year. They also touched on the need for productivity gains to be reflected in GDP growth.