Brave Leo Enhances Browser Experience with Mixtral 8x7B AI Assistant Integration

K.C. Sabreena Basheer Last Updated : 30 Jan, 2024
3 min read

In a significant desktop browser update (v1.62), Brave Leo, the AI browser assistant, has incorporated Mixtral 8x7B as its default large language model (LLM). Mistral AI’s Mixtral 8x7B, known for its speed and superior performance, now powers Leo, bringing a host of improvements to the user experience. Let’s delve into the details of this integration and its implications for Brave users.

Also Read: Mistral AI Introduces Mixtral 8x7B: A Powerful Sparse Mixture-of-Experts Model

Mixtral 8x7B – A Game-Changer for Leo

With the latest update, Brave Leo has embraced Mixtral 8x7B, an open-source LLM by Mistral AI. This model, released in December, has swiftly gained popularity for its exceptional speed and efficiency. According to the LMSYS Chatbot Arena Leaderboard, Mixtral 8x7B surpasses well-known models like ChatGPT 3.5 and Claude Instant, demonstrating its prowess in the AI landscape. The integration aims to elevate Leo’s capabilities, offering users improved performance in handling larger contexts and interacting in multiple languages, including English, French, German, Italian, and Spanish.

Also Read: What is the Mixture of Experts Approach to LLM Development?

Mistral AI's Mixtral 8x7B | Brave Leo's New LLM

Diverse AI Choices for Leo Users

Brave Leo stands out among AI assistants by providing users with the freedom to choose from various language models based on their preferences and needs. While Mixtral 8x7B becomes the default LLM for both free and premium versions, users can still opt for Claude Instant and Llama 2 13B. The introduction of Mixtral brings a substantial boost in answer/result quality for Leo users, enhancing their overall browsing experience.

Privacy First Approach in Brave Leo

Privacy remains a top priority for Brave Leo, ensuring that user interactions with the AI assistant are private, anonymous, and secure. The integration of Mixtral 8x7B maintains this commitment, with features such as a reverse proxy and immediate discarding of responses. Users can enjoy Leo’s capabilities without the need for an account, and those opting for Leo Premium experience an extra layer of privacy with unlinkable tokens, safeguarding their purchase details.

Brave Leo follows privacy first approach

Brave’s Commitment to Open Models

Brave’s decision to integrate Mixtral aligns with its commitment to open models and the broader open-source community. Brian Bondy, CTO and co-founder at Brave, emphasizes the goal of creating novel and convenient use cases for users during their browsing sessions. Arthur Mensch, CEO of Mistral AI, expresses delight in seeing Mixtral integrated into both Brave Search and Brave Leo, highlighting the significance of making Mixtral available as an open model.

Also Read: Apple Secretly Launches Its First Open-Source LLM, Ferret

Our Say

Brave Leo’s integration of Mixtral 8x7B marks a noteworthy advancement in the realm of AI-powered browsers. The collaboration between Brave and Mistral AI showcases a commitment to providing users with cutting-edge, efficient, and privacy-focused technologies. As Brave continues to expand Leo’s capabilities, users can expect an even more seamless and versatile browsing experience.

The integration of Mixtral 8x7B into Brave Leo is a testament to the constant evolution of AI technologies in enhancing user interactions with web browsers. As technology continues to advance, users can anticipate further innovations and improvements in AI browser assistants, making browsing sessions more personalized, efficient, and secure.

Follow us on Google News to stay updated with the latest innovations in the world of AI, Data Science, & GenAI.

Sabreena Basheer is an architect-turned-writer who's passionate about documenting anything that interests her. She's currently exploring the world of AI and Data Science as a Content Manager at Analytics Vidhya.

Responses From Readers

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details