Anthropic rolls out Claude 3.5 Haiku for all users, both free and paid: Report

Share This Post


Anthropic, a leading AI research firm, has released its latest large language model, Claude 3.5 Haiku, to users globally, reported Gagdets360. Touted as the fastest model in the company’s portfolio, Claude 3.5 Haiku is designed to deliver superior performance, outpacing its predecessor, Claude 3 Opus, in various benchmarks. In a notable move, Anthropic has made the new model available to all users, regardless of their subscription tier.

A Quiet Rollout with Big Impacts

While Anthropic refrained from making a formal announcement, the availability of Claude 3.5 Haiku did not go unnoticed. Users on the social media platform X highlighted its integration across Anthropic’s website and mobile applications. Gadgets 360 reported that the model has become the default option for free-tier users.

This release follows the October launch of the Claude 3.5 family, which initially introduced the 3.5 Sonnet model.

Reportedly, Claude 3.5 Haiku brings several enhancements, including reduced latency, improved instruction adherence, and optimised tool usage. These upgrades aim to make it a versatile solution for both individual and enterprise users.

For businesses, Anthropic emphasised the model’s capability to excel in user-centric applications, specialised sub-agent operations, and the creation of tailored experiences from extensive datasets. This positions Haiku as an ideal choice for companies looking to enhance customer interaction and streamline internal processes.

The report also highlights that the Claude 3.5 Haiku model has achieved impressive results in industry-standard benchmarks. It scored 40.6 per cent on the Software Engineering (SWE) benchmark, surpassing the earlier 3.5 Sonnet model and even outperforming OpenAI’s GPT-4o. Additionally, the model outshone GPT-4o Mini in the HumanEval and Graduate-Level Google-Proof Q&A (GPQA) tests, cementing its place as a leader in the AI landscape.

Earlier this month, Anthropic optimised Claude 3.5 Haiku for Amazon Web Services’ (AWS) Trainium2 AI chipset, enabling latency-optimised inference through Amazon Bedrock.



Source link

spot_img

Related Posts

spot_img