xAI woos developers with $25/month worth of API credits

Share This Post


Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


We’ve known it for some time, but now it’s certain: the generative AI race is as much a contest for developers as it is for end-users.

Case-in-point: today, Elon Musk’s xAI, the spinoff startup of the social network X that uses its data to train new large language models (LLMs) such as the Grok family, today announced its application programming interface (API) is now open to the public and with it comes $25 free per month in API credits through the end of the year.

Given it’s already November, that’s just 2 months worth of free credits, or $50 total.

Musk previously announced the xAI API was open in beta three weeks ago to the date, but apparently uptake was not enough for his liking, hence the added incentive of free dev credits.

Is $25 per month with 2 months remaining really that much of a carrot?

It doesn’t sound like much coming from the world’s wealthiest man and multi-billionaire, and it’s not really on a per user basis nor in aggregate, but it may be enough to entice some developers to at least check out xAI’s tools and platform for building apps atop of the Grok models.

Specifically, xAI’s API is priced at $5 per million input tokens and $15 per million output, compared to $2.50/$10 for OpenAI’s GPT-4o model and at $3/$15 for Anthropic’s Claude 3.5 Sonnet model. Ultimately, that means xAI’s $25 credit won’t get the developer very far — only about two million tokens in and one million out per month. For reference, a million tokens is equivalent to 7-8 novels worth of words.

The context limit, or how many tokens can be inputted or outputted in one interaction through the API, is around 128,000, similar to OpenAI’s GPT-4o and below Anthropic’s 200,000 token window, and well below Google Gemini 1.5 Flash’s 1-million context window length.

Also, from my brief test of the xAPI, I was only able to access grok-beta and text only, no image generation capabilities such as those found on Grok 2 (powered by Black Forest Labs’ Flux.1 model).

New Grok models coming soon

According to xAI’s blog post, this is actually “a preview of a new Grok model that is currently in the final stages of development,” and a new Grok “vision model will be available next week.”

In addition, xAI notes that the grok-beta supports “function calling,” or the ability for the LLM to take commands from a user and access functions of other connected apps and services, even executing them on the user’s behalf (if the connected app allows such access).

Compatible with the competition

Furthermore, the xAI account on the social network X posted that the xAI API is “compatible with OpenAI & Anthropic SDKs,” or the software development kits of different web tools used by developers of those platforms, meaning it should be relatively easy to switch out those models for grok-beta or others on the xAI platform.

Musk’s xAI recently switched on its “Colossus” supercluster of 100,000 Nvidia H100 GPUs in Memphis, Tennessee, which is being used to train its new models — the largest or one of the largest in the world — so apparently that facility is already hard at work.

What do you think? Is it enough to get the developers out in the VentureBeat audience to try building atop xAI? Let me know: carl.franzen@venturebeat.com.



Source link
spot_img

Related Posts

OpenAI Suchir Balaji: OpenAI responds to former researcher Suchir Balaji’s death

OpenAI released a statement on Thursday regarding the...

True Anomaly achieves milestone with Jackal satellite deployment

WASHINGTON — Space technology startup True Anomaly announced...

A look back at the biggest watch trends of 2024

As we wrap up 2024, it’s time to...

How to get 24×7 home security from Dubai police. What are the charges?

If you feel anxious while leaving your home...
spot_img