Auto-translation used

XAI launched the API

So far, only the large Grok 2 is available. The Mini-model is in the documentation, but access to it is not yet given. There is also an empty section with models for embeddings, which hints at future releases.

The prices are quite high — $5 input/$15 output per million tokens. All competing models have significantly cheaper prices (only the o1 is more expensive, but the Grok 2 is very far from it). In addition, competitors often have context caching and Batch APIs that help significantly reduce the cost of use.

It is important to understand that this is a beta product that 6 people have made in 4 months. Next, they will probably add new features, more models and lower prices. How it will compare with competitors in six months is an open question.