Skip to content

Commit

Permalink
docsssss
Browse files Browse the repository at this point in the history
  • Loading branch information
lgrammel committed Oct 17, 2024
1 parent 4c383ff commit 8767782
Showing 1 changed file with 48 additions and 37 deletions.
85 changes: 48 additions & 37 deletions content/providers/01-ai-sdk-providers/50-groq.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,89 +5,100 @@ description: Learn how to use Groq.

# Groq Provider

<Note style={{ paddingTop: 0, paddingBottom: 0 }}>
Groq is supported via [ OpenAI API
compatibility](https://console.groq.com/docs/openai) - the OpenAI provider is
used in the examples below.
</Note>

The [Groq](https://groq.com/) provider contains language model support for the Groq API.
It creates language model objects that can be used with the `generateText`, `streamText`, and `generateObject`.

## Setup

The Groq provider is available via the `@ai-sdk/openai` module as it is compatible with the OpenAI API.
The Groq provider is available via the `@ai-sdk/groq` module.
You can install it with

<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>
<Snippet text="pnpm add @ai-sdk/openai" dark />
<Snippet text="pnpm add @ai-sdk/groq" dark />
</Tab>
<Tab>
<Snippet text="npm install @ai-sdk/openai" dark />
<Snippet text="npm install @ai-sdk/groq" dark />
</Tab>
<Tab>
<Snippet text="yarn add @ai-sdk/openai" dark />
<Snippet text="yarn add @ai-sdk/groq" dark />
</Tab>
</Tabs>

## Provider Instance

To use Groq, you can create a custom provider instance with the `createOpenAI` function from `@ai-sdk/openai`:
You can import the default provider instance `groq` from `@ai-sdk/groq`:

```ts
import { createOpenAI } from '@ai-sdk/openai';
import { groq } from '@ai-sdk/groq';
```

const groq = createOpenAI({
name: 'groq',
baseURL: 'https://api.groq.com/openai/v1',
apiKey: process.env.GROQ_API_KEY,
If you need a customized setup, you can import `createGroq` from `@ai-sdk/groq`
and create a provider instance with your settings:

```ts
import { createGroq } from '@ai-sdk/groq';

const groq = createGroq({
// custom settings
});
```

You can use the following optional settings to customize the Groq provider instance:

- **baseURL** _string_

Use a different URL prefix for API calls, e.g. to use proxy servers.
The default prefix is `https://api.groq.com/openai/v1`.

- **apiKey** _string_

API key that is being send using the `Authorization` header.
It defaults to the `GROQ_API_KEY` environment variable.

- **headers** _Record&lt;string,string&gt;_

Custom headers to include in the requests.

- **fetch** _(input: RequestInfo, init?: RequestInit) => Promise&lt;Response&gt;_

Custom [fetch](https://developer.mozilla.org/en-US/docs/Web/API/fetch) implementation.
Defaults to the global `fetch` function.
You can use it as a middleware to intercept requests,
or to provide a custom fetch implementation for e.g. testing.

## Language Models

You can create [Groq models](https://console.groq.com/docs/models) using a provider instance.
The first argument is the model id, e.g. `llama3-8b-8192`.
The first argument is the model id, e.g. `gemma2-9b-it`.

```ts
const model = groq('llama3-8b-8192');
const model = groq('gemma2-9b-it');
```

### Example

You can use Groq language models to generate text with the `generateText` function:

```ts
import { createOpenAI } from '@ai-sdk/openai';
import { groq } from '@ai-sdk/groq';
import { generateText } from 'ai';

const groq = createOpenAI({
name: 'groq',
baseURL: 'https://api.groq.com/openai/v1',
apiKey: process.env.GROQ_API_KEY,
});

const { text } = await generateText({
model: groq('llama3-8b-8192'),
model: groq('gemma2-9b-it'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
```

Groq language models can also be used in the `streamText`, `generateObject`, `streamObject`, and `streamUI` functions
(see [AI SDK Core](/docs/ai-sdk-core) and [AI SDK RSC](/docs/ai-sdk-rsc)).

## Model Capabilities

Groq offers [a variety of models with different capabilities](https://console.groq.com/docs/models), including:

| Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
| -------------------------- | ------------------- | ------------------- | ------------------- | ------------------- |
| `llama-3.1-405b-reasoning` | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
| `llama-3.1-70b-versatile` | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
| `llama-3.1-8b-instant` | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
| `mixtral-8x7b-32768` | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
| `gemma2-9b-it` | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
| Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
| ------------------------- | ------------------- | ------------------- | ------------------- | ------------------- |
| `llama-3.1-70b-versatile` | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
| `llama-3.1-8b-instant` | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
| `gemma2-9b-it` | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
| `mixtral-8x7b-32768` | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |

<Note>
The table above lists popular models. You can also pass any available provider
Expand Down

0 comments on commit 8767782

Please sign in to comment.