This example shows how to use the Vercel AI SDK with Next.js and OpenAI to create a sample generative application with OpenTelemetry support.
OTEL spans are processed by the @arizeai/openinference-vercel package and collected and viewable in Arize-Phoenix.
Deploy the example using Vercel:
If you don't already have an OpenAI API key do the following to create one:
- Sign up at OpenAI's Developer Platform.
- Go to OpenAI's dashboard and create an API KEY.
To run the example locally you need to:
- Set the required OpenAI environment variable as the token value as shown the example env file but in a new file called
.env.local
. - To run Arize-Phoenix locally run
docker run -p 6006:6006 -i -t arizephoenix/phoenix
npm ci
to install the required dependencies.npm run dev
to launch the development server.
To view Arize-Phoenix and the example app visit:
-
App: localhost:3000
-
Phoenix: localhost:6006 or app.phoenix.arize.com
To learn more about Phoenix, OpenAI, Next.js, and the Vercel AI SDK take a look at the following resources:
- Phoenix repository - learn about LLM observability with Phoenix
- Phoenix docs
- @arizeai/openinference-vercel - Check out the OpenInference Vercel support.
- Vercel AI SDK docs
- Vercel AI SDK telemetry support
- Vercel AI Playground
- OpenAI Documentation - learn about OpenAI features and API.
- Next.js Documentation - learn about Next.js features and API.