Craft AI-driven interfaces effortlessly.
Changelog Β· Report Bug Β· Request Feature Β· English Β· δΈζ
- π Derived from Best Practices of Enterprise-Level AI Products: Built on the RICH interaction paradigm, delivering an exceptional AI interaction experience.
- 𧩠Flexible and Diverse Atomic Components: Covers most AI dialogue scenarios, empowering you to quickly build personalized AI interaction interfaces.
- β‘ Out-of-the-Box Model Integration: Easily connect with inference services compatible with OpenAI standards.
- π Efficient Management of Conversation Data Flows: Provides powerful tools for managing data flows, enhancing development efficiency.
- π¦ Rich Template Support: Offers multiple templates for quickly starting LUI application development.
- π‘ Complete TypeScript Support: Developed with TypeScript, ensuring robust type coverage to improve the development experience and reliability.
- π¨ Advanced Theme Customization: Supports fine-grained style adjustments to meet diverse use cases and personalization needs.
npm install @ant-design/x --save
yarn add @ant-design/x
pnpm add @ant-design/x
Add script
and link
tags in your browser and use the global variable antd
.
We provide antdx.js
, antdx.min.js
, and antdx.min.js.map
in the dist directory of the npm package.
Based on the RICH interaction paradigm, we provide numerous atomic components for various stages of interaction to help you flexibly build your AI dialogue applications:
Below is an example of using atomic components to create a simple chatbot interface:
import React from 'react';
import {
// Message bubble
Bubble,
// Input box
Sender,
} from '@ant-design/x';
const messages = [
{
content: 'Hello, Ant Design X!',
role: 'user',
},
];
const App = () => (
<>
<Bubble.List items={messages} />
<Sender />
</>
);
export default App;
We help you integrate standard model inference services out of the box by providing runtime tools like useXAgent
, XRequest
, etc.
Here is an example of integrating Qwen:
Note: π₯
dangerouslyApiKey
has security risks, more details can be found in the documentation.
import { useXAgent, Sender, XRequest } from '@ant-design/x';
import React from 'react';
const { create } = XRequest({
baseURL: 'https://dashscope.aliyuncs.com/compatible-mode/v1',
dangerouslyApiKey: process.env['DASHSCOPE_API_KEY'],
model: 'qwen-plus',
});
const Component: React.FC = () => {
const [agent] = useXAgent({
request: async (info, callbacks) => {
const { messages, message } = info;
const { onUpdate } = callbacks;
// current message
console.log('message', message);
// messages list
console.log('messages', messages);
let content: string = '';
try {
create(
{
messages: [{ role: 'user', content: message }],
stream: true,
},
{
onSuccess: (chunks) => {
console.log('sse chunk list', chunks);
},
onError: (error) => {
console.log('error', error);
},
onUpdate: (chunk) => {
console.log('sse object', chunk);
const data = JSON.parse(chunk.data);
content += data?.choices[0].delta.content;
onUpdate(content);
},
},
);
} catch (error) {
// handle error
}
},
});
const onSubmit = (message: string) => {
agent.request(
{ message },
{
onUpdate: () => {},
onSuccess: () => {},
onError: () => {},
},
);
};
return <Sender onSubmit={onSubmit} />;
};
We help you efficiently manage the data flow of AI chat applications out of the box by providing the useXChat
runtime tool:
Here is an example of integrating OpenAI:
import { useXAgent, useXChat, Sender, Bubble } from '@ant-design/x';
import OpenAI from 'openai';
import React from 'react';
const client = new OpenAI({
apiKey: process.env['OPENAI_API_KEY'],
dangerouslyAllowBrowser: true,
});
const Demo: React.FC = () => {
const [agent] = useXAgent({
request: async (info, callbacks) => {
const { messages, message } = info;
const { onSuccess, onUpdate, onError } = callbacks;
// current message
console.log('message', message);
// history messages
console.log('messages', messages);
let content: string = '';
try {
const stream = await client.chat.completions.create({
model: 'gpt-4o',
// if chat context is needed, modify the array
messages: [{ role: 'user', content: message }],
// stream mode
stream: true,
});
for await (const chunk of stream) {
content += chunk.choices[0]?.delta?.content || '';
onUpdate(content);
}
onSuccess(content);
} catch (error) {
// handle error
// onError();
}
},
});
const {
// use to send message
onRequest,
// use to render messages
messages,
} = useXChat({ agent });
const items = messages.map(({ message, id }) => ({
// key is required, used to identify the message
key: id,
content: message,
}));
return (
<>
<Bubble.List items={items} />
<Sender onSubmit={onRequest} />
</>
);
};
export default Demo;
@ant-design/x
supports ES modules tree shaking by default.
@ant-design/x
provides a built-in ts definition.
Welcome to contribute!
Ant Design X is widely used in AI-driven user interfaces within Ant Group. If your company and products use Ant Design X, feel free to leave a comment here.
Please read our CONTRIBUTING.md first.
If you'd like to help us improve antd, just create a Pull Request. Feel free to report bugs and issues here.
If you're new to posting issues, we ask that you read How To Ask Questions The Smart Way and How to Ask a Question in Open Source Community and How to Report Bugs Effectively prior to posting. Well written bug reports help us help you!
If you encounter any issues while using Ant Design X, you can seek help through the following channels. We also encourage experienced users to assist newcomers via these platforms.
When asking questions on GitHub Discussions, it's recommended to use the Q&A
tag.