To consolidate this week's learning, complete the following project:
- Create a GitHub repository for your project
- Add all group members as collaborators
- Create a
README.md
file with a comprehensive project description - Use the
ts-playground
as a foundation or develop a new application from scratch using Next.js - Design a page with a single input field for .txt file uploads
- Users should upload a book or similar content with characters and settings
- Implement a button to extract characters from the uploaded file
- Develop a RAG pipeline to extract characters from the uploaded file
- Each character should have a name, description, and personality
- Add a text area
belownext to ( :-) ) the button to display the results - Convert the output from the AI into an array of objects and present it in a table format
- (optional) Modify the
retrieveandquery.ts
file to use Structured Outputs to make it easier to process the response - Integrate the
ts-playground
project with thestory-telling-app
to enable users to create new stories using imported characters, reusing their descriptions and personalities inside the stories being generated by the AI - Submit your project through the designated submission form
git clone https://github.com/alexmazaltov/encode_bootcamp_week_4
cd encode_bootcamp_week_4
2. Set your OpenAI API key
in .env.local
which should be already in .gitignore
:
export OPENAI_API_KEY="sk_..."
npm install
npm run dev
5. Navigate to http://localhost:3000.
- Create your AI assistant, and add the generated ID to
app/assistant-config.ts
.
- This will also be automatically available in your OpenAI Platform WebGUI
- here you can click the upload button and upload a file
- a sample is provided in this repo
A Study in Scarlet.txt
by Arthur Conan Doyle
- after the uploaded file has been processed (and the embedings added to the vector store)
- you can click the extract button to will extract the main characters profiles from the story
- this can take a while, if you click the Extract button you might get an error - be patient
- when the processing is complete you will see a table with Story Characters
- a Description and Personality profile will be filled based on the RAG pipeline results
This homework has been done by the following members of GROUP 9:
- Luis José Sánchez
- Oleksii Bondarenko
- Razvan Ionescu
- Amr Fayez
- Akash Pathak
- Ahmed Yassin
- Maria Kachalova ...
You can deploy this project to Vercel or any other platform that supports Next.js.
This project is intended to serve as a template for using the Assistants API in Next.js with streaming, tool use (code interpreter and file search), and function calling. While there are multiple pages to demonstrate each of these capabilities, they all use the same underlying assistant with all capabilities enabled.
The main logic for chat will be found in the Chat
component in app/components/chat.tsx
, and the handlers starting with api/assistants/threads
(found in api/assistants/threads/...
). Feel free to start your own project and copy some of this logic in! The Chat
component itself can be copied and used directly, provided you copy the styling from app/components/chat.module.css
as well.
- Basic Chat Example: http://localhost:3000/examples/basic-chat
- Function Calling Example: http://localhost:3000/examples/function-calling
- File Search Example: http://localhost:3000/examples/file-search
- Full-featured Example: http://localhost:3000/examples/all
app/components/chat.tsx
- handles chat rendering, streaming, and function call forwardingapp/components/file-viewer.tsx
- handles uploading, fetching, and deleting files for file search
api/assistants
-POST
: create assistant (only used at startup)api/assistants/threads
-POST
: create new threadapi/assistants/threads/[threadId]/messages
-POST
: send message to assistantapi/assistants/threads/[threadId]/actions
-POST
: inform assistant of the result of a function it decided to callapi/assistants/files
-GET
/POST
/DELETE
: fetch, upload, and delete assistant files for file search
Let us know if you have any thoughts, questions, or feedback in this form!