RagDocs is a cutting-edge open-source solution that revolutionizes how you interact with documentation. By combining the power of local LLMs with state-of-the-art Retrieval-Augmented Generation (RAG), developers can instantly get accurate answers from their documentation without any API costs.
Say goodbye to expensive API calls and privacy concerns. With RagDocs, all your documentation stays private while providing ChatGPT-like interactions. Built with Milvus vector search and Next.js, it's production-ready and easy to deploy. Experience the future of documentation search with complete data privacy and no usage fees.
RAG.Demo.1.mp4
- 🤖 Intelligent Chat Interface: Conversational interface tailored for technical documentation.
- 🔍 Cutting-Edge Retrieval Augmented Generation: Accurate and context-aware responses.
- 💾 Local LLM Support: Integrated with Ollama for privacy and efficiency.
- 🎯 Vector Search Powered by Milvus: Rapid and scalable document querying.
- 🚀 Modern Next.js Frontend: Sleek, real-time user experience.
- 📚 Multi-Document Support: Seamless handling of diverse tech stacks.
- ⚡ Fast Document Processing: Efficient ingestion and analysis of updates.
- 🔄 Incremental Updates: Keep your documentation in sync effortlessly.
ragdocs/
├── data/ # Documentation storage
│ ├── milvus_docs/ # Milvus documentation
│ ├── qdrant_docs/ # Qdrant documentation
│ └── weaviate_docs/ # Weaviate documentation
├── milvus/ # Milvus standalone setup
│ └── standalone_embed.sh # Milvus standalone script
├── ragdocs_api/ # FastAPI backend
│ ├── conversation_api.py # Chat API endpoints
│ ├── file_tracker.py # Document change tracking
│ ├── llm_provider.py # LLM integration (Ollama)
│ ├── markdown_processor.py # Markdown processing
│ └── rag_system.py # Core RAG implementation
└── ragdocs_frontend/ # Next.js frontend
└── src/ # Frontend source code
- Python 3.12+
- Node.js 20.17.0
- Milvus 2.0+
- Ollama (for local LLM support)
Install Node.js with nvm:
nvm install 20.17.0
nvm use 20.17.0
- Clone the Repository:
git clone https://github.com/AlexisBalayre/RagDocs.git
cd RagDocs
- Install Python Dependencies:
# Install poetry if not already installed
curl -sSL https://install.python-poetry.org | python3 -
# Install project dependencies
poetry install
poetry shell
Key Dependencies:
- llama-index: 0.11.22
- fastapi: 0.115.4
- milvus-lite: 2.4.10
- sentence-transformers: 2.7.0
- torch: 2.5.1
- Install Frontend Dependencies:
cd ragdocs_frontend
yarn
- Set Up Environment Variables:
cp example.env.local .env.local
# Edit .env.local with your configuration
- Start Milvus:
cd milvus
bash standalone_embed.sh start
- Run the Backend:
poetry run uvicorn ragdocs_api.conversation_api:app --reload
- Run the Frontend:
cd ragdocs_frontend
yarn dev
- Add Documentation: Place files in the respective folders under
data/
. - Start Chatting: Use the frontend to chat with and explore your documentation.
- Compare Tech Stacks: Leverage built-in comparison features for analysis.
Edit the .env.local
file for key settings:
MILVUS_HOST=localhost
MILVUS_PORT=19530
OLLAMA_MODEL=llama3.2
Contributions are encouraged! Follow these steps:
- Fork the repository.
- Create a feature branch:
git checkout -b feature/AmazingFeature
- Commit your changes:
git commit -m 'Add some AmazingFeature'
- Push your branch:
git push origin feature/AmazingFeature
- Open a Pull Request.
This project is licensed under the MIT License. See the LICENSE file for details.
- Vercel AI Chatbot - Template inspiration for the frontend.
- LlamaIndex - Powering RAG capabilities.
- Milvus - Efficient vector search backend.
- Ollama - Local LLM support.
If this project adds value to your work, please give it a star!
Your support makes a difference and encourages further development. Feedback and feature suggestions are always welcome!