Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: redact file contents from chat and put latest files into system prompt #904

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

thecodacus
Copy link
Collaborator

@thecodacus thecodacus commented Dec 26, 2024

Re-enable Context Optimization with Toggle Feature

Overview

This PR re-enables the previously implemented context optimization feature (from PR #578) while adding a user-configurable toggle in the settings tab. This allows users to choose whether they want to use context optimization based on their specific needs and the LLM model they're using.

Key Changes

1. Context Optimization Toggle

  • Added a new switch in the Features tab for enabling/disabling context optimization
  • Implemented persistent storage of the setting using cookies
  • Added new state management through enableContextOptimizationStore

2. Integration with Chat System

  • Modified chat implementation to pass context optimization flag through the API
  • Updated stream-text functionality to conditionally apply optimization
  • Re-enabled previously commented out code for file content handling

Technical Details

Settings Management

  • Added new contextOptimizationEnabled state in useSettings hook
  • Implemented cookie persistence for the setting
  • Added enableContextOptimization callback for toggle handling

System Changes

  • Modified the chat API endpoint to accept contextOptimization parameter
  • Updated streamText to conditionally apply optimization based on the toggle
  • Re-enabled file context integration with system prompts when optimization is active

Testing

  • Verified toggle functionality persists across sessions
  • Tested chat behavior with optimization enabled and disabled
  • Verified file context handling with various LLM models
  • Confirmed proper state management and API integration

Migration Impact

  • No breaking changes
  • Feature is opt-in through UI toggle
  • Maintains backward compatibility with existing chat functionality
  • No migration steps required for existing installations

Future Improvements

  • Consolidate feature flags into a single configuration map (noted in TODO)
  • Add analytics for feature usage
  • Implement model-specific optimization defaults
  • Add documentation about optimal use cases for different models

Preview

Context.Optimization.demo.mp4

@thecodacus thecodacus added this to the v0.0.4 milestone Dec 26, 2024
@thecodacus thecodacus changed the title feat: redact file contents from chat and placed current files into system prompt feat: redact file contents from chat and put files into system prompt Dec 26, 2024
@thecodacus thecodacus changed the title feat: redact file contents from chat and put files into system prompt feat: redact file contents from chat and put latest files into system prompt Dec 26, 2024
@traflagar
Copy link

Very important feature!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants