Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add model_from_pretrained_kwargs as config parameter #122

Merged
merged 4 commits into from
May 9, 2024

Conversation

RoganInglis
Copy link
Contributor

Description

Adds the model_from_pretrained_kwargs config parameter and associated code in order to allow full control over the HookedTransformer or HookedMamba model used to extract activations from. In my specific case this is required in order to load non-default checkpoints for Pythia models.

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist:

  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
    • Not required given existing documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I have not rewritten tests relating to key interfaces which would affect backward compatibility

You have tested formatting, typing and unit tests (acceptance tests not currently in use)

  • I have run make check-ci to check format and linting. (you can run make format to format code if needed.)

Performance Check.

If you have implemented a training change, please indicate precisely how performance changes with respect to the following metrics:

  • L0
  • CE Loss
  • MSE Loss
  • Feature Dashboard Interpretability

Please links to wandb dashboards with a control and test group.

Doesn't affect performance for existing code

…ol over model used to extract activations from. Update tests to cover new cases
@RoganInglis RoganInglis changed the title Allow passing kwargs from config to from_pretrained when loading model to extract activations from feat: Add model_from_pretrained_kwargs as config parameter May 7, 2024
Copy link

codecov bot commented May 8, 2024

Codecov Report

Attention: Patch coverage is 83.33333% with 1 lines in your changes are missing coverage. Please review.

Project coverage is 64.47%. Comparing base (5c41336) to head (b401f71).
Report is 2 commits behind head on main.

Files Patch % Lines
sae_lens/training/sae_group.py 50.00% 0 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #122      +/-   ##
==========================================
+ Coverage   64.43%   64.47%   +0.04%     
==========================================
  Files          17       17              
  Lines        1777     1782       +5     
  Branches      295      296       +1     
==========================================
+ Hits         1145     1149       +4     
  Misses        567      567              
- Partials       65       66       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Collaborator

@chanind chanind left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work with this!

@chanind chanind merged commit 094b1e8 into jbloomAus:main May 9, 2024
7 checks passed
tom-pollak pushed a commit to tom-pollak/SAELens that referenced this pull request Oct 22, 2024
…#122)

* add model_from_pretrained_kwargs config parameter to allow full control over model used to extract activations from. Update tests to cover new cases

* tweaking test style

---------

Co-authored-by: David Chanin <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants