-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
General: SOP or Work Practice Document #2
Comments
At NVS, we have a work instruction that outlines the risk validation process for R packages that is currently under review. Generally speaking, acceptance by QA and in particular eCompliance is helped a lot by the fact that we can leverage an extensive list of documents related to the validation of the underlying scientific computing platform and through the existing business processes that do not absolve statisticians/programmers/data scientists from validation responsibility of the final product they build. E.g., a shiny app built using validated R packages and hosted on a GxP-qualified platform is (obviously) not automatically validated itself and needs to undergo its own testing if validation is required. |
Thanks - I was the asker of this question! (Mike Carniello at Astellas) |
At Roche, we don't really have any official, certified documents that would describe the validation process. We host an internal training website, where anyone whos concerned can read about the process overall, the tools we are using, the team itself etc. It contains information about each of the validation steps, how to submit a package, or even how to pre-validate a package on your own as a preparation step for actual, official validation. Additionally, the report that is the product of the entire validation process, contains an explanation and walk-through of the entire process. As for the QA and IT and how did we convince them that this is appropriate, frankly, we didn't have to. From the very beginning, we had the full support of our validation and quality teams. We carved most of the rules and fundamentals of the approach together by confronting the ideas and opinions of experts from multiple concerned fields. Without the validation and quality team support, we wouldn't be able to prepare a validation report template that we later feed with packages' metadata. Additionally, the validation teams consist of people with different backgrounds mixing fields of programming, statistics, and pharma itself. Thanks to that we find it easier to communicate with particular teams about theirs and our needs. Communication is also a very important part of this process, I think it's the main reason why our IT is backing us up because we approach validation iteratively. Each gap or concern is discussed with the people involved, and the support team that oversees every submission is ready to explain every demand and why it is necessary. Thanks to that developers and researchers don't just blindly follow the validation team decision, but rather try to understand the whole process and consequences together with us. Sorry for a wall of text, but I tried to explain the details, as our approach seems to be slightly different than the others :) |
The writing is on the wall, as it is said and sung! :) Thanks for these notes, and you've energized me to really work and think with my IT and QA colleagues on this topic. |
Are your assessment/validation approaches written into some formal SOP or Work Practice Document or anything like that? If so - how did you convince QA and IT services that these approaches were okay?!
The text was updated successfully, but these errors were encountered: