-
Notifications
You must be signed in to change notification settings - Fork 53
Security
Niffler on-demand and real-time DICOM extractions acquire DICOM images consisting of PHI. Therefore, several security measures must be enforced. Our Niffler production instance considers and implements the below security details, in terms of the vendors, infrastructures, and security protocols that it uses for its deployment. We advise similar production deployments of Niffler for the on-demand and real-time retrieval of DICOM images and clinical data to consider the same.
-
The facility is in an NDB Datacenter. The facilities housing the vendor's services and the customer data must be have appropriate physical barriers (fences, walls, locked doors, gates, etc) to prevent unauthorized physical access to systems storing or processing customer data or services.
-
Access to the datacenter floor must be restricted using a mechanism that logs physical access by individuals (card swipe, proximity badges, biometrics, etc).
-
Access to the datacenter floor must be monitored with video surveillance, maintaining video records for at least 30 days.
-
The facilities housing the vendor's services and the customer data must have a temperature and humidity controlled environment, and have an appropriate fire suppression system.
-
Data storage devices (hard drives, USB flash drives, CD’s, tapes, etc.) containing customer data must be purged of all data before being discarded or repurposed by using any one of the following methods: DoD level formatting, cryptographic wiping, degaussing, or physical destruction.
-
Network and Computer systems must be protected by industry-standard firewalls (e.g. Checkpoint, Juniper) configured to “Deny All” inbound connections, except those that are explicitly allowed. Lives in NDB Datacenter.
-
The firewall application and operating systems software updates must be kept current. OS and DB security patches managed by EHC Infrastructure.
-
Hardware must include appropriate levels of internal redundancy (RAID, multiple power supplies, etc) based on the required service availability and high availability architecture. The hardware lives in NDB Datacenter.
-
All hardware default vendor accounts, privileges and passwords must be deleted or changed.
-
All security updates for firmware must be kept current.
-
Operating systems used on the vendor systems must be currently supported versions, for which the author is actively supplying security updates. If addressing a security issue requires an upgrade to a new version, then the upgrade must be without charge to the customer. Lives in NDB Datacenter.
-
All security updates for operating system layer (operating systems, device drivers, etc.) must be kept current. Lives in NDB Datacenter.
-
All operating system default vendor accounts, privileges and passwords must be deleted or changed. Lives in NDB Datacenter.
-
Userids and Passwords for system access must: Be unique for every employee and not shared or generic, and Be 8 or more characters in length, (and require the use of at least 1 number and 1 special character, when technically feasible), and Expire every 180 days, and Not be re-usable for 10 cycles. Passwords meet the currently published Emory policy.
-
Access to the system to perform remote administration, must be restricted to approved users and point of origin, and use industry standard encryption for both authentication and the session. This is achived internally in the system.
-
Security software must prevent unauthorized personnel from deleting, changing, or adding system or application software to the vendor systems, for both user privileges, and file and directory permissions. Limited to Infrastructure admins and EUV RTA admins.
-
Vendor systems must automatically lock or logoff active console sessions after a period of inactivity of no more than 15 minutes. Automatic logout set at 15 minutes.
-
Change management procedures must be in place and followed. Internal change management process developed with guidance from Emory Healthcare IS.
-
Vendor systems must have security logging enabled which Records successful and failed: Authentications, User management actions (e.g. user creation, password changes), Access control changes (e.g. group membership changes, permissions changes to critical system and application components), Attempts to access sensitive data, Security event logs must include at least the following pieces of information, User identification, Date and time of event, Indication of success or failure, Origin of event, and Identity of affected resource.
-
All Security event logs must be: Analyzed, correlated, and evaluated daily, and Archived and stored securely off site for no less than 60 days.
-
All software used on the vendor systems must be currently supported versions, for which the author is actively supplying security updates. If addressing a security issue requires an upgrade to a new version, then the upgrade must be without charge to the customer.
-
All security updates for the application layer (databases, application, etc.) must be kept current.
-
All application default vendor accounts, privileges and passwords must be deleted or changed.
-
Userids and Passwords for application access must: Be unique for every employee and not shared or generic, and Be 8 or more characters in length, (and require the use of at least 1 number and 1 special character, when technically feasible), and Expire every 180 days, and Not be re-usable for 10 cycles. All accounts used for actually accessing the servers are unique; Password complexity requirements are enforced by Emory policy.
-
Change management procedures must be in place and followed. If an update/major change to server is required, all primary stakeholders will be informed and a specific date/time set for the upgrade to take place.
-
Vendor systems must have security logging enabled which Records successful and failed: Authentications, User management actions (e.g. user creation, password changes), Access control changes (e.g. group membership changes, permissions changes to critical system and application components), Attempts to access sensitive data, Security event logs must include at least the following pieces of information, User identification, Date and time of event, Indication of success or failure, Origin of event, and Identity of affected resource. The security event logs are managed by EHC LDAP.
-
All Security event logs must be: Analyzed, correlated, and evaluated daily, and Archived and stored securely off site for no less than 60 days. Maintain security logs daily for all events.
-
Vendor must follow a documented Secure Software Development Life Cycle.
-
Vendor must ensure that security reviews are included throughout the Software Development Life Cycle.
-
Vendor will take actions necessary to protect information against reasonably anticipated threats and to limit the likelihood that vulnerabilities in vendor’s products are exposed.
-
Vendor must assign explicit responsibility for overall security of vendor products during development, management, and operation of the products.
-
All members of the developer team must be trained in secure programming techniques.
-
Vendor must conduct an analysis of their products to identify common programming errors and mitigate any errors identified. At a minimum this analysis must address all common vulnerabilities identified in the current OWASP Top 10 application vulnerabilities and the CWE/SANS Top 25 Most Dangerous programming Errors.
-
Vendor must conduct periodic risk assessments to determine and prioritize risks, enumerate vulnerabilities, and understand the impact that attacks might have on vendor’s products and ensure that vendor products meet applicable contractual obligations, regulatory mandates and security best practices and standards.
-
Vendor shall share with customer all security-relevant information regarding the vulnerabilities, risks and threats to vendor products immediately and completely upon identification.
-
The Vendor shall use a source code control system that authenticates and logs the team member associated with all changes to the software baseline and all related configuration and build files.
-
The Vendor shall use a build process that reliably builds a complete distribution from source. This process shall include a method for verifying the integrity of the software delivered to Client.
-
The Vendor shall engage external security experts to conduct penetration tests of Vendor products at least annually. The penetration testing methodology will, at a minimum, address all vulnerabilities identified in the current OWASP Top 10 Application Vulnerabilities and the CWE/SANS Top 25 Most Dangerous Programming errors. Penetration tests shall also include a security review of the portions of code that are most relevant to the security of Vendor’s product (such as authentication, authorization, and session management modules).
-
Post production, the Vendor shall perform quarterly security scans/tests with the most current signature files. Rely on EHC to do security scans
-
Vulnerabilities identified via penetration testing, code review, quarterly vulnerability scans, or other means shall be reviewed by the Vendor in a timely manner. Identified vulnerabilities that represent a significant risk to the confidentiality or integrity of customer data shall be corrected by the vendor as soon as possible. Vendor shall make reasonable efforts commensurate with the level of risk as determined by the Vendor to correct lesser vulnerabilities within a reasonable time, not to exceed 60 days.
-
The Vendor shall maintain written documentation of the results of the scans/tests along with mitigation plans for identified vulnerabilities for at least 1 year following the complete mitigation of the vulnerabilities. The vendor shall make mitigation plans available to customer upon request.
-
The Vendor shall provide notification of patches and updates affecting security immediately upon availability as identified in the patch management process throughout the software lifecycle.
-
The Vendor shall apply, test, and validate the appropriate patches and updates and/or workarounds on a test version of the application before distribution.
-
The Vendor shall track all security issues uncovered during the entire software lifecycle, whether a requirements, design, implementation, testing, deployment, or operational issue. The risk associated with each security issue shall be evaluated, documented, and reported to Customer as soon as possible after discovery.
-
The Vendor shall resolve all security issues that are identified before delivery. Security issues discovered after delivery shall be handled in the same manner as other bugs and issues as specified in this Agreement. We will address risks as they are discovered, whether through mitigation or acceptance.
-
After acceptance, if security issues are discovered or reasonably suspected, Vendor shall assist Customer in performing an investigation to determine the nature of the issue.
-
The vendor must employ a backup process that provides protection from hardware failure, data corruption or environmental disaster. Recovery from hardware failure or data corruption should take no more than one business day from the detection of the failure or corruption and should result in the loss of no more than one day of data. Recovery from a site-wide disaster should take no more than five business days and result in the loss of no more than three days of data. VM and DB are backed up by EHC Infrastructure.
-
An off-site backup of all customer data and essential operational data must be regularly maintained to recover from site-wide disaster. VM and DB are backed up by EHC Infrastructure.
-
Does the vendor's off-site redundant or backup facility reside outside the range of likely natural disasters from the primary facility? VM and DB are backed up by EHC Infrastructure.
-
A documented and tested backup, recovery and contingency plan must be in place. The plan must include addressing both small-scale events (e.g. hardware failure, data corruption) as well as site-wide events (e.g. building fire, flooding). VM and DB are backed up by EHC Infrastructure.
-
Encryption must be used for all transactions that require user authentication, transfer of sensitive data, non-console administrative access, or electronic transfer of funds; this should be accomplished using any one of the approved methods: SSL/TLS or Secure Shell (SSH).
-
The vendor agrees that any transfer of customer data between the customer and the vendor or within the vendor's computing environment will take place using encrypted protocols such as TLS, SSL, SCP, SFTP. All systems are on the EHC network.
-
The vendor certifies that all data backups of the customer’s data will be stored and maintained in an encrypted format using at least a 128 bit key. Managed by EHC Infrastructure
-
Procedures to remove operating system level access must be developed and followed when an employee leaves the vendor’s organization or is terminated. Managed by EHC Infrastructure
-
Procedures to remove application access level must be developed and followed when an employee leaves the vendor’s organization or is terminated. User account should be terminated in EHC LDAP when employee leaves
-
The vendor must have an information security awareness program and information security policies covering all vendor employees which should include items such as: Protecting customer data, Detecting and reporting suspicious activity, Safe web and email practices, Proper password management, and Locking unattended workstations.
-
An individual must be assigned responsibility for the development and implementation of security policies and procedures. This individual must have an appropriate combination of information security knowledge, skill, and experience to effectively carry out the assigned responsibilities.
-
An annual network and security risk assessment must be completed, documented, and maintained: On ePHI systems and applications for a period of 6 years, and On all other systems for 3 years.
-
The vendor must ensure that any and all customer data will be stored, processed, and maintained solely on designated servers and that no customer data at any time will be processed on or transferred to any portable or laptop computing device or any portable storage medium, unless that storage medium is in use as part of the vendor's designated backup and recovery processes.
-
The vendor must respond to suspected or known security incidents; mitigate to the extent practical, and document these incidents and their outcomes.
-
Policies and procedures that identify a security incident and require a response must: Be in place, and tested annually, and Require reporting of security incidents to the customer within 24 hours of discovery of the incident, and Require all appropriate third-party (e.g. FBI, CERT) communications to be initiated, recorded and documented.
-
In the event of a suspected data breach, the vendor must coordinate with the customer on submitting the system or other data to a third-party (law enforcement or commercial entity) for forensic analysis.
-
The vendor agrees to notify the customer when any vendor system that may access, process, or store customer data is subject to unauthorized access. Unauthorized access includes compromise by a computer worm, search engine web crawler, password compromise or access by an individual or automated program due to a failure to secure a system or adhere to established security procedures. The vendor further agrees to notify the customer within twenty-four (24) hours of the discovery of the unauthorized access by providing notice via email to <email address, typically security office or CIO>.
-
Vendor agrees to notify the customer within 24 hours if there is a threat to the vendor's product as it pertains to the use, disclosure, and security of the customer’s data.
-
Vendor, within one day of discovery, shall report to customer any use or disclosure of sensitive data not authorized by this Addendum or in writing by the customer. Vendor shall identify: (i) the nature of the unauthorized use or disclosure, (ii) the sensitive data used or disclosed, (iii) who made the unauthorized use or received the unauthorized disclosure, (iv) what the vendor has done or shall do to mitigate any deleterious effect of the unauthorized use or disclosure, and (v) what corrective action the vendor has taken or shall take to prevent future similar unauthorized use or disclosure. Vendor shall provide such other information, including a written report, as reasonably requested by the customer.
-
Vendor agrees to comply with all applicable laws that require the notification of individuals in the event of unauthorized release of personally-identifiable information or other event requiring notification. In the event of a breach of any of vendor's security obligations or other event requiring notification under applicable law ("Notification Event"), the vendor agrees to assume responsibility for informing all such individuals in accordance with applicable law and to indemnify, hold harmless and defend the customer, its trustees, officers, and employees from and against any claims, damages, or other harm resulting from such Notification Event.
-
All vendors’ employees who interact with systems containing ePHI, must receive Security Awareness training in accordance with HIPAA Security rule. All members of HITILab are required to complete the university’s 3 HIPAA modules in ELMS.
-
The vendor must maintain a record of the movements of hardware and electronic media containing ePHI, noting the person responsible for its movement. All records will be maintained.