The challenge
With the rapid need to integrate generative AI into ways of working within health systems many challenges exist that are unique to health organizations. One of the biggest challenges is compliance; it is a violation of hospital policies and HIPAA to share patient or other sensitive data (e.g. IP, student data) on public AI services. Therefore, a large nationally known academic health system was seeking to develop a usable version of public AI services (like ChatGPT) to create a competitive data advantage while noting and mitigating the risk of patient information leakage (PHI).
The approach
- Assembling an Internal Team with Urgency
Vynamic worked with our client to bring together an interdisciplinary set of data experts and stakeholders across multiple departments – and organized a curated series of working sessions to start developing a product.
- Building a Prototype for Testing
We managed a team of technical experts in developing a project plan, which included selecting users to test a pilot. Research teams and clinicians with use cases in mind utilized the tool, and the internal team was able to further understand and improve functionality.
- Working with an External Vendor to compare options of make/buy
Vynamic coordinated meetings with a leading commercial vendor in the AI space, Microsoft, and harnessed expertise to build/customize on top of a base infrastructure they provided.
- Deploying, Communicating & Optimizing Across Enterprise
Vynamic worked with the central communications team as well as the technical leads to draft a clear, yet succinct communication to the enterprise to socialize the use of these PHI-safe AI tools and continue to provide strategic support.
The result
These new AI/ML driven tools will help enable and spur on innovation and research; while implementing safeguards to ensure we meet compliance and security requirements. They are both HIPAA compliant and secured for sensitive data input, and they are delivered within our HIPAA-covered Azure tenant and our network security.