The Health Insurance Portability and Accountability Act (HIPAA) is getting attention with use of AI and machine learning in the healthcare industry — and for good reason. Although these tools can provide a more efficient and potentially reliable means to organizing and analyzing patient data, use can come at a cost. In this example, that cost could be patient privacy.
Case studies provide examples of violations
But what exactly does a violation look like? Recent examples include a health system executive sentenced to two years probation and mandatory repayment of $140,000 in restitution after pleading guilty to impermissibly disclosing protected health information. The exec shared patient data with a collection vendor as part to the process to develop software to aid other healthcare companies.
Another potential violation involves the use of tools like a chatbot to help organize notes or aid in drafting correspondence. The information the physician inputs into the software may not be safe as many chatbots gather data to help aid in learning. Although use of the chatbot is especially helpful to reduce administrative burdens like correspondence to insurance companies, physicians must use these tools wisely to avoid allegations of a violation of applicable federal and state regulations.
Tips to avoid a violation
It is a good idea for private practices and healthcare facilities to address this potential issue and make sure they have a compliance program in place. Inform physicians and other healthcare professionals of the expectations and how they can use these tools while remaining in compliance with various regulations such as HIPAA.
Attorney John Rivas is responsible for this communication.