Wednesday, December 5, 2012

natural-language-processing-electronic-health-records




80 percent of the clinical documentation that exists in healthcare today is unstructured.  It is sometimes referred to as “the text blob” and is buried within electronic health records(EHRs).  The inherent problem with “the text blob” is that locked within it lies an extraordinary amount of key clinical data – valuable information that can and should be leveraged to make more informed clinical decisions, to ultimately improve patient care and reduce healthcare costs.  To date, however, because it consists of copious amounts of text, the healthcare industry has struggled to unlock meaning from “the text blob” without intensive, manual analysis or has chosen to forego extracting the value completely.
Recently, there was a study published in JAMA, led by Harvey J. Murff and colleagues that validates natural language processing (NLP) technologies as a powerful tool to unlock data (meaning) from EHRs.  However, the use of NLP in healthcare is not limited to information queries and while the Murff/JAMA study reveals NLP as a way to track adverse events after surgery, this use-case is just the tip of the iceberg when it comes to the value of NLP technologies for healthcare.
Next-generation clinical technologies: The scope of language processing, the value of understanding
In time, NLP will evolve beyond a query tool.  It will become a well-known, primary component of the point-of-care process, providing doctors with real-time information about a patient that they’re documenting a medical record on behalf of, and will guide physicians to include the most thorough and accurate patient information as they are dictating their notes.  To put NLP in perspective when it comes to healthcare, let’s explore three use-cases that are being developed today.
NLP for meaningful use. In order for physicians to qualify for government incentive payments associated with adopting and using EHRs they must capture specified facts, including things such as problem lists, allergies, smoking status and vital signs.  These facts are oftentimes easy for a physician to capture through a narrative description (via voice), but can prove difficult and time consuming to capture via an EHR system template.  For example, saying a patient is taking a certain medicine is simpler than finding the associated prescription pull-down menu and selecting the corresponding drug, dosage, route and frequency with several clicks of a mouse.  The EHR documentation conundrum thereby becomes a double-edged sword – doctors can document easily and fast by speaking, but “the text blob” that speaking creates traps information, renderings it unusable because data outside of a structured format is not actionable.  Beyond the issues with the mechanics associated with entering data within structured EHR formats, pure structured representation of the patient story falls short of what a care team requires to deliver optimal care.  In fact in 2009, 96 percent of 1,000 surveyed physicians said they were “concerned” about “losing the unique patient story with the transition to point-and-click (template-driven) EHRs,” and 94 percent said that “including the physician narrative as part of patients’ medical records” is “important” or “very important” to realizing and measuring improved patient outcomes.  Structured documentation, created via template, is easy to analyze and pull facts from, but has proven to be an unnatural means of documentation for doctors and does not capture the nuances of each unique patient story.   Natural speech documentation capture combined with NLP delivers a means for physicians to tell a complete patient story with all its subtleties and makes available all of the clinical facts needed for the EMR to operate in an optimal way.  NLP delivers the best of both worlds.
NLP applied to the medical domain is called Clinical Language Understanding or CLU.  The difference between NLP and CLU is that CLU works off of a complete, highly granular medical ontology, which has been tuned to relate and identify all kinds of medical facts so that the underlying NLP engine can “understand” what the caregiver is saying.  For example, CLU knows that “cancer” is a “disease” and would auto-populate the EHR with that information.  CLU knows that “amoxicillin” is an “antibiotic;” this knowledge is a direct impact of the ontology.  NLP for meaningful use or Clinical Language Understanding, allows doctors to be efficient with documentation, helps to ensure patients’ medical records are comprehensive and are not reduced purely structured content created by point-and-click templates, and supports healthcare organizations to comply with government regulations, including the HITECH act so that care can be optimized and reimbursement can be maximized.
NLP for predictive care. The application of NLP to healthcare can be done in a retrospective manner, (after the patient has left the hospital) or in a predictive manner (while the patient is still there).  With the JAMA study, NLP was applied retrospectively and used to query data for information for broad patient analysis.  In this scenario, it is far more difficult to exploit real-time opportunities to impact patient outcomes because, as mentioned above, analysis occurs after the patient has left.  With advancement, that is taking place today, CLU solutions will move toward decision-support that will provide immediate feedback to physicians at the point-of-dictation, whether they are using a digital recorder, PDA, or mobile phone.  For example, if a doctor is documenting a prescription for a patient within the EHR and CLU technology is running in the background, the system might notify the doctor that the patient could have an adverse reaction to that drug and would recommend an alternative.  This is one of many examples.
NLP for effective billing. When applied to billing, NLP can remove a lot of pain from the billing process, for doctors and for coders.  Let’s start with doctors; as they document, as a natural consequence of their busy lives there is ample room for doctors to be vague, which can negatively impact patient care, communication with other caregivers, and can complicate billing.  Today, if a doctor is vague with documentation they might get a phone call three weeks later from a medical coder who is trying to code their documentation for billing purposes.  Chances are the doctor won’t fully remember the extra detail that should have initially been captured and the exchange will be burdensome and ineffective.  By applying NLP to the documentation process, CLU can scan and understand what the doctor is saying and ask for added specificity or severity when necessary.  For example, if a doctor says a patient had a “fracture of forearm,” did they mean lower forearm, right or left forearm, and what was the severity?  By prompting the physician while the details are fresh in his/her mind, the end document will be more complete, which results in improved care, better cross-care communication, more accurate billing and eliminates that phone call three weeks down the road.  Likewise, for the medical coder, CLU can be used to scan and understand electronic medical records and help to auto-code information based on what is documented. For example, what was once dictated as “fracture of forearm,” was appropriately elaborated on to become “torus fracture of lower end of right radius,” and would be coded “S52.521” based on ICD standards.
We can’t understand what we don’t know
A fundamental component to the success of NLP in healthcare and to improved patient care in our future is high-quality documentation.  Things like tracking trends in patient care or shaping treatment decisions through better, real-time information cannot happen if we don’t have meaningful data to analyze and drive informed decisions.  As part of the healthcare industry’s transition to EHRs, step one is making clinical data digital and step two is making digital data meaningful.  Structured information captured within EHRs is incredibly valuable and easy to derive meaning from, but it does not tell the whole story.  Because unstructured text delivered from natural speech is so much a part of our clinical ecosystem, NLP will play a leading role in making “the text blob” meaningful and actionable.  The Murff/JAMA study calls attention to the value of NLP in healthcare and sets the stage for a broad spectrum of scenarios in which we can apply understanding and intelligence technologies to improve care quality, reimbursement and efficiency.
Joe Petro is senior vice president of research and development for Nuance Healthcare.
Submit a guest post and be heard on social media’s leading physician voice.

No comments:

Post a Comment