Professional Education
Module 3: Safe Prescribing of Opioids and CNS Depressants (Innovations and Smart Approaches in Safe Prescribing)
Learn several new concepts and innovation tips. Learn about the new CDC Guidelines about pain. This program will show how to treat pain, opioid withdrawal, and/or opioid addiction. It will show how to taper opioids and you will learn how to conduct “motivational interviewing.” You will examine multimodal approaches for chronic pain, including physical, psych-behavioral, procedural, and pharmacological modalities. You will be encouraged to use tools for prescribing and tapering opioids and benzodiazepines. You will be provided with an overview of the use of CURES 2.0. Lastly, the program will examine how to best screen patients for use of CNS depressants and marijuana to mitigate the potential for contraindications with opioids. This module consists of seven audio-filled videos. To successfully complete this course, you need to achieve a passing score of 80 percent in the post-test questions.
Search
Sep 05, 2025
The Foundation Board Welcomes New Director Ami Parekh, MD, JD
We’re pleased to welcome Ami Parekh, MD, JD, to The Doctors Company Foundation Board of Directors. Dr. Parekh is the Chief Health Officer at Included Health, where she leads their national primary care, urgent care, behavioral health, clinical navigation, and population health management practices.
From
The Doctor’s Advocate
Shoulder Dystocia Documentation: Implementing a Protocol
Shoulder dystocia claims have traditionally been among the most problematic to defend.
The Importance of Clinical Health Information at the Point of Radiology Order
Having ready access to patients’ clinical information helps radiologists eliminate assumptions and apply their skills and expertise in rendering accurate interpretive reports.
November 13, 2025, Inside Medical Liability
MPL Case: Could Timely Diagnosis Have Preserved This Patient’s Vision?
Despite thorough subject-matter knowledge, physicians can miss the diagnosis of a familiar condition because of issues with clinical judgment. Clinical judgment may be influenced by distraction, interruption, or team communication issues. A new study of malpractice allegations against ophthalmologists suggests that when practices build teamwork skills, they strengthen patient safety and mitigate practice risks.
February 19, 2025, JAMA Network Open
Ambient Listening—Legal and Ethical Issues
Ambient listening, which involves using AI to record and analyze conversations between clinicians and patients, is one area of early AI adoption among healthcare professionals. I. Glenn Cohen, JD; Julie Ritzman, MBA, CPHRM; and Richard F. Cahill, JD, provide a comprehensive analysis of the legal and ethical considerations associated with the use of ambient listening technologies in healthcare settings.
Healthcare Cybersecurity Resources
It’s not a matter of if a data breach will occur in your medical practice—it’s a matter of when. Make sure your practice is prepared with these resources.
Professional Education
Shoulder Dystocia Clinician-Patient Disclosure
This enduring activity is designed to assist physicians and advanced practice clinicians (APCs) in enhancing their communication skills when disclosing a shoulder dystocia injury to patients and family members. This type of injury to the infant may, unfortunately, occur despite the best of care; however, effective physician-patient communication is an integral part of clinical practice and has been shown to positively influence outcomes by increasing patient understanding and trust.
Randall Nukk
Randall Nukk is physician niche president for Gallagher Healthcare in Itasca, Illinois. His areas of expertise include medical malpractice insurance and coverage and exposure analysis.
October 30, 2025, Medical Economics
AI, Malpractice, and the Future of Physician Liability
Artificial intelligence (AI) is entering everyday care, raising questions about malpractice. Deepika Srivastava, Chief Operating Officer at The Doctors Company, discusses how AI could redefine the standard of care, what happens when an algorithm contributes to patient harm, and practical steps physicians can take now to protect themselves—including documentation, communication and clear internal policies.