Skip to content

Commit

Permalink
add citations
Browse files Browse the repository at this point in the history
  • Loading branch information
monicagerber committed Dec 22, 2023
1 parent fd10298 commit 240ba2e
Show file tree
Hide file tree
Showing 2 changed files with 80 additions and 3 deletions.
6 changes: 3 additions & 3 deletions 04b-AI_Policy-ai_regs_and_laws.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -88,12 +88,12 @@ Each educational institution and classroom is adapting to AI differently. The Mi
## Healthcare

The health care industry is an example of an industry where the speed of technology development has led to gaps in regulation, and the US recently released an Executive Order about creating [healthcare-specific AI policies](https://www.whitehouse.gov/briefing-room/blog/2023/12/14/delivering-on-the-promise-of-ai-to-improve-health-outcomes/).
The U.S. Food and Drug Administration (FDA) regulates AI-enabled medical devices and software used in disease prevention, diagnosis, and treatment. However, there are serious concerns about the adequacy of current regulation, and many other AI-enabled technologies that may have clinical applications fall out of the scope of FDA regulation. Other federal agencies, such as the Health and Human Services Office of Civil Rights, have important roles in the oversight of some aspects of AI use in health care, but their authority is limited. Additionally, there are existing federal and state laws and regulations, such as the Health Insurance Portability and Accountability Act (HIPAA), that impact the use and development of AI. This patchwork landscape of federal and state authority and existing laws has led the American Medical Association (AMA) to advocate for a “whole government” approach to implement a comprehensive set of policies to ensure that “the benefits of AI in health care are maximized while potential harms are minimized.”
The U.S. Food and Drug Administration (FDA) regulates AI-enabled medical devices and software used in disease prevention, diagnosis, and treatment. However, there are serious concerns about the adequacy of current regulation, and many other AI-enabled technologies that may have clinical applications fall out of the scope of FDA regulation (@habib2023; @ama2023). Other federal agencies, such as the Health and Human Services Office of Civil Rights, have important roles in the oversight of some aspects of AI use in health care, but their authority is limited. Additionally, there are existing federal and state laws and regulations, such as the Health Insurance Portability and Accountability Act (HIPAA), that impact the use and development of AI. This patchwork landscape of federal and state authority and existing laws has led the American Medical Association (AMA) to advocate for a “whole government” approach to implement a comprehensive set of policies to ensure that “the benefits of AI in health care are maximized while potential harms are minimized” (@healthcareradio2023).

The AMA and health care leaders have highlighted the importance of specialized expertise in the oversight and adoption of AI products in health care delivery and operations. For example, Dr. Nigam Shah and colleagues call for the medical community to take the lead in defining how LLMs are trained and developed:

> By not asking how the intended medical use can shape the training of LLMs and the chatbots or other applications they power, technology companies are deciding what is right for medicine.
> By not asking how the intended medical use can shape the training of LLMs and the chatbots or other applications they power, technology companies are deciding what is right for medicine (@nigam2023).
The medical community should actively shape the development of AI-enabled technologies by advocating for clinically-informed standards for the training of AI, and for the evaluation of the value of AI in real-world health care settings. At an institutional level, specialized clinical expertise is required to create policies that align AI adoption with standards for health care delivery. And in-depth knowledge of U.S. health insurance system is required to understand how complexity and lack of standardization in this landscape may impact AI adoption in clinical operations. In summary, health care leaders and the medical community need to play an active role in the development of new AI regulations and policy.
The medical community should actively shape the development of AI-enabled technologies by advocating for clinically-informed standards for the training of AI, and for the evaluation of the value of AI in real-world health care settings. At an institutional level, specialized clinical expertise is required to create policies that align AI adoption with standards for health care delivery. And in-depth knowledge of U.S. health insurance system is required to understand how complexity and lack of standardization in this landscape may impact AI adoption in clinical operations (schulman2023). In summary, health care leaders and the medical community need to play an active role in the development of new AI regulations and policy.

# VIDEO AI acts, orders, and policies
77 changes: 77 additions & 0 deletions book.bib
Original file line number Diff line number Diff line change
Expand Up @@ -806,3 +806,80 @@ @misc{zoom2023
language = {en-US},
urldate = {2023-12-20},
}


@article{habib2023,
author = {Habib, Anand R. and Gross, Cary P.},
title = "{FDA Regulations of AI-Driven Clinical Decision Support Devices Fall Short}",
journal = {JAMA Internal Medicine},
volume = {183},
number = {12},
pages = {1401-1402},
year = {2023},
month = {12},
abstract = "{We are entering a new era of computerized clinical decision support (CDS) tools. Companies are increasingly using artificial intelligence and/or machine learning (AI/ML) to develop new CDS devices, which are defined by the US Food and Drug Administration (FDA) as software used in disease prevention, diagnosis, or treatment. Recognizing the potential implications for clinical practice, the 21st Century Cures Act enjoined the FDA to regulate these new devices.In their case series reported in this issue of JAMA Internal Medicine, Lee and colleagues analyzed the evidence supporting FDA approval of 10 AI/ML CDS devices intended for use in critical care. Their findings are worrisome. Only 2 device authorizations cited peer-reviewed publications, and only 1 outlined a detailed safety risk assessment. No company provided software code to enable independent validation, evaluated clinical efficacy, or assessed whether the use of algorithms exacerbates health disparities.}",
issn = {2168-6106},
doi = {10.1001/jamainternmed.2023.5006},
url = {https://doi.org/10.1001/jamainternmed.2023.5006},
eprint = {https://jamanetwork.com/journals/jamainternalmedicine/articlepdf/2810620/jamainternal\_habib\_2023\_er\_230003\_1701463607.95585.pdf},
}

@misc{healthcareradio2023,
title={AMA issues new principles for AI development, deployment & use},
url={https://www.healthcarenowradio.com/ama-issues-new-principles-for-ai-development-deployment-use/#:~:text=Key%20concepts%20outlined%20by%20the,governance%20of%20health%20care%20AI.},
journal={HealthcareNOWradio.com},
author={News, Industry},
year={2023},
month={Dec}
}
@misc{ama2023,
title={AMA issues new principles for AI development, deployment & use},
url={https://www.ama-assn.org/press-center/press-releases/ama-issues-new-principles-ai-development-deployment-use},
journal={American Medical Association},
author={American Medical Association},
year={2023},
month=nov,
language={en}
}
@article{nigam2023,
author = {Shah, Nigam H. and Entwistle, David and Pfeffer, Michael A.},
title = "{Creation and Adoption of Large Language Models in Medicine}",
journal = {JAMA},
volume = {330},
number = {9},
pages = {866-869},
year = {2023},
month = {09},
abstract = "{There is increased interest in and potential benefits from using large language models (LLMs) in medicine. However, by simply wondering how the LLMs and the applications powered by them will reshape medicine instead of getting actively involved, the agency in shaping how these tools can be used in medicine is lost.Applications powered by LLMs are increasingly used to perform medical tasks without the underlying language model being trained on medical records and without verifying their purported benefit in performing those tasks.The creation and use of LLMs in medicine need to be actively shaped by provisioning relevant training data, specifying the desired benefits, and evaluating the benefits via testing in real-world deployments.}",
issn = {0098-7484},
doi = {10.1001/jama.2023.14217},
url = {https://doi.org/10.1001/jama.2023.14217},
eprint = {https://jamanetwork.com/journals/jama/articlepdf/2808296/jama\_shah\_2023\_sc\_230004\_1693922864.71803.pdf},
}

@article{schulman2023,
author = {Schulman, Kevin A. and Nielsen, Perry Kent, Jr and Patel, Kavita},
title = "{AI Alone Will Not Reduce the Administrative Burden of Health Care}",
journal = {JAMA},
volume = {330},
number = {22},
pages = {2159-2160},
year = {2023},
month = {12},
abstract = "{Large language models (LLMs) are some of the most exciting innovations to come from artificial intelligence research. The capacity of this technology is astonishing, and there are multiple different use cases being proposed where LLMs can solve pain points for physicians—everything from assistance with patient portal messages to clinical decision support for chronic care management to compiling clinical summaries. Another often discussed opportunity is to reduce administrative costs such as billing and insurance-related costs in health care. However, before jumping into technology as a solution, considering why the billing process is so challenging in the first place may be a better approach. After all, the prerequisite for a successful LLM application is the presence of “useful patterns” in the data.}",
issn = {0098-7484},
doi = {10.1001/jama.2023.23809},
url = {https://doi.org/10.1001/jama.2023.23809},
eprint = {https://jamanetwork.com/journals/jama/articlepdf/2812255/jama\_schulman\_2023\_vp\_230150\_1701364722.65094.pdf},
}








0 comments on commit 240ba2e

Please sign in to comment.