Showing posts with label FDA. Show all posts
Showing posts with label FDA. Show all posts

HIMSS Senior Vice President on Medical Ethics: Ignore Health IT Downsides for the Greater Good

The Health Information Management Systems Society (HIMSS) is the large health IT vendor trade group in the U.S.  At a Sept. 21, 2012 HIMSS blog post, John Casillas, Senior Vice President of HIMSS Financial-Centered Systems and HIMSS Medical Banking Project dismisses concerns about health IT with the refrain:

... To argue that the existence of something good for healthcare in many other ways, such as having the right information at the point of care when it’s needed, is actually bad because outliers use it to misrepresent claims activity is deeply flawed.

Through the best use of health IT and management systems, we have the opportunity to improve the quality of care, reduce medical errors and increase patient safety. Don’t let the arguments of some cast a cloud over the critical importance and achievement of digitizing patient health records.

Surely, no one can argue paper records are the path forward. Name one other industry where this is the case. I can’t.

Let’s not let the errors of a few become the enemy of good.

The ethics of these statements from a non-clinician are particularly perverse.

The statement "Don’t let the arguments of some cast a cloud over the critical importance and achievement of digitizing patient health records" is particularly troubling.

When those "some" include organizations such as FDA (see FDA Internal 2010 memo on HIT risks, link) and IOM's Committee on Patient Safety and Health Information Technology (see 2012 report on health IT safety, link) both stating that harms are definite but magnitude unknown due to systematic impediments to collecting the data, and the ECRI Institute having had health IT in its "top ten healthcare technology risks" for several years running, link, the dismissal of "clouds" is unethical on its face.

These reports indicate that nobody knows if today's EHRs improve or worsen outcomes over good paper record systems or not.  The evidence is certainly conflicting (see here).

It also means that the current hyper-enthusiasm to roll out this software nationwide in its present state could very likely be at the expense of the unfortunate patients who find themselves as roadkill on the way to the unregulated health IT utopia.

That's not medicine, that's perverse human subjects experimentation without safeguards or consent.

As a HC Renewal reader noted:

Astounding hubris, although it does seem to be effective.  Such is PC hubris.  Who could ever call for reducing the budget of the NIH that is intended to improve health.  Has health improved?  No.

So why does a group with spotty successes if not outright failure never get cut?  It’s not the results, it’s the mission that deserves the funding.  So it’s not the reality of HIT, it’s the promise, the mission, that gets the support.  Never mind the outcome, it’s bound to improve with the continued support of the mission.

Is this HIMSS VP aware of these reports?  Does he even care?

Does he believe patients harmed or killed as a result of bad health IT (and I know of a number of cases personally through my advocacy work, including, horribly, infants and the elderly) are gladly sacrificing themselves for the greater good of IT progress?

It's difficult to draw any other conclusion from health IT excuses such as proffered, other than he and HIMSS simply don't care about unintended consequences of health IT.

Regarding "Surely, no one can argue paper records are the path forward" - well, yes, I can.  (Not the path 'forward', but the path for now, at least, until health IT is debugged and its adoption and effects better understood).  And I did so argue, at my recent posts "Good Health IT v. Bad Health IT: Paper is Better Than The Latter" and "A Good Reason to Refuse Use of Today's EHR's in Your Health Care, and Demand Paper".  I wrote:

I opine that the elephant in the living room of health IT discussions is that bad health IT is infrequently, if ever, made a major issue in healthcare policy discussions.

I also opine that bad health IT is far worse, in terms of diluting and decreasing the quality and privacy of healthcare, than a very good or even average paper-based record-keeping and ordering system.  


This is a simple concept, but I believe it needs to be stated explicitly. 

A "path forward" that does not take into account these issues is the path forward of the hyper-enthusiastic technophile who either deliberately ignores or is blinded to technology's downsides, ethical issues, and repeated local and mass failures.

If today's health IT is not ready for national rollout, e.g., causes harms of unknown magnitude (e.g., see this query link), results in massive breaches of security as the "Good Reason" post above, and mayhem such as at this link, then:

The best - and most ethical - option is to slow down HIT implementation and allow paper-based organizations and clinicians to continue to resort to paper until these issues are resolved.  Resolution needs to occur in lab or experimental clinical settings without putting patients at risk - and with their informed consent.

Anything else is akin to the medical experimentation abuses of the past that led to current research subjects protections such as the "Ethical Guidelines & Regulations" used by NIH.

-- SS

Old Mystery Solved? Former FDA Reviewer Speaks Out About Intimidation, Retaliation and Marginalizing of Safety

At my Dec. 2005 post "Report: Life Science Manufacturers Adapt to Industry Transition" I wrote:

... The recognition of a gap in formally-trained medical informatics-trained personnel in the pharmaceutical industry [by Gartner Group] is welcome. For example, from my own experience:

I recall an interview I had last year with the head of the Drug Surveillance & Adverse Events department at Merck Research Labs in a rehire situation [after a 2003 layoff]. I came highly recommended by an Executive Director in the department, to whom I had shown my prior work. This included well-accepted, novel human-computer interaction designs I'd developed for use by busy biomedical researchers for a large clinical study in the Middle East, as well as my work modeling invasive cardiology and leading the development and implementation of a comprehensive information system to detect new device and treatment modality risks in a regional center performing more than 6,000 procedures/year. In addition, I'd worked with the wife of the Executive Director in years prior, when she ran the E.R. of the hospital where I was director of occupational medicine.

Despite all this in my favor, the Executive Director's boss, himself a former FDA adverse events official [a former deputy director of CDER’s office of drug safety, who'd recently moved to the pharma industry he once regulated - ed.], dismissed me in five minutes as I was showing him the cardiology project, saying flatly "we don't need a medical informatics person here." I had driven 80 miles to Rahway for this interview to save the executive a trip to Pennsylvania, where I was originally scheduled to come for the interview, since the executive's father was ill in the hospital. In an instance of profound social ineptness, my effort was not even acknowledged. Perhaps he was in a bad frame of mind, but the dismissal under the circumstances was all the more disappointing.

I recall this was one of the most puzzling hiring debacles I'd ever experienced, as all the senior people in his dept. had recommended he hire me - I was really only there for his approval and signoff - and the work I'd shown him had improved care, saved lives, and saved money.

I may not need to be puzzled any longer.  This story just appeared:

Former FDA Reviewer Speaks Out About Intimidation, Retaliation and Marginalizing of Safety
By Martha Rosenberg, Truthout
July 29, 2012

The Food and Drug Administration (FDA) is often accused of serving industry at the expense of consumers. But even FDA defenders are shocked by reports this week of an institutionalized FDA spying program on its own scientists, lawmakers, reporters and academics that included an enemies list of "actors" and collaborators

... Ronald Kavanagh [FDA drug reviewer from 1998 to 2008]:  ... In the Center for Drugs [Center for Drug Evaluation and Research or CDER], as in the Center for Devices, the honest employee fears the dishonest employee. There is also irrefutable evidence that managers at CDER have placed the nation at risk by corrupting the evaluation of drugs and by interfering with our ability to ensure the safety and efficacy of drug ... While I was at FDA, drug reviewers were clearly told not to question drug companies and that our job was to approve drugs.

Read the entire story at the link.  I won't cover it more here, except to say it's certainly possible to believe certain FDA officials don't want serious people around -- who in addition to being MD's can write serious software to detect drug and device problems -- whose work can get in the way of drug approvals.

-- SS

Health IT FDA Recall: Philips Xcelera Connect - Incomplete Information Arriving From Other Systems

Another health IT FDA recall notice, this time on middleware, an interface engine that routes data:

Week of July 11

Product description:  

Philips Xcelera Connect, Software R2.1 L 1 SP2, an interface engine for data exchange [a specialized computer and accompanying software package - ed.]. Philips Xcelera Connect R2.x is a generic interface and data mapping engine between a Hospital Information System (HIS), Imaging Modalities, Xcelera PACS and Xcelera Cath Lab Manager (CLM). This interface engine simplifies the connection by serving as a central point for data exchange. The data consists only of demographic patient information, schedules, textual information and text reports.

Classification:  Class II

Reason For Recall Xcelera Connect R2.1 L 1 SP2 , incomplete information arriving from unformatted reports interface

The data consists "only" of demographic patient information, schedules, textual information and text reports?

This is a dangerous fault mode, indeed.

"Incomplete information" moving between a hospital information system, imaging systems, a PACS system used to manage the images, and a cardiac cath lab can lead to very bad outcomes (and million dollar lawsuits), such as at "Babies' deaths spotlight safety risks linked to computerized systems", second example.

Note that the interface engine is in release 2.1, level 1, service pack 2.

In other words, a critical hardware/software product such as this undergoes constant tweaking (like Windows).

As a Class II device, at least the software is vetted to some degree by FDA:

Class II devices are those for which general controls alone are insufficient to assure safety and effectiveness, and existing methods are available to provide such assurances.[8][10] In addition to complying with general controls, Class II devices are also subject to special controls.[10] A few Class II devices are exempt from the premarket notification.[10] Special controls may include special labeling requirements, mandatory performance standards and postmarket surveillance.[10] Devices in Class II are held to a higher level of assurance than Class I devices, and are designed to perform as indicated without causing injury or harm to patient or user. Examples of Class II devices include powered wheelchairs, infusion pumps, and surgical drapes.[8][10]

One wonders how testing of tweaks and updates to this product is done, if at all, other than on live and unsuspecting patients.

When you go into the hospital you are not just putting your life in the hands of the doctors and nurses, you're putting your life into the hands of computer geeks and software development experiments.

-- SS

July 25, 2012 Addendum:

The WSJ covered this here:  http://blogs.wsj.com/cio/2012/07/20/philips-recalls-flawed-patient-data-system/.  From their report:

... The problem that led to the recall: hitting the “enter” button, to start a new paragraph, in the summary field of heart test reports, sometimes caused the text entered below that point to be stripped from the report as it was transmitted into the patient’s electronic health record. And doctors later reviewing the patient’s electronic health record would not necessarily know they had received only part of the report, which could lead them to make “incorrect treatment decisions,” Philips said in a letter to hospitals.

...  Mike Davis, managing director at The Advisory Board Company, a healthcare research firm, says in the case of the Xcelera Connect, Philips should have caught the problem in testing. “How the hell does this get out? It shows there wasn’t good quality assurance processes in place.”

Indeed.

-- SS

FDA: Software Failures Responsible for 24% Of All Medical Device Recalls

At "FDA: Software Failures Responsible for 24% Of All Medical Device Recalls" via Kapersky Labs, a software security company, it is observed (emphases mine):

Software failures were behind 24 percent of all the medical device recalls in 2011, according to data from the U.S. Food and Drug Administration, which said it is gearing up its labs to spend more time analyzing the quality and security of software-based medical instruments and equipment.

The FDA's Office of Science and Engineering Laboratories (OSEL) released the data in its 2011 Annual Report on June 15, amid reports of a compromise of a Web site used to distribute software updates for hospital respirators. The absence of solid architecture and "principled engineering practices" in software development affects a wide range of medical devices, with potentially life-threatening consequences, the Agency said. In response, FDA told Threatpost that it is developing tools to disassemble and test medical device software and locate security problems and weak design.

... "Manufacturers are responsible for identifying risks and hazards associated with medical device software (or) firmware, including risks related to security, and are responsible for putting appropriate mitigations in place to address patient safety," the agency said in an e-mail statement.

Health IT medical devices are the exception, of course.  A health IT virtual medical device is always of rock-solid architecture, always uses "principled engineering practices" in software development, and never has life-threatening consequences, of course.

Hence its special regulatory accommodations over non-virtual (tangible) medical devices.

-- SS

WSJ "There's a Medical App for That—Or Not" - Misinformation on Health IT Safety Regulation?

There's a health IT meme that just won't die (patients may, but not the meme).

It's the meme that health IT "certification" is a certification of safety.

I expressed concern about the term "certification" being misunderstood even before the meme formally appeared, when the term was adopted by HHS with regard to evaluation of health IT for adherence to the "meaningful use" pre-flight features checklist.  See my mid-2009 post "CCHIT Has Company" where I observed:

HIT "certification." ... is a term I put in quotes since it really is "features qualification" at this point, not certification such as a physician receives after passing Specialty Boards.

The "features qualification" is an assurance that the EHR functions in way that could enable an eligible provider or eligible hospital to meet the Center for Medicare & Medicaid Services' (CMS) requirements of "Meaningful Use."  No rigorous safety testing in any meaningful sense is done, and no testing under real-world conditions is done at all.

I've seen the meme in various publications and venues.  I've even seen it in legal documents in medical malpractice cases where EHR's were involved, as an attempted defense.

Now the WSJ has fallen for the health IT Certification meme.

An article "There's a Medical App for That—Or Not" was published on May 29, 2012.  Its theme is special regulatory accommodation for health IT in the form of opposition to FDA regulation of devices such as "portable health records and programs that let doctors and patients keep track of data on iPads."

In the article, this assertion about health IT "certification" is made:

... The FDA's approach to health-information technology risks snuffing out activity at a critical frontier of health care. Poor, slow regulation would encourage programmers to move on, leaving health care to roil away for yet another generation, fragmented, disconnected and choking on paperwork.

The process already exists for safeguarding the public for computers in health care. It's not FDA premarket review but the health information technology certification program, established under President George W. Bush and still working fine under the Obama Health and Human Services Department. The government sets the standards and an independent nonprofit [ATCB, i.e., ONC Authorized Testing and Certification Bodies - ed.] ensures that apps meet those standards. It's a regulatory process as nimble as the breakout industry it's meant to monitor. That is where and how these apps should be regulated.

It's a wonderful meme.  Unfortunately, it's wrong.  Dead wrong.

Certification by an ATCB does not "safeguard the public."   Two ONC Authorized Testing and Certification Bodies (ATCB's) admitted this in email, as in my Feb. 2012 post "Hospitals and Doctors Use Health IT at Their Own Risk - Even if Certified".  I had asked them, point-blank:

"Is EHR certification by an ATCB a certification of EHR safety, effectiveness, and a legal indemnification, i.e., certifying freedom from liability for EHR use of clinical users or organizations? Or does it signify less than that?"

I received two replies from major ONC ATCB's indicating that "certification" is merely assurance that HIT meets a minimal set of "meaningful use" guidelines, not that it's been vetted for safety.  For instance:

From: Joani Hughes (Drummond Group)
Sent: Monday, March 05, 2012 1:06 PM
To: Scot Silverstein
Subject: RE: EHR certification question

Per our testing team:

It is less than that. It does not address indemnification although a certification could be used as a conditional part of some other form of indemnification function, such as a waiver or TOA, but that is ultimately out of the scope of the certification itself. Certification in this sense is an assurance that the EHR functions in way that could enable an eligible provider or eligible hospital to meet the CMS requirements of Meaningful Use Stage 1. Or to restate it more directly, CMS is expecting eligible providers or eligible hospitals to use their EHR in “meaningful way” quantified by various quantitative measure metrics and eligible providers or eligible hospitals can only be assured they can do this if they obtain a certified EHR technology.

Please let me know if you have any questions.

Thank you,
Joani.

Joani Hughes
Client Services Coordinator
Drummond Group Inc.

The other ATCB, ICSA Labs, stated that:

... Certification by an ATCB signifies that the product or system tested has the capabilities to meet specific criteria published by NIST and approved by the Office of the National Coordinator. In this case the criteria are designed to support providers and hospitals achieve "Meaningful Use." A subset of the criteria deal with the security and patient privacy capabilities of the system.

Here is a list of the specific criteria involved in our testing:
http://healthcare.nist.gov/use_testing/effective_requirements.html

In a nutshell, ONC-ATCB Certification deals with testing the capabilities of a system, some of them relate to patient safety, privacy and security functions (audit logging, encryption, emergency access, etc.).

What was suggested in the email below (freedom from liability for users of the system, etc.) would be out of scope for ONC-ATCB testing based on the given criteria. [I.e., certification criteria - ed.] I hope that helps to answer your question.

I had noted that:

... My question was certainly answered [by the ATCB responses]. ONC certification is not a safety validation, such as in a document from NASA on aerospace software safety certification, "Certification Processes for Safety-Critical and Mission-Critical Aerospace Software" (PDF) which specifies at pg. 6-7:
In order to meet most regulatory guidelines, developers must build a safety case as a means of documenting the safety justification of a system. The safety case is a record of all safety activities associated with a system throughout its life. Items contained in a safety case include the following:

• Description of the system/software
• Evidence of competence of personnel involved in development of safety-critical software and any
safety activity
• Specification of safety requirements
• Results of hazard and risk analysis
• Details of risk reduction techniques employed
• Results of design analysis showing that the system design meets all required safety targets
Verification and validation strategy
• Results of all verification and validation activities
• Records of safety reviews
• Records of any incidents which occur throughout the life of the system
• Records of all changes to the system and justification of its continued safety

A CCHIT ATCB juror, a physician informatics specialist, has also done a guest post in Jan. 2012 on HC Renewal about the certification process, reproducing his testimony to HHS on the issue.  That post is "Interesting HIT Testimony to HHS Standards Committee, Jan. 11, 2011, by Dr. Monteith."  Dr. Monteith testified (emphases mine):

... I’m “pro-HIT.” For all intents and purposes, I haven’t handwritten a prescription since 1999.

That said and with all due respect to the capable people who have worked hard to try to improve health care through HIT, here’s my frank message:

ONC’s strategy has put the cart before the horse. HIT is not ready for widespread implementation. 

... ONC has promoted HIT as if there are clear evidence-based products and processes supporting widespread HIT implementation.

But what’s clear is that we are experimenting…with lives, privacy and careers.

... I have documented scores of error types with our certified EHR, and literally hundreds of EHR-generated errors, including consistently incorrect diagnoses, ambiguous eRxs, etc.

As a CCHIT Juror, I’ve seen an inadequate process. Don’t get me wrong, the problem is not CCHIT. The problem stems from MU.

EHRs are being certified even though they take 20 minutes to do a simple task that should take about 20 seconds to do in the field.  [Which can contribute to mistakes and "use error" - ed.] Certification is an “open book” test. How can so many do so poorly?

For example, our EHR is certified, even though it cannot generate eRxs from within the EHR, as required by MU.

To CCHIT’s credit, our EHR vendor did not pass certification. Sadly, our vendor went to another certification body, and now they’re certified.

MU does not address many important issues. Usability has received little more than lip-service. What about safety problems and reporting safety problems? What about computer generated alerts, almost all of which are known to be ignored or overridden (usually for good reason)?
 
The concept of “unintended consequences” comes to mind.

All that said, the problem really isn’t MU and its gross shortcomings, it is ONC trying to do the impossible:

ONC is trying to artificially force a cure for cancer, basically trying to promote one into being, when in fact we need to let one evolve through an evidence-based, disciplined process of scientific discovery and the marketplace.

Needless to say, as was learned at great cost in past decades, a "disciplined process" in medicine includes meaningful safety regulation by objective outside experts.

Further, the certifiers have no authority to do important things such as forcibly remove dangerous software from the market.  An example is the forced Class 1 recall of a defective system as I wrote about in my Dec. 2011 post "FDA Recalls Draeger Health IT Device Because This Product May Cause Serious Adverse Health Consequences, Including Death".   Class 1 recalls are the most serious type of recall and involve situations in which there is a reasonable probability that use of these products will cause serious adverse health consequences or death.

In that situation, the producer had been simply advising users (in critical care environments, no less) to "work around the defects" that could indicate incorrect recommended dosage values of critical meds, including a drug dosage up to ten times the indicated dosage, as well as corrupt critical cardiovascular monitoring data.  As I observed:

... I find a software company advising clinicians to make sure to "work around" blatant IT defects in "acute care environments" the height of arrogance and contempt for patient safety.

Without formal regulatory authority to take actions such as this FDA recall, "safeguarding the public" is a meaningless platitude.

It's also likely the ATCB's, which are private businesses, would not want the responsibility of "safeguarding the public."  That responsibility would open them up to litigation when patient injuries or death were caused, or were contributed to, by "certified" health IT.

I have in the past also noted that the use of the term "certification" might have been deliberate, to mislead potential buyers exactly into thinking that "certification" is akin to a UL certification of an electrical appliance for safety, or an FAA approval of a new aircraft's flight-worthiness.

The WSJ needs to clarify and/or retract its statement, as the statement is misinformation.

At my Feb. 2012 post "Health IT Ddulites and Disregard for the Rights of Others" I observed:

Ddulites [HIT hyper-enthusiasts - ed.] ... ignore the downsides (patient harms) of health IT.

This is despite being already aware of, or informed of patient harms, even by reputable sources such as FDA (Internal FDA memo on H-IT risks), The Joint Commission (Sentinel Events Alert on health IT), the NHS (Examples of potential harm presented by health software - Annex A starting at p. 38), and the ECRI Institute (Top ten healthcare technology risks), to name just a few.

In fact, the hyper-enthusiastic health IT technophiles will go out of their way to incorrectly dismiss risk management-valuable case reports as "anecdotes" not worthy of consideration (see "Anecdotes and medicine" essay at this link).

They will also make unsubstantiated, often hysterical-sounding claims that health IT systems are necessary to, or simply will "transform" (into what, exactly, is usually left a mystery) or even "revolutionize" medicine (whatever that means).

Health IT is a potentially dangerous technology.   It requires meaningful regulation to "safeguard the public."  How many incidents like this and this will it take before that is understood by the hyper-enthusiasts?

I've emailed the ATCB's that had responded to my aforementioned query for clarification on the WSJ assertion about their role, being that the statement is in contradiction to their earlier replies to me.  I also advised them of the potential liability issues.

However, if it turns out to be true that the ONC-ATCB's do intend themselves as the ultimate watchdog and assurer of public safety related to EHR's, that needs to be known by the public and their representatives.

-- SS

Doctors and EHRs: Reframing the "Modernists v. Luddites" Canard to The Accurate "Ardent Technophiles vs. Pragmatists" Reality

One manner by which Healthcare's core values are usurped is via distortions and slander about physicians and other clinicians.

At "Health IT: Ddulites and Irrational Exuberance" and related posts (query link) I've described the phenomenon of the:

'Hyper-enthusiastic technophile who either deliberately ignores or is blinded to technology's downsides, ethical issues, and repeated local and mass failures.'

I have called this personality type the "Ddulite", which is "Luddite" with the first four letter reversed. I have also pointed out that the two are not exact opposites, as the Luddites did not endanger anyone in trying to preserve their textile jobs, whereas the Ddulites in healthcare IT do endanger patients.

Yet, in the 20 years I've been professionally involved in health IT, I have frequently heard the refrain, usually from IT personnel and their management, that "Doctors resists EHRs because they are [backwards, technophobic, reactionary, dinosaurs, unable/unwilling to change, think they are Gods, ..... insert other slanderous/libelous comment].

I've heard this at Informatics meetings, at medical meetings, at commercial health IT meetings (e.g., Microsoft's Health Users Group, and at HIMSS), at government meetings (e.g., GS1 healthcare), and others.

The summary catchphrase I've heard and seen (even in the comments on this blog) is that doctors are "Luddites" while IT personnel are forward-thinking, know better than doctors, and are "Modernists."

This slander and libel of physicians and other clinicians needs to stop, and the entire issue needs to be reframed.

Doctors are pragmatists. When a new technology is rigorously shown to be beneficial to patients, and (perhaps more importantly) rigorously shown not to be of little benefit or worse, significantly harmful, doctors will embrace it. There are countless examples of this that I need not go into. They also have responsibilities, obligations, ethical considerations, liabilities, and other factors to consider in their decisions:

Pragmatism (Merriam-Webster):

: a practical approach to problems and affairs

The reality is not:


Luddite doctors <---- are in tension with ----> Modernist IT personnel

but is:


Pragmatist doctors <---- are in tension with ----> Ardent technophiles (Ddulites)


The technophiles' views may be due, on the one hand, to ignorance of medicine's true complexities and "innocent" overconfidence in technology. Unfortunately, it is a gargantuan leap of logic to go from "well, computers work in tracking FedEx packages and allowing me to withdraw money from my U.S. bank when I'm abroad, to "therefore with just a little work they will transform medicine."

Anyone familiar with even the most fundamental issues in Medical Informatics is aware of this. (This is the problem with "generic management" of healthcare IT - healthcare amateurs are unfamiliar with these issues.) Due to the complex, messy social, scientific, informational, ethical, cultural, emotional and other issues relatively unique to medicine, that leap from banking/widget tracking/mercantile computing --> medicine is probably more naive than the leap in logic, for instance, that would have a person believe since a hot air balloon can go high in the sky, it can take a person to the moon, as I observed here.

On the other hand the technophile's expressed views can also be a territorial ploy with full awareness of, and reckless disregard for, the consequences of technology's downsides.

(The CIO where I was a CMIO was well-known to be an aficionado of Sun Tzu's "Art of War" in his corporate politics - the polar opposite of a 'team player.' I might add that the doctors were fully expected to be 'team players'.)

Part of the struggle between the health IT industry and medical professionals has also been control of information flow about HIT.

This has been brought to the fore by my observation of the almost uniformly negative comments on today's HIT at the physician-only site Sermo.com. Sermo is populated, I might add, not by computerphobes but by physicians in a wide variety of specialties using computers for social networking. These comments will hopefully soon be published.

(They are not dissimilar to the many comments I reported in my Jan. 2010 post "An Honest Physician Survey on EHR's", although some might call the sponsor of the latter survey, AAPS, biased. I do not think the same can be said of Sermo.com, an open site for all physicians.)

I have mentioned on this blog the numerous impediments to flow of information about health IT's downsides, and these impediments are well described, for example, in the Joint Commission Sentinel Events Alert on Health IT (link), the FDA Internal Memorandum on H-IT Safety (link) and elsewhere (such as at link, link).

The Institute of Medicine of the National Academies noted this in their late 2011 study on EHR safety:

... While some studies suggest improvements in patient safety can be made, others have found no effect. Instances of health IT–associated harm have been reported. However, little published evidence could be found quantifying the magnitude of the risk.

Several reasons health IT–related safety data are lacking include the absence of measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology. Another impediment to gathering safety data is contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT–related adverse events. These barriers limit users’ abilities to share knowledge of risk-prone user interfaces, for instance through screenshots and descriptions of potentially unsafe processes. In addition, some vendors include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., “hold harmless clauses”). The committee believes these types of contractual restrictions limit transparency, which significantly contributes to the gaps in knowledge of health IT–related patient safety risks. These barriers to generating evidence pose unacceptable risks to safety.[IOM (Institute of Medicine). 2012. Health IT and Patient Safety: Building Safer Systems for Better Care (PDF). Washington, DC: The National Academies Press, pg. S-2.]

Also in the IOM report:

… “For example, the number of patients who receive the correct medication in hospitals increases when these hospitals implement well-planned, robust computerized prescribing mechanisms and use barcoding systems. But even in these instances, the ability to generalize the results across the health care system may be limited. For other products— including electronic health records, which are being employed with more and more frequency— some studies find improvements in patient safety, while other studies find no effect.

More worrisome, some case reports suggest that poorly designed health IT can create new hazards in the already complex delivery of care. Although the magnitude of the risk associated with health IT is not known, some examples illustrate the concerns. Dosing errors, failure to detect life-threatening illnesses, and delaying treatment due to poor human–computer interactions or loss of data have led to serious injury and death.”


I note that the 'impediments to generating evidence' effectively rise to the level of legalized censorship, as observed by Koppel and Kreda regarding gag and hold-harmless clauses in their JAMA article "Health Care Information Technology Vendors' Hold Harmless Clause: Implications for Patients and Clinicians", JAMA 2009;301(12):1276-1278. doi: 10.1001/jama.2009.398.

Pragmatist physicians are quite rightly very wary of the technology as it now exists.

Ultimately, even when information on HIT risks or defects does surface, it is highly inappropriately labeled as "anecdotal" (see this post on anecdotes for why this behavior is inappropriate).

This "anecdotalist" phenomenon occurs right up to the HHS Office of the National Coordinator for Health IT (ONC), as I described in my post "Making a Stat Less Significant: Common Sense on 'Side Effects' Lacking in Healthcare IT Sector" and elsewhere.

Therefore, another part of reframing the pragmatism vs. technophilia issue is for clinicians to put an end to censorship of HIT adverse experiences.

I have the following practical suggestions, used myself, to start to accomplish the latter goal.

These suggestions are in the interest of protecting public health and safety:

When a physician or other clinician observes health IT problems, defects, malfunctions, mission hostility (e.g., poor user interfaces), significant downtimes, lost data, erroneous data, misidentified data, and so forth ... and most certainly, patient 'close calls' or actual injuries ... they should (anonymously if necessary if in a hostile management setting):

(DISCLAIMER:  I am not responsible for any adverse outcomes if any organizational policies or existing laws are broken in doing any of the following.)

  • Inform their facility's senior management, if deemed safe and not likely to result in retaliation such as being slandered as a "disruptive physician" and/or or being subjected to sham peer review (link).
  • Inform their personal and organizational insurance carriers, in writing. Insurance carriers do not enjoy paying out for preventable IT-related medical mistakes. They have begun to become aware of HIT risks. See, for example, the essay on Norcal Mutual Insurance Company's newsletter on HIT risks at this link. (Note - many medical malpractice insurance policies can be interpreted as requiring this reporting, observed occasional guest blogger Dr. Scott Monteith in a comment to me about this post.)
  • Inform the State Medical Society and local Medical Society of your locale.
  • Inform the appropriate Board of Health for your locale.
  • If applicable (and it often is), inform the Medicare Quality Improvement Organization (QIO) of your state or region. Example: in Pennsylvania, the QIO is "Quality Insights of PA."
  • Inform a personal attorney.
  • Inform local, state and national representatives such as congressional representatives. Sen. Grassley of Iowa is aware of these issues, for example.
  • As clinicians are often forced to use health IT, at their own risk even when "certified" (link), if a healthcare organization or HIT seller is sluggish or resistant in taking corrective actions, consider taking another risk (perhaps this is for the very daring or those near the end of their clinical career). Present your organization's management with a statement for them to sign to the effect of:
"We, the undersigned, do hereby acknowledge the concerns of [Dr. Jones] about care quality issues at [Mount St. Elsewhere Hospital] regarding EHR difficulties that were reported, namely [event A, event B, event C ... etc.]

We hereby indemnify [Dr. Jones] for malpractice liability regarding patient care errors that occur due to EHR issues beyond his/her control, but within the control of hospital management, including but not limited to: [system downtimes, lost orders, missing or erroneous data, etc.] that are known to pose risk to patients. We assume responsibility for any such malpractice.

With regard to health IT and its potential negative effects on care, Dr. Jones has provided us with the Joint Commission Sentinel Events Alert on Health IT at http://www.jointcommission.org/assets/1/18/SEA_42.PDF, the IOM report on HIT safety at http://www.modernhealthcare.com/Assets/pdf/CH76254118.PDF, and the FDA Internal Memorandum on H-IT Safety Issues at http://www.scribd.com/huffpostfund/d/33754943-Internal-FDA-Report-on-Adverse-Events-Involving-Health-Information-Technology.

CMO __________ (date, time)
CIO ___________ (date, time)
CMIO _________ (date, time)
General Counsel ___________ (date, time)
etc."
  • If the hospital or organizational management refuses to sign such a waiver (and they likely will!), note the refusal, with date and time of refusal, and file away with your attorney. It could come in handy if EHR-related med mal does occur.
  • As EHRs remain experimental, I note that indemnifications such as the above probably belong in medical staff contracts and bylaws when EHR use is coerced.

These measures can help "light a fire" under the decision makers, and "get the lead out" of efforts to improve this technology to the point where it is usable, efficacious and safe.

-- SS

Congressman Darrell Issa: FDA's email monitoring of "whistleblowers" communicating with Congress was illegal

In followup to my post of Jan. 30, 2012 "Can You Sue the Government? FDA Whistleblowers Sue Over Surveillance of Personal e-Mail" I provide a link to a probing letter from Darrell Issa, Chairman, US House of Representatives Committee on Oversight and Government Reform to Margaret Hamburg MD, Commissioner of the FDA.

The letter raises the issue that FDA's email monitoring of "whistleblowers" communicating with Congress was illegal ("unlawful, and will not be tolerated"), and the illegality was further compounded by harassment and retaliation against the "whistleblowers."

Many probing "who? why? when?" questions are asked of FDA.

I do not have free text of this letter, just a link to images of the letter. I cannot post the text (no access to OCR of a PDF at the moment), but the letter images are here:

http://www.whistleblowers.org/storage/whistleblowers/documents/FDAComplaint/issaletter.fdaspying.pdf

Worth reading in its entirety.

-- SS

Can You Sue the Government? FDA Whistleblowers Sue Over Surveillance of Personal e-Mail

From the Washington Post:

FDA staffers sue agency over surveillance of personal e-mail
Ellen Nakashima and Lisa Rein
January 29, 2012

The Food and Drug Administration secretly monitored the personal e-mail of a group of its own scientists and doctors after they [the scientists - ed.] warned Congress that the agency was approving medical devices that they believed posed unacceptable risks to patients, government documents show.

The surveillance — detailed in e-mails and memos unearthed by six of the scientists and doctors, who filed a lawsuit against the FDA in U.S. District Court in Washington last week — took place over two years as the plaintiffs accessed their personal Gmail accounts from government computers.

While accessing Gmail from government computers was not a wise idea, since all traffic over an institutional PC and network can be monitored, these Gmails were apparently to members of Congress.

Copies of the e-mails show that, starting in January 2009, the FDA intercepted communications with congressional staffers and draft versions of whistleblower complaints complete with editing notes in the margins. The agency also took electronic snapshots of the computer desktops of the FDA employees and reviewed documents they saved on the hard drives of their government computers.

See sample emails at link above.

Information garnered this way eventually contributed to the harassment or dismissal of all six of the FDA employees, the suit alleges. All had worked in an office responsible for reviewing devices for cancer screening and other purposes.

That's very unfortunate.

It will be far more unfortunate if the warnings of the six, as in this whistleblower case, went unheeded, and patients are injured or die as a result. In that case, FDA bureaucrats might have been accessories to those injuries or deaths.

“Who would have thought that they would have the nerve to be monitoring my communications to Congress?” said Robert C. Smith, one of the plaintiffs in the suit, a former radiology professor at Yale and Cornell universities who worked as a device reviewer at the FDA until his contract was not renewed in July 2010. “How dare they?”

I, on the other hand, would have expected it. It would have been far more prudent to send such emails from a private home computer and ISP.

The scientists and doctors denied sharing information improperly. The HHS inspector general’s office, which oversees FDA operations, declined to pursue an investigation, finding no evidence of criminal conduct. It also said that the doctors and scientists had a legal right to air their concerns to Congress or journalists.

FDA officials sought a second time that year to initiate action against the scientists and doctors. “We have obtained new information confirming the existence of information disclosures that undermine the integrity and mission of the FDA and, we believe, may be prohibited by law,” wrote Jeffrey Shuren, director of the FDA’s Center for Devices and Radiological Health, on June 28, 2010.

The inspector general, after consulting with federal prosecutors, declined the second request, as well.


The IG office seemed to find the behavior legal, but FDA bureaucrats apparently did not like non-team players.


The FDA scientists and doctors, all of whom worked for the agency’s Office of Device Evaluation, said they first made internal complaints beginning in 2007 that the agency had approved or was on the verge of approving at least a dozen radiological devices whose effectiveness was not proven and that posed risks to millions of patients. Frustrated, they also brought their concerns to Congress, the White House and the HHS inspector general.

Three of the devices risked missing signs of breast cancer, the scientists and doctors warned, according to documents and interviews. Another risked falsely diagnosing osteoporosis, leading to unnecessary treatments; one ultrasound device could malfunction while monitoring pregnant women in labor, risking harm to the fetus; and several devices for colon cancer screening used such heavy doses of radiation that they risked causing cancer in otherwise healthy people, the FDA scientists and doctors said.


Permit me to wonder if regulatory capture played a role in these decisions.

One might also wonder if complaints about electronic health records or other clinical IT, admitted by FDA to be a medical device "political hot potato" they elected to not regulate, were also involved.


... The first documented FDA interception was of an e-mail dated Jan. 29, 2009, shortly after the letter from Ferry. In it, device reviewer Paul T. Hardy asked a congressional aide, Joanne Royce, for assurances that “it is not a crime to provide information to the Congress about potential misconduct by another Agency employee.”

Royce replied: “[Y]ou and your colleagues have committed no crime. . . . you guys didn’t even provide confidential business information to Congress.”


The only 'crime' was apparently not being a 'team player', which on Healthcare Renewal has been defined as someone who is silent, or silenced, or a co-conspirator regarding managerial mediocrity, malfeasance, or madness.


Hardy, who is among the six employees who filed the suit, was fired in November after a negative performance review; an internal FDA letter obtained in separate litigation quoted managers saying they did not “trust” him. Of the other five scientists and doctors, the suit says two did not have their contracts renewed, two suffered harassment and werepassed over for promotions, and one was fired.


Trust him to do - what, I ask?

Read the whole WaPo article.

Plaintiff's lawyers need to be aware of this event, and I intend to make them aware.

-- SS

Feb. 13, 2012 addendum:

A link to Darrell Issa's letter to FDA Commissioner Hamburg is here.

-- SS

FDA Recalls Draeger Health IT Device Because "This Product May Cause Serious Adverse Health Consequences, Including Death"

More health IT madness, in the form of an actual FDA recall:


FDA Recall notice

Draeger Medical Inc., Infinity Acute Care System Monitoring Solution (M540)
Recall Class: Class I [the most serious type of recall, see below - ed.]
Date Recall Initiated: October 18, 2011
Product: Infinity Acute Care System Monitoring Solution (M540), Catalog number MS25510
All serial numbers are affected by this recall.

This product was manufactured from March 1, 2011 through September 30, 2011 and distributed only to the Rush University Medical Center (Chicago, Illinois) from July 1, 2011 through September 30, 2011.

Use: This product is a networked solution system used to monitor a patient’s vital signs and therapy, control alarms, review Web-based diagnostic images, and access patient records. The number of monitored vital signs can be increased or decreased based on the patient’s needs.

Recalling Firm: Draeger Medical, Inc. 3135 Quarry Rd., Telford,
Pennsylvania 18969-1042

Manufacturer: Draeger Medical GmbH, Moislinger Allee 53-55_23558, Lubeck, Germany

Reason for Recall: The weight-based drug dosage calculation may indicate incorrect recommended values, including a drug dosage up to ten times the indicated dosage. Additionally, there may be a 5-10 second delay between the electrocardiogram and blood pressure curves (waveforms) at the Infinity Central Station.

This product may cause serious adverse health consequences, including death.


Public Contact: Draeger Medical, Inc. 3135 Quarry Road Telford, Pennsylvania 18969-1042_215-660-2349

FDA District: Philadelphia

FDA Comments: On October 17, 2011, the company sent the Rush University Medical Center a letter stating that users should enter the patient’s weight by way of the admin/demographics screen to ensure the drug dosage is calculated as intended.

Additionally, the company’s letter states that users should follow the instructions for Use of the Infinity Acute Care System Monitoring Solution. The Instructions for Use includes, "For primary monitoring and diagnosis of bedside patients, use the bedside monitor. Use the Infinity Central Station only for remote assessment of a patient's status." [That is, clinicians should work around the device's defects, which would seem to hold the computer's rights over the patients' rights -- rather than taking the device out of service immediately and having the vendor fix it - ed.]

Class I recalls are the most serious type of recall and involve situations in which there is a reasonable probability that use of these products will cause serious adverse health consequences or death.

Health care professionals and consumers may report adverse reactions or quality problems they experienced using these products to MedWatch: The FDA Safety Information and Adverse Event Reporting Program either online, by regular mail or by FAX.

I find a software company advising clinicians to make sure to "work around" blatant IT defects in "acute care environments" the height of arrogance and contempt for patient safety. Yes, acute care environments are not unpredictable, chaotic environments often moving a mile a minute. They are precisely the environment where everyone can sit around on their butts and leisurely hold, over pizzas and cokes, a committee meeting where each and every move can be discussed, just like in a software development shop ...

I also find the statement that this medical device was "distributed only to the Rush University Medical Center" remarkable. If true, it raises a number of issues that make me very uncomfortable:

  • How did the company's letter to Rush make its way to FDA? Whistle blower?
  • What testing was done by the manufacturer of this medical device before release to a live-patient environment?
  • Who approved this software "going live?" What due diligence was performed?
  • Was this a software beta test of experimental software on live subjects?
  • Did Rush University Medical Center have some sort of quid pro quo (e.g., financial arrangement) with the software company?
  • Did Rush seek IRB approval of this device?
  • Were patients presented with an informed consent process regarding its use in their care?
  • Were any patients actually injured or did any die as a result of this software?

The answers to these questions need to be sought by FDA.

-- SS

A Logical Fallacy Affecting Selection of Panelists on an FDA Advisory Committee

An old argument used to defend against criticisms of conflicts of interest was just employed in a disturbing context. 

Expert Removed from FDA Advisory Committee for Having an Opinion

As first reported on the PharmaLot blog, and later by the Newark Star-Ledger, a panelist was just disqualified from voting on a US Food and Drug Administration (FDA) panel for having previously expressed an opinion about the safety of the drug up for re-evaluation.  Per the Star-Ledger,
Federal drug regulators have notified Sidney Wolfe, one of the nation's leading advocates for drug safety, that he would not be permitted to join a committee of experts asked to review new dangers associated with a group of birth control pills, including Bayer Healthcare's top-selling Yaz.

The Food and Drug Administration scheduled a meeting Thursday of two advisory committees — one on drug safety and risk management and the other on reproductive health drugs — after new information emerged on the safety of oral contraceptives containing the synthetic hormone

Why Was Dr Wolf removed from the committee?
The agency recently learned that Public Citizen, a non-profit consumer advocacy organization, had placed one of the contraceptives, Bayer’s Yasmine — a predecessor to Yaz — on its list of 'Do Not Use Pills' in 2002.

'He did not volunteer this information,' said agency spokeswoman Erica Jefferson. 'It was brought to our attention.'

The FDA offered Wolfe two options: He could present information to the advisory committee like other members of the public or he could sit on the committee, participate in the discussion but refrain from voting.

Logical Fallacy: False Dilemma

We frequently post about conflicts of interest affecting health care decision-makers.  It is now clear (e.g., look here) that leading health care academics often have significant financial relationships with drug and device companies and other health care corporations which could potentially influence their clinical research, clinical teaching, health policy recommendations, or direct patient care.  These relationships are frequently defended, often with logical fallacies used by those who themselves have conflicts. 

One common argument is based on the assertion that conflicts of interest are ubiquitous and everyone is conflicted.  Therefore, if one were to ban people with conflicts from responsible positions, there would be no one left to fill these positions, so such a ban would be untenable.  This seems to be an example of the false dilemma.  It is often employed by people who themselves have conflicts of interest.

One way to make it appear that everyone has conflicts of interest is to broaden the concept of conflicts of interest to "intellectual conflicts of interest."  Doing this facilitates the assertion that everyone who has an opinion on a subject has a conflict of interest, so this argument implies that all sentient beings have important conflicts.  This argument would make equivalent a doctor who would not use a particular drug because his or her reading of the clinical research literature about this drug suggests its benefits do not outweigh its harms, and a doctor who advocates using the drug, and is paid $100,000 a year by the marketing division of the company that makes this drug as a marketing consultant. 

The decision to prevent Dr Wolfe from voting on this committee seems to be based on this logical fallacy. As Dr Wolfe said,
In his statement, Wolfe said if intellectual conflict of interest means being informed and subsequently having opinions on a drug, then 'many more members of advisory committees would have to be excluded.'

'For members of a scientific and technical advisory committee, possessing information and expert views on matters within the purview of the committee is not a conflict of interest,' Wolfe wrote. 'To the contrary, qualified experts are likely to have developed views on a variety of subjects based on their professional experience.'

As Larry Husten wrote on the CardioBrief blog,
do we really want to choose advisory committee panelists who have never expressed opinions about the topics they are reviewing? Are we reaching the point where potential FDA panelists will be required, like Supreme Court nominees, to have avoided any discussion of all important issues at every point in the past?

Thus they point out the absurdity of banning people with "intellectual conflicts of interests," that is, with relevant opinions, as if they had real conflicts of interest. (But wait for someone to argue that if Wolfe were allowed to serve, it would be unfair to ban anyone with financial conflicts of interest from serving.)

What is most distressing about this case is that the sort of fallacious arguments usually employed by the conflicted to defend conflicts of interest are now being employed by leaders of government agencies, who are supposed to not have their own conflicts, and to serve the people, and in this case, to be dedicated to improving the health and safety of the population and of individual patients.

Every fallacious argument made in support of financial conflicts of interest affecting health care decision makers suggests we need to do more to combat such conflicts.  At an absolute minimum, all such conflicts should be fully disclosed in detail in any context in which they possibly could influence medical research, medical education, clinical care, or health policy.  Furthermore, we need to work towards ending as many such conflicts as possible.  A good starting point would be the recommendations made by the Institute of Medicine committee reports on conflicts of interest, and clinical practice guidelines.

See also comments by Merrill Goozner on the GoozNews blog.

ADDENDUM (9 December, 2011) - In response to comments below, see two posts by Dr Howard Brody in the Hooked: Ethics, Medicine, and Pharma blog on the problems with the concept of "intellectual conflict of interest" - here and here.

ADDENDUM (9 December, 2011) -  See further discussion by posted by Dr Brody today here.

Retreat Back to Regulatory Capture: US FDA, NIH, Department of Health and Human Services All Back Off

After some brave words about transparency, integrity and all that, US government officials seem to be running back to the arms of the health care corporate CEOs.

Weakening FDA Conflict of Interest Rules

As reported by Reuters,
U.S. lawmakers likely will change the criteria for advisers reviewing new medicines next year because of complaints that the rules meant to prevent conflicts of interest make it harder to find real experts.

Congressional lawmakers may require the Food and Drug Administration to relax the rules that bar advisers from reviewing a drug if they have even indirect financial ties to related manufacturers, as part of an FDA funding bill.

This was not purely an initiative of legislators, but was egged on by a top FDA administrator
The agency often must delay panel meetings while it searches for experts without conflicts, lawmakers and FDA officials say. Top doctors are usually the ones drugmakers hire as speakers or consultants.

'We have had difficulty in recruiting highly qualified people. And we've had delays in having panels because of this,' Dr. Janet Woodcock, head of the FDA's drugs center, told a House of Representatives hearing earlier this month.

The result is that 23 percent of FDA advisory panels have vacancies, more than double the agency's stated goal, according to the FDA's quarterly report at the end of May.

The rationale was that those paid by drug and device companies are the most expert:
The FDA tightened guidelines in 2007 to minimize industry ties that could sway a panelist's view, partly inspired by the scandal with Merck's pain reliever Vioxx.

Ten of the 32 panelists advising the FDA on the drug consulted for drugmakers. Nine of the 10 recommended putting the drug back on the market after it was pulled in 2004 over concerns about heart risk.

The restrictions go too far, say lawmakers who want the FDA to approve more new medicines, in part because they promote American jobs.

'No longer can we deny experts simply because they have ties to industry,' said Georgia Representative Phil Gingrey during a House of Representative hearing on FDA funding last month. The committee's chairman, Fred Upton from Michigan, called the conflict of interest rules 'rigid and unrealistic.'

Industry executives, who want the FDA to speed drug approvals, also support relaxing the rules. Biogen Idec CEO George Scangos said the guidelines 'exclude a lot of people who would be the best qualified.'

Of course, the drug and device companies have been touting their paid "key opinion leaders" as the best and the brightest for a long time. There is plenty of evidence, however, that they are mainly those whom those companies find the most compliant, and in many cases, those who are willing to be stealth marketers on those companies' payrolls. (See this post about those who recruit KOLs regarding them as salesmen, and more here.)  "Key opinion leaders" supported by commercial grant funding may seem like experts to academic medical institutions' leadership who now value outside funding more than teaching and research excellence (see this post).

Furthermore, as reported by Politco, the Project on Government Oversight, a watchdog group, chastised FDA leadership for exaggerating the difficulty of finding unconflicted experts, concluding in their letter to the FDA Commissioner, "to gain the public trust, we must ensure that the FDA relies on the best available information for its policies, rather than personal opinions and biases."

So far, government officials seem to be more worried about the opinions of corporate leaders than the public trust.

Weakening NIH Conflict of Interest

As discussed in Nature,
Francis Collins hailed it as a 'new era of clarity and transparency in the management of financial conflicts of interest' (S. J. Rockey and F. S. Collins J. Am. Med. Assoc. 303, 2400–2402; 2010). But the director of the US National Institutes of Health (NIH) may have spoken too soon when he described a new rule, proposed last year, that would require universities and medical schools to publicly disclose online any financial arrangements that they believe could unduly influence the work of their NIH-funded researchers.

Nature has learned that a cornerstone of that transparency drive — a series of publicly accessible websites detailing such financial conflicts — has now been dropped.

In more detail,
The NIH's parent agency, the Department of Health and Human Services (DHHS), proposed the new rule in May 2010, after congressional and media investigations revealed that prominent NIH grant recipients had failed to tell their universities or medical schools about lucrative payments from companies that may have influenced their government-funded research. The DHHS called the proposed websites 'an important and significant new requirement to … underscore our commitment to fostering transparency, accountability, and public trust'. Under the proposal, institutions with NIH-funded researchers would determine, grant by grant, if any financial conflicts existed for senior scientists on the grant. For example, these would include receiving consultancy fees, or holding shares in a company, 'that could directly and significantly affect the design, conduct, or reporting' of the research. The institutions would post the details online, where they would stay for at least five years.

But of course the medical schools decided that it would just be too much trouble to do all this:
'The websites don't appear out of nowhere,' says Heather Pierce, senior director of science policy at the Association of American Medical Colleges (AAMC) in Washington DC. They would 'require employees to not only create the website but to pull the information, review it, and make sure it is up to date and accurate'.

That is not the only objection from the powerful academic lobbies. During the public comment period last summer, the Association of American Universities and the AAMC submitted a joint statement saying: 'There are serious and reasonable concerns among our members that the Web posting will be of little practical value to the public and, without context for the information, could lead to confusion rather than clarity regarding financial conflicts of interest and how they are managed.'

Given how academic medical institutions have expanded their administrations and bureaucracy, the enormous amounts they spend on management, and the huge compensation they give their executives, and further given how much of their revenues come from government sources (Medicare, Medicaid money for patient care, Veterans Administration money supporting many faculty members, Medicare money funding graduate medical education, and NIH and other government research grants), the notion that getting a few staffers to process disclosures would be administratively or financially burdensome is just laughable.

At least Iowa's Republican Senator Charles Grassley, seemingly one of the last politicians in Washington who cares about the integrity of government programs and spending, is upset. As reported again by Nature,
The US Senate's leading advocate for government transparency wrote today to the White House's budget office, demanding that it protect a proposed rule that would obligate universities to post their publicly-funded biomedical researchers' financial conflicts on a publicly accessible website.

'The public's business should be public... I urge OMB to follow through and approve a rule that includes a publicly available website,' Senator Charles Grassley, Republican of Iowa ..., wrote in in this letter to Jacob Lew, the director of the White House's Office of Management and Budget (OMB).

Furthermore, he wrote:
I am troubled that taxpayers cannot learn about the outside income of the researchers whom the taxpayers are funding, and this flies in the face of President Obama's call for more transparency in the government.

We will see if his protest does any good, but again it appears that government officials are more worried about the revenues of big health care organizations than the needs of the public.

Retreating from Threats to Disbar Forest Laboratories CEO

We previously posted about how the US Department of Health and Human Services threatened to disbar the CEO of Forrest Laboratories from dealings with the government after his company pleaded guilty to obstruction of justice and misbranding, and paid a $313 million fine.

Now, per Alicia Mundy writing for the Wall Street Journal, things have changed:
The U.S. government dropped efforts to force the resignation of a prominent pharmaceutical-company chief executive, reversing course after protests from the company and major business groups.

The about-face on Forest Laboratories's longtime leader, Howard Solomon, represents a significant retreat by the Department of Health and Human Services, which has said it wants to step up punishments against drug-company executives when wrongdoing happens on their watch.

Forest agreed last year to plead guilty to misdemeanors involving marketing of its drugs including the antidepressant Celexa, and it paid $313 million to resolve the matter.

Mr. Solomon wasn't personally accused of any wrongdoing. Nonetheless, the government notified him in April that it was considering excluding him from jobs at health-care companies that sell to the U.S. government. It invoked a little-used clause in the Social Security Act that allows such an action against corporate leaders of companies found guilty of criminal misconduct, even if the leaders had no knowledge of the misconduct.

The exclusion move would have effectively forced Forest to remove Mr. Solomon from office, because Forest and other drug companies rely on business from U.S. government agencies such as Medicare and the Veterans Administration.

In a letter to Mr. Solomon on Friday, the office of the inspector general of the Department of Health and Human Services said, 'Based on a review of information in our file, and consideration of the information your attorneys provided to us both in writing and in an in-person meeting, we have decided to close this case.'

We have discussed - some might say endlessly - how despite numerous publicly reported cases of wrongdoing by health care organizations, hardly any individual who authorized, directed or implemented the bad behavior has ever faced any negative consequences. There have been recent fulminations by some government officials that this is going to change. The case of the Forest Laboratories CEO appeared to be an example of such change, but no more.

The Wall Street Journal went on to discuss why the government may have changed course:
The government's retreat came after a barrage of complaints from Forest and business groups including the U.S. Chamber of Commerce and the Pharmaceutical Research and Manufacturers of America, the drug industry's leading trade group.

In July, Forest spent $80,000 to hire former Louisiana Sen. John Breaux to lobby the government regarding exclusion, according to Senate records. Mr. Breaux didn't return a call requesting comment. Forest said earlier that it was just trying to make its case that this was a highly unusual action by the U.S. government.

Of course it was unusual. That is the whole problem.

In any case, it looked like the government was much more concerned about the coddling of corporate CEOs and their lobbyists' and cronies' opinions than about deterring bad behavior by large health care organizations.

Summary

After a bit of blustering by the current US administration about transparency and integrity it appears to be back to business as usual in the US capital. Over the last 20 years, government has increasingly answered to corporate CEOs instead of "we, the people." Protecting patients' and the public's health has given way to protecting the financial health of large health care organizations, and the compensation of rich CEOs. Federalism is giving way to corporatism. As long as this continues, expect our health care system to continue its slow collapse. Eventually, expect the CEOs to get in their private jets and escape while the rest of us picks up the pieces.

Until we dispel the fog of corporatism that has spread over the government that was once supposed to be of the people, by the people, and for the people, expect no real health care reform, and expect continuing rising costs, declining access, and worsening patient care. Obviously, true health care reform would start with the government and its officials putting patients' and the public's health first, way ahead of the financial comfort of corporate CEOs.

See also comments by Alison Bass.

FDA Decides Regulating Implantable Defibrillator Medical Devices a "Political Hot Potato"; Demurs

Well, not exactly, but they have decided regulation of another medical device is a political hot potato, and demurred on enforcing regulation: 

Health IT. 

Health IT are medical devices.

FDA's Chair of the Center for Device and Radiological Health, Jeffrey Shuren, MD JD, stated this explicitly on Feb. 25, 2010 (see testimony to the HHS Health Information Technology HIT Policy Committee at this PDF) that:

... Under the Federal, Food, Drug, and Cosmetic Act, [that regulates all drug, medical devices, etc. in the United States - ed.] HIT software is a medical device. Currently, the FDA mandates that manufacturers of other types of software devices comply with the laws and regulations that apply to more traditional medical device firms. These products include devices that contain one or more software components, parts, or accessories (such as electrocardiographic (ECG) systems used to monitor patient activity), as well as devices that are composed solely of software (such as laboratory information management systems).

That leaves no doubt that these are medical devices. However, he also stated:

To date, FDA has largely refrained from enforcing our regulatory requirements with respect to HIT devices.

In other words, this medical device receives special accommodation over all others,such as heart stents, defibrillators, spine and knee implants, etc., all of which have been in the news in recent years for major defects and malfunctions, up to and including causing patient deaths. The extent would have likely been far, far worse had these gadgets been unregulated.

One should ask: why the special accommodation for health IT medical devices? What are the underlying politics, and who is behind them? Especially when FDA is aware of potential risks that may only be the "tip of the iceberg?"

The selective reluctance to enforce the FD&C Act persists to this day. See for example this link regarding his statements just a few days ago here in Philadelphia: Will FDA Regulate EHR's? :

Speaking at the first annual PharmEHR Summit in Philadelphia on April 7, Jeffrey Shuren, M.D., J.D., director of the Center for Devices and Radiological Health at the FDA, said his agency could change its traditional hands-off approach to EHRs, but he acknowledged that the potential of FDA regulation raises serious clinical issues [the only clinical issues I can think of are in holding vendors accountable for patient injury - ed.] and is a “political hot potato.” As of right now we’re not regulating EHRs, and it may turn out that we won’t,” he said.

A "political hot potato" as the reason for an FDA hands-off policy?  Remarkable. 




Tossing away political hot potatoes and ... patients.


Likely translation: FDA regulation of health IT will never happen.

One implication is that health IT quality, safety, efficacy, privacy, security, and other issues about these systems will remain subject to HHS and industry caprice.

While we're at it, let's deregulate knee implants too...

-- SS

Addendum April 19, 2011:

Roy Poses observes that:


Here is a great example of regulatory capture. The health care IT industry has amassed such political clout that it now has impunity to regulation. Once again, combined economic and political power trumps patients' and the public's health and safety. We will not be able to really reform health care until we can provide for honest, independent health care regulation to uphold patients' and the public's health.

-- SS
womens health ,health articles ,health information ,health benefits ,free health insurance ,health plus ,child health insurance ,health insurance plans ,insurance health ,online health insurance ,health insurance companies ,best health insurance ,health insurance ,health plan ,health ins ,family health insurance ,health plans ,health insurance coverage ,health magazine ,health insurance providers ,health news ,health insurance online ,health current events ,health insurance company ,womens health magazine ,health and wellness ,current health articles ,good health insurance ,health insurances ,health news articles ,health insurance plan ,current health events ,health related articles ,health insurance options ,recent health articles ,health facts ,health.com ,get health insurance ,health topics ,articles on health ,articles about health ,health current event ,health concerns ,holistic health ,global health ,health magazines ,health news today ,current health issues ,heart health ,current health news womens health ,health articles ,health information ,health benefits ,free health insurance ,health plus ,child health insurance ,health insurance plans ,insurance health ,online health insurance ,health insurance companies ,best health insurance ,health insurance ,health plan ,health ins ,family health insurance ,health plans ,health insurance coverage ,health magazine ,health insurance providers ,health news ,health insurance online ,health current events ,health insurance company ,womens health magazine ,health and wellness ,current health articles ,good health insurance ,health insurances ,health news articles ,health insurance plan ,current health events ,health related articles ,health insurance options ,recent health articles ,health facts ,health.com ,get health insurance ,health topics ,articles on health ,articles about health ,health current event ,health concerns ,holistic health ,global health ,health magazines ,health news today ,current health issues ,heart health ,current health news