• Home
  • >
  • Blog
  • >
  • Part 3: Avoiding the Cobra Effect

August 8, 2023

Part 3: Avoiding the Cobra Effect

Written by: Carl Byron, CCS, CHA, CIFHA, CMDP, CPC, ICDCT-CM/PCS, OHCC   

This article follows The Cobra Effect articles Part 1 and Part 2 also written by Carl Byron. 

In the previous segments we covered the Cobra Effect and outcomes/results to identify what it is, but can it be negated?

What about an organization that is going to implement a policy or incentive? How can an organization predict faults in the business plan or project to avoid the Cobra Effect?

In review (if you didn’t read Part 1 or 2), the “Cobra Effect” is a term coined for a solution that makes a current problem worse based on incentives, rewards and/or punishments exposing chinks in the planning. Perverse Incentives are nearly natural by-products, motivating unscrupulous individuals to game the system to make easy money or cut corners in regulations.

When the Cobra Effect has already occurred, the caution must be in reevaluation, mitigation and corrective action. Although similar, both have specific actions unique to each. The key is in intelligent avoidance.

Reversing or Mitigating the Cobra Effect

Because medical organizations vary so much by size, specialty, state they’re located, and a seeming maze of internal and external policies and personnel we will only cover general but effective ideas. First, in the assumption right now, the Cobra Effect has happened.

First, we need to look at exactly what happened.  I am a Certified Internal Healthcare Fraud Auditor (CIFHA).  If you have such training or experience, I recommend taking an investigative approach.  For the sake of brevity, here are a few questions which can help you start your investigation, such as:

Were laws broken?

  • If so, what are the corrective measures (self-reporting) and potential consequences?
    • Was the effect found by an external reviewer or auditor or through an internal whistleblower?
  • Who exactly was responsible for launching the perverse incentive that went so wrong?
  • Were providers/staff/employees affected?
  • Who was involved in both the drawing and execution (Something went big-time wrong and we’ll need to ferret out the details.)

Next you must evaluate the outcome from the initial response to your questions and develop new questions – but you must ask the right questions. If you don’t, every answer will be wrong, and the unknowns become high risk and increasingly dangerous.

Great auditors know this and live by it. So, we must ask not only the right questions: but as many as possible so the outcome will be positive.

Although I haven’t seen the term for a while, you need to use deductive reasoning, meaning we have a known result and must rapidly work backwards to find the root cause(s). If possible, the best starting point is building an investigative element (or team) of people who understand the issue and know what was meant to be accomplished. Although the Cobra Effect typically involves bad actors, this is not absolute: well-intentioned employees who misunderstood or were put under pressure of some form could have made errors affecting the outcome. Any combination of missteps could cause a failure.

You need people dedicated to mission accomplishment who are accurate and effective investigators: and who have your organization’s health and best interests at heart. This team will gather all information and sift through it to determine the causes and produce either an alternate incentive to gain the same ground, or remove the incentive completely.

Add a Fishbone

A very effective tool is the fishbone diagram.   Both the American Society for Quality (ASQ) and Centers for Medicare and Medicaid Services (CMS) recognize this tool’s usefulness in root cause analysis (RCA). To determine shortfalls, omissions or bad actors, your team will need to produce a detailed root cause analysis.

The fishbone allows addition and insertion of critical elements, in this case as the investigation progresses; and it also allows for manipulation of the fish’s segments to better account for where in a process something occurred, and who was responsible for that segment. A great example is below, from the Minnesota Department of Health

  1. Problem Statement. Define a clear problem statement on which all team members agree. Be specific about how and when the problem occurs when a conclusion is reached and agreed upon.
  2. Categorize major elements of the policy or incentive. There is no limit to how many of these “fish ribs” you can use. Use as many as will definitively detail the problem statement.
  3. Brainstorm and build a comprehensive rib cage of Contributing Factors. *NOTE: This is a critical part of the investigation. Investigators must be (1) Unanswerable to the parties who built and sent the policy/incentive into the field. The investigators, like auditors must have the ability to interview individuals at any level and be unaffected by their motivations and safe from accusations or retaliation. Second, investigators must also be able to instill a sense of confidentiality and anonymity in the interviewees: their job is to determine and organize all the facts- not render judgment.
  4. Be the detective. From the moment you begin ask why, who, where, when, what. With the Cobra Effect the central question (which should be repeated by the team constantly), is WHY. For example: 
    • Why was this overtime pay bonus started? Answer: Productivity stalled
    • Why has productivity stalled? Answer: Outdated skills/brand new programs were put in our equipment
    • Why were the skills outdated? Why did the new programs go live without proper training? Answer: This department suffered significant budget cuts and funds have not been increased or replaced. There has been no training on the new programs
    • With the budget cuts last year the few people who knew the new programs were downsized.
    • New hires did not fill the void and in X number of cases the few new transferees from department Y were not the right people for the job
  5. The ribs can have “offshoots” or extensions. There is no limit to the number of ribs used to come to the conclusion. Just continually reevaluate to make certain of the conclusion(s)
  6. Causes will expose themselves. Analyze them for repeat offenders which appear consistently not only within categories but between categories.

Avoid the recurring problem of “hyper-compartmentalization”

Many times, a plan will be incompletely drawn up, finalized and thrown into the field without some of the most important people’s input who will be involved in making the plan work: and many times, even those who are responsible for executing the plan get left out.

Employees, sections, departments and even executives do not speak to each other, or are guarded when they do due to being “territorial”. Important people in the manufacturing and execution of the plan go uninformed and only become aware of it when the “final” plan is sent with orders to execute it.

When the report is finalized, the investigating team must meet with all individuals responsible for bringing the policy or incentive to life: but caution must be taken to detail specifics and not pass blame or accusations.

More than likely some aspect involves a breach of compliance so compliance representatives must attend.

Depending on the monetary impact, people from finance may also be required, along with outside experts who can be objective and analyze your findings.

Who else must attend these debriefings should be delegated to the investigation team lead. If the investigation was conducted properly, the lead will have more precise details where the cracks in the plan occurred. Last, there must be someone neutral, and with enough authority to make the final decision and give the order how to proceed without contradiction or argument. If the Cobra Effect was damaging enough this might need to be done by your legal folks.


The investigative executive report details the problems, who was involved and specific findings: mitigation is taking the information from the meetings and either reworking the plan correctly or finding a way to withdraw it without causing further problems. This can be the same meeting as the debriefing: but it must include every party responsible for the plan, policy or incentive from start to finish. An additional party here might be someone who routinely does risk assessments for your organization. The plan failed once: if you plan to go through with it after one costly failure someone needs to evaluate the risks involved in trying again. Someone or even a small team must also be involved as monitors; watching, monitoring, measuring and evaluating each step or series of steps to ensure the failures that occurred before are avoided and thought through.

Last, if there were bad actors involved there must be a means to deal with them immediately and Human Resource experts should be involved. Again, neutrality of investigators is key. Plans must already be in place to deal with them immediately, conclusively and without regard to possible alliances or relationships.

Before the Cobra Strikes

Every action we take carries unintended consequences. Thus far we’ve taken a look at dealing with the Cobra Effect after it has occurred. Now let’s focus on preventing it.

Whereas before you were Sherlock Holmes and worked using deductive reasoning, now you need to use inductive reasoning and realize there will be unintended consequences. The outcome is unknown. Someone has determined this incentive, policy or plan must go forward. Now what? How can you plan for the unknown?

Consider utilizing a reverse fishbone diagram. Rather than the head of the fish being the problem, make the head of the fish the desired outcome. Then draw the spine to the tail and determine who will comprise the ribs. This is an unbeatable way to get input from important elements in the process and will involve parties who typically get left out of decisions. You can even let the different parts make their own diagram with processes: what we can do to facilitate the successful launch of plan at each level. They know best: listen and keep them involved.

Failure Mode and Effects Analysis (FMEA) is an approach offered in the AIHC Certified Healthcare Auditor course.  It is a structured approach to discovering potential failures that may exist within the design of a product or process. Failure modes are the ways in which a process can fail. Effects are the ways that these failures can lead to waste, defects or harmful outcomes.

Another term relevant in healthcare, and particularly to avoiding the Cobra Effect, is the Game Theory, or Gaming Theory. A great simplified definition is given by Brittanica and was updated June 2023 in the article game theory mathematics: “game theory, branch of applied mathematics that provides tools for analyzing situations in which parties, called players, make decisions that are interdependent. This interdependence causes each player to consider the other player’s possible decisions, or strategies, in formulating strategy. A solution to a game describes the optimal decisions of the players, who may have similar, opposed, or mixed interests, and the outcomes that may result from these decisions.”

The importance of using this to prevent the Cobra Effect can hardly be understated. Every party affected by the incentive will have their own view of it: their own expertise: their own loyalties and motivations. Depending on the extent of the incentive/plan/policy these parties may have (in fact or assumed), conflicting goals and motivations.

When I first studied Game Theory, a lot of experts stated Game Theory existed kind of in a vacuum; with parties competing based on, basically, selfish motivations. This has since been proven inaccurate, and rather than motivations many people are driven by rules.

  • For example, rather than saying IT only looks out for itself, we now realize IT is an expertise: and based on their regular work, knowledge and experiences they approach objectives from a more technical aspect.
  • Same with finance: rather than self-serving they tend to approach a goal with initial costs, what the outcome can make money-wise, how much it will cost to maintain, etc. Each segment in the plan lives under rules and constraints: so, in this vein using the term “game” is correct even if the fallout is potentially catastrophic or criminal.

So hopefully when you begin this process integrating Game Theory into your processes, you can see the value all parties bring to make the plan successful. And do not order: ASK. Brainstorm. Make every person and every idea count. In my experience healthcare employees tend to be closer to the military: they are proud of their work, they know their jobs and they see a greater purpose in the work they do and they recognize rank: Administration, Providers-Surgeons and Staff. Use this. Invite people with conflicting goals and abilities. Make certain the lead(s) listen. You may discover competing or conflicting goals may be traced back to conflicting abilities or even non-integrating technology that prevents a certain part of the plan from progressing. If there are competing motivations they can be addressed immediately. Potential bad actors can also be ferreted out at this stage. You may also, through this trust, get people to inform you in advance who the bad actors potentially are.

Realize employees recognize the rank structure. This can dovetail into Game Theory by integrating each stratum, and letting them draw up parts of the plan that directly involve them and their expertise. In over 30 years of working with providers and surgeons I know they’re known as fiercely independent and incisive and they think rapidly on their feet. But they’re also loyal and would far rather have input early, and be asked for their input, than be blindsided by an order to do something they never anticipated.

Building Trust Takes Leadership

Long before worrying about the Cobra Effect, healthcare organizations must build a cohesive people system based on individual value, trust and realization of a common mission. Then, when time comes for execution, no one is surprised; they all agreed to the plan and their part in it, and they realize there will be oversight to ensure the processes and outcome agreed on materialize. The executive section may determine what the need is: but the actual blueprints and construction should occur at the other levels.

Note how I don’t say “know your systems”; or “you’ll need massive data” or something similar. Many people are great with data but it either gets incorrectly manipulated or causes cracks between people. This is a common problem with reporters and sportscasters: they either have more data than experience or they lean heavily on data rather than experience. Data initiates nothing: it produces nothing: and it solves nothing. People do. Data is a tool but it cannot replace the people who do produce it. Data also foresees nothing: your best defense? Correct-people.

A revolutionary thinker was a quality engineer often credited with producing the meteoric rise of the Japanese auto industry after World War II: Dr. W. Edwards Deming. Deming was a reverse thinker (even by many of today’s processes) and believed people were the fundamental driver over data and, yes, incentives. If the Cobra Effect is a negative, unintended outcome, then to my mind Dr. Deming could be called “The Cobra Charmer”. One of his main focal points was why productivity and quality so often suffered unintended consequences. Sound familiar?

In an October 12, 2017 article over Dr. Deming I find the best characterization of him and his importance to Quality. The article appeared in the Quality Assurance Directorate (QAD) blog; QAD being part of the Royal Arsenal, United Kingdom: “Dr. W. Edwards Deming’s outlook on quality was simple but radical. He asserted that organizations that focused on improving quality would automatically reduce costs while those that focused on reducing cost would automatically reduce quality and actually increase costs as a result. He outlined his ideas simply in his theory of management, now known as The Deming Theory of Profound Knowledge.”  

Finally, I leave you with a quote from the great General George S. Patton: “Never tell people how to do things. Tell them what to do and they will surprise you with their ingenuity.” When an organization lives on and instills trust and realization of the value of different parts, people become above value. Conversing, sharing and involving them will reap unanticipated benefits. As General Patton saw cohesive teams can produce creative ways of problem solving and mission accomplishment. They might even find novel compliant ways of producing the desired results with less effort, or even superior results.

About the Author

Carl Byron is part time on the professional staff of the American Institute of Healthcare Compliance and a full-time health care auditor for the Department of Defense.  He is a certified coder with AAPC and AHIMA, and is certified by AIHC in compliance, auditing, investigations, clinical documentation improvement and as an ICD-10-CM instructor.  Learn more about the CHA credential - auditing for compliance.  Learn more about conducting internal investigations and CIHFA credential.

Copyright © 2023 American Institute of Healthcare Compliance All Rights Reserved


follow us