The ACAN Problem – When the Shit Hits the Fan

How do we approach a problem which is inelastic to experts and siloed expertise? The problem which requires a multi-disciplinary approach in order to even identify as a problem in the first place.
Moreover, the ACAN problem demands a change in the ways in which we speak and think, what we fear, what we value, and finally the praxis of skepticism we employ.

Years ago, my team was brought in to a consumer goods company to develop its operating strategy. The challenge which lay before this number three ranked player in its business sub-vertical resided in its struggle to grow, and compete with the strong sourcing of that sub-vertical’s top operators. The top two consumer goods businesses in my client’s segment had dominated their class for two decades, through strong Walmart-styled product sourcing clout, x-factory production blocking, consolidation prowess, pervasive and exclusionary agency relationships, and a disciplined and highly efficient freight, processing, and delivery infrastructure. These players had achieved ‘best in practice’ status, and furthermore had leveraged this clout to drive lowest first cost and industry dominance by means of exclusivity with their oveseas suppliers. Such was a familiar business landscape, one which capitalized upon Chinese hegemony and played out inside American business at that time. A set of mistakes and bad crony judgment for which we are paying the piper even now. Most of the cronies who drove this strategy are now retired at their villas or deceased, leaving for us the task of resolving their messy legacy.

Cronies crave cash and revenue does not make right. Instead, flows of margin and value are king. This is why value chain analysis is a critical component of sound national, market, and business strategy.

On September 22, 480 BCE, a naval engagement occurred in the Saronic Gulf, called the Battle of Salamis. This event was part of a broader ongoing conflict between an alliance of Greek city-states under politician and general Themistocles and the Achaemenid Empire (Persia) under King Xerxes. In this naval battle, the Persian fleet greatly outnumbered the Greek fleet, and as well was equipped with superior ships. Their ‘best in class’ armada bore both the numerics and efficacy sufficient to dispatch a blue water ocean-going opponent in short order. What Themistocles elected to do, was decline to engage the Persians inside their arena of strength, fighting on the high seas. Instead, he boldly told Xerxes exactly where his fleet resided and invited him to come fight there.1

In order to compress this tale of antiquity into its moral; in short, Themistocles enticed overconfident Xerxes through a bit of sleight-of-hand disinformation (both, constituting ambiguity) to fight him inside a strait between the Greek mainland and the island of Salamis. The Persians were less skilled at the columnar-and-abreast tactics of naval engagement inside a strait (complexity), and moreover faced the reality that in this circumstance, because of the constrained engagement zone, only an equal number of ships could indeed be in actual combat at any given time. This meant that Xerxes could not take advantage of his greater numbers (novelty). Indeed, Themistocles had forced a matching of island warfare specialty ships and crews versus blue ocean warfare specialty ships and crews (asymmetry).

Xerxes in his overconfidence, had failed to spot the asymmetry, complexity, ambiguity, and novelty – the ACAN – of the battle which was about to ensue.

As a result, the Persian fleet lost several command vessels early in the conflict and became confused. In the disarray, they ended up having to retreat from their first sortie. However, anticipating this first retreat, Themistocles had an Aeginetan task group on standby (asymmetry, complexity, ambiguity and novelty all embodied inside one single strategic element), which subsequently ambushed Xerxes’ fleet and routed the armada during its exit from the straits.2

Much as in the way that the Biblical David, refused to fight with sword and shield against a tougher sword-and-shielded opponent in Goliath, instead exploiting his nearsightedness, overconfidence, and lack of mobility against him, the Greeks attained victory simply through the action of changing the playing field on the erstwhile ‘industry leader’.

In this same way, my client elected to change the playing field of their industry sub-vertical. Instead of going head-to-head against the top two companies inside their arena of strength, one of serving the customer as first priority, they elected instead to position their infrastructure to be able to respond more adeptly to the vendor industry. In this manner, they were able to respond to emergent vendor buys in much quicker fashion; physically taking possession of problem, distressed, and overstocked inventory and shipments within 24 hours.

I believe there is an innate wisdom as well, a potency within the simple ethic of refusing to do what everyone else is doing. If a naked emperor skulks somewhere inside the popular fray, you will be most sure to find it.

Vendors loved to avail themselves of this new capability and lowered inventories with abandon by means of this handy exit strategy opportunity. As the economy flagged during the endless George W Bush wars, and as the great preponderance of vendor inventory/production began to fall into these types of distressed dispositions, my client was able to cherry pick the best of this class of goods and obtain them at a fraction of the price the major competitors were paying – despite their volume buying leverage. Yes, it is true that American consumers are pampered by good service, but they love off-priced (not simply ‘discount’) merchandise even more. In our strategy, I called this method of market engagement a ‘source based value chain infrastructure’. Accordingly, my client rose from number three, to number one in its market segment, in part, because of this evolution.

Years later I would have clients in sales meetings cite this very business case back to me, framing for my team their desire to have their business employ its philosophy as means to becoming ‘best-in-class’. They would pedantically lecture me with my very own industry strategy, even citing the company (my client), fully unaware that my team had led the groundbreaking project. Then ironically question whether or not my team was capable of delivering strategy for such best practice. I would remain silent, smile, and nod accommodatingly.

Most strategists don’t even know what a value chain is, much less how to craft a strategy employing such principles. The only way to learn how to assemble, normalize, and make decisions from a value chain is to develop several species of them in the real world, and have surveyed their outcomes at a later date. Value chain principles can be topically addressed in a textbook, but cannot be fully instructed in an academic or pulp-mill publication context. It took me eight years to organically figure out what a value chain even was (through myriad successes and directly observing many business and market collapses/consolidations), and another ten years to explain its principles to the industry, major enterprise resource planning software developers, and the leading instructors at my alma mater. However, now you will find the phrase bandied about frequently by poseurs – often as a cudgel of intimidation. Much of skepticism and science functions in this same way. We are all humans after all, even if elite segments of professional domains fail to remember this reality.

Such elicits a key insight regarding arrogance among poseurs, and that of suit, lab-coat, and logo wearers in particular. They often imitate, but seldom innovate.

The simple reality is this, value chains are a methodology which affords the astute problem solver, the ability to see more clearly into the logical calculus of critical path, comprised inside the prosecution of what I call, the ACAN, or New World problem. And just as in the Penrose endless staircase depicted in the image at the beginning of this article, unless one introduces nodes and measures of value/risk into the decision calculus, the ACAN as well as most strategic problems, are often actually insolvent. As a result of our inability to see this, we as governing entities resort to levers of cash, revenues, derivatives, good-sounding rhetoric, and control – to our eventual demise and the suffering of those who are left out.

The ACAN Problem

The New World Problem or ACAN problem, is one which is ‘Asymmetric – Complex – Ambiguous – Novel’. These are top-shelf problems which are difficult to distinguish in the first place, much less negotiate and/or resolve. They require a different type of thinking – a mode of thought which is essential in identifying the ACAN problem. It is not simply that the ACAN problem requires a multi-disciplinary approach, and of course it does. But moreover, the ACAN problem demands that we change the ways in which we speak and think, what we fear (risk), how we measure and adjudicate value, and finally in order to break the entailed contrathetic impasse, the praxis of skepticism we employ.

Value and risk flow, just like product, cash, and margin. Always seek to become exposed to value flows and robust to flows of risk.

Your every decision and action as a business should be one which seeks to either capture value into your brand and/or displace risk to the market.

As a nation or market, never allow any entity’s margin to outrun their value nor marinate in excessive risk.

If one can systemically measure, model, and equitably leverage value and risk into a value chain – one can rule the world (but then of course that would not serve value nor risk).

The ability to distinguish which ACAN problems society faces in the first place, will be a top shelf skill in the New World. This has become most poignant in the post-Soviet/Nazi era, wherein the bad guys are not as easy to spot as they once were. Darth Vader does not exist in an embodied single person any longer. This is not your father’s bad guy. Now he comes dressed in the cloak of woke, with flowers and and army of people thinking they are doing good – all of whom have zero perception of what value and risk indeed are.

Accordingly, below (and as depicted in Exhibit 1 as well) are the features of a true ACAN problem. Specifically, an ACAN problem is one where the following supra-challenges imbue and enhance the normal challenges which we face with respect to run-of-the-mill problems, specifically in terms of a circumstance bearing the following features.

Asymmetry

  • Might serve to conceal shortages, sabotaged efforts, or flawed or incomplete processes
  • May increase exposure risk inside Force Majeure events
  • Can serve to hide the presence and impact of groupthink or bandwagoneering
  • May serve to delude governance into perceiving a false success
  • Introduces left-hand right-hand or compartmentalization ineffectiveness
  • Allows for one critical failure to sabotage a full set of successes

Complexity

  • Clouds the ability of professionals to observe measurement, precision, and tolerance abuses
  • Clouds the ability of operators to follow through, design or control process or quality
  • Clouds the perception of managers in identifying and resolving complicated-ness (more dangerous than complexity) in their processes
  • Tempts management to think in terms of single indices, linearity, normal curves, or averages
  • Enables confidence to cover for lack of capability
  • May allow politics to foster
  • Allows the incompetent to survive

Ambiguity

  • Lack of discipline in language allows goals/processes to be ill defined, and critical actors to talk past each other
  • Ignorance of sensitivity, feedback, or whipsaw conditions renders operators vulnerable to their own processes
  • The presence of neglect or Nelsonian knowledge becomes difficult to spot and eradicate
  • May introduce right-answer wrong-timing or invalid inference problems
  • Serves to conflate inductive and deductive inference
  • May obscure ability to evaluate soundness, logical calculus, or critical path
  • Allows failures to be concealed

Novelty

  • Tempts players to pretend like they know what is occurring
  • Neutralizes effectiveness of script or method educated professionals
  • Encourages reliance on buzz-phrases and apothegm
  • Serves to confuse or cause dissonance in the Peters in an organization
  • Allows an appeal to authority elite class to emerge
  • Allows a lack of success to come to be expected – or even be rewarded

Anti-fragility after all, is exhibited by the organization which can best identify, measure, and negotiate complexity, novelty, ambiguity, and asymmetry. By mapping flows of value and risk, one can more readily discern asymmetry, ambiguity, and complexity from the mundanity of mere product, margin, and information flow. One can more readily negotiate a novel circumstance, and run circles around the classic experts who inhabit the domain therein.

Notice how well amateurs and ‘conspiracy theorists’ performed on comprehension of critical issues relative to the best epidemiologists and public health officials during the Covid-19 pandemic. Covid-19 was an ACAN problem, which we as a society failed to recognize. We were erstwhile Xerxes, conflating the numbers of ships we possessed, and the cockiness of our captains, with actual competence in combat.

Develop this skill, and one will be the person to call, when the shit hits the fan.

Exhibit 1 – the essence of anti-fragility resides in a team’s ability to spot the presence of complexity, ambiguity, novelty, and asymmetry (rightmost column) – and how they serve to compound more classic organizational or process struggles (left three columns).

The solution entailed with each New World or ACAN problem of course demands creativity, persistence, and insight. There is no one formula which results in a win. But you will find, that the individual or team who can craft a keen vision of the core ACAN problem in the first place, is the only entity which stands even a remote chance of actually solving it.

Everything else is vanity.

The Ethical Skeptic, “The ACAN Problem – When the Shit Hits the Fan”; The Ethical Skeptic, WordPress, 7 Aug 2022; Web, https://theethicalskeptic.com/2022/08/07/the-acan-problem-when-the-shit-hits-the-fan/

How Glyphosate Practices Serve to Increase Our Diet Risk Exposure

We are highly risk exposed to the world’s most widely used pesticide, glyphosate. We as a regulatory entity, an industry and a technology, fail to track glyphosate’s modalities, vectors and its actual EPA Part 180 Maximum Tolerance Limit compliance inside our food supply. This is called malfeasance in the business world, and bonus sive malum inside ethical skepticism. Otherwise known as criminal ignorance and pseudoscience.

pesticidesCertainly yes, I am a skeptic. One of the first rules of ethical skepticism, after the tenets concerning conducting your own investigation and holding open an ‘allow-for’ disposition regarding multiple strong explanatory approaches, is to be skeptical of your own thoughts, and indeed, work. So yes, I am skeptical of the data I have produced below. But as an ethical skeptic I also have a problem in that I have never seen this data published, despite the critical importance of this issue inside social discourse on the rapid decline in American health and skyrocketing rates of auto-immune, allergy and microbiome related disorders since 1995. So I went and pulled the official sources and did the analysis myself. As a note, this is the reality I face in 90% of the instances regarding tough social issues inside which we find so many social skeptic ‘experts’ and so little actual data/research.

I would also not be maintaining integrity inside my own philosophical base, were I to not raise the warning flag of concern about what I see inside my data regarding glyphosate regulation and monitoring practices and risk vector pathways within the American food supply chain (see my chart below).

Raising a warning flag of plurality is an ethical skeptical action. It does not stand as a claim to final proof, neither is it an accusation of conspiracy, nor is it tantamount to credulousness/bias – nor any other of the red herring and strawman objection protocols employed by fake skeptics. It is simply a call for research, under the context of risk based necessity.

Now set aside the fact that the very foods in vector exposure V below are the very same ones which make me break out, gain weight, get painful intestinal disorders and become very sick. Set all that aside for whatever reason you choose: apophenia, placebo (just mistaking that I get sick), a priori confirmation bias, etc. I assure you that these are not contributors to my observation base in the least. But some of you use these things as methodical cynicism defense mechanisms, so I recognize that and allow for it. Be that as it may, yes let’s set this personal observation aside – and simply address the risk vector pathways incumbent inside the current practices involving application, regulation, tracking, and most importantly – weighted risk exposure, regarding glyphosate employment inside the United States food supply.

Set aside as well, the fact that the top two contribution vectors, Aspirated/Whole Grains and Corn Sweeteners, are the top two soaring allergy/sensitivity growth food commodities since 1995. Don’t let correlation move you to causality, we wouldn’t want that at all. Better to just ignore it instead. ‘Cuz that is being skeptical after all.

Below I have assembled a chart which is drawn from the following three resources on pesticide use, EPA Part 180 MTL tolerances and corresponding food consumption rates by commodity in the United States. I extracted the data on glyphosate and glyphosate bearing foods – and compared that to the rates of US consumption in pounds per capita, in the chart below. This took a good 8 hours of data assimilation and sorting in order to derive a picture which is not available to the American Public.

  • US Environmental Protection Agency Office of Pesticide Programs Index to Pesticide Chemical Names, Part 180 Tolerance Information, and Food and Feed Commodities (by Commodity) December 12, 2012¹
  • United States Department of Agriculture: Profiling Food Consumption in America, Chapter 2²
  • Food and Drug Administration: Pesticide Residues and Industrial Chemicals 2004 – 2005 sorted by Pesticide/Chemical³

Several alarms were raised inside this analysis. Not conclusions mind you – as I am always skeptical of my own work – rather, flags. Flags which not only indicate practice exposures inside the regulation, administration and monitoring of glyphosate in the US, but as well correlate highly with specific foods which are showing to produce health problems in the United States since glyphosate’s introduction to the food supply chain in starting in 1995. So without further ado, let’s outline these exposure pathways which emerge from the analysis in the chart below.

Vectors and Modalities for Glyphosate Entry and Risk Exposure in American Diets ¹ ² ³

 I.  Glyphosate History CurveActual Pesticide Contents Are Not Measured nor MTL Tracked for Any Food

First, the most widely employed pesticide inside the American food supply is neither tracked for actual level compliance to EPA Part 180 MTL’s for ANY food at all, nor in many cases is even specified for Maximum Tolerance Limits on several critical foods which employ large scale use of glyphosate.¹ ³

II.  Cheese, Butter & Dairy Contents are Highly Exposed and Neither Regulated nor MTL Tracked

Both the content of glyphosate inside these critical caloric contributors, as well as the fodder and feed contribution (100 – 400 ppm) to such foods is neither monitored for actual EPA Part 180 MTL compliance nor even specified for a Maximum Tolerance Limit.¹ ³

III.  Dried Beans Contents are Highly Exposed and Neither Regulated nor MTL Tracked

Both the content of glyphosate inside these critical caloric contributors is neither monitored for actual EPA Part 180 MTL compliance nor even specified for a Maximum Tolerance Limit.¹ ³

IV.  Animal Fats are Highly Exposed and Neither Regulated nor MTL Tracked

Both the content of glyphosate inside these critical caloric contributors, as well as the fodder and feed contribution (100 – 400 ppm) to such foods is neither monitored for actual EPA Part 180 MTL compliance nor even specified for a Maximum Tolerance Limit.¹ ³

V.  High Risk/Content Exposed/Desiccated Foods Are Not MTL Tracked

The content of glyphosate inside these critical caloric contributors is not monitored for compliance to EPA Part 180 MTL’s at all.³

Set aside the increase in use (potentially attributable to desiccation practices – which has been recently claimed and disputed by several resources). No monitoring has been conducted to observe the prevalence, nor ppm impact of modalities and practices of any kind, including use of the Monsanto desiccation instruction. This is a risk exposure and a warning flag. You cannot make the claim that there is no problem, and attack people who bring up the issue, even if you are Snopes, if you have not conducted any actual research.†

Aspirated Grain Fractions and Whole Grains

Refined Wheat, Barley & Oats

High Fructose Corn Syrup and Other Sweeteners

Safflower, Sunflower, Cottonseed, Canola, Soybean Oils

Soy Products

Cattle, Poultry & Pork Meat Products (including fats, oils & milk derivatives)

Corn/Corn Feed Sourced Products

absorbed glyphosate is not excreted in urineVI.  Feed and Forage Content Contribution is Neither Regulated nor Tracked

Both the content measures for glyphosate inside these critical caloric contributors to our meat supply (feed contribution 100 – 400 ppm) are neither monitored for actual EPA Part 180 MTL compliance nor are they even specified for a Maximum Tolerance Limit by modality contribution to our food.¹ ³

The bio-accumulation, given glyphosate’s persistence in soft tissue, is not modality measured and EPA Part 180 MTL tracked for bio-accumulation sensitive food derivatives such as Cheese, Cream, Butter, Milk, Dairy, Shortening and Animal Fat derivative products.

The Compiled Data From the Three Resources

List is complied from resource 1, matched to commodity measures from resource 2 (consumption lbs per capita indexed against MTL ppm ratios). Then sorted, highest to lowest in terms of contribution to overall amount in weight of glyphosate consumed (theoretical) in the per capita diet.  !!! indicators show where risk exposure exists but is not Part 180 defined. Yellow commodity highlights indicate non-animal derived foods, while beige highlights indicate animal derived foods. Green highlights indicate animal feed and fodder commodities. Direct unknown risks rank first, quantified MTL risks second by theoretical per capita quantity of glyphosate exposure (lbs), while indirect (feed and fodder) risk ranks last in priority flagging. Resource 3 shows that none of these food commodity types are tracked for actual parts per million Part 180 MTL compliance and impact on the American diet.¹ ² ³ And here is why we need to be concerned about this:

Abstract (Glyphosate pathways to modern diseases V: Amino acid analogue of glycine in diverse proteins Article (PDF Available)inJournal of Biological Physics and Chemistry Volume 16(June):9-46 · June 2016

Glyphosate, a synthetic amino acid and analogue of glycine, is the most widely used biocide on the planet. Its presence in food for human consumption and animal feed is ubiquitous. Epidemiological studies have revealed a strong correlation between the increasing incidence in the United States of a large number of chronic diseases and the increased use of glyphosate herbicide on corn, soy and wheat crops. Glyphosate, acting as a glycine analogue, may be mistakenly incorporated into peptides during protein synthesis. A deep search of the research literature has revealed a number of protein classes that depend on conserved glycine residues for proper function. Glycine, the smallest amino acid, has unique properties that support flexibility and the ability to anchor to the plasma membrane or the cytoskeleton. Glyphosate substitution for conserved glycines can easily explain a link with diabetes, obesity, asthma, chronic obstructive pulmonary disease (COPD), pulmonary edema, adrenal insufficiency, hypothyroidism, Alzheimer’s disease, amyotrophic lateral sclerosis (ALS), Parkinson’s disease, prion diseases, lupus, mitochondrial disease, non- Hodgkin’s lymphoma, neural tube defects, infertility, hypertension, glaucoma, osteoporosis, fatty liver disease and kidney failure. The correlation data together with the direct biological evidence make a compelling case for glyphosate action as a glycine analogue to account for much of glyphosate’s toxicity. Glufosinate, an analogue of glutamate, likely exhibits an analogous toxicity mechanism. There is an urgent need to find an effective and economical way to grow crops without the use of glyphosate and glufosinate as herbicides.

Glyphosate being snuck into your diet

epoché vanguards gnosis


¹  (EPA 180 MLI)  US Environmental Protection Agency Office of Pesticide Programs Index to Pesticide Chemical Names, Part 180 Tolerance Information, and Food and Feed Commodities (by Commodity) December 12, 2012; https://www.epa.gov/sites/production/files/2015-01/documents/tolerances-commodity.pdf

²  (USDA Chap 2)  United States Department of Agriculture: Profiling Food Consumption in America, Chapter 2: http://www.usda.gov/factbook/chapter2.pdf

³  (FDA Pest/Chem)  Food and Drug Administration: Pesticide Residues and Industrial Chemicals 2004 – 2005 sorted by Pesticide/Chemical (PDF, 95KB) http://www.fda.gov/downloads/Food/FoodScienceResearch/TotalDietStudy/UCM291686.pdf

† Snopes: “Grain of Truth? Are U.S. farmers saturating wheat crops with Monsanto’s Roundup herbicide as a desiccant to facilitate a quicker harvest?”; http://www.snopes.com/food/tainted/roundupwheat.asp.

‡  Glyphosate pathways to modern diseases V: Amino acid analogue of glycine in diverse proteins Article (PDF Available)inJournal of Biological Physics and Chemistry Volume 16(June):9-46 · June 2016; https://www.researchgate.net/publication/305318376_Glyphosate_pathways_to_modern_diseases_V_Amino_acid_analogue_of_glycine_in_diverse_proteins

The Correlation-Causality One-Liner Can Highlight One’s Scientific Illiteracy

“Correlation does not prove causality.” You have heard the one-liner uttered by clueless social skeptics probably one thousand times or more. But real science rarely if ever starts with ‘proof.’ More often than not, neither does a process of science end in proof. Correlation was never crafted as an analytical means to proof. However this one-liner statement is most often employed as a means of implying proof of an antithetical idea. To refuse to conduct the scientific research behind such fingerprint signal conditions, especially when involving a risk exposure linkage, can demonstrate just plain ole malicious ignorance. It is not even stupid.

When a social skeptic makes the statement “Correlation does not prove causality,” they are making a correct statement. It is much akin to pointing out that a pretty girl smiling at you does not mean she wants to spend the week in Paris with you. It is a truism, most often employed to squelch an idea which is threatening to the statement maker. As if the statement maker were the boyfriend of the girl who smiled at you. Of course a person smiling at you does not mean they want to spend a week in Paris with you. Of course correlation does not prove causality. Nearly every single person bearing any semblance of rational mind understands this.  But what the one who has uttered this statement does not grasp, while feeling all smart and skeptickey in its mention, is that they have in essence revealed a key insight into their own lack of scientific literacy. Specifically, when a person makes this statement, three particular forms of error most often arise. In particular, they do not comprehend, across an entire life of employing such a statement, that

1.  Proof Gaming/Non Rectum Agitur Fallacy: Correlation is used as one element in a petition for ‘plurality’ and research inside the scientific method, and is NOT tantamount to a claim to proof by anyone – contrary to the false version of method foisted by scientific pretenders.

To attempt to shoot down an observation, by citing that it by itself does not rise tantamount to proof, is a form of Proof Gaming. It is a trick of trying to force the possible last step of the scientific method, and through strawman fallacy regarding a disliked observer, pretend that it is the first step in the scientific method. It is a logical fallacy, and a method of pseudoscience. Science establishes plurality first, seeks to develop a testable hypothesis, and then hopes, …only hopes, to get close to proof at a later time.

Your citing examples of correlation which fail the Risk Exposure Test, does not mean that my contention is proved weak.

… and yes, science does use correlation comparatives in order to establish plurality of argument, and consilience which can lead to consensus (in absence of abject proof). The correlation-causality statement, while mathematically true, is philosophically and scientifically illiterate.¹²³

2. Ignoratio Elenchi Fallacy (ingens vanitatum): What is being strawman framed as simply a claim to ‘correlation’ by scientific pretenders, is often a whole consilience (or fingerprint) of mutually reinforcing statistical inference well beyond the defined context of simple correlation.

Often when data shows a correlation, it also demonstrates other factors which may be elicited to demonstrate a relationship between two previously unrelated contributing variables or data measures.  There are a number of other factors which science employs through the disciplines of modeling theory, probability and statistics which can be drawn from a data relationship. In addition these inferences can be used to mutually support one another, and exponentially increase the confidence of contentions around the data set in question.²³

3.  Methodical Cynicism: Correlation is used as a tool to examine an allowance for and magnitude of variable dependency. In many cases where a fingerprint signal is being examined, the dependency risk has ALREADY BEEN ESTABLISHED or is ALLOWED-FOR by diligent reductive science. To step in the way of method and game protocols and persuasion in order to block study, is malevolent pseudoscience.

If the two variables pass the risk-exposure test, then we are already past correlation and into measuring that level of dependency, not evaluating its existence. If scientific studies have already shown that a chemical has impacts on the human or animal kidney/livers/pancreas, to call an examination of maladies relating to those organs as they relate to trends in use of that chemical a ‘correlation’ is an indication of scientific illiteracy on the part of the accuser. Once a risk relationship is established, as in the case of colon disorders as a risk of glyphosate intake, accusations of ‘correlation does not prove causality’ constitute a non-sequitur Wittgenstein Error inside the scientific method. Plurality has been established and a solid case for research has been laid down. To block such research is obdurate scientific fraud.²³

4.  Correlation does not prove causality… however, even weaker in strength of inference is an implicit refutation by claim of coincidence.

Most often, when one poses the ‘correlation does not prove causality’ apothegm, they are attempting to enforce an implicit counter-claim to coincidence in the observed data. While this is the null, it is also most often not an actual hypothesis – nor can such a claim be made without evidence.  Most often the evidence in support of a correlation being merely coincidence, is in fact weaker than the evidence in support of causality. A position of epoche is warranted – not denial, in such circumstances.

Calling or downgrading the sum total of these inferences through the equivocal use of the term ‘correlation,’ not only is demonstrative of one’s mathematical and scientific illiteracy, but also demonstrates a penchant for the squelching of data through definition in a fraudulent manner. It is an effort on the part of a dishonest agent to prevent the plurality step of the scientific method.
None of this has anything whatsoever to do with ‘proof.’

A Fingerprint Signal is Not a ‘Correlation’

earth mag fieldAn example of this type of scientific illiteracy can be found here (Note: a former article entitled Correlation Is Not Causation in Earth’s Dipole Contribution to Climate by Steven Novella, which was dropped by Discover Magazine). There is a well established covariance, coincidence, periodicity and tail sympathy; a long tight history of dynamic with respect to how climate relates to the strength of Earth’s magnetic dipole moment. This is a fingerprint signal. Steven Novella incorrectly calls this ‘correlation.’ A whole host of Earth’s climate phenomena move in concert with the strength of our magnetic field. This does not disprove anthropogenic contribution to current global warming. But to whip out a one liner and shoot at a well established facet of geoscience, all so as to protect standing ideas from facing the peer review of further research is not skepticism, it is pseudoscience. The matter merits investigation. This hyperepistemology one-liner does not even rise to the level of being stupid.

Measuring of An Established Risk Relationship is Not a ‘Correlation’

Risk Exposure Exists CorrelationAn example of this type of scientific illiteracy can be found inside pharmaceutical company pitches about how the increase in opioid addiction and abuse was not connected with their promotional and lobbying efforts. Correlation did not prove causality. Much of today’s opiate epidemic stems from two decades of promotional activity undertaken by pharmaceutical companies. According to New Yorker Magazine, companies such as Endo Pharmaceuticals, Purdue Pharma and Johnson & Johnson centered their marketing campaigns on opioids as general use pain treatment medications. Highly regarded medical journals featured promotions directed towards physicians involved in pain management. Educational courses on the benefits of opioid-based treatments were offered. Pharmaceutical companies made widespread use of lobbyist groups in their efforts to disassociate opiate industry practices from recent alarming statistics (sound familiar? See an example where Scientific American is used for such propaganda here). One such group received $2.5 million from pharmaceutical companies to promote opioid justification and discourage legislators from passing regulations against unconstrained opioid employment in medical practices. (See New Yorker Magazine: Who is Responsible for the Pain Pill Epidemic?) The key here is, that once a risk relationship is established, such as between glyphosate and cancer, one cannot make the claim that correlation does not prove causality in the face of two validated sympathetic risk-dependency signals. It is too late, plurality has been established and the science needs to be done. To block such science is criminal fraud.

Perhaps We Need a New Name Besides Correlation for Such Robust Data Fit

Both of these examples above elicit instances where fake skeptic scientific illiteracy served to mis-inform, mis-lead or cause harm to the American Public. Correlation, in contrast, is simply a measure of the ‘fit’ of a linear trend inside the relationship between a two factor data set. It asks two questions (the third is simply a mathematical variation of the second):

  1. Can a linear inference be derived from cross indexing both data sets?, and
  2. How ‘close to linearity’ do these cross references of data come?
  3. How ‘close to curvinlinearity’ do these cross references of data come?

The answer to question number 2 is called an r-factor or correlation coefficient. Commonly, question number 3 is answered by means of a coefficient of determination and is expressed as an r² factor (r squared).³ Both are a measure of a paired-data set fit to linearity. That is all. In many instances pundits will use correlation to exhibit a preestablished relationship, such as the well known relationship between hours spent studying and academic grades. They are not establishing proof with a graph, rather simply showing a relationship which has already been well documented through several other previous means. However, in no way shape or form does that mean that persons who apply correlation as a basis of a theoretical construct are therefore then contending a case for proof. This is a relational form of the post hoc ergo propter hoc fallacy. This is a logical flaw, served up by the dilettante mind which confuses the former case, an exhibit, and conflates it with the later use, the instance of a petition for research.

Correlation Dismissal Error (Fingerprint Ignorance)

/philosophy : logic : evidence : fallacy/ : when employing the ‘correlation does not prove causality’ quip to terminally dismiss an observed correlation, when the observation is being used to underpin a construct or argument possessing consilience, is seeking plurality, constitutes direct fingerprint evidence and/or is not being touted as final conclusive proof in and of itself.

THIS is Correlation (Pearson’s PPMCC)      It does not prove causality (duh…)¹²

Cor 1

This is a Fingerprint Signal and is Not Simply a Correlation³∋

diabetes and glyphosate

There are a number of other methods of determining the potential relationship between two sets of data, many of which appear to the trained eye in the above graph. Each of the below relational features individually, and increasingly as they confirm one another, establish a case for plurality of explanation. The above graph is not “proving” that glyphosate aggravates diabetes rates. However, when this graph is taken against the exact same shape and relationship graphs for multiple myloma, non-Hodgkin’s Lymphoma, bladder cancer, thyroid disease, pancreatic cancer, irritable bowel syndrome, inflammatory bowel syndrome, lupus, fibromyalgia, renal function diminishment, Alzheimer’s, Crohn’s Disease, wheat/corn/canola/soy sensitivity, SIBO, dysbyosis, esophageal cancer, stomach cancer, rosacea, gall bladder cancer, ulcerative colitis, rheumatoid arthritis, liver impairment and stress/fatty liver disease, … and for the first time in our history a RISE in the death rates of of middle aged Americans…

… and the fact that in the last 20 years our top ten disease prescription bases have changed 100%… ALL relating to the above conditions and ALL auto-immune and gut microbiome in origin. All this despite a decline in lethargy, smoking and alcohol consumption on average. All of this in populations younger than an aging trend can account for.

Then plurality has been argued. Fingerprint signal data has been well established. This is an example of consilience inside an established risk exposure relationship. To argue against plurality through the clueless statement “Correlation does not prove causality” is borderline criminal. It is scientifically illiterate, a shallow pretense which is substantiated by false rationality (social conformance) and a key shortfall in real intelligence.

Contextual Wittgenstein Error Example – Incorrect Rhetoric Depiction of Correlation

cor 2

The cartoon to the left is a hypoepistemology which misses the entire substance of what constitutes fingerprint correlation. A fingerprint signal is derived when the bullet-pointed conditions exist – None of which exist in the cartoon invalid comparison to the left – this is a tampering with definition, enacted by a person who has no idea what correlation in this context, even means. A Wittgenstein Error. In other words: scientifically illiterate propaganda. Conditions which exist in a proper correlation, or more, condition:

  • A constrained pre-domain and relevant range which differ in stark significance
  • An ability to fit both data sets to curvinlinear or linear fit, with projection through golden section, regression or a series of other models
  • A preexisting contributor risk exposure between one set of unconstrained variables and a dependent variable
  • A consistent time displacement between independent and dependent variables
  • A covariance in the dynamic nature of data set fluctuations
  • A coincident period of commencement and timeframe of covariance
  • A jointly shared arrival distribution profile
  • Sympathetic long term convex or concave trends
  • A risk exposure (see below) – the cartoon to the left fails the risk exposure test.

Rhetoric: An answer, looking for a question, targeting a victim

Fingerprint Elements: When One or More of These Risk Factor Conditions is Observed, A Compelling Case Should be Researched¹²³

Corresponding Data – not only can one series be fitted with a high linear coefficient, another independent series can also be fitted with a similar and higher coefficient which increases in coherence throughout a time series both before and during its domain of measure, and bears similar slope, period and magnitude. In this instance as well, a preexisting risk exposure has been established.  This does not prove causality, however is a strong case for plurality especially if a question of risk is raised. To ignore this condition, is a circumstance where ignorance ranges into fraud.

Cor 1a

Covariant Data – not only can one series be fitted with a high coefficient, another independent series can also be observed with a similar fit which increases in coherence as a time series both before and during its domain of measure, and bears similar period and magnitude. Adding additional confidence to this measure is the dx/dy covariance, Browning Covariance, or distance covariance, etc. measure which can be established between the two data series; that is, the change in x(1)…x(n) versus y(1)…y(n). In this instance as well, a preexisting risk exposure has been established.  This does not prove causality, however is a very strong case for plurality especially if a question of risk is raised. To ignore this condition, is a circumstance where socially pushed skepticism ranges into fraud.

 Cor 1b

Co-incidence Data – two discrete measures coincide as a time series both before and during its domain of measure, and bear similar period and magnitude. Adding additional confidence to this measure magnitude consistency which can be established between the two data series; that is, the discrete change in x(1)…x(n) versus y(1)…y(n). In this instance as well, a preexisting risk exposure has been established.  This does not prove causality, however is a moderately strong case for plurality especially if a question of risk is raised. To ignore this condition, is a circumstance where arrogant skepticism ranges into fraud.

Cor 1c

Jointly Distributed Data – two independent data sets exhibit the same or common arrival distribution functions. Adding additional confidence to this measure magnitude consistency which can be established between the two data series; that is, the discrete change in x(1)…x(n) versus y(1)…y(n). In this instance as well, a preexisting risk exposure has been established.  This does not prove causality, however is a moderately strong case for plurality especially if a question of risk is raised. To ignore this condition, is a circumstance where arrogant skepticism ranges into fraud.

Cor 1d

Probability Function Match – two independent data sets exhibit a resulting probability density function of similar name/type/shape. Adding additional confidence to this measure magnitude consistency which can be established between the two data series; that is, the discrete change in x(1)…x(n) versus y(1)…y(n). In this instance as well, a preexisting risk exposure has been established.  This does not prove causality, however is a moderately strong case for plurality especially if a question of risk is raised. To ignore this condition is not wise.

Cor 1e

Marginal or Tail Condition Match – the tail or extreme regions of the data exhibit coincidence and covariance. Adding additional confidence to this measure magnitude consistency which can be established between the two data series when applied in the extreme or outlier condition; that is, the discrete change of these remote data in x(1)…x(n) versus y(1)…y(n). In this instance as well, a preexisting risk exposure has been established.  This does not prove causality, however is a moderately strong case for plurality especially if a question of risk is raised. To ignore this condition, is a circumstance where even moderate skepticism ranges into fraud activity.

 Cor 1f

Sympathetic Long Term Shared Concave or Convex – long term trends match each other, but more importantly each is a departure from the previous history and occurred simultaneously, offset by a time displacement, are both convex or concave and co-vary across the risk period. Adding additional confidence to this measure magnitude consistency which can be established between the two data series; that is, the discrete change in x(1)…x(n) versus y(1)…y(n). In this instance as well, a preexisting risk exposure has been established.  This does not prove causality, however is a compellingly strong case for plurality especially if a question of risk is raised. To ignore this condition, is a circumstance where even moderate skepticism ranges into fraud activity.

 Cor 1g

Discrete Measures Covariance – the mode, median or mean of discrete measures is shared in common and/or in coincidence, and also vary sympathetically over time. Adding additional confidence to this measure magnitude consistency which can be established between the two data series; that is, the discrete change in mode and mean over time. In this instance as well, a preexisting risk exposure has been established.  This does not prove causality, however is a moderate case for plurality especially if a question of risk is raised. To ignore this condition is not wise.

Cor 1h

Risk Exposure Chain/Test – two variables, if technical case were established that one indeed influenced the other, would indeed be able to influence one another. (In other words, if your kid WAS eating rat poison every Tuesday, he WOULD be sick on every Wednesday – but your kid eating rat poison would not make the city mayor sick on Wednesday). If this condition exists, along with one or more of the above conditions, a case for plurality has been achieved. To ignore this condition, is a circumstance where even moderate skepticism ranges into fraud activity.

 Cor 1i

These elements, when taken in concert by honest researchers, are called fingerprint data. When fake skeptics see an accelerating curve which matches another accelerating curve – completely (and purposely) missing the circumstance wherein any or ALL of these factors are more likely in play – to say “correlation” is what is being seen, demonstrates their scientific illiteracy. It is up to the ethical skeptic to raise their hand and say “Hold on, I am not ready to dismiss that data relationship so easily. Perhaps we should conduct studies which investigate this risk linkage and its surrounding statistics.”

To refuse to conduct the scientific research behind such conditions, especially if it involves something we are exposed to three times a day for life, constitutes just plain active ignorance and maliciousness. It is not even stupid.

epoché vanguards gnosis


¹  Madsen, Richard W., ” Statistical Concepts with Applications to Business and Economics,” Prentice-Hall, 1980; pp 604 – 610.

²  Gorini, Catherine A., “Master Math Probability,” Course Technology, 2012; pp. 175-196, 252-274.

³  Levine, David M.; Stephan, David F., “Statistics and Analytics,” Pearson Education, 2015; pp. 137-275.

∋  Graphic employed for example purposes only. Courtesy of work of Dr. Stephanie Seneff, sulfates, glyphosates and gmo food; MIT, september 19, 2013.