Mere Facts & Data Do Not Constitute Knowledge

Why those who boast of holding the ‘data’ and ‘facts’ are also those likely to be the most clueless.

Years ago I was asked to develop and lead a plan targeting acquisition of a marine and boating consumer products company. During the discovery process I flew to Manhattan in order to interview the senior executives about the overall business state and direction. Being an avid sloop sailor, I was rather excited about the chance to speak with the developers of some of the key equipment I employed on my boat.

When I arrived at the intended conference room and began to speak with the C-level executives prior to the meeting start, it became clear that none of them bore experience in boating, nor afloat work or recreation at all. Upon asking if any of them were boaters, the CEO replied, “I think Earl in Product Development owns a boat.” To which the rest of the team acknowledged, “Yes, Earl owns one, I’ve been on it with him.” or responses of such nature. These were all Harvard Finance MBA trained suit and tie wearers – nothing but undertakers there to embalm and prepare the body for value-extraction and burial. They were perdocent, taught to be more-than-eager to follow the instructions exactly as prescribed and without question. They were quickly-promoted operatives whose mission was to funnel formerly productive asset value to predatory offshore elite stockholders – a whole set of clowns of whom none bore an interest in, nor knew anything about boating. Portraits of fashion models, celebrities, and dead Presidents debarked in various vessels, adorned their conference room walls – mocking them for the betrayal of trust in their name.

The single most toxic thing you can do to a business is appoint its own accountants or information technologists to run it. The perdocent mistakenly conflates a state of holding the data and facts, and following the instructions, with competence in the art of its delivery. Very often, the opposite is true.

The worst thing a university can do is teach confidence interval and p-value skills to candidates who have no idea what inference is, nor how to prosecute an argument. This is like handing binoculars to the driver of a car, so that they can ‘see better’.

The most clueless person in a business is typically the controller or Vice President of Finance. Ironically, despite the fact that this is also the person who holds and knows the most facts and data about the business itself. When I conduct a strategy, the team is required to work very closely with such entities early on, simply because they possess all the data the team initially needs. We will often get along well, because I hold an MBA in Finance from a tier I B-school as well. This reality forces a business’ accountants to also be represented on the strategy team.

But once the data is collected and accounts have been defined, the process of strategy is exposed to devolving into teaching those representatives why Enterprise Resource Planning financial profit & loss pro forma are not sufficient tools to determine the state of a market nor the viable direction of a business. I have found myself at times pointing out that the scope of the project did not include teaching accountants how to be strategists and business leaders. As well, I have had to often find ways to insulate the strategy team from being forced to use their methods – techniques which will merely serve to show that driving in-context sales, capturing low-hanging fruit, and cost-cutting are the appropriate next steps.

Why Those Who Boast of Holding the ‘Data’ and ‘Facts’ are Also Likely to Be the Most Clueless

Among my Folleagues I have been known from time to time, to issue the following apothegm.

Data must be denatured into information.
Information must be transmuted into intelligence.
Intelligence is the first sound basis from which to plan or take action.

In order to comprehend what these tenets mean let’s then delve more into the philosophies behind this, shall we?

I. Data must be denatured into information.

Taking data to 3rd normal and relative-to-context form is taught in database development courses no doubt.1 However, denaturation of data requires expertise in the dynamics of the data structures themselves. A faithfulness in observing them over time, a knowledge of human nature, or having observed such structures under similar business dynamics. Denaturation is the process of removing the natural background noise, influences, and confounding factors from the raw data itself.

Resource Example: Powell, Gavin; Database Modeling Step by Step (1st Edition); Auerbach Publications, 18 Dec 2019; ISBN-13: 978-0367422172

Example: In the below example, the graph publisher has depicted the flow of death records through a medical death classification category called R00-R99 ‘Symptoms, Signs, and Abnormal Clinical and Laboratory Findings, Not Elsewhere Classified’. What the graph creator failed to do however, is take the data to a normal form, or to ‘denature’ it. Denaturation (biochemistry, not alcohol) is the process of removing invalid primary, secondary, or tertiary structure so that data may be parsed into its native state.2 This is also often called a process of ‘normalization’. However, since normalization relates to the elemental structure of a relational database as well, I use the term ‘denature’, in order to distinguish that this step constitutes more than simple assembly of a 3rd Normal Form (3NF) database table structure.3

A ‘graph’ is a visual depiction of raw data. A ‘chart’ communicates a concept for comprehension, which transcends or belies the mere ‘data’ behind it.

The example Exhibit 1 graph below is compromised by two dynamic secondary structures: First, the dynamic lag in the rate of its arrival (the dip in each sub-line at the right end), and second the dynamic rate of clearing of the data to other underlying cause of death codes (the relative gaps between the sub-lines). These two secondary structures must be removed from the data (denatured) in order to make it usable for model development or inference.

Denature – to remove the secondary, tertiary, agency-driven, confounding, or misleading concatenation structures from extracted data, no matter its purported ‘reliability’, in order to derive its base state – a state which is then more likely to inform.

As such, while the table below indeed portrays fact, it does not therefore also constitute information. It is disinformation technically, sans intent. It is equivalent to an accountant’s ledger entry and nothing more. The ‘accountant’ here (the CDC) is missing (most of) an alarming rise in this category of deaths, because they are deluding themselves through a conflation of the data and its secondary dynamic structures. An inability to see the forest for the ‘facts’. The fact that the data is reliable, has nothing to do with whether or not it is also therefore useful. It must be denatured before it can become informative.

Exhibit 1 – US Centers for Disease Control data which looks normal and informative – however, whose dynamic secondary structures obfuscate an underlying critical trend in deaths.

II. Information must be transmuted into intelligence.

Transmutating information into intelligence typically involves two steps:

a. Assembly of the denatured elemental data structures produced in Step I above, into a model outlining a chain of mechanism and feedback (a dynamic or ‘neural’ model) which represents the system in question (most all things involve systems), and

b. Filtering out imbalances and dis-economies of scale. That is to say – modeling the true value, magnitude, risk ,or cost of each element or node in a system, and not its fiat or nominal estimate of same.

The holding of a reasonably accurate model, which can describe contribution points, contributors, current or future activity, is called holding ‘intelligence’. Counter-intelligence is the process of obfuscating these elements so that the enemy or competitor is not able to produce an accurate model. This failure to develop dynamic or neural systems, along with modeling by means of nominal (not normalized) node costs (values), almost always lead to the conclusion that the current approach or understanding is the correct answer. Sound familiar? For readers of The Ethical Skeptic, this should.

The techniques which serve to break these habits are typically taught in systems and value chain engineering curricula, but not in most of academic science.

Resource Example: West, Page; Strategic Management: Value Creation, Sustainability, and Performance (6th Edition); Riderwood Publishing, 19 Nov 2019; ISBN-13: 978-1733174404

Example: At one fashion designer for whom I did strategy, the question was raised as to whether or not a shift in sourcing out of South Korea was in order. South Korea had been where the main competitor was conducting its sourcing – a sourcing strength which proved to be a disadvantage to my client in terms of both factory clout and landed cost. I proposed that we take an opposite approach and develop sourcing out of Jordan, Mexico, and Pakistan instead. The finance team retorted that per-unit first costs out of those nations were prohibitively high, product was slow to arrive, and costly to ship.

My team responded that those figures cited by accounting constituted ‘nominal loaded figures’ – in other words, they did not reflect economies of scale, maturation, and clout. They were simply output data from an ERP report, and not the result of a dynamic and normalized node structure. I contended that, once those resources were fully loaded and managed, not only would their per-unit costs then fall in line, but the speed by which they could swiftly outpace South Korean factories, would enable a speed-to-margin strategy win which would surpass the client’s major competitor in terms of fashion-based push sell-through.

The CEO comprehended what I was saying and elected to implement our strategy. Three years later, I was invited to speak before executives at this major competitor (a prominent national brand) – and without divulging proprietary information, present how this client had beaten them over the last three years through smart strategic changes. Although my team contributed a minor role at the very beginning of its process, this business case constituted one of the most monumental ‘David and Goliath’ upsets in US business history.

III. Intelligence is the first sound basis from which to plan or take action.

Intelligence is a process of eliminating the irrelevant, as much as it is a process of identifying the truth. Often, it is not a lack of truth which prevents the making of a sound decision, but rather a lack of clarity – a distraction with the chronically ill-defined, trivial, and menial. Humanity labors as if an unfortunate animal, trapped in a tar pit of irrelevancy, regarding that its desperate bellows will somehow serve to bring it to the salvation of truth.

Assembly and testing of conjecture scenarios through a systemic model, the measure of soundness, logical calculus, deduction, critical path, and inferential strength I have only found (to sufficiency – and not even complete given that) on my website. I am sure all of this is comprised through a number of separate publications – however, there exist few individuals if any, who can match my experiential depth in data analysis, business operations, military intelligence, philosophy, science, decades of applied systems and value chain engineering projects, knowledge of the nature of deception, along with hundreds of successful business and national strategies to boot.

This leveraging of intelligence into sound planning, is roughly outlined in the chart below. It involves the process of extracting thinking from the mire of irrelevance and into the ‘shining pathway of success’. The critical path.

There are many people who write books about this topic, but very few who have actually done it. You will note that these Type II or Semantic Experts employ intimidating-in-appearance heuristic and second-hand accounts, as a substitute for actual personal experience and competence. So I consider this, fortunately or unfortunately, to constitute my horizon of expertise.

Resource and Four Application Examples:

The Ethical Skeptic, “The Strategic Mindset”; The Ethical Skeptic, WordPress, 27 Jan 2022; Web, https://theethicalskeptic.com/?p=61452

The Ethical Skeptic, “Inflection Point Theory and the Dynamic of The Cheat”; The Ethical Skeptic, WordPress, 20 Oct 2019; Web, https://wp.me/p17q0e-atd

The Ethical Skeptic, “The Map of Inference”; The Ethical Skeptic, WordPress, 4 Mar 2019; Web, https://wp.me/p17q0e-9r6

Intelligence after all, constitutes far more than the mere holding of facts and data. Those who boast of such, only serve to discredit themselves in the eyes of an ethical skeptic. Just smile and quietly move on when you encounter these types. Refuse to join them as they wallow, blissful and intoxicated inside the warm yellowing pool of fact-fueled ignorance.

The Ethical Skeptic, “Mere Facts & Data Do Not Constitute Knowledge”; The Ethical Skeptic, WordPress, 9 Apr 2022; Web, https://theethicalskeptic.com/?p=64843

The Plural of Anecdote is Data

A single observation does not necessarily constitute an instance of the pejorative descriptive ‘anecdote’. Not only do anecdotes constitute data, but one anecdote can serve to falsify the null hypothesis and settle a scientific question in short order. Such is the power of a single observation. Such is the power of wielding skillfully, scientific inference. Fake skeptics seek to emasculate the power of the falsifying observation, at all costs.

It is incumbent upon the ethical skeptic, those of us who are researchers if you will – those who venerate science both as an objective set of methods as well as their underlying philosophy – incumbent that we understand the nature of anecdote and how the tool is correctly applied inside scientific inference. Anecdotes are not ‘Woo’, as most fake skeptics will imply through a couple of notorious memorized one-liners. Never mind what they say, nor might claim as straw man of their intent, and watch instead how they apply their supposed wisdom. You will observe such abuse of the concept to be most often the case. We must insist upon the theist and nihilist religious community of deniers, that inside the context of falsification/deduction in particular, a single observation does not constitute an instance of ‘anecdote’ (in the pejorative). Not only do anecdotes constitute data, but one anecdote can serve to falsify the Null (or even null hypothesis) and settle the question in short order. Such is the power of a single observation.

See ‘Anecdote’ – The Cry of the Pseudo-Skeptic

To an ethical skeptic, inductive anecdotes may prove to be informative in nature if one gives structure to and catalogs them over time. Anecdotes which are falsifying/deductive in nature are not only immediately informative, but moreover they are even more importantly, probative. Probative with respect to the null. I call the inferential mode modus absens the ‘null’ because usually in non-Bayesian styled deliberation, the null hypothesis, the notion that something is absent, is not actually a hypothesis at all. Rather, this species of idea constitutes simply a placeholder – the idea that something is not, until proved to be. And while this is a good common sense structure for the resolution of a casual argument, it does not mean that one should therefore believe or accept the null, as merely outcome of this artifice in common sense. In a way, deflecting observations by calling them ‘anecdote’ is a method of believing the null, and not in actuality conducting science nor critical thinking. However, this is the reality we face with unethical skeptics today. The tyranny of the religious default Null.

The least scientific thing a person can do, is to believe the null hypothesis.

Wolfinger’s Misquote

/philosophy : skepticism : pseudoscience : apothegm/ : you may have heard the phrase ‘the plural of anecdote is not data’. It turns out that this is a misquote. The original aphorism, by the political scientist Ray Wolfinger, was just the opposite: ‘The plural of anecdote is data’. The only thing worse than the surrendered value (as opposed to collected value, in science) of an anecdote is the incurred bias of ignoring anecdotes altogether. This is a method of pseudoscience.

Our opponents elevate the scientific status of a typical placeholder Null (such-and-such does not exist) and pretend that the idea, 1. actually possesses a scientific definition and 2. bears consensus acceptance among scientists. These constitute their first of many magician’s tricks, that those who do not understand the context of inference fall-for, over and over. Even scientists will fall for this ole’ one-two, so it is understandable as to why journalists and science communicators will as well. But anecdotes are science, when gathered under the disciplined structure of Observation (the first step of the scientific method). Below we differentiate four contexts of the single observation, in the sense of both two inductive and two deductive inference contexts, only one of which fits the semantics regarding ‘anecdote’ which is exploited by fake skeptics.

Inductive Anecdote

Inductive inference is the context wherein a supporting case or story can be purely anecdotal (The plural of anecdote is not data). This apothegm is not a logical truth, as it could apply to certain cases of induction, however does not apply universally.

Null:  Dimmer switches do not cause house fires to any greater degree than do normal On/Off flip switches.

Inference Context 1 – Inductive Data Anecdote:  My neighbor had dimmer switched lights and they caused a fire in his house.

Inference Context 2 – Mere Anecdote (Appeal to Ignorance):  My neighbor had dimmer switched lights and they never had a fire in their house.

Hence we have Wolfinger’s Inductive Paradox.

Wolfinger’s Inductive Paradox

/philosophy : science : data collection : agency/ : an ‘anecdote’ to the modus praesens (observation or case which supports an objective presence of a state or object) constitutes data, while an anecdote to the modus absens (observation supporting an appeal to ignorance claim that a state or object does not exist) is merely an anecdote. One’s refusal to collect or document the former, does not constitute skepticism. Relates to Hempel’s Paradox.

Finally, we have the instance wherein we step out of inductive inference, and into the stronger probative nature of deduction and falsification. In this context an anecdote is almost always probative. As in the case of Wolfinger’s Inductive Paradox above, one’s refusal to collect or document such data, does not constitute skepticism.

Deductive or Falsifying Anecdote

Deductive inference leading to also, falsification (The plural of anecdote is data). Even the singular of anecdote is data under the right condition of inference.

Null:  There is no such thing as a dimmer switch.

Inference Context 3 – Deductive Anecdote:  I saw a dimmer switch in the hardware store and took a picture of it.

Inference Context 4 – Falsifying Anecdote:  An electrician came and installed a dimmer switch into my house.

For example, what is occurring when one accepts materialism as an a priori truth pertains to those who insert that religious agency between steps 2 and 3 above. They contend that dimmer switches do not exist, so therefore any photo of one necessarily has to be false. And of course, at any given time, there is only one photo of one at all (all previous photos were dismissed earlier in similar exercises). Furthermore they then forbid any professional electrician from installing any dimmer switches (or they will be subject to losing their license). In this way – dimmer switches can never ‘exist’ and deniers endlessly can proclaim to non-electricians ‘you bear the burden of proof’ (see Proof Gaming). From then on, deeming all occurrences of step 2 to constitute lone cases of ‘anecdote’, while failing to distinguish between inductive and deductive contexts therein.

Our allies and co-observers as ethical skeptics need bear the knowledge of philosophy of science (skepticism) sufficient to stand up and and say, “No – this is wrong. What you are doing is pseudoscience”.

Hence, one of my reasons for creating The Map of Inference.

The Ethical Skeptic, “The Plural of Anecdote is Data”; The Ethical Skeptic, WordPress, 1 May 2019; Web, https://wp.me/p17q0e-9HJ

The Spectrum of Evidence Manipulation

Unconscious bias occurs with everyone and inside most deliberation. Such innocent manifestation of bias, while important in focus, is not the First Duty of the ethical skeptic. The list of fallacies and crooked thinking below outline something beyond just simple bias – something we call agency. The tricks of obfuscation of evidence for which ethical skeptics keep vigilant watch.

Michael Shermer has outlined in his November 2018 editorial for Scientific American, a new fallacy of data which he calls the ‘Fallacy of Excluded Exceptions’. In this informal fallacy, evidence which does not serve to confirm one’s a priori conclusion is systematically eliminated or ignored, despite its potentially robust import. This is a form of, not unconscious bias, but a more prevalent and dangerous mode of corrupted thinking which we at The Ethical Skeptic call ‘agency’. The First Duty of Ethical Skepticism is to oppose agency (not simply bias).

Fallacy of Excluded Exceptions

/philosophy : pseudoscience : data manipulation/ : a form of data skulpting in which a proponent bases a claim on an apparently compelling set of confirming observations to an idea, yet chooses to ignore an also robust set of examples of a disconfirming nature. One chisels away at, disqualifies or ignores large sets of observation which are not advantageous to the cause, resulting only seeing what one sought to see to begin with.

“Excluded exceptions test the rule. Without them, science reverts to subjective speculation.” ~ Michael Shermer 1

Despite his long career inside skepticism, Michael is sadly far behind most ethical skeptics in his progression in understanding the tricks of fake epistemology and agency. That is because of his anchoring bias in that – he regards today’s ‘skepticism’ as representative of science and the scientific method, as well as all that is good in data reduction and syllogism. He is blinded by the old influences of bandwagon hype. Influences as master which he must still serve today, and which serve to agency-imbue his opinions. The celebrity conflict of interest.

Agency and bias are two different things.
Ironically, agency can even tender the appearance of mitigating bias, as a method of its very insistence.

Well we at The Ethical Skeptic have been examining tricks of data manipulation and agency for decades, and already possessed a name for this fallacy Michael has been compelled to create from necessity on his own – precisely because it is a very common trick we have observed on the part of fake skeptics to begin with. Michael’s entrenchment inside social skepticism is the very reason why he could not see this fallacy until now – he is undergoing skeptive dissonance and is beginning to spot fallacies of agency his cronies have been committing for decades. Fallacies which he perceives to be ‘new’. Congratulations Michael, you are repenting. The next step is to go out and assist those your cronies and sycophants have harmed in the past through fake skepticism.  Help them develop their immature constructs into hypotheses with mechanism, help them with the scientific method, help them with the standards of how to collect and reduce data and argument. Drop the shtick of a priori declarations of ‘ you are full of baloney’ and help them go and find that out for themselves. Maybe. 2

Agency versus Bias

Bias is the Titanic’s habit of failing to examine its ice berg alerts.
Agency is the Titanic telling the ship issuing ice berg alerts to ‘shut up’.

If all we suffered from was mere bias, things might even work out fine.
But reality is that we are victims of agency, not bias.

Just maybe as well, embarking upon such a journey you will find as I did – that you really did not understand the world all that well, nor have things as figured out as you had assumed. Your club might have served as a bit of a cocoon, if you will. Maybe in this journey you have so flippantly stumbled upon, you will observe as ‘new’ a fallacy that ethical skeptics have identified for a long time now; one which your cabal has routinely ignored.

Evidence Sculpting (Cherry Sorting)

/philosophy : pseudoscience : data manipulation/ : has more evidence been culled from the field of consideration for this idea, than has been retained? Has the evidence been sculpted to fit the idea, rather than the converse?

Skulptur Mechanism – the pseudoscientific method of treating evidence as a work of sculpture. Methodical inverse negation techniques employed to dismiss data, block research, obfuscate science and constrain ideas such that what remains is the conclusion one sought in the first place. A common tactic of those who boast of all their thoughts being ‘evidence based’. The tendency to view a logical razor as a device which is employed to ‘slice off’ unwanted data (evidence sculpting tool), rather than as a cutting tool (pharmacist’s cutting and partitioning razor) which divides philosophically valid and relevant constructs from their converse.

Your next assignment Michael, should you choose to accept it, is to learn about how agency promotes specific hypotheses through the targeting of all others (from The Tower of Wrong: The Art of Professional Lying):

Embargo Hypothesis (Hξ)

/philosophy : pseudoskepticism/ : was the science terminated years ago, in the midst of large-impact questions of a critical nature which still remain unanswered? Is such research now considered ‘anti-science’ or ‘pseudoscience’? Is there enormous social pressure to not even ask questions inside the subject? Is mocking and derision high – curiously in excess of what the subject should merit?

Entscheiden Mechanism – the pseudoscientific or tyrannical approach of, when faced with epistemology which is heading in an undesired direction, artificially declaring under a condition of praedicate evidentia, the science as ‘settled’ and all opposing ideas, anti-science, credulity and pseudoscience.

But Michael, as you begin to spot agency inside purported processes of epistemology, we have to warn you, there is more – oh, so much more which you do not know. Let’s take a brief look shall we?

Agency as it Pertains to Evidence and Data Integrity

So, in an effort to accelerate Michael’s walk through the magical wonderland of social skepticism, and how it skillfully enforces conformance upon us all, let us examine the following. The fallacies, modes of agency and methods of crooked thinking below relate to manipulations of data which are prejudices, and not mere unconscious biases – such as in the case of anchoring bias, wherein one adopts a position which is overly influenced by their starting point or the first information which arrived. They may hold a bias, but at least it is somewhat innocent in its genesis, i.e. not introduced by agency. Prejudicial actions in the handling and reduction of evidence and data, are the preeminent hint of the presence of agency, and the first things which the ethical skeptic should look out for inside a claim, denial, mocking or argument.

Unconscious bias happens with everyone, but the list of fallacies and crooked thinking below, outline something more than simple bias. They involve processes of pseudo-induction, panduction, abduction and pseudo-deduction, along with the desire to dissemble the contribution of agency. You can find this, along with agency-independent and unconscious biases, all defined at The Tree of Knowledge Obfuscation: Misrepresentation of Evidence or Data

And of course, all of these fallacies, biases, modes of agency and crooked thinking can be found and defined here:

And as well, more modes of agency can be found at The Tree of Knowledge Obfuscation itself.

The Tree of Knowledge Obfuscation

A compendium of fallacy and corrupted thought commonly employed inside Social Skepticism

The Ethical Skeptic, “The Spectrum of Evidence Manipulation” The Ethical Skeptic, WordPress, 2 Nov 2018; Web, https://wp.me/p17q0e-8wy