Why those who boast of holding the ‘data’ and ‘facts’ are also those likely to be the most clueless.
Years ago I was asked to develop and lead a plan targeting acquisition of a marine and boating consumer products company. During the discovery process I flew to Manhattan in order to interview the senior executives about the overall business state and direction. Being an avid sloop sailor, I was rather excited about the chance to speak with the developers of some of the key equipment I employed on my boat.
When I arrived at the intended conference room and began to speak with the C-level executives prior to the meeting start, it became clear that none of them bore experience in boating, nor afloat work or recreation at all. Upon asking if any of them were boaters, the CEO replied, “I think Earl in Product Development owns a boat.” To which the rest of the team acknowledged, “Yes, Earl owns one, I’ve been on it with him.” or responses of such nature. These were all Harvard Finance MBA trained suit and tie wearers – nothing but undertakers there to embalm and prepare the body for value-extraction and burial. They were perdocent, taught to be more-than-eager to follow the instructions exactly as prescribed and without question. They were quickly-promoted operatives whose mission was to funnel formerly productive asset value to predatory offshore elite stockholders – a whole set of clowns of whom none bore an interest in, nor knew anything about boating. Portraits of fashion models, celebrities, and dead Presidents debarked in various vessels, adorned their conference room walls – mocking them for the betrayal of trust in their name.
The single most toxic thing you can do to a business is appoint its own accountants or information technologists to run it. The perdocent mistakenly conflates a state of holding the data and facts, and following the instructions, with competence in the art of its delivery. Very often, the opposite is true.
The worst thing a university can do is teach confidence interval and p-value skills to candidates who have no idea what inference is, nor how to prosecute an argument. This is like handing binoculars to the driver of a car, so that they can ‘see better’.
The most clueless person in a business is typically the controller or Vice President of Finance. Ironically, despite the fact that this is also the person who holds and knows the most facts and data about the business itself. When I conduct a strategy, the team is required to work very closely with such entities early on, simply because they possess all the data the team initially needs. We will often get along well, because I hold an MBA in Finance from a tier I B-school as well. This reality forces a business’ accountants to also be represented on the strategy team.
But once the data is collected and accounts have been defined, the process of strategy is exposed to devolving into teaching those representatives why Enterprise Resource Planning financial profit & loss pro forma are not sufficient tools to determine the state of a market nor the viable direction of a business. I have found myself at times pointing out that the scope of the project did not include teaching accountants how to be strategists and business leaders. As well, I have had to often find ways to insulate the strategy team from being forced to use their methods – techniques which will merely serve to show that driving in-context sales, capturing low-hanging fruit, and cost-cutting are the appropriate next steps.
Why Those Who Boast of Holding the ‘Data’ and ‘Facts’ are Also Likely to Be the Most Clueless
Among my Folleagues I have been known from time to time, to issue the following apothegm.
Data must be denatured into information.
Information must be transmuted into intelligence.
Intelligence is the first sound basis from which to plan or take action.
In order to comprehend what these tenets mean let’s then delve more into the philosophies behind this, shall we?
I. Data must be denatured into information.
Taking data to 3rd normal and relative-to-context form is taught in database development courses no doubt.1 However, denaturation of data requires expertise in the dynamics of the data structures themselves. A faithfulness in observing them over time, a knowledge of human nature, or having observed such structures under similar business dynamics. Denaturation is the process of removing the natural background noise, influences, and confounding factors from the raw data itself.
Resource Example: Powell, Gavin; Database Modeling Step by Step (1st Edition); Auerbach Publications, 18 Dec 2019; ISBN-13: 978-0367422172
Example: In the below example, the graph publisher has depicted the flow of death records through a medical death classification category called R00-R99 ‘Symptoms, Signs, and Abnormal Clinical and Laboratory Findings, Not Elsewhere Classified’. What the graph creator failed to do however, is take the data to a normal form, or to ‘denature’ it. Denaturation (biochemistry, not alcohol) is the process of removing invalid primary, secondary, or tertiary structure so that data may be parsed into its native state.2 This is also often called a process of ‘normalization’. However, since normalization relates to the elemental structure of a relational database as well, I use the term ‘denature’, in order to distinguish that this step constitutes more than simple assembly of a 3rd Normal Form (3NF) database table structure.3
A ‘graph’ is a visual depiction of raw data. A ‘chart’ communicates a concept for comprehension, which transcends or belies the mere ‘data’ behind it.
The example Exhibit 1 graph below is compromised by two dynamic secondary structures: First, the dynamic lag in the rate of its arrival (the dip in each sub-line at the right end), and second the dynamic rate of clearing of the data to other underlying cause of death codes (the relative gaps between the sub-lines). These two secondary structures must be removed from the data (denatured) in order to make it usable for model development or inference.
Denature – to remove the secondary, tertiary, agency-driven, confounding, or misleading concatenation structures from extracted data, no matter its purported ‘reliability’, in order to derive its base state – a state which is then more likely to inform.
As such, while the table below indeed portrays fact, it does not therefore also constitute information. It is disinformation technically, sans intent. It is equivalent to an accountant’s ledger entry and nothing more. The ‘accountant’ here (the CDC) is missing (most of) an alarming rise in this category of deaths, because they are deluding themselves through a conflation of the data and its secondary dynamic structures. An inability to see the forest for the ‘facts’. The fact that the data is reliable, has nothing to do with whether or not it is also therefore useful. It must be denatured before it can become informative.
II. Information must be transmuted into intelligence.
Transmutating information into intelligence typically involves two steps:
a. Assembly of the denatured elemental data structures produced in Step I above, into a model outlining a chain of mechanism and feedback (a dynamic or ‘neural’ model) which represents the system in question (most all things involve systems), and
b. Filtering out imbalances and dis-economies of scale. That is to say – modeling the true value, magnitude, risk ,or cost of each element or node in a system, and not its fiat or nominal estimate of same.
The holding of a reasonably accurate model, which can describe contribution points, contributors, current or future activity, is called holding ‘intelligence’. Counter-intelligence is the process of obfuscating these elements so that the enemy or competitor is not able to produce an accurate model. This failure to develop dynamic or neural systems, along with modeling by means of nominal (not normalized) node costs (values), almost always lead to the conclusion that the current approach or understanding is the correct answer. Sound familiar? For readers of The Ethical Skeptic, this should.
The techniques which serve to break these habits are typically taught in systems and value chain engineering curricula, but not in most of academic science.
Resource Example: West, Page; Strategic Management: Value Creation, Sustainability, and Performance (6th Edition); Riderwood Publishing, 19 Nov 2019; ISBN-13: 978-1733174404
Example: At one fashion designer for whom I did strategy, the question was raised as to whether or not a shift in sourcing out of South Korea was in order. South Korea had been where the main competitor was conducting its sourcing – a sourcing strength which proved to be a disadvantage to my client in terms of both factory clout and landed cost. I proposed that we take an opposite approach and develop sourcing out of Jordan, Mexico, and Pakistan instead. The finance team retorted that per-unit first costs out of those nations were prohibitively high, product was slow to arrive, and costly to ship.
My team responded that those figures cited by accounting constituted ‘nominal loaded figures’ – in other words, they did not reflect economies of scale, maturation, and clout. They were simply output data from an ERP report, and not the result of a dynamic and normalized node structure. I contended that, once those resources were fully loaded and managed, not only would their per-unit costs then fall in line, but the speed by which they could swiftly outpace South Korean factories, would enable a speed-to-margin strategy win which would surpass the client’s major competitor in terms of fashion-based push sell-through.
The CEO comprehended what I was saying and elected to implement our strategy. Three years later, I was invited to speak before executives at this major competitor (a prominent national brand) – and without divulging proprietary information, present how this client had beaten them over the last three years through smart strategic changes. Although my team contributed a minor role at the very beginning of its process, this business case constituted one of the most monumental ‘David and Goliath’ upsets in US business history.
III. Intelligence is the first sound basis from which to plan or take action.
Intelligence is a process of eliminating the irrelevant, as much as it is a process of identifying the truth. Often, it is not a lack of truth which prevents the making of a sound decision, but rather a lack of clarity – a distraction with the chronically ill-defined, trivial, and menial. Humanity labors as if an unfortunate animal, trapped in a tar pit of irrelevancy, regarding that its desperate bellows will somehow serve to bring it to the salvation of truth.
Assembly and testing of conjecture scenarios through a systemic model, the measure of soundness, logical calculus, deduction, critical path, and inferential strength I have only found (to sufficiency – and not even complete given that) on my website. I am sure all of this is comprised through a number of separate publications – however, there exist few individuals if any, who can match my experiential depth in data analysis, business operations, military intelligence, philosophy, science, decades of applied systems and value chain engineering projects, knowledge of the nature of deception, along with hundreds of successful business and national strategies to boot.
This leveraging of intelligence into sound planning, is roughly outlined in the chart below. It involves the process of extracting thinking from the mire of irrelevance and into the ‘shining pathway of success’. The critical path.
There are many people who write books about this topic, but very few who have actually done it. You will note that these Type II or Semantic Experts employ intimidating-in-appearance heuristic and second-hand accounts, as a substitute for actual personal experience and competence. So I consider this, fortunately or unfortunately, to constitute my horizon of expertise.
Resource and Four Application Examples:
The Ethical Skeptic, “The Strategic Mindset”; The Ethical Skeptic, WordPress, 27 Jan 2022; Web, https://theethicalskeptic.com/?p=61452
The Ethical Skeptic, “Inflection Point Theory and the Dynamic of The Cheat”; The Ethical Skeptic, WordPress, 20 Oct 2019; Web, https://wp.me/p17q0e-atd
The Ethical Skeptic, “The Map of Inference”; The Ethical Skeptic, WordPress, 4 Mar 2019; Web, https://wp.me/p17q0e-9r6
Intelligence after all, constitutes far more than the mere holding of facts and data. Those who boast of such, only serve to discredit themselves in the eyes of an ethical skeptic. Just smile and quietly move on when you encounter these types. Refuse to join them as they wallow, blissful and intoxicated inside the warm yellowing pool of fact-fueled ignorance.
The Ethical Skeptic, “Mere Facts & Data Do Not Constitute Knowledge”; The Ethical Skeptic, WordPress, 9 Apr 2022; Web, https://theethicalskeptic.com/?p=64843