The Elements of Hypothesis

One and done statistical studies, based upon a single set of statistical observations (or even worse lacks thereof), are not much more credible in strength than a single observation of Bigfoot or a UFO. The reason, because they have not served to develop the disciplines of true scientific hypothesis. They fail in their duty to address and inform.

As most scientifically minded persons realize, hypothesis is the critical foundation in exercise of the scientific method. It is the entry door which demonstrates the discipline and objectivity of the person asking to promote their case in science. Wikipedia cites the elements of hypothesis in terms of the below five features, as defined by philosophers Theodore Schick and Lewis Vaughn:1

  • Testability (involving falsifiability)
  • Parsimony (as in the application of “Occam’s razor” (sic), discouraging the postulation of excessive numbers of entities)
  • Scope – the apparent application of the hypothesis to multiple cases of phenomena
  • Fruitfulness – the prospect that a hypothesis may explain further phenomena in the future
  • Conservatism – the degree of “fit” with existing recognized knowledge-systems.

Equivocally, these elements are all somewhat correct, however none of the five elements listed above constitute logical truths of science nor philosophy. They are only correct under certain stipulations. The problem resides in that this renders these elements not useful, and at worst destructive in terms of the actual goals of science. They do not bear utility in discerning when fully structured hypothesis is in play, or some reduced set thereof. For instance, ‘Scope’ is functionally moot at the point of hypothesis, because in the structure of Intelligence, the domain of observation has already been established – it had to have been established, otherwise you could not develop the hypothesis from any form of intelligence to begin with.2 3 To address scope again at the hypothesis stage is to further tamper with the hypothesis without sound basis. Let the domain of observation stand, as it was observed – science does not advance when observations are artificially fitted into scope buckets (see two excellent examples of this form of pseudoscience in action, with Examples A and B below).

Fruitfulness can mean ‘producing that which causes our paradigm to earn me more tenure or money’ or ‘consistent with subjects I favor and disdain’ or finally and worse, ‘is able to explain everything I want explained’. Predictive strength, or even testable mechanism, are much stronger and less equivocal elements of hypothesis. So, these two features of hypothesis defined by Schick and Vaughn are useless to vacuous in terms of real contribution to scientific study. These two bad philosophies of science (social skepticism) serve to produce inevitably a fallacy called explanitude. A condition wherein the hypothesis is considered stronger the more select historical observations it serves to explain and how flexible it can be in predicting or explaining select future observations. Under ethical skepticism, this qualification of an alternative or especially null hypothesis is a false notion. Also known as pseudo-theory, an idea which explains everything easily, likely explains nothing at all. This process begins by a faulty method of science which ‘begins with a question’ (aka as a ‘rhetorically-expressed’ a priori answer).

        Orphan Question

/philosophy : pseudoscience : sciebam/ : a question, purported to be the beginning of the scientific method, which is asked in the blind, without sufficient intelligence gathering or preparation research, and is as a result highly vulnerable to being manipulated or posed by means of agency. The likelihood of a scientifically valid answer being developed from this question process, is very low. However, an answer of some kind can almost always be developed – and is often spun by its agency as ‘science’. This form of question, while not always pseudoscience, is a part of a modified process of science called sciebam. It should only be asked when there truly is no base of intelligence or body of information regarding a subject. A condition which is rare.


/philosophy : science : method : sciebam/ : (Latin: I knew) An alternative form of knowledge development, which mandates that science begins with the orphan/non-informed step of ‘ask a question’ or ‘state a hypothesis’. A non-scientific process which bypasses the first steps of the scientific method: observation, intelligence development and formulation of necessity. This form of pseudoscience/non-science presents three vulnerabilities:

First it presumes that the researcher possesses substantially all the knowledge or framework they need, lacking only to fill in final minor gaps in understanding. This creates an illusion of knowledge effect on the part of the extended domain of researchers. As each bit of provisional knowledge is then codified as certain knowledge based upon prior confidence. Science can only progress thereafter through a series of shattering paradigm shifts.

Second, it renders science vulnerable to the possibility that, if the hypothesis, framework or context itself is unacceptable at the very start, then its researcher therefore is necessarily conducting pseudoscience. This no matter the results, nor how skillfully and expertly they may apply the methods of science. And since the hypothesis is now a pseudoscience, no observation, intelligence development or formulation of necessity are therefore warranted. The subject is now closed/embargoed by means of circular appeal to authority.

Finally, the question asked at the beginning of a process of inquiry can often prejudice the direction and efficacy of that inquiry. A premature or poorly developed question, and especially one asked under the influence of agency (not simply bias) – and in absence of sufficient observation and intelligence – can most often result quickly in a premature or poorly induced answer.

Science (Latin: scī́mus/sciḗmus -‘we know/we will know’)4 – leveraging challenging thinking, deductive falsification, straightforward complexity, and consilience to infer a critical path of novel comprehension – one prosecutes (pursues) truth.

Sciebam (Latin: sciēbā́mus -‘we knew’)5 – exploiting assumption, abduction, panduction, complicated simplicity, and linear/statistical induction to confirm an existing or orphan understanding – one is holder of the truth.

†See The Distinction Between Comprehension and Understanding (The Problem of Abduction)

Real Hypothesis

Ethical skepticism proposes a different way of lensing the above elements. Under this philosophy of hypothesis development, I cannot make any implication of the ilk that ‘I knew’ the potential answer a priori. Such implication biases both the question asked, as well as the processes of inference employed. Rather, hypothesis development under ethical skepticism involves structure which is developed around the facets of Intelligence, Mechanism and Wittgenstein Definition/Domain. A hypothesis is neither a hunch, assumption, suspicion nor idea. Rather it is a form of self-skeptical notion:


/philosophy : skepticism : scientific method/ : a disciplined and structured incremental risk in inquiry, relying upon the co-developed necessity of mechanism and intelligence. A hypothesis necessarily features seven key elements which serve to distinguish it from non-science or pseudoscience.

The Seven Elements of Hypothesis

1.  Construct based upon necessity. A construct is a disciplined ‘spark’ (scintilla) of an idea, on the part of a researcher or type I, II or III sponsor, educated in the field in question and experienced in its field work. Once a certain amount of intelligence has been developed, as well as definition of causal mechanism which can eventually be tested (hopefully) under a given risk exposure or sufficient plausibility, then the construct becomes ‘necessary’ (i.e. passes Ockham’s Razor). See The Necessary Alternative. A hypothesis is not simply a ‘question’, especially one which is asked through agency, or because the scientific method supposedly ‘starts with a question’.

2.  Wittgenstein definition and defined domain. A disciplined, exacting, consistent, conforming definition need be developed as premise for both the domain of observation, as well as the underpinning terminology and concepts. See Wittgenstein Error and The Tests of Neologism.

3.  Parsimony. The resistance to expand explanatory plurality or descriptive-feature complexity beyond what is absolutely necessary, combined with the wisdom to know when to do so. Conjecture along an incremental and critical path of syllogism/risk. Avoidance of unnecessarily orphan questions, even if apparently incremental in the offing. See The Real Ockham’s Razor. Three characteristic traits highlight hypothesis which has been adeptly posed inside parsimony.

a. Is incremental and critical path in its construct – the incremental conjecture should be a reasoned, single stack and critical path new construct. Constructs should follow prior art inside the hypothesis (not necessarily science as a whole), and seek an answer which serves to reduce the entropy of knowledge.

b. Methodically conserves risk in its conjecture – no question may be posed without risk. Risk is the essence of hypothesis. A hypothesis, once incremental in conjecture, should be developed along a critical path which minimizes risk in this conjecture by mechanism and/or intelligence, addressing each point of risk in increasing magnitude or stack magnitude.

c. Posed so as to minimize stakeholder risk – (i.e. precautionary principle) – a hypothesis should not be posed which suggests that a state of unknown regarding risk to impacted stakeholders is acceptable as central aspect of its ongoing construct critical path. Such risk must be addressed first in critical path as a part of 3. a. above.

4.  Duty to Reduce Address and Inform. A critical element and aspect of parsimony regarding a scientific hypothesis. The duty of such a hypothesis to expose and address in its syllogism, all known prior art in terms of both analytical intelligence obtained or direct study mechanisms and knowledge. If information associated with a study hypothesis is unknown, it should be simply mentioned in the study discussion. However, if countermanding information is known or a key assumption of the hypothesis appears magical, the structure of the hypothesis itself must both inform of its presence and as well address its impact. See Methodical Deescalation and The Warning Signs of Stacked Provisional Knowledge.

Unless a hypothesis offers up its magical assumption for direct testing, it is not truly a scientific hypothesis. Nor can its conjecture stand as knowledge.


/philosophy : pseudoscience/ : A pseudo-hypothesis explains everything, anything and nothing, all at the same time.

A pseudo-hypothesis fails in its duty to reduce, address or inform. A pseudo-hypothesis states a conclusion and hides its critical path risk (magical assumption) inside its set of prior art and predicate structure. A hypothesis on the other hand reduces its sets of prior art, evidence and conjecture and makes them manifest. It then addresses critical path issues and tests its risk (magical assumption) as part of its very conjecture accountability. A hypothesis reduces, exposes and puts its magical assertion on trial. A pseudo-hypothesis hides its magical assumptions woven into its epistemology and places nothing at risk thereafter. A hypothesis is not a pseudo-hypothesis as long as it is ferreting out its magical assumptions and placing them into the crucible of accountability. Once this process is ceased, the ‘hypothesis’ has been transformed into an Omega Hypothesis. Understanding this difference is key to scientific literacy.

Grant me one hidden miracle and I can explain everything else.

5.  Intelligence. Data is denatured into information, and information is transmuted into intelligence. Inside decision theory and clandestine operation practices, intelligence is the first level of illuminating construct upon which one can make a decision. The data underpinning the intelligence should necessarily be probative and not simply reliable. Intelligence skills combine a healthy skepticism towards human agency, along with an ability to adeptly handle asymmetry, recognize probative data, assemble patterns, increase the reliability of incremental conjecture and pursue a sequitur, salient and risk mitigating pathway of syllogism. See The Role of Intelligence Inside Science. If all the intelligence offered is cherry picked, mocked, or otherwise biased toward or against the hypothesis, it is not really a hypothesis.

6.  Mechanism. Every effect in the universe is subject to cause. Such cause may be mired in complexity or agency; nonetheless, reducing a scientific study into its components and then identifying underlying mechanisms of cause to effect – is the essence of science. A pathway from which cause yields effect, which can be quantified, measured and evaluated (many times by controlled test) – is called mechanism. See Reduction: A Bias for Understanding.

7.  Exposure to Accountability.  This is not peer review. While during the development phase, a period of time certainly must exist in which a hypothesis is held proprietary so that it can mature – and indeed fake skeptics seek to intervene before a hypothesis can mature and eliminate it via ‘Occam’s Razor’ (sic) so that it cannot be researched. Nonetheless, a hypothesis must be crafted such that its elements 1 – 6 above can be held to the light of accountability, by 1. skepticism (so as to filter out sciebam and fake method) which seeks to improve the strength of hypothesis (this is an ‘ally’ process and not peer review), and 2. stakeholders who are impacted or exposed to its risk. Hypothesis which imparts stakeholder risk, which is held inside proprietary cathedrals of authority – is not science, rather oppression by court definition.

It is developed from a construct – which is a type of educated guess (‘scintilla’ in the chart below). One popular method of pseudoscience is to bypass the early to mid disciplines of hypothesis and skip right from data analysis to accepted proof. This is no different ethically, from skipping right from a blurry photo of Blobsquatch, to conjecture that such cryptic beings are real and that they inhabit all of North America. It is simply a pattern in some data. However, in this case, blurry data which happened to fit or support a social narrative.

A hypothesis reduces, exposes and puts its magical assertion on trial.
A pseudo-hypothesis hides its magical assumptions woven into its epistemology and places nothing at risk thereafter.

Another method of accomplishing inference without due regard to science, is to skip past falsifying or countermanding information and simply ignore it. This is called The Duty to Address and Inform. A hypothesis, as part of its parsimony, cannot be presented in the blind – bereft of any awareness of prior art and evidence. To undertake such promotional activity is a sale job and not science. Why acknowledge depletion of plant food nutrients on the part of modern agriculture, when you have a climate change message to push? Simply ignore that issue and press your hypothesis anyway (see Examples A and B below).

However, before we examine that and other examples of such institutional pseudoscience, let’s first look at what makes for sound scientific hypothesis. Inside ethical skepticism, a hypothesis bears seven critical elements which serve to qualify it as science.

These are the seven elements which qualify whether or not an alternative hypothesis becomes real science. They are numbered in the flow diagram below and split by color into the three discipline streams of Indirect Study (Intelligence), Parsimony and Conservatism (Knowledge Continuity) and Direct Study (Mechanism).

A Few Examples

In the process of defining this philosophical basis over the years, I have reviewed several hundred flawed and agency-compliant scientific studies. Among them existed several key examples, wherein the development of hypothesis was weak to non-existent, yet the conclusion of the study was accepted as ‘finished science’ from its publishing onward.

Most institutional pseudoscience spins its wares under a failure to address and/or inform.

If you are going to accuse your neighbor of killing your cat, if their whereabouts were unknown at the time, then your hypothesis does not have to address such an unknown. Rather merely acknowledge it (inform). However much your neighbor disliked your cat (intelligence), if your neighbor was in the Cayman Islands that week, your hypothesis must necessarily address such mechanism. You cannot ignore that fact simply because it is inconvenient to your inductive/abductive evidence set.

Most all of these studies skip the hypothesis discipline by citing a statistical anomaly (or worse lack thereof), and employing a p-value masquerade as means to bypass the other disciplines of hypothesis and skip right to the peer review and acceptance steps of the scientific method. Examples A and B below fail in their duty to address critical mechanism, while Examples B and C fail in their duty to inform the scientific community of all the information they need, in order to tender peer review. Such studies end at the top left hand side of the graphic above and call the process done, based upon one scant set of statistical observation – in ethical reality not much more credible in strength than a single observation of Bigfoot or a UFO.

Example A – Failure in Duty to Address/Inform on Mechanism, Asking an Orphan Question (Sciebam), Fallacy of Explanitude, Linear Induction used when Deduction was Necessary (Methodical Deescalation)

Increasing CO2 threatens human nutrition. Meyers, Zanobetti, et. al. (Link)

In this study, and in particular Extended Data Table 1, a statistical contrast was drawn between farms located in elevated CO2 regions versus ambient CO2 regions. The contrast resulted in a p-value significance indicating that levels of  Iron, Zinc, Protein and Phytate were lower in areas where CO2 concentrations exhibited an elevated profile versus the global ambient average. This study was in essence a statistical anomaly; and while part of science, should never be taken to stand as neither a hypothesis, nor even worse a conclusion – as is indicated in the social skeptic ear-tickling and sensationalist headline title of the study ‘Increasing CO2 threatens human nutrition’. The study has not even passed the observation step of science (see The Elements of Hypothesis graphic above). Who allowed this conclusion to stand inside peer review? There are already myriad studies showing that modern (1995+) industrial farming practices serve to dramatically reduced crop nutrient levels.6 Industrial farms tend to be nearer to heavy CO2 output regions. Why was this not raised inside the study? What has been accomplished here is to merely hand off a critical issue of health risk, for placement into the ‘climate change’ explanitude bucket, rather than its address and potential resolution. It broaches the question, since the authors neither examined the above alternative, nor raised it inside their Discussion section – that they care neither about climate change nor nutrition dilution – viewing both instead as political football means to further their careers. It is not that they have to confirm this existing study direction, however they should at least acknowledge this in their summary of analytics and study limitations. The authors failed in their duty to address standing knowledge about industrial farming nutrient depletion. This would have never made it past my desk. Grade = C (good find, harmful science).

Example B – Failure in Both Duty to Inform of Intelligence and Duty to Address Mechanism, Fallacy of Explanitude, Orphan Question (Sciebam), Linear Induction Employed

Possible future impacts of elevated levels of atmospheric CO2 on human cognitive performance and on the design and operation of ventilation systems in buildings. Lowe, Heubner, et. al. (Link)

This study cites its review of the immature body of research surrounding the relationship between elevated CO2 and cognitive ability. Half of the studies reviewed indicated that human cognitive performance declines with increasing CO2 concentrations. The problem entailed in this study, similar to the Zanobetti study above in Example 1, is that it does not develop any underlying mechanism which could explain instances how elevated CO2 directly impacts cognitive performance. This is not a condition of ‘lacking mechanism’ (as sometimes the reality is that one cannot assemble such), rather one in which the current mechanism paradigm falsifies the idea. The study should be titled ‘Groundbreaking new understanding on the toxicity of carbon dioxide’. This is of earth-shattering import. There is a lot of science which needs to be modified if this study proved correct at face value. The sad reality is that the study does not leverage prior art in the least. As an experienced diver, I know that oxygen displacement on the order of 4 percentage points is where the first slight effects of cognitive performance come into play. Typical CO2 concentrations in today’s atmosphere are in the range of 400 ppm – not even in the relevant range for an oxygen displacement argument. However, I would be willing to accept this study in sciebam, were they to offer another mechanism of direct effect; such as ‘slight elevations in CO2 and climate temperature serve to toxify the blood’, for example. But no such mechanism exists – in other words, CO2 is only a toxicant as it becomes an asphyxiant.7 This study bears explanitude, it allows for an existing paradigm to easily blanket-explain an observation which might have otherwise indicated a mechanism of risk – such as score declines being attributable to increases in encephalitis, not CO2. It violates the first rule of ethical skepticism, If I was wrong, would I even know it? The authors failed in their duty to inform about the known mechanisms of CO2 interaction inside the body, and as well failed to address standing knowledge about industrial farming nutrient depletion. As well, this study was a play for political sympathy and club rank. Couching this pseudo-science with the titular word ‘Possible’ is not excuse to pass this off as science. Grade = D (inexpert find, harmful science).

Example C – Orphan Question, Failing in All Seven Elements of Hypothesis – Especially Duty to Inform of Intelligence and Wittgenstein Domain Definition (Wrong ages/Wrong study domain population and timeframe)

A Population-Based Study of Measles, Mumps, and Rubella Vaccination and Autism. Madsen, Hviid, et. al. (Link)

This is the notorious ‘Danish Study’ of the relationship between the MMR vaccination and observed rates of autism psychiatric confirmed diagnoses inside the Danish Psychiatric Central Register. These are confirmed diagnoses of autism spectrum disorders (Autism, ADD/PDD and Asperger’s) over a nine year tracking period (see Methodology and Table 2). In Denmark, children are referred to specialists in child psychiatry by general practitioners, schools, and psychologists if autism is suspected. Only specialists in child psychiatry diagnose autism and assign a diagnostic code, and all diagnoses are recorded in the Danish Psychiatric Central Register. The fatal flaw in this study resided in its data domain analyzed and the resulting study design. 77% of autism cases are not typically diagnosed until past 4.5 years of age. Based upon a chi-squared cumulative distribution fit at each individual μ below from the CDC, and 1.2 years degree of freedom, and 12 months of Danish bureaucratic bias results in a (.10 + .08 + .05) = 0.23 crude chance of detection by CDC statistical indices (outlined below) – or around a 77% chance of a false negative (Type II error) across the tripartite population being sampled. The preponderance of diagnoses in the ADD/PDD and Asperger’s sets serves to weight the average age of diagnosis well past the average age of the subjects in this nine year study – tracking patients from birth to an average age = 4.5 years at study end. See graphic to the above right, which depicts the Gompertzian age-arrival distribution function embedded inside this study’s population; an arrival distribution which Madsen and Hviid should have defined and accounted for – but failed to. This constituted a fatal exclusion bias.

In other words, the Hviid Study failed in its Element #2 Wittgenstein domain definition, in that it excluded 77% of its observation base through either incompetence, cleverness, or both.

From the CDC data on this topic, the mean age of diagnosis for ASD spectrum disorders in the United States, where particular focus has tightened this age data in recent years:8

   •  Autistic disorder: 3 years, 10 months
   •  ASD/pervasive developmental disorder (PDD): 4 years, 8 months
   •  Asperger disorder: 5 years, 7 months

Note: A study released 8 Dec 2018 showed a similar effect through data manipulation-exclusion techniques in the 2004 paper by DeStefano et al.; Age at first measles-mumps-rubella vaccination in children with autism and school-matched control subjects: a population-based study in metropolitan Atlanta. Pediatrics 2004;113:259-266.9

Neither did the study occur in a society which has observed a severe uptick in autism, nor during a timeframe which has been most closely associated with autism diagnoses, (2005+).10 Of additional note is the fact that school professionals refer non-profound autism diagnosis cases to the specialists in child psychiatry, effectively ensuring that all such diagnoses occurred after age 5, by practice alone. Exacerbating this is the fact that a bureaucratic infrastructure will be even more slow/fatal in posting diagnoses to a centralized system of this type. These two factors alone will serve to force large absences in the data, which mimic confirmatory negatives. The worse the data collection is, the more the study results favor what we want to see – a formal fallacy called utile absentia. The study even shows the consequent effect inversion (results which suggest that vaccines prevent or cure autism), incumbent with utile absentia. In addition, the overt focus on the highly precise aspects of the study, and away from its risk exposures and other low-confidence aspects and assumptions, constitutes a fallacy called idem existimatis (measure it with a micrometer, mark it with a grease pencil, cut it with an axe). I will measure the depth of the water into which you are cliff diving, to the very millimeter – but measure the cliff you are diving off of, to the nearest 100 feet. The diver’s survival is now an established fact of science by the precision of the water depth measure alone.

In other words this study did not examine the relevant domain of data acceptable to underpin the hypothesis which it purported to support. Forget mechanism and parsimony to prior art – as those waved bye-bye to this study a long time ago. Its conclusions were granted immunity and immediate acclaim because they fit an a priori social narrative held by their sponsors. It even opened with a preamble citing that it was a study to counter a very disliked study on the part of its authors. Starting out a hypothesis prosecution process purported to be of science, by being infuriated about someone else’s study results is not science, not skepticism, not ethical.

Accordingly, this study missed 80% of its relevant domain data. It failed in its duty to inform the scientific community of peers. It is almost as if a closed, less-exposed bureaucracy were chosen precisely because of its ability to both present reliable data, and yet at the same time screen out the maximum number of positives possible. Were I a criminal, I could not have selected a more sinister means of study design myself. This was brilliance in action. Grade = F (diabolical study design, poor science).

All of the above studies failed in their duty to inform. They failed in their responsibility to communicate the elements of hypothesis to the outside scientific community. They were sciebam – someone asked a question, poorly framed and without any background research – and by golly they got an answer. They sure got an answer. They were given free pass, because they conformed to political will. But they were all bad science.

It is the duty of the ethical skeptic to be aware of what constitutes true hypothesis, and winnow out those pretenders who vie for a claim to status as science.

The Ethical Skeptic, “The Elements of Hypothesis”; The Ethical Skeptic, WordPress, 4 Mar 2019; Web,

Critical Attributes Which Distinguish the Scientific Method

The scientific method bears several critical attributes which distinguish it from both mere experiment, as well as its masquerade, the Lyin’tific Method. It behooves the ethical skeptic to understand the critical features which distinguish science from its pretense; to maintain the skill in spotting social manipulation in the name of science.

The experimental method is a subset of the scientific method. There exists a distinct difference between these two protocols of science. The experimental method is oriented towards an incremental continuation of existing knowledge development, and accordingly begins with the asking of a question, bolstered by some quick research before initiating experimental testing. But not all, nor even the majority of knowledge development can be prosecuted in this fashion of study. Under the scientific method, one cannot boast about possessing the information necessary in asking a question at the very start. Asking an uninformed question may serve to bias the entire process – or kill the research artificially without the full awareness of the sponsors or stakeholders. Accordingly, in the scientific method, a question is not asked until step 8 – this in an effort to avoid the pitfalls of pseudo-theory. This is purposeful, because the astute researcher often does not know the critical path necessary to reach his discovery – at the very beginning. Science involves a intelligence development period wherein we ask, 1. what are the critical issues that need to be resolved, 2. what are the irrelevant factors we can ignore for now? and 3. how do I chain these issue resolutions into a critical path of knowledge development? In absence of this process, there exists a bevy of questions – wherein just selecting one and starting experiments, is akin to shooting in the dark. The Oxford Handbook of Philosophy of Science expounds on such a mistake thusly:

The materials physicist Percy Bridgman, commented upon the process by which we ‘translate’ abstract theories and concepts into specific experimental contexts and protocols. Calling this work of reduction and translation ‘operationalism’ – Bridgman cautioned that experimental data production is often guided by substantial presuppositions about the subject matter which arise as a part of this translation. Often raising concern about the ways in which initial questions are formulated inside a scientific context. True science is a process which revisits its methodological constructs (modes of research method) as often as it does its epistemological (knowledge) ones. Accordingly, this principle identified by Bridgman is the foundation of the philosophy which clarifies the difference between the scientific method, and the experimental method. It is unwise to consider the two as being necessarily congruent.1

The process of developing a scientific question, is many times daunting, involving commitment from a sponsor, a long horizon of assimilating observational intelligence and persistence in seeking to establish a case for necessity. A necessity which serves to introduce plurality of argument (see Ockham’s Razor), which can be brought before peers. Advising peers who are in support of the research and assist in developing the construct being addressed, into a workable hypothesis. These peers are excited to witness the results of the research as well.

Science is a process of necessity in developing taxonomic observation, which seeks to establish a critical path of continuously evaluated and incremental in risk conjecture, probing by means of the most effective inference method available, the resolution of a query and its inevitable follow-on inquiry, in such manner that this process can be replicated and shared with mankind.

The Lyin’tific Method in contrast, will target skipping one or more of these critical attributes of science, in an effort to get to a desired answer destination in as expedient a manner as is possible – yet still appear as scientific methodology.

The most anti-science position one can adopt is the insistence that the scientific method consists of one step: 1. Proof.

The Critical Attributes of Science

My identification of the critical attributes of science, and especially the early and neglected steps of the scientific method, are reflected inside a statement on the part of the fictional character Sherlock Holmes, from the 1887 novel, A Study in Scarlett by Sir Arthur Conan Doyle.

“It is a capital mistake to theorize before you have all the evidence.
It biases the judgment.”


Although science is indeed an iterative process, nonetheless true scientific inquiry, as opposed to technical or developmental study, involves these critical path steps which reside at the beginning of the scientific method. Steps which are critical in avoiding the mistake cited by Sir Arthur Conan Doyle above:


Science thereafter is an iterative method which bears the following necessary features (see related also The Elements of Hypothesis):

1.  Flows along a critical path of dependent, salient and revelatory observation and query
2.  Develops hypothesis through testable mechanism
3.  Is incremental in risk of conjecture (does not stack conjectures)
4.  Examines probative study in preference over reliable data
5.  Seeks reliable falsification over reliable inductive inference
6.  Seeks reliable consilience over reliable abductive inference
7.  Does not prematurely make a claim to consensus in absence of available deduction
8.  Shares results, limitations, open issues, next questions, next steps and replication guidance.

Social skeptics seek to deny the first three steps of science, along with routinely ignoring its necessary features. Social skeptics then further push the experimental method in place of the above attributes of science – asking a biased and highly uninformed question (also known in philosophy as rhetoric), while promoting science as nothing but an exclusive club or lab activity.  Finally, incorporating their corrupted version of ‘peer review’ (at sponsorship, see below) wherein they seek to kill ideas before they can be formulated into a hypothesis and be studied. This is a process of corrupted philosophy of science – corrupt skepticism.

Most unanswered questions reside in a state of quandary precisely because of a failure in or refusal to pursue and apply the above characteristics of science.

Accordingly, the scientific method begins with a process of circumspection and skepticism, which is distinctly different from the inception of the much more tactical experimental method. To scoff at this distinction, reveals a state of scientific illiteracy and of never having done actual scientific research nor discovery.

While both the experimental method and the scientific method are valid process descriptions applicable to science, there does exist an abbreviated version of the scientific method which is errant precisely because it adheres only to the Experimental Method. A method which is exposed and vulnerable to Conan Doyle’s caution about presuming which question to ask, or which theory to prosecute, before enough intelligence and necessity has been assimilated:


/philosophy : appeal to authority : pseudoscience/ : (latin: I knew) – an alternative form of knowledge development, which mandates that science begins with the orphan/non-informed step of ‘ask a question’ or ‘state a hypothesis’. A non-scientific process which bypasses the first steps of the scientific method: observation, intelligence development and formulation of necessity. This form of pseudoscience/non-science presents three vulnerabilities:

First it presumes that the researcher possesses substantially all the knowledge or framework they need, lacking only to fill in final minor gaps in understanding. This creates an illusion of knowledge effect on the part of the extended domain of researchers. As each bit of provisional knowledge is then codified as certain knowledge based upon prior confidence. Science can only progress thereafter through a series of shattering paradigm shifts.

Second, it renders science vulnerable to the possibility that, if the hypothesis, framework or context itself is unacceptable at the very start, then its researcher therefore is necessarily conducting pseudoscience. This no matter the results, nor how skillfully and expertly they may apply the methods of science. And since the hypothesis is now a pseudoscience, no observation, intelligence development or formulation of necessity are therefore warranted. The subject is now closed/embargoed by means of circular appeal to authority.

Finally, the question asked at the beginning of a process of inquiry can often prejudice the direction and efficacy of that inquiry. A premature or poorly developed question, and especially one asked under the influence of agency (not simply bias) – and in absence of sufficient observation and intelligence – can most often result quickly in a premature or poorly induced answer.

And sciebam, as a quasi-scientific method would almost be fine in itself, if not for the specter of agency which then further exploits even that rogue approach to science into a complete and permissive masquerade; through additional methodology as outlined below. Such is the methodology practiced by the Pesticide and Vaccine industries today. False science, which can be identified by its faux methodology, all dressed up in profit laden and corporate lab coats.

Ladies and gentlemen, the Lyin’tific Method.

The Lyin’tific Method: The Ten Commandments of Fake Science

When you have become indignant and up to your rational limit over privileged anti-science believers questioning your virtuous authority and endangering your industry profits (pseudo-necessity), well then it is high time to undertake the following procedure.

1. Select for Intimidation. Appoint an employee who is under financial or career duress, to create a company formed solely to conduct this study under an appearance of impartiality, to then go back and live again comfortably in their career or retirement. Hand them the problem definition, approach, study methodology and scope. Use lots of Bradley Effect vulnerable interns (as data scientists) and persons trying to gain career exposure and impress. Visibly assail any dissent as being ‘anti-science’, the study lead will quickly grasp the implicit study goal – they will execute all this without question. Demonstrably censure or publicly berate a scientist who dissented on a previous study – allow the entire organization/world to see this. Make him become the hate-symbol for your a priori cause.

2. Ask a Question First. Start by asking a ‘one-and-done’, noncritical path & poorly framed, half-assed, sciencey-sounding question, representative of a very minor portion of the risk domain in question and bearing the most likely chance of obtaining a desired result – without any prior basis of observation, necessity, intelligence from stakeholders nor background research. Stress that the scientific method begins with ‘asking a question’. Avoid peer or public input before and after approval of the study design. Never allow stakeholders at risk to help select or frame the core problem definition, nor identify the data pulled. Never allow a party highly involved in making observations inside the domain (such as a parent, product user or farmer) to have input into the question being asked nor the study design itself. These entities do not understand science and have no business making inputs to PhD’s.

3. Amass the Right Data. Never seek peer input at the beginning of the scientific process (especially on what data to assemble), only the end. Gather a precipitously large amount of ‘reliable’ data, under a Streetlight Effect, which is highly removed from the data’s origin and stripped of any probative context – such as an administrative bureaucracy database. Screen data from sources which introduce ‘unreliable’ inputs (such as may contain eyewitness, probative, falsifying, disadvantageous anecdotal or stakeholder influenced data) in terms of the core question being asked. Gather more data to dilute a threatening signal, less data to enhance a desired one. Number of records pulled is more important than any particular discriminating attribute entailed in the data. The data volume pulled should be perceptibly massive to laymen and the media. Ensure that the reliable source from which you draw data, bears a risk that threatening observations will accidentally not be collected, through reporting, bureaucracy, process or catalog errors. Treat these absences of data as constituting negative observations.

4. Compartmentalize. Address your data analysts and interns as ‘data scientists’ and your scientists who do not understand data analysis at all, as the ‘study leads’. Ensure that those who do not understand the critical nature of the question being asked (the data scientists) are the only ones who can feed study results to people who exclusively do not grasp how to derive those results in the first place (the study leads). Establish a lexicon of buzzwords which allow those who do not fully understand what is going on (pretty much everyone), to survive in the organization. This is laundering information by means of the dichotomy of compartmented intelligence, and it is critical to everyone being deceived. There should not exist at its end, a single party who understands everything which transpired inside the study. This way your study architecture cannot be betrayed by insiders (especially helpful for step 8).

5. Go Meta-Study Early. Never, ever, ever employ study which is deductive in nature, rather employ study which is only mildly and inductively suggestive (so as to avoid future accusations of fraud or liability) – and of such a nature that it cannot be challenged by any form of direct testing mechanism. Meticulously avoid direct observation, randomized controlled trial, retrospective cohort study, case-control study, cross-sectional study, case reports and series, or especially reports or data from any stakeholders at risk. Go meta-study early, and use its reputation as the highest form of study, to declare consensus; especially if the body of industry study from which you draw is immature and as early in the maturation of that research as is possible.  Imply idempotency in process of assimilation, but let the data scientists interpret other study results as they (we) wish. Allow them freedom in construction of Oversampling adjustment factors. Hide methodology under which your data scientists derived conclusions from tons of combined statistics derived from disparate studies examining different issues, whose authors were not even contacted in order to determine if their study would apply to your statistical database or not.

6. Shift the Playing Field. Conduct a single statistical study which is ostensibly testing all related conjectures and risks in one fell swoop, in a different country or practice domain from that of the stakeholders asking the irritating question to begin with; moreover, with the wrong age group or a less risky subset thereof, cherry sorted for reliability not probative value, or which is inclusion and exclusion biased to obfuscate or enhance an effect. Bias the questions asked so as to convert negatives into unknowns or vice versa if a negative outcome is desired. If the data shows a disliked signal in aggregate, then split it up until that disappears – conversely if it shows a signal in component sets, combine the data into one large Yule-Simpson effect. Ensure there exists more confidence in the accuracy of the percentage significance in measure (p-value), than of the accuracy/precision of the contained measures themselves. Be cautious of inversion effect: if your hazardous technology shows that it cures the very thing it is accused of causing – then you have gone too far in your exclusion bias. Add in some of the positive signal cases you originally excluded until the inversion effect disappears.

7. Trashcan Failures to Confirm. Query the data 50 different ways and shades of grey, selecting for the method which tends to produce results which favor your a priori position. Instruct the ‘data scientists’ to throw out all the other data research avenues you took (they don’t care), especially if it could aid in follow-on study which could refute your results. Despite being able to examine the data 1,000 different ways, only examine it in this one way henceforth. Peer review the hell out of any studies which do not produce a desired result. Explain any opposing ideas or studies as being simply a matter of doctors not being trained to recognize things the way your expert data scientists did. If as a result of too much inherent bias in these methods, the data yields an inversion effect – point out the virtuous component implied by your technology – how it will feed the world or cure all diseases, is fighting a species of supremacy or how the ‘technology not only does not cause the malady in question, but we found in this study that it cures it~!’.

8. Prohibit Replication and Follow Up. Craft a study which is very difficult to or cannot be replicated, does not offer any next steps nor serves to open follow-on questions (all legitimate study generates follow-on questions, yours should not), and most importantly, implies that the science is now therefore ‘settled’. Release the ‘data scientists’ back to their native career domains so that they cannot be easily questioned in the future.  Intimidate organizations from continuing your work in any form, or from using the data you have assembled. Never find anything novel (other than a slight surprise over how unexpectedly good you found your product to be), as this might imply that you did not know the answers all along. Never base consensus upon deduction of alternatives, rather upon how many science communicators you can have back your message publicly. Make your data proprietary. View science details as an activity of relative privation, not any business of the public.

9. Extrapolate and Parrot/Conceal the Analysis. Publish wildly exaggerated & comprehensive claims to falsification of an entire array of ideas and precautionary diligence, extrapolated from your single questionable and inductive statistical method (panduction). Publish the study bearing a title which screams “High risk technology does not cause (a whole spectrum of maladies) whatsoever” – do not capitalize the title as that will appear more journaly and sciencey and edgy and rebellious and reserved and professorial. Then repeat exactly this extraordinarily broad-scope and highly scientific syllogism twice in the study abstract, first in baseless declarative form and finally in shocked revelatory and conclusive form, as if there was some doubt about the outcome of the effort (ahem…). Never mind that simply repeating the title of the study twice, as constituting the entire abstract is piss poor protocol – no one will care. Denialists of such strong statements of science will find it very difficult to gain any voice thereafter. Task science journalists to craft 39 ‘research articles’ derived from your one-and-done study; deem that now 40 studies. Place the 40 ‘studies’, both pdf and charts (but not any data), behind a registration approval and $40-per-study paywall. Do this over and over until you have achieved a number of studies and research articles which might fancifully be round-able up to ‘1,000’ (say 450 or so ~ see reason below). Declare Consensus.

10. Enlist Aid of SSkeptics and Science Communicators. Enlist the services of a public promotion for-hire gang, to push-infiltrate your study into society and media, to virtue signal about your agenda and attack those (especially the careers of wayward scientists) who dissent.  Have members make final declarative claims in one liner form “A thousand studies show that high risk technology does not cause anything!” ~ a claim which they could only make if someone had actually paid the $40,000 necessary in actually accessing the ‘thousand studies’. That way the general public cannot possibly be educated in any sufficient fashion necessary to refute the blanket apothegm. Have them demand final proof as the only standard for dissent. This is important: make sure the gang is disconnected from your organization (no liability imparted from these exaggerated claims nor any inchoate suggested dark activities *wink wink), and moreover, who are motivated by some social virtue cause such that they are stupid enough that you do not actually have to pay them.

To the media, this might look like science. But to a life-long researcher, it is nowhere near valid.  It is pseudo-science at the least; and even worse than in the case of the paranormal peddler – it is a criminal felony and assault against humanity. It is malice and oppression, in legal terms.

The discerning ethical skeptic bears this in mind and uses it to discern the sincere researcher from the attention grabbing poseur.

The Ethical Skeptic, “Critical Attributes Which Distinguish the Scientific Method”; The Ethical Skeptic, WordPress, 31 Mar 2018; Web,

The Role of Critical Path in Logic, Systems and Science

It is the essence of critical thinking. A critical path is the ordered chain of incremental events or logic which produces the most elegant pathway to a reasonably constrained objective. While an aspect of critical thinking however, its application is more related to actual discovery and accomplishment than it is to the more common, cynical martial art of denial. Pop critical thinking is more about explaining the reasons why, what is right is right. Critical path inside of science is about applying intelligence to discover the pathway to what is right. Fearing being not informative, even more than merely being wrong.

It is the most effective (not efficient) path of progress which cuts through the bullshit in order to get to an objective. If you are a regular reader of The Ethical Skeptic, you will hear me speak often of critical path. The ability to discern it, is one of the skills which allows a sincere researcher to spot the agenda faker. It is the same reason why a whitetail buck deer will reliably traverse a path following the ridgeline through a chain of mountains, avoiding the gorges and ravines to either side. Critical logic is the inverse of the condition involving ingoratio elenchi (a misdirection in argument) or ingens vanitatum (a great deal of irrelevance). Modern versions of ‘critical thinking’ taught by today’s social skeptics are usually entropic in their application; producing misinformation, irrelevance and conformity. Energy sucking gorges and ravines of distraction and misdirection. This amateur teaching by activist numpties results in an array of enforced, conforming and simplest answers (see The Art of the Professional Lie). A critical path is in many regards, the opposite of this pseudo-skeptical process. Critical thinking may produce the same conclusion as critical path logic, but in the case of ‘critical thinking’ this is simply an accident in outcome. It is the critical path of logic which is the process of science, and not ‘critical thinking’. Just because someone might virtue signal around promoting the right answer inside a scientific question, does not serve to legitimize the process, path or method they employed to get there.

You will not find critical path logic defined in most scientific or philosophical handbooks and guides. Believe me I have over 40 of such guides in my personal library. In guides to science, you will most likely find the scientific method1† – which is in its essence a process involving a critical path discipline; incomplete though it may be in constituting actual science. In books of philosophy, sadly, rather than finding tenets on logical calculus and the development of knowledge, one finds a list of previous philosophers and their work, also bereft of critical path elemental exposé.

†Please note that Wikipedia has removed its older definition of ‘scientific method’ which began with ‘Define a Question’ as the first step (see the old definition extracted April 1, 2014 in The Scientific Method is Not Simply The Experimental Method), and has replaced it with ethical skepticism’s – ‘Conduct Observation’ now instead!  This is a major breakthrough, and while a remote stretch to imply that The Ethical Skeptic provided contribution to this change; nonetheless, it has taken time and activism on the part of real researchers just like us to supersede the false version of the scientific method, formerly taught by social skeptics over the last 6 decades. They have yet to add in the steps of ‘Frame Intelligence’ and ‘Establish Necessity’ (Ockham’s Razor), before asking a question – but this is a step in the right direction. The Ethical Skeptic is very pleased with this. This evolution is part of the contribution to the dismantling of the social skepticism movement currently underway.

An exception to such paucity on the topic may be found (below quote) in physics and computational scientist Stephen Wolfram’s ‘Completion Theory’, inside of his work, A New Kind of Science, (p. 1037).2 In this critical development of the philosophy of science, Wolfram outlines a set of approaches and principles used in converting conditions of non-confluence in a model structure, into one of confluence or even, unifinality (note, not monofinality). These modeling paradigms outline the benefit of reducing probative insight questions into elements of critical pair testing (highly constrained Bayesian model segments) which force two divergent analogues into one single conclusion, artificially. This approach is forcing rationalization to Bayesian reduction in its ethic of course, but more importantly outlines that – pairing of critical elements must be done iteratively and in the right succession, in order to produce a single, non-entropic answer. This is the essence of a critical path of logic, reducing an argument into single steps of validity which can be used to underpin a larger more comprehensive conjecture.

If one has a multiway system that terminates but is not confluent then it turns out often to be possible to make it confluent by adding a finite set of new rules. Given a string p which gets transformed to either q or r by the original rules, one can always imagine adding a new rule q → r or r → q that makes the paths from p immediately converge. To do this explicitly for all possible p that can occur would however entail having infinitely many new rules. But as noted by Donald Knuth and Peter Bendix in 1970, it turns out often to be sufficient just iteratively to add new rules only for each so-called critical pair q, r that is obtained from strings p that represent minimal overlaps in the left-hand side of the rules one has.

Similarly, German polymath Gottfried Wilhelm (von) Leibniz’s devices for the evaluation of the validity of given theses (ars iudicandi) and for finding new truths on the basis of given truths (ars inveniendi) outline principles which show that scientific logical reduction is nigh unto the art of mathematical reduction – itself the most pure form of critical path in logic.3 The Stanford Encyclopedia of Philosophy summarizes critical path logic thusly, in its series on foundational philosophy of logic:4

Leibniz stresses in the “Nouveaux essais” that syllogistic is part of a sort of universal mathematics, an art of infallibility (art d’infaillibilité). This art is not restricted to syllogisms, but concerns all kinds of formal proofs, i.e. all reasoning in which inferences are executed by virtue of their form.

What is Critical Path? Driving a Dark Highway at Night to an Unknown Destination

Critical path logic is simply a deployment of true critical thinking itself. Critical thinking has nothing whatsoever to do with ‘Stanovich goal enabling behaviors and cognitive dispositions’5 – as that is nothing but compliance and the ability to spot how to comply. This is the definition of critical thinking, and it has nothing to do with what you currently know – or pressure you receive from your peers, to conform to ‘rationality‘:

Critical Thinking

/philosophy : skepticism : science/ : the ability to assemble an incremental, parsimonious and probative series of questions, in the right sequence, which can address or answer a persistent mystery – along with an aversion to wallowing in or sustaining the mystery for personal or club gain. Critical thinking is the ability to understand, along with the skill in ability to deploy for benefit (value, clarity, risk and suffering alleviation), critical path logic and methodology. A process of methodically and objectively evaluating a claim to verity, through seeking new observations/questions which can be creatively and intelligently framed to challenge elements of fiat knowledge which underpin the claim, regardless of how compulsive, reasonable, justified and accepted that knowledge might be promoted or perceived.

There are several species of critical path logic application most commonly employed inside business, law, warfare and science today (I have been extensively and professionally involved in all of these):

  • Legal Prosecution – the critical sequence of evidences and questions which serve to convict or acquit a defendant.
  • Operations Research – the critical sequence of task and work content analytics which serve to attain a specific goal in the most objective/constraint effective method.
  • Science – the critical/sound basis from which to ask each next incremental question under the scientific method, which will serve to most completely and confidently answer a single query into the unknown.
  • Patent Prosecution – the critical set of disclosures, prior art, framings, constraints and claims, which inexorably lead to a non-obvious, teachable and novel invention, use or application.
  • Target Prosecution – in the Navy, when hunting a sub, the sub’s position is unknown at first. However, it can be eventually derived through a deductive process of critical path called Target Motion Analysis. This is a logical and incremental series of questions which, once answered, reveal a set of, or even one single possible instance of, a sub’s location, depth, speed and heading.

Therefore it has become clear over the last 200 years of the development of the philosophy of scientific logic (albeit more of late as opposed to of former), that, inference itself is drawn from three things which act in concert to clarify (reverse entropy) knowledge inside a horizon of unknown:

  • Develop a mathematical pathway of logical reduction – which evolves flexibly by novel outcomes rather than deterministically (as does maths),
  • Constrain in order to iteratively and convergently test critical pairs of modus ponens conjecture (the novelty), and
  • Sequence testing in such a fashion as to maximize probative potential and either an intelligence structure or unifinality (reduce entropy of knowledge).

For instance, let us compare two natural questions, an orphan question and a scientific question (bearing features of critical path):

Did a consciousness craft the universe?   <— This is a probative question, but it is not critical path.  It bears no underlying sound premise, is not parsimonious, incremental nor sequitur inside any particular argument bearing a logical calculus.  This is an orphan question. Even if we obtained the answer from some certified divine revelation, we would not know what to do with it next. It is next to useless, as both a question and an answer. Being right or wrong is inconsequential, as it is not informative.

Can a signal indicating observer effect on one particle, between a particle and its anti-particle carry information about that observation to the complimentary particle, faster than c (speed of light)?   <— Bears premise, is parsimonious and testable, incremental and sequitur – and finally, is highly probative; that is, it bears informative potential which can be crafted into intelligence, which further allows us to craft and constrain a further series of related probative questions.  This is the essence of critical path. It is a turning on, of the headlights of science, while it drives down a dark highway at night. But not only that, it also eventually selects the most effective route to the destination – or even the destination itself.

Social Skeptics will make scientific arguments of denial, which sound like they are principles of science, but in reality bear no more critical path value, than does the orphan question above. Anecdote and eyewitness testimony are to be ignored, complex ideas are wrong, conspiracy theories are wrong, pseudosciences are false topics, question the facts, examine every alternative you can think of.  These are orphan conjectures in science; crafted and disguised so as to not look like religious statements. But they are religious tenets and tools, nonetheless.

If something is false, it should eventually falsify itself through accrued intelligence. And in being found wrong, become highly informative in the process. If we choose instead to pre-certify it as wrong and then choose to block further research through use of apothegms, no informative critical path development (intelligence) can ever be undertaken from that point.  Wrong and seeing, is a world better state than is correct and blind.

This is the essence of a critical path in logic, science and systems. Most people, including many prominent skeptics and scientists, do not get this. It is a discipline of methodically focusing on what is important – and making the inquiry outcome even more rewarding in the process. Walt Whitman laments in “Thoughts”, inside his work Leaves of Grass, about the inability to cut through bullshit and focus on the path of the salient, incremental and sequitur, as such:6

OF persons arrived at high positions, ceremonies,
wealth, scholarships, and the like;
To me, all that those persons have arrived at, sinks
away from them, except as it results to their
Bodies and Souls,
So that often to me they appear gaunt and naked;
And often, to me, each one mocks the others, and
mocks himself or herself,
And of each one, the core of life, namely happiness,
is full of the rotten excrement of maggots,
And often, to me, those men and women pass unwit-
tingly the true realities of life, and go toward
false realities,
And often, to me, they are alive after what custom has
served them, but nothing more,
And often, to me, they are sad, hasty, unwaked son-
nambules, walking the dusk.

In fact, Whitman’s lament stands as metaphor to the elegance of an effective scientific process – it is satisfying and honing in skill, every bit as much as it is illuminating. As head of a materials lab, as CEO of a markets research and intelligence company, as a Director in Intelligence, as a philosopher and as a systems engineer planning over $600 billion in trade throughout my career (all $5 million to $100 million income companies), I have applied extensively, all three forms of professional critical path variant: science, logic and systems. My firms have been in premium demand for this role for over 3 decades; brought in to solve scientific, business and national infrastructure challenges which daunt classic organizations. In my experience, less than 1% of the population grasps the role of critical path in argument, planning and scientific reduction. Whitman’s unwaked sonnambules. Lawyers and mathematicians often do, and scientists sometimes do grasp critical logic. But diagnosticians, technicians, the dilettante and abduction/induction specialists rarely exhibit the skills honed under experience in handling critical path disciplines (see Diagnostician’s Error and The Three Types of Reason). This is why it is important to sense what type of mind you are dealing with early on inside a discussion.  One seldom can accompany a numpty across a path of critical logic or progression, as they do not bear the background nor skill set to assimilate such things. They only know the talking points by means of which they were trained.

Anecdote – BAD! Complex Alternative – BAD! Pseudosciences – BAD! Eyewitness Testimony – BAD! Conspiracy Theory – BAD!

A critical path involves several components of defining feature, which are similar in nature to, however much more than simply engineering critical path method:7 (note: In engineering critical path planning, ostensibly one knows all the tasks, work content, slack, drag and interconnections in advance, and is simply arranging them into a duration minimized framework. In science and in logic, one must apply the intelligence of early steps, in order to improve the clarity, structure and efficacy of the latter steps. There is fog on that horizon – one must adopt a pair of fog lights. See below.)

Critical Path (of systems, science or logic)

/philosophy : skepticism : science : critical thought/ : a preselected and interdependently ordered chain of incremental tasks, experiments or arguments which produce the most elegant pathway of progress to a reasonably constrained goal or answer. Elegance being defined as resource efficiency, plenary completeness and expediency, employed in ethical balance. Each step in a critical path relies upon a foundation of its previous steps/logic, yet adds in one incremental goal, test or claim which is being examined for validity or is sought for accomplishment (incrementalism).  In science, a critical path constitutes a series of tests or analyses crafted in such a succession hierarchy so as to produce a constrained and deductive incremental answer.  In the philosophy of logic, a critical path is the assembly of prior art foundational modus ponens or tolens arguments of logical calculus, which lead to sound basis for a greater incremental truth conjecture.  Finally in systems engineering, a critical path is the chain of interdependency of necessary and sufficient tasks, arranged in their most elegant progression, which leads to accomplishment of an incremental process step or overall planned goal.

The features of a critical path of logic involve the following:

  • a preselected and interdependently ordered chain of incremental tasks, experiments or arguments
  • an elegant pathway of progress, being defined as resource efficiency, plenary completeness and expediency, employed in ethical balance
  • a reliance upon prior art, a foundation of previous steps/logics which imbues soundness
  • is incremental in the nature of each critical step
  • employs a mathematical pathway of logical reduction
  • constrains in order to iteratively and convergently test critical pairs of modus ponens conjecture
  • tests in such a fashion as to maximize probative potential and either an unfolding intelligence structure or unifinality
  • employs feedback from intelligence selected early testing, to modify downline testing steps
  • is deductive in as much as is possible, versus other forms of inference
  • and does not wander aimlessly in testing, rather employs necessity and intelligence to strike a more elegant path to unifinality.

My Example

Of course this does not equate exactly to a logical critical path example, but it does combine systems planning and science into one discipline of critical logic. To the right is an example drawn from a classified lab I managed in the past – the specifics of which are unimportant, save to say that we were comparing the compatibility of various Transition metals as to their lattice substitution tolerance. In an effort to circumvent testing 9 different metals, under three different parameters, with 3 settings to each parameter (just to start off), which would involve 81 peer parallel experiments and take 8 months of costly and valuable reactor time, we decided to shortcut this by focusing instead on comparatives along one indice. In theory, if we did not find probative advantage early on, we could end up having to pursue all 81 orthogonal study permutations. But by focusing on 1 indice, running testing based on that 1 input and 1 setting in order to hone in on our most probable candidate, we were able to develop intelligence around the performance of various materials – which would accelerate our prosecution of the broader question (Q3301). The 3301 testing series related to the compatibility of niobium diboride in this role, as compared to a variety of other materials. Previous testing had shown niobium to bear significant advantage in both substitution and even some issues of interstitial phase displacement. We used this intelligence to our elegance advantage, if you will – by focusing on the most critical components of the broader question, 3301.

Before we simply shotgun tested every parameter and combination, we did some quick up front comparatives with our most promising element, which allowed us to focus on the core issues, and cut out 85% of the 81 permutations of test necessary in answering this one question alone.  This because, there were a series of at least 35 more questions necessary to answer before we could approach this material as a technology, not just a science.

The essence of this approach was to establish a path of testing which was focused on a guess (constraint), and then employ early results to modify the number of downline tests necessary (intelligence). Wolfram’s path convergence.

In other words, we pursued a relational dynamic between converging the schedule of events inside our testing, to become congruent with the unfolding of the critical logic inside the discovery. This is the process of developing and using intelligence – wherein our method made for our guesses getting better and better, very fast. Some scientists even bristled over this. We pursued what was probative as a priority, not simply what was methodically reliable.

We were not afraid to be wrong – we feared being not informative, even more.

Now this is but simply one version of a critical path – those of simply scheduling and logic may differ in structure, but not really in ethic. They all focus on what is important, and critical in attaining the goal, based upon advanced observations and only that information necessary in getting to the objective. Nothing more. This is the essence of a critical path.

The faking skeptic will toss out every manner of ignoratio elenchi and ingens vanitatum. They want to look the part, and enjoy being praised as science. They have no tolerance for wrongness, because being informative is not their primary goal. Be wise to this.

The Ethical Skeptic, “The Role of Critical Path in Logic, Systems and Science” The Ethical Skeptic, WordPress, 25 March 2018, Web;