Necessity and plurality are the critical elements inside Ockham’s Razor, and not this business of the simplicity of any particular explanation. How does agency block the introduction of a necessary alternative, and how does the ethical skeptic go about establishing a case for its necessity? Such critical questions revolve around the distinction between valid probative confirmation and the pseudo-inference of linear affirmation bias.
Ockham’s Razor is the mandate that, inside reductionist science: plurality should not be posited without necessity.1 In straightforward terms this means that, in order to conserve parsimony, one should not propose study of a new explanatory alternative, nor add features to an existing alternative or the null, without a well established case of necessity to do so. Both of these flavors of additions in conjecture if you will, constitute conditions of what is called ‘plurality’. Plurality means in this context, ‘more than one’ or ‘more than currently exist’. Ockham’s Razor therefore is asking the researcher, ‘When is it indeed necessary to add another explanation, or to add features which should be tested in order to improve an existing explanation?’
This principle of course has nothing to do with simplicity and everything to do with epistemological risk. A state of being apparently ‘simple’ can be just as risky in epistemology as can a state of being ‘complicated’. Ockham’s Razor therefore, leverages its application upon this fulcrum – this threshold of a Wittgenstein logical object called necessity. Of course, implicit in this apothegm is the principle that, once a solid case for necessity has been established, then both the consideration of a new explanatory alternative, or added features to an existing alternative, become justified inside any pathway of research. The purpose of this blog article is to address that threshold, that fulcrum upon which Ockham’s Razor leverages its wisdom – the Necessity of Plurality. When and how do we achieve such a feat, and what are the symptoms of a condition wherein forces of fake skeptical agency are at work feverishly to prevent such a critical path step of science from ever occurring in the first place? Possessing the skill to discern a condition of plurality is critical to any claim to represent the philosophy of science, skepticism.
Its Abrogation: The Circular Simplicity Sell
God exists. This is the critical thesis of Question 2 inside Saint Thomas Aquinas’ The Summa Theologica.2 That such a critical path assumption is ‘simple’, is the critical thesis of Question 3 inside that same foundational work of Christian apologetica.3 Finally, Saint Thomas Aquinas begins that work with the provision that such a simple presumption is ‘necessary’.4 The theological argument outlined in The Summa Theologica flows along the following critical path of logic:
Question 1: God is necessary as an explanatory basis for all that we see
Question 2: God is self evident
Question 3: God as a principle in and of itself, is simple (return to Question 1)
Aquinas is fatally wrong here. The only thing simple about God as a principle, is the spelling of the word. The actual critical sequence of sleight-of-hand employed in Aquinas’ argument here starts with Question 3, flowing into Question 1 and finally to Question 2. However, he purposely places these out of order in an effort to conceal from his reader the circular nature of the contention, along with the subsequent basis from which he drew necessity. What he is doing is postulating Question 3, and then ‘affirming’ (see below) that assumption through reinterpreted observations in Question 1 and 2. The actual flow of Aquinas’ argument proceeds thusly:
And thus Aquinas’ argument circularly complies with ‘Occam’s Razor’, in that since God is by manifest evidence (this is not the same thing as ‘probative observation’), a simple principle, therefore now it is also a necessary one. You draw attention to and speak of yourself. Everyone knows that drawing attention to or speaking of one’s self is the practice of a narcissist, therefore it is a simple explanatory option, and now the necessary null, that you be regarded as a narcissist. Furthermore, since that accusation is assumed now ‘necessary’ it can be shown to be self evident, through this new attention towards your every action, along with the resulting linear affirmation bias (see The Map of Inference and below) – that you will ultimately be affirmed (infer ‘confirmed’) as a narcissist.5 Under this same logic, there does not exist one thing now that I will encounter in my life, which cannot be framed or re-explained inside the conforming context of my desired conclusion (see Qualifying Theory and Pseudo-Theory). This is how the circular simplicity sell works, it involves a principle of pseudo-inference called linear affirmation bias.
Linear Affirmation Bias
/philosophy : inference : induction : pseudo-inference/ : a primarily inductive methodology of deriving inference in which the researcher starts in advance with a premature question or assumed answer they are looking for. Thereafter, observations are made. Affirmation is a process which involves only positive confirmations of an a priori assumption or goal. Accordingly, under this method of deriving inference, observations are classified into three buckets:
1. Affirming 2. In need of reinterpretation 3. Dismissed because they are not ‘simple’ (conforming to the affirmation underway).
Under this method, the model is complicated by reinterpretations. Failing the test that a model should be elegant, not exclusively simple. By means of this method, necessity under Ockham’s Razor is assumed in advance and all observations thereafter are merely reconfigured to fit the assumed model. At the end of this process, the idea which was posed in the form of a question, or sought at the very start, is affirmed as valid. Most often this idea thereafter is held as an Omega Hypothesis (more important to protect than the integrity of science itself).
Now, set aside of course the premature contentions made inside Questions 2 and 3, that ‘God is self evident’ and ‘God is simple’, and focus instead on the first trick of this pseudo-philosophy. The sleight-of-hand proffered in this form of casuistry, is embodied in a violation of Ockham’s Razor called an invalid ‘necessity of plurality’. Saint Aquinas begins his work in Question 1 by assuming necessity. A nice trick, and one I would love to pull off inside a laboratory setting. Things would always go my way; conclusions would always turn out as planned – as long as I can dictate the necessity of the questions I want asked.
As a note, the ignostic atheist does not object to the idea of God as does the nihilist Atheist. They simply object to logically invalid or manipulative pathways of arriving at such knowledge as a predefined destination. If one is to discover God, the above pathway is not the way of going about such a task. Logic’ing your way into a proof or disproof of God is foolishness; the self-aggrandizing fantasy of the theist and Atheist.
Moreover, as long as you afford me exemption from having to establish a case for necessity, I can invalidly swing the direction of research towards any particular conclusion I desire. Both the religious and the nihilist use this technique in order to squelch thinking they find offensive. Both cults have learned how to employ this technique of linear induction, along with its interpretive confirmations (see below), to control the direction of science. Hence existence of the obtuse form of parsimony, ‘Occam’s Razor’.
Allow me the privilege of in advance establishing the necessity of the questions asked, and I can prove absolutely anything that I want.
This is much akin to the philosophical apothegm ‘Grant me one miracle and I can explain all the rest’. It behooves the ethical skeptic to understand the role of necessity in the development of an alternative – and in particular, understand when an agency has bypassed the burden of necessity (as has been done by Aquinas above), or even more importantly when a case for necessity has indeed been established – and is being artificially blocked by fake skeptics.
It is not proof which they are in reality demanding, contrary to what fake skeptics contend (in science there is really no such thing as proof), rather it is plurality of explanation which they fear most.
As it pertains to fake skepticism, it is this case for necessity of plurality which must be blocked at all costs. They cannot afford to have more than one explanation (their simple one) be researched by science.
A Case for Necessity
It is a set of observations, and nothing else, which ultimately serves to establish a case for necessity. This is why the scientific method must begin with observation and not by means of the ‘asking of a question’. When one asks a question – one assumes necessity a priori; much akin to the trick played by Saint Aquinas above. For example ‘Are you beating your dog?’ is not a scientific question, because it exposes itself to the ad hoc nature of arrivals of affirming evidence (only positive feedback), and does not rely upon the epistemological principle of a model’s predictive, consilient or probative power. This contrast between affirmation versus predictive model development – in essence the contrast between Occam’s Razor and Ockham’s Razor, is exhibited below.
Predictions are derived in advance from inductive modeling, and are placed at risk. Affirmations are ad hoc in nature, and further then are reinterpreted to fit according to an anchoring bias. Necessity (N) therefore under real skepticism, is something one establishes – and can never be something that a researcher assumes in advance.
So, given the abject reality of linear affirmation basis which inhabits The Affirmative Model ‘Occam’s Razor’ above, when indeed does a body of observation, therefore lead to a case for necessity? There are five conditions wherein a case for scientific plurality can be established through observational necessity. It is important to remember however, that in none of these cases is any particular answer proved (in other words, plurality exists or persists). Nor is any claim being made that science has been completed – as a case for necessity constitutes the beginning of the scientific method and not its completion.
The Five Cases for Necessity
I. When a consilience of observation has implied/predicted that the null may indeed possibly be false.
Example: The idea that our current 53 event vaccine schedule is ‘safe’, has been brought into question through a myriad of observations by parents and administering medical professionals. This does not mean that vaccines do not work – but rather straightforwardly, that the unchallenged null hypothesis, that vaccines do not also bear unintended and long term health consequences, has been brought into legitimate scientific question.
II. When one or more null-falsifying/probative observations have been recorded.
Example: Observing the presence of something assumed to be absent. Finding one incident of permanent cerebral injury (encephalitis) immediately after receipt of a vaccine. Such an observation serves to introduce necessity of plurality.
III. When risk incumbent with the null is greater than any particular alternative’s risk in epistemology.
Example: Deploying a pesticide under a presumption of safety, backed up by a couple confirmatory studies conducted by the company which stands to profit from the deployment of that pesticide. Despite immaturity of the study suggesting a lack of safety, such plurality must be assumed under a principle of precaution.
IV. When the null hypothesis, in order to avoid being falsified, has become more complicated than any of its alternatives.
Example: Monarch butterflies are being impacted indirectly by man made global warming and not by the direct impact of glyphosate on their annual migratory breeding grounds.
V. When the credibility or unbiased nature of the science which served to establish the null, has come into question.
Example: When Monsanto internal communications shifted from a theme of promoting scientific studies, to an oppressive policy of ‘no challenge left unanswered’ – countering and squelching opposing viewpoints in the media at all costs.
This therefore serves to introduce the condition wherein agency attempts to manipulate necessity so as to control the reigns of science.
The Four Abuses of Necessity on the Part of Fake Skeptics
Blocking Necessity – the principal tactic of fake skepticism. By asking for ‘proof’, the fake skeptic is in essence attempting to block access to the subject by science. This is a Catch 22 Proof Gaming or non rectum agitur fallacy.
Equating Simplicity with a Lack of Necessity – another key tactic of fake skeptics. An appeal to ignorance is an example of this tactic. Because we see no evidence for something, the simplest explanation is that it does not exist.
Equating Simplicity with Necessity – a tactic of fake skepticism. The existential Occam’s Razor fallacy appeal to authority is an example of this. Because an explanation is simple, therefore it is the truth. No further science is warranted.
Assuming Necessity – a final tactic of fake skepticism. The Confirmative Model (above) is an example of this. Because an idea tickles me emotionally, therefore it is granted immediate necessity – not having to justify itself nor satisfy the rigor of probativeness, consilience nor predictiveness.
All of these conditions are conditions of corruption of the scientific method, for which the ethical skeptic must be ever vigilant.
The Ethical Skeptic, “Necessity – A Case for Plurality”; The Ethical Skeptic, WordPress, 19 May 2019; Web, https://wp.me/p17q0e-9KR
Einstein was desperate for a career break. He had a 50/50 shot – and he took it. The necessary alternative he selected, fixed c, was one which was both purposely neglected by science, and yet offered the only viable alternative to standing and celebrated club dogma. Dogma which had for the most part, gone unchallenged. Developing mechanism for such an alternative is the antithesis of religious activity. Maturing the necessary alternative into hypothesis, is the heart and soul of science.
Mr. Einstein You’ll Never Amount to Anything You Lazy Dog
Albert Einstein introduced in a 1905 scientific paper, the relationship proposed inside the equation e = mc² : the concept that the system energy of a body (e) is equal to the mass (m) of that body times the speed of light squared (c²). That same year he also introduced a scientific paper outlining his theory of special relativity. Most of the development work (observation, intelligence, necessity, hypothesis formulation) entailed in these papers was conducted during his employment as a technical expert – class III (aka clerk) at the Federal Office for Intellectual Property in Bern, Switzerland; colloquially known as the Swiss patent office.1 There, bouncing his ideas off a cubicle-mate (si vis) and former classmate, Michele Angelo Besso, an Italian Engineer, Einstein found the time to further explore ideas that had taken hold during his studies at the Swiss Federal Polytechnic School. He had been a fan of his instructor, physicist Heinrich Friedrich Weber – the most notable of his two top-engaged professors at Swiss Federal Polytechnic. Weber had stated two things which struck an impression on the budding physicist.2
“Unthinking respect for authority is the enemy of truth.” ~ physicist Heinrich Friedrich Weber
As well, “You are a smart boy, Einstein, a very smart boy. But you have one great fault; you do not let yourself be told anything.” quipped Weber as he scolded Einstein. His mathematics professor, Hermann Minkowski scoffed to his peers about Einstein, relating that he found Einstein to be a “lazy dog.” In similar vein, his instructor physicist Jean Pernet, admonished the C-average (82% or 4.91 of 6.00 GPA) student “[I would advise that you change major to] medicine, law or philosophy rather than physics. You can do what you like. I only wish to warn you in your own interest.” Pernet’s assessment was an implication to Einstein that he did not face a bright future, should he continue his career in pursuit of physics. His resulting mild ostracizing from science was of such extent that Einstein’s father later had to petition in an April 1901 letter, for a university to hire Einstein as an instructor’s assistant. His father wrote “…his idea that he has gone off tracks with his career & is now out of touch gets more and more entrenched each day.” Unfortunately for the younger Einstein, his father’s appeal fell upon deaf ears. Or perhaps fortuitously, as Einstein finally found employment at the Swiss patent office in 1902.3
However, it was precisely this penchant for bucking standing authority, which served to produce fruit in Einstein’s eventual physics career. In particular, Einstein’s youthful foible of examining anew the traditions of physical mechanics, combined with perhaps a dose of edginess from being rejected by the institutions of physics, were brought to bear effectively in his re-assessment of absolute time – absolute space Newtonian mechanics.
Einstein was not ‘doubting’ per se, which is not enough in itself. Rather he executed the discipline of going back and looking – proposing an alternative to a ruling dogma based upon hard nosed critical path induction work, and not through an agency desire to lazily pan an entire realm of developing ideas through abduction (panduction) – no social skeptics, Einstein did not practice your form of authority-enforcing ‘doubt’. Rather it was the opposite.
Einstein was not lazy after all, and this was a miscall on the part of his rote-habituated instructors (one common still today). Einstein was a value economist. He applied resources into those channels for which they would provide the greatest beneficial effect. He chose to not waste his time upon repetition, memorization, rote procedure and exercises in compliance. He was the ethical C student – the person I hire before hiring any form of cheating/memorizing/imitating A or B student. And in keeping with such an ethic, Einstein proposed in 1905, 3 years into his fateful exile at the Swiss patent office, several unprecedented ideas which were subsequently experimentally verified in the ensuing years. Those included the physical basis of 3 dimensional contraction, speed and gravitational time dilation, relativistic mass, mass–energy equivalence, a universal speed limit (for matter and energy but not information or intelligence) and relativity of simultaneity.4 There has never been a time wherein I reflect upon this amazing accomplishment and lack profound wonder over its irony and requital in Einstein’s career.
The Necessary Alternative
If the antithesis of your alternative can in one observation, serve to falsify your preferred alternative or the null, then that antithesis is then, the necessary alternative.
But was the particular irony inside this overthrow of Newtonian mechanics all that unexpected or unreasonable? I contend that it was not only needed, but the cascade of implications leveraged by c-invariant physics was the only pathway left for physics at that time. It was the inevitable, and necessary, alternative. The leading physicists, as a very symptom of their institutionalization, had descended into a singular dogma. That dogma held as its centerpoint, the idea that space-time was the fixed reference for all reality. Every physical event which occurred inside our realm hinged around this principle. Einstein, in addressing anew such authority based thinking, was faced with a finite and small set of alternative ideas which were intrinsically available for consideration. That is to say – the set of ideas only included around 4 primary elements, which could alternately or in combination, be assumed as fixed, dependent, or independently variable. Let’s examine the permutation potential of these four ideas: fixed space, fixed time, fixed gravity and/or fixed speed of light. Four elements. The combinations available for such a set are 14, as related by the summation of three combination functions:
What reasoned conjecture offered, and given that combinations of 4 or 3 were highly unlikely to unstable, was to serve in bounding the set of viable alternative considerations to even a lesser set than 14 – maybe 6 very logical alternatives at most (the second C(4,2) function above). However, even more reductive, essentially Einstein would only need start by selecting from one of the four base choices, as represented by the first combination function above, C(4,1). Thereafter, if he chose correctly, he could proceed onward to address the other 3 factors depending upon where the critical path led. But the first choice was critical to this process. One of the following four had to be chosen, and two were already in deontological doubt, in Einstein’s mind.
• Fixed 3 dimensional space (x, y, z)
• Fixed time (t)
• Fixed gravitation relative to mass (g-mass)
• Fixed speed of light (c)
Ultimately then, only two choices existed if one is to suppose a maximum of two fixed elements as possible per below. Indeed this ended up being the plausal-set for Einstein. The necessary alternatives, one of which had been essentially embargoed by the science authorities at the time, were a combination of two of the above four combining elements. Another combination of two was currently in force (fixed space and time).
In other words, now we have reduced the suspect set to two murder suspects – Colonel Mustard and Professor Plum, and standing dogma was dictating that only Colonel Mustard could possibly be considered as the murderer. To Einstein this was at worst, an even bet.
This is the reason why we have ethical skepticism. This condition, an oft repeated condition wherein false skepticism is applied to underpin authority based denial in an a priori context, in order to enforce one mandatory conclusion at the expense of another or all others, is a situation ripe for deposing. Einstein grasped this. The idea that space and time were fixed references was an enforced dogma on the part of those wishing to strengthen their careers in a social club called physics. Everyone was imitating everyone else, and trying to improve club ranking through such rote activity. The first two element selection, stemming of course from strong inductive work by Newton and others, was a mechanism of control called an Einfach Mechanism (see The Tower of Wrong) or
Omega Hypothesis HΩ – the answer which has become more important to protect than science itself.
• Fixed 3 dimensional space (x, y, z)
• Fixed time (t)
Essentially, Einstein’s most logical alternative was to assume the speed of light as fixed first. By choosing first, a fixed reference of the speed of light, Einstein had journeyed down both a necessary, as well as inevitable hypothesis reduction pathway. It was the other murder suspect in the room, and as well stood as the rebellious Embargo Hypothesis option.
Embargo HypothesisHξ– the option which must be forbidden at all costs and before science even begins.
• Fixed gravitation relative to mass (g-mass)
• Fixed speed of light (c)
But this Embargo Hypothesis was also the necessary alternative, and Einstein knew this. One can argue both sides of the contention that the ’embargo’ of these two ideas was one of agency versus mere bias. In this context and for purposes of this example, both agency and bias are to be considered the same embargo principle. In many/most arguments however, they are not the same thing.
The Necessary Alternative
/philosophy : Ockham’s Razor : Necessity/ : an alternative which has become necessary for study under Ockham’s Razor because it is one of a finite, constrained and very small set of alternative ideas intrinsically available to provide explanatory causality or criticality inside a domain of sufficient unknown. This alternative does not necessarily require inductive development, nor proof and can still serve as a placeholder construct, even under a condition of pseudo-theory. In order to mandate its introduction, all that is necessary is a reduction pathway in which mechanism can be developed as a core facet of a viable and testable hypothesis based upon its tenets.
The assertion ‘there is a God’, does not stand as the necessary alternative to the assertion ‘there is no God’. Even though the argument domain constraints are similar, these constructs cannot be developed into mechanism and testable hypothesis. So, neither of those statements stand as the necessary alternative. I am sorry but neither of those statements are ones of science. They are Wittgenstein bedeutungslos – meaningless. A proposition or question which resides upon a lack of definition, or presumed definition which contains no meaning other than in and of its self.
However in exemplary contrast, the dilemma of whether or not life originated on Earth (abiogenesis), or off Earth (panspermia) do stand as a set of necessary alternatives. Even though both ideas are in their infancy, they can both ultimately be developed into mechanism and a testing critical path. The third letter of the DNA codon (see Exhibit II below) is one such test of the necessary alternatives, abiogenesis and panspermia. There is actually a third alternative as well, another Embargo Hypothesis (in addition to panspermia) in this case example – that of Intervention theory. But we shall leave that (in actuality necessary as well) alternative discussion for another day, as it comes with too much baggage to be of utility inside this particular discourse.
Einstein chose well from the set of two necessary alternatives, as history proved out. But the impetus which drove the paradigm change from that of the standing dogma and to Einstein’s favored Embargo Hypothesis, might not have been as astounding a happenstance as it might appear at first blush. Einstein chose red, when everyone and their teaching assistant, was of the awesome insistence that one need choose blue. All the ramifications of fixed speed of light (and fixed gravitation, relative only to mass), unfolded thereafter.
Einstein was desperate for a break. He had a 50/50 shot – and he took it.
Example of Necessity: Panspermia versus Abiogenesis
An example of this condition wherein the highly constrained set of alternatives (two in this case) inside a sufficient domain of unknown, forces the condition of dual necessity, can be exemplified inside the controversy around the third letter (base) of the DNA codon. A DNA codon is the word, inside the sentence of DNA. A codon is a series of 3 nucleotides (XXX of A, C, T or G) which have a ‘definition’ corresponding to a specific protein-function to be transcripted from the nucleus and decoded by the cell in its process of assembling body tissues. It is an intersection on the map of the organism. Essentially, the null hypothesis stands that, the 3rd letter (nucleotide) digit of the codon, despite its complex and apparently systematic methodical assignment codex, is the result of natural stochastic-derivation chemical happenstance during the fist 300 million years of Earth’s existence (not a long time). The idea being that life existed on a 2 letter DNA codon (XX) basis for eons, before a 3 letter (XXX) basis evolved (shown in Exhibit II below). The inductive evidence that such an assignment codex based upon 3 letters derived from 2, is beyond plausibility given the lack of probability of its occurrence and lack of time and influencing mechanism during which that improbability could have happened – this evidence supports its also-necessary alternative.
In this circumstance, the idea that the DNA codon third digit based codex, was not a case of 300 million year fantastical and highly improbable happenstance, but rather existed inside the very first forms of life which were to evolve (arrive) on Earth, is called panspermia. The necessary alternative panspermia does not involve or hinge upon the presence of aliens planting DNA on Earth, rather that the 3 letter codon basis was ‘unprecedented by context, complex and exhibiting two additional symmetries (radial and block) on top of that’ at the beginning of life here on Earth, and therefore had to be derived from a source external to the Earth. Note, this is not the same as ‘irreducible complexity’, a weak syllogism employed to counter-argue evolution (not abiogenesis) – rather it is a case of unprecedentable complexity. A much stronger and more deductive argument. It is the necessary alternative to abiogenesis. It is science. Both alternatives are science.
The key in terms of Ockham’s Razor plurality is this: In order to provide hypothesis which aligns abiogenesis as a sufficient explanatory basis for what we see in the fossil record – we must dress it up so that it performs in artificial manipulation,
exactly as panspermia would perform with no manipulation at all.
This renders panspermia a legitimate and necessary hypothesis.
This circumstance elicits the contrathetic impasse, a deliberation wherein a conundrum exists solely because authority is seeking to enforce a single answer at the expense of all others, or forbid one answer at the expense of science itself. The enforced answer is the Omega Hypothesis and the forbidden alternative is the Embargo Hypothesis. And while of course abiogenesis must stand as the null hypothesis (it can be falsified but never really proven) – that does not serve to make it therefore true. Fake skeptics rarely grasp this.
Therefore, the necessary alternative – that the DNA (XXX) codex did not originate on Earth, is supported by the below petition for plurality, comprising five elements of objective inference (A – E below). This systematic codex is one which cannot possibly be influenced (as is evolution) by chemical, charge, handedness, use of employment, epigenetic or culling factors (we cannot cull XX codex organisms to make the survival group more compatible for speciating into XXX codex organisms). Nor do we possess influences which can serve to evolve the protein based start, and silence based stop codons. It can only happen by accident or deliberation. This is an Einstein moment.
Omega Hypothesis HΩ – the third letter of the DNA codex evolved as a semi-useless appendage, in a single occurrence, from a 2 letter codex basis, featuring radial symmetry, featuring block assignment symmetry and molecule complexity to 2nd base synchrony, only upon Earth, in a 1.6 x 10^-15 (1 of 6 chance across a series of 31 pairings, across the potential permutations of (n – 1) proteins which could be assigned) chance, during the first 300 million years of Earth’s existence. Further the codex is exceptionally optimized for maximizing effect inside proteomic evolution. Then evolution of the codex stopped for an unknown reason and has never happened again for 3.8 billion years.
Stacking of Entities = 10 stacked critical path elements. Risk = Very High.
Embargo HypothesisHξ– the three letter codex basis of the DNA codon, pre-existed the origination of life on Earth, arrived here preserved by a mechanism via a natural celestial means, and did not/has not evolved for the most part, save for slight third base degeneracy. Further the codex is exceptionally optimized for maximizing effect inside proteomic evolution.
Note: By the terms ‘deliberacy’ and ‘prejudice’ used within this article, I mean the ergodicity which is incumbent with the structure of the codex itself. Both how it originated and what its result was in terms of the compatibility with amino acids converting into life. There is no question of ergodicity here. The idea of ‘contrived’ on the other hand, involves a principle called agency. I am not implying agency here in this petition. A system can feature ergodicity, but not necessarily as a result of agency. To contend agency, is the essence of intervention hypotheses. To add agency would constitute stacking of entities (a lot of them too – rendering that hypothesis weaker than even abiogenesis). This according to Ockham’s Razor (the real one).
The contention that panspermia merely shifts the challenges addressed by abiogenesis ‘off-planet’ is valid; however those challenges are not salient to the critical path and incremental question at hand. It is a red herring at this point. With an ethical skeptic now understanding that abiogenesis involves a relatively high-stacked alternative versus panspermia, let’s examine the objective basis for such inference in addition to this subjective ‘stacking of entities’ skepticism surrounding the comparison.
The Case for Off-Earth Codex Condensation
Did our DNA codex originate its structure and progressions first-and-only upon Earth, or was it inherited from another external mechanism? A first problem exists of course in maintaining the code once extant. However, upon observation, a more pressing problem exists in establishing just how the code came into being in the first place. Evolved or pre-existed? ‘Pre-existed by what method of origination then?’ one who enforces an Omega Hypothesis may disdainfully pontificate. I do not have to possess an answer to that question in order to legitimize the status of this necessary alternative. To pretend an answer to that question would constitute entity stacking. To block this necessary alternative (pre-existing codex) however, based upon the rationality that it serves to imply something your club has embargoed or which you do not like personally – even if you have mild inductive support for abiogenesis – is a religion. Given that life most probably existed in the universe and in our galaxy, already well before us – it would appear to me that panspermia, The Embargo Hypothesis, is the simplest explanation, and not abiogenesis. However, five sets of more objective inference serve to make this alternative a very strong one, arguably deductive in nature, versus abiogenesis’ relative paltry battery of evidence.
As you read Elements A – E below, ask yourself the critical path question:
~ ~ ~
‘If the precise, improbable and sophisticated Elements A – E below were required
as the functional basis of evolution before evolution could even happen, then how did they then evolve?’
A. The most early use amino acids and critical functions hold the fewest coding slots, and are exclusively dependent upon only the three letter codon form. Conjecture is made that life first developed upon a 2 letter codon basis and then added a third over time. The problem with this is that our first forms of life use essentially the full array of 3 letter dependent codex, to wit: Aspartate 2 (XXX), Lysine 2 (XXX), Asparagine 2 (XXX), Stop 3 (XXX), Methionine-Start 1 (XXX), Glutamine 2 (XXX). Glutamic Acid and Aspartic Acid, which synthesize in the absolute earliest forms of thermophiles in particular would have had to fight for the same 2 digit code, GA – which would have precluded the emergence of even the earliest thermal vent forms of life – under a 2 letter dependent codex (XX). These amino acids or codes were mandatory for the first life under any digit size context – and should hold the most two digit slots accordingly – they do not. As well, in the case where multiple codons are assigned to a single amino acid, the multiple codons are usually related. Even the most remote members of archaea, thermophilic archaea, use not only a full 3 letter codon dependent codex, but as well use proteins which reach well into both the adenine position 2 variants (XAX) and thymidine position 2 variants (XTX) groupings; ostensibly the most late-appearing sets of amino acids (see graphic in C. below and Table III at the end).5
It is interesting to also note that the three stop codons TAA-TAG-TGA, all match into codex boxing with later appearing/more complex amino acid molecules, and as members of the adenine position 2 variants (XAX) group. They box with the more complex and ‘later appearing’ amino acids tyrosine and tryptophan. The stop codes needed to be a GG, CC, GC, or at the very least a CT/TC XX codon basis at the very least, in order to support an extended evolutionary period under a two letter codon basis (XX). This was not the situation as you can see in Exhibit III at the end. This would suggest that the stop codes appeared later to last under a classic abiogenetic evolutionary construct. Life would have had to evolve up until thermophilic archaea without stop codes in their current form. Then suddenly, life would have had to adopt a new stop codon basis (and then never make another change again in 3.6 billion years), changing horses in mid stream. This XX codon previous form of life should be observable in our paleo record. But it is not.
Moreover, the use of the two digit codex is regarded by much of genomics as a degeneracy, ‘third base degeneracy’, and not an artifact of evolution.6 Finally, the codon ATA should in a certain number of instances, equate to a start code, since it would have an evolutionary two digit legacy – yet it is never used to encode for Methionine – this is incompatible with the idea that methionine used to employ a two digit AT code. Likewise, Tyrosine and the non-amino stop code, TA would have been in conflict under the two digit codex. In fact, overall there should exist a relationship between arrival of an amino acid use in Earth life, and the number of slots it occupies in the codex, and there is not. Neither to the positive slope, nor negative slope.7
One can broach the fact that protein reassignments were very possible and could explain away the apparent introduction of a XXX codon dependency of all observable life, midstream. But then one must explain why it never ‘speciated’ again over 3.6 billion years, along with the apparent absence of XX codon life in the paleo record. This chasm in such a construct is critical and must be accommodated before abiogenesis can be fully developed as a hypothesis. In comparison, panspermia possess no such critical obstacle (note, this is not a ‘gap’ as it relates to the critical path of the alternative and not merely circumstantial inductive inference).
Codon Radial Symmetry
B. Evolution of the codex would necessarily occur from amino absences rather than positives. Of particular note is the secondary map function of the start and stop codons. Notice that the start of a DNA sentence begins with a specific protein (the polar charge molecule methionine-ATG). The end of a DNA sequence however, consists of no protein coding whatsoever (see TAA, TAG and TGA). In other words the DNA sentence begins with the same note every time, and ends with protein silence. tRNA evolved a way to accommodate the need for a positive, through the employment of proline-CCA during the protein assembly process. This is how a musical score works – it starts with a note, say an A440 tuning one, and ends with the silence dictated by the conductor’s wand. Deliberacy of an empty set, as opposed to the stochasticity of positive notes, as another appearance of a start code methionine could have sufficed for a positive stop and start code instead. This positive stop mechanism would succeed inside an evolution context much better. Why would an absence evolve into a stop code for transfer RNA? It could not, as ‘absence’ contains too much noise. It occurs at points other than simply a stop condition. The problem exists in that there is no way for an organism to survive, adapt, cull or evolve, based upon its use of an empty set (protein silence). Mistakes would be amplified under such an environment, Evolution depends intrinsically upon logical positives only (nucleotides, mutations, death) – not empty sets.
C. Features triple-symmetrical assignment, linear robustness, ergodicity along with a lack of both evolution and deconstructive chaos. This is NOT the same set of conditions as exist inside evolution, even though it may appear as such to a layman. This set of codex assignments, features six principle challenges (C., and C. 1, 2a, 2b, 3, and 4 below). Specifically,
• radial assignment symmetry (B. Codon Radial Symmetry chart above),
• thymidine and adenine (XTX, XAX) second base preference for specific chemistries, • synchrony with molecule complexity (C. Codon versus Molecule Complexity 64 Slots graphic to the right), • block symmetry (C. Codon Second Base Block Symmetry table below) around the second digit (base), and
• ergodicity, despite a lack of chemical feedback proximity nor ability for a codon base to attract a specific molecule chemical profile or moiety.
• lack of precedent from which to leverage.
These oddities could not ‘evolve’ as they have no basis to evolve from. The structure and assignment logic of the codex itself precludes the viability of a two base XX codex. Evolution by definition, is a precedent entity progressing gradually (or sporadically) into another new entity. It thrives upon deconstructive chaos and culling to produce speciation. There was no precedent entity in the case of the DNA stop codon nor its XXX codex. As well, at any time, the stop codons could have been adopted under the umbrella of a valid protein, rendering two SuperKingdoms of life extant on Earth – and that should have happened. Should have happened many times over and at any time in our early history (in Archaea). Yet it did not.8 An asteroid strike and extinction event would not serve to explain the linearity. Evolution is not linear. We should have a number of DNA stop and start based variants of life available (just as we have with evolution based mechanisms) to examine. But we do not. In fact, as you can see in the chart to the right (derived from Exhibit III at the end), there exist four challenges to a purely abiogenetic classic evolution construct:
1. An original symmetry to the assignment of codon hierarchies (codex), such that each quadrant of the assignment chart of 64 slots, mirrors the opposing quadrant in an ordinal discipline (see Codon Radial Symmetry charts in B. above to the right – click to get a larger image).
Codon Second Base Block Symmetry
2. The second character in the codon dictates (see chart in B. Codon Radial Symmetry chart above) what was possible with the third character. In other words
a. all the thymidine position 2 variants (XTX) had only nitrite molecules (NO2) assigned to them (marked in blue in the chart in C. Codon versus Molecule Complexity 64 Slots to the upper right and in Exhibit III at the end – from where the graph is derived). While the more complex nitrous amino acids were all assigned to more complex oversteps in codex groups (denoted by the # Oversteps line in the Codon versus Molecule Complexity 64 Slots chart to the upper right).
In addition,
b. all adenine position 2 variants (XAX) were designated for multi-use 3rd character codons, all cytidine position 2 variants (XCX) were designated for single use 3rd character codons, while guanine (XGX) and thymidine (XTX) both were split 50/50 and in the same symmetrical patterning (see Codon Second Base Block Symmetry table to the right).
3. There exists a solid relationship, methodical in its application, between amino acid molecule nucleon count and assignment grouping by the second digit of the DNA codon in rank of increasing degeneracy. Second letter codon usages were apportioned to amino acids as they became more and more complex, until T and A had to be used because the naming convention was being exceeded (see chart in C., Codon versus Molecule Complexity 64 Slots above to the right as well as Exhibit III at the end). After this was completed, then one more use of G was required to add 6 slots for arginine, and then the table of 60 amino acids was appended by one more, tryptophan (the most complex of all the amino acid molecules – top right of chart in C. Codon versus Molecule Complexity 64 Slots above or the end slots in Exhibit III at the end) and the 3 stop codes thereafter. Very simple. Very methodical. Much akin to a IT developer’s subroutine log – which matures over the course of discovery inside its master application.
4. Ergodicity. Prejudice… Order, featuring radial symmetry (B. Codon Radial Symmetry chart), synchrony with molecule complexity (C. Codon versus Molecule Complexity 64 Slots graphic) and and block symmetry (C. last image Codon Second Base Block Symmetry table) around the second digit. The problem is that there is no way a natural process could detect the name/features of the base/molecule/sequence as means to supplant such order and symmetry into the codex, three times – much less evolve is such short order, without speciation, in the first place.
Since the DNA chemistry itself is separated by two chemical critical path interventions, how would the chemistry of thymine for instance (the blue block in Exhibit III below) exclusively attract the nitric acid isomer of each amino acid? And why only the nitric acid isomers with more complex molecule bases? First, the DNA base 2 is no where physically near the chemical in question, as it is only a LOGICAL association, not a chemical one so it cannot contain a feedback or association loop. Second, there is no difference chemically, between C2H5NO2 and C5H11NO2. The NO2 is the active moiety. So there should have not been a synchrony progression (C.3. above), even if there were a direct chemical contact between the amino acid and the second base of the codon. So the patterns happen as a result of name only. One would have to know the name of the codon by its second digit (base), or the chemical formula for the amino acid, and employ that higher knowledge to make these assignments.
Finally, this order/symmetry has not changed since the code was first ‘introduced’ and certainly has not been the product of stochastic arrival – as a sufficiently functional-but-less-orderly code would have evolved many times over (as is the practice of evolution) and been struck into an alternative codice well before (several billion years) this beautiful symmetry could ever be attained.
We claim that evolution served to produce the codex, yet the codex bears the absolute signs of having had no evolution in its structure. We cannot selectively apply evolution to the codex – it must either feature evolutionary earmarks, or not be an evolved code. The mechanisms of evolution cannot become a special pleading football applied only when we need it, to enforce conformance – because in that case we will only ever find out, what we already know. It becomes no better argument philosophically than ‘God did it’.
D. Related codons represent related amino acids. For example, a mutation of CTT to ATT (see table in C. above) results in a relatively benign replacement of leucine with isoleucine. So the selection of the CT and AT prefixes between leucine and isoleucine was done early, deliberately and in finality – based upon a rational constraint set (in the example case, two nitrite molecule suffixed proteins) and not eons of trail and error.9 Since the assignment of proteins below, is not partitioned based upon any physical characteristic of the involved molecule, there is no mechanism but deliberacy which could dictate a correspondence between codon relationships and amino acid relationships.10
E. Statistically impossible codex, not just improbable. Finally, it is not simply the elegant symmetry to the codex which is perplexing, but as well, items A. – D. usage contexts identified above, allow one to infer (deductive?) that the codex, its precedent, provenance and structure are difficult to impossible to accommodate in even the most contorted construct of abiogenesis. Observe the A-G vs C-T tandem relationship between Lysine and Asparagine for instance. This elegant pattern of discipline repeats through the entire codex. This is the question asked by Eugene V. Koonin and Artem S. Novozhilov at the National Center for Biotechnology in Bethesda, Maryland in their study Origin and Evolution of the Genetic Code: The Universal Enigma (see graphic to the right, extracted from that study).11 This serious challenge is near to falsifying in nature, and cannot be dismissed by simple hand waving. Take some time (weeks or months, not just seconds) to examine the DNA codex in Exhibits I thru III below, the three tables and charts in B. and C. above, as well as the study from which the graphic to the right is extracted, and see if you do not agree. This argument does not suffer the vulnerability of ‘creationist’ arguments, so don’t play that memorized card – as Ockham’s Razor has been surpassed for this necessary alternative.
The hypothesis that the codex for DNA originated elsewhere bears specificity, definition and testable mechanism.
It bears less stacking, gapping and risk as compared to abiogenesis.
It is science. It is the necessary alternative.
Assuming that life just won the 1 in 1.6 x 10 to the 15th power lottery (the Omega Hypothesis), again, and quickly, near to the first time it even bought a lottery ticket – has become the old-hat fallback explanation inside evolution. One which taxes a skeptic’s tolerance for explanations which begin to sound a lot like pseudo-theory (one idea used to explain every problem at first broach). However this paradox of what we observe to be the nature of the codex, and its incompatibility with abiogenesis, involves an impossibly highly stacked assumption set to attempt to explain away based solely upon an appeal to plenitude error. A fallback similar in employment of the appeal to God for the creationist. The chance that the codex evolved a LUCA (Last Universal Common Ancestor) by culled stochastics alone in the first 300 million years of Earth’s existence,12 is remote from both the perspective of time involved (it happened very quickly) and statistical unlikelihood – but as well from the perspective that the codex is also ‘an optimal of optimates’ – in other words, it is not only functional, but smart too. Several million merely functional-but-uglier code variants would have sufficed to do the job for evolution. So this begs the question, why did we get a smart/organized codex (see Table II below and the radial codon graphic in B. above) and not simply a sufficiently functional but chaotic one (which would also be unlikely itself)? Many social skeptics wishing to enforce a nihilistic religious view of life, miss that we are stacking deliberacy on top of a remote infinitesimally small chance happenstance. Their habit is to ignore such risk chains and then point their finger at creationists as being ‘irrational’ as a distraction.
The codex exhibits unprecedentable, empty set as a positive logical entity, static, patterned codex format, exposed to deconstructive vulnerability which happens in everything else, but which chose to not happen in this one instance. In other words, evolution actually chose to NOT happen in the DNA codex – i.e. deliberacy. The codex, and quod erat demonstrandum, the code itself, came from elsewhere. I do not have to explain the elsewhere, but merely provide the basis for understanding that the code did not originate here. That is the only question on our scientific plate at the moment.
Exhibits I II and III
Exhibit I below shows the compressed 64 slot utilization and it efficiency and symmetry. 61 slots are coded for proteins and three are not – they are used as stop codes (highlighted in yellow). There are eight synonym based code groups (proteins), and twelve non-synonym code groups (proteins). Note that at any given time, small evolutionary fluctuations overlapping into the stop codes would render the code useless as the basis for life. So, the code had to be frozen from the very start, or never work at all. Either that or assign tryptophan as a dedicated stop code and mirror to methionine and make the 5th code group a synonym for Glutamine or Aspartic Acid.
Exhibit I – Tandem Symmetry Table
Exhibit II below expands upon the breakout of this symmetry by coded protein, chemical characteristics and secondary mapping if applicable.
Exhibit II – Expanded Tandem Symmetry and Mapping
Finally, below we relate the 64 codon slot assignments, along with the coded amino acid, the complexity of that molecule and then the use in thermophilic archaea. Here one can see that it is clear that even our first forms of life, constrained to environments in which they had the greatest possibility of even occurring, employed the full XXX codex (three letter codon). While it is reasonable to propose alternative conjecture (and indeed plurality exists here) that the chart suggests a two letter basis as the original codex, life’s critical dependency upon the full codon here is very apparent.
Exhibit III – 61 + 3 Stop Codon Assignment versus Molecule Complexity and
Use in Thermophilic Archaea (see chart in C. above)
Recognizing the legitimacy of the necessary alternative – one which was both purposely neglected by science, and yet offers the only viable alternative to standing and celebrated club dogma – this is a process of real science. Developing mechanism for such an alternative is the antithesis of religious activity. Maturing the necessary alternative into hypothesis, is the heart and soul of science.
Blocking such activity is the charter and role of social skepticism. Holding such obfuscating agency accountable is the function of ethical skepticism.
The Ethical Skeptic, “Embargo of The Necessary Alternative is Not Science” The Ethical Skeptic, WordPress, 24 Nov 2018; Web, https://wp.me/p17q0e-8Ob
In true research, the diligent investigator is continually bombarded by huge amounts of data, in the form of facts, observations, measures, voids, paradoxes, associations, and so on. To be able to make use of this data a researcher typically reduces it to more manageable proportions. This does not mean we need to necessarily tender a claim about that data. Instead science mandates that we apply the principles of both linear and asymmetric Intelligence as part of the early scientific method. Our goal in Sponsorship is not to force an argument or proof, rather to establish a reductionist description through which the broader observational set may be reproduced or explained, as possible. This reductionist description is called a construct. Constructs are developed by laymen, field researchers, analysts, philosophers, witnesses as well as lay experts and scientists alike. The science which ignores this process, is not science.
It is the job of the Sponsor in the scientific method, to perform these data collection, linear and asymmetric Intelligence and reductionist development steps. In absence of robust regard for Sponsorship, science is blinded, and moreover the familiar putrefied rot of false skepticism takes root and rules the day of ignorance.
Sponsor
(scientific methodology) an individual or organization gathering the resources necessary and petitioning for plurality of argument under Ockham’s Razor and the scientific method.
When we speak of ethics at The Ethical Skeptic, we speak less of features of personal moral character, and more of the broader application context. A professional allegiance and adherence to a clear and valuable series of deontological protocols which produce results under a given knowledge development process. In other words, fealty to the scientific method, above specific conclusions. Ethically I defer; I surrender my religions, predispositions and dogma to the outcome of the full and competently developed knowledge set. This is ethics. It really has nothing to do with morality and everything to do with the character of curiosity. I do not have the universe figured out, and I would sincerely like to know some things.
Type I Sponsor: Lay Science
A key component of this ethical process are the portions of science which involve Sponsorship. Don’t be dissuaded by the title. A sponsor is a very familiar participant in the protocols of science. Sky watching lay astronomers for example are a vital part of the scientific method, depicted in the chart to the right under Type I Sponsors. These lay researchers perform key roles in monitoring, collecting and documenting of celestial events and bodies. Many new comets are named after the actual layman who spotted them and provided enough information for science then to further prove the case at hand. The Sponsor in astronomy does not prove the hypothesis per se, rather simply establishes the case for a construct: the proposed incremental addition of celestial complexity beyond the reasonableness of parsimony (see: Ethical Skepticism – Part 5). Science then further tests, reviews and proves the lay astronomer’s sponsored construct by means of a hypothesis. The layman in astronomy in essence ‘gathers the resources necessary and petitions for plurality of argument (a new celestial moving body) under Ockham’s Razor and the scientific method.’ It is this Type I Sponsor realm, inside of which Big Data will unveil its most remarkable revolution.
“In everyday life we are continually bombarded by huge amounts of data, in the form of images, sounds, and so on. To be able to make use of this data we must reduce it to more manageable proportions.”¹ This does not mean we need to make a claim about that data. It means we need to apply the principles of asymmetric intelligence. Our goal in [Sponsorship] is not to make a claim necessarily, rather to “establish a reductionist description through which the observational set may be reproduced.”¹ – Stephen Wolfram, A New Kind of Science
In similar fashion, Sponsors function as a critical contributor group to science under a number of more complex, and less linear fields of study than astronomy. A woodsman who has hunted, fished and lived in the Three Sisters region of Oregon, can stand as both an expert in terms of resource and recitation, and moreover can become a sponsor of an idea regarding the domain in which they have spent their entire life conducting data collection. Wise local university researchers will meet locals on Cascade Ave. in order to collect observations on some of the region’s geologic history. The lay researcher him or herself can perform citizen science as well, yes. But more importantly he or she might aid science by developing an idea which has never been seriously considered before. Perhaps they have observed changes in total Whychus Head spring water volume flow prior to magnitude 3.0 and above earthquakes. Perhaps they want to formalize these observations and ask a local university to take a look at their impression (construct). This is Sponsorship. Establishing a case that science should address their own version of a comet, in the natural domain which they survey.
The role of the Sponsor is not to prove a particular case, rather to surpass Ockham’s Razor in petitioning science to develop and examine a set of hypotheses
Indeed, problems even more complex, such as the studies of patterns and habits of wildlife, rarely advance via the handiwork of one organization or individual. The task is simply too daunting. Citizens provide critical inputs as to the habits of the Red Wolf or Grizzly Bear.² In similar fashion, medical maladies and successful means of addressing them, are many times asymmetric in their challenge, involving a many faceted contribution and mitigation element series. The role of the lay researcher, moms and dads with respect to their children, is critical. To ignore this lay resource under the guise of fake skepticism and ‘anecdote’ is not only professionally unwise (unethical), but cruel as well (immoral). These stand as examples of the asymmetric challenge entailed in the majority of scientific knowledge processes we face as a society. It is this preponderance of asymmetric challenge therefore which promotes the Sponsor into the necessary roles of both inventor and discoverer, and not simply the role of science clerk.
Type II: Tinkerer Sponsors As Lay Scientist
The second category of Sponsorship involves the work of lay tinkerers and garage inventors (Type II Sponsor in the graphic above). Arguments vary as to the magnitude of impact of this class of researcher, but no one can dispute the relatively large impact that this class of Sponsor has had on various industry verticals. A key example might be 17 year old layman Michael Callahan, who lost his voice in a skateboarding accident, and subsequently developed a device called Audeo, to aide the plight of those who have suffered a similar loss of vocal function (see: Top Inventions: Audeo). But lay science does not have to bear simply the consequentialist result of a technological device (Type II) or simply observation inside a well established domain of science (Type I) . A sponsor can perform the role of discovery as well.
Type III: Sponsor As Discoverer
A more powerful and controversial role of the Sponsor, and a role which The Ethical Skeptic believes stands as the Achille’s Heel of science today, is the role of the discoverer or Type III Sponsor under the scientific method. This person performs both the inception and broad case petition roles for plurality under the scientific method. In my labs historically, we have had two significant discoveries which were sponsored by outside parties who brought their petition for hypothesis development to my labs, for both validation/application testing and funding. Were I a fake skeptic, this would have never happened. One was a method of changing a clinical compound and another a groundbreaking approach for material development. Each was not a technological development, rather a scientific breakthrough that would ultimately change technology later. These were discoveries, not inventions. This agent of science, the discoverer, exercises the Bacon-esque Novum Organum which resides at the heart of discovery science. This is the aspect of science which Social Skeptics and their crony oligarchs perform desperate gymnastics in order to deny and squelch. Pharmaceutical companies and competing labs/organizations fought us hard to deny or steal the development of technologies surrounding these discoveries. For the most part they failed, but they caused damage. Damage to society and all of us ultimately. They wanted to control and overprice the technology application and deployment.
The freedom to discover wrests control from the hands of Social Skeptic cronies and into those of ethical small enterprise and mercy based organizations.
Should their cronies in Social Skepticism gain control of science and government fully, then the Sponsor and lay researcher will become an endangered species. A compliant herd, caged in an oligopoly cubicle zoo, milked of their intellectual potential, mulling the shallow, instant-grits, Social Skepticism literature upon which they graze. SSkeptics are professionals at socially mandating that something not exist. They are a mafia after all; if they cannot kill you or your message, then they will ensure that no intellectual trace of either exists. Such constitutes the bright and wonderful promised future of Social Skepticism.
Nonetheless this less touted and critical part of the scientific method, the discovery contribution of the lay researcher, has contributed vastly more to our understanding of life, health and our realm than oppressive SSkepticism will ever allow to be admitted into history. The Stanford Encyclopedia of Philosophy opines thusly about the data collection, reduction, intelligence and discovery process as outlined by, lay scientist, Roger Bacon (Type III Lay Scientists):
Bacon’s account of his “new method” as it is presented in the Novum Organum is a prominent example. Bacon’s work showed how best to arrive at knowledge about “form natures” (the most general properties of matter) via a systematic investigation of phenomenal natures. Bacon described how first to collect and organize natural phenomena and experimental facts in tables, how to evaluate these lists, and how to refine the initial results with the help of further experiments. Through these steps, the investigator would arrive at conclusions about the “form nature” that produces particular phenomenal natures. The point is that for Bacon, the procedures of constructing and evaluating tables and conducting experiments according to the Novum Organum leads to secure knowledge. The procedures thus have “probative force”.†
Indeed, it is the lay researcher who may well possess the only domain access under which to make such “form natures” observations which can be crafted into “probative force,” or that of a testable Construct or Hypothesis. To ignore these inputs, to ignore the input of 10,000 lay observers who inhabit a particular domain, even in the presence of uncertainty and possible chicanery, is professionally unwise (unethical). To the Ethical Skeptic, it is unwise to set up conferences which function only in the role of teaching people how to attack these researchers, even if the conclusions on subjects these conferences regard as bunk, are 95% correct in their sponsors’ assessments.
Fake Skeptics, those who have a religion and an ontology to protect, bristle at the work and deontological impact of Type III lay researchers. Roger Bacon was a lay philosopher and researcher in his own right, and accordingly so, bore his own cadre of detractors and ‘skeptics.’ Yet his work had a most profound impact on modern discovery science thought. It is this type of researcher which challenges and changes the landscape of science (see: Discovery versus Developmental Science). It is in this domain where we first encounter the Ethical Skeptic paradoxical adage “Experts who are not scientists and scientists who are not experts.” It is this social and methodical challenge which we as a body of knowledge developers must overcome, in order for discovery to proceed. It is our duty to resolve this paradox and move forward, not habitually attack those involved.
Social Skepticism: Exploiting the Paradox for Ignorance
This absolutely essential element of ethics therefore, under the scientific method, is the process of Sponsorship. The Craft of Research, a common guide recommended by advisers in candidate dissertation prosecution, relates “Everything we’ve said about research reflects our belief that it is a profoundly social activity,”³ That simply means that, the majority of science, research and the knowledge development process resides outside the laboratory, in an asymmetric and highly dynamic realm. Given this abject complexity of playing field, it becomes manifest that it is our fealty to process which effectively distinguishes us from the pretender, and not how correct we are on bunk subjects. The role of the Ethical Skeptic is to defend the integrity of this knowledge development process. Which brings up the circumstance where, what if various persons and groups do not desire knowledge to improve? What then is the role of the Ethical Skeptic?
If you demand ‘bring me proof’ before you would ask ‘bring me enough intelligence to form a question or hypothesis’ – then I question your purported knowledge of science. ~TES
Part of our job as well at The Ethical Skeptic is to elicit, to shed light on circumstance wherein pretenders circumvent and abrogate this ethical process of science. Instances where false vigilante skeptics use the chaos of research, against the knowledge development process itself. Unethical actions which target elimination of Type III research and defamation/intimidation of Type III both lay and scientist researchers. Below are listed some of the tactics employed on a deleterious path of willfully and purposely vitiating the Sponsorship (in particular Type III Sponsorship) steps of the scientific method:
Tactics/Mistakes Which Social Skeptics Employ to Vitiate the Sponsorship Portion of the Scientific Method
1. Thinking that the craft of research and science solely hinge around the principle of making final argument.³
2. Promoting an observation to status as a claim.
3. Thinking that a MiHoDeAL or Apophenia claim to authority can be issued without supporting evidence.
4. Routinely accepting at face value associative, predictive or statistical proofs while eschewing and denying falsifying observations.
5. Lack of acknowledging the full set of explanatory possibilities under the sponsorship Peer Input step.
6. Presuming that a Sponsor is only pursuing one alternative explanation during Peer Input.
7. The inability of SSkeptics to recognize true experts in other than academic/oligarch contexts.
8. The errant habit of false skeptics in citing and deferring to non-experts.
9. Vigilante thinking in mistakenly believing that the role of skepticism is to ‘evaluate claims’ and teach Sponsors about critical thinking which squelches Sponsorship in the first place.
10. Amateur error in applying pretend Peer Review tactics in the Sponsorship stage of science.
11. Failure to demonstrate circumspection to own contradictions or weaknesses in own Peer Input argument.
12. Failure to define/address significance versus insignificance by observational context.
13. Fake Skepticism: Asking “Here is what we need to prove this” rather than “Here is what we need in order to develop a hypothesis.”
14. Committing the Vigilante Mistake: Killing just as many innocent Sponsors as one does Bad Science Sponsors.
¹ a. Stephen Wolfram, A New Kind of Science, Wolfram Media, Inc. Champaign, IL; p. 548.
¹ b. Stephen Wolfram, A New Kind of Science, Wolfram Media, Inc. Champaign, IL; pp. 557-576.
³ a. Wayne C. Booth, Gregory G. Colomb, and Joseph M. Williams, The Craft of Research, Third Edition; The University of Chicago Press, Chicago, IL; p 285.
³ b. Wayne C. Booth, Gregory G. Colomb, and Joseph M. Williams, The Craft of Research, Third Edition; The University of Chicago Press, Chicago, IL; pp 114-123.