The Ethical Skeptic

Challenging Pseudo-Skepticism, Institutional Propaganda and Cultivated Ignorance

Embargo of The Necessary Alternative is Not Science

Einstein was desperate for a career break. He had a 50/50 shot – and he took it. The necessary alternative he selected, fixed c, was one which was both purposely neglected by science, and yet offered the only viable alternative to standing and celebrated club dogma. Dogma which had for the most part, gone unchallenged. Developing mechanism for such an alternative is the antithesis of religious activity. Maturing the necessary alternative into hypothesis, is the heart and soul of science.

Mr. Einstein You’ll Never Amount to Anything You Lazy Dog

Albert Einstein introduced in a 1905 scientific paper, the relationship proposed inside the equation e = mc² : the concept that the system energy of a body (e) is equal to the mass (m) of that body times the speed of light squared (c²). That same year he also introduced a scientific paper outlining his theory of special relativity. Most of the development work (observation, intelligence, necessity, hypothesis formulation) entailed in these papers was conducted during his employment as a technical expert – class III (aka clerk) at the Federal Office for Intellectual Property in Bern, Switzerland; colloquially known as the Swiss patent office.1 There, bouncing his ideas off a cubicle-mate (si vis) and former classmate, Michele Angelo Besso, an Italian Engineer, Einstein found the time to further explore ideas that had taken hold during his studies at the Swiss Federal Polytechnic School. He had been a fan of his instructor, physicist Heinrich Friedrich Weber – the most notable of his two top-engaged professors at Swiss Federal Polytechnic. Weber had stated two things which struck an impression on the budding physicist.2

“Unthinking respect for authority is the enemy of truth.” ~ physicist Heinrich Friedrich Weber

As well, “You are a smart boy, Einstein, a very smart boy. But you have one great fault; you do not let yourself be told anything.” quipped Weber as he scolded Einstein. His mathematics professor, Hermann Minkowski scoffed to his peers about Einstein, relating that he found Einstein to be a “lazy dog.” In similar vein, his instructor physicist Jean Pernet, admonished the C-average (82% or 4.91 of 6.00 GPA) student “[I would advise that you change major to] medicine, law or philosophy rather than physics. You can do what you like. I only wish to warn you in your own interest.” Pernet’s assessment was an implication to Einstein that he did not face a bright future, should he continue his career in pursuit of physics. His resulting mild ostracizing from science was of such extent that Einstein’s father later had to petition in an April 1901 letter, for a university to hire Einstein as an instructor’s assistant. His father wrote “…his idea that he has gone off tracks with his career & is now out of touch gets more and more entrenched each day.” Unfortunately for the younger Einstein, his father’s appeal fell upon deaf ears. Or perhaps fortuitously, as Einstein finally found employment at the Swiss patent office in 1902.3

However, it was precisely this penchant for bucking standing authority, which served to produce fruit in Einstein’s eventual physics career. In particular, Einstein’s youthful foible of examining anew the traditions of physical mechanics, combined with perhaps a dose of edginess from being rejected by the institutions of physics, were brought to bear effectively in his re-assessment of absolute time – absolute space Newtonian mechanics.

Einstein was not ‘doubting’ per se, which is not enough in itself. Rather he executed the discipline of going back and looking – proposing an alternative to a ruling dogma based upon hard nosed critical path induction work, and not through an agency desire to lazily pan an entire realm of developing ideas through abduction (panduction) – no social skeptics, Einstein did not practice your form of authority-enforcing ‘doubt’. Rather it was the opposite.

He was not doubting, rather executing work under a philosophical value-based principle called necessity (see Ethical Skepticism – Part 5 – The Real Ockham’s Razor). Einstein was by practice, an ethical skeptic.

Einstein was not lazy after all, and this was a miscall on the part of his rote-habituated instructors (one common still today). Einstein was a value economist. He applied resources into those channels for which they would provide the greatest beneficial effect. He chose to not waste his time upon repetition, memorization, rote procedure and exercises in compliance. He was the ethical C student – the person I hire before hiring any form of cheating/memorizing/imitating A or B student. And in keeping with such an ethic, Einstein proposed in 1905, 3 years into his fateful exile at the Swiss patent office, several unprecedented ideas which were subsequently experimentally verified in the ensuing years. Those included the physical basis of 3 dimensional contraction, speed and gravitational time dilation, relativistic mass, mass–energy equivalence, a universal speed limit (for matter and energy but not information or intelligence) and relativity of simultaneity.4 There has never been a time wherein I reflect upon this amazing accomplishment and lack profound wonder over its irony and requital in Einstein’s career.

The Necessary Alternative

But was the particular irony inside this overthrow of Newtonian mechanics all that unexpected or unreasonable? I contend that it was not only needed, but the cascade of implications leveraged by c-invariant physics was the only pathway left for physics at that time. It was the inevitable, and necessary, alternative. The leading physicists, as a very symptom of their institutionalization, had descended into a singular dogma. That dogma held as its centerpoint, the idea that space-time was the fixed reference for all reality. Every physical event which occurred inside our realm hinged around this principle. Einstein, in addressing anew such authority based thinking, was faced with a finite and small set of alternative ideas which were intrinsically available for consideration. That is to say – the set of ideas only included around 4 primary elements, which could alternately or in combination, be assumed as fixed, dependent, or independently variable. Let’s examine the permutation potential of these four ideas: fixed space, fixed time, fixed gravity and/or fixed speed of light. Four elements. The combinations available for such a set are 14, as related by the summation of three combination functions:

  

What reasoned conjecture offered, and given that combinations of 4 or 3 were highly unlikely to unstable, was to serve in bounding the set of viable alternative considerations to even a lesser set than 14 – maybe 6 very logical alternatives at most (the second C(4,2) function above). However, even more reductive, essentially Einstein would only need start by selecting from one of the four base choices, as represented by the first combination function above, C(4,1). Thereafter, if he chose correctly, he could proceed onward to address the other 3 factors depending upon where the critical path led. But the first choice was critical to this process. One of the following four had to be chosen, and two were already in deontological doubt, in Einstein’s mind.

•  Fixed 3 dimensional space (x, y, z)
•  Fixed time (t)
•  Fixed gravitation relative to mass (g-mass)
•  Fixed speed of light (c)

Ultimately then, only two choices existed if one is to suppose a maximum of two fixed elements as possible per below. Indeed this ended up being the plausal-set for Einstein. The necessary alternatives, one of which had been essentially embargoed by the science authorities at the time, were a combination of two of the above four combining elements. Another combination of two was currently in force (fixed space and time).

In other words, now we have reduced the suspect set to two murder suspects – Colonel Mustard and Professor Plum, and standing dogma was dictating that only Colonel Mustard could possibly be considered as the murderer. To Einstein this was at worst, an even bet.

This is the reason why we have ethical skepticism. This condition, an oft repeated condition wherein false skepticism is applied to underpin authority based denial in an a priori context, in order to enforce one mandatory conclusion at the expense of another or all others, is a situation ripe for deposing. Einstein grasped this. The idea that space and time were fixed references was an enforced dogma on the part of those wishing to strengthen their careers in a social club called physics. Everyone was imitating everyone else, and trying to improve club ranking through such rote activity. The first two element selection, stemming of course from strong inductive work by Newton and others, was a mechanism of control called an Einfach Mechanism (see The Tower of Wrong) or

Omega Hypothesis HΩ – the answer which has become more important to protect than science itself.

•  Fixed 3 dimensional space (x, y, z)
•  Fixed time (t)

Essentially, Einstein’s most logical alternative was to assume the speed of light as fixed first. By choosing first, a fixed reference of the speed of light, Einstein had journeyed down both a necessary, as well as inevitable hypothesis reduction pathway. It was the other murder suspect in the room, and as well stood as the rebellious Embargo Hypothesis option.

Embargo Hypothesis Hξ– the option which must be forbidden at all costs and before science even begins.

•  Fixed gravitation relative to mass (g-mass)
•  Fixed speed of light (c)

But this Embargo Hypothesis was also the necessary alternative, and Einstein knew this. Once can argue both sides of the contention that the ’embargo’ of these two ideas was one of agency or mere bias. In this context and for purposes of this example, both agency and bias are to be considered the same embargo principle. In many/most arguments however, they are not the same thing.

The Necessary Alternative

/philosophy : Ockham’s Razor : Necessity/ : an alternative which has become necessary for study under Ockham’s Razor because it is one of a finite, constrained and very small set of alternative ideas intrinsically available to provide explanatory causality or criticality inside a domain of sufficient unknown. This alternative does not necessarily require inductive development, nor proof and can still serve as a placeholder construct, even under a condition of pseudo-theory. In order to mandate its introduction, all that is necessary is a reduction pathway in which mechanism can be developed as a core facet of a viable and testable hypothesis based upon its tenets.

The assertion ‘there a God’, does not stand as the necessary alternative to the assertion ‘there is no God’. Even though the argument domain constraints are similar, these constructs cannot be developed into mechanism and testable hypothesis. So, neither of those statements stand as the necessary alternative. I am sorry but neither of those statements are ones of science. They are Wittgenstein bedeutungslos – meaningless. A proposition or question which resides upon a lack of definition, or presumed definition which contains no meaning other than in and of its self.

However in exemplary contrast, the dilemma of whether or not life originated on Earth (abiogenesis), or off Earth (panspermia) do stand as a set of necessary alternatives. Even though both ideas are in their infancy, they can both ultimately be developed into mechanism and a testing critical path. The third letter of the DNA codon (see Exhibit II below) is one such test of the necessary alternatives, abiogenesis and panspermia. There is actually a third alternative as well, another Embargo Hypothesis (in addition to panspermia) in this case example – that of Intervention theory. But we shall leave that (in actuality necessary as well) alternative discussion for another day, as it comes with too much baggage to be of utility inside this particular discourse.

Einstein chose well from the set of two necessary alternatives, as history proved out. But the impetus which drove the paradigm change from that of the standing dogma and to Einstein’s favored Embargo Hypothesis, might not have been as astounding a happenstance as it might appear at first blush. Einstein chose red, when everyone and their teaching assistant, was of the awesome insistence that one need choose blue. All the ramifications of fixed speed of light (and fixed gravitation, relative only to mass), unfolded thereafter.

Einstein was desperate for a break. He had a 50/50 shot – and he took it.

Example of Necessity: Panspermia versus Abiogenesis

An example of this condition wherein the highly constrained set of alternatives (two in this case) inside a sufficient domain of unknown, forces the condition of dual necessity, can be exemplified inside the controversy around the third letter (base) of the DNA codon. A DNA codon is the word, inside the sentence of DNA. A codon is a series of 3 nucleotides (XXX of A, C, T or G) which have a ‘definition’ corresponding to a specific protein-function to be transcripted from the nucleus and decoded by the cell in its process of assembling body tissues. It is an intersection on the map of the organism. Essentially, the null hypothesis stands that, the 3rd letter (nucleotide) digit of the codon, despite its complex and apparently systematic methodical assignment codex, is the result of natural stochastic-derivation chemical happenstance during the fist 300 million years of Earth’s existence (not a long time). The idea being that life existed on a 2 letter DNA codon (XX) basis for eons, before a 3 letter (XXX) basis evolved (shown in Exhibit II below). The inductive evidence that such an assignment codex based upon 3 letters derived from 2, is beyond plausibility given the lack of probability of its occurrence and lack of time and influencing mechanism during which that improbability could have happened – this evidence supports its also-necessary alternative.

In this circumstance, the idea that the DNA codon third digit based codex, was not a case of 300 million year fantastical and highly improbable happenstance, but rather existed inside the very first forms of life which were to evolve (arrive) on Earth, is called panspermia. The necessary alternative panspermia does not involve or hinge upon the presence of aliens planting DNA on Earth, rather that the 3 letter codon basis was ‘unprecedented by context, complex and exhibiting two additional symmetries (radial and block) on top of that’ at the beginning of life here on Earth, and therefore had to be derived from a source external to the Earth. Note, this is not the same as ‘irreducible complexity’, a weak syllogism employed to counter-argue evolution (not abiogenesis) – rather it is a case of unprecedentable complexity. A much stronger and more deductive argument. It is the necessary alternative to abiogenesis. It is science. Both alternatives are science.

This circumstance elicits the contrathetic impasse, a deliberation wherein a conundrum exists solely because authority is seeking to enforce a single answer at the expense of all others, or forbid one answer at the expense of science itself. The enforced answer is the Omega Hypothesis and the forbidden alternative is the Embargo Hypothesis. And while of course abiogenesis must stand as the null hypothesis (it can be falsified but never really proven) – that does not serve to make it therefore true. Fake skeptics rarely grasp this.

Therefore, the necessary alternative – that the DNA (XXX) codex did not originate on Earth, is supported by the below petition for plurality, comprising five elements of objective inference (A – E below). This systematic codex is one which cannot possibly be influenced (as is evolution) by chemical, charge, handedness, use of employment, epigenetic or culling factors (we cannot cull XX codex organisms to make the survival group more compatible for speciating into XXX codex organisms). Nor do we possess influences which can serve to evolve the protein based start, and silence based stop codons. It can only happen by accident or deliberation. This is an Einstein moment.

Omega Hypothesis HΩ – the third letter of the DNA codex evolved as a semi-useless appendage, in a single occurrence, from a 2 letter codex basis, featuring radial symmetry, featuring block assignment symmetry and molecule complexity to 2nd base synchrony, only upon Earth, in a 1.6 x 10^-15 (1 of 6 chance across a series of 31 pairings, across the potential permutations of (n – 1) proteins which could be assigned) chance, during the first 300 million years of Earth’s existence. Further the codex is exceptionally optimized for maximizing effect inside proteomic evolution. Then evolution of the codex stopped for an unknown reason and has never happened again for 3.8 billion years.

Stacking of Entities = 10 stacked critical path elements. Risk = Very High.

Embargo Hypothesis Hξ– the three letter codex basis of the DNA codon, pre-existed the origination of life on Earth, arrived here preserved by a mechanism via a natural celestial means, and did not/has not evolved for the most part, save for slight third base degeneracy. Further the codex is exceptionally optimized for maximizing effect inside proteomic evolution.

Stacking of Entities = 4 stacked critical path elements. Risk = Moderate.

Note: By the terms ‘deliberacy’ and ‘prejudice’ used within this article, I mean the ergodicity which is incumbent with the structure of the codex itself. Both how it originated and what its result was in terms of the compatibility with amino acids converting into life. There is no question of ergodicity here. The idea of ‘contrived’ on the other hand, involves a principle called agency. I am not implying agency here in this petition. A system can feature ergodicity, but not necessarily as a result of agency. To contend agency, is the essence of intervention hypotheses. To add agency would constitute stacking of entities (a lot of them too – rendering that hypothesis weaker than even abiogenesis). This according to Ockham’s Razor (the real one).

The contention that panspermia merely shifts the challenges addressed by abiogenesis ‘off-planet’ is valid; however those challenges are not salient to the critical path and incremental question at hand. It is a red herring at this point. With an ethical skeptic now understanding that abiogenesis involves a relatively high-stacked alternative versus panspermia, let’s examine the objective basis for such inference in addition to this subjective ‘stacking of entities’ skepticism surrounding the comparison.

The Case for Off-Earth Codex Condensation

Did our DNA codex originate its structure and progressions first-and-only upon Earth, or was it inherited from another external mechanism? A first problem exists of course in maintaining the code once it exists. However, upon observation, a more pressing problem exists in establishing just how the code came into being in the first place. Evolved or pre-existed? ‘Pre-existed by what method of origination then?’ one who enforces an Omega Hypothesis may disdainfully pontificate. I do not have to possess an answer to that question in order to legitimize the status of this necessary alternative. To pretend an answer to that question would constitute entity stacking. To block this necessary alternative (pre-existing codex) however, based upon the rationality that it serves to imply something your club has embargoed or which you do not like personally – even if you have mild inductive support for abiogenesis – is a religion. Given that life most probably existed in the universe and in our galaxy, already well before us – it would appear to me that panspermia, The Embargo Hypothesis, is the simplest explanation, and not abiogenesis. However, five sets of more objective inference serve to make this alternative a very strong one, arguably deductive in nature, versus abiogenesis’ relative paltry battery of evidence.

A.  The most early use amino acids and critical functions hold the fewest coding slots, and are exclusively dependent upon only the three letter codon form. Conjecture is made that life first developed upon a 2 letter codon basis and then added a third over time. The problem with this is that our first forms of life use essentially the full array of 3 letter dependent codex, to wit: Aspartate 2 (XXX), Lysine 2 (XXX), Asparagine 2 (XXX), Stop 3 (XXX), Methionine-Start 1 (XXX), Glutamine 2 (XXX). Glutamic Acid and Aspartic Acid, which synthesize in the absolute earliest forms of thermophiles in particular would have had to fight for the same 2 digit code, GA – which would have precluded the emergence of even the earliest thermal vent forms of life – under a 2 letter dependent codex (XX). These amino acids or codes were mandatory for the first life under any digit size context – and should hold the most two digit slots accordingly – they do not. As well, in the case where multiple codons are assigned to a single amino acid, the multiple codons are usually related. Even the most remote members of archaea, thermophilic archaea, use not only a full 3 letter codon dependent codex, but as well use proteins which reach well into both the adenine position 2 variants (XAX) and thymidine position 2 variants (XTX) groupings; ostensibly the most late-appearing sets of amino acids (see graphic in C. below and Table III at the end).5

It is interesting to also note that the three stop codons TAA-TAG-TGA, all match into codex boxing with later appearing/more complex amino acid molecules, and as members of the adenine position 2 variants (XAX) group. They box with the more complex and ‘later appearing’ amino acids tyrosine and tryptophan. The stop codes needed to be a GG, CC, GC, or at the very least a CT/TC XX codon basis at the very least, in order to support an extended evolutionary period under a two letter codon basis (XX). This was not the situation as you can see in Exhibit III at the end. This would suggest that the stop codes appeared later to last under a classic abiogenetic evolutionary construct. Life would have had to evolve up until thermophilic archaea without stop codes in their current form. Then suddenly, life would have had to adopt a new stop codon basis (and then never make another change again in 3.6 billion years), changing horses in mid stream. This XX codon previous form of life should be observable in our paleo record. But it is not.

Moreover, the use of the two digit codex is regarded by much of genomics as a degeneracy, ‘third base degeneracy’, and not an artifact of evolution.6 Finally, the codon ATA should in a certain number of instances, equate to a start code, since it would have an evolutionary two digit legacy – yet it is never used to encode for Methionine – this is incompatible with the idea that methionine used to employ a two digit AT code. Likewise, Tyrosine and the non-amino stop code, TA would have been in conflict under the two digit codex. In fact, overall there should exist a relationship between arrival of an amino acid use in Earth life, and the number of slots it occupies in the codex, and there is not. Neither to the positive slope, nor negative slope.7

One can broach the fact that protein reassignments were very possible and could explain away the apparent introduction of a XXX codon dependency of all observable life, midstream. But then one must explain why it never ‘speciated’ again over 3.6 billion years, along with the apparent absence of XX codon life in the paleo record. This chasm in such a construct is critical and must be accommodated before abiogenesis can be fully developed as a hypothesis. In comparison, panspermia possess no such critical obstacle (note, this is not a ‘gap’ as it relates to the critical path of the alternative and not merely circumstantial inductive inference).

Codon Radial Symmetry

B.  Evolution of the codex would necessarily occur from amino absences rather than positives. Of particular note is the secondary map function of the start and stop codons. Notice that the start of a DNA sentence begins with a specific protein (the polar charge molecule methionine-ATG). The end of a DNA sequence however, consists of no protein coding whatsoever (see TAA, TAG and TGA). In other words the DNA sentence begins with the same note every time, and ends with protein silence. tRNA evolved a way to accommodate the need for a positive, through the employment of proline-CCA during the protein assembly process. This is how a musical score works – it starts with a note, say an A440 tuning one, and ends with the silence dictated by the conductor’s wand. Deliberacy of an empty set, as opposed to the stochasticity of positive notes, as another appearance of a start code methionine could have sufficed for a positive stop and start code instead. This positive stop mechanism would succeed inside an evolution context much better. Why would an absence evolve into a stop code for transfer RNA? It could not, as ‘absence’ contains too much noise. It occurs at points other than simply a stop condition. The problem exists in that there is no way for an organism to survive, adapt, cull or evolve, based upon its use of an empty set (protein silence). Mistakes would be amplified under such an environment,  Evolution depends intrinsically upon logical positives only (nucleotides, mutations, death) – not empty sets.

Codon versus Molecule Complexity 64 Slots

C.  Features triple-symmetrical assignment, linear robustness, ergodicity along with a lack of both evolution and deconstructive chaos. This is NOT the same set of conditions as exist inside evolution, even though it may appear as such to a layman. This set of codex assignments, features six principle challenges (C., and C. 1, 2a, 2b, 3, and 4 below). Specifically,

• radial assignment symmetry (B. Codon Radial Symmetry chart above),
• thymidine and adenine (XTX, XAX) second base preference for specific chemistries,
• synchrony with molecule complexity (C. Codon versus Molecule Complexity 64 Slots graphic to the right),
• block symmetry (C. Codon Second Base Block Symmetry table below) around the second digit (base), and
• ergodicity, despite a lack of chemical feedback proximity nor ability for a codon base to attract a specific molecule chemical profile or moiety.
• lack of precedent from which to leverage.

These oddities could not ‘evolve’ as they have no basis to evolve from. The structure and assignment logic of the codex itself precludes the viability of a two base XX codex. Evolution by definition, is a precedent entity progressing gradually (or sporadically) into another new entity. It thrives upon deconstructive chaos and culling to produce speciation. There was no precedent entity in the case of the DNA stop codon nor its XXX codex. As well, at any time, the stop codons could have been adopted under the umbrella of a valid protein, rendering two SuperKingdoms of life extant on Earth – and that should have happened. Should have happened many times over and at any time in our early history (in Archaea). Yet it did not.8 An asteroid strike and extinction event would not serve to explain the linearity. Evolution is not linear. We should have a number of DNA stop and start based variants of life available (just as we have with evolution based mechanisms) to examine. But we do not. In fact, as you can see in the chart to the right (derived from Exhibit III at the end), there exist four challenges to a purely abiogenetic classic evolution construct:

1. An original symmetry to the assignment of codon hierarchies (codex), such that each quadrant of the assignment chart of 64 slots, mirrors the opposing quadrant in an ordinal discipline (see Codon Radial Symmetry charts in B. above to the right – click to get a larger image).

Codon Second Base Block Symmetry

2. The second character in the codon dictates (see chart in B. Codon Radial Symmetry chart above) what was possible with the third character. In other words

a. all the thymidine position 2 variants (XTX) had only nitrite molecules (NO2) assigned to them (marked in blue in the chart in C. Codon versus Molecule Complexity 64 Slots to the upper right and in Exhibit III at the end – from where the graph is derived). While the more complex nitrous amino acids were all assigned to more complex oversteps in codex groups (denoted by the # Oversteps line in the Codon versus Molecule Complexity 64 Slots chart to the upper right).

In addition,

b. all adenine position 2 variants (XAX) were designated for multi-use 3rd character codons, all cytidine position 2 variants (XCX) were designated for single use 3rd character codons, while guanine (XGX) and thymidine (XTX) both were split 50/50 and in the same symmetrical patterning (see Codon Second Base Block Symmetry table to the right).

3.  There exists a solid relationship, methodical in its application, between amino acid molecule complexity and assignment grouping by the second digit of the DNA codon. Second letter codon usages were apportioned to amino acids as they became more and more complex, until T and A had to be used because the naming convention was being exceeded (see chart in C., Codon versus Molecule Complexity 64 Slots above to the right as well as Exhibit III at the end). After this was completed, then one more use of G was required to add 6 slots for arginine, and then the table of 60 amino acids was appended by one more, tryptophan (the most complex of all the amino acid molecules – top right of chart in C. Codon versus Molecule Complexity 64 Slots above or the end slots in Exhibit III at the end) and the 3 stop codes thereafter. Very simple. Very methodical. Much akin to a IT developer’s subroutine log – which matures over the course of discovery inside its master application.

4.  Ergodicity. Prejudice… Order, featuring radial symmetry (B. Codon Radial Symmetry chart), synchrony with molecule complexity (C. Codon versus Molecule Complexity 64 Slots graphic) and and block symmetry (C. last image Codon Second Base Block Symmetry table) around the second digit. The problem is that there is no way a natural process could detect the name/features of the base/molecule/sequence as means to supplant such order and symmetry into the codex, three times – much less evolve is such short order, without speciation, in the first place.

Since the DNA chemistry itself is separated by two chemical critical path interventions, how would the chemistry of thymine for instance (the blue block in Exhibit III below) exclusively attract the nitric acid isomer of each amino acid? And why only the nitric acid isomers with more complex molecule bases? First, the DNA base 2 is no where physically near the chemical in question, as it is only a LOGICAL association, not a chemical one so it cannot contain a feedback or association loop. Second, there is no difference chemically, between C2H5NO2 and C5H11NO2. The NO2 is the active moiety. So there should have not been a synchrony progression (C.3. above), even if there were a direct chemical contact between the amino acid and the second base of the codon. So the patterns happen as a result of name only. One would have to know the name of the codon by its second digit (base), or the chemical formula for the amino acid, and employ that higher knowledge to make these assignments.

Finally, this order/symmetry has not changed since the code was first ‘introduced’ and certainly has not been the product of stochastic arrival – as a sufficiently functional-but-less-orderly code would have evolved many times over (as is the practice of evolution) and been struck into an alternative codice well before (several billion years) this beautiful symmetry could ever be attained.

We claim that evolution served to produce the codex, yet the codex bears the absolute signs of having had no evolution in its structure. We cannot selectively apply evolution to the codex – it must either feature evolutionary earmarks, or not be an evolved code. The mechanisms of evolution cannot become a special pleading football applied only when we need it, to enforce conformance – because in that case we will only ever find out, what we already know. It becomes no better argument philosophically than ‘God did it’.

D.  Related codons represent related amino acids. For example, a mutation of CTT to ATT (see table in C. above) results in a relatively benign replacement of leucine with isoleucine. So the selection of the CT and AT prefixes between leucine and isoleucine was done early, deliberately and in finality – based upon a rational constraint set (in the example case, two nitrite molecule suffixed proteins) and not eons of trail and error.9 Since the assignment of proteins below, is not partitioned based upon any physical characteristic of the involved molecule, there is no mechanism but deliberacy which could dictate a correspondence between codon relationships and amino acid relationships.10

E.  Statistically impossible codex, not just improbable. Finally, it is not simply the elegant symmetry to the codex which is perplexing, but as well, items A. – D. usage contexts identified above, allow one to infer (deductive?) that the codex, its precedent, provenance and structure are difficult to impossible to accommodate in even the most contorted construct of abiogenesis. Observe the A-G vs C-T tandem relationship between Lysine and Asparagine for instance. This elegant pattern of discipline repeats through the entire codex. This is the question asked by Eugene V. Koonin and Artem S. Novozhilov at the National Center for Biotechnology in Bethesda, Maryland in their study Origin and Evolution of the Genetic Code: The Universal Enigma (see graphic to the right, extracted from that study).11 This serious challenge is near to falsifying in nature, and cannot be dismissed by simple hand waving. Take some time (weeks or months, not just seconds) to examine the DNA codex in Exhibits I thru III below, the three tables and charts in B. and C. above, as well as the study from which the graphic to the right is extracted, and see if you do not agree. This argument does not suffer the vulnerability of ‘creationist’ arguments, so don’t play that memorized card – as Ockham’s Razor has been surpassed for this necessary alternative.

The hypothesis that the codex for DNA originated elsewhere bears specificity, definition and testable mechanism.
It bears less stacking, gapping and risk as compared to abiogenesis.
It is science. It is the necessary alternative.

Assuming that life just won the 1 in 1.6 x 10 to the 15th power lottery (the Omega Hypothesis), again, and quickly, near to the first time it even bought a lottery ticket – has become the old-hat fallback explanation inside evolution. One which taxes a skeptic’s tolerance for explanations which begin to sound a lot like pseudo-theory (one idea used to explain every problem at first broach). However this paradox of what we observe to be the nature of the codex, and its incompatibility with abiogenesis, involves an impossibly highly stacked assumption set to attempt to explain away based solely upon an appeal to plenitude error. A fallback similar in employment of the appeal to God for the creationist. The chance that the codex evolved by culled stochastics alone in the first 300 million years of Earth’s existence, is remote from both the perspective of time involved (it happened very quickly) and statistical unlikelihood – but as well from the perspective that the codex is also ‘an optimal of optimates’ – in other words, it is not only functional, but smart too. Several million merely functional-but-uglier code variants would have sufficed to do the job for evolution. So this begs the question, why did we get a smart/organized codex (see Table II below and the radial codon graphic in B. above) and not simply a sufficiently functional but chaotic one (which would also be unlikely itself)? Many social skeptics wishing to enforce a nihilistic religious view of life, miss that we are stacking deliberacy on top of a remote infinitesimally small chance happenstance. Their habit is to ignore such risk chains and then point their finger at creationists as being ‘irrational’ as a distraction.

The codex exhibits unprecedentable, empty set as a positive logical entity, static, patterned codex format, exposed to deconstructive vulnerability which happens in everything else, but which chose to not happen in this one instance. In other words, evolution actually chose to NOT happen in the DNA codex – i.e. deliberacy. The codex, and quod erat demonstrandum, the code itself, came from elsewhere. I do not have to explain the elsewhere, but merely provide the basis for understanding that the code did not originate here. That is the only question on our scientific plate at the moment.

Exhibits I II and III

Exhibit I below shows the compressed 64 slot utilization and it efficiency and symmetry. 61 slots are coded for proteins and three are not – they are used as stop codes (highlighted in yellow). There are eight synonym based code groups (proteins), and twelve non-synonym code groups (proteins). Note that at any given time, small evolutionary fluctuations overlapping into the stop codes would render the code useless as the basis for life. So, the code had to be frozen from the very start, or never work at all. Either that or assign tryptophan as a dedicated stop code and mirror to methionine and make the 5th code group a synonym for Glutamine or Aspartic Acid.

Exhibit I – Tandem Symmetry Table

Exhibit II below expands upon the breakout of this symmetry by coded protein, chemical characteristics and secondary mapping if applicable.

Exhibit II – Expanded Tandem Symmetry and Mapping

Finally, below we relate the 64 codon slot assignments, along with the coded amino acid, the complexity of that molecule and then the use in thermophilic archaea. Here one can see that it is clear that even our first forms of life, constrained to environments in which they had the greatest possibility of even occurring, employed the full XXX codex (three letter codon). While it is reasonable to propose alternative conjecture (and indeed plurality exists here) that the chart suggests a two letter basis as the original codex, life’s critical dependency upon the full codon here is very apparent.

Exhibit III – 61 + 3 Stop Codon Assignment versus Molecule Complexity and
Use in Thermophilic Archaea (see chart in C. above)

Recognizing the legitimacy of the necessary alternative – one which was both purposely neglected by science, and yet offers the only viable alternative to standing and celebrated club dogma – this is a process of real science. Developing mechanism for such an alternative is the antithesis of religious activity. Maturing the necessary alternative into hypothesis, is the heart and soul of science.

Blocking such activity is the charter and role of social skepticism. Holding such obfuscating agency accountable is the function of ethical skepticism.

epoché vanguards gnosis

——————————————————————————————

How to MLA cite this blog post =>

The Ethical Skeptic, “Embargo of The Necessary Alternative is Not Science” The Ethical Skeptic, WordPress, 24 Nov 2018; Web, https://wp.me/p17q0e-8Ob

November 24, 2018 Posted by | Argument Fallacies | , , | 11 Comments

The Critical Role of Sponsors in the Scientific Method

In true research, the diligent investigator is continually bombarded by huge amounts of data, in the form of facts, observations, measures, voids, paradoxes, associations, and so on. To be able to make use of this data a researcher typically reduces it to more manageable proportions. This does not mean we need to necessarily tender a claim about that data. Instead science mandates that we apply the principles of both linear and asymmetric Intelligence as part of the early scientific method. Our goal in Sponsorship is not to force an argument or proof, rather to establish a reductionist description through which the broader observational set may be reproduced or explained, as possible. This reductionist description is called a construct. Constructs are developed by laymen, field researchers, analysts, philosophers, witnesses as well as lay experts and scientists alike. The science which ignores this process, is not science.
It is the job of the Sponsor in the scientific method, to perform these data collection, linear and asymmetric Intelligence and reductionist development steps. In absence of robust regard for Sponsorship, science is blinded, and moreover the familiar putrefied rot of false skepticism takes root and rules the day of ignorance.

Sponsor

(scientific methodology) an individual or organization gathering the resources necessary and petitioning for plurality of argument under Ockham’s Razor and the scientific method.

The Critical Role of SponsorsWhen we speak of ethics at The Ethical Skeptic, we speak less of features of personal moral character, and more of the broader application context.  A professional allegiance and adherence to a clear and valuable series of deontological protocols which produce results under a given knowledge development process. In other words, fealty to the scientific method, above specific conclusions. Ethically I defer; I surrender my religions, predispositions and dogma to the outcome of the full and competently developed knowledge set.  This is ethics.  It really has nothing to do with morality and everything to do with the character of curiosity. I do not have the universe figured out, and I would sincerely like to know some things.

Type I Sponsor: Lay Science

A key component of this ethical process are the portions of science which involve Sponsorship. Don’t be dissuaded by the title. A sponsor is a very familiar participant in the protocols of science. Sky watching lay astronomers for example are a vital part of the scientific method, depicted in the chart to the right under Type I Sponsors. These lay researchers perform key roles in monitoring, collecting and documenting of celestial events and bodies. Many new comets are named after the actual layman who spotted them and provided enough information for science then to further prove the case at hand. The Sponsor in astronomy does not prove the hypothesis per se, rather simply establishes the case for a construct: the proposed incremental addition of celestial complexity beyond the reasonableness of parsimony (see: Ethical Skepticism – Part 5). Science then further tests, reviews and proves the lay astronomer’s sponsored construct by means of a hypothesis. The layman in astronomy in essence ‘gathers the resources necessary and petitions for plurality of argument (a new celestial moving body) under Ockham’s Razor and the scientific method.’ It is this Type I Sponsor realm, inside of which Big Data will unveil its most remarkable revolution.

“In everyday life we are continually bombarded by huge amounts of data, in the form of images, sounds, and so on. To be able to make use of this data we must reduce it to more manageable proportions.”¹  This does not mean we need to make a claim about that data. It means we need to apply the principles of asymmetric intelligence.  Our goal in [Sponsorship] is not to make a claim necessarily, rather to “establish a reductionist description through which the observational set may be reproduced.”¹ – Stephen Wolfram, A New Kind of Science

Asymmetric Challenges Demand Asymmetric Intelligence Approaches

Sponsorship in the Scientific Method

In similar fashion, Sponsors function as a critical contributor group to science under a number of more complex, and less linear fields of study than astronomy. A woodsman who has hunted, fished and lived in the Three Sisters region of Oregon, can stand as both an expert in terms of resource and recitation, and moreover can become a sponsor of an idea regarding the domain in which they have spent their entire life conducting data collection. Wise local university researchers will meet locals on Cascade Ave. in order to collect observations on some of the region’s geologic history. The lay researcher him or herself can perform citizen science as well, yes. But more importantly he or she might aid science by developing an idea which has never been seriously considered before. Perhaps they have observed changes in total Whychus Head spring water volume flow prior to magnitude 3.0 and above earthquakes. Perhaps they want to formalize these observations and ask a local university to take a look at their impression (construct). This is Sponsorship. Establishing a case that science should address their own version of a comet, in the natural domain which they survey.

The role of the Sponsor is not to prove a particular case, rather to surpass Ockham’s Razor in petitioning science to develop and examine a set of hypotheses

Indeed, problems even more complex, such as the studies of patterns and habits of wildlife, rarely advance via the handiwork of one organization or individual. The task is simply too daunting. Citizens provide critical inputs as to the habits of the Red Wolf or Grizzly Bear.² In similar fashion, medical maladies and successful means of addressing them, are many times asymmetric in their challenge, involving a many faceted contribution and mitigation element series. The role of the lay researcher, moms and dads with respect to their children, is critical. To ignore this lay resource under the guise of fake skepticism and ‘anecdote’ is not only professionally unwise (unethical), but cruel as well (immoral). These stand as examples of the asymmetric challenge entailed in the majority of scientific knowledge processes we face as a society. It is this preponderance of asymmetric challenge therefore which promotes the Sponsor into the necessary roles of both inventor and discoverer, and not simply the role of science clerk.

Type II:  Tinkerer Sponsors As Lay Scientist

The second category of Sponsorship involves the work of lay tinkerers and garage inventors (Type II Sponsor in the graphic above). Arguments vary as to the magnitude of impact of this class of researcher, but no one can dispute the relatively large impact that this class of Sponsor has had on various industry verticals. A key example might be 17 year old layman Michael Callahan, who lost his voice in a skateboarding accident, and subsequently developed a device called Audeo, to aide the plight of those who have suffered a similar loss of vocal function (see: Top Ten Inventions). But lay science does not have to bear simply the consequentialist result of a technological device (Type II) or simply observation inside a well established domain of science (Type I) . A sponsor can perform the role of discovery as well.

Type III:  Sponsor As Discoverer

the-key-to-powerA more powerful and controversial role of the Sponsor, and a role which The Ethical Skeptic believes stands as the Achille’s Heel of science today, is the role of the discoverer or Type III Sponsor under the scientific method. This person performs both the inception and broad case petition roles for plurality under the scientific method. In my labs historically, we have had two significant discoveries which were sponsored by outside parties who brought their petition for hypothesis development to my labs, for both validation/application testing and funding. Were I a fake skeptic, this would have never happened. One was a method of changing a clinical compound and another a groundbreaking approach for material development. Each was not a technological development, rather a scientific breakthrough that would ultimately change technology later. These were discoveries, not inventions. This agent of science, the discoverer, exercises the Bacon-esque Novum Organum which resides at the heart of discovery science. This is the aspect of science which Social Skeptics and their crony oligarchs perform desperate gymnastics in order to deny and squelch. Pharmaceutical companies and competing labs/organizations fought us hard to deny or steal the development of technologies surrounding these discoveries. For the most part they failed, but they caused damage. Damage to society and all of us ultimately. They wanted to control and overprice the technology application and deployment.

The freedom to discover wrests control from the hands of Social Skeptic cronies and into those of ethical small enterprise and mercy based organizations.

Should their cronies in Social Skepticism gain control of science and government fully, then the Sponsor and lay researcher will become an endangered species. A compliant herd, caged in an oligopoly cubicle zoo, milked of their intellectual potential, mulling the shallow, instant-grits, Social Skepticism literature upon which they graze.  SSkeptics are professionals at socially mandating that something not exist. They are a mafia after all; if they cannot kill you or your message, then they will ensure that no intellectual trace of either exists.  Such constitutes the bright and wonderful promised future of Social Skepticism.

Nonetheless this less touted and critical part of the scientific method, the discovery contribution of the lay researcher, has contributed vastly more to our understanding of life, health and our realm than oppressive SSkepticism will ever allow to be admitted into history. The Stanford Encyclopedia of Philosophy opines thusly about the data collection, reduction, intelligence and discovery process as outlined by, lay scientist, Roger Bacon (Type III Lay Scientists):

Bacon’s account of his “new method” as it is presented in the Novum Organum is a prominent example. Bacon’s work showed how best to arrive at knowledge about “form natures” (the most general properties of matter) via a systematic investigation of phenomenal natures. Bacon described how first to collect and organize natural phenomena and experimental facts in tables, how to evaluate these lists, and how to refine the initial results with the help of further experiments. Through these steps, the investigator would arrive at conclusions about the “form nature” that produces particular phenomenal natures. The point is that for Bacon, the procedures of constructing and evaluating tables and conducting experiments according to the Novum Organum leads to secure knowledge. The procedures thus have “probative force”.†

Indeed, it is the lay researcher who may well possess the only domain access under which to make such “form natures” observations which can be crafted into “probative force,” or that of a testable Construct or Hypothesis. To ignore these inputs, to ignore the input of 10,000 lay observers who inhabit a particular domain, even in the presence of uncertainty and possible chicanery, is professionally unwise (unethical). To the Ethical Skeptic, it is unwise to set up conferences which function only in the role of teaching people how to attack these researchers, even if the conclusions on subjects these conferences regard as bunk, are 95% correct in their sponsors’ assessments.

Fake Skeptics, those who have a religion and an ontology to protect, bristle at the work and deontological impact of Type III lay researchers. Roger Bacon was a lay philosopher and researcher in his own right, and accordingly so, bore his own cadre of detractors and ‘skeptics.’ Yet his work had a most profound impact on modern discovery science thought. It is this type of researcher which challenges and changes the landscape of science (see: Discovery versus Developmental Science). It is in this domain where we first encounter the Ethical Skeptic paradoxical adage “Experts who are not scientists and scientists who are not experts.”  It is this social and methodical challenge which we as a body of knowledge developers must overcome, in order for discovery to proceed. It is our duty to resolve this paradox and move forward, not habitually attack those involved.

Social Skepticism:  Exploiting the Paradox for Ignorance

Slide8This absolutely essential element of ethics therefore, under the scientific method, is the process of Sponsorship. The Craft of Research, a common guide recommended by advisers in candidate dissertation prosecution, relates “Everything we’ve said about research reflects our belief that it is a profoundly social activity,”³ That simply means that, the majority of science, research and the knowledge development process resides outside the laboratory, in an asymmetric and highly dynamic realm. Given this abject complexity of playing field, it becomes manifest that it is our fealty to process which effectively distinguishes us from the pretender, and not how correct we are on bunk subjects. The role of the Ethical Skeptic is to defend the integrity of this knowledge development process. Which brings up the circumstance where, what if various persons and groups do not desire knowledge to improve? What then is the role of the Ethical Skeptic?

If you demand ‘bring me proof’ before you would ask ‘bring me enough intelligence to form a question or hypothesis’ – then I question your purported knowledge of science. ~TES

Part of our job as well at The Ethical Skeptic is to elicit, to shed light on circumstance wherein pretenders circumvent and abrogate this ethical process of science. Instances where false vigilante skeptics use the chaos of research, against the knowledge development process itself. Unethical actions which target elimination of Type III research and defamation/intimidation of Type III both lay and scientist researchers. Below are listed some of the tactics employed on a deleterious path of willfully and purposely vitiating the Sponsorship (in particular Type III Sponsorship) steps of the scientific method:

Tactics/Mistakes Which Social Skeptics Employ to Vitiate the Sponsorship Portion of the Scientific Method
The Critical Role of Sponsors fraud by skeptics1.  Thinking that the craft of research and science solely hinge around the principle of making final argument.³
2.  Promoting an observation to status as a claim.
3.  Thinking that a MiHoDeAL or Apophenia claim to authority can be issued without supporting evidence.
4.  Routinely accepting at face value associative, predictive or statistical proofs while eschewing and denying falsifying observations.
5.  Lack of acknowledging the full set of explanatory possibilities under the sponsorship Peer Input step.
6.  Presuming that a Sponsor is only pursuing one alternative explanation during Peer Input.
7.  The inability of SSkeptics to recognize true experts in other than academic/oligarch contexts.
8.  The errant habit of false skeptics in citing and deferring to non-experts.
9.  Vigilante thinking in mistakenly believing that the role of skepticism is to ‘evaluate claims’ and teach Sponsors about critical thinking which squelches Sponsorship in the first place.
10.  Amateur error in applying pretend Peer Review tactics in the Sponsorship stage of science.
11.  Failure to demonstrate circumspection to own contradictions or weaknesses in own Peer Input argument.
12.  Failure to define/address significance versus insignificance by observational context.
13.  Fake Skepticism: Asking “Here is what we need to prove this” rather than “Here is what we need in order to develop a hypothesis.”
14.  Committing the Vigilante Mistake: Killing just as many innocent Sponsors as one does Bad Science Sponsors.


¹  a. Stephen Wolfram, A New Kind of Science, Wolfram Media, Inc. Champaign, IL; p. 548.

¹  b. Stephen Wolfram, A New Kind of Science, Wolfram Media, Inc. Champaign, IL; pp. 557-576.

²  Defenders of Wildlife, Red Wolf Fact Sheet; professional and layperson wildlife advocates, (http://www.defenders.org/red-wolf/basic-facts)

³  a. Wayne C. Booth, Gregory G. Colomb, and Joseph M. Williams, The Craft of Research, Third Edition; The University of Chicago Press, Chicago, IL; p 285.

³  b. Wayne C. Booth, Gregory G. Colomb, and Joseph M. Williams, The Craft of Research, Third Edition; The University of Chicago Press, Chicago, IL; pp 114-123.

† The Stanford Encyclopedia of Philosophy, Scientific Discovery, March 6 2014, No. 2 Scientific Inquiry as Discovery, (http://plato.stanford.edu/entries/scientific-discovery/)

January 25, 2015 Posted by | Agenda Propaganda, Argument Fallacies, Institutional Mandates, Tradecraft SSkepticism | , , , , , , , , | Leave a comment

Ethical Skepticism – Part 5 – The Real Ockham’s Razor

The abuse or misemployment of Ockham’s Razor as an appeal to authority/ignorance is a key indicator as to a person’s lack of scientific literacy. Indeed, the actual role of Ockham’s Razor, the real scientific principle, is to begin the scientific method, not complete it in one felled swoop.  It is the ethic and nature of science to prosecute incremental risk in conjecture – a simple explanation puts little at risk – this is why it appears to fail less often. This is in part, an illusion.
Science is, the very task of introducing and resolving, incremental plurality. Rational thinking under Ockham’s Razor is the demonstrated ability to handle such plurality with integrity.

It is the simplicity sell of the pretend skeptic. They fail to understand Ockham’s Razor, so in 1972 they crafted this mutated version called Occam’s Razor, rendering all the loose ends wrapped up as ‘finished science’ in one felled swoop of fatal logic. Making the world easy to explain; facile for tender hearts from then on. Let’s take a more detailed look at both the scientific and virulent-error forms of Ockham’s Razor.

Fake Skepticism’s ‘Occam’s Razor’

We begin first with the infamous, itself simple and simultaneous appeal to both authority and ignorance: ‘Occam’s Razor’:

“All things being equal, the simplest explanation tends to be the correct one.”

This statement is the most often quoted variant of pop-skepticism’s Occam’s Razor. It is commonly employed as appeal to authority/reverence (the authority/chain of reverence in question is the apothegm itself, the celebrity skeptics who repeat it over and over, Carl Sagan, and William of Ockham himself – although a straw man contention in that final context) – and is not actually Ockham’s Razor. Employing this statement as a decision heuristic constitutes an Asch Conformity error, which is indeed a form of appeal to reverence/authority (see Argument from Authority/Cognitive Bias). This one-liner, popularized by Carl Sagan and in the movie Contact, is a sleight of hand expression taught by Social Skeptics and often called ‘Occam’s Razor.’ It is employed errantly as this twisted decision heuristic, abused to force a premature disposition on an idea, dismiss observations and data as if they were ‘claims’ and further squelch disdained topics which would otherwise be entertained for research by Ethical Skepticism.  The weakness of the statement above resides in the philosophical principle that the simplest answer is typically the one which falls in line with the pre-cooked assumptions, the stack of risky-but nothing put at risk provisional knowledge we are bringing to the argument.

See my commentary on the deceptive role of ‘simple’ inside science here: When Simple is Just Simply Wrong.

Moreover, implicit within this statement reside the claims that all relevant knowledge is currently mastered by the one issuing disposition and that data/observations must immediately be ‘explained’ so that a disposition (read that as dismissal) can be issued a priori and anecdotally. These actions serve to obviate both the data aggregation and intelligence development steps of science; a fallacious sleight-of-hand employed to obfuscate and abrogate the application of the scientific method. This trick, the false claim to ‘You see it’s simple…,’ is a common huckster tactic, bearing little in common with true rationality and failing the Popper science demarcation principle. The Stanford Encyclopedia of Philosophy expounds on the weakness of such thinking in terms of Popperian Philosophy:

“In the view of many social scientists, the more probable a theory is, the better it is, and if we have to choose between two theories which are equally strong in terms of their explanatory power, and differ only in that one is probable and the other is improbable, then we should choose the former. Popper rejects this.”²

You ever hear the expression ‘Simplest answer’s often the correct one’?”
“Actually, I’ve never found that to be true.
– Gone Girl, 2014¹

It is the ethic and nature of science to prosecute incremental risk in conjecture – a simple explanation puts little (nothing in reality) at risk – this is why it appears to fail less often. This is in part, an illusion. A very costly and uninformative illusion. Science is, the very task of introducing and resolving, incremental plurality risk. ~TES

Don’t let your integrity slip to the point where you catch yourself using these practices to deceive others, or employing ‘Occam’s Razor’ as a habitually reflexive martial arts response, deflecting information from entering your rational playing field. The two formal and one informal fallacies introduced via this errant philosophy are:

Transactional Occam’s Razor Fallacy (Appeal to Ignorance)

The false contention that a challenging construct, observation or paradigm must immediately be ‘explained.’ Sidestepping of the data aggregation, question development, intelligence and testing/replication steps of the scientific method and forcing a skip right to its artificially conclusive end (final peer review by ‘Occam’s Razor’).

Existential Occam’s Razor Fallacy (Appeal to Authority)

The false contention that the simplest or most probable explanation tends to be the scientifically correct one. Suffers from the weakness that myriad and complex underpinning assumptions, based upon scant predictive/suggestive study, provisional knowledge or Popper insufficient science, result in the condition of tendering the appearance of ‘simplicity.’

Observational Occam’s Razor Fallacy (Exclusion Bias)

Through insisting that observations and data be explained immediately, and through rejecting such a datum based upon the idea that it introduces plurality (it is not simple), one effectively ensures that no data will ever be recognized which serves to frame and reduce a competing alternative.  One will in effect perpetually prove only what they have assumed as true, regardless of the idea’s inherent risk. No competing idea can ever be formulated because outlier data is continuously discarded immediately, one datum at a time by means of ‘simplicity’.

Utility Blindness

When simplicity or parsimony are incorrectly applied as excuse to resist the development of a new scientific explanatory model, data or challenging observation set, when indeed the participant refuses to consider or examine the explanatory utility of any similar new model under consideration.

Finally, Occam’s Razor suffers from the fact that is conceals pseudoscience, inside a surreptitious facility:

Facile

Appearing neat and comprehensive only by ignoring the true complexities of an issue; superficial. Easily earned, arrived at or won – derived without the requisite rigor or effort.

Perhaps a more valid expression describing this principle is bound up in the popular equivocal version of Ockham’s Razor:

‘Entities should not be multiplied unnecessarily.’

With this aphorism, we begin to encroach upon the valid principles which underpin the real Ockham’s Razor (below). However, be careful with this large equivocal footprint version of Ockham’s Razor. Socially this statement leaves open (and has been often abused in this way) the notion that categories should not be added to social groups, or ideas should not be brought to the table for consideration, observation, intelligence or sponsorship in social or scientific discourse – for no other reason than disdain. All under the unskilled apothegm based science that ‘I don’t want to consider this’ coded inside the abuse of a twisted form of philosophy. This is the exclusive-conclusive abuse of ‘Occam’s Razor’ which is popular among those bearing an oppressive social mindset and seeking to appear to have science back their politics, religion and personal hatreds. So while the second variant above is indeed better than its ‘simplest explanation’ cousin, its amphibology potential still affords malicious mindsets an open door to apply it in error. It still can be used to stand tantamount to, and in substitution of, a material argument. This is philosophy being used to supplant science. This is pseudoscience.

The actual principle does indeed involve a discretion of ‘entities’; however, the actual role of Ockham’s Razor, the real scientific principle, is to begin the scientific method by means of managing entities, not complete it in one felled swoop of denial (Occam’s Razor).

THE REAL OCKHAM’S/OCCHAM’S RAZOR

“Pluralitas non est ponenda sine neccesitate” or “Plurality should not be posited without necessity”

Summa Totius Logicae, William of Ockham (frater Occham)

william-of-ockham cutThe words are those of the medieval English philosopher and Franciscan monk William of Ockham (ca. 1287-1347).³ I use Ockham’s Razor because that is what most philosophers use, and it is the choice of the Stanford Encyclopedia of Philosophy (SEP: Ockham’s Razor).³ It is not that the use of ‘Occam’ is not acceptable; however, if you do not make it clear which version you are referring to, your recitation as such could stand as a warning flag highlighting a lack of scientific literacy, especially regarding issues of parsimony and explanatory research.

Please note that the pre-anglicized name of the village from which Ockham heralded was named Bocheham,³ and was called Ockham even before William’s life. It was never at any time called ‘Occam’ nor ‘Occamus.’ Although its use is accepted, fewer serious scientific publications use the term ‘Occam’s Razor;’ and when they do, they mean the latter parsimony context presented here, and not the former ‘simplest explanation’ meaning above. Further, when he referred to himself in the Latin, he used the term “frater Occham,” and not ‘Occamus’ as some people claim in order to defend the widespread use in error (See William of Ockham’s sketch “frater Occham iste”, from a manuscript of Ockham’s Summa Logicae, 1341). This was not a case of pen-name selection, rather an attempt at a Latin transcription on the part of William. Therefore, this is not a case of petition for an official name as might be warranted in the instance of an author’s choosing a pen-name. Latin lexicons bore no precedent for an ‘ock’ based expression root, so the pseudo Latin version of Ockham chosen by William was Occham, and not ‘Occam’ nor ‘Occamus.’ Accordingly, William of Ockham employed ‘Occham’ in his translated work (which was the de rigueur of the day) and successive disciples of his employed the name Occhami in their Latin publications† – which further was cited in error of transcription as the much later French Renaissance variant “de Occam.”† So, as we step back into the proper context of original usage (case example: Neanderthal d. 1856 as proper over the later French to German Valley name change to ‘Neandertal’ d. 19041) – this does not mean that science also therefore must change the pronunciation of the original entity because the village after which it was named changed theirs at a later time. In similar fashion as regards modern English employment, Ockham is the correct modern transcription of the pseudo-Latin Occham and Occhami (these names did not actually exist in Latin – it would be like me signing my blogs Etticchus Parsimonae as a requirement to get them published – neither name or term actually exists in Latin – rather it would be a technically imprecise expression of pretense on my part). If however, one must insist on the pen-name approach, then one should honor the author’s clear expressed choice, and Occham’s Razor would be the only appropriate variant to employ.

Of key relevance however, this apothegm is a qualifying heuristic (neither exclusive nor inclusive nor conclusive, ie. not a decision heuristic), which simply cites that until we have enough evidence to compel us, science should not invest its resources into immaturely novel, discontinuous, unnecessarily feature laden, risk element stacked or agenda driven theories. This renders Ockham’s Razor more a discipline of economy, and not in the least the decision heuristic it is sold as by social skeptics. Not because the ideas which it may screen are false or terminally irrelevant, rather existentially they are unnecessary in the current incremental discourse of science. They are not yet relevant. Observation, intelligence and sponsorship are the steps in the scientific method which can serve to introduce necessity into the equation of science.

A suspension of relevancy is in no way tantamount to a material argument, nor may it boast an existential call to arms, for the assembling clubs of rationality and critical thought. Cabals which simply serve to entertain and encourage malevolent minds and instruct the credulous inside habits of denial. This is the pseudoscience of the fake skeptic. It is unfaithfulness to science, elucidating the stark scientific illiteracy afoul in the heart of those who practice such dishonesty.

When this is done as a matter of convenience in order to make the pseudo-principle go viral, or push selected answers, this is called being advantageously obtuse.

Advantageously Obtuse (Bridgman Reduction)

/philosophy : pseudo-philosophy/ : a principle which has been translated, reduced or dumbed-down for consumption so as to appear to be a ‘simple’ version of its source principle; however, which has been compromised through such a process. Thereby making it easy to communicate among the vulnerable who fail to grasp its critical elements, and moreover to serve as an apothegm useful in enforcing specific desired conclusions. Statements such as ‘the burden of proof lies on the claimant’ or ‘the simplest explanation tends to be correct’ – stand as twisted, viral forms of their parent principles, which contend ironically, critically or completely different standards of thought.

plurality 2However, it is the latter half of this definition of Ockham’s Razor which is rendered advantageously obtuse by those in the social skepticism movement. One critical element of Ockham’s Razor most importantly also establishes that, once there exists a sufficient threshold of evidence to warrant attention, then science should seek to address the veracity of a an outside claim, or multiple explanatory approaches, or more complex versions of standing theory. This condition is called plurality. Plurality is a condition of science which is established by observations, intelligence and sponsorship, not by questions, peer review or claims. To block the aggregation and intelligence of this observational data, or attempt to filter it so that all data are essentially relegated as fiat anecdote, is pseudoscience. It is fraud, and is the chief practice of those in the Social Skepticism movement today. The claim of “Prove it” – or Proof Gaming Formal Fallacy, embodies this fundamental misunderstanding of Ockham’s Razor on the part of those who have not pursued a rigorous philosophical core inside their education.

The abuse or mis-employment of Ockham’s Razor is a key indicator as to a person’s lack of scientific literacy.

This statement, and in particular Ockham’s Razor’s employment of the term ‘plurality,’ is more commonly recognized in research science as the principle of parsimony:

Parsimony – the resistance to expand explanatory plurality or descriptive complexity beyond what is absolutely necessary, combined with the wisdom to know when to do so. Avoidance of unnecessarily orphan questions, even if apparently incremental in the offing.

To understand the role of Ockham’s Razor parsimony inside the concepts of elegance and design, see The Nature of Elegance. And of course we would be remiss without defining the axiomatic principle inside parsimony, which is the defining essence of Ockham’s Razor:

Plurality (plural of entities)

/philosophy : scientific method : construct and theory discipline/ : adding entities or complexity to an argument. Introducing for active consideration, more than one idea, construct or theory attempting to explain a set of data, information or intelligence. Also, the stacking of features or special pleading to an existing explanation, in order to adapt it to emerging data, information or intelligence – or in an attempt to preserve the explanation from being eliminated through falsification.

A related form of parsimony is a principle called Corber’s Burden. It states that the burden of proof falls, even to one who is claiming falseness. Falseness being a claim just the same as a primary affirmative contention. This applies as well to the condition where a ‘skeptic’ implies falseness by a variety of means. Not only this, but in a broader sense, when one makes multiple claims, or contends that they have identified the core domain of falseness (pseudoscience), then that claimant bears the ultimate burden of proof. This is a form of surreptitious plurality error embodied inside Corber’s Burden.

Corber’s Burden

When one tenders an authoritative claim as to what is incorrect – one must be perfectly correct.

/philosophy : argument : burden of proof/ The mantle of ethics undertaken when one claims the role of representing conclusive scientific truth, ascertained by means other than science, such as ‘rational thinking,’ ‘critical thinking,’ ‘common sense,’ or skeptical doubt. An authoritative claim or implication as to possessing knowledge of a plural set of that which is incorrect. The nature of such a claim to authority on one’s part demands that the skeptic who assumes such a role be 100% correct.

Many subjects reside inside this arena of doubt, wherein a claim to falseness is under the same burden of scrutiny as is the claim to verity. This threshold of plurality and in contrast, the ‘proof’ of an idea, are not the same standard of data, testing and evidence.  Muddying the two contexts is a common practice of deception on the part of SSkeptics. Proof is established by science, plurality is established by sponsors.  SSkeptics regard Ockham’s Razor as a threat to their religion, and instead quote the former substitute above, which while sounding similar and ‘sciencey’, does not mean the same thing at all.  An imposter principle which rather seeks to blur the lines around and prevent competing ideas from attaining this threshold of plurality and attention under the scientific method.  Their agenda is to prohibit ideas from attaining this threshold at ANY cost.  This effort to prohibit an idea its day in the court of science, constitutes in itself, pseudoscience.

This method of pseudoscience is called the DRiP Method.

Misuse of “Occam’s” Razor to Effect Knowledge Filtering

Knowledge FilteringOne of the principal techniques, if not the primary technique of the practitioners of thought control and Deskeption, is the unethical use of Knowledge Filtering.  The core technique involves the mis-use of Ockham’s Razor as an application to DATA and not to competitive thought constructs.  This is a practice of pseudoscience and is in its essence dishonesty.

Ockham’s Razor, or the discernment of plurality versus singularity in terms of competing ideas, is a useful tool in determining whether science should be distracted by bunk theories which would potentially waste everyone’s time and resources.  Data on the other hand is NOT subject to this threshold.

By insisting that observations be explained immediately, and through rejecting a datum, based on the idea that it introduces plurality, one effectively ensures that no data will ever be found which produces a competing construct.  You will in effect, perpetually prove only what you are looking for, or what you have assumed to be correct. No competing idea can ever be formulated because outlier data is continuously discarded immediately, one datum at a time. This process of singularly dismissing each datum in a series of observations, which would otherwise constitute data collection in an ethical context is called “Knowledge Filtering” and stands as a key step in the Cultivation of Ignorance, a practice on the part of Social Skepticism. It is a process of screening data before it can reach the body of non-expert scientists. It is a method of squelching science in its unacknowledged steps of process and before it can gain a footing inside the body of scientific discourse. It is employed in the example graphic to the right, in the center, just before the step of employing the ‘dismissible margin’ in Social Skepticism’s mismanagement of scientific consensus.

Plurality is a principle which is applied to constructs and hypotheses, not data.

Essence of Plurality - CopyI found a curious native petroglyph once while on an archaeological rafting excursion, which was completely out of place, but who’s ocre had been dated to antiquity.  I took a photo of it to the state university library and was unable to find the petroglyph in the well documented inventory of Native American Glyphs. I found all the glyphs to the right and all the glyphs to the left of the curious one.  However, the glyph in question had been the only one excluded from the state documentation work performed by a local university professor.  A senior fellow at the foundation supporting the library, when I inquired replied appropriately “You know, maybe the Glyph just didn’t fit the understanding.” He had hit the nail on the head. By Occam’s Razor, the professor had been given tacit permission to filter the information out from the public database, effectively erasing its presence from history. He did not have to erase the glyph itself, rather simply erase the glyph from the public record, our minds and science – and excuse it all as an act of ‘rational thinking.’ And were I to attempt to insert this glyph into the scientific record myself, I knew that my career would come under attack. So I left the issue at that point.

The Purpose of Ockham’s Razor is to BEGIN the scientific method, not screen data out and finish it.

Data stands on its own.  Additionally, when found in abundance or even sometimes when found in scarcity, and not eliminated one at a time by the false anecdotal application of “Occam’s” Razor, can eventually be formulated into a construct which then will vie for plurality under the real Ockham’s Razor.  A useful principle of construct refinement, prior to testing, under the scientific method.

As you might see below, plurality resides at the heart of scientific research. But the unsung heroes of plurality are the sponsors of original, creative, persistent and perceptive research who drive the process of plurality (Scientific Method Steps 1 – 5, below).  They, even more so than authors and studies undergoing the process of Peer Review, bear the brunt of disdain from faking scientists and SSkeptics who seek to prevent the process of plurality from occurring at all costs.

When rational thinking becomes nothing more than an exercise in simply dismissing observations to suit one’s inherited ontology, then the entire integral will and mind of the individual participating in such activity, has been broken.

Sometimes what is celebrated as simple, is in reality merely obtuse.


¹  Gone Girl, screenplay by Gillian Flynn, motion picture by 20th Century Fox; September 26, 2014; David Fincher.

²  Thornton, Stephen, “Karl Popper”, The Stanford Encyclopedia of Philosophy (Summer 2014 Edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/archives/sum2014/entries/popper/>

³  Spade, Paul Vincent and Panaccio, Claude, “William of Ockham”, The Stanford Encyclopedia of Philosophy (Fall 2011 Edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/entries/ockham/>

†  Gulielmi Occhami, … Summa totius logicae; https://archive.org/details/bub_gb_ciXVn5xQ-5kC

June 30, 2013 Posted by | Agenda Propaganda, Argument Fallacies, Ethical Skepticism, Institutional Mandates | , , , , , , | 1 Comment

   

Chinese (Simplified)EnglishFrenchGermanHindiPortugueseRussianSpanish
%d bloggers like this: