The Ethical Skeptic

Challenging Agency of Pseudo-Skepticism & Cultivated Ignorance

The Peculiar Schema of DNA Codon’s Second Letter

The second letter of the three digit DNA codon bears remarkable schema of such extraordinary improbability, that the question arises, “How did this 1 in 2.7 sextillion occurrence happen at all?” Is its essence a signature livestock crypto-branding, wherein the iron just happens to be struck at the single best intersection of complexity and immutability inside the planet’s common genome?

The protein makeup and physiology of every living organism which has lived on Earth is defined by its genome. Genes consist of long sequences of nucleic acid bases (digits) which provide the information essential in constructing these proteins and their expressed features. DNA is the information library which is employed by ribosomes and messenger RNA to craft the structure and function of the organism. Our current DNA codex has stood as the basis of life since the common ancestor of Archaea first appeared on Earth shortly after it formed, 4.5 billion years ago.

Recent and so-called ‘molecular clock studies’ have pushed the origin of our DNA-based life back to a mere 60 million years after Earth’s very formation.1 They make the stark argument that the code upon which DNA functionally depends has been around pretty much as long as has our Earth or its Moon and oceans at the very least. It took another 2.2 billion years for Earth’s life to evolve from Archaea and into our current domain, Eukaryota. Eukaryotes consist of a cell or cells in which the genetic material is DNA in the form of chromosomes contained within a distinct nucleus. Mankind is of course then, part of the domain Eukaryota (albeit post Cambrian Explosion).

This sequence of events and their timing is critical path to the argument presented inside this article. Therefore, as is my standard practice, a summary graphic is in order. Our current understanding of the age of the Moon, Earth, life on Earth, and DNA are as follows (drawn from the four sources footnoted):2 3 4 5 Bear in mind as you continue to read, the timing of the four key events depicted on the left hand side of the timeline chart below (using Three Kingdom classification): The formation of the Earth, introduction of the Moon, and the appearance of oceans and life on Earth.

A Timeline of Earth Life from LUCA to Genetic Engineers

Before we begin, please forgive me if I wax speculatively poetic for just a moment. It is almost as if, in death and dying, the incentive strategy of DNA is to motivate higher order creatures not only to propagate species through sexual reproduction and evolution, but at a certain point to branch their now engineered polymorphisms farther into space as a part of the sentient pursuit of immortality as well. Very much bearing the stealth, coincidence, robustness, and brilliance in conquest characteristic of incentivized distributed ledger warfare. Perhaps as much as anything, this mercenary and suffering-strewn pathway to almost certain extinction, encourages more nihilism than anything men may long ponder. For this very present pain buoys upon an undercurrent of our conscious lives, rendering theist and atheist alike, understandable compatriots in its existential struggle. In a figurative sense, both children of the same ruthless God drunk on their very suffering and confusion. A most-likely Bronze Age mythology recounted in the Gnostic text The Hypostasis of the Archons aptly framed this as ‘The Empire of the Dead and Dying’, caught up in an eons-long struggle with putative ‘Forces of Light’.6

Now let’s table until the end of this discussion the obvious coincidence that Earth’s Moon and its oceans arrived an incredibly short time after the Earth’s formation – essentially at the same time. Thereafter then, nothing else of such an astronomically monumental scope occurred for one whole third of the existence of the Universe itself. I have always regarded this as being a rather odd happenstance. As if the Moon was most likely a very large Oort Cloud icy planetesimal, heavy with frozen salt water, nitrates, carbon monoxide, carbon dioxide, methane, sulfur, and ammonia (and LUCA?). An interloper (much like Uranus’ moon Miranda) which then surrendered these signature elements to accrete and compose the new surface crust of its larger companion, and sputtered the final ‘mares’ (maria) of water into space or onto Earth. All this only after the Moon’s extinct ocean tides (and not lava basalt floods) had gradually introduced entropy enough to slow its rotation into a tidal lock with its new host, Earth. Under such a construct, this would be why the Earth-facing side of the Moon appears like flat ocean bottom (pink in the graphic to the right), while the far side is craggy and non-eroded. One can clearly observe static-water-eroded older craters contrasting with pristine newer ones, complimented by horizon-disciplined ocean-silt planes in these 4K clarity Moonscapes. Our barren, now desiccated and ‘same isotope ratios’ gamete gave its very life in order to birth its offspring, a living Earth. But I digress. We will come back to this issue at the end of this article. This alternative construct on the Moon’s origin will be the subject of another article sometime in the future for The Ethical Skeptic.

All speculation aside, a more astounding aspect of this timeline is the relative quickness by which life appeared on the newly formed Earth-Moon binary. Moreover, it is not the mere appearance of life itself which stands as the most intriguing aspect of this event for me. Not to take the emergence of life for granted, but certainly one can be forgiven for pondering an even more challenging issue: the very quick appearance of the complex code upon which all Earth life is based, the DNA Codex – or what is also called the ‘Standard Code’.7 Be forewarned however, this sudden and early introduction of a fully functional and logical Standard Code is not the only mystery encompassed therein.

Peculiar Schema Surrounding the DNA Codon Second Base

Our genetic code consists of four types of DNA nucleotide (A-adenine, C-cytosine, T-thymine, G-guanine) structured into triplet sequences (XXX, or for example ‘ATC’) called codons.8 To put it another way, this simply means that the ‘alphabet’ of DNA only contains 4 letters, and each ‘word’ in the DNA lexicon only possesses three letters (or ‘bases’). This leaves a set of 64 possible combinations, or words (called ‘codons’ or ‘slots’ in this article) in the language of DNA. More specifically, the set of all possible three-nucleotide combinations is 4 × 4 × 4 = 64, which comprises coding for 19 amino acid molecules, a sulfur/methionine-start code (ATG), and three silence-stop codes (TGA, TAG, TAA). One can observe the breakout of this codex by 32 left and right-handed protein doublets (64 ÷ 2) in the graphic I created here: Third Codon Letter Left and Right-Handed 32-Slot Apportionment.

However, perhaps a better way to view the assignment of codon slot to specific amino acid molecule is through examining the full 64 slot breakout by amino acid or control codon (with duplications). That breakout can be viewed in the graphic I created here: DNA Codon Slot by Nucleon Count. As a note, I like to create my own graphics from scratch. One will find that they do not truly understand a subject until they accurately describe it in detail for themself. The errors encountered along such a journey typically demonstrate that one did not possess nearly the grasp of an issue as one might have thought upon first study. One will also find that their ability to retain the material in detail is enhanced through such exercise. While similar in nature to the Feynman technique, my chosen approach differs from Feynman’s in that it does not put on the pretentious charade of packaging ideas for ‘sixth graders’, which is in reality an attempt at celebrity-building. In similar ethic one will not find me playing bongo drums for a doting media. Such buffoonery exemplifies why ignorance around the DNA codon schema is ubiquitous today. Remember these tenets of ethical skepticism:

Deception is the complicated made to appear simple.
Accurate, is simple.

There is a thing called a Bridgman Point, below which, to simplify something further, is to also introduce error. Sixth-grade speak often resides below this critical threshold, and the entailed error escapes the simple minds of both the presenter and recipient. Such runs anathema to the philosophy of skepticism.

Nonetheless, the most astounding aspect of this latter breakout method (all 64 slots ranked by nucleon count) is the utter organization around which the codon-to-amino assignment is made. The DNA codon second digit (base) schema is akin to an organized and well-kept room, wherein even the items which are out of place, are forced out of place for a specific purpose. When I assembled Graphic A below, it reminded me very much of the Resistor Color Band-Code codex we employed in Electrical Engineering classes in undergraduate school and assembly/soldering of circuit boards in Navy cryptography electronics. Bear in mind that the resistor 5-Band-Code engineer’s benchmark standard to the right (courtesy of Digi-Key Electronics Supply) bears less organization and symmetry than does the DNA codex in Graphic A below.

For this reason and many others, the Standard Code DNA Codex is sometimes referred to by a Francis Crick assigned moniker, the ‘Frozen Accident’.9 However, what we will observe later in this article is that this event was not characterized by simply one frozen accident, but rather several highly improbable ones which concatenate into a single scope of events. Nonetheless, this organization/symmetry regarding the slot-to-nucleon-count schema, which leverages the second letter of the DNA codon, can be more easily viewed in Graphic A below.

It took me around ten years of grappling with this and falsifying probably 8 or 10 other linear inductive approaches to viewing it, to finally break this code through a deductive approach and winnow it out into its 33 components of logic and symmetry (A through Y below). The broken code can be viewed here: DNA Codex Broken by CTGA Nucleon N-O Stem and Control Slot, and of course its matching Graphic:

Graphic A – DNA Codon Slot by Second and Third Base Matched to Assigned Amino Acid Nucleon Count

While most genetic scientists recognize the peculiarities entailed in the schema surrounding the second base of the DNA codon,10 few perhaps fully perceive its extraordinary logical structure along with the infinitesimally small possibility of the Standard Code having occurred (or even evolved) by accident. The reader should know that I presented this construct to a Chair in Genetics at a very prominent university years ago. That discussion constituted the first he had ever heard of the idea or realized fully many of the oddities identified therein. He forwarded me a complimentary copy of the genetics textbook he had authored, which I treasure to this day. This was not a shortfall on his part as I am sure that the domain bore ample professional challenge over the decades above and beyond this set of potential observations. Nonetheless, I remain doubtful (not skeptical) that this construct has been given adequate examination and fare due inside scientific discourse.

Numerous improbable to impossible idiosyncrasies stand out inside Graphic A above, an illuminating version of how to depict the schema surrounding the second letter (base) of the DNA codon. For example, a critical observation to note under this method of examining the schema is that there is no ‘third base degeneracy’ inside the Standard Code, as many commonly hold. The symmetry simply dovetails into more specialized and partitioned schema, bearing even more symmetry (a lower entropy state, not higher – as can also be seen in the G and T blocks in the chart to the right). Upon close examination the 64-slot code’s being fully fleshed-out is not necessarily the result of degeneration, but rather bears just as significant a likelihood that this ‘every-slot-occupied’ tactic is purposed to prohibit amino acid assignment degeneration in the first place. This notion that the Codex features degeneracy, is an idea containing an a priori Wittgenstein lexicon bias. Inference from such an assumption is unsound. This matter must be left open from the standpoint of skepticism.

This notion that the Standard Code features third base degeneracy, is an idea containing an a priori Wittgenstein lexicon bias.
Inference drawn from such an assumption is unsound.

Abductive thinking in this nature impedes our ability to conduct science around the issue. Why are so few geneticists truly familiar with this material and why do only a paltry number of studies exist to date on this very important and astounding construct? The issue incorporates a feature of the philosophy of logic which I call ‘critical path’. One of mankind’s greatest skills is the ability to deliberate a subject at great length, yet still manage to avoid anything of a critical path benefit throughout its discourse (aka politics). DNA is no different.

Critical Path Question: Could our Second Base CTGA-N Codex have developed outside any context of intent?

Now that I have buried the lede far enough, let’s get right to the key point then. The likelihood of this remarkable schema occurring inside a solely accidental context, I have calculated to be 1 chance in 2.7 x 1021 (2.7 sextillion). One can view the reference table and series probability formulas employed for the combinations in this first table, Table 1 – Probability Reference Table. These reference probabilities are then combined into a series of calculations, culminating in the remote likelihood of Item B below, and then finally a preliminary estimate of the codon second letter schema’s occurring naturally, based upon just three primary factors (the remainder are included in items A through Y later in this article):

Item B. Possibility of a combined set of two (y) blocks of 16 (x) contiguous second base assignments in increasing nucleon count order, along with 7 other blocks of 4 accomplishing the same (see Table 1 – P(x,y) = ((1 – P(x))^O)^y or P= 7.12 x 10-17). Please note that we only count two contiguous 16-slot blocks as coherent, not four, because of uncertainty in the other two. This also for conservancy.

Item C. The likelihood of having such a structure result in symmetry between start and first stop blocks, and in addition displacing the second stop-block codes to the end of the series (P(x) = 0.00024), and

Item E. The likelihood of having an entire block of second base amino acids be composed solely of NO2 nitrites, given that there is no chemical feedback from the amino acid to the codon development/assignment, along with the fact that the Standard Code is itself a prerequisite in order to have evolution in the first place (P(x) = 0.00002).

This results in a compounded probability of 2.65 x 10-25. Remember, for conservancy, we have chosen to only quantify items B, C, and E from the list of idiosyncrasies below. When I ran the numbers using all items A through Y, the calculations just compounded to outlandishness. Items A and B as well were simply two sides of the same coin, so all my trials of calculation have only used A or B, but not both. These three evaluation factors seemed to be the most compatible with a reasonable quantification effort, and to my mind offered a smaller range of potential error. My belief is that this notion of degeneracy, and poor portrayals of the Standard Code, have blinded most of science’s ability to observe this anomaly to its full extent. As a former intelligence officer, this is what I have been trained to do, spot the things no one else has. I have made a very successful career of this skill.

However, given that our Standard Code is not the ultimate code which possibly could have developed, and indeed there are most likely up to 1 x 104 codes of equal or superior optimization,11 an increase in probability by a factor of 10,000 adjusts the final probability tally to 2.7 x 10-21 in likelihood (1 in 2.7 sextillion – note that one significant digit is lost in that calculation), given that we prefer to conservatively assume the best case probability. The combined series of calculations can be viewed in this second table, Table 2 – Probability Calculation Table for 3 Factors. Those calculations are conceptually depicted in Graphic B below.

Graphic B – Remote Likelihood of Standard Code Developing by Accident

The abstract in Graphic B still places the Standard Code 99.9% along the journey from a more likely version, and on towards an ultimate ‘optimized’, but at the same time highly unlikely code. As you examine the chart above, note that the Standard Code is not structured to flag attention with a 10-25 beacon of perfection, but rather a much more tantalizing 10-21 efficacy woven into a fabric of stealth. While this is only a suspicion, I cannot shake the perception that this pattern is not meant to be an ultimate optimization in the first place (although it is abundantly close), but rather a watermark. A branding if you will, identifying the species’ trademark/point of origin (ownership?), regardless of what the creature has evolved into at any point in the future. It resides coincidentally at a very opportune Indigo Point inside inflection theory, bearing a raison d’être in that once the code is struck, it never changes, nor does it evolve. What I have found in my career is that benefit stakes from coincidences/uncertainty seldom go uncaptured. Look back at Graphic A again now and see if such an idea makes sense.

In other words, is what is contained in Graphic A a crypto-trademark? A cattle brand? Its branding iron being struck at the only point which functionally resides at the intersection of complexity and immutability inside a genome-in-common.

Not merely serving as a brand, but moreover a crypto-codex. A Standard Codex which would function simultaneously to prevent outside-crossbreeding (even with other DNA-based life), and yet enable a catch & release monitoring program to quickly identify interlopers into a planet’s (or series’ thereof) biosphere.

Granted this stack of ideas is highly speculative and skeptical neutrality upon its first hearing is certainly understandable. I would suggest the reader hold such a line of thinking (per hoc aditum) in suspension (epoché) and continue reading through Items A through Y below.

The chart to the right is extracted from the footnoted Koonin/Novozhilov study and expresses those authors’ visualization of this penultimate concept. I don’t agree with those authors’ study conclusions but I applaud their boldness, career risk, and critical path work on this matter. Click on the thumbnail in order to obtain a larger image of the chart. The Standard Code is represented conceptually by the blue dot, while the ultimate optimal code would reside at the tip of the tallest peak in the chart. Indeed the Standard Code represents almost codex perfection. Something mere chemical and metabolic affinities (even if they were plausible, which is highly doubtful) cannot come close to explaining, much less attaining.

One should note that various constructs (not true hypothesis) exist as to chemical/metabolic connections between DNA codes slot number and nucleon count.12 However, we discount this because the purported chemistry involved would have had to select which chemistry to serve, between nucleon count and the N-O stem of each amino acid molecule – serving a mix of one or the other in terms of chemical affinity, but not both perfectly at the same time. The selection here transcended chemical affinity roles and selected correctly for both (the blue bars in the above chart). Both the one-way aspect of gene expression, and the difficulty in selecting correctly for two conflicting chemistries at the same time, deductively strengthen a logical-only scenario. Such force-to-convention speculation ends up constituting only ad hoc apologetic.

Moreover, regarding this rather extraordinary schema, several additional detailed observations may be made. Note that only the items in bold/red were used in the actual probability calculations.

A. There exists a slot-order to nucleon count linearity bearing a coefficient of determination (R2) of .971 within codon groups 1 – 48 and 49 – 64 (.5757 overall), and against the second base blocks that are formed by this progression. It is not the linearity itself which is a tail event, as anything which is sorted by magnitude can take on linearity – but rather it is the cohesive groupings by second base of the DNA codon which result from this linear series arrangement, which must be quantified inside a salient probability function (see at bottom right hand side of Graphic C below). Below, one can observe where I ran a Monte Carlo simulation of combined possibilities for the sequence of 60 amino acid and 4 control-code slots. The distribution function and degrees of freedom in random assignment of codon to slot, fail miserably to establish an orderly/logical relationship with amino acid nucleon count or blocks of same second base codons.

On iteration 335 I got a coefficient of determination of R2 = .1087, which was relatively high as compared to the previous trials. It was there that I framed a rudimentary top-end and ‘degrees of freedom’ for this apparent Chi-squared arrival distribution curve. Despite the less probable nature of the Iteration 335 coefficient, it still resided a woeful 1 x 1016 in distance from a more likely code, to the existing Standard Code. Had I continued to run these iterations in the Monte Carlo simulation it would take me 2.6 billion-million years to finally hit upon a code as remote in possibility as is the DNA Standard Code. That is under a condition wherein I purposely try to encounter such a code. Think how long this endeavor would take if I was just randomly hitting keys throughout the exercise. Remember that abiogenesis only had one shot at producing the Standard Code through such randomness – as it cannot ‘evolve’.

Note that this Monte Carlo simulation is not used in the probability calculations. It is run as part of the set of exercises which might serve to elicit something missing, falsify the main thesis, provide relative perspective, or stimulate different thinking around the matter. It bears a fascinating result nonetheless.

Graphic C – Monte Carlo Simulation of 20 Amino Acids and 3 Stops into 64 Logical Slots over 335 Iterations

Continuing from this critical point, we observe even more idiosyncrasy in terms of items B through Y below.

B.  All codons are grouped into contiguous blocks of 16 logical assignments, and when sequenced C-T-G-A, for both the second and then third bases of the Codex, produce a split linear progression against nucleon count of 5 discrete groupings (2 overlap in G). Only two blocks are evaluated for probability under this analysis.

C.  Stop code assigned to slot 64 with two stop-codes being grouped into a contiguous pair, when stop-codes bear no chemical feature from which affinity may ostensibly originate. Third stop code bears symmetry with the methionine start code.

D.  Use of an amino acid (methionine) as sequence start code and in contrast, silence as the sequence stop code, two distinct places both of which are logically assigned and not remnants of failed chemistry.

E.  Assignment of solely hydrophobic nitrites (NO2) to the T-coded block.

F.  Methionine start code and tryptophan-bock stop code bear mirrored symmetry in the T and G blocks, with each spanning a distance of 8 slots from the start of the block, 16 slots apart and each 24 slots inward from the first and last amino acid assignment. Nucleon nor N-O affinity cannot generate this type of symmetry.

G.  A-coded amino acid group block employs all doublet code assignments.

H.  C-coded amino acid group block employs all quadlet code assignments.

I.  The C-T-G-A sequence which produces symmetry at the macro level, also produces partitioned symmetry at the individual codon sequence level, which is also optimized with a C-T-G-A sequence through all 64 slots.

J.  Absolute necessity of all three codon digits for any type of basis for functional life/evolution/Archaea. There existed neither the time nor logical basis for the Standard Code to have functioned upon a two digit (XX) codon, before adding the third digit as a suffix. There are critical amino acids and controls which depend upon a specific 3-digit codex in Archaea, our oldest form of life on Earth.

K.  Inability of the Standard Codex to derive from a process of evolution.

   1.  The code is a prerequisite for evolution itself, so it could not have evolved.

   2.  The chemistry (if such chemistry is ever found) could serve nucleon count or N-O stem affinity, but not both. Only logical assignment could balance both requirements without fail and achieve symmetry at the same time.

   3.  Evolution would have more likely selected for a simpler array of assignment (32 digits, etc.). A suggested early-on two digit Codex could explain part of this, but cannot explain a start and stop-code symmetry which depends upon 3 digits, nor the short amount of time in which the three digit code arose (in something which does not change).

   4.  If the Code evolved – this evolution should have continued across 4.5 billion years (after originating in its entirety in less than 30 million) – yet it did not.13

   5.  See D. above. Evolutionary changes in DNA occur through the accretion/change of information linked to function and not through the leveraging of silence (absence of information).

L.  Control start and first stop doublets bear common symmetry and regression fit – and as well, both associated codon molecules are adjusted by isomer in order to adhere to this symmetry and fit. Sulfur suffix applied to methionine start-control codon – boosts it into a position of correct regression linearity by doublet slot. Tryptophan bears an isomer appendage which is essentially the complexity equivalent of cysteine, thereby reducing it into the correct regression linearity control doublet slot as well.

M.  All complex NxOx distal amino acid codes grouped together after the first stop code, with N4O2 assigned only under guanine block and remainder grouped into the adenine block.

N.  Featuring no true grouping of odd amino acid counts (3, 5, etc) which should have occurred in an affinity or other unguided scenario.

O.  The only odd molecules tryptophan (C11H12N2O2) and methionine (C5H11NO2S) being only ones assigned to singlet slots – and just happen to both bear symmetry and both be paired with control codons. These doublets are then placed symmetrically from the beginning (8) of each of their respective blocks, 16 slots apart, simultaneously with symmetric distance (24) from the outer edges of the C-T-G-A block as a whole. This is an extraordinary feat, given that chemical affinity would have not only not resulted in this, but moreover prevented it from occurring in the first place (were affinity involved at all, either nucleon or N-O stem).

P.  T and G blocks possessing symmetrical doublet assignment patterns.

Q.  The assignment of all 64 logical slots when life bore a much greater probability of beginning with a far less extensive codex size. Akin to evolution starting with a snail, instead of Archaea. However, once the context of very-quickly-assigned logical symmetry is broached, the possibility arises that the 64 slot code being fully fleshed-out (every slot assigned) is not the result of degeneration, but rather it is precisely a tactic to prohibit degeneration in the first place. This is a starkly different basis of understanding this code.

R.  Control start and stop codes solely employ left-handed codon suffixes. (Note: this is logical only, not the same as molecule chirality)

S.  The positive correlation between the number of synonymous codons for an amino acid and the frequency of the amino acid in living organism proteins.14

T.  Maximization of synonymous point mutations by means of the third letter of the codon – the letter which also bears the greatest frequency of mutation.15

U.  This codex was not only a ‘Frozen Accident’ – which is highly improbable as an event in and of itself. Moreover, this accident also selected for a 1 in 2.7 x 10-21 optimized configuration, on its first and only try.

V.  There were no evolutionary trials or precedents from which to strengthen the code’s logical structure as exhibited in items A through U above. Our very first version of life, Archaea depended upon this full logical structure in order to exist.

W.  Polyadenylation (the addition of a poly(A) tail to an RNA transcript), uses the best option of the four bases from which to form the inert tail of RNA. The Adenine block both consists of all doublet assignments, increasing the likelihood of accidental transcription producing amino acid gibberish, and as well is the 2nd base block which contains two stop-silence codes, increasing the likelihood that an end-amino-silence will serve to truncate and end the polyadenylation tail.

X.  Next to last, the item which should also be quantified in the probability calculations, but I do not know of an accurate way to estimate its probability arrival function: Each of the regression lines which describe the split symmetry of slot versus nucleon counts features four things:

   1.  both follow a 3:1 m slope (y = mx + b), and

   2.  both trend lines intercept the scatterplot right where the control codon is positioned, wherein these positions themself also bearing symmetry relative to the codon block boundaries and T-G symmetry itself, and

   3.  adjustments were made to the molecules associated with the control code, cysteine (C3H7NO2 appended with S), tryptophan (C3H7NO2 appended with C8H5N), and methionine (C5H11NO2 appended with S) in order to achieve this positioning (see DNA Codex Broken by CTGA Nucleon N-O Stem and Control Slot), and finally the most incredible feat,

   4.  Tryptophan (C3H7NO2 appended with C8H5N), when taken in total, and broken into its molecular constituents – fits BOTH the C-T-G and G-A symmetry and linearity, connecting both groups with a common anchor point in the form of their common stop code assignment (amino acid silence).

The abject unlikelihood of all this occurring by happenstance, and on essentially the first try, pushes the bounds of Ockham’s Razor, rationality, and critical thinking to the point of accepting the absurd frozen accident construct (it is not a mature hypothesis) merely because it is ‘simple’ to understand. Such an approach is no less ad hoc than ‘God did it’ (equally absurd and simple), and only becomes a religion when the idea is enforced as truth/science. Which serves to introduce of course the construct that just happens to feature all of these foibles.

Fast and Simple: Grant Unto Me Six Miracles and I Can Explain All the Rest

Finally regarding ‘Item Y’ if you will, perhaps what it most daunting is the short amount of time which was allotted for such a code to ‘freeze’ into existence in the first place, along with the immediacy in which it happened upon a hostile and 60 million year-old planet – a mere 20 million years after the Moon was ‘ejected’ (more than likely ‘arrived’ with life’s DNA Codex already intact) from Earth by a trillion-to-one collision with the hypothetical planetary body Theia.16 The astute reader should have noticed by now that science possesses multiple ‘trillion to one’ happenstance claims, all compounding inside this argument. One can throw tantrums all they want about ‘irreducible complexity’ (which this is not) being ‘debunked’ (whatever either of those terms may mean), but those who issue such memorized dross must recognize that the theory they are defending is even worse. Our reliance upon the absurd in order to cling to a religious null hypothesis is becoming almost desperate in appearance. This upside down condition of ignorance is called an Ockham’s Inversion.

Ockham’s Inversion

When the ‘simplest explanation’ is not so simple after all. The condition when the rational or simple explanation or null hypothesis requires so many risky, stacked or absurd assumptions in order to make it quickly viable, that is has become even more outlandish than the complex explanation it was supposed to surpass in likelihood.

Now let us hearken back to those four key timeline events which we tabled at the outset of this article. Let’s consider the highly stacked set of unlikely elements and their probability, which together compose this Ockham’s Inversion in our current understanding. To wit:

The Six Miracle Theory

Moon created through Earth collision with Theia                                     1 in 1 x 1012
Moon and oceans arrive so soon after Earth formed                               1 in 1 x 103
Occurrence of Francis Crick’s DNA Codex ‘Frozen Accident’                 1 in 1 x 109
Immediate arrival of ‘Frozen Accident’ after Moon and oceans               1 in 1 x 103
Infinitesimal possibility of Standard Code DNA codon schema               1 in 2.7 x 1021
60 M year duration window for these events in sequence                       1 in 1 x 106

Compounds up to 1 chance in 1 x 1054
Planets in Milky Way ~4.0 x 1011 stars x 5 planets = 2.0 x 1012

A difference in magnitude of 10-42 in possibility, in one galaxy alone…

In other words, there are not enough stars in the entire Milky Way to pull off this incredibly-remote in possibility six-miracle event. This does not necessarily suggest the intervention of aliens or Gods (each fallaciously ad hoc), however it does imply that something exceptional and beyond our current frame of understanding has occurred behind the appearance and ascendancy of life on Earth. I do not have to declare a murder or accident when I find a dead body. All I have to know, is that I have a dead body and that something extraordinary has transpired. This is science. Plugging one’s ears and refusing to examine the corpse, is not science. I have no doubt that whatever the mechanism is that precipitated this codex, it is a natural feature of our realm. I simply regard it to likely be one we don’t yet know about or understand.

One third of the known history of the Universe has passed since this gift, and all we have produced with it are some violent fashionista monkeys.

One or more of our grand assumptions about all origins is more than likely a colossal error.

By the same token, it is one thing to claim a single grand accident under the idea, ‘Grant me one miracle and I can explain all the rest’. We stomach such miracles of science all the time, at the very least to serve as placeholders until we attain better information. However it is another thing entirely to demand in the name of science, six miracles inside a tight sequence, in order to protect one’s religious beliefs. That is where I respond ‘Not so fast’.

It is not that which you parrot, but rather that which you forbid examination, which defines one’s religious belief.

What is it the skeptics always say about ‘Occam’s Razor’, ‘The least feature-stacked possibility tends to be the correct one’? Well this is one hell of a stacked set of features which support our current null hypothesis around the appearance of life on Earth (abiogenesis). In any genuine scientific context the hypothesis that the Moon brought us our crust elements and life itself, making Earth what it is today, would be foremost in our theory. Instead, we have chosen the Six Miracle Theory. However, why such a fortuitous event occurred in the first place, is a matter of another critical path of inquiry altogether.

In order to advance as mankind, we will need open our minds to possibilities which address, not mere ‘gaps’ in our knowledge, but rather vast barren domains of Nelsonian ignorance. Unfortunately our Wittgenstein definitions and religious assumptions regarding the appearance of life on Earth, are serving to bias us into an extreme corner of statistically remote plausibility, from which we staunchly refuse to budge.

The Ethical Skeptic, “The Peculiar Schema of DNA Codon’s Second Letter”; The Ethical Skeptic, WordPress, 24 Feb 2021; Web, https://theethicalskeptic.com/?p=48816

February 24, 2021 Posted by | Ethical Skepticism | , , , | 18 Comments

The Five Species of Syndicate and Their Dissent

I am an Infidel
I am a Blockade Runner
I am a Stakeholder
I am a Heretic
I am a Dissident
I may indeed agree with the goal of the syndicate. But as an ethical skeptic I may also be compelled to oppose the tactics used in attaining that goal. It is the journey and not the destination which flags corruption.

What is a syndicate? Who are these entities that regard the lavishness and control which can be garnered during a short 78 years of life, of such importance that abusing their fellow man is justified in its quest? The syndicate player is a pretender to the role of God. They a form of Royalty which seeks to exploit the control, class conflict, and stratification of their fellow citizens in order to improve their wealth, celebrity, and power. An irony exists as well in that many of the below players identify themselves as part of the dissent or heresy, contrasted under the reality that they are indeed the oppressor.

Remember, that a God play typically involves three parties. First a symbol of victim-hood, second the party who is targeted for harm/extinction, and third, the virtuous party (God-proxy) who harms the second in order to ‘help’ the cause of the first. Such is a common ploy of the syndicate. So what is a syndicate and what are the most common forms thereof? Of course one can find a differing set of definitions in a graduate-level Business, Government, and the International Economy (BGIE) textbook; however, during my decades of work – I have found that these are the definitions which provide a best Wittgenstein fit in terms of ethical skepticism.

Syndicate

/philosophy : misrepresentation : coercion : organization/ : an organization or group which is crafted to enforce a common interest in terms of agency, coercion, or purging of any form of dissent. Any of five species of human oppression in the name of some higher cause (science, religion, economics, justice, equality, God, virtue, club, self, etc.) A group attempting to act in the role of a God-entity.

The various species of syndicate along with their dissenting entity include:

Cabal – unofficial organizations which control access to the realm of authorized ideas.

   Purpose:  Social Coercion/Intimidation
   Examples:  Antifa/Social Skepticism/Science Communicators/Nihilism
   Opposition:  Infidel

Cartel/Trust (Clayton or Sherman) – controls access to a market vertical or horizontal, sets its pricing and rules, and qualifies which transactions are permitted.

   Purpose:  Economic or Social Exclusion/Domination
   Example: US Grain ABCD’s/OPEC/Technology Social Media/NewsMedia Syndicates
   Opposition:  Free Trader/Pirate

Mafia – controls a product, its distribution channels, and/or the region in which such product is allowed to be sold.

   Purpose:  Economic Oppression/Domination
   Example:  Educational or Non-Elected Government Official Unions/National Football League/AFL-CIO
   Opposition:  Stakeholder

Cathedral – official organizations which administer religious doctrines, the necessity of thought, instruction and cultivation of ignorance toward opposing ideas.

   Purpose:  Social and Religious Doctrine/Enforcement
   Example:  American University System/Big ‘A’ Atheism/Science Based Medicine/Catholic Church
   Opposition:  Heretic

Party – a group which seeks domination of a government and its Overton Window of acceptable policy, through social purging in the forms of violence, propaganda, intimidation, and fraud in governance.

   Purpose:  Social Purge/Oppression/Genocide
   Example:  The Party (formerly ‘Democrats’)/Progressives/Global Socialists/Marxism
   Opposition:  Dissident

One of the purposes of ethical skepticism is to hone an ability to spot a syndicate and its agency when at play. It is not that neutrality itself is actually a form of dissent; however, in the presence of a syndicate it must be exercised as such, as the ethical skeptic is often not given the choice. I might for instance agree with what a mafia does, but still oppose the mafia itself. I might agree with the conclusions of an organization of fake skeptics, but still oppose their methods of pretend science and social coercion. I support addressing the needs of the disadvantaged, but that does not mean that I need put you in power to enact such things.

I may indeed agree with the goal of the syndicate, but as an ethical skeptic I may also be compelled to oppose the tactics employed in arriving at that goal. It is the journey and not the destination which flags corruption.

The purge and coercion tactics in and of themselves force the neutral voice to defacto become one in opposition. Such is the fruit of a syndicate; you are either with us, or against us.

The Ethical Skeptic, “The Five Species of Syndicate and Their Dissent”; The Ethical Skeptic, WordPress, 17 Feb 2021; Web, https://theethicalskeptic.com/?p=48654

February 17, 2021 Posted by | Ethical Skepticism | 6 Comments

The Fatuous Errand of the Fact Checker

What you are seeing in the media is not news. The function of media is to craft the direction and range of tolerance of an Overton Window. Paltering is the method by which one may lie through the delivery of facts. It is the chisel wielded by the ‘fact checker’, employed to shape and direct the Overton Window.

Gone are the days in which one could sit down in front of the TV, and trust the information given by a Walter Cronkite or Peter Jennings. The function of media under the rule of The Party is to deliver neither salient information nor logical truth. While the media does tender updates on cherry sorted social situations, do not be deceived into thinking that therefore the media is a source of news. Most of the articles passed by the media for public consumption constitute means of ingens vanitatum persuasion or public witch-hunt coercion, and not modes of informing. The purpose of a media inside a socio-fascist oligarchy is in fact to enable establishment of an Overton Window regarding that policy discourse which is deemed to be acceptable.1 This Window is then reinforced by astroturfed social pressure inside Clayton Trust collusion on the part of social media tech firms.

Overton Window

/philosophy : misrepresentation : coercion/ : the range of opinion positions which are acceptable to the mainstream population at a given time. It is also known as the acceptable window of discourse. The term is named after Joseph P. Overton, who stated that an idea’s political viability depends mainly on whether it falls within this range. False skeptics purposely pathologize subjects and individuals in order to artificially truncate or manipulate where this window falls in media, along with what is deemed acceptable for scientific study.

At the frontiers of such a Window, reside the ‘fact checkers’. Like Marines in a military theater of engagement they are the most skilled in conducting the frontline warfare of propaganda, through means of a specialized form of lying-through-facts called ‘paltering’.

Paltering

/philosophy : misrepresentation : Wittgenstein sinnlos/ : lying through facts. Paltering is the deceptive use of truthful statements to convey a misleading impression or inference. It is the devious art of lying by telling unqualified truths. It usually involves equivocation and/or prevarication as the basis of its management of constraint, context or ignoratio elenchi – however often can also come in the form of a semantic truth as opposed to a logical one.

Paltering is the method by which the right end of the Overton Window is sculpted in a given direction. The left end of the Overton Window is directed by universities and colleges in their role as up-front-dues unions for white collar labor. In this regard, not only does one pay/borrow the present value (PV) of an entire career’s union dues all at once, but often one must also compromise their integrity in order to be deemed acceptable to The Party. Especially if one is to be inducted into that cathedral called science, and thereafter held also as authority. Paltering in contrast, is used to herd the non-conforming right end of the Overton Window towards a progressive or even extremist viewpoint. It is also a form of confirmation anchoring for the already-compliant left.

Examples of Paltering the Overton Window

The following constitutes a short listing of various species of paltering. An article’s inclusion as an example of each is purposed simply to elicit that principle in use, and not to make commentary upon the claim being made nor falsified. In many cases here I agree with the fact checker’s conclusion; however, I do not agree with the method by which they arrived at their ‘fact’. One cannot arrive at truth standing upon a foundation which lacks integrity in the first place.

Once the blinders have been removed, it becomes nearly impossible to watch the ‘news’. ~Twitter: @baseballmama34

The various species of paltering include:

Prevarication – to lie through manipulating in advance of a point, its basis of definition, observation or data, or by means of persuasion, locution and/or tactic of argument.

Example: AFP Fact Check – Quotes from US Democrats falsely characterized as calls for violence

Hyperbolic – an exaggeration, or as well a form of special pleading wherein a ludicrous or cherry picked statistical constraint upon a set of data is used to lens that data in such a way as to make it appear to be more favorable or unfavorable per one’s a priori position.

Example: Based Upon Science Claim Check – Vaccines are Safe: Vaccines go through a lot of testing

Eqivocation – the misleading use of a term with more than one meaning, sense, or use in professional context by glossing over which meaning is intended in the instance of usage, in order to misdefine, inappropriately include or exclude data in an argument.

Example: USA Today Fact Check: Joe Biden has condemned protest-related violence from the left and the right

Lob & Slam – a stooge-posed incident or even one fabricated by allies, often cherry picked from a vast array of such incidents for its ridiculousness or vulnerability in being easily debunked, both as an example of how irrational the other side indeed is, or how easy the subject is to dismiss.

Example: CheckYourFact – Fact Check: Did The International Space Station Film A UFO For 22 Minutes?

ignoratio elenchi – a misdirection in argumentation rather than a weak inference. A misrepresentation of the question being addressed, logical calculus or evidence for an opponent’s claim, so as to frame the opponent’s contention in the poorest light.

Example: AFP Fact Check – Police did not assist Antifa during US Capitol assault

Amphibology – a situation where a contention may be interpreted in more than one way for a variety of deceptive reasons, due to ambiguous sentence structure.

Example: AFP Fact Check – Fact check: Joe Biden has condemned protest-related violence from the left and the right

Semantic versus Logical Truth – a logical truth is a statement which is true, and remains true under all reinterpretations of its components or in all contexts aside from simply that of its apperception and crafting. A semantic truth is only true in certain given circumstances.

Example: Misbar – GMOs Aren’t Inherently Unhealthy

Appeal – a variety of appeals to an external reference as support for one’s claim or position. Appeal to popularity, ignorance, authority, hoax, virtue, bucket classification, race, motive, etc.

Example: PolitiFact – Seniors around the world who died from COVID vaccine are being improperly listed as “natural causes.”

Weapon Word – a familiar and often overused term of condemnation and bucket categorization, which is employed towards a person or idea in advance of any state of knowledge or investigation. A pejorative which is used to condemn from an a priori perspective, in an effort to end an argument without due rigor.

Example: Snopes – Did a Scientific Study Prove That “Conspiracy Theorists” Are “The Most Sane of All?”

Straw Man – misrepresentation of either an ally or opponent’s position, argument or fabrication of a ridiculous version of such, often in absence of any stated opinion.

Example: CheckYourFact – Fact Check: Were These 9 Democratic Politicians Born 9 Months After The 1947 Roswell UFO Incident?

Wicker Man – called ‘the ultimate straw man’. Employing a Daedalean identity where so many special exemptions are able to be pleaded or apologists habitually spin the idea that any critique offered towards their side constitutes straw man, ignorance or tu quoque errors – that the defended philosophy or position actually has no effective defining essence which can be pinned down in the first place.

Example: FactCheck.org – Missing Context on Claim About ‘Antifa.com’

Special Pleading Misdirection – a misdirection in denial by means of a specific or cherry picked anecdote, so highly constrained that it most likely has to be false to begin with, to serve as a red herring distraction from a similar but more valid version of the same contention.

Example: Snopes – Is Rep. Debbie Dingell Introducing a ‘Gun Confiscation’ Bill?

One of the purposes of ethical skepticism is to hone an ability to spot paltering when it is passed by a fact checker as ‘correct information or inference’ or for basis thereof.

The Ethical Skeptic, “The Fatuous Errand of the Fact Checker”; The Ethical Skeptic, WordPress, 6 Feb 2021; Web, https://theethicalskeptic.com/?p=48336

February 6, 2021 Posted by | Ethical Skepticism | 8 Comments

Chinese (Simplified)EnglishFrenchGermanHindiPortugueseRussianSpanish
%d bloggers like this: