The Ethical Skeptic

Challenging Pseudo-Skepticism, Institutional Propaganda and Cultivated Ignorance

The Hermit of Nosnix Who Couldn’t be Fooled

This Holiday message from me. When in an effort to see that no one is fooled, you cede control authority to one who cannot be fooled – then everyone ends up being fooled. The Debbles of Doubthill are the pseudosciences, topics so feared by skeptics that we all lose in their mad rush to obfuscate and squelch them. Most Debbles are dispelled in their facing, and not in their avoidance.
A persistence of blindness who’s lesson’s quite cruel
The worst form of idiot thinks he cannot be fooled.

by Theodor Eric Seussel

A planner of trestles and layer of tracks
Purveyor of cartniks with plumthings on their backs,
The Who’s of Fair Pluntkin fully decked out in auld
Could barely scarce function without things which he hauled.
His hovel not cluttered his mind like Straight Lane
His cartniks lined up to shuttle plumthings again,
For short were his musings and so scarce were his lacks
That the Who’s of Fair Pluntkin did cede him their tracks.
Come spring summer’s fresh offing or winter’s cold somethings
No Debble of Doubthill would delay their dear plumthings,
Cartniks would arrive here and then sometimes there
The Who’s of Fair Pluntkin just didn’t care where.
Who’s knew that from Kesnig and Pennington to Kuled
They were the Hermit of Nosnix’s who couldn’t be fooled.
When an idea arose in the talk of Fair Pluntkin
‘Why don’t we go looking for quazlots and buntpins?’
Such cans of rare ore good were legend of old
More precious than silver, fine jewelry or gold.
They setout a plan to build trackway and charter
And bade their tough Hermit to sit down and barter,
But he could not be bought and he would not be schooled
For he was the Hermit of Nosnix and nobody’s fool.
Thus did they cajole him and call him by name
Offering up plumthings and palace and fame,
‘Lay us the tracks there and we’ll run cartnik cans
The Debbles of Doubthill won’t thwart any such plans!’
But staid in his knowledge of their trestles and tracks,
He vowed to confuse cartniks with cans on their backs!
Quazlots and buntpins were foolish men’s mirth
Of fleeting sheer fantasy in essence and worth;
Such trackways to Debbles should never be tooled
I am the Hermit of Nosnix and I cannot be fooled!
So each newly laid line of steel and proud skill
He plotted curve ’round all Debbles of Doubthill;
For whether tracks stopped there or ran ’round their threat
In either case ran no cartnik cans yet!
Year after year Who’s laid out Who plans
To find Debbles of Doubhill had blocked all their cans!
The curves he laid toward them only bent ’round again
To meetup with tracks from once which they came!
So hated he quazlots and buntpins and mirth
That he wouldn’t lay tracks there no matter their worth!
In the town of Fair Pluntkin from Debbles protected
Each winter of somethings Who’s grew more dejected.
T’was cynicism which blocked them and left them to rot
Because quazlots were real and Debbles were not!
A persistence of blindness who’s lesson’s quite cruel
The worst form of idiot thinks he cannot be fooled.
Thus the Who’s of Fair Pluntkin had all left astray
Leaving cartnik’s tracks rusting away where they lay;
While Fair Kesnig and Pennington had taken their outfill
Because they didn’t believe in Debbles of Doubthill.
Come spring summer’s fresh offing or winter’s cold somethings
The town of Fair Pluntkin had run out of dear plumthings,
All Who’s had been duped who lived Kesnig to Kuled
By the Hermit of Nosnix who couldn’t be fooled.

Merry Christmas and Happy Holidays.

     How to MLA cite this article:

The Ethical Skeptic, “The Hermit of Nosnix Who Couldn’t be Fooled”; The Ethical Skeptic, WordPress, 16 Dec 2018; Web, https://wp.me/p17q0e-98Z

 

December 16, 2018 Posted by | Institutional Mandates, Tradecraft SSkepticism | , | Leave a comment

The Apothegm Makes the Poison

The dose makes the poison. This statement is not a logical truth. To cough up this notorious fur-ball of an apothegm in a serious broadscope discussion concerning toxicology risk, informs all concerned about your personal ignorance and desire to deceive – moreso than it speaks anything particular about me. The masters who let loose the dogs of skepticism have found such organic lying to be very effective in asset preservation.

One of the most notorious catch-phrases of pseudo-wisdom the ethical skeptic will encounter from a social skeptic poseur, is the apothegm ‘The dose makes the poison’. It is not that this statement is false. The basis of the quip resides in scientific validity and it is categorically true regarding lethality, yes. However the statement is not a logical truth.1 Logical truth is the state of syllogism which the utterer is deceitfully wishing for you to infer regarding this football of an apothegm. It is a means of lying through stating something which is only conditionally accurate – hoping that their victim will accept the statement as one which addresses the context of toxicity. Discussions of this ilk are rarely over lethality, and most often pertain to the impact of a toxin on the population, environment or family. If your conversant conflates these two concepts in order to enforce the entailed organic lie, or hands you cartoon LD50 charts comparing glyphosate with table salt, stop talking with them immediately. They are a non-player character. A social skeptic.

As an ethical skeptic, never ever ever conduct your communication under such misrepresentation by locution – as people spot this, but will not mention it to you. You will lose credibility, yet not know it, nor understand why in the end. The apothegm is not necessarily true (different from being ‘false’), and that is what disqualifies it from being a logical truth (ethical knowledge). This is of critical path importance to the ethical skeptic. Let’s examine a couple examples before we look at the entire domain of such a statement’s limited applicability (Exhibits I and II below).

If I am asked to consume diazinon in my drinking water (we are never ‘asked’, but let’s pretend we live in such an ethical world) for example, because its use increases corn yields 14%, when we have a glut of corn production each year for decades now in the US as it is, the ppm tolerance for diazinon in my water in such a circumstance is ZERO ppm. A Mean Lethal Dose measure-LD50 does not apply because there is no economic benefit to be derived from the risk I undertake. This, though a simple exercise example, is actually how ethical toxicology is done in the big boy world. When I work establishing food and trade markets, this is the type of mechanism I petition to have inserted in the market constraint dynamics and enterprise API’s used by large trade aggregation desks. This is ethics. Everything else is academic – and possibly immoral. I do not care how much you know or that you use pedophrasty to promote your product, placing pictures of starving children into your ads – if you are lazy/greedy, and that laziness or greed serves to harm others – you are acting under malice and oppression by court definition.

The Puppet Show: Comparing Aggregate Benefit to Individual Risk (while Ignoring Aggregate Risk)

If however, I am forced to drink say some dosage of diazinon, because involved stockholders inside several companies know my representatives and key regulatory agency members, and they were able to get the pesticide pushed through for higher-risk use; and furthermore, these stockholders are now able to buy beachfront vacation homes on St. George Island rather than rent smaller back-lane beach cottages – well under that stark risk/benefit scenario, I will then drink the toxin I suppose. Their benefit outweighs my risk. Now the astute ethical skeptic will observe that, toxin risk is never measured in terms of population descriptives – only individual risk. Individual risk LD50 versus a diffuse set of poorly estimated and confirmed aggregate benefits – the risk is never expressed in terms of aggregate risk – and is never followed up on. In reality the state of ethics in toxicology – per below – is one sad state of affairs.

Social skeptics, as usual, provide no help at all in this matter – ironic, when this is their claimed identity and life goal.

Notice that all the measures regarding toxin risk, relate to the individual.2 There are no studies which attach a measured population affect in humans, to an introduced toxin. There are studies of the farming community, and there exists some study of environmental impact – but no studies following up with human populations as a group. Not even devisement of a suitable measure.3 I find that amusing (horrifying), given that the ethical assessment of toxin risk pertains to impacts and measures relating to populations, not individuals. All of the following entries below, two new observations and five previous ones, are cataloged into The Tree of Knowledge Obfuscation: Misrepresentation of Evidence or Data and apply in this circumstance:

missam singuli

/philosophy : pseudoscience : study design/ : a shortfall in scientific study wherein two factors are evaluated by non equivalent statistical means. For instance, risk which is evaluated by individual measures, compared to benefit which is evaluated as a function of the whole – at the ignorance of risk as a whole. Conversely, risk being measured as an effect on the whole, while benefit is only evaluated in terms of how it benefits the individual or a single person.

Virtue Telescope

/philosophy : sophistry : deception/ : employment of a theoretical virtue benefit projected inside a domain which is distant, slow moving, far into the future, diffuse or otherwise difficult to measure in terms of both potential and resulting impact, as exculpatory immunity for commission of an immoral act which is close by, obvious, defined and not as difficult to measure. Similar to but converse of an anachronistic fallacy, or judging distant events based on current norms.

And of course a smattering of fallacies and crooked thinking art which we have examined before.

idem existimatis – attempting to obscure the contributing error or risk effect of imprecise estimates or assumptions, through an overt focus on the precision or accuracy of other measures inputs inside a calculation, study or argument.

ignoro eventum – institutionalized pseudoscience wherein a group ignores or fails to conduct follow-up study after the execution of a risk bearing decision. The instance wherein a group declares the science behind a planned action which bears a risk relationship, dependency or precautionary principle, to be settled, in advance of this decision/action being taken. Further then failing to conduct any impact study or meta-analysis to confirm their presupposition as correct. This is not simply pseudoscience, rather it is a criminal action in many circumstances.

phantasiae vectis – the principle outlining that, when a human condition is monitored publicly through the use of one statistic, that statistic will trend more favorable over time, without any real underlying improvement in its related human condition. Unemployment not reflecting true numbers out of work, electricity rates or inflation measures before key democratic elections, crime being summed up by burglaries or gun deaths only, etc.

Yule-Simpson Paradox – a trend appears in different groups of data can be manipulated to disappear or reverse (see Effect Inversion) when these groups are combined.

Elemental Pleading – breaking down the testing of data or a claim into testing of its constituents, in order to remove or filter an effect which can only be derived from the combination of elements being claimed. For instance, to address the claim that doxycycline and EDTA reduce arterial plaque, testing was developed to measure the impact of each item individually, and when no effect was found, the combination was dismissed as well without study, and further/combination testing was deemed to be ‘pseudoscience’.

However, given that somebody out there is benefiting, I will gladly accept a drink containing 2 ppm (parts per million) diazinon over one containing 10 ppm, based upon this necessity of individual risk compared to aggregate benefit. Now diazinon features no Efficacy Curve (EC) of benefit for me to ingest, however it does exhibit toxicity measures in 240-day rat studies. Certainly studies of value, and I am glad we completed such diligence. The NOAEL (No Observed Adverse Effect Level) of diazinon is set, as a result of such studies, at 0.02 mg/kg-bodyweight per day.4 This would equate to an 8 ounce glass of water per day containing 8.4 ppm or less of the chemical, for my body weight (the ‘lethal concentration mean-LCt50’ being much higher than this – so below this NOAEL level is considered safe). Thus, in theory, that same glass of water with 10 ppm would prompt observable adverse effects in my physiology. It won’t kill me though, right? That’s great news.

Nonetheless, yes, I will choose to drink the lower 2 ppm dose any day. The dose does make the poison, inside this highly constrained conflation of adverse effect and toxicity.

However, to cough up this statement fur-ball at me, in a serious debate about food and water contaminants, means that you are first, clueless enough to have highly underestimated your opponent and second, don’t really understand toxicology nor adverse effect all that well. It tells all concerned more about you, than it does about me. Yes, this includes the case wherein you hold a PhD. LD50, LCt50, NOAEL and other exculpatory idem existimatis contentions of that ilk are most often cited by lazy science poseurs. These measures do not even begin to bear salience or relevance around the list of 20 different ways in which toxicity can harm our citizens and our family members (Exhibit II below).

No, the dose will not kill me. Lethality, and even Adverse Effects are red herrings. We are discussing toxicity.
The discussion has never been about whether or not the contaminant in the glass of water will hurt me right this moment.

If these stats do not address the questions which our families have intelligently raised about toxins
– then why should our scientists and skeptics not have already raised the same questions?

But Table Salt Had a Higher ‘LD50’ What Happened?

But does the dose actually make the poison? Is that a logical truth? If your child accidentally ingests some rat poison – such measures are absolutely critical. But for you and millions of others, hold on just a second. Here it is 20 years later, two decades of confidently ingesting a NOAEL-safe .5 to 2 ppm diazinon glasses of water, most every day, and suddenly, you’ve gained 100 lbs across 3 years and have had to have both of your knees replaced because an aggressive form of rheumatoid arthritis has kicked in. Your same-age colleagues at the plant fully understand and cover for you. Your orthopedic surgeon is hesitant to undergo the procedure because she wants you to lose 70 lbs first. She is not sure that you will be able to handle the difficulties involved in the surgery with the extra weight. Your spouse feels like he must have done something wrong. He changes his diet in an effort to help out, but to no avail. IBS and diabetes start to creep up periodically. All at a fairly young age. But but but… the LD50 of table salt was higher though!5 Must have been the table salt, and coffee too. It’s always the coffee.

We have an apothegm just for this type of circumstance as well: ‘Luck of the draw’.

OK, in an effort to be truthful when held to public account, social skeptics will admit that we have enough epidemiological data to know that the table salt and coffee did not cause your long term exposure physical ailments after all. They just brought up those red herrings years ago in order to look smarter than you – and because this was what they were told to say. Can you as an experienced skeptic now go back then and contact the study group which set the rat-240d-NOAEL for diazinon, and say “Hey, we might need to examine this with a bit more scientific rigor and follow-up.” The fact is, that I just observed adverse effects from something – and there are only a couple culpable ‘somethings’ which could be considered – a set which includes diazinon, the least likely candidate of which is ‘luck of the draw’ (pseudo-theory). The fact is, that what we really needed were human-30y-NOLTAEL, statistics to be derived from comprehensive community data to begin with. The sad fact however is, that they are rarely if ever done. Nobody wants to find out who had the bullet in a one-bullet firing squad.

And herein resides the rub – we don’t think we need to develop human-30y-NOLTAEL because we already have rat-derived LD50, LCt50 and NOAEL data.

To push for further science might endanger the St. George beachfront property. Better enlist the aid of some, compromised-ethics, fake experts who are smart-but-dumb, with dark teeth. If they don’t have any qualifications, have them call themselves ‘skeptics’. You can hire them cheap, all you have to do is pay their celebrity leaders a pittance, and they will do anything. Ignorance is asset preserving. The science is settled. (Another deadly apothegm of social skepticism)

In the Real World, Acute Lethal Dose is Rarely the Issue

These ethical dilemmas, along with the ‘our pesticide is less toxic than table salt’ baloney, elicits just one simple example problem with ‘the dose makes the poison’ apothegm applied as panacea to the entire issue domain inside toxicology. However, even more compounding in risk, is the specter that, there are at least 17 other toxicity expression vectors, which bear a similar incompatibility to the classic ‘LD50 – dose makes the poison’ paradigm. For most toxicity vectors, those we have understood much better than our 1920’s-minded skeptics – the dose does not make the poison. And you are particularly stupid-to-gullible to believe otherwise.

The safety of glyphosate, the active ingredient in the Roundup weedkiller, has been compared to many things over the years, but the table salt comparison stands out as particularly ridiculous. In fact the state of New York took legal action against Monsanto for false advertising for making this very claim. Monsanto agreed to cease and desist from making this claim, but it is still commonly parroted by aggressive supporters of GMOs and chemical company apologists.

Suffice it to say that no one’s going to intentionally ingest enough salt or glyphosate to immediately die from their exposure, and comparing the LD50 values of chemicals that can have serious health harms other than immediate mortality is so misleading as to be irresponsible.

~ Dr. Nathan Donley, Center for Biological Diversity6

The following pages are available for your use, as you see fit – to partly educate the vulnerable public about what they need to know regarding food/water/medicines toxicology. This is not a case of ‘Dunning-Kruger’ – as toxicology’s application inside this context fails the limits test for application of that ‘fallacy’.7

Such matters are your responsibility as well as your right. If you and your family are getting sick for no reason – raise hell about it. They are just gonna have to put up with us.

However, if you are a professional toxicologist/epidemiologist and wish to make comment/input on the graphics below – I will certainly consider improving them with your help. That would be absolutely appreciated.

Exhibit I

Exhibit II

Appendix

epoché vanguards gnosis

——————————————————————————————

How to MLA cite this blog post =>

The Ethical Skeptic, “The Apothegm Makes the Poison” The Ethical Skeptic, WordPress, 29 Nov 2018; Web, https://wp.me/p17q0e-8UR

November 29, 2018 Posted by | Agenda Propaganda, Argument Fallacies | , | Leave a comment

Embargo of The Necessary Alternative is Not Science

Einstein was desperate for a career break. He had a 50/50 shot – and he took it. The necessary alternative he selected, fixed c, was one which was both purposely neglected by science, and yet offered the only viable alternative to standing and celebrated club dogma. Dogma which had for the most part, gone unchallenged. Developing mechanism for such an alternative is the antithesis of religious activity. Maturing the necessary alternative into hypothesis, is the heart and soul of science.

Mr. Einstein You’ll Never Amount to Anything You Lazy Dog

Albert Einstein introduced in a 1905 scientific paper, the relationship proposed inside the equation e = mc² : the concept that the system energy of a body (e) is equal to the mass (m) of that body times the speed of light squared (c²). That same year he also introduced a scientific paper outlining his theory of special relativity. Most of the development work (observation, intelligence, necessity, hypothesis formulation) entailed in these papers was conducted during his employment as a technical expert – class III (aka clerk) at the Federal Office for Intellectual Property in Bern, Switzerland; colloquially known as the Swiss patent office.1 There, bouncing his ideas off a cubicle-mate (si vis) and former classmate, Michele Angelo Besso, an Italian Engineer, Einstein found the time to further explore ideas that had taken hold during his studies at the Swiss Federal Polytechnic School. He had been a fan of his instructor, physicist Heinrich Friedrich Weber – the most notable of his two top-engaged professors at Swiss Federal Polytechnic. Weber had stated two things which struck an impression on the budding physicist.2

“Unthinking respect for authority is the enemy of truth.” ~ physicist Heinrich Friedrich Weber

As well, “You are a smart boy, Einstein, a very smart boy. But you have one great fault; you do not let yourself be told anything.” quipped Weber as he scolded Einstein. His mathematics professor, Hermann Minkowski scoffed to his peers about Einstein, relating that he found Einstein to be a “lazy dog.” In similar vein, his instructor physicist Jean Pernet, admonished the C-average (82% or 4.91 of 6.00 GPA) student “[I would advise that you change major to] medicine, law or philosophy rather than physics. You can do what you like. I only wish to warn you in your own interest.” Pernet’s assessment was an implication to Einstein that he did not face a bright future, should he continue his career in pursuit of physics. His resulting mild ostracizing from science was of such extent that Einstein’s father later had to petition in an April 1901 letter, for a university to hire Einstein as an instructor’s assistant. His father wrote “…his idea that he has gone off tracks with his career & is now out of touch gets more and more entrenched each day.” Unfortunately for the younger Einstein, his father’s appeal fell upon deaf ears. Or perhaps fortuitously, as Einstein finally found employment at the Swiss patent office in 1902.3

However, it was precisely this penchant for bucking standing authority, which served to produce fruit in Einstein’s eventual physics career. In particular, Einstein’s youthful foible of examining anew the traditions of physical mechanics, combined with perhaps a dose of edginess from being rejected by the institutions of physics, were brought to bear effectively in his re-assessment of absolute time – absolute space Newtonian mechanics.

Einstein was not ‘doubting’ per se, which is not enough in itself. Rather he executed the discipline of going back and looking – proposing an alternative to a ruling dogma based upon hard nosed critical path induction work, and not through an agency desire to lazily pan an entire realm of developing ideas through abduction (panduction) – no social skeptics, Einstein did not practice your form of authority-enforcing ‘doubt’. Rather it was the opposite.

He was not doubting, rather executing work under a philosophical value-based principle called necessity (see Ethical Skepticism – Part 5 – The Real Ockham’s Razor). Einstein was by practice, an ethical skeptic.

Einstein was not lazy after all, and this was a miscall on the part of his rote-habituated instructors (one common still today). Einstein was a value economist. He applied resources into those channels for which they would provide the greatest beneficial effect. He chose to not waste his time upon repetition, memorization, rote procedure and exercises in compliance. He was the ethical C student – the person I hire before hiring any form of cheating/memorizing/imitating A or B student. And in keeping with such an ethic, Einstein proposed in 1905, 3 years into his fateful exile at the Swiss patent office, several unprecedented ideas which were subsequently experimentally verified in the ensuing years. Those included the physical basis of 3 dimensional contraction, speed and gravitational time dilation, relativistic mass, mass–energy equivalence, a universal speed limit (for matter and energy but not information or intelligence) and relativity of simultaneity.4 There has never been a time wherein I reflect upon this amazing accomplishment and lack profound wonder over its irony and requital in Einstein’s career.

The Necessary Alternative

But was the particular irony inside this overthrow of Newtonian mechanics all that unexpected or unreasonable? I contend that it was not only needed, but the cascade of implications leveraged by c-invariant physics was the only pathway left for physics at that time. It was the inevitable, and necessary, alternative. The leading physicists, as a very symptom of their institutionalization, had descended into a singular dogma. That dogma held as its centerpoint, the idea that space-time was the fixed reference for all reality. Every physical event which occurred inside our realm hinged around this principle. Einstein, in addressing anew such authority based thinking, was faced with a finite and small set of alternative ideas which were intrinsically available for consideration. That is to say – the set of ideas only included around 4 primary elements, which could alternately or in combination, be assumed as fixed, dependent, or independently variable. Let’s examine the permutation potential of these four ideas: fixed space, fixed time, fixed gravity and/or fixed speed of light. Four elements. The combinations available for such a set are 14, as related by the summation of three combination functions:

  

What reasoned conjecture offered, and given that combinations of 4 or 3 were highly unlikely to unstable, was to serve in bounding the set of viable alternative considerations to even a lesser set than 14 – maybe 6 very logical alternatives at most (the second C(4,2) function above). However, even more reductive, essentially Einstein would only need start by selecting from one of the four base choices, as represented by the first combination function above, C(4,1). Thereafter, if he chose correctly, he could proceed onward to address the other 3 factors depending upon where the critical path led. But the first choice was critical to this process. One of the following four had to be chosen, and two were already in deontological doubt, in Einstein’s mind.

•  Fixed 3 dimensional space (x, y, z)
•  Fixed time (t)
•  Fixed gravitation relative to mass (g-mass)
•  Fixed speed of light (c)

Ultimately then, only two choices existed if one is to suppose a maximum of two fixed elements as possible per below. Indeed this ended up being the plausal-set for Einstein. The necessary alternatives, one of which had been essentially embargoed by the science authorities at the time, were a combination of two of the above four combining elements. Another combination of two was currently in force (fixed space and time).

In other words, now we have reduced the suspect set to two murder suspects – Colonel Mustard and Professor Plum, and standing dogma was dictating that only Colonel Mustard could possibly be considered as the murderer. To Einstein this was at worst, an even bet.

This is the reason why we have ethical skepticism. This condition, an oft repeated condition wherein false skepticism is applied to underpin authority based denial in an a priori context, in order to enforce one mandatory conclusion at the expense of another or all others, is a situation ripe for deposing. Einstein grasped this. The idea that space and time were fixed references was an enforced dogma on the part of those wishing to strengthen their careers in a social club called physics. Everyone was imitating everyone else, and trying to improve club ranking through such rote activity. The first two element selection, stemming of course from strong inductive work by Newton and others, was a mechanism of control called an Einfach Mechanism (see The Tower of Wrong) or

Omega Hypothesis HΩ – the answer which has become more important to protect than science itself.

•  Fixed 3 dimensional space (x, y, z)
•  Fixed time (t)

Essentially, Einstein’s most logical alternative was to assume the speed of light as fixed first. By choosing first, a fixed reference of the speed of light, Einstein had journeyed down both a necessary, as well as inevitable hypothesis reduction pathway. It was the other murder suspect in the room, and as well stood as the rebellious Embargo Hypothesis option.

Embargo Hypothesis Hξ– the option which must be forbidden at all costs and before science even begins.

•  Fixed gravitation relative to mass (g-mass)
•  Fixed speed of light (c)

But this Embargo Hypothesis was also the necessary alternative, and Einstein knew this. Once can argue both sides of the contention that the ’embargo’ of these two ideas was one of agency or mere bias. In this context and for purposes of this example, both agency and bias are to be considered the same embargo principle. In many/most arguments however, they are not the same thing.

The Necessary Alternative

/philosophy : Ockham’s Razor : Necessity/ : an alternative which has become necessary for study under Ockham’s Razor because it is one of a finite, constrained and very small set of alternative ideas intrinsically available to provide explanatory causality or criticality inside a domain of sufficient unknown. This alternative does not necessarily require inductive development, nor proof and can still serve as a placeholder construct, even under a condition of pseudo-theory. In order to mandate its introduction, all that is necessary is a reduction pathway in which mechanism can be developed as a core facet of a viable and testable hypothesis based upon its tenets.

The assertion ‘there a God’, does not stand as the necessary alternative to the assertion ‘there is no God’. Even though the argument domain constraints are similar, these constructs cannot be developed into mechanism and testable hypothesis. So, neither of those statements stand as the necessary alternative. I am sorry but neither of those statements are ones of science. They are Wittgenstein bedeutungslos – meaningless. A proposition or question which resides upon a lack of definition, or presumed definition which contains no meaning other than in and of its self.

However in exemplary contrast, the dilemma of whether or not life originated on Earth (abiogenesis), or off Earth (panspermia) do stand as a set of necessary alternatives. Even though both ideas are in their infancy, they can both ultimately be developed into mechanism and a testing critical path. The third letter of the DNA codon (see Exhibit II below) is one such test of the necessary alternatives, abiogenesis and panspermia. There is actually a third alternative as well, another Embargo Hypothesis (in addition to panspermia) in this case example – that of Intervention theory. But we shall leave that (in actuality necessary as well) alternative discussion for another day, as it comes with too much baggage to be of utility inside this particular discourse.

Einstein chose well from the set of two necessary alternatives, as history proved out. But the impetus which drove the paradigm change from that of the standing dogma and to Einstein’s favored Embargo Hypothesis, might not have been as astounding a happenstance as it might appear at first blush. Einstein chose red, when everyone and their teaching assistant, was of the awesome insistence that one need choose blue. All the ramifications of fixed speed of light (and fixed gravitation, relative only to mass), unfolded thereafter.

Einstein was desperate for a break. He had a 50/50 shot – and he took it.

Example of Necessity: Panspermia versus Abiogenesis

An example of this condition wherein the highly constrained set of alternatives (two in this case) inside a sufficient domain of unknown, forces the condition of dual necessity, can be exemplified inside the controversy around the third letter (base) of the DNA codon. A DNA codon is the word, inside the sentence of DNA. A codon is a series of 3 nucleotides (XXX of A, C, T or G) which have a ‘definition’ corresponding to a specific protein-function to be transcripted from the nucleus and decoded by the cell in its process of assembling body tissues. It is an intersection on the map of the organism. Essentially, the null hypothesis stands that, the 3rd letter (nucleotide) digit of the codon, despite its complex and apparently systematic methodical assignment codex, is the result of natural stochastic-derivation chemical happenstance during the fist 300 million years of Earth’s existence (not a long time). The idea being that life existed on a 2 letter DNA codon (XX) basis for eons, before a 3 letter (XXX) basis evolved (shown in Exhibit II below). The inductive evidence that such an assignment codex based upon 3 letters derived from 2, is beyond plausibility given the lack of probability of its occurrence and lack of time and influencing mechanism during which that improbability could have happened – this evidence supports its also-necessary alternative.

In this circumstance, the idea that the DNA codon third digit based codex, was not a case of 300 million year fantastical and highly improbable happenstance, but rather existed inside the very first forms of life which were to evolve (arrive) on Earth, is called panspermia. The necessary alternative panspermia does not involve or hinge upon the presence of aliens planting DNA on Earth, rather that the 3 letter codon basis was ‘unprecedented by context, complex and exhibiting two additional symmetries (radial and block) on top of that’ at the beginning of life here on Earth, and therefore had to be derived from a source external to the Earth. Note, this is not the same as ‘irreducible complexity’, a weak syllogism employed to counter-argue evolution (not abiogenesis) – rather it is a case of unprecedentable complexity. A much stronger and more deductive argument. It is the necessary alternative to abiogenesis. It is science. Both alternatives are science.

This circumstance elicits the contrathetic impasse, a deliberation wherein a conundrum exists solely because authority is seeking to enforce a single answer at the expense of all others, or forbid one answer at the expense of science itself. The enforced answer is the Omega Hypothesis and the forbidden alternative is the Embargo Hypothesis. And while of course abiogenesis must stand as the null hypothesis (it can be falsified but never really proven) – that does not serve to make it therefore true. Fake skeptics rarely grasp this.

Therefore, the necessary alternative – that the DNA (XXX) codex did not originate on Earth, is supported by the below petition for plurality, comprising five elements of objective inference (A – E below). This systematic codex is one which cannot possibly be influenced (as is evolution) by chemical, charge, handedness, use of employment, epigenetic or culling factors (we cannot cull XX codex organisms to make the survival group more compatible for speciating into XXX codex organisms). Nor do we possess influences which can serve to evolve the protein based start, and silence based stop codons. It can only happen by accident or deliberation. This is an Einstein moment.

Omega Hypothesis HΩ – the third letter of the DNA codex evolved as a semi-useless appendage, in a single occurrence, from a 2 letter codex basis, featuring radial symmetry, featuring block assignment symmetry and molecule complexity to 2nd base synchrony, only upon Earth, in a 1.6 x 10^-15 (1 of 6 chance across a series of 31 pairings, across the potential permutations of (n – 1) proteins which could be assigned) chance, during the first 300 million years of Earth’s existence. Further the codex is exceptionally optimized for maximizing effect inside proteomic evolution. Then evolution of the codex stopped for an unknown reason and has never happened again for 3.8 billion years.

Stacking of Entities = 10 stacked critical path elements. Risk = Very High.

Embargo Hypothesis Hξ– the three letter codex basis of the DNA codon, pre-existed the origination of life on Earth, arrived here preserved by a mechanism via a natural celestial means, and did not/has not evolved for the most part, save for slight third base degeneracy. Further the codex is exceptionally optimized for maximizing effect inside proteomic evolution.

Stacking of Entities = 4 stacked critical path elements. Risk = Moderate.

Note: By the terms ‘deliberacy’ and ‘prejudice’ used within this article, I mean the ergodicity which is incumbent with the structure of the codex itself. Both how it originated and what its result was in terms of the compatibility with amino acids converting into life. There is no question of ergodicity here. The idea of ‘contrived’ on the other hand, involves a principle called agency. I am not implying agency here in this petition. A system can feature ergodicity, but not necessarily as a result of agency. To contend agency, is the essence of intervention hypotheses. To add agency would constitute stacking of entities (a lot of them too – rendering that hypothesis weaker than even abiogenesis). This according to Ockham’s Razor (the real one).

The contention that panspermia merely shifts the challenges addressed by abiogenesis ‘off-planet’ is valid; however those challenges are not salient to the critical path and incremental question at hand. It is a red herring at this point. With an ethical skeptic now understanding that abiogenesis involves a relatively high-stacked alternative versus panspermia, let’s examine the objective basis for such inference in addition to this subjective ‘stacking of entities’ skepticism surrounding the comparison.

The Case for Off-Earth Codex Condensation

Did our DNA codex originate its structure and progressions first-and-only upon Earth, or was it inherited from another external mechanism? A first problem exists of course in maintaining the code once it exists. However, upon observation, a more pressing problem exists in establishing just how the code came into being in the first place. Evolved or pre-existed? ‘Pre-existed by what method of origination then?’ one who enforces an Omega Hypothesis may disdainfully pontificate. I do not have to possess an answer to that question in order to legitimize the status of this necessary alternative. To pretend an answer to that question would constitute entity stacking. To block this necessary alternative (pre-existing codex) however, based upon the rationality that it serves to imply something your club has embargoed or which you do not like personally – even if you have mild inductive support for abiogenesis – is a religion. Given that life most probably existed in the universe and in our galaxy, already well before us – it would appear to me that panspermia, The Embargo Hypothesis, is the simplest explanation, and not abiogenesis. However, five sets of more objective inference serve to make this alternative a very strong one, arguably deductive in nature, versus abiogenesis’ relative paltry battery of evidence.

A.  The most early use amino acids and critical functions hold the fewest coding slots, and are exclusively dependent upon only the three letter codon form. Conjecture is made that life first developed upon a 2 letter codon basis and then added a third over time. The problem with this is that our first forms of life use essentially the full array of 3 letter dependent codex, to wit: Aspartate 2 (XXX), Lysine 2 (XXX), Asparagine 2 (XXX), Stop 3 (XXX), Methionine-Start 1 (XXX), Glutamine 2 (XXX). Glutamic Acid and Aspartic Acid, which synthesize in the absolute earliest forms of thermophiles in particular would have had to fight for the same 2 digit code, GA – which would have precluded the emergence of even the earliest thermal vent forms of life – under a 2 letter dependent codex (XX). These amino acids or codes were mandatory for the first life under any digit size context – and should hold the most two digit slots accordingly – they do not. As well, in the case where multiple codons are assigned to a single amino acid, the multiple codons are usually related. Even the most remote members of archaea, thermophilic archaea, use not only a full 3 letter codon dependent codex, but as well use proteins which reach well into both the adenine position 2 variants (XAX) and thymidine position 2 variants (XTX) groupings; ostensibly the most late-appearing sets of amino acids (see graphic in C. below and Table III at the end).5

It is interesting to also note that the three stop codons TAA-TAG-TGA, all match into codex boxing with later appearing/more complex amino acid molecules, and as members of the adenine position 2 variants (XAX) group. They box with the more complex and ‘later appearing’ amino acids tyrosine and tryptophan. The stop codes needed to be a GG, CC, GC, or at the very least a CT/TC XX codon basis at the very least, in order to support an extended evolutionary period under a two letter codon basis (XX). This was not the situation as you can see in Exhibit III at the end. This would suggest that the stop codes appeared later to last under a classic abiogenetic evolutionary construct. Life would have had to evolve up until thermophilic archaea without stop codes in their current form. Then suddenly, life would have had to adopt a new stop codon basis (and then never make another change again in 3.6 billion years), changing horses in mid stream. This XX codon previous form of life should be observable in our paleo record. But it is not.

Moreover, the use of the two digit codex is regarded by much of genomics as a degeneracy, ‘third base degeneracy’, and not an artifact of evolution.6 Finally, the codon ATA should in a certain number of instances, equate to a start code, since it would have an evolutionary two digit legacy – yet it is never used to encode for Methionine – this is incompatible with the idea that methionine used to employ a two digit AT code. Likewise, Tyrosine and the non-amino stop code, TA would have been in conflict under the two digit codex. In fact, overall there should exist a relationship between arrival of an amino acid use in Earth life, and the number of slots it occupies in the codex, and there is not. Neither to the positive slope, nor negative slope.7

One can broach the fact that protein reassignments were very possible and could explain away the apparent introduction of a XXX codon dependency of all observable life, midstream. But then one must explain why it never ‘speciated’ again over 3.6 billion years, along with the apparent absence of XX codon life in the paleo record. This chasm in such a construct is critical and must be accommodated before abiogenesis can be fully developed as a hypothesis. In comparison, panspermia possess no such critical obstacle (note, this is not a ‘gap’ as it relates to the critical path of the alternative and not merely circumstantial inductive inference).

Codon Radial Symmetry

B.  Evolution of the codex would necessarily occur from amino absences rather than positives. Of particular note is the secondary map function of the start and stop codons. Notice that the start of a DNA sentence begins with a specific protein (the polar charge molecule methionine-ATG). The end of a DNA sequence however, consists of no protein coding whatsoever (see TAA, TAG and TGA). In other words the DNA sentence begins with the same note every time, and ends with protein silence. tRNA evolved a way to accommodate the need for a positive, through the employment of proline-CCA during the protein assembly process. This is how a musical score works – it starts with a note, say an A440 tuning one, and ends with the silence dictated by the conductor’s wand. Deliberacy of an empty set, as opposed to the stochasticity of positive notes, as another appearance of a start code methionine could have sufficed for a positive stop and start code instead. This positive stop mechanism would succeed inside an evolution context much better. Why would an absence evolve into a stop code for transfer RNA? It could not, as ‘absence’ contains too much noise. It occurs at points other than simply a stop condition. The problem exists in that there is no way for an organism to survive, adapt, cull or evolve, based upon its use of an empty set (protein silence). Mistakes would be amplified under such an environment,  Evolution depends intrinsically upon logical positives only (nucleotides, mutations, death) – not empty sets.

Codon versus Molecule Complexity 64 Slots

C.  Features triple-symmetrical assignment, linear robustness, ergodicity along with a lack of both evolution and deconstructive chaos. This is NOT the same set of conditions as exist inside evolution, even though it may appear as such to a layman. This set of codex assignments, features six principle challenges (C., and C. 1, 2a, 2b, 3, and 4 below). Specifically,

• radial assignment symmetry (B. Codon Radial Symmetry chart above),
• thymidine and adenine (XTX, XAX) second base preference for specific chemistries,
• synchrony with molecule complexity (C. Codon versus Molecule Complexity 64 Slots graphic to the right),
• block symmetry (C. Codon Second Base Block Symmetry table below) around the second digit (base), and
• ergodicity, despite a lack of chemical feedback proximity nor ability for a codon base to attract a specific molecule chemical profile or moiety.
• lack of precedent from which to leverage.

These oddities could not ‘evolve’ as they have no basis to evolve from. The structure and assignment logic of the codex itself precludes the viability of a two base XX codex. Evolution by definition, is a precedent entity progressing gradually (or sporadically) into another new entity. It thrives upon deconstructive chaos and culling to produce speciation. There was no precedent entity in the case of the DNA stop codon nor its XXX codex. As well, at any time, the stop codons could have been adopted under the umbrella of a valid protein, rendering two SuperKingdoms of life extant on Earth – and that should have happened. Should have happened many times over and at any time in our early history (in Archaea). Yet it did not.8 An asteroid strike and extinction event would not serve to explain the linearity. Evolution is not linear. We should have a number of DNA stop and start based variants of life available (just as we have with evolution based mechanisms) to examine. But we do not. In fact, as you can see in the chart to the right (derived from Exhibit III at the end), there exist four challenges to a purely abiogenetic classic evolution construct:

1. An original symmetry to the assignment of codon hierarchies (codex), such that each quadrant of the assignment chart of 64 slots, mirrors the opposing quadrant in an ordinal discipline (see Codon Radial Symmetry charts in B. above to the right – click to get a larger image).

Codon Second Base Block Symmetry

2. The second character in the codon dictates (see chart in B. Codon Radial Symmetry chart above) what was possible with the third character. In other words

a. all the thymidine position 2 variants (XTX) had only nitrite molecules (NO2) assigned to them (marked in blue in the chart in C. Codon versus Molecule Complexity 64 Slots to the upper right and in Exhibit III at the end – from where the graph is derived). While the more complex nitrous amino acids were all assigned to more complex oversteps in codex groups (denoted by the # Oversteps line in the Codon versus Molecule Complexity 64 Slots chart to the upper right).

In addition,

b. all adenine position 2 variants (XAX) were designated for multi-use 3rd character codons, all cytidine position 2 variants (XCX) were designated for single use 3rd character codons, while guanine (XGX) and thymidine (XTX) both were split 50/50 and in the same symmetrical patterning (see Codon Second Base Block Symmetry table to the right).

3.  There exists a solid relationship, methodical in its application, between amino acid molecule complexity and assignment grouping by the second digit of the DNA codon. Second letter codon usages were apportioned to amino acids as they became more and more complex, until T and A had to be used because the naming convention was being exceeded (see chart in C., Codon versus Molecule Complexity 64 Slots above to the right as well as Exhibit III at the end). After this was completed, then one more use of G was required to add 6 slots for arginine, and then the table of 60 amino acids was appended by one more, tryptophan (the most complex of all the amino acid molecules – top right of chart in C. Codon versus Molecule Complexity 64 Slots above or the end slots in Exhibit III at the end) and the 3 stop codes thereafter. Very simple. Very methodical. Much akin to a IT developer’s subroutine log – which matures over the course of discovery inside its master application.

4.  Ergodicity. Prejudice… Order, featuring radial symmetry (B. Codon Radial Symmetry chart), synchrony with molecule complexity (C. Codon versus Molecule Complexity 64 Slots graphic) and and block symmetry (C. last image Codon Second Base Block Symmetry table) around the second digit. The problem is that there is no way a natural process could detect the name/features of the base/molecule/sequence as means to supplant such order and symmetry into the codex, three times – much less evolve is such short order, without speciation, in the first place.

Since the DNA chemistry itself is separated by two chemical critical path interventions, how would the chemistry of thymine for instance (the blue block in Exhibit III below) exclusively attract the nitric acid isomer of each amino acid? And why only the nitric acid isomers with more complex molecule bases? First, the DNA base 2 is no where physically near the chemical in question, as it is only a LOGICAL association, not a chemical one so it cannot contain a feedback or association loop. Second, there is no difference chemically, between C2H5NO2 and C5H11NO2. The NO2 is the active moiety. So there should have not been a synchrony progression (C.3. above), even if there were a direct chemical contact between the amino acid and the second base of the codon. So the patterns happen as a result of name only. One would have to know the name of the codon by its second digit (base), or the chemical formula for the amino acid, and employ that higher knowledge to make these assignments.

Finally, this order/symmetry has not changed since the code was first ‘introduced’ and certainly has not been the product of stochastic arrival – as a sufficiently functional-but-less-orderly code would have evolved many times over (as is the practice of evolution) and been struck into an alternative codice well before (several billion years) this beautiful symmetry could ever be attained.

We claim that evolution served to produce the codex, yet the codex bears the absolute signs of having had no evolution in its structure. We cannot selectively apply evolution to the codex – it must either feature evolutionary earmarks, or not be an evolved code. The mechanisms of evolution cannot become a special pleading football applied only when we need it, to enforce conformance – because in that case we will only ever find out, what we already know. It becomes no better argument philosophically than ‘God did it’.

D.  Related codons represent related amino acids. For example, a mutation of CTT to ATT (see table in C. above) results in a relatively benign replacement of leucine with isoleucine. So the selection of the CT and AT prefixes between leucine and isoleucine was done early, deliberately and in finality – based upon a rational constraint set (in the example case, two nitrite molecule suffixed proteins) and not eons of trail and error.9 Since the assignment of proteins below, is not partitioned based upon any physical characteristic of the involved molecule, there is no mechanism but deliberacy which could dictate a correspondence between codon relationships and amino acid relationships.10

E.  Statistically impossible codex, not just improbable. Finally, it is not simply the elegant symmetry to the codex which is perplexing, but as well, items A. – D. usage contexts identified above, allow one to infer (deductive?) that the codex, its precedent, provenance and structure are difficult to impossible to accommodate in even the most contorted construct of abiogenesis. Observe the A-G vs C-T tandem relationship between Lysine and Asparagine for instance. This elegant pattern of discipline repeats through the entire codex. This is the question asked by Eugene V. Koonin and Artem S. Novozhilov at the National Center for Biotechnology in Bethesda, Maryland in their study Origin and Evolution of the Genetic Code: The Universal Enigma (see graphic to the right, extracted from that study).11 This serious challenge is near to falsifying in nature, and cannot be dismissed by simple hand waving. Take some time (weeks or months, not just seconds) to examine the DNA codex in Exhibits I thru III below, the three tables and charts in B. and C. above, as well as the study from which the graphic to the right is extracted, and see if you do not agree. This argument does not suffer the vulnerability of ‘creationist’ arguments, so don’t play that memorized card – as Ockham’s Razor has been surpassed for this necessary alternative.

The hypothesis that the codex for DNA originated elsewhere bears specificity, definition and testable mechanism.
It bears less stacking, gapping and risk as compared to abiogenesis.
It is science. It is the necessary alternative.

Assuming that life just won the 1 in 1.6 x 10 to the 15th power lottery (the Omega Hypothesis), again, and quickly, near to the first time it even bought a lottery ticket – has become the old-hat fallback explanation inside evolution. One which taxes a skeptic’s tolerance for explanations which begin to sound a lot like pseudo-theory (one idea used to explain every problem at first broach). However this paradox of what we observe to be the nature of the codex, and its incompatibility with abiogenesis, involves an impossibly highly stacked assumption set to attempt to explain away based solely upon an appeal to plenitude error. A fallback similar in employment of the appeal to God for the creationist. The chance that the codex evolved by culled stochastics alone in the first 300 million years of Earth’s existence, is remote from both the perspective of time involved (it happened very quickly) and statistical unlikelihood – but as well from the perspective that the codex is also ‘an optimal of optimates’ – in other words, it is not only functional, but smart too. Several million merely functional-but-uglier code variants would have sufficed to do the job for evolution. So this begs the question, why did we get a smart/organized codex (see Table II below and the radial codon graphic in B. above) and not simply a sufficiently functional but chaotic one (which would also be unlikely itself)? Many social skeptics wishing to enforce a nihilistic religious view of life, miss that we are stacking deliberacy on top of a remote infinitesimally small chance happenstance. Their habit is to ignore such risk chains and then point their finger at creationists as being ‘irrational’ as a distraction.

The codex exhibits unprecedentable, empty set as a positive logical entity, static, patterned codex format, exposed to deconstructive vulnerability which happens in everything else, but which chose to not happen in this one instance. In other words, evolution actually chose to NOT happen in the DNA codex – i.e. deliberacy. The codex, and quod erat demonstrandum, the code itself, came from elsewhere. I do not have to explain the elsewhere, but merely provide the basis for understanding that the code did not originate here. That is the only question on our scientific plate at the moment.

Exhibits I II and III

Exhibit I below shows the compressed 64 slot utilization and it efficiency and symmetry. 61 slots are coded for proteins and three are not – they are used as stop codes (highlighted in yellow). There are eight synonym based code groups (proteins), and twelve non-synonym code groups (proteins). Note that at any given time, small evolutionary fluctuations overlapping into the stop codes would render the code useless as the basis for life. So, the code had to be frozen from the very start, or never work at all. Either that or assign tryptophan as a dedicated stop code and mirror to methionine and make the 5th code group a synonym for Glutamine or Aspartic Acid.

Exhibit I – Tandem Symmetry Table

Exhibit II below expands upon the breakout of this symmetry by coded protein, chemical characteristics and secondary mapping if applicable.

Exhibit II – Expanded Tandem Symmetry and Mapping

Finally, below we relate the 64 codon slot assignments, along with the coded amino acid, the complexity of that molecule and then the use in thermophilic archaea. Here one can see that it is clear that even our first forms of life, constrained to environments in which they had the greatest possibility of even occurring, employed the full XXX codex (three letter codon). While it is reasonable to propose alternative conjecture (and indeed plurality exists here) that the chart suggests a two letter basis as the original codex, life’s critical dependency upon the full codon here is very apparent.

Exhibit III – 61 + 3 Stop Codon Assignment versus Molecule Complexity and
Use in Thermophilic Archaea (see chart in C. above)

Recognizing the legitimacy of the necessary alternative – one which was both purposely neglected by science, and yet offers the only viable alternative to standing and celebrated club dogma – this is a process of real science. Developing mechanism for such an alternative is the antithesis of religious activity. Maturing the necessary alternative into hypothesis, is the heart and soul of science.

Blocking such activity is the charter and role of social skepticism. Holding such obfuscating agency accountable is the function of ethical skepticism.

epoché vanguards gnosis

——————————————————————————————

How to MLA cite this blog post =>

The Ethical Skeptic, “Embargo of The Necessary Alternative is Not Science” The Ethical Skeptic, WordPress, 24 Nov 2018; Web, https://wp.me/p17q0e-8Ob

November 24, 2018 Posted by | Argument Fallacies | , , | 11 Comments

Chinese (Simplified)EnglishFrenchGermanHindiPortugueseRussianSpanish
%d bloggers like this: