The Ethical Skeptic

Challenging Pseudo-Skepticism, Institutional Propaganda and Cultivated Ignorance

The Warning Indicators of Stacked Provisional Knowledge

Rather than presume as capstone upon incredibly risk-ignorant stacks of knowledge, what is true and not true, the ethical skeptic instead focuses on field observation and the suspension of doubt, belief and provisionally stacked assumption. He is not denying knowledge, rather denying the lie-spinner the raw material he so desperately needs. He is denying the tradecraft of the lie: an Omega Hypothesis. A condition wherein the conclusions themselves become more important than the process of knowledge development. Wherein conformance of belief becomes now a moral imperative.

In the previous blog post we discussed the Riddle of Skepticism and a thing called the Tradecraft of the Lie. Inside this precept we are made sensitive to the role which machinated doubt, belief and ignorance of provisionally stacked risk play inside grand fantasies posed as representing (pluralistic ignorance) accepted scientific thinking. Several warning signs can be monitored to watch for such a condition wherein, social forces are seeking to promote an idea (The Omega Hypothesis) in such a fashion as to block further study and end the scientific discourse. In the process of doing so, these same forces will speak often about ‘evidence’, ‘study’ and ‘science’. Of course evidence, study and science are the foundation of our knowledge development process. But simply because one proclaims such words, does not mean therefore that their proclaimant actually understands nor represents science.

The Tower Which Cannot Be Touched – At All Costs

Below I have employed the analogy of a Jenga blocks game as illustrative of the principles comparing ideal science, the reality of fake knowledge posing as science and how its proponents undertake pseudoscience (pretend method/action – and NOT a subject) in an effort to block competing ideas.  A shaky tower cannot be touched, at all costs. Therefore any method of obfuscating competing ideas becomes part of the ‘stack’ of provision afforded the Omega Hypothesis. The job of fake skepticism is to ensure that no competing idea nor unauthorized entity ever touches the shaky tower of blocks. As well, to ensure that the resource and obfuscation gaps inside the Embargo Hypothesis (on the right below), the Hypothesis which gets them angry – that those gaps are never addressed by science. That the question of the Embargo Hypothesis can never be raised in serious scientific discussion – at cost of severe career penalty. (Click to enlarge)

Verisimilitude Versus Field Observation

This principle is underpinned by key Karl Popper philosophy as outlined inside The Stanford Encyclopedia of Philosophy:¹

In the view of many social scientists, the more probable a theory is, the better it is, and if we have to choose between two theories which are equally strong in terms of their explanatory power, and differ only in that one is probable and the other is improbable, then we should choose the former. Popper rejects this. Science, or to be precise, the working scientist, is interested, in Popper’s view, in theories with a high informative content, because such theories possess a high predictive power and are consequently highly testable. But if this is true, Popper argues, then, paradoxical as it may sound, the more improbable a theory is the better it is scientifically, because the probability and informative content of a theory vary inversely—the higher the informative content of a theory the lower will be its probability, for the more information a statement contains, the greater will be the number of ways in which it may turn out to be false. Thus the statements which are of special interest to the scientist are those with a high informative content and (consequentially) a low probability, which nevertheless come close to the truth.¹

Provisional Knowledge and Risk

In contrast, you will note that it is the job of the provisional knowledge proponent, to seek out stacks of self-defined most probable conclusion. This is why basement and cubicle skepticism gets swallowed so easily by its sycophancy. It involves information which simultaneously ignores the contribution of increasing risk in plurality, and sidesteps the requirements of a coherent and sound argument, bearing a sufficient level of integrity in logical calculus. Most of our questionable skeptical ‘beliefs’ today (such as nihilism) fail in regard to these two duties, regardless of how probable the evidence makes them.

   The Specter of Ignored Increasing Risk

1. Is not evaluated in terms of increasing risk-chain dependency
2. Is not informative because it is probable; nor tolerates competing falsification work to be undertaken
3. Cannot be assailed because of its probable superiority and critical basis for other arguments.

The key here being that as a stack of favored knowledge gets increasingly higher and even more extrapolations are added over time (and they always are), even so more powerful become the standing 3 risk features above. So powerful in fact, that in order for the argument to survive, the entailed conclusions need become more important than the process of knowledge development itself. They must become doctrine as a vaccination against future science which might threaten their status. This is how an Omega Hypothesis works – it must be protected at all costs, in order to not upset its multiple and tall stacks of provision. This is the job of Social Skepticism, to protect these stacks – the money, power, tenure and religious interweaving from which multiple arguments extend and thrive. Let’s examine a study abstract below from a proponent of what they call ‘evidentialism’. Ethical skepticism in contrast regards evidence as only one of the evaluation factors of a good argument; ranking, in order of importance:

   Argument Strength

1. Coherency – the conjecture and its underpinning definitions actually form a resolvable scientific sentence
2. Soundness – the premises and assumptions bear merit and accuracy in underpinning the argument
3. Formal Theory – the logical calculus (flow) of the argument and the arrangement its mechanisms, are critically dependent and sequitur
4. Inductive Strength – the argument bears consilience and has fairly addressed its relevancy and risk chain dynamics
5. Circumstantial Strength (the ‘evidence’) – the argument bears a robust array of inductive or deductive evidences
6. Integrity of Form/Cogency – the argument filters agency and mitigates bias and is expressed in a neutral fashion bearing scientific integrity.

The above list of 6 components of argument is how a modus ponens or tollens builds its cache of reason. Not a list of ‘facts’ tossed out as evidence, even if they do happen to inductively support a somewhat probable conclusion. Under an assumption of ‘probability’, one can pass off sound arguments (bearing plurality merit certainly) as ‘proved’ and ‘accepted’ arguments, skipping past critical elements 3 – 6 in the list above – and bereft of understanding of the three risk elements listed earlier. This is the trick of fake skepticism. Such reasoning is foolishness, taught by people who do not understand what is going on inside science, philosophy and skepticism.

The problem with the equivocal word ‘evidence’ is that it is a permissive argument in itself,
which admits any unmerited form of casuistry or trivia to masquerade as epistemology at the table of science.

Even worse, is when they begin to insist that adopting such unsound conjecture is a moral obligation. Yes, you heard me – if I produce one small iota of evidence to support my a priori belief, you now bear a moral obligation to believe it as well. Look at the bad things which happen to good people, so therefore only this realm can exist. The world is getting hotter, therefore we must elect progressives and introduce violence into the political process. Eminent garbage. Eventually yes, jail sentences will be introduced for those who do not hold their dissent quietly. This is coming folks. Get ready.

What the author has done in the below example of such a reasoning pathway, is to execute a process of panduction – one study which disproves everything which stands in objection to his sophistry. An apologetic, to be sure, but also one of an exceptionally awesome insistence. This is not a religious pitch which is optional for you, as most religious pitches go. The author makes it clear instead that believing like he does is mandatory. Poetically, his very paper fails the six elements of a strong argument above and ignores the elements of entailed risk. He executes one felled swoop of invalid inference. This is illegitimate science. Notice the 7 things he supposedly ‘disproves’ in his inverse negation rush to defend something which bears no evidence of its own right. He conflates evidence with ‘proof’ and uses the word ‘or’ as a permissive artifice which opens the door to just about any datum which can be whipped up or has some brief funds backing. This is deception and philosophical incompetence. It is oppression in a legal sense. To wit:

‘Evidentialism’ is the conventional name (given mainly by its opponents) for the view that there is a moral duty to proportion one’s beliefs to evidence, proof or other epistemic justifications for belief. This essay defends evidentialism against objections based on the alleged involuntariness of belief, on the claim that evidentialism assumes a doubtful epistemology, that epistemically unsupported beliefs can be beneficial, that there are significant classes of exceptions to the evidentialist principle, and other shabby evasions and alibis (as I take them to be) for disregarding the duty to believe according to the evidence. Evidentialism is also supported by arguments based on both self-regarding and other-regarding considerations. ~ Allen Wood, The Duty to Believe According to the Evidence, International Journal for the Philosophy of Religion; Feb 2008, Vol 63.

This ‘study’ is highlighted by social skeptics wishing to enforce iffy beliefs upon those who do not share them with the oppression-minded ‘skeptic’. The problem which is introduced by such flawed thinking, is highlighted by a principle of ethical skepticism called the acatalepsia fallacy:

acatalepsia Fallacy

/philosophy : casuistry : pseudoscience/ : a flaw in critical path logic wherein one appeals to the Pyrrhonistic Skepticism principle that no knowledge can ever be entirely certain – and twists it into the implication that therefore, knowledge is ascertained by the mere establishment of some form of ‘probability’. Moreover, that therefore, when a probability is established, no matter how plausible, slight or scant in representation of the domain of information it might constitute, it is therefore now accepted truth.  Because all knowledge is only ‘probable’ knowledge, all one has to do is spin an apparent probability, and one has ascertained accepted knowledge. Very similar in logic to the Occam’s Razor aphorism citing that the ‘simplest explanation’ is the correct explanation.

This can be seen as well inside the habitual conflation of verisimilitude and consilience inside the Sskeptic community, two differing approaches to the establishment of hypothesis gravitas inside the scientific community – purposed for two different situations of inference. While consilience is not finished science, rather an important step along the way in developing a hypothesis or understanding, sadly often verisimilitude is non-expertly regarded conveniently as finished science by fake skeptics.  This is the reason why doubt, belief and stacks of provisional knowledge are eschewed by the ethical skeptic.

Sculptured Narrative

/philosophy : pseudoscience : method/ : a social declaration which fits a predetermined agenda, purported to be of ‘weight of evidence’ and science in origin. However, in reality stems more from only the removal/ignoring of the majority or plurality of available or ascertainable evidence, in order to sculpt a conclusion which was sought before research ever began (see Wittgenstein sinnlos Skulptur Mechanism). Conducting science by dwelling only in the statistical and meta-analytical domains while excising all data which does not fit the social narrative of funding entities, large corporations or sskeptic organizations. Refusing to conduct direct studies, publishing studies which contain an inversion effect and filtering of countermanding studies out by attacking journals, authors and ignoring large bodies of evidence, consilience or falsification opportunity.

Verisimilitude (Unity of Knowledge Error)

The Omega Hypothesis:  A provisionally stacked basis supporting an apparent simple reality

/philosophy : pseudoscience : method/ : an argument establishing a screen of partially correct information employed to create an illusion of scientifically accurate conclusion. The acceptance of specific assumptions and principles, crafted in such a fashion as to lead to a therefore simple resulting explanatory conclusion.

Consilience

A dynamic enhancement of measure, observation and analysis increasingly promoting a single coherent idea

/philosophy : science : method : induction/ : is the nature or characteristic of an argument wherein its underpinning premises, data, multiple associated disciplines, avenues of research or predicates provide for independent but mutual reinforcement of its conclusion.

The Embargo Hypothesis (Hξ)

The hypothesis which must be dismissed without science because it threatens simplicity and verisimilitude

/philosophy : pseudoscience : method/ : a disfavored hypothesis which will never be afforded access to science or the scientific method no matter what level of consilience is attained. An idea which threatens to expose the risk linkages inside of or falsify a stack of protected provisional knowledge which has achieved an importance greater than science itself: an Omega Hypothesis.

Consilience is not finished science, nor does consilience stand as consensus. You will find fake skeptics conflating and twisting the two terms in order to support provisional knowledge they wish to enforce. However consilience does stand as a principal threat to those wishing to protect an agenda of verisimilitude. The Embargo Hypothesis on the other hand, is typically an elegant yet ignored robust and cogent theory. One which threatens power, tenure and money. It will never see the light of a scientific day, no matter how much consilience is developed.

Several Key Warning Signs to Look For Indicating an Omega Hypothesis at Play

A.  Mixing up the steps of or using only a portion of the scientific method

B.  Becoming irritated at calls for further competing study

C.  Over-using the term ‘settled science’ or ‘evidence’ in pluralistic debate

D.  Applying social pressure and club conclusions – feigned as ‘promoting scientific literacy’

E.  Liberal use of the prefix ‘Anti’

F.  Identifying enemies and pigeon-hole bifurcating an argument

G.  Becoming threatened by specific alternatives – calling them magic, pseudoscience, conspiracy or irrational

H.  Relying on scant, outdated or Big-Data only study

I.  Employing celebrity, journalism, funding, influence or authority (see Kilkenny’s Law) to intimidate

J.  Claiming their stack of provisional argument results from a principle of doubt and/or skepticism

K.  A complete ignorance of favored hypothesis limitations and risk.

Several Filtering Techniques Employed to Block The Embargo Hypothesis

And finally, whenever you observe the practices cited above, at play inside the methods of science employed to craft a stack of provisional knowledge and/or enforce an embargo of an eschewed but competing alternative – watch for the following filtering methods. A short definition is provided below with each.

Amplanecdote – so many observations are discarded as anecdote, that an entire science can be assembled around them

Filbert’s Law – relying solely upon single big-data, large domain or arm’s length inexpert meta-analysis to underpin a conclusion

Correlation Dismissal – assuming that all correlation is invalid and instead demanding proof as a first step in science

Effect Inversion – when the reverse question is asked inside an apparently non-significant signal data set, the opposite effect (‘curative’) shows up as a statistically significant signal

Procrustean Solution – further and further study is either modified or thrown out in order to not conflict with the current paradigm

Forward Problem Blindness – the reverse question is never asked or only predictive science is employed and falsification remains unused, despite its availability

Ignoro Eventum – failure to conduct follow-on/impact study or observe an impacted population after a major environmental change has been implemented

Ascertainment Bias – a form of inclusion or exclusion criteria error where the mechanism of sampling, specimen selection, data screening or sub-population selection is inherently flawed

MiHoDeAL Filtering – resorting too often to disposition unliked observations as Misidentification, Hoaxes, Delusions, Anecdotes and Lies

Law of Static Privation – does the provisionally stacked knowledge have a track record of improving further knowledge or alleviating suffering? – if not, it is a provisional stack

Existential Fallacy of Data – the implication or contention that there is an absence of observation or data supporting an idea, when in fact no observational study at all, or of any serious import has been conducted by science on the topic at hand

Shevel’s Inconsistency – one simultaneously contends that science has shown a research subject to be invalid, yet at the same time chooses to designate any research into that subject as constituting pseudoscience

Manipulative Rational Ignorance – an arguer contends rational ignorance applies inside an argument, or the ignoring of a pathway of science because the cost or effort entailed is too high versus the results or lack thereof to be obtained

Furtive Fallacy – undesired data and observations are asserted to have been caused by the malfeasance of researchers or laymen

Fallacy of Relative Privation – science is only the property of scientists. Dismissing an avenue of research due its waste of scientists’ time and focus

Semmelweis Reflex – the tendency to reject new evidence that contradicts one’s held paradigm

Furtive Confidence Fallacy – refusal to estimate, grasp or apply principles of statistical confidence to collected data. The act of prematurely declaring or doggedly denying a multiplicity of anecdote to be equal to data

Muta-Analysis/Studium a Studia – arm’s length, big data, studies-of-studies conducted by 3 year or less experienced analysts, presided over by money influenced directors

Dismissible Margin Fallacy – ensuring that the dissenting real experts and outlier data in a study field are of sufficiently low percentage that they can be ignored (typically cited as <5%)

Consilience Evasion – the refusal to look at a body of growing consilience in an effort to deny a disliked alternative any access to science

Hume’s Razor Error – the false presumption that a seemingly miraculous explanation is assumed to be false if any alternative explanation provided is less miraculous

Sponsorship Bias – rejection of an entire methodological basis of a scientific argument and all its underpinning data and experimental history simply because one can point to a bad personality involved in the subject

Regressive Bias – a certain state of mind wherein perceived high likelihoods are overestimated while perceived low likelihoods are underestimated

Observer Expectancy Effect – when a researcher expects a given result and therefore unconsciously manipulates an experiment or scientific method or misinterprets data in order to find that expected result

False Stickiness/Consensus – false belief that, or willingness to acceptance the claim that, scientists are all in agreement a given subject

Reactive Dissonance – when faced with a challenging observation, study or experiment outcome, to immediately set aside rational data collection, hypothesis reduction and scientific method protocols in favor of crafting a means to dismiss the observation

Unity of Knowledge Error – to conflate and promote consilience as consensus, in absence of having diligently falsified or even studied any competing hypothesis

All of these may be found in the Glossary or Tree of Knowledge Obfuscation Pages of this blog.

epoché vanguards gnosis


¹  Thornton, Stephen, “Karl Popper”, The Stanford Encyclopedia of Philosophy (Winter 2015 Edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/archives/win2015/entries/popper/>.

August 24, 2016 Posted by | Agenda Propaganda, Argument Fallacies, Social Disdain | , , , | Leave a comment

Corber’s Burden of Skepticism and The Omega Hypothesis

The Omega Hypothesis is the argument which is foisted to end all argument, period.
Social Skeptics work to defend this set of beliefs through several means. First they codify these beliefs into a partly unacknowledged, but comprehensively protected set. Further then, through application of an inverse negation fallacy, and conflating the ethics of skepticism with corrupted methods of cynicism they establish the preeminence of their favored beliefs, without or by skirting the rigor of science. All this stems from a principle of parsimony called Corber’s Burden. When one makes an authoritative claim as to possessing knowledge of a complete set of that which is incorrect, one must be 100% correct (or at least appear to be so).

Omeaga Hypothesis - CopyWho needs further science and hypothesis testing when you have skepticism? Further study of anything which falls outside of The Omega Hypothesis is pseudoscience. Why study it when we already have the answer through the magic of critical thinking and rationality? Such is the rationalization of the Social Skeptic. The Social Skeptic will then appoint their debunking resume with ample sets of Stooge Posed examples, case anecdotes of ridiculous debunking wherein their ‘critical thinking’ skills resulted in the correct answer. Jesus on a piece of toast, mermaids, swallowing whole bottles of sugar pills, statues drinking milk. An erstwhile skeptic Christmas tree adorned with such a profusion of correct ornaments that it will be difficult to impossible to locate any incorrect ones. They will imply that the entire suite of ideas they have disproved constitutes 100% bunk. But they need be cautious in such a claim. For god knows what the implications of being wrong are – when your job is to instruct everyone as to what is indeed, incorrect. This is related inside a tenet of parsimony, under Ockham’s Razor, called the Truzzi Fallacy or Burden of Proof. Making a claim to falseness is the same as making a claim. This applies as well to situations where the claimant implies falsity through various non-evidence avenues – or by attempting to shoot down or plausibly deny all alternative ideas.

When one makes or implies a claim to falseness, one assumes the burden of proof.¹

In ancient Biblical times, the test or ‘burden of proof’ of a prophet was that they need be correct in their prognostications 100%.² Thankfully we no longer rely on prophets to tell us a couple things about the future.  Now we rely upon Social Skeptics to instruct us as to ALL things about the present.  The nature of tendering answers in lieu of science (a false form of skepticism), demands that the purveyor of such correct conclusive thought – necessarily be correct in 100% of their claims.

For in those cases where you are right – you serve to simply re-enforce the prevailing dogma – thereby adding no real value.

And if even if in one case, you are wrong – even on 1% of the ‘pseudoscience’ you squelch without any research or real effort – your entire life’s contribution has been a net disservice to humanity.

This reality is born out in a principle called Corber’s Burden.

Corber’s Burden

Corber’s Burden Inverse Exponential Decaya. When one tenders an authoritative claim as to what is incorrect – one must be perfectly correct.

b. When a person or organization claims to be an authority on all that is bunk, their credibility decays in inverse exponential proportion to the number of subjects in which authority is claimed.

c. A sufficiently large or comprehensive set of claims to conclusive evidence in denial, is indistinguishable from an appeal to authority.

/philosophy : burden of proof : pretense/ The mantle of ethics undertaken when one claims the role of representing conclusive scientific truth, ascertained by means other than science, such as ‘rational thinking,’ ‘critical thinking,’ ‘common sense,’ or skeptical doubt. An authoritative claim or implication as to possessing knowledge of a complete set of that which is incorrect. The nature of such a claim to authority on one’s part demands that the skeptic who assumes such a role be 100% correct. If however, one cannot be assured of being 100% correct, then one must tender the similitude of such.

Corber’s Hypocrisy

/philosophy : pseudoscience : self misrepresentation/ : When a skeptic who dismisses a large set of specific subjects and who realizes that under Corber’s Burden they must be 100% correct in such a role – speaks often about ‘following the evidence’ or that they ‘withhold conclusion’ in a state of neutrality over such subjects – when indeed such claims of behavior are not the case at all in their habit or practice.

An example of this can be found in atheist activism. If one purports to be able to instruct others under the certainty that there is no god, and that the spirit realm and extra-material consciousness all do not exist – or further then that all these things have been ‘disproved by science’ – one ethically in this role is under the burden of needing to be 100% correct. In other words you better damn well be right – or you have committed a great harm to those you instructed. This burden applies to the greater domain of disdained topics as well. The likelihood of one harming others increases as one becomes more and more boastful as to those subjects in which one claims to be an authority to claims of both promotion or denial.

The Furtive and Mandatory Hypothesis

Corber’s Burden/Hypocrisy introduces an additional form of pseudoscience which fails the Popper Demarcation of science versus non-science. One which can be found in the practices regarding the employment of an invalid, hidden, but also mandatory null hypothesis, HΩ. The Omega Hypothesis is hidden precisely because of Corber’s Burden. The creation and unmerited protection of the Omega Hypothesis constitutes a form of hypoepistemology which is spun through practices of Inverse Negation Fallacy, and corruption of the standards and methods of science. It is an embodiment and method of ensuring that what we believe in Social Skepticism, is at any given time, regarded as 100% correct – in accordance with Corber’s Burden. Through these practices of social epistemology, an apparent coherence can be spun around a particular view of a subject, and protection by the corrupted institutions of science afforded until such time as a Kuhn Paradigm Shift is able to be precipitated. Sadly, this often only occurs upon the death of the key social epistemologists involved.

Omega Hypothesis (HΩ)

The argument which is foisted to end all argument, period. A conclusion which has become more important to protect, than the integrity of science itself.

/philosophy : pseudoscience : social epistemology : apparent coherency/ : the argument which is foisted to end all argument, period. A conclusion promoted under such an insistent guise of virtue or importance, that protecting it has become imperative over even the integrity of science itself. An invalid null hypothesis or a preferred idea inside a social epistemology. A hypothesis which is defined to end deliberation without due scientific rigor, alternative study consensus or is afforded unmerited protection or assignment as the null. The surreptitiously held and promoted idea or the hypothesis protected by an Inverse Negation Fallacy. Often one which is promoted as true by default, with the knowledge in mind that falsification will be very hard or next to impossible to achieve.

1.  The (Wonka) Golden Ticket – Have we ever really tested the predictive strength of this idea standalone, or evaluated its antithetical ideas for falsification? Does an argument proponent constantly insist on a ‘burden of proof’ upon any contrasting idea, a burden that they never attained for their argument in the first place? An answer they fallaciously imply is the scientific null hypothesis; ‘true’ until proved otherwise?

Einfach Mechanism – an idea which is not yet mature under the tests of valid hypothesis, yet is installed as the null hypothesis or best explanation regardless. An explanation, theory or idea which sounds scientific, yet resolves a contention through bypassing the scientific method, then moreover is installed as truth thereafter solely by means of pluralistic ignorance around the idea itself. Pseudo-theory which is not fully tested at its inception, nor is ever held to account thereafter. An idea which is not vetted by the rigor of falsification, predictive consilience nor mathematical derivation, rather is simply considered such a strong, or Occam’s Razor (sic) stemming-from-simplicity idea that the issue is closed as finished science or philosophy from its proposition and acceptance onward. A pseudo-theory of false hypothesis which is granted status as the default null hypothesis or as posing the ‘best explanation’, without having to pass the rigors with which its competing alternatives are burdened. The Einfach mechanism is often accompanied by social rejection of competing and necessary alternative hypotheses, which are forbidden study. Moreover, the Einfach hypothesis must be regarded by the scientific community as ‘true’ until proved otherwise. An einfach mechanism may or may not be existentially true.

2.  Cheater’s Hypothesis – Does the hypothesis or argument couch a number of imprecise terms or predicate concepts? Is it mentioned often by journalists or other people wishing to appear impartial and comprehensive? Is the argument easily falsified through a few minutes of research, yet seems to be mentioned in every subject setting anyway?

Imposterlösung Mechanism – the cheater’s answer. A disproved, incoherent or ridiculous contention, or one which fails the tests to qualify as a real hypothesis, which is assumed as a potential hypothesis anyway simply because it sounds good or is packaged for public consumption. These alternatives pass muster with the general public, but are easily falsified after mere minutes of real research. Employing the trick of pretending that an argument domain which does not bear coherency nor soundness – somehow (in violation of science and logic) falsely merits assignment as a ‘hypothesis’. Despite this, most people hold them in mind simply because of their repetition. This fake hypothesis circumstance is common inside an argument which is unduly influenced by agency. They are often padded into skeptical analyses, to feign an attempt at appearing to be comprehensive, balanced, or ‘considering all the alternatives’.

Ad hoc/Pseudo-Theory – can’t be fully falsified nor studied, and can probably never be addressed or can be proposed in almost any circumstance of mystery. They fail in regard to the six tests of what constitutes a real hypothesis. Yet they persist anyway. These ideas will be thrown out for decades. They can always be thrown out. They will always be thrown out.

3.  Omega Hypothesis (HΩ) – Is the idea so important or virtuous, that it now stands more important that the methods of science, or science itself. Does the idea leave a trail of dead competent professional bodies behind it?

Höchste Mechanism – when a position or practice, purported to be of scientific basis, is elevated to such importance or virtue that removing the rights of professionals and citizens to dissent, speak, organize or disagree (among other rights) is justified in order to protect the position or the practice inside society.

Constructive Ignorance (Lemming Weisheit or Lemming Doctrine) – a process related to the Lindy Effect and pluralistic ignorance, wherein discipline researchers are rewarded for being productive rather than right, for building ever upward instead of checking the foundations of their research, for promoting doctrine rather than challenging it. These incentives allow weak confirming studies to to be published and untested ideas to proliferate as truth. And once enough critical mass has been achieved, they create a collective perception of strength or consensus.

4.  Embargo Hypothesis (Hξ) – was the science terminated years ago, in the midst of large-impact questions of a critical nature which still remain unanswered? Is such research now considered ‘anti-science’ or ‘pseudoscience’? Is there enormous social pressure to not even ask questions inside the subject? Is mocking and derision high – curiously in excess of what the subject should merit?

Entscheiden Mechanism – the pseudoscientific or tyrannical approach of, when faced with epistemology which is heading in an undesired direction, artificially declaring under a condition of praedicate evidentia, the science as ‘settled’ and all opposing ideas, anti-science, credulity and pseudoscience.

Poison Pill Hypothesis – the instance wherein sskeptics or agency work hard to promote lob & slam condemnation of particular ideas. A construct obsession target used to distract or attract attack-minded skeptics into a contrathetic impasse or argument. The reason this is done is not the confusion or clarity it provides, rather the disincentive which patrolling skeptics place on the shoulders of the genuine skilled researcher. These forbidden alternatives (often ‘paranormal’ or ‘pseudoscience’ or ‘conspiracy theory’ buckets) may be ridiculous or indeed ad hoc themselves – but the reason they are raised is to act as a warning to talented researchers that ‘you might be tagged as supporting one of these crazy ideas’ if you step out of line and do not visibly support the Omega Hypothesis. A great example is the skeptic community tagging of anyone who considers the idea that the Khufu pyramid at Giza might have not been built by King Khufu in 2450 bce, as therefore now supporting conspiracy theories or aliens as the builders – moreover, their being racist against Arabs who now are the genetic group which occupies modern Egypt.

5.  Evidence Sculpting – has more evidence been culled from the field of consideration for this idea, than has been retained? Has the evidence been sculpted to fit the idea, rather than the converse?

Skulptur Mechanism – the pseudoscientific method of treating evidence as a work of sculpture. Methodical inverse negation techniques employed to dismiss data, block research, obfuscate science and constrain ideas such that what remains is the conclusion one sought in the first place. A common tactic of those who boast of all their thoughts being ‘evidence based’. The tendency to view a logical razor as a device which is employed to ‘slice off’ unwanted data (evidence sculpting tool), rather than as a cutting tool (pharmacist’s cutting and partitioning razor) which divides philosophically valid and relevant constructs from their converse.

Also, the instance common in media wherein so-called ‘fact-based’ media sites tell 100% truth about 50% the relevant story. This is the same as issuing 50% misinformation or disinformation.

6.  Lindy-Ignorance Vortex – do those who enforce or imply a conforming idea or view, seem to possess a deep emotional investment in ensuring that no broach of subject is allowed regarding any thoughts or research around an opposing idea or specific ideas or avenues of research they disfavor? Do they easily and habitually imply that their favored conclusions are the prevailing opinion of scientists? Is there an urgency to reach or sustain this conclusion by means of short-cut words like ‘evidence’ and ‘fact’? If such disfavored ideas are considered for research or are broached, then extreme disdain, social and media derision are called for?

Verdrängung Mechanism – the level of control and idea displacement achieved through skillful employment of the duality between pluralistic ignorance and the Lindy Effect. The longer a control-minded group can sustain an Omega Hypothesis perception by means of the tactics and power protocols of proactive pluralistic ignorance, the greater future acceptability and lifespan that idea will possess. As well, the harder it will to be dethrone as an accepted norm or perception as a ‘proved’ null hypothesis.

One key sign that an Omega Hypothesis is being promoted, is the tactic of declaring any non-conventional alternative explanation as constituting ‘magical thinking.’ This paranoia about every thought that threatens one’s beliefs as stemming somehow from magic, is in itself a version of religious thinking. Three conditions typically lead to this tactic, and highlight a person’s religious clinging to the Omega Hypothesis:

A.  Forcing a Null Hypothesis from an idea which has not really been matured into an actual scientific hypothesis in the first place,

B.  Assuming the Null Hypothesis to be true,

C.  Assuming all competing hypotheses to be declarations of ‘magical thinking’ – in an attempt to obviate any scientific testing or maturing of such an idea.

All the above a set of practice which abrogates a Popperian view of the threshold and rigor of adequate science, relying instead on the promotion of an invalid null hypothesis (HΩ) through academic inertia, ignorance of the discipline, promotification science, social skeptic campaigns or corporate pressure. Now to be fair, this set of unethical practice was also developed and perfected by Abrahamism and other religions, both currently and in the past. It is not something practiced by only Social Skeptics by any means.

The Signals of an Omega Hypothesis at Play:

omega hypothesis truck - Copy
Proselytize Children or College Students

Do they tout the wonders of ‘critical thinking’ before one possesses a deep or even a nascent understanding of the world around them?

Bears a Similitude of Science or 100% Correct Authority

Do they love to wear the robes of, or enjoy the regard of self, as being equivalent to an expert or scientist on a subject? Are they never wrong or uncertain?

Promoted by Means of an Inverse Negation Fallacy

Do they just happen to habitually apply methodical doubt to every idea except for a favored few?

Defended by the Martial Art of Denial

Instead of bringing evidence, do they habitually and only provide excuses or plausible outs as to why opposing evidence could be invalid?

Involves Clubs, Literature and Bandwagons Events to Re-enforce

Do they only hang with their fellows, or regularly celebrate visible membership and fealty to positions of club opinion and membership? Conventions, outreach literature?

Byzantine Enforcement of Preferred Ideas Through Social Pressure Rather Than Empirical Study

Do they wink and nudge, loudly alert their fellows for action, patrol forums, and loudly decry anyone who thinks differently about ideas they disfavor? Do they cite your dissent as constituting a ‘lack of critical thinking skills’ – and position you inside a pigeon hole of canned irrationality?

Seeks Control of Media and Legislation in Lieu of Science

Do they seek to head off science by intervening through legislature or inflammatory articles, or consider the thoughts of the common man or those who have experienced anecdotes or health problems to be beneath them?

Enforced Through Celebrity and Priest

Do they seek the authoritative backing of celebrity voices, or celebrity for themselves, as a means of adding credibility to their message?

Employs Mocking and Derisiveness on All Opposing Thought

Is their first and primary resort to argument, a mocking disdain and derisiveness towards those who are honestly (and maybe errantly) seeking answers inside their lives? or are suffering a health challenge?

A word: Regard people who practice the above as DISHONEST.

shortcutsYou will find that the philosophy of Social Skepticism rarely if ever, delves into the philosophy of ethics, particularly in regard to the mantle assumed by one when undertaking the role of skepticism as an authority. Nor do they examine their steps in regard to enforcement of The Omega Hypothesis. Moreover, one would think that the role of the skeptic and free thinking would be to challenge The Omega Hypothesis. Sadly this is never the case inside Social Skepticism.

When philosophers speak of skepticism being the foundation of science, they are not referring to the unbridled spewing of methodical cynicism and prejudicial doubt which is practiced by those who today pretend to be, or assume the mantle of representing, science. Skepticism carries no agenda, save for the idempotent ethic of defending the knowledge development process. It challenges manipulation of data and methods through fear, establishment of control, practices of disdain, squelching of ideas, observations or persons, mafia elite powers and the cultivation of ignorance. This is why the definition of Ethical Skepticism begins with the statement:

Skepticism is the complement of sound science, not the privilege sword of a few pretenders. It is the handiwork of those who possess the grace, integrity and acumen requisite in the wielding of great ideas.

It is not, and never has been a license to spew denial as if one bore no responsibility in its offing.


¹  Philosophic Burden of Proof, Wikipedia; https://en.wikipedia.org/wiki/Philosophic_burden_of_proof

note: An example of a false employment of a common misconception of burden of proof can be found here: RationalWiki Burden of Proof

In the definition cited, Rational Wiki cites that only ‘new and remarkable ideas’ bear the burden of proof. That once evidence is presented, it is ‘up to the opposing side to disprove the evidence.’  Specifically they employ the Error of the Default Null: “If someone has presented you with an idea and says that the burden of proof is on you to disprove the idea, work out what the null hypothesis is and then put their evidence for the idea against it. The person claiming something is possible or has happened needs to produce evidence to refute the null hypothesis.”

All of these fallacious ideas are employed inside fake skepticism.

²  McCoy, Ryan, “Biblical Tests of a Prophet,” http://www.biblicaltestsofaprophet.com/

August 25, 2015 Posted by | Argument Fallacies, Institutional Mandates | , , , , , | Leave a comment

The Burden of Proof (in Gumballs)

Sometimes ‘simple’ is itself the extraordinary claim. A burden of proof may not always reside where we regard it. A claim that something is false, can be just as extraordinary as the claim that something is true. It is important that The Ethical Skeptic distinguish between claims which bear a burden of proof, those that do not, and those which are by their non-sequitur nature, irrelevant.

kids and teacher how many gumballs - CopyI was attending one of my kid’s school parties, a father/student night in elementary school one evening years ago, when an interesting contest arose. The teacher challenged the fathers to all guess how many gumballs were in a jar of gumballs she had on her desk. We all dutifully wrote our guesses down on a sheet of paper and tore it off into a small folded sheet to hand to the students’ homeroom teacher. Once all the slips of guesses were placed into a bowl, the teacher pulled each one out and wrote the guesses on the whiteboard for all to see.  110, 245, 43, 66, 190, and so forth. Number after number came up, and I waited dutifully for my guess of 143 to show. I had made the guess by counting the number of gumballs in the size of one fist. A fist is a common quick measure of volume in situations where one does not have any available measure for volume estimate. 11 Gumballs to a fist, 12 fists to the container – then a little sluff for the neck of the jar, which appeared to be full to the brim. 11 x 12 = 132, plus another 11 I could count in the neck. Thus my guess was 143.

Finally the teacher pulled up a piece of paper and started to read it, then stopped abruptly and smiled. “Well it’s obvious someone overhead me saying how many gumballs we had in the jar, earlier to Ms. Clemmens over here. So we won’t count that submission.” She set, what turned out to be my submission, aside.

The winning dad made a valiant guess of 127 against the correct count of 143 gumballs in the jar. Good job dad. I applauded his excellent guestimating skills and said nothing about the matter. After all, this is just elementary school. What we are taught here, does not matter in the larger scheme of things, right?  Such a drama-in-elementary exhibits an important principle with regard to claims of falseness.

When one makes or implies a claim to falseness, one assumes the burden of proof.¹

Under Ockham’s Razor, plurality should not be introduced without necessity. The homeroom teacher, by accusing me of exercising dishonesty in my submission, had violated Ockham’s Razor. The context of entrant anonymity in no way excused a direct or implicit claim of lying; as this is still the same contention. She had introduced a very complicated idea, by mistaking the challenge to be simple. She had chosen the simplest explanation – no one can guess EXACTLY the gumball count in my jar. As with fake skeptics, she failed to discern the real principle here, that of plurality – or hypothesis stacking – complicated-ness as it might otherwise be known. She chose without evidence, Hypothesis B below, and presumed it because of

Occam’s Razor‘ the simplest explanation – in my base of personal knowledge and critical thinking, the chance of guessing 143 gumballs is too unlikely to be considered as a valid outcome.

Therefore Hypothesis C below, had to be false, in her skeptical mind. Here are the available array of ideas surrounding my ‘lucky’ guess, as they stand:

Hypothesis A – One or more fathers is a psychic – one father reads minds and could ascertain from my thoughts that the gumball count was 143.

This Hypothesis fails Ockham’s Razor for the simple fact that it must first presume that psychic ability exists, that the teacher knows what being psychic even means, that there was a knowledge on my part on how to employ such skill here, that I possessed the desire to falsify a document and impress a crowd, that I was looking for glory as to how prescient I am, that this is the way I impress and provide a role model for my son, and that I held that desire so profoundly that I would apply it in the guess of a gumball count in a jar at my kid’s father/student party.

A highly stacked – or pluralistic – hypothesis

Hypothesis B – One or more fathers is a cheater and a liar – one father listened in on myself and my assistant and ascertained from my statement that the gumball count was 143.

This Hypothesis fails Ockham’s Razor for the simple fact that it must first presume that the submittant cheats and lies, that the teacher is so smart and skeptical, that she can correctly detect this condition in a person and in me, that I possessed the desire to falsify a document and impress a crowd, that I was looking for glory as to how prescient I am, that cheating and lying is the way I impress and provide a role model for my son, and that I held that desire so profoundly that I would apply it in the guess of a gumball count in a jar at my kid’s father/student party.

A highly stacked – or pluralistic – hypothesis

Hypothesis C – One father made a skilled and lucky guesstimate – from a pinch of math and a bit of english, one father correctly guessed a gumball count of 143.

This hypothesis ‘holds the razor’ even thought it could be considered unlikely to guess 143 exactly – it is the null or favored hypothesis until such time as there is necessity, and a sufficient threshold of plurality evidence is brought forward which showed I ascertained the correct count of gumballs by any mutually exclusive and alternative means.

unlikely-versus-simpleI simply employed a little bit of skill I have used in the field in Africa and Asia, with a bit of math, combined with a bit of estimator’s wisdom (english) to get lucky on my estimated count of gumballs.  Had the teacher selected hypothesis C above – perhaps I could have explained how I did this to the kids – showed an example of measuring concretions being formed into a housing brick in Africa, and how I pulled off the guess.

But, it is better that schools teach the false form of skepticism instead, right? Don’t step outside of the rules of expectations, there is no way to get the correct amount. There are penalties if you do. There is no such thing as a cure for cancer or IBS, if you feel bad it is a panic attack, supplements are all evil, there is no such thing as a spirit realm, there is no such thing as ghosts, there is no such thing as…. – All easy pat, Occam’s Razor compliant answers.

‘Occam’s Razor’ says that the simplest explanation is that 143 is a hard count to guess, and cannot be guessed realistically, right? Something is up, if it is indeed guessed. Implicit in such a claim is a boast that I personally, hold the full domain knowledge of potentiality and likelihood. This is a common Social Skeptic implicit claim. The pitfall of the fake skeptic: I fail to be a skeptic of myself. Well the simple fact is, that

…sometimes, ‘simple’ is an extraordinary claim in and of itself.

Your effort will not be regarded as valid if you do not fit this errant version of ‘Occam’s Razor’ – simple contention – complicated-ness in knowing.  When one extrapolates this claim to Hypothesis B (see The MiHoDeAL Claim and The Appeal to Skepticism Fallacy), to apply to a whole domain of subjects they seek to discredit, under an air of authority as a skeptic – one is performing under Corber’s Burden. Under this burden, the skeptic must therefore always be right. Always. Or tender the appearance of doing so. Such is the enormous burden, the implicit claim, of the fake skeptic.

Which brings up the topic of proof gaming. Let’s examine that common social skeptic bad science method and fallacy, before we move on to our gumball examples.

Proof Gaming

/philosophy : argument : pseudoscience : false salience/ : employing dilettante concepts of ‘proof’ as a football in order to win arguments, disfavor disliked groups or thought, or exercise fake versions of science. Asking for proof before the process of science can ostensibly even start, knowing that plurality is what begins the scientific method not proof, and further exploiting the reality that science very seldom arrives at a destination called ‘proof’ anyway. Proof gaming presents itself in seven speciations:

Catch 22 (non rectum agitur fallacy) – the pseudoscience of forcing the proponent of a construct or observation, to immediately and definitively skip to the end of the scientific method and single-handedly prove their contention, circumventing all other steps of the scientific method and any aid of science therein; this monumental achievement prerequisite before the contention would ostensibly be allowed to be considered by science in the first place. Backwards scientific method and skipping of the plurality and critical work content steps of science. A trick of fake skeptic pseudoscience, which they play on non-science stakeholders and observers they wish to squelch.

Fictitious Burden of Proof – declaring a ‘burden of proof’ to exist when such an assertion is not salient under science method at all. A burden of proof cannot possibly exist if neither the null hypothesis or alternative theories nor any proposed construct possesses a Popper sufficient testable/observable/discernible/measurable mechanism; nor moreover, if the subject in the matter of ‘proof’ bears no Wittgenstein sufficient definition in the first place (such as the terms ‘god’ or ‘nothingness’).

Herculean Burden of Proof – placing a ‘burden of proof’ upon an opponent which is either arguing from ignorance (asking to prove absence), not relevant to science or not inside the relevant range of achievable scientific endeavor in the first place. Assigning a burden of proof which cannot possibly be provided/resolved by a human being inside our current state of technology or sophistication of thought/knowledge (such as ‘prove abiogenesis’ or ‘prove that only the material exists’). Asking someone to prove an absence proposition (such as ‘prove elves do not exist’).

Fictus Scientia – assigning to disfavored ideas, a burden of proof which is far in excess of the standard regarded for acceptance or even due consideration inside science methods. Similarly, any form of denial of access to acceptance processes normally employed inside science (usually peer review both at theory formulation and at completion). Request for proof as the implied standard of science – while failing to realize or deceiving opponents into failing to realize that 90% of science is not settled by means of ‘proof’ to begin with.

Observation vs Claim Blurring – the false practice of calling an observation or data set, a ‘claim’ on the observers’ part.  This in an effort to subjugate such observations into the category of constituting scientific claims which therefore must be now ‘proved’ or dismissed (the real goal: see Transactional Occam’s Razor Fallacy).  In fact an observation is simply that, a piece of evidence or a cataloged fact. Its false dismissal under the pretense of being deemed a ‘claim’ is a practice of deception and pseudoscience.

As Science as Law Fallacy – conducting science as if it were being reduced inside a court of law or by a judge (usually the one forcing the fake science to begin with), through either declaring a precautionary principle theory to be innocent until proved guilty, or forcing standards of evidence inside a court of law onto hypothesis reduction methodology, when the two processes are conducted differently.

The Burden of Proof (exhibited in the oft-applied gumball analogy)

Which brings up the whole subject of the Philosophical Burden of Proof, which differs from a legal burden of proof regarding innocence.¹ When is a claim under the burden of proof, and when is it not? And when does a claim enjoy a lack of burden of proof simply because it is non-sequitur? In general, when one makes a claim to veracity (not a call for sponsorship and research – that is different) – in other words, one makes a claim that they are correct – the burden of proof falls upon them.  If I spot a big hairy man-like ‘thing’ in the forest and then make the call for more research on the observation – I am NOT MAKING A CLAIM. Rather simply calling for research – as I have no claim, save for being shocked by observing something paradigm shattering for which I have no explanation. This could be a person putting themselves in danger inside a costume, or it could be similar to what others of credible background have observed.  It is not a claim.  A claim, is a claim to empirical or analytical authority – that all must now accept as establishment of fact, reason, rationality or critical thinking – that which I am contending is substantiated by the evidence.

But with regard to the gumballs in our classroom anecdote above:

gumball analogy - CopyClaims Which Bear a Burden of Proof – regarding the gumballs to the right

  • There are 143 gumballs
  • There are an odd number of gumballs
  • There are an even number of gumballs
  • There are only red white and blue gumballs
  • There are no green gumballs
  • There is something besides gumballs in this mix
  • People who have observed green gumballs are liars
  • People who claim to have seen green gumballs are suffering memory suggestiveness
  • Observations of green gumballs are only anecdote
  • There are not an even number of gumballs
  • There are not an odd number of gumballs
  • We cannot see some gumballs currently

note that bullet points 2 and 3 above stand as an example of plurality under Ockham’s Razor

Claims Which are Non-Sequitur – they fail or skip large parts of the scientific method and cannot yet be contended or even asked

  • There are not 143 gumballs
  • The mix of red white and blue gumballs remains the same throughout those we cannot see
  • People who believe in green gumballs are credulous
  • We see an even number of gumballs, therefore the total of all gumballs is even
  • There are all sorts of gumballs of varying colors
  • There are only gumballs in this jar
  • People who attend church believe in green gumballs
  • Observations of green gumballs are pareidolia
  • Gumballs taste rancid
  • Gumballs can only be observed by a specific gumball expert team
  • Gumball skeptics are critical thinkers
  • Science does not have any evidence for green gumballs
  • Gumballs are pseudoscience
  • Gumballs are inter-dimensional and therefore hard to find
  • Dead body gumballs are necessary before I look beyond the visible ones
  • Skepticism tells me there are only red blue and white gumballs
  • The universe is so large that there must be green gumballs
  • Richard Dawkins has disproved all non red white or blue gumballs
  • Green gumballs do not exist
  • I hold the unambiguous definition of what is a gumball
  • Science holds the unambiguous definition of what is a gumball
  • There are no gumballs
  • There are no more red gumballs than what we see here, the rest are all blue and white

Claims Which are No Longer Under a Burden of Proof – established by empirical observation

  • There are 17 blue gumballs visible
  • We see an even number of gumballs (and recognize some dissent)
  • There are at least three colors of gumballs
  • Reality does not contain an empty set of gumballs
  • There are more than 0 gumballs
  • Gumballs are seen by more than one credible observing authority

In the end, is it not easier to skip a claim to knowledge and let the data accrue on its own, before one begins to invest in large grand scenarios of skepticism or launch into fanciful pathways of non-sequitur entertainment?  Or perhaps – best put, let ideas falsify themselves through accrued verity – not personal brilliance and experience.

Such is the nature of Ethical Skepticism. Man, now I am craving gumball.


¹  Philosophic Burden of Proof, Wikipedia; https://en.wikipedia.org/wiki/Philosophic_burden_of_proof

August 25, 2015 Posted by | Argument Fallacies, Institutional Mandates | , , , , , , , , , | Leave a comment

   

Chinese (Simplified)EnglishFrenchGermanHindiPortugueseRussianSpanish
%d bloggers like this: