The Tree of Knowledge Obfuscation: Misrepresentation by Bias or Method

The following is The Ethical Skeptic’s list, useful in spotting both formal and informal logical fallacies, cognitive biases, statistical broaches and styles of crooked thinking on the part of those in the Social Skepticism movement. It is categorized by employment groupings so that it can function as a context appropriate resource in a critical review of an essay, imperious diatribe or publication by a thought enforcing Social Skeptic. To assist this, we have comprised the list inside an intuitive taxonomy of ten contextual categories of mischaracterization/misrepresentation:

Tree of Knowledge Obfuscation The Ethical Skeptic.

Misrepresentation by Bias or Method

a dicto simpliciter ad dictum secundum quid – the fallacy of arguing from a generalization (the dicto simpliciter) to a particular case (ad dictum secundum), without recognizing qualifying factors, i.e.: Most people are involved in conspiracy theory cults, then anyone who disagrees with me is a conspiracy theory cultist.’

absens sciens absens iniuria – literally, no knowledge – no harm. A procedural fallacy or error in principle similar to ‘what they don’t know, won’t hurt ’em’. An erroneous principle which cites that a person cannot be harmed if they do not know that they were harmed. Alternatively, if a group of people is unaware that a harm has been done, then no one in that group has been harmed. A form of pluralistic ignorance exploitation.

acatalepsia Fallacy – a flaw in critical path logic wherein one appeals to the Pyrrhonistic Skepticism principle that no knowledge can ever be entirely certain – and twists it into the implication that therefore, knowledge is ascertained by the mere establishment of some form of ‘probability’. Moreover, that therefore, when a probability is established, no matter how plausible, slight or scant in representation of the domain of information it might constitute, it is therefore now accepted truth.  Because all knowledge is only ‘probable’ knowledge, all one has to do is spin an apparent probability, and one has ascertained accepted knowledge. Very similar in logic to the Occam’s Razor aphorism citing that the ‘simplest explanation’ is the correct explanation.

Actor-Observer Bias – in a situation where a person experiences something negative, the individual will often blame the situation or circumstances. When something negative happens to another person, people will often blame the individual for their personal choices, behaviors and actions.

ad hoc Fallacy – an ignoratio elenchi response to an argument or evidence, which seeks to exploit ambiguity or non-accountability as a domain in which to craft a defense which cannot be readily distinguished from something made up. Invention of an explanation which distracts attention away from critical path logic, and/or for which evidence to the pro and con cannot be derived in the now, and/or falsification is unapproachable. A tactic of pseudo-theory and a form of rhetoric.

ad hoc/Pseudo-Theory – a placeholder construct which suffers from the additional flaw in that it cannot be fully falsified, deduced nor studied, and can probably never be addressed or further can be proposed in almost any circumstance of uncertainty. These ideas will be thrown out for decades. They can always be thrown out. They will always be thrown out. Sometimes also called ‘blobbing’ or ‘god of the gaps’, it is a bucket into which one dumps every unknown, hate-based, fear-based and unexplained observation – add in a jigger of virtue – then you shake it up like a vodka martini, and get drunk on the encompassing paradigm which can explain everything, anything and nothing all at the same time.

ad injuriam – appeal to injury. When you can’t refute someone’s work, so instead you hire trolls to threaten a person’s career, reputation, and/or family.

ad stalkinem – a focus on the opposing person in a debate, to such an extreme that the ad hominem party also begins to stalk or obsess over that person – through a continuous critique by means of meaningless rhetorical statements, and/or by delving into their personal life in a disturbed and inappropriate manner. Typically, the imbalance here suggests that the stalker never comprehended nor was really interested in the subject being debated in the first place.

Adissonance – the natural human desire or state of escape from the discomfort of dissonance. A lever used by fake skeptics to bring a curious person back to a state of contrived and pleasant, ignorance.

Adoy’s Principle (House Hedge)

  1. Inside a conflict in interests, glitches and errors will always favor a single side.
  2. Penalty systems rarely fail, while benefit systems are dispositioned to do so.

(an inversion of Yoda’s axiom in Star Wars, The Empire Strikes Back: “Do, or do not. There is no try.”) – systems which administer punitive actions and/or penalties rarely if ever fail (they ‘do’); while systems which deliver awards and/or benefits often fail or are designed so as to increase the likelihood of failure (they ‘try’). The difference is called a ‘house hedge’. The house hedge is expressed in two ways. First as the economic inefficiency of extraction by taxation: the taxing body gets to keep the house hedge illegitimately as a defacto program inefficiency. Second as a feature of club quality: fake skeptics are allowed to deliver condemning dispositions without any scientific rigor, while their victims must produce flawless science in order to negate the easy proclamations of the fake skeptic.

Advantageously Obtuse (Bridgman Reduction) – a principle which has been translated, reduced or dumbed-down for consumption so as to appear to be a ‘simple’ version of its source principle; however, which has been compromised through such a process. Thereby making it easy to communicate among the vulnerable who fail to grasp its critical elements, and moreover to serve as an apothegm useful in enforcing specific desired conclusions. Statements such as ‘the burden of proof lies on the claimant’ or ‘the simplest explanation tends to be correct’ – stand as twisted, viral forms of their parent principles, which contend ironically, critically or completely different standards of thought.

Affiliation Bias –  when one chooses something due to a current or past closeness, love, sentiment or affiliation with the something.

Agency – an activated, intentional and methodical form of bias, often generated by organization, membership, politics, hate or fear based agenda and disdain. Agency and bias are two different things. Ironically, agency can even tender the appearance of mitigating bias, as a method of its very insistence. Agency is different from either conflict of interest or bias. It is actually stronger than either, and more important in its detection. Especially when a denial is involved, the incentive to double-down on that denial, in order to preserve office, income or celebrity – is larger than either bias or nominal conflict of interest. One common but special form of agency, is the condition wherein it is concealed, and expresses through a denial/inverse negation masquerade called ideam tutela. When such agency is not concealed it may be call tendentiousness.

ideam tutela – concealed agency. A questionable idea or religious belief which is surreptitiously promoted through an inverse negation. A position which is concealed by an arguer because of their inability to defend it, yet is protected at all costs without its mention – often through attacking without sound basis, every other form of opposing idea.

Tendentious – showing agency towards a particular point of view, especially around which there is serious disagreement in the at-large population. The root is the word ‘tendency’, which means ‘an inclination toward acting a certain way.’ A tendentious person holds their position from a compulsion which they cannot overcome through objective evaluation. One cannot be reasoned out of, a position which they did not reason themselves into to begin with.

Akratic Trolling – when an advocate of an agenda plays the game wherein they will troll and provoke their perceived enemy, then suddenly retreat into the pure technical of science or atheism and adopt a holy or statesman facade when the perceived enemy objects to their behavior.

Aleatoric Casuistry – employing statistical uncertainty, representative of unknowns that differ each time we run the same experiment, observation or situation, in order to push the idea that something very uncommon is actually common, can be, or has been common. A version of ‘you never know’ or ‘I bet this happens a lot’. A way of implying that an epistemological basis for a frequency of epistemic uncertainty exists, when indeed it does not.

Alinsky’s Rule #9  – from Saul Alinsky’s work Rules for Radicals, the contention that “The threat is usually more terrifying than the thing itself.” This apothegm was expanded upon by Obama Chief of Staff, Rahm Emanuel in 2011, inside his corollary, “Never allow a good crisis to go to waste.”

Anachronistic Fallacy – when applying modern societal morals, strictures, moors, rules, laws and ethics retrospectively or retroactively upon past events or persons. Any attempt to lens and judge historical characters through means of modern character framing. This fallacy of soundness fails in that its method only produces negative assessments, by failing to detect any higher standards versus today’s or regard of mitigating circumstances/considerations.

Anecdote Data Skulpting (Cherry Sorting) – when one applies the categorization of ‘anecdote’ to screen out unwanted observations and data. Based upon the a priori and often subjective claim that the observation was ‘not reliable’. Ignores the probative value of the observation and the ability to later compare other data in order to increase its reliability in a more objective fashion, in favor of assimilating an intelligence base which is not highly probative, and can be reduced only through statistical analytics – likely then only serving to prove what one was looking for in the first place (aka pseudo-theory).

Anecdote Error – the abuse of anecdote in order to squelch ideas and panduct an entire realm of ideas. This comes in two forms:

Type I – a refusal to follow up on an observation or replicate an experiment, does not relegate the data involved to an instance of anecdote.

Type II – an anecdote cannot be employed to force a conclusion, such as using it as an example to condemn a group of persons or topics – but an anecdote can be employed however to introduce Ockham’s Razor plurality. This is a critical distinction which social skeptics conveniently do not realize nor employ.

Angel Questions – a form of rhetoric or propaganda wherein easy lob questions are only offered to a person or organization who otherwise should be held to account. Prefabricated FAQ’s which fall in line with a prescripted set of propaganda or politically correct thinking. Questions which appear to come from a curious third party, however are scripted to hijack a discussion down an easy path of justifying the message of the person being questioned.

Anomie – a condition in which a club, group or society provides little or negative ethical guidance to the individuals which inhabit it or craft its direction.​

Antipode Path Logic – the inverse of critical path logic. A condition wherein an arguer develops a conclusion about a matter in absence of having addressed any critical path logic or epistemology (risk incremental, dependent series and probative questions or tests) before making the conclusion. The opposite of the condition where a person has pursued critical path logic, yet in finding insufficient evidence, refuses to tender a final conclusion or opinion (ethical skepticism).

aphronêsis – twisted, extreme, ill timed, misconstrued, obtuse or misapplied wisdom, sometimes even considered correct under different contexts of usage – which allow an agenda holder to put on a display of pretend science, rationality and skepticism. The faking skeptic will trumpet loudly and often about the scientific method, evidence, facts, ‘skepticism’ or peer review, but somehow will never seem to be able to apply those principles, nor cite accurate examples of their application. The faking skeptic will speak often of ‘demanding proof,’ deny sponsors access to challenge ideas or fiat science, and incorrectly cite that denial of access to peer review is indeed – peer review. You will find them endlessly spouting incorrect phrases like ‘the burden of proof resides on the claimant’; its symbolic evisceration standing as de facto proof of their own beliefs. Vehement skeptics tend to be young and only academically/socially trained to a great degree – their ‘skepticism’ easing most of the time as they gain life experience. ‘Proof’ is the hallmark of religion. ~Bill Gaede

The Appeal of the Narrative Lie – a three-part principle which underpins the motivation as to why persons will lie in support of a Narrative. 1. To achieve assent via logical deduction means your case is powerful, but to manipulate assent via a lie means that you are powerful – the more people manipulated, the more gratifying is that power. 2. The lie is excused because one is lying for ‘virtuous or just cause’. 3. The most rewarding form of lying for the Narrative Narcissist therefore, is one in which intent can be laundered from the liar themselves.

Appeal to Implicit Conspiracy – the default position taken by a pseudo-skeptic that in order for a counter-claimant to actively research or have confidence in their proposition, then quod erat demonstrandum they must therefore believe a conspiracy exists which is holding back their preferred alternative from being studied or accepted. This default ad hoc fallacy explanation can be accused of anyone, without discretion, distracts from the logic at hand, can never be verified and results in only finding what we already think we know, to therefore be true. A substitute form of science (pseudo-theory) issued in the form of pejorative ad hominem and straw man, all rolled up into one baseless and easy claim on the part of a pseudo-skeptic.

Argument from Incredulity – also known as argument from personal incredulity or appeal to common sense, is a fallacy in informal logic. It asserts that a proposition must be false because it contradicts one’s personal or common expectations or beliefs, or is difficult to imagine.

argumentum ad ignorantiam (Argument from Ignorance) – a species of assertion in which one contends that a proposition is true because it has not yet been proven false or a proposition is false because it has not yet been researched, studied by science or proven true.

Artifarce – a slow moving disaster, drought, famine, inflation, eclipse, change in climate, or other natural phenomena – which is speciously blamed upon one’s political opponents or those they hate, and is used to justify taxation, enslavement, and/or genocide.

As Science as Law Fallacy – conducting science as if it were being reduced inside a court of law or by a judge (usually the one forcing the fake science to begin with), through either declaring a precautionary principle theory to be innocent until proved guilty, or forcing standards of evidence inside a court of law onto hypothesis reduction methodology, when the two processes are conducted differently. The implication or assumption that something is ‘innocent until proven guilty’ under the scientific method, when in fact this is an incorrect philosophy of hypothesis reduction.

Ascertainment Bias – a form of inclusion or exclusion criteria error where the mechanism of sampling, specimen selection, data screening or sub-population selection is inherently flawed, is developed based upon an inadequate or sophomoric view of the observations base or produces skewed representations of actual conditions.

Asch Conformity –  participants in an argument fed false or misleading information, who conform to the majority opinion on at least half of these misleading ideas, are reported as reacting with what gestalt psychologist Solomon Asch called a “distortion of perception”. These participants, who make up a distinct minority (social skeptics or sycophants inside public discourse), will express belief that misleading or false answers are correct, unaware as to the actual veracity or lack thereof, of the answers originating from the majority to which they have paid reverence (appeal to authority).

atéchne – a misconstruing of fact and method. Pretenses of false science and skepticism, portrayed by faking skeptics and social agenda apparatchiks. An inventory of fake methods, such as Occam’s Razor, anecdote dismissal, assailing the facts, dismissing eyewitness testimony, informal fallacy, etc.; all of which are abused, misapplied or are indeed not applicable at all to the subject or research under consideration.

Badwill – a detrimental and exploitable asset similar to goodwill found on a company’s financial statement. It refers to the influence that a governing body or entity holds over a target population, utilizing factors such as widespread ignorance, disinformation, fear, or other means of manipulation. It acts as a tool of intimidation and control, which is administered by illegitimate forms of governance, supervision, or management.

Benford’s Law of Controversy – passion is inversely proportional to the amount of real information available.

Bias Inflation – a baseless claim of discrediting an observation set through identifying a plausible, relevant or even salient observer bias, which might have or did contribute to the observational profile, yet which can at most only explain a small or negligible portion of the observation base itself.

Blame – an expressed and dysfunctional absence of responsibility, which demonstrates an intellectual/spiritual deficit of awareness of the distinction between the two principles.

Bolt’s Axiom – a belief is not merely an idea the mind possesses; it is an idea that possesses the mind. Attributed to Robert Oxton Bolt, English Playwright.

Brechung Effect (German: Refraction Effect) – a principle which cites that, the further away an expert in a field is from personally conducting recent or field application/observation practice (eg. seeing patients or direct testing/observation/measurement/excavation, etc.), the more confident, appeal to credential, appeal to authority, arrogant, demeaning, and/or insistent will be their assertions.

The Bricklayer’s Error – the presumption that academic, heuristic or deep single-function expertise (bricklaying) qualify one to stand as authority as to how the broader issue is to be managed (house is to be built or lived-in). Experience trumps consilience. Consilience trumps heuristic.

Bridgman Point – the point at which a principle can no longer be dumbed-down any further, without sacrifice of its coherency, accuracy, salience or context.

Bridgman Point Paradox – if you understood, I could explain it to you – but then again – if you understood I wouldn’t have to explain it to you.

Broad or Deep Fallacy – the habit of a pretend researcher to go deep into details on a subject possessing thin evidence, or alternately as the situation may warrant, to examine only old monkey suit stories or broaden the subject being considered sufficiently enough to include numerous anecdotes of a ludicrous or dismissed nature in an effort to avoid addressing a current body of robust observational evidence.

Bucket Irrationality – a key sign of irrationality, resides in the circumstance whereupon, observing a problem inside a system – one contends that the entire system is evil and should be shut down. Moreover, the circumstance whereupon, identifying problems inside a topic, one declares it all to be ‘woo’ or ‘pseudoscience’.

Caesar’s Wife (Must be Above Suspicion) – a principle inside the philosophy of skepticism which cites that a mechanism, research/polling effort, or study which bears an implicit a priori claim to innocence (i.e. soundness, salience, precision, accuracy and/or lack of bias/agency) must transparently and demonstratively prove this claim before being assumed as such, executed or relied upon as scientific.

Casuistry – the use of clever but unsound reasoning, especially in relation to moral questions; sophistry. Daisy chaining contentions which lead to a preferred moral outcome, by means of the equivocal use of the words within them unfolding into an apparent logical calculus – sometimes even done in a humorous, ironic or mocking manner. A type of sophistry.

Cause-Therapy Affirmation of the Consequent – when one operates from a belief or practice based upon the disinformed notion that, since a therapy is effective in reducing an effect, therefore the lack of that therapy is the cause of that effect. Placing a person in an ice-bath reduces dangerously high fever; therefore, fever is caused by lack of ice-baths. Fasting and HIIT exercise serve to reduce body mass index and non-alcoholic fatty liver disease; therefore, high BMI and NAFLD are caused by too much consumption and not enough exercise.

Catch 22 (non rectum agitur fallacy) – the pseudoscience of forcing the proponent of a construct or observation, to immediately and definitively skip to the end of the scientific method and single-handedly prove their contention, circumventing all other steps of the scientific method and any aid of science therein; this monumental achievement prerequisite before the contention would ostensibly be allowed to be considered by science in the first place. Backwards scientific method and skipping of the plurality and critical work content steps of science.

Cheater’s Hypothesis – Does the hypothesis or argument couch a number of imprecise terms or predicate concepts? Is it mentioned often by people wishing to appear impartial and comprehensive? Is the argument easily falsified through a few minutes of research?

Imposterlösung Mechanism – the cheater’s answer. An incoherent or ridiculous contention which is assumed as a potential hypothesis simply because it sounds good enough for public consumption. These alternatives pass muster with the general public, but are easily falsified after mere minutes of real research. Employing the trick of pretending that an argument domain which does not bear coherency nor soundness – somehow (in violation of science and logic) falsely merits assignment as a ‘hypothesis’. Despite this, most people hold them in mind simply because of their repetition. This fake hypothesis circumstance is common inside an argument which is unduly influenced by agency. They are often padded into skeptical analyses, to feign an attempt at appearing to be comprehensive, balanced, or ‘considering all the alternatives’.

Ad hoc/Pseudo-Theory – can’t be fully falsified nor studied, and can probably never be addressed or can be proposed in almost any circumstance of mystery. These ideas will be thrown out for decades. They can always be thrown out. They will always be thrown out.

Cheetah Skeptic – just as the cheetah is the fastest land animal, the fastest philosophical animal is a ‘skeptic’ who flees from the scientific method by means of ‘critical thinking’ when faced with any evidence or science they see as potentially threatening to the paradigm they desire to enforce.

Cherry Picking – pointing to a talking sheet of handpicked or commonly circulated individual cases or data that seem to confirm a particular position, while ignoring or denying a significant portion of related context cases or data that may contradict that position.

Choice Supportive Bias – the tendency to remember one’s choices and professional judgement as more educated or accurate than they actually were. When one chooses something because one has previously also selected that something in an earlier decision set.

Chucklehead Diversion – when using humor or mocking of others as a facade of appearing objective or to conceal the underlying message one is passing as not being threatening, serious or malicious in nature, when such an implication is false. Typically employed as well as a defensive lever posture of allowing accusation of any criticism bearer as needing to ‘lighten up” since they have not used humor to belie their agenda.

Circular Definition – Also called a god principle or definition. A definition which relies upon elements of itself in order to define itself; much akin to a god being defined as the only being which can define itself. For example, critical thinking (epistemic rationality) being defined as ‘ensuring that one visibly demonstrate that their beliefs/actions/thoughts fall in line with those of others who also are also epistemically rational’. An appeal to authority and circular reasoning, bundled into one error. The one who enforces a circular definition regarding human attributes, is pretending to the role of God.

Clifford’s Law/Axiom – a belief is never benign. “No real belief, however trifling and fragmentary it may seem, is ever truly insignificant; it prepares us to receive more of its like, confirms those which resembled it before, and weakens others.” ~William Kingdon Clifford

Close-Hold Embargo – is a common form of dominance lever exerted by scientific and government agencies to control the behavior of the science press. In this model of coercion, a governmental agency or scientific entity offers a media outlet the chance to get the first or most in-depth scoop on an important new ruling, result or disposition – however, only under the condition that the media outlet not seek any dissenting input, nor critically question the decree, nor seek its originating authors for in-depth query.

Coincidence Theory/Theorist – the reactionary theory of conformance or one who crafts such complicated and highly stacked ‘rational’ alternatives as to why an astounding observation can only be served by a conventional or conforming explanation. Usually comes with the adjunct claim that the exceptional observation cannot possibly exist since ‘it would require a conspiracy of X people, if indeed it were true.’ A form of sophistry and rhetoric used to defend a political or religious a priori assumption.

Comparative Performance Paradox – a form of gaslighting which involves comparing a victim to peak performers when the victim’s activity benefiting others is evaluated, yet at the same time comparing the same victim to average performers when reward/compensation for their effort is apportioned back to them for that same effort.

Compartmentalization – the method employed by a person wishing to deceive them self, then subsequently others, by means of organizing their thoughts in such a way as to obscure data or truth regarding a matter. Equally, a method of organization relating to the structure of access to information; of categorizing data and practices into impotent silos and categories – no single one of which can service, impact or relate truth on its own accord. While a useful tactic in intelligence management circles, compartmentalization on the part of the human mind or organizations where transparency is of utmost importance is rarely employed to good ends. A focus on only clinical experiment at the exclusion of field observation, the blinders-on academic pretense of material monism or the division of a company’s fiscal accountability mechanisms in such a way as to hide profits or nefarious expenses, all these serve as methods which abrogate goals of clarity, truth and transparency.

Computational Irreducibility – the idea that some systems can only be sufficiently described by fully simulating them. The only way to determine the answer to a computationally irreducible question is to perform the computation/simulation which solves for its answer. In this context, it is impossible to ascertain the future state of a CI system, without having to sufficiently model and determine all the intermediate states in between. Such process cannot be reduced or sped up through any kind of reduction (Bridgman Reduction), assumption or shortcut. To do so alters the actual model and its answer into a state of unknown, and unrealized, error.

Confirmation Reliance Error – abuse of the Popper demarcation principle, which cites that body of knowledge/finished science cannot rely upon predictive and confirming evidence alone, by then applying this principle incorrectly to the process of science – or failing to distinguish controlled predictive science from simple confirmatory observation after the fact. This in an effort to filter out selectively, those ideas and theories which are vulnerable through having to rely in part upon predictive evidence, consilience of multiple inputs, or are further denied access to peer review and replication steps of science simply because of this malpractice in application of the Popperian predictive demarcation.

Conflation of Treatment and Cause – the assumption that, because a medicinal, food or activity helps to mitigate the symptoms of a malady, therefore the treatment is necessarily addressing the cause of that malady.  Assuming that because exercise and low caloric intake help reduce blood pressure and weight, that therefore low caloric activity and excess intake is quod erat demonstrandum the cause of those maladies.

Consequent Unintendences – things we know will happen as a result of our actions, but either we don’t care, or wanted them to happen as a side effect in the first place. Something one pretends is an unintended consequence of their actions, but was actually intended to begin with.

Conservatism – a certain state of mind wherein the tendency to dismiss perceived lower likelihood events or change one’s mind very little when faced with their veracity, is dismissed as an act of rationality.

Consilience Evasion – a refusal to consider scientifically multiple sources of evidence which are in congruence or agreement, focusing instead on targeting a single item from that group of individual sources of evidence because single items appear weaker when addressed alone. Also called a ‘silly con.’

Silly Con – spinning consilience as consensus. Investigating only one alternative and through manipulative pluralistic ignorance and social pressure, declaring that hypothesis as consensus and all others as unnecessary/pseudoscience/anti-science.  Spinning politically motivated variations of an accepted scientific hypothesis, and selling those variations to the untrained public, for consumption in the name of science.

Constructive Ignorance (Lemming Weisheit or Lemming Doctrine) – a process related to the Lindy Effect and pluralistic ignorance, wherein discipline researchers are rewarded for being productive rather than right, for building ever upward instead of checking the foundations of their research, for promoting doctrine rather than challenging it. These incentives allow weak confirming studies to to be published and untested ideas to proliferate as truth. And once enough critical mass has been achieved, they create a collective perception of strength or consensus.

Contradox Bias – when citing as supporting evidence the recanting testimony of a former adherent who conducted misrepresentation to support an idea, who now contends to be a member of a an opposing idea and is “coming clean” about their lies. Suffers from a form of confirmation bias, wherein one cites as an authority, testimony from someone who has demonstrated that they will lie to support their position.

Contrathetic Inference Paradox – a condition where abductive and inductive inference point to one outcome or understanding, and deductive inference points in another antithetical direction entirely. Since science often begins with inductive study stemming from abductive understanding, it will dogmatically hold fast to an inductive understanding until a paradigm shift occurs – a result of the weight of deductive evidence pointing in a different direction. The job of fake skepticism is to ensure that this deductive evidence or any thought resulting from it, is never accepted into science in the first place.

Corber’s Burden – the mantle of ethics undertaken when one claims the role of representing conclusive scientific truth, ascertained by means other than science, such as ‘rational thinking,’ ‘critical thinking,’ ‘common sense,’ or skeptical doubt. An authoritative claim or implication as to possessing knowledge of a complete set of that which is incorrect. The nature of such a claim to authority on one’s part demands that the skeptic who assumes such a role be 100% correct. If however, one cannot be assured of being 100% correct, then one must tender the similitude of such.

a. When one tenders an authoritative claim as to what is incorrect – one must be perfectly correct.

b. When a person or organization claims to be an authority on all that is bunk, their credibility decays in inverse exponential proportion to the number of subjects in which authority is claimed.

c. A sufficiently large or comprehensive set of claims to conclusive evidence in denial, is indistinguishable from an appeal to authority.

Correlation Dismissal Error – when employing the ‘correlation does not prove causality’ quip to terminally dismiss an observed correlation, when the observation is being used to underpin a construct or argument inside consilience, and is not being touted as final conclusive proof in and of itself.

Crabapple Picking – pointing to a talking sheet of handpicked or common touted individual cases or data that loosely seem to confirm a particular position, yet in fact are not sequitur with, nor in context with, nor logically related to the point of contention being touted.

The Critic’s Wager – aside from matters such as human rights, accountability of power, or potential harm, a principle which cites that one should practice restraint in explaining their resolution to a standing problem to a critic. A. If you do, they will find something wrong, as their emotions are invested in finding something flawed to begin with. Instead, ask them how they would approach the problem at hand. A1. If they hold the skills necessary in approaching the problem, odds are they will describe essentially the process you employed in the first place. If they propose something novel or creative, acknowledge and credit that added value or novel approach to them, and use it if applicable. A2. If they don’t bear the skill necessary or know how to approach the problem, then the question is broached – are they qualified to or should they then ethically critique its candidate solutions? This process sidesteps ignoratio elenchi, rhetorical, and ad hominem wastes of time.

Cultivation of Ignorance – If one is to deceive, yet also fathoms the innate spiritual decline incumbent with such activity – then one must abstract a portion of the truth, such that it serves and cultivates ignorance – a dismissal of the necessity to seek what is unknown.​ The purposeful spread and promotion or enforcement of Nelsonian knowledge and inference. Official knowledge or Omega Hypothesis which is employed to displace/squelch both embargoed knowledge and the entities who research such topics. Often the product of a combination of pluralistic ignorance and the Lindy Effect, its purpose is to socially minimize the number of true experts within a given field of study. This in order to ensure that an embargoed topic is never seriously researched by more members of the body of science than Michael Shermer’s ‘dismissible margin’ of researchers. By acting as the Malcolm Gladwell connectors, and under the moniker of ‘skeptics’, Social Skeptics can then leverage the popular mutual ignorance of the members and begin to spin misconceptions as to what expert scientists think. Moreover, then cultivate these falsehoods among scientists and the media at large. True experts who dissent are then intimidated and must remain quiet so as not to seem anathema, nor risk possibly being declared fringe by the patrolling Cabal of fake skeptics.

Cut the Feed – a tactic employed by oppressive agency, to withhold observation or data from public purview when it serves to falsify the Narrative, especially inside a circumstance wherein debunking or plausible deniability might only serve to bring the issue even more attention. Also known as an Entscheiden Mechanism – the pseudoscientific or tyrannical approach of, when faced with epistemology or observation which is heading in an undesired direction, artificially declaring under a condition of praedicate evidentia, the science as ended or ‘settled’ – or to cut off all scientific examination of the subject – as in a NASA notorious ‘cut the feed’ event.

Debunking – as C. S. Lewis noted, is an easy and lazy kind of ‘rationality’ that almost anyone can do and on any subject. Literally almost anything can be debunked. Debunking is a magic act whose misdirection tricks the magician instead of their audience. It is a form of bullshitting adorning the authoritative costume of denial. The debunker is spinning a facade of cherry-picked anecdotal anti-data, which is then used to linearly claim that something isn’t. This backwards method of outference runs anathema to the practices of science, evidence, and inference; a method plied by someone who will never debunk their own favored ideas and who possesses no interest in truth whatsoever.

The Delimitation of Virtue – a society’s inability or failure to skillfully define ‘endangerment’, constitutes the chief endangerment to that society.

Demarcation of Skepticism – once plurality is necessary under Ockham’s Razor, it cannot be dismissed by means of skepticism alone.

Denial Activist’s Bias – when bias is evident from the social fact that the majority of persons inside a denial based activists group, neither have studied nor had any first hand experience within the subject they are actively seeking to deny.

Deontological Doubt (epoché) – if however one defines ‘doubt’ – as the refusal to assign an answer (no matter how probable) for a specific question – in absence of assessing question sequence, risk and dependency (reduction), preferring instead the value of leaving the question unanswered (null) over a state of being ‘sorta answered inside a mutually reinforcing set of sorta answereds’ (provisional knowledge) – then this is the superior nature of deontological ethics.

Desire to Offend Bias – when one excuses or bears a condition wherein, the desire to offend a targeted party is so high or is of such a first priority that, it imbues or reveals a bias or agency all of its own. The ironic bigotry of highlighting a strawman bigotry in another targeted party or disliked race. See Hitchens’ Apology.

Devil’s Bargain – an extremely disadvantageous deal, bearing a terrible price to pay (known or unknown), which someone considers accepting because they can either see no other way out of a problem situation, or desire some objective so much that they are willing to sacrifice something precious in order to attain it. The indicator of a devil’s bargain once adopted, is that the victim/participant will double down on their commitment to its ideal, any time a challenge is raised. No reflection or objectivity will be tolerated.

Diagnostic Habituation Error – the tendency of medical professionals to view subjects of discourse as if resolvable by the methods of diagnosis, when most fields of discourse cannot be approached solely in this manner. Diagnosis involves a mandatory answer, closed set of observational data, constraints, model convergence, increasing simplicity and conclusions which select from a closed field of prescriptive conclusions. All of these domain traits are seldom encountered in the broader world of scientific research.

Dichotomy of Compartmented Intelligence – a method of exploiting the compartmented skill sets of scientists, in order to preclude any one of them from fully perceiving or challenging a study direction or result. Call your data analysts ‘data scientists’ and your scientists who do not understand data analysis at all, the ‘study leads’. Ensure that those who do not understand the critical nature of the question being asked (the data scientists) are the only ones who can feed study results to people who exclusively do not grasp how to derive those results in the first place (the study leads). This is a method of intelligence laundering and is critical to eveyone’s being deceived, including study authors, peer reviewers, media and public at large.

dicto simpliciter – when you presume that what is true in general and/or under normal circumstances, is therefore true under all circumstances without exception.

The Diebold Test – if you and a friend are walking down the sidewalk, and a Diebold ATM machine falls from ten stories above and crushes him, you do not need a doctor to pronounce death, nor to take an EEG nor blood pressure in order to scientifically establish that your buddy is dead. Those who appeal for such false epistemology typically are not actually looking for verity – rather just seeking to deflect anything which competes with the answer they desire to enforce.

Discounting Vividness – the invalid presumption that all types of eyewitness testimony are universally faulty, and further, that those involving describing an occurrence in vivid or emotional detail, even if it is an exceptional occurrence, are immediately suspect and should be discredited.

Disinformation – when a group plants a false item of information inside the camp of thought they oppose, then alerts their allies (typically the main stream press) to highlight this falsehood as a means to discredit those groups, their movements or people disdained by the disinformation specialist and/or the condemning press channel.

Dismissertation – reciting scant, cherry picked or anecdotal counter-arguing evidence which appears to dismiss or plausibly deny a subject. Evidence consists often of only canned talking points from an agenda group and not authentic research. The technique is sometimes admissible in peer review, however is a fallacy when applied in a petition for plurality.

Dissent Muzzling – on a controversial issue, a group holding power offering those who are issue stakeholders or voters, the ‘choices’ of 1. supporting their particular agenda, or 2. remaining silent. No option of dissent or free speech is offered.  This is a favorite masquerade of oppressive 1984-styled tyranny – feigning a ‘choice’ to visibly support or remain silent.

Doubt – there are two forms of ‘doubt’ (below).  Most fake skeptics define ‘doubt’ as the former and not the latter – and often fail to understand the difference.

Methodical Doubt – doubt employed as a skulptur mechanism, to slice away disliked observations until one is left with the data set they favored before coming to an argument. The first is the questionable method of denying that something exists or is true simply because it defies a certain a priori stacked provisional knowledge. This is nothing but a belief expressed in the negative, packaged in such a fashion as to exploit the knowledge that claims to denial are afforded immediate acceptance over claims to the affirmative. This is a religious game of manipulating the process of knowledge development into a whipsaw effect supporting a given conclusion set.

Deontological Doubt (epoché) – if however one defines ‘doubt’ – as the refusal to assign an answer (no matter how probable) for a specific question – in absence of assessing question sequence, risk and dependency (reduction), preferring instead the value of leaving the question unanswered (null) over a state of being ‘sorta answered inside a mutually reinforcing set of sorta answereds’ (provisional knowledge) – then this is the superior nature of deontological ethics.

DRiP Method – the method by which SSkpetics employ Pseudoscience to filter official discourse and enforce the religious goals of the Deskeption Cabal by the process of 1. D Denying Data, 2. R Restricting Research in 3. P Preventing or Pretending Peer Review.

Dual-Burden Model of Inferential Ethics – the condition which is called ‘plurality’ by Ockham’s Razor exists once the sponsors of an alternative (to the null) idea or construct (does not have to be fully mature as a hypothesis) have achieved any one of the following necessity thresholds:

‣  a nexus of a persistent and robust alternative construct observation base
‣  potential falsification of the ‘null’ exists (and certainly if that null is not really a hypothesis itself)
‣  the intent contribution of agency has been detected
‣  the critical issue involved is a matter of public trust
‣  the contention involves placing involuntary or large counts of stakeholders at risk
‣  there exists a critical immaturity of the entailed observation domain.

Under such necessity, the hypothesis reduction circumstance exists wherein an actual null hypothesis must be developed, and further be shown to have comprehensive explanatory potential to justify its contention – it can no longer reside as simply the lazy ‘null’ argument. Conditions wherein the evidence is forcing the null sponsor to contend something other than simply ‘nuh-uh’ (nulla infantis). However beware, the discipline in such defense of the null better be just as solide-en-preuve as that discipline set which was previously demanded of alternative explanation sponsors.

Duck Test – a form of abductive reasoning which proceeds along the lines of the apothegm, ‘If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck’.

Durable Franchise – a business product or service which bears a feature in that, as it grows in consumption or employment, it also serves to create a condition by which its criticality grows inside its target specific market or channel – artificially promoting it to the status of category killer or monopoly. An application, refreshment, drug, vaccine, practice or technology standard, which displaces all competing entities or alternatives, or serves to habituate the user into increasing dependency or demand for the product or service over its lifespan.

The Duty of Science (Einstein) – The right to search for the truth implies also a duty that one must not conceal any part of what one finds to be true. The right to search for the truth is commensurate also with a duty that one must not conceal any part of what one finds to be true, nor obfuscate what one fears could possibly be true.

Editorial Burden Error – when pushing the envelope on evidence/reason or making mistakes as to what to discredit, impugn and attack because one is under the burden of having to find some subject to discredit or eviscerate. This because they are on a regular/urgent editorial publication schedule or have some key presentation due inside a group of skepticism.

Einfach Mechanism – an idea which is not yet mature under the tests of valid hypothesis, yet is installed as the null hypothesis or best explanation regardless. An explanation, theory or idea which sounds scientific, yet resolves a contention through bypassing the scientific method, then moreover is installed as truth thereafter solely by means of pluralistic ignorance around the idea itself. Pseudo-theory which is not fully tested at its inception, nor is ever held to account thereafter. An idea which is not vetted by the rigor of falsification, predictive consilience nor mathematical derivation, rather is simply considered such a strong, or Occam’s Razor (sic) stemming-from-simplicity idea that the issue is closed as finished science or philosophy from its proposition and acceptance onward. A pseudo-theory of false hypothesis which is granted status as the default null hypothesis or as posing the ‘best explanation’, without having to pass the rigors with which its competing alternatives are burdened. The Einfach mechanism is often accompanied by social rejection of competing and necessary alternative hypotheses, which are forbidden study. Moreover, the Einfach hypothesis must be regarded by the scientific community as ‘true’ until proved otherwise. An einfach mechanism may or may not be existentially true.

Elastration (Banding) – The use of an elastrator to gradually cease bloodflow to the testes of a bull. A furtive method of gradual castration and extinction, without the victim fully perceiving it. A method of enacting evil, without full perception of performing it.

Embargo Hypothesis (Hξ) – was the science terminated years ago, in the midst of large-impact questions of a critical nature which still remain unanswered? Is such research now considered ‘anti-science’ or ‘pseudoscience’?

Entscheiden Mechanism – the pseudoscientific or tyrannical approach of, when faced with epistemology which is heading in an undesired direction, artificially declaring under a condition of praedicate evidentia, the science as ‘settled’.

Poison Pill Hypothesis – the instance wherein sskeptics or agency work hard to promote lob & slam condemnation of particular ideas. A construct obsession target used to distract or attract attack-minded skeptics into a contrathetic impasse or argument. The reason this is done is not the confusion or clarity it provides, rather the disincentive which patrolling skeptics place on the shoulders of the genuine skilled researcher. These forbidden alternatives (often ‘paranormal’ or ‘pseudoscience’ or ‘conspiracy theory’ buckets) may be ridiculous or indeed ad hoc themselves – but the reason they are raised is to act as a warning to talented researchers that ‘you might be tagged as supporting one of these crazy ideas’ if you step out of line and do not visibly support the Omega Hypothesis. A great example is the skeptic community tagging of anyone who considers the idea that the Khufu pyramid at Giza might have not been built by King Khufu in 2450 bce, as therefore now supporting conspiracy theories or aliens as the builders – moreover, their being racist against Arabs who now are the genetic group which occupies modern Egypt.

Emotional Priming – a process of pseudo-education wherein a popular controversial issue such as Creation-Evolution, or Monsim-Dualism is framed as a whipping horse, posed in a false dilemma, so as to polarize the general public into ‘science’ and ‘woo’ camps of belief. The visceral reaction to the woo camp of belief inside academia imbues a type of anchoring bias and emotional agency on the part of those who self appoint or are tasked to ‘represent science’ – thereafter influencing their objectivity just as severely as would a religion.

epoché (ἐποχή, “suspension”) – a suspension of disposition. The suspended state of judgement exercised by a disciplined and objective mind. A state of neutrality which eschews the exercise of religious, biased rational or critical, risky provisional and dogmatic dispositions when encountering new observations, ideas and data. It is the step of first being skeptical of self, before addressing challenging or new phenomena. It is underpinned by both a examination of the disciplines of true knowledge development (epignosis) and the repository of vetted and accepted knowledge (gnosis). If someone relates a challenging observation to you, you suspend disposition, and catalog it. If you toss it out based upon a fallacy or trivial flaw – then you are a cynic, not a skeptic.

Error of the Default Null (Omega Hypothesis or King of the Hill Pseudoscience) – a variation of argument from ignorance. The practice of assigning a favored or untestable/unfalsifiable hypothesis unmerited status as the null hypothesis. Further then proclaiming the Default Null as the null hypothesis until such time as it can be defeated by new competing science.

Error of the Guilty Null (The Precautionary Principle) – the practice of assigning a favored hypothesis the status as null hypothesis, when in fact the hypothesis involves a feature or implication which would dictate its address as an alternative hypothesis instead.  A null hypothesis which is, by risk or impact, considered potentially harmful until proved innocent, should be treated as an alternative under correct parsimony. Further then invalidly proclaiming this Guilty Null to be the prevailing conclusion of science until such testing is conducted which could prove it to be false or until such time as it can be defeated by new competing science.

Error of the True Null (Omega Hypothesis or King of the Hill Pseudoscience) – a variation of argument from ignorance. Regarding the null hypothesis as objectively ‘true’ until proved otherwise, when it simply is the null hypothesis from the standpoint of the logical calculus in a hypothesis reduction hierarchy and not because it has been underpinned by a Popper level scientific rigor. Further then proclaiming the True Null to be the prevailing conclusion of science.

Erwartungstoleranz – (Ger: ‘anticipation tolerance’) – a passive form of hatred wherein an observer ignores the fact that an opponent is about to make a mistake, and allows the mistake to occur without interruption – through motivation by epicariacy, an innate desire to punish, or allowing harm to manifest on their opponent over that mistake. ‘Never interfere with an enemy while he’s in the process of destroying himself.’

Ethical Inversion –  a social condition and form of linear induction which prohibits the execution of actual science, ironically in the name of ethics. A false condition wherein proper study design is deemed to involve unethical or inhumane techniques, thus researchers are allowed to employ flawed study design, which in turn only serves to produce results that mildly suggest there is no need to conduct proper study design in the first place. So none is undertaken.

Ethical Skeptic’s First Axiom – accurate, is simple. To a practitioner in the art, accuracy is equivalent or superior to simplicity. The idea that, in its first inception, a new paradigm must be expressed accurately first before it may be simplified. One cannot as a rule be forced to express simply, an idea which is being drafted for the very first time. Simplicity is a manner of teaching, communicating or sustaining; often to non-discipline individuals, and is not a necessary tool of formulation. Thereafter, an idea may be simplified if possible, in order to aid in understanding. See Bridgman Reduction.

Ethical Skeptic’s First Axiom – accurate, is simple. But that does not serve to make simple, therefore accurate. Among explanatory alternatives, elegance is always preferable over simplicity.

Ethical Skeptic’s Second Axiom – an idea cannot be a conspiracy theory, if it is also the null hypothesis.

Ethical Skeptic’s Third Axiom – danger trumps conspiracy. That which introduces a danger (risk and/or uncertainty) constitutes a more extraordinary claim (demands more evidence) than that which is deemed conspiracy theory.

Ethical Skeptic’s Axiom of Extremism and Science – if they will choose to be an extremist politically, this offers a glimpse as to the poor integrity and instant-grits rigor placed into their claim of ‘settled science’ as well.

Ethical Skeptic’s Axiom of Fact and Belief – the direct derivation of belief from mere ‘fact’ is one level of competence below ignorance.

Ethical Skeptic’s Axiom of Nelsonian Knowledge – in order to distort the truth, you first have to know it.

Ethical Skeptic’s Conspiracy Razor – never ascribe to happenstance or incompetence, that which coincidentally, surreptitiously and elegantly supports a preexisting agency. Never attribute to a conspiracy of millions, what can easily arise from a handful of the clever manipulating the ignorance of millions.

Ethical Skeptic’s Dictum of the Cabal – a cabal is detected not through their unity in message, but rather their unity in ignorance. A cabal is a consensus of ignorance.

Ethical Skeptic’s Dictum of Peer Review – in a domain which resides below a given quotient of knowledge, peer review is indistinguishable from ad populum appeal.

Ethical Skeptic’s Dictum of Rhetoric – what is posed in the rhetorical, can only be opposed with the rhetorical. One cannot answer a rhetorical question with objective reason and evidence.

Ethical Skeptic’s Dictum of Silence – silence cannot be refuted. However, ontological silence should not be confused with rhetorical silence.

Ethical Skeptic’s Dictum of Truth and Authority – Who monopolizes truth, also sells it at a dear price.

Ethical Skeptic’s Law – if science won’t conduct the experiment, society will force the experiment. One can only embargo an idea for so long.

The Ethical Skeptic’s Law of Advanced Intelligence

Neti’s Razor – one cannot produce evidence from an entity which at a point did or will not exist, to also demonstrate that nothing aside from that entity therefore exists. The principle which serves to cut secular nihilism as a form of belief, distinct from all other forms of atheism as either philosophy or belief. From the Sanskrit idiom, Neti Neti (not this, not that). Therefore, you are wholly unqualified to instruct me that this realm is the only realm which exists, and efforts to do so constitute a religious activity.

I Am that I Am – that which possesses the unique ability to be able to define itself, renders all other entities disqualified in such expertise. There is no such thing as an expert in god. The principle which serves to cut theism as a form of belief, distinct from all other forms of belief as either philosophy or religion. From the Torah idiom, I Am (I Am that I Am or in Sanskrit, Aham Bramsmi).  Therefore, if god existed, you are unqualified to tell me about it. So, theism falls into a lack of allow-for domain.

Non-Existence Definition – six questions form the basis of a definition: What, Where, When, How, Why, Who. The answers to this set of six questions still forms a definition of expert attributes, even if the answer to all six is ’empty set’. Therefore, when one applies the ethics of skepticism – one cannot formulate a definition which is specified as ’empty set’, without due empirical underpinning, a theory possessing a testable mechanism and a consilience of supporting research.  We have none of this, and can make no claims to ‘non-existence’ expertise in god.

Principle of Indistinguishability – any sufficiently advanced act of benevolence is indistinguishable from either malevolence or chance.

The Ethical Skeptic’s Laws of Critical Path

Law of the Trivial – our tendency to devote disproportionate amounts of focus upon irrelevant matters, while leaving important matters unaddressed.

Law of Wallowing – our tendency to devote disproportionate amounts of time to the menial, while sacrificing important elements of process, or while serving no process at all.

Law of the Non-Sequitur – our habit of responding only to that which is immediately salient, while sacrificing or obfuscating the sequitur or broader issue at hand.

Law of the Non-Critical – our tendency to focus primarily upon cause-to-effect, however exercised inside a non-productive or dead-ended sequence of actions, or by no critical sequence whatsoever.

Ethical Skeptic’s Law of Ethics – being ethically obtuse is indistinguishable from being scientifically or technologically inept. Intelligence or acumen in no way excuse one from accountability under the public trust.

Ethical Skeptic’s Law of Virtue and Value – the problem with easy money on Wall Street is that people who are trained to think that money comes in absence of any form of delivery of value – those poseurs rise into positions of power. These entities coordinate with social virtue activists as a result, because they cannot attach margin to value. Those who bear no skill in provision of value, will substitute virtue in its place.

Ethical Skeptic’s Law of Virtue and Violence – violence is always presaged by virtue. Oppression is always presaged by truth. A violent movement seeking oppressive power, will always costume itself in a virtuous cause of some kind (science, the people, the children, the migrant, the worker, etc.), during its ascendancy. It does not matter who is actually in current power. Violent forces will inevitably decided that their virtue exonerates their incompetence, and more importantly, justifies violence in the establishment of their righteous power.

Ethical Skeptic’s Principle of Money and Power – it is difficult to get a person to study and know something, when their income or power depends upon their not studying or knowing it.

Ethical Skeptic’s Law of Slow Moving Disasters – slow moving disasters (famine, climate change, pandemic, racism, etc.) will universally involve sacrifice solely on the part of a consistent and single targeted ethnicity, gender and nationality.

The Ethical Skeptic’s Razor (The Antiwisdom of Crowds) – among competing alternatives, all other things being equal, prefer the one for which discussion or research is embargoed. Power, Politics, Narrative, and Profit demand a level of transparency which obviates that same burden upon mere dissent. What is enforced by Narrative, can also be dismissed as Narrative.

Ethical Skeptic’s Third Razor – dangerous demands greater level of evidence than does crazy. Any claim which exposes a stakeholder to risk, ignorance or loss of value – regardless of how ordinary, virtuous or correct – demands extraordinary evidence. For a skeptic, a person who claims to represent science and uses ‘facts’ to lie, misrepresent, manipulate, and push politics – is worse than someone who claims there is a Galactic Federation of Aliens out there. Dangerous is much worse than crazy.

Ethics-Understanding Gridlock – a paradoxical and socially paralyzing condition wherein stakeholders who grasp the ethical issues involved with a science, do not possess full understanding of the subject science itself; while conversely those who are able to understand the science, do not possess a full grasp of its incumbent stakeholder ethics.

Evidence Sculpting – has more evidence been culled from the field of consideration for this idea, than has been retained? Has the evidence been sculpted to fit the idea, rather than the converse?

Skulptur Mechanism – the pseudoscientific method of treating evidence as a work of sculpture. Methodical inverse negation techniques employed to dismiss data, block research, obfuscate science and constrain ideas such that what remains is the conclusion one sought in the first place. A common tactic of those who boast of all their thoughts being ‘evidence based’. The tendency to view a logical razor as a device which is employed to ‘slice off’ unwanted data (evidence sculpting tool), rather than as a cutting tool (pharmacist’s cutting and partitioning razor) which divides philosophically valid and relevant constructs from their converse.

Also, the instance common in media wherein so-called ‘fact-based’ media sites tell 100% truth about 50% the relevant story. This is the same as issuing 50% misinformation or disinformation.

ex ante – an inference which is derived from predictive, yet unconfirmed forecasts. While this may be a result of induction, the most common usage is in the context of abductive inference.

Exclusion Induced Bias – if one is on the outside of a subject, one’s natural tendency is to be skeptical of those on the inside of that subject. A natural bias inducement which occurs when the only people who engage over a complex, highly technical or niche specialty subject are those that really care about it. This is often perceived negatively or interpreted as lacking in accountability or as elitism by laymen or outsiders. In such outside groups, a reactionary skepticism or cynicism can artificially generate by means of this exclusion barrier alone, and not from any form of philosophy, reason nor epistemology.

Existential Occam’s Razor Fallacy (Appeal to Authority) – the false contention that the simplest or most probable explanation tends to be the scientifically correct one. Suffers from the weakness that myriad and complex underpinning assumptions, based upon scant predictive/suggestive study, provisional knowledge or Popper insufficient science, result in the condition of tendering the appearance of ‘simplicity.’

exoagnoia – conspiracy which is generated naturally through the accelerative interaction of several commonplace social factors. A critical mass of an uninformed, misinformed or disinformed population under chronic duress (the ignorance fuel), ignited by an input of repetitive authoritative propaganda (the ignition source). Such a phenomenon enacts falsehood through its own inertia/dynamic and does not necessarily require a continuous intervention on the part of an influencing group.

ex opere operato – a form of appeal to skepticism, wherein the person who has declared them self to be a skeptic, believes that therefore they justly derive their credibility from a higher office. The fallacy wherein one regards the superiority of their position to not be derived merely from their own opinion; implying rather, that all their thoughts, rationalizations, conclusions and beliefs confer from a higher authority – that of scientists or science, the evidence, doubt, critical thinking or skepticism.

Extremist/Fanaticist/Ultraist – a person or group of similar minded persons who fail in comprehension of two valuable human truths: 1) the value of creating allies from those who only mildly disagree, and 2) the error of siding with ‘the enemy of thy enemy’ – seldom grasping that, given condition 1) the likelihood that this new allied enemy will be worse than their old one is very high. The extremist not only does not care, but furthermore does not possess the skill to discern this condition nor fathom its principle. Orange man bad. A mental state which makes one vulnerable to manipulation by this mindset.

Facile – appearing neat and comprehensive only by ignoring the true complexities of an issue; superficial. Easily earned, arrived at or won – derived without the requisite rigor or effort.

Fake Hoax Ethics Error – when one errantly regards a hoax which is purposely constructed, then revealed to ‘show how easy it is to fake this stuff,’ as standing exemplary of ethical skeptical conduct.

Fallacy of Centrality – If something significant was happening, I’d know about it; since I don’t know about it, it isn’t happening. A fallacy framed by researcher Ron Westrum while observing the diagnostic practices of pediatricians in the 1940s and 1950s. An expert or self-proclaimed one assumes that they are in a central position inside a topic or discipline. Moreover they assume that if something of serious import were happening, they would know about it. Since they don’t know about it, therefore it isn’t happening.

Fallacy of Composition by Null Result – the contention that the result of a null set on an experimental observation in an unconstrained domain means that the evidence supporting the idea in question does not exist. The comparison is invalid when the null result is measured in an unconstrained field of measure, assuming it to be comparable to a null result in a constrained domain of measure, as is the instance of testing to effect a medical diagnosis.

Fallacy of Division/Composition – a form of faulty reasoning which appeals to the existence of internal disagreements or factions within a group as evidence that the group’s argument is collectively wrong, invalid, or weak in comparison to a specific antithetical argument. The existence of religious divisions for example does not serve to invalidate the construct of spirituality, nor promote nihilist atheism as the stronger or necessary birfucated alternative.

Fallacy of Excluded Exceptions – a form of data skulpting in which a proponent bases a claim on an apparently compelling set of confirming observations to an idea, yet chooses to ignore an also robust set of examples of a disconfirming nature. One chisels away at, disqualifies or ignores large sets of observation which are not advantageous to the cause, resulting only seeing what one sought to see to begin with.

Fallacy of Exclusion (Fallacy of Suppressed or Isolated Evidence) – One of the basic principles of argumentation is that a sound argument is one which presents all the relevant, and especially critical-path, evidence. A debunker will seek to isolate one single facet of an observation and then pretend that it is weak, when stripped of its corroborating observations, context, and facets of credibility. This is the warning flag that a pseudo-scientific method is at play.

Fallacy of The Control – a condition wherein invalid objection is raised to a valid observation, citing that the observation was not conducted against a differential cohort or control. Situations were the extreme nature of the observation is exceptional inside any normal context, to the point of not requiring a control, in order to be valid for inference or inclusion. A fake skeptic’s method of dismissing observations which threaten their religion, by means of sophistry and pretend science.

Fallacy of Univariate Linear Inductive Inquiry into Complex Asymmetric Systems (Univariate Fallacy) – the informal fallacy of attempting to draw inference regarding complex dynamic systems, such as biological, economic or human systems, through employment of singular, linear or shallow inductive analytical methods. A deep understanding of complex systems first demands a conceptual and analytical strategy that respects, defines and adequately reduces for analysis, that complexity. Only then can multiple component analyses be brought to bear as a consilience in understanding the whole.

False Analogy – an argument by analogy in which the analogy is poorly suited, used to disprove a challenging or disdained construct.

False Domain Equivalence – a form of ambiguity slack exploitation wherein one equates the probability metrics which can be derived from a qualified, constrained or specific domain or circumstance, to be comparable to the use of ‘probability’ inside a broad domain or one lacking Wittgenstein parameters, constraints or descriptives.

False Parsimony – when one desires to restrict the set of potential explanations or favors a simpler explanation, yet does not possess the objectivity or integrity to know when such conservatism is no longer warranted.

Familiar Controversy Bias – the tendency of individuals or researchers to frame explanations of observed phenomena in terms of mainstream current or popular controversies. The Fermi Paradox exists because aliens all eventually blow themselves up with nuclear weapons right after they discover radio. Venus is a case of run-away global warming from greenhouse gasses. Every hurricane since 1995 has been because of Republicans. Every disaster is God’s punishment for some recent thing a nation conducted. Mars is a case of ozone depletion at its worst, etc. Every paradox or novel observation is readily explainable in terms of a current popular or manufactured controversy. Similar to the anachronistic fallacy of judging past events in light of today’s mores or ethics.

Fictional Miss-Identification – when one reacts to fictional representations as though they are real. Complaining about how a popular fictional TV programs portrays the paranormal, irate reactions to a book which invokes a ghost or spirit, or has a character convert to a spiritual outlook.

Fictitious Burden of Proof – declaring a ‘burden of proof’ to exist when such an assertion is not salient under science method at all. A burden of proof cannot possibly exist if neither the null hypothesis or alternative theories nor any proposed construct possesses a Popper sufficient testable/observable/discernible/measurable mechanism; nor moreover, if the subject in the matter of ‘proof’ bears no Wittgenstein sufficient definition in the first place (such as the terms ‘god’ or ‘nothingness’).

fictus scientia – assigning to disfavored ideas, a burden of proof which is far in excess of the standard regarded for acceptance or even due consideration inside science methods. Similarly, any form of denial of access to acceptance processes normally employed inside science (usually peer review both at theory formulation and at completion). Request for proof as the implied standard of science – while failing to realize or deceiving opponents into failing to realize that 90% of science is not settled by means of ‘proof’ to begin with.

Focal Panic Effect – the human tendency to panic when viewing for the first time, a new set of measures or observations concerning a pervasive or risky phenomenon. For example, grasping how high a jet is flying, or how fast two cars are moving relative to each other while pass on the highway, or how many people are ‘infected’ by a new virus, all for the very first time – such novel observations can cause one to panic, since the numbers involved are well outside our normal frame of relevance/reference. In actuality, the measure or observation is in its normal range, it is just we are not used to looking at it in the newly introduced fashion.

Focusing Effect – the tendency to place too much importance on one aspect of an event.

Frankenstein’s Monster Fallacy – an interweaving of cherry picking and straw man fallacies, wherein the arguer assembles a patchwork of disparate and conflicting opposing positions, both valid and straw man, to create a fabricated, monstrous argument that doesn’t accurately represent any single position or poses anything aside from their own position in the most ludicrous light.

Fundamental Attribution Error : Behaviorism – a tendency to attribute suffering to one’s behavior, psychological state, or inclination, before and as opposed to examining for factors which are out of their control, access, or knowledge.

Fupgrade – a touted ‘upgrade’ in a software application, offering or packages of services in which critical feature functionality from which the user used to benefit is actually removed and shifted to a higher costing program or package. This is usually obfuscated by a flurry of overblown improvements in basic functionality and appearance (the misdirection). Typically the user is not informed that critical functionality has been removed from their newly paid-for upgrade, and they find out later that they are forced to now also purchase the higher-cost or ‘pro’ version of their product.

Gaslighting – a form of manipulation that seeks to sow seeds of doubt in a targeted individual or in members of a targeted group, hoping to make them question their own memory, perception, and/or sanity. Using persistent denial, disinformation, misdirection, contradiction, manipulated statistics and organic untruths, in an attempt to destabilize the target and delegitimize the target’s beliefs, understanding or confidence in self.

Gell-Mann Amnesia – the habit or bias in which a person, upon finding a source to be unreliable and critically mistaken when reporting on a subject inside which they hold close expertise, wherein they tend to forget such error/bias and regard that source as reliable regarding other areas in which they do not hold close expertise. Gell-Mann exploitation is the realization on the part of casual effort media sources, that individual experts who can challenge their contentions, do not possess a means to aggregate their collective knowledge about the media, and craft a condemning perspective.

Genetic Fallacy – an informal fallacy of irrelevance regarding the origins of an argument or the person making the argument, wherein a conclusion is suggested or rejected based solely on someone’s or something’s history, origin, or source and/or rather than its current meaning or context.

Gestalt-Heuristic (G-H) Gap – a fundamental but unspoken disconnect (G-H Gap) between those subject matter experts who execute the detail and craft of the discipline, its heuristics, and those who direct the purpose and accountability of the discipline as a part of a larger mission, or its gestalt. The Gap in competence wherein those who develop the analytics/heuristics/programs don’t fully grasp the question being asked (and may answer a different, political, or rhetorical one instead – under fear of negative career impact), and those who are responsible to explain and be accountable for the results, don’t really understand how those results were derived. The gap between the academic, administrator, technician, and recent college graduate versus the executive, department head, or senior associate who’s heuristic skills are rusty and/or outdated.

Gish Gallop – a tactic of argument wherein the arguer skips through subject after subject or data point after data point, in order to tender the appearance of a barrage of sound unchallengeable argument.

God’s Dilemma (& The Satan Principle) – since ‘God’ can never cause harm, any instance of harm must be caused by someone or something else. (As well, since humans and randomness do not cause the full set of all harms, therefore there must be a harm-imparting agent who completes this set, an anti-God who can and does cause harm.) Generally, any person, buzzword, program, party, technique, intervention, principle, or action which is held under a God’s dilemma, can never cause harm. It is assumed holy from its very naming onward. All harm therefore must have come from other parties.

Goodhart’s Law – when a specific measure becomes the sole or primary target, it ceases to be a good measure.

Goodhart’s Law of Skepticism – when skepticism itself becomes the goal, it ceases to be skepticism.

Griefer – a bad-faith participant. Someone who is involved in a discussion or multiplayer video game who pretends to be a genuine member but intentionally annoys and harasses others (trolling). The griefer primarily aims to cause harm instead of achieving mission success or gaining knowledge. They exploit game mechanics or twist the concept of “truth” in unintended ways solely to hurt other players. You can identify them by their passive-aggressive behavior, avoidance of personal accountability or risk of their own constructive contributions, and sole concentration upon targeting specific individuals.

Groupthink Blindness – when a group fails to see their own faults, usually through the common spread of propaganda, and therefore must view any critique of, decline in or mistake by the group as being the fault of opposition or outside parties.

Gundeck/Gundecking – a form of methodical deescalation and Nelsonian inference. Meaning to slip away to lower decks of a ship or lesser functions of professional conduct in order to put on the appearance of working on an objective. However in reality, one is merely avoiding or falsifying the necessary work at hand.

  • Being a scientist when one needs to be an investigator.
  • Being a technician when one needs to be a scientist.

Whatever keeps one from discovering that which they do not want discovered, or not performing work on the salient question at hand.

Hamming’s Axiom – a quip by mathematician Richard Hamming, “You get what you measure.” Which means that one will not find what they are not measuring for in the first place.

Haspel’s Paradox – a suppressed idea mutates to ever more virulent forms, these are then invoked to justify its continued suppression.

Hasty Generalization – basing a broad conclusion or general rule about a group on rumor, stereotype, a small sample set or scant observational experience.

Hawthorne Observer Effects – an expansion of thought based around the principle in which individuals modify an aspect of their behavior in response to their awareness of being observed or to a test-change in their environment. The simple act of observation changes both the state of mind of the observer as well as those being observed. These effects principally center on the state of mind change in the observer:

  1. That which is observed, changes
  2. That which is observed for the first time, appears exceptional
  3. That which is observed obsessively, instills angst
  4. That which bears risk, is observed less
  5. That which is observed to move more slowly, is perceived as the greater risk

Hedging – the a priori employment of ambiguous words or phrases, for the purposeful instance wherein they can be reinterpreted in such a way as to appear in consensus, if one is later found to be wrong on a position of denial and opposition.

Hegelian Deception – a form of employment of the Hegelian Dialectic where a fourth party deceiver manipulates the dialectic process to guide external witnesses to synthesize a false truth that excludes an actual hidden tertiary element. In Hegelian philosophy, the dialectical process involves the interaction of opposing ideas (thesis and antithesis) leading to a higher level of understanding or resolution (synthesis). In this scenario, the deceiver sets up a primary event (thesis) and a secondary event (antithesis) to create a false narrative or synthesis that relates the false perception that a higher level of understanding has been achieved, and more importantly diverts attentions from the actual truth (a tertiary event).

Herculean Burden of Proof – placing a ‘burden of proof’ upon an opponent which is either arguing from ignorance (asking to prove absence), not relevant to science or not inside the relevant range of achievable scientific endeavor in the first place. Assigning a burden of proof which cannot possibly be provided/resolved by a human being inside our current state of technology or sophistication of thought/knowledge (such as ‘prove abiogenesis’ or ‘prove that only the material exists’). Asking someone to prove an absence proposition (such as ‘prove elves do not exist’).

The High & Low Poseur – do A, then B was the right choice. Do B, then A was the right choice. Do A and B, you are over-complicating. They are afraid to risk being wrong, or lack the skill to create their own work. They always know ‘how it should really be done’, but never can seem to actually do it themself – and rather, only bear the ability to critique ideas they disdain or the work of others.

Hitchens’ Apology (Desire to Offend Bias) – when one uses Christopher Hitchens’ apologetics to excuse/cover for a bias. Using the familiar Hitchens quote, valid in its own right, to excuse a condition wherein, the desire to offend is so high or is a first priority such that, it imbues or reveals a bias all of its own. A way of masquerading agency under the apologetic “If someone tells me that I’ve hurt their feelings, I say, ‘I’m still waiting to hear what your point is.’ In this country I’ve been told, ‘That’s offensive,’ as if those two words constitute an argument or a comment. Not to me they don’t.”

Hoax Laundering – the employment of satire news sites to create completely false news segments, which are then passed around sympathetic advocacy groups, stripped of their source and/or satirical context to eventually emerge as potentially true, inside target opposition circles. This is not ‘fake’ news, as the common idea that this, and not mainstream media news, constitutes fake news, is another misdirection by the same advocacy groups. Rather hoax laundering occurs with respect to hoax news; as the laundering of the satirical context is intentional and useful in a lob and slam ploy advocacy effort.

Huckster’s Anxiety – a form of dissonance/anger wherein a sales person or person making a pitch thinks that they are about to close a deal, and the sales target starts probing into issues of soundness, after the sales person has already tasted blood. This can elicit a form of sudden anger and posturing on the pitch-maker’s part. It can also serve as a form of manipulation of the intended target of the sale.

Humor Hoax Fraud – when posing misinformation about disdained persons or subjects, posed inside or excused by a context of humor, knowing that it will be re-circulated as fact by those with whom you associate. Pretending to be innocent by expression technicality.

Humping the Elephant – an extension of the familiar ‘elephant looks different from every angle metaphor;’ wherein a fake skeptic is not actually trying to find out the truth, rather is simply there for personal benefit or agenda promotion. In this context having their way with the elephant rather than trying to find out what it is.

Hyperepistemology – transactional pseudoscience in the employment of extreme, linear, diagnostic, inconsistent, truncated, excessively lab constrained or twisted forms of science in order to prevent the inclusion or consideration of undesired ideas, data, observations or evidence.

Hypoepistemology –  existential pseudoscience in the relegation of disfavored subjects and observations into bucket pejorative categorizations in order to prevent such subjects’ inclusion or consideration in the body of active science.  Conversely, acceptance of an a priori favored idea, as constituting sound science, based simply on its attractiveness inside a set of social goals.

Iatrogenic Skepticism – skepticism which serves to mislead science in its role as the philosophy underlying science. Skepticism which is actually the cause of ignrance rather than a mechanism helping reduce the entropy of understanding. Skepticism which results in such things as flat Earth theory, Moon landing denial, useless supplement claims and risky corporate science injury denial.

ideam tutela – concealed agency. A questionable idea or religious belief which is surreptitiously promoted through an inverse negation. A position which is concealed by an arguer because of their inability to defend it, yet is protected at all costs without its mention – often through attacking without sound basis, every other form of opposing idea.

Ignorance – is not a ‘lack of knowledge’ but is rather a verb, meaning ‘a cultivated quiescence before an idea or group which has become more important to protect than science, human rights, well-being, and life itself.’ The belief that one has personally attained a state of immunity to incorrect information. The action of blinding one’s self to an eschewed reality through a satiating and insulating culture and lexicon.

Imposterlösung Mechanism – the cheater’s answer. A disproved, incoherent or ridiculous contention, or one which fails the tests to qualify as a real hypothesis, which is assumed as a potential hypothesis anyway simply because it sounds good or is packaged for public consumption. These alternatives pass muster with the general public, but are easily falsified after mere minutes of real research. Employing the trick of pretending that an argument domain which does not bear coherency nor soundness – somehow (in violation of science and logic) falsely merits assignment as a ‘hypothesis’. Despite this, most people hold them in mind simply because of their repetition. This fake hypothesis circumstance is common inside an argument which is unduly influenced by agency. They are often padded into skeptical analyses, to feign an attempt at appearing to be comprehensive, balanced, or ‘considering all the alternatives’.

In Extremis – a condition of rising or extreme danger wherein a decision which is dependent upon an outcome of scientific study, must be made well in advance of any reasonable opportunity for peer review and/or consensus to be developed. This is one of the reasons why science does not dictate governance, but rather may only advise it. Science must ever operate inside the public trust, especially if that trust requires expertise from multiple disciplines.

in medias res – a process, research initiative or argument which begins its discourse in the middle of a series of sequential questions as opposed to starting with foundational ones and/or outlining a specific objective. This does not serve to guarantee an errant outcome, however often can waste enormous amounts of time and attention trying to resolve questions which are orphan, ill timed or unsound given the current knowledge base. Can also be used as a method of deception. See also non rectum agitur fallacy.

Inchoate Action – a set of activity or a permissive argument which is enacted or proffered by a celebrity or power wielding sskeptic, which prepares, implies, excuses or incites their sycophancy to commit acts of harm against those who have been identified as the enemy, anti-science, credulous or ‘deniers’. Usually crafted is such a fashion as to provide a deniability of linkage to the celebrity or inchoate activating entity.

“Inconclusive” is a Conclusion – the fake sleuth is desperate to issue a conclusion and obtain club credit for having reached it. Their goal is to stamp the observation (what they incorrectly call a ‘claim’) with the word ‘Debunked’. However, they also know that most neutral parties have this trick figured out now. So they prematurely reach a conclusion which appears to be skeptically neutral, but tenders the same desired result: Inconclusive. It is like declaring two opponents in a field game to be of equal team strength through a tie, 0 to 0 – by means of simply turning on the scoreboard and walking off the field after 15 seconds of play. By means of an inconclusive status, the observation can be neutralized and tossed upon a ‘never have to examine this again’ heap. Defacto, this is the same as ‘debunked’. It is a trick, wherein, the fake skeptics takes on the appearance of true skeptical epoché, while still condemning an observation or subject, wherein it is nothing of the sort.

Indigo Point Man (Person) – one who conceals their cleverness or contempt. Based upon the tenet of ethical skepticism which cites that a shrewdly apportioned omission at Point Indigo, an inflection point early in a system, event or process, is a much more effective and hard to detect cheat/skill, than that of more manifest commission at Point Tau, the tipping point near the end of a system, event or process. Based upon the notion ‘Watch for the gentlemanly Dr. Jekyl at Point Tau, who is also the cunning Mr. Hyde at Point Indigo’. It outlines a principle wherein those who cheat (or apply their skill in a more neutral sense) most effectively, such as in the creation of a cartel, cabal or mafia – tend do do so early in the game and while attentions are placed elsewhere. In contrast, a Tau Point man tends to make their cheat/skill more manifest, near the end of the game or at its Tau Point (tipping point).

Inflection Point Exploitation (The Cheat) – a flaw, exploit or vulnerability inside a business vertical or white/grey market which allows that market to be converted into a mechanism exhibiting cartel, cabal or mafia-like behavior. Rather than the market becoming robust to concavity and exposed to convexity – instead, this type of consolidation-of-control market becomes exposed to excessive earnings extraction and sequestration of capital/information on the part of its cronies. Often there is one raison d’être (reason for existence) or mechanism of control which allows its operating cronies to enact the entailed cheat enabling its existence. This single mechanism will serve to convert a price taking market into a price making market and allow the cronies therein to establish behavior which serves to accrete wealth/information/influence into a few hands, and exclude erstwhile market competition from being able to function. Three flavors of entity result from such inflection point exploitation:

Cartel – an entity run by cronies which enforces closed door price-making inside an entire economic white market. Functions through exploitation of buyers (monoopoly) and/or sellers (monopsony) through manipulation of inflection points. Points where sensitivity is greatest, and as early into the value chain as possible, and finally inside a focal region where attentions are lacking. Its actions are codified as virtuous.

Cabal – an entity run by a club which enforces closed door price-making inside an information or influence market. Functions through exploitation of consumers and/or researchers through manipulation of the philosophy which underlies knowledge development (skepticism) or the praxis of the overall market itself. Points where they can manipulate the outcomes of information and influence, through tampering with a critical inflection point early in its development methodology. Its actions are secretive, or if visible, are externally promoted through media as virtue or for sake of intimidation.

Mafia – an entity run by cronies which enforces closed door price-making inside a business activity, region or sub-vertical. Functions through exploitation of its customers and under the table cheating in order to eliminate all competition, manipulate the success of its members and the flow of grey market money to its advantage. Points where sensitivity is greatest, and where accountability is low or subjective. Its actions are held confidential under threat of severe penalty against its organization participants. It promotes itself through intimidation, exclusive alliance and legislative power.

Information Diversion – the tendency to continually seek new information on a topic at hand in order to imply the invalidity or indeterminant nature of the subject, or to distract focus away from more relevant but disliked data.

Integrity Tell – a skeptic who has examined themself first, should never cheat nor mock objective dissent in order to provide an ideological advantage for favored views. This is the first sign that one’s integrity has been compromised. The telltale sign that one is not really a skeptic.

Interrogative Biasing – ask the wrong question and you are assured to arrive at the right answer. A method of faking science by asking an incomplete, statistical absence, non-probative, ill sequenced or straw man question, fashioned so as to achieve a result which implies a specific desired answer; yet is in no way representative of plenary or ethical science on the matter under consideration.

Inverse Negation Fallacy (of Presumption) – the asymmetric strategy of promoting a desired idea through cancellation of all its antithetical concepts and competing ideas. A method which seeks to undermine and censor any communication, research, or construct which runs counter to a favored idea, often through framing such activity as ‘pseudoscience’ or ‘conspiracy theory’. A surreptitious effort to promote a favored idea without acknowledging it, nor appearing to be in advocacy for it, nor undertaking the risk of exposing that favored idea to the scientific method or critical scrutiny. This because the implicitly favored model itself, although promoted as TruthTM, often is unethical or bears very little credibility when examined stand-alone.

Inversion Effect – an opposite and compensating signal effect inside a research study which has filtered out sample data through invalid study design or an exclusion bias towards a specific observation. By offsetting a subset of the population being studied which bears a condition that is not desired for examination or detection, a study can introduce an opposite or contrapositive effect in its analytical results.  Vaccines not only do not cause autism, but two major studies showed they actually cure autism. Persons vaccinated for Covid, die at 35% the rate of unvaccinated persons in terms of all non-Covid deaths (natural and non-natural). These are outcomes which are impossible, however show up statistically because of an invalid exclusion bias which served to produce the effect.

Is-Does Fallacy – a derivative of the philosophical principle that one does not have to framework what something is, in order to study what it does. The error of attempting to conform science and epistemology to the notion that one must a priori understand or hold a context as to what something is, before one can study what ‘it’ does. The error of intolerance resides in the assumption that in order to study a phenomenon, we must assume that its cause is ‘real’ first. When electromagnetic theory was posed, they conceived of the context of a ‘field’ as real (= IS), before conducting EMF experimentation. However not all science can be introduced in this manner. In a true Does/Is, a researcher does not conceive of the context of the cause as real in advance of study. This frees up scientists to study challenging phenomena without having to declare it or its context to be ‘real’ first. Is/Does is a problem of induction – as it forces us to to a highly constrained form of science called sciebam, which seeks to state a hypothesis as the first step of the scientific method. A corollary of this idea involves the condition when the ‘does’ involves some sort of prejudice or will on the part of the subject being studied. Does becomes more difficult to study in such instance, however this does not remove from us the responsibility to conduct such study.

Jamais l’a Fait – Never been there. Never done that. Someone pretending to the role of designer, manager or policy maker – when in fact they have never actually done the thing they are pretending to legislate, decide upon or design.. A skeptic who teaches skepticism, but has never made a scientific discovery, nor produced an original thought for themself. Interest rate policy bureaucrats who have never themselves borrowed money to start a business nor been involved in anything but banks and policymaking. User manuals done by third parties, tax laws crafted by people who disfavor people unlike themselves more heavily, hotel rooms designed by people who do not travel much, cars designed by people who have never used bluetooth or a mobile device, etc.

Jumping the Shark – a ridiculous event or issue which serves to expose the latent ridiculousness of other previous events or issues or fecklessness on the part of the event or issue source itself. Roughly synonymous with ‘nuking the fridge’ or ‘opening the vault’.

Kafka Trap – named for the works of novelist Franz Kafka, this defines a circumstance involving bizarre or surrealistic predicaments used to enforce socio-bureaucratic power. A circumstance where participants in a social discourse are coerced to either agree with a particular political view, or be cast into a pre-established set of bucket condemnations based upon familiar and unfair stereotypes. The trap usually leads with an implicit threat of condemnation, which then provides a fixed set of categorizations based upon the conversant’s response to that threat. The conversant’s disagreement with the final categorization is simply ex post facto confirmation that the categorization was indeed correct.

Käseglocke – German for ‘cheese dome’. A condition and/or place where nothing can get in, and nothing can get out. A society, culture or club where development has stultified and the organized group no longer provides benefit to themselves nor those around them or society at large. A group which has ceased learning and/or its ability to increase understanding. See anomie, with respect to the morals and ethics version of such a condition.

Kettle Logic – using multiple inconsistent arguments or discipline examples to defend a specific position or outcome at all costs.

Kit Cynicism – a method of being a cynic, such as in the case of Carl Sagan’s Baloney Detection Kit, which allows one to be a cynic while at the same time convincing yourself you are not one.

Knapp’s Axiom – a quip by Las Vegas based investigative reporter for KLAS-TV, George Knapp, “The purpose of science is to investigate the unexplained, not explain the uninvestigated.” Citing the fallacy of an appeal to ignorance especially as it pertains to the refusal to develop plurality around challenging and robust ‘paranormal’ observation sets.

Laches Scheme – any sequential process, logical argument or legal circumstance in which a position is rendered null because it was not pursued in time – ironically however, neither could it have validly been pursued in time to begin with. A laches argument is one involving an unreasonable delay in making an assertion or claim, such as asserting a right, claiming a privilege, or making an application for redress. The scheme arises in the circumstance by which the subject claim could also not have been technically pursued during the period in which such claim was supposed to have been made. A form of Catch-22 common in corrupt elections or funding/assistance programs.

The Last Contributor (Principle of the Critique) – a shallow person (often one who exploits this principle intentionally) who contributes the last $5 to a $100 fund, or provides a finishing element of critique to a long effort of creativity, conjecture, risk, work, faulure and success – often is tempted to believe that they provided the entire solution or set of value. Often believing that they provided the entire $100, and lording it over the other providers.

Learned Helplessness – behavior exhibited by a subject inside a closed environment into which a negative stimulus is repeatedly introduced. After enduring repeated aversive stimuli beyond their control, eventually they resign themself to the reality of its negative presence. Possibly even introducing an amnesia about it – see Post Stockholm Syndrome.

Learned Helpfulness – in a chaotic and open environment bearing a large set of both positive and negative stimuli, the brain’s default state is to assume that control is not present, and the presence of ‘helpfulness’ is what is actually must be learned.

Lemming Inertia/Karen Train – the propensity for a syndicate, club, or advocacy group to be deluded by ad populum (appeal to club popularity of an idea) and ad virtutem (appeal to virtue of self and their ideas) in support of a notion – to the extent that even if they are found wrong, the movement can no longer be stopped. The belief that because one is acting as proxy in the name of some oppressed party, therefore they are now qualified to rule over others, silence speech, scream, attack, and harm in the name of that virtue costume – without any further circumspection or accountability from that point onward.

Axiom of the Lesson Learneda lesson can only be learned if one is willing to examine and capable of learning.

Likeocracy – a society whose governing direction is determined by the number of ‘likes’ given to celebrity one-liners published over state-authorized-monopoly media channels.

Lindy-Ignorance Vortex – do those who enforce or imply a conforming idea or view, seem to possess a deep emotional investment in ensuring that no broach of subject is allowed regarding any thoughts or research around an opposing idea or specific ideas or avenues of research they disfavor? Do they easily and habitually imply that their favored conclusions are the prevailing opinion of scientists? Is there an urgency to reach or sustain this conclusion by means of short-cut words like ‘evidence’ and ‘fact’? If such disfavored ideas are considered for research or are broached, then extreme disdain, social and media derision are called for?

Verdrängung Mechanism – the level of control and idea displacement achieved through skillful employment of the duality between pluralistic ignorance and the Lindy Effect. The longer a control-minded group can sustain an Omega Hypothesis perception by means of the tactics and power protocols of proactive pluralistic ignorance, the greater future acceptability and lifespan that idea will possess. As well, the harder it will to be dethrone as an accepted norm or perception as a ‘proved’ null hypothesis.

Linear Affirmation Bias – a primarily inductive methodology of deriving inference in which the researcher starts in advance with a premature question or assumed answer they are looking for. Thereafter, observations are made. Affirmation is a process which involves only positive confirmations of an a priori assumption or goal. Accordingly, under this method of deriving inference, observations are classified into three buckets:

1. Affirming
2. In need of reinterpretation
3. Dismissed because they are not ‘simple’ (conforming to the affirmation underway).

Under this method, the model is complicated by reinterpretations. Failing the test that a model should be elegant, not exclusively simple. By means of this method, necessity under Ockham’s Razor is assumed in advance and all observations thereafter are merely reconfigured to fit the assumed model. At the end of this process, the idea which was posed in the form of a question, or sought at the very start, is affirmed as valid. Most often this idea thereafter is held as an Omega Hypothesis (more important to protect than the integrity of science itself).

Lob & Slam Ploy – a version of good cop/bad cop wherein a virtual partnership exists between well known fake news ‘satire’ news outlets, and so called ‘fact checkers’ media patrols. The fake news is generated and posed to the web as satire, subsequently stripped of its context by a third party (which often is the debunking or fact check site to begin with), and then inserted into social media as true – whereupon it is virally circulated. Subsequently, ‘fact checking’ agencies are then alerted to this set up (the Lob), and then slam home the idea of the fake nature of the ‘news’, as well as the lack of credibility and gullible nature of those who passed it around through social media. This in itself is a fake ploy, a form a Fake-Hoaxing and Hoax Baiting practiced by social agenda forces seeking to artificially enhance the credibility of a news ‘fact checker’.

The Logical Truth of Extraordinary Evidence – any claim which exposes a stakeholder to risk, ignorance or loss of value – regardless of how ordinary, virtuous or correct – demands extraordinary evidence. The correct version of Carl Sagan’s ‘extraordinary claims demand extraordinary evidence’.

Logical versus Semantic Truth – a logical truth is a statement which is true, and remains true under all reinterpretations of its components or in all contexts aside from simply that of its apperception and crafting. A semantic truth is only true in certain given circumstances.

Loosh – a spiritual intoxicant derived from the acute suffering of higher sentience beings, including and especially those who are human or of an unblemished, young or innocent nature. The addictive essence which is derived from a blood sacrifice, broad scale destitution, war, pandemic, and/or sexual exploitation, which imparts a temporary rush in its abuser, mistakenly perceived as spiritual power. A currency or wage which is doled out among those who have unwisely addicted themselves to it, which is used to control and direct the actions of a mindless dark hierarchy.

Lotto Ticket Shuffle Scam – a scam wherein two persons pool money and buy 10 lottery tickets, however only one of them goes to the store and buys the lotto tickets; subsequently reporting back that all his tickets won and all his partner’s tickets lost. Or a business wherein one partner owns all the profitable aspects of the business, and his partners own all the non-profitable ones. The same is done with ‘reliable’ data being produced only by a authorized club – all my data is fact and all your data is anecdote.

Ludic Fallacy – a fallacy identified by Nassim Nicholas Taleb in his book The Black Swan, wherein one favors a theoretical probability history of knowns and known unknowns, either exclusively or more heavily than unknown unknowns. Under this mistaken fascination with technology, mathematics and/or stochastics, one may assemble a simulation, model or game which introduces a probability constraint, or lack thereof, which fully exposes the model to unknown unknowns. This renders the gaming or simulation catastrophically misinforming.

Machiavelli Solution – a three stage ‘solution’, implemented through an often unseen or unappreciated agency’s manipulation of a population. This is what fake and celebrity skeptics are doing to us today – they work to foment conflict between the public and science/scientists – in order to exploit the self-sublation into their own power and enforcement of their own religion, sol-nihilism. There are three steps to this:

1. Hegelian Dialectic – three dialectical stages of development: a thesis, giving rise to its reaction; an antithesis, which contradicts or negates the thesis; and the tension between the two being resolved by means of a synthesis. In more simplistic terms, one can consider it thus: proposition → anti-proposition → solution.​

However, the proposition and anti-proposition become stuck in a thing called self-sublation​​. A state in which both extremes have been falsified, however no one can give either extreme up, because of the perceived risk of a victory by the other side:

2. Self-Sublation (autoaufheben) – Hegelian principle of a dialectic which is stuck in stasis through an idea both canceling and sustaining itself at the same time. A doubled meaning: it means both to cancel (or negate) and to preserve at the same time.​

The proposition/anti-proposition tension now stuck as its own perpetual argument, this gives rise to the surreptitiously played​:

3. Machiavelli Solution – a third party creates and/or exploits the self-sublation condition of a Hegelian dialectic bifurcation at play, in order to sustain a conflict between two opposing ideas or groups, and eventually exploit those two groups’ losses into its own gain in power.​

Machinated Doubt – tendering the appearance of applying skeptical Cartesian Doubt to every observation except for those which happen to support one’s favored idea, belief or Omega Hypothesis.

Magician’s Sleeve – the critical twist in logical calculus, equivocation, soundness, or context/salience, which allows one ostensible conclusion to be surreptitiously converted into a differing, more desirable conclusion on the part of the magician. A formal fallacy in logical calculus which is hidden inside the complexity of the argument context itself.

Maleduction – an argument must be reduced before it can be approached by induction or deduction – failure to reduce an argument or verify that the next appropriate question under the scientific method is being indeed addressed, is a sign of pseudoscience at play. The non rectum agitur fallacy of attempting to proceed under or derive conclusions from science when the question being addressed is agenda driven, non-reductive or misleading in its formulation or sequencing.

Manager’s Error – from Nassim Taleb’s tome Fooled by Randomness (2001). The principle of forcing an argument into an artificial binary or bifurcated outcome set, examining only that which is a priori deemed to be the more probable or simple outcome, and not that choice which can serve to produce the largest net effect or ‘payoff’. Only researching the most likely, framework compliant or simple alternative, will only serve to confirm what we already know, and bears a much lower payoff in terms of information which might be garnered through a black swan, less likely or ‘complex’ alternative turning out to bear any form of credence or final veracity.​

Mazumah Shift – the inflection point wherein more grant money can be applied for or obtained for work based on a new scientific approach/theory than can be readily obtained for work based upon an older theory. A twist of Kuhn’s Theory of Scientific Revolution or Planck’s Paradigm Shift.

McCulloch’s Axiom – it is often possible to extrapolate the truth solely from what is banned. Named for physicist Mike McCulloch, key proponent of QI theory alternatives to Dark Matter.

McLuhan’s Axiom – “Only small secrets need to be protected. The large ones are kept secret by the public’s incredulity.” A quip attributed to philosopher on media theory, Herbert Marshall McLuhan.

McNamara Fallacy – named for Robert McNamara, the US Secretary of Defense from 1961 to 1968, involves making a decision based solely on quantitative observations or ‘metrics’ and ignoring all other factors. Often the rationale behind this is that other forms of observations cannot be proven. But the reality is often that metrics obsession stems from mostly laziness, cost avoidance or the fact that metrics tend toward certain answers. See phantasiae vectis.

mésa éxo Prose or Communication– (from Greek: μέσα έξω; mésa éxo – ‘inside out’) – when one employs sophisticated style in communication as a substitution for competence. Communication is achieved by means of two elements: style and logical delivery. Mésa éxo prose is that writing or speech which is delivered in a compliant and user friendly style (a grammar, flow, idiom, and sentence structure with which the reader will most likely be familiar and view as culturally sophisticated), however is convoluted in terms of its inference, logical structure, integrity or ability to deliver actual intelligence. Just because its style may appear elite or comfortable does not mean that a piece of communication has been skillfully delivered. A common deception applied in journalism. See also ‘Bridgman Point’.

Methodical Cynicism – when one practices methods of cultivating ignorance through corruption of the process which regulates our social and scientific understanding. The exploitation of denial mandating a personal religious belief set while at the same time tendering an affectation of science.

Methodical Deescalation – employing abductive inference in lieu of inductive inference when inductive inference could have, and under the scientific method should have, been employed. In similar fashion employing inductive inference in lieu of deductive inference when deductive inference could have, and under the scientific method should have, been employed.

All things being equal, the latter is superior to the midmost, which is superior to the former:

  • Conformance of panduction​ (while a type/mode of inference this is not actually a type of reasoning)
  • Convergence of abductions
  • ​Consilience of inductions
  • Consensus of deductions

One of the hallmarks of skepticism is grasping the distinction between a ‘consilience of inductions’ and a ‘convergence of deductions’. All things being equal, a convergence of deductions is superior to a consilience of inductions. When science employs a consilience of inductions, when a convergence of deductions was available, yet was not pursued – then we have an ethical dilemma called Methodical Deescalation.

Methodical Doubt – doubt employed as a skulptur mechanism, to slice away disliked observations until one is left with the data set they favored before coming to an argument. The first is the questionable method of denying that something exists or is true simply because it defies a certain a priori stacked provisional knowledge. This is nothing but a belief expressed in the negative, packaged in such a fashion as to exploit the knowledge that claims to denial are afforded immediate acceptance over claims to the affirmative. This is a religious game of manipulating the process of knowledge development into a whipsaw effect supporting a given conclusion set.

Misinformation Selectiveness – Cherry picking of eyewitness data through the belief that memory becomes less accurate because of interference from post-event information,  except for information which a claimant happens to favor.

missam singulia shortfall in scientific study wherein two factors are evaluated by non equivalent statistical means. For instance, risk which is evaluated by individual measures, compared to benefit which is evaluated as a function of the whole – at the ignorance of risk as a whole. Conversely, risk being measured as an effect on the whole, while benefit is only evaluated in terms of how it benefits the individual or a single person.

Mona Lisa Effect – the misconception on the part of a person pushing their celebrity or narcissism by means of an agenda, or even agenda of convenience or virtue (about which they care nothing in reality), that every evolution of a conversation is about them or contains some schemed affront to their virtue or truth. Named after the effect (mistakenly attributed to the Leonardo da Vinci painting the Mona Lisa) wherein a painting’s eyes will present the illusion of following one about the room as they move by it.

Moral Recourse – an appeal to morality wherein a faking arguer who actually bears no interest in the science behind an issue, is outflanked, and actual science is no longer on his side. He will shift to moral arguments and attempt to make his opponents appear to be bad or immoral for their stance. This is the shift we see underway now in vaccine science for instance, now that early immune activation and injected aluminum are linked in numerous studies to autism, the argument is no longer scientific, rather a moral appeal.

Moving the Goalposts – argument in which evidence successfully presented in response to a specific demand for proof of an observation is dismissed and some other (often greater) evidence is then demanded.

MPUC – Maximize profits until the collapse. The prevailing strategy of extraction-minded cronies and elites. A prevailing strategy on the part of crooks, super salesmen, Ponzi and Pyramid schemers and Wall Street exploitation specialists.

Myth of Certainty/Myth of Proof – based upon the wisdom elicited by the Leon Wieseltier quote ‘No great deed private or public has ever been undertaken in a bliss of certainty’.

negare attentio Effect – the unconscious habituation of a person seeking publicity or attention in which they will gravitate more and more to stances of denial, skepticism and doubting inside issues of debate, as their principal method of communication or public contention. This condition is subconsciously reinforced because they are rewarded with immediate greater credence when tendering a position of doubt, find the effort or scripted method of arguing easier, enjoy shaming and demeaning people not similar to their own perception of self or presume that counter-claims don’t require any evidence, work or research.

Negative Reactance –  an Aristotelian posturing wherein one, upon confrontation with objectionable principles, thereafter embraces the opposite of such objectionable principles, avoiding any possible middle path or other rational option – as a defensive reaction to such objectionable principles. If one adopts a set of tenets or a lie of allegiance, even if that set of beliefs does not qualify as a religion in and of itself, solely as a reaction to a religion one has departed from recently or in the past, and/or as a way of seeking revenge or retribution or cathartic reward over past hurts and regrets regarding one’s membership in the former religion – then one is simply operating inside a duality and indeed has simply adopted another religion.

Neglect by Proxy – when one employs pluralistic ignorance, false consensus or a doctrine of religious belief as a preemptive excuse or rationale as to why it is unnecessary to examine a challenging body of evidence.

Nelsonian Knowledge – a precise and exhaustive knowledge, about that which one claims is not worth examining. No expertise is so profound in its depth as that expertise prerequisite in establishing what not to know. Such Nelsonian knowledge takes three forms:

1. a meticulous attentiveness to and absence of, that which one should ‘not know’,
2. an inferential method of avoiding such knowledge, and finally as well,
3. that misleading knowledge or activity which is used as a substitute in place of actual knowledge (organic untruth or disinformation).

The former (#1) is taken to actually be known on the part of a poseur. It is dishonest for a man deliberately to shut his eyes to principles/intelligence which he would prefer not to know. If he does so, he is taken to have actual knowledge of the facts to which he shut his eyes. Such knowledge has been described as ‘Nelsonian knowledge’, meaning knowledge which is attributed to a person as a consequence of his ‘willful blindness’ or (as American legal analysts describe it) ‘contrived ignorance’.

Nelsonian Strategy – when one intentionally creates a situation one knows will, or likely will, result in a disastrous outcome – an outcome which serves to benefit one’s profit, power, or harm of enemies – however, which also will not technically be their ‘fault’. SARS-CoV-2 gain of function research funding being cut in the US, with the work and funding being shifted to brand new and inexperienced Chinese labs – gave the appearance of virtue, diligence, and non-culpability – yet was motivated by the knowledge that the outcome of an accidental release of the virus would benefit The Party immensely.

Neti’s Razor – the principle which serves to cut nihilism as a form of belief, distinct from all other forms of atheism as either philosophy or belief. From the Sanskrit idiom, Neti Neti (not this, not that): one cannot produce evidence from that which at a point did or will not exist, to also demonstrate that nothing aside from that entity therefore exists.

Neuhaus’s Law – where orthodoxy is optional, orthodoxy will sooner or later be proscribed. Skepticism, as a goal in and of itself will always escalate to extremism.

Neutrality as Bias Error – the error of assuming that a neutral party will conduct more diligent scientific investigation into a controversial topic than would a sponsor of an idea, when vulnerabilities actually compromise such an approach. Neutral parties are less inclined to be up to date on subject intelligence, may ask the wrong question, may fail to discern validity of data or the difference between authentic research and reactive propaganda, may research the wrong facet of the issue, and might perceive a parsimonious need to result in conforming explanations as looming larger than the plurality introduced by facets of the research.

Newman’s Doctrine – a resilience on the part of one’s victim in no way serves to exonerate the immorality of one’s crime. The resulting benefits in terms of wisdom, resilience and strength on the part of the victim, in no way serves to justify the decision to enact a harm upon that victim. What makes you stronger is not therefore forgiven in its attempt to kill you. (See Bastiat Fallacy).

Noitcif – fiction is the act of weaving a series of lies to arrive at a truth. Noitcif is the act of weaving a series of facts to arrive at a lie.

non rectum agitur Fallacy – a purposeful abrogation of the scientific method through corrupted method sequence or the framing and asking of the wrong, ill prepared, unit biased or invalid question, conducted as a pretense of executing the scientific method on the part of a biased participant. Applying a step of the scientific method, out of order – so as to artificially force a conclusion, such as providing ‘peer review’ on sponsored constructs and observations, rather than studies and claims, in an effort to kill research into those constructs and observations.

Nonaganda – (see Evidence Sculpting or Skulptur Mechanism) a media which does no real investigation, relates 100% accurate fact or even does ‘fact-checking’, yet still ignores 50% of relevance concerning an issue, is still fake news.

Novella Split – when one flees from addressing a challenging topic by citing/issuing standardized shallow past doctrines as authority; further then refusing to intellectually or professionally regard the challenging subject or observation ever again. The “My job is done here now” or “I’ve written about this before” cocoon of defensiveness on the part of a fake skeptic.

Numptured/Numptant/Numpty – a person who is educated or intelligent enough to execute a method, memorize a list of key phrases/retorts or understand some scientific reasoning, yet is gullible or lacking in circumspection to where they are unable to understand the applicable deeper meaning/science, the harm they cause nor their role in being manipulated inside propaganda. A numptant, or ‘numpty’ can be discerned through the number of subjects about which they like to argue. This indicating a clear preference not for any compassion or concern regarding any particular subject; rather the superior nature of their own thinking, argument, adherence to rationality and compliance inside any topic in which they can demonstrate such. Science, or the pretense thereof, is a handy shield behind which to exercise such a neurosis.

Objective and Subjective Domain Error – holding a domain argument (such as God or monism) as consensus, becomes an oppressive action called an Einfach Mechanism. Promoting a domain idea, such as God or Material Monism into the place of a testable or objective hypothesis – and thereafter, treating it as if it was an actual hypothesis. Most scientists do not take much philosophy, so they fall vulnerable to this mind trick. The least scientific thing one can do, is to believe the null hypothesis. This gets even worse, if the null hypothesis is not even a true hypothesis – and rather is a subjective domain of ideas.

Observation vs Claim Blurring – the false practice of calling an observation or data set, a ‘claim’ on the observers’ part.  This in an effort to subjugate such observations into the category of constituting scientific claims which therefore must be now ‘proved’ or dismissed (the real goal: see Transactional Occam’s Razor Fallacy).  In fact an observation is simply that, a piece of evidence or a cataloged fact. Its false dismissal under the pretense of being deemed a ‘claim’ is a practice of deception and pseudoscience.

Observational Occam’s Razor Fallacy (Exclusion Bias) – through insisting that observations and data be falsely addressed as ‘claims’ needing immediate explanation, and through rejecting such a ‘claim’ (observation) based upon the idea that it introduces plurality (it is not simple), one effectively ensures that no observations will ever be recognized which serve to frame and reduce a competing alternative.  One will in effect perpetually prove only what they have assumed as true, regardless of the idea’s inherent risk. No competing idea can ever be formulated because outlier data and observations are continuously discarded immediately, one at a time by means of being deemed ‘extraordinary claims’.

Observer Expectancy Effect – when a researcher expects a given result and therefore unconsciously manipulates an experiment or scientific method or misinterprets data in order to find that expected result.

Obtollence – (The Principle of Ethical Skepticism) – Latin ob – against, plus tollens – denial. Fake skeptics love to ply their wares in proving an absence (Hempel’s Paradox) – applying science to deny that things exist (prove the null, or prove absence); when such activity is unethical, impossible or even unnecessary. They seek to remove any question of modus indifferens (the neutrality of skepticism) at all costs. An ethical researcher avoids any form of Hempel’s Paradox – whereas a fake researcher dwells in it most of the time.

Occam’s Razorall things being equal, that which is easy for most to understand, and as well conforms with an a priori stack of easy-to-understands, along with what I believe most scientists think, tends to obviate the need for any scientific investigation. A false logical construct invented by SSkepticism to replace and change the efficacy of Ockham’s Razor, the latter employed as a viable principle in scientific logic. Occam’s Razor was a twist off the older Ockham’s Razor, which was slight and almost undetectable, but can be used to reverse the applicability of the more valid thought discipline inside of Ockham’s Razor. “All things being equal, the simplest explanation tends to be the correct one” is a logical fallacy; constituting a completely different and antithetical approach than that of Ockham’s Razor. Occam’s Razor can only result in conformance based explanations, regardless of their scientific validity.

Occam’s Razor Fallacy – abuse of Ockham’s Razor (and misspelling) in order to to enact a process of sciencey-looking ignorance and to impose a favored idea. All things being equal, that which is easy for most to understand, and as well conforms with an a priori stack of easy-to-understands, along with what I believe most scientists think, tends to obviate the need for any scientific investigation. Can exist in four forms, transactional, existential, observational and utility blindness.

Transactional Occam’s Razor Fallacy (Appeal to Ignorance) – the false contention that a challenging construct, observation or paradigm must immediately be ‘explained.’ Sidestepping of the data aggregation, question development, intelligence and testing/replication steps of the scientific method and forcing a skip right to its artificially conclusive end (final peer review by ‘Occam’s Razor’).

Existential Occam’s Razor Fallacy (Appeal to Authority) – the false contention that the simplest or most probable explanation tends to be the scientifically correct one. Suffers from the weakness that myriad and complex underpinning assumptions, based upon scant predictive/suggestive study, provisional knowledge or Popper insufficient science, result in the condition of tendering the appearance of ‘simplicity.’

Observational Occam’s Razor Fallacy (Exclusion Bias) – through insisting that observations and data be falsely addressed as ‘claims’ needing immediate explanation, and through rejecting such a ‘claim’ (observation) based upon the idea that it introduces plurality (it is not simple), one effectively ensures that no observations will ever be recognized which serve to frame and reduce a competing alternative.  One will in effect perpetually prove only what they have assumed as true, regardless of the idea’s inherent risk. No competing idea can ever be formulated because outlier data and observations are continuously discarded immediately, one at a time by means of being deemed ‘extraordinary claims’.

Utility Blindness – when simplicity or parsimony are incorrectly applied as excuse to resist the development of a new scientific explanatory model, data or challenging observation set, when indeed the participant refuses to consider or examine the explanatory utility of any similar new model under consideration.

Facile – appearing neat and comprehensive only by ignoring the true complexities of an issue; superficial. Easily earned, arrived at or won – derived without the requisite rigor or effort. Something easy to understand, which is compatible with a predicate or associated stack of also easy-to-understands.

Ockham’s Inversion – the condition when the ‘rational or simple explanation’ requires so many risky, stacked or outlandish assumptions in order to make it viable, that is has become even more outlandish than the complex explanation it was originally posed against and was supposed to surpass in likelihood. Similarly, a condition wherein the proposed ‘more likely or simple’ alternative is just as outlandish in reality as is the originally considered one.

Omega Hypothesis (HΩ) – Is the idea so important, that it now stands more important that the methods of science, or science itself. Does the idea leave a trail of dead competent professional bodies behind it?

Höchste Mechanism – when a position or practice, purported to be of scientific basis, is elevated to such importance that removing the rights of professionals and citizens to dissent, speak, organize or disagree (among other rights) is justified in order to protect the position or the practice inside society.

Omega Hypothesis (HΩ) – the argument which is foisted to end all argument, period. A conclusion promoted under such an insistent guise of virtue or importance, that protecting it has become imperative over even the integrity of science itself. An invalid null hypothesis or a preferred idea inside a social epistemology. A hypothesis which is defined to end deliberation without due scientific rigor, alternative study consensus or is afforded unmerited protection or assignment as the null. The surreptitiously held and promoted idea or the hypothesis protected by an Inverse Negation Fallacy. Often one which is promoted as true by default, with the knowledge in mind that falsification will be very hard or next to impossible to achieve.

1.  The (Wonka) Golden Ticket – Have we ever really tested the predictive strength of this idea standalone, or evaluated its antithetical ideas for falsification? Does an argument proponent constantly insist on a ‘burden of proof’ upon any contrasting idea, a burden that they never attained for their argument in the first place? An answer they fallaciously imply is the scientific null hypothesis; ‘true’ until proved otherwise?

Einfach Mechanism – an idea which is not yet mature under the tests of valid hypothesis, yet is installed as the null hypothesis or best explanation regardless. An explanation, theory or idea which sounds scientific, yet resolves a contention through bypassing the scientific method, then moreover is installed as truth thereafter solely by means of pluralistic ignorance around the idea itself. Pseudo-theory which is not fully tested at its inception, nor is ever held to account thereafter. An idea which is not vetted by the rigor of falsification, predictive consilience nor mathematical derivation, rather is simply considered such a strong, or Occam’s Razor (sic) stemming-from-simplicity idea that the issue is closed as finished science or philosophy from its proposition and acceptance onward. A pseudo-theory of false hypothesis which is granted status as the default null hypothesis or as posing the ‘best explanation’, without having to pass the rigors with which its competing alternatives are burdened. The Einfach mechanism is often accompanied by social rejection of competing and necessary alternative hypotheses, which are forbidden study. Moreover, the Einfach hypothesis must be regarded by the scientific community as ‘true’ until proved otherwise. An einfach mechanism may or may not be existentially true.

2.  Cheater’s Hypothesis – Does the hypothesis or argument couch a number of imprecise terms or predicate concepts? Is it mentioned often by journalists or other people wishing to appear impartial and comprehensive? Is the argument easily falsified through a few minutes of research, yet seems to be mentioned in every subject setting anyway?

Imposterlösung Mechanism – the cheater’s answer. A disproved, incoherent or ridiculous contention, or one which fails the tests to qualify as a real hypothesis, which is assumed as a potential hypothesis anyway simply because it sounds good or is packaged for public consumption. These alternatives pass muster with the general public, but are easily falsified after mere minutes of real research. Employing the trick of pretending that an argument domain which does not bear coherency nor soundness – somehow (in violation of science and logic) falsely merits assignment as a ‘hypothesis’. Despite this, most people hold them in mind simply because of their repetition. This fake hypothesis circumstance is common inside an argument which is unduly influenced by agency. They are often padded into skeptical analyses, to feign an attempt at appearing to be comprehensive, balanced, or ‘considering all the alternatives’.

Ad hoc/Pseudo-Theory – can’t be fully falsified nor studied, and can probably never be addressed or can be proposed in almost any circumstance of mystery. They fail in regard to the six tests of what constitutes a real hypothesis. Yet they persist anyway. These ideas will be thrown out for decades. They can always be thrown out. They will always be thrown out.

3.  Omega Hypothesis (HΩ) – Is the idea so important or virtuous, that it now stands more important that the methods of science, or science itself. Does the idea leave a trail of dead competent professional bodies behind it?

Höchste Mechanism – when a position or practice, purported to be of scientific basis, is elevated to such importance or virtue that removing the rights of professionals and citizens to dissent, speak, organize or disagree (among other rights) is justified in order to protect the position or the practice inside society.

Constructive Ignorance (Lemming Weisheit or Lemming Doctrine) – a process related to the Lindy Effect and pluralistic ignorance, wherein discipline researchers are rewarded for being productive rather than right, for building ever upward instead of checking the foundations of their research, for promoting doctrine rather than challenging it. These incentives allow weak confirming studies to to be published and untested ideas to proliferate as truth. And once enough critical mass has been achieved, they create a collective perception of strength or consensus.

4.  Embargo Hypothesis (Hξ) – was the science terminated years ago, in the midst of large-impact questions of a critical nature which still remain unanswered? Is such research now considered ‘anti-science’ or ‘pseudoscience’?

Entscheiden Mechanism – the pseudoscientific or tyrannical approach of, when faced with epistemology which is heading in an undesired direction, artificially declaring under a condition of praedicate evidentia, the science as ‘settled’.

Poison Pill Hypothesis – the instance wherein sskeptics or agency work hard to promote lob & slam condemnation of particular ideas. A construct obsession target used to distract or attract attack-minded skeptics into a contrathetic impasse or argument. The reason this is done is not the confusion or clarity it provides, rather the disincentive which patrolling skeptics place on the shoulders of the genuine skilled researcher. These forbidden alternatives (often ‘paranormal’ or ‘pseudoscience’ or ‘conspiracy theory’ buckets) may be ridiculous or indeed ad hoc themselves – but the reason they are raised is to act as a warning to talented researchers that ‘you might be tagged as supporting one of these crazy ideas’ if you step out of line and do not visibly support the Omega Hypothesis. A great example is the skeptic community tagging of anyone who considers the idea that the Khufu pyramid at Giza might have not been built by King Khufu in 2450 bce, as therefore now supporting conspiracy theories or aliens as the builders – moreover, their being racist against Arabs who now are the genetic group which occupies modern Egypt.

5.  Evidence Sculpting – has more evidence been culled from the field of consideration for this idea, than has been retained? Has the evidence been sculpted to fit the idea, rather than the converse?

Skulptur Mechanism – the pseudoscientific method of treating evidence as a work of sculpture. Methodical inverse negation techniques employed to dismiss data, block research, obfuscate science and constrain ideas such that what remains is the conclusion one sought in the first place. A common tactic of those who boast of all their thoughts being ‘evidence based’. The tendency to view a logical razor as a device which is employed to ‘slice off’ unwanted data (evidence sculpting tool), rather than as a cutting tool (pharmacist’s cutting and partitioning razor) which divides philosophically valid and relevant constructs from their converse.

Also, the instance common in media wherein so-called ‘fact-based’ media sites tell 100% truth about 50% the relevant story. This is the same as issuing 50% misinformation or disinformation.

6.  Lindy-Ignorance Vortex – do those who enforce or imply a conforming idea or view, seem to possess a deep emotional investment in ensuring that no broach of subject is allowed regarding any thoughts or research around an opposing idea or specific ideas or avenues of research they disfavor? Do they easily and habitually imply that their favored conclusions are the prevailing opinion of scientists? Is there an urgency to reach or sustain this conclusion by means of short-cut words like ‘evidence’ and ‘fact’? If such disfavored ideas are considered for research or are broached, then extreme disdain, social and media derision are called for?

Verdrängung Mechanism – the level of control and idea displacement achieved through skillful employment of the duality between pluralistic ignorance and the Lindy Effect. The longer a control-minded group can sustain an Omega Hypothesis perception by means of the tactics and power protocols of proactive pluralistic ignorance, the greater future acceptability and lifespan that idea will possess. As well, the harder it will to be dethrone as an accepted norm or perception as a ‘proved’ null hypothesis.

Omega Hypothesis Principle of Causatum – if you approach a subject with flawed assumptions, everything will appear to be a mystery from that point on. The principle citing that, as the number of enforced Omega Hypotheses increases inside a discipline or subject, so will the number of quandaries, mysteries, paradoxes and conundrums – and in arithmetic proportion. By the principle of the contrathetic impasse, such entities of conflict and unresolvability relate in direct proportion to bad underlying assumptions in force.

omnis doctrina – when an authority insists that, in order to be a member or adherent to a club, citizenship, religion or group, one must believe all the tenets of the charter or mantra of that group without question or dissent. Ideas such as ‘you can’t just throw out parts of the Bible which you don’t like, and keep the rest,’ or ‘you cannot pick and choose the science you like and do not like’ or ‘you cannot be an American and toss out the 4th Amendment’. An appeal to authority, which can slip by and sound more reasonable because it is offered in a rhetorical reverse fashion of posing.

Open-Ended Fallacy – an argument, contention, or objective which stipulates attainment of something which is either undefined, difficult to measure, involves changing goals, is impossible to attain, or would require so much investment of resources that the involved costs are not worth the attainment benefits. A method of arguing/oppression which is used to enslave an opponent under an unresolvable standard or burden.

Orphan Question – a question, purported to be the beginning of the scientific method, which is asked in the blind, without sufficient intelligence gathering or preparation research, and is as a result highly vulnerable to being manipulated or posed by means of agency. The likelihood of a scientifically valid answer being developed from this question process, is very low. However, an answer of some kind can almost always be developed – and is often spun by its agency as ‘science’. This form of question, while not always pseudoscience, is a part of a modified process of science called sciebam. It should only be asked when there truly is no base of intelligence or body of information regarding a subject. A condition which is rare.

Ostrich Effect – the tendency of a person when facing a losing scenario, danger or data which one does not favor, to bury one’s head in the sand and ignore the issue, person or new information.

Outcome Error – judging a decision based on the outcome of the decision rather than by the soundness of the methodology which went into make the decision. Ends justify the means.

Outference – a critical (not rhetorical) argument which bases its inference or conclusions upon cultivated ignorance and the resulting lack of information, rather than the presence of sound information. More than simply an appeal to ignorance, this ‘lack’ of information is specifically engineered to produce specious conclusion in the first place. This type of argument gets stronger and stronger the less and less critical information one holds. This is a warning flag of agenda or political shenanigans at play.

Outsourced Critical Thinking – an oxymoron. Skepticism is after all ‘a null hypothesis holding that experts are fallible’. All this type of appeal to authority constitutes is: one doesn’t trust self to analyze the evidence that another ordinary person can easily survey – and defaults to someone with credentials to do their thinking for them. This corruption in the part of stakeholder thinking renders science nothing more than: outsourced critical thinking.

Overconfidence Effect – excessive confidence in one’s own answers to questions based on ego, past success or one being an expert or scientist.

Overshooting the Question – to subconsciously make the mistake of responding to a simple question with a more complex answer than was asked or required – an indicator of agency and/or Nelsonian Knowledge. The act of subconsciously answering a question which was not actually asked, indicating a fear or a degree of defensiveness which betrays agency, bias, or culpability.

Oversimplification (Pseudo Reduction) – instead of reducing an argument so that its contributing elements can be tested, a pretend skeptic will oversimplify the argument as a pretense of reduction. A form of false reduction which only serves to reduce the possible outcomes, and not actually deconstruct an argument into its logical critical path. Rather than examining all contributing elements of cause to effect, soundness or observation, the oversimplifier pares off those influences, constraints, objectives, and factors which serve to get in the way of their agency or desired conclusion. Thereafter employing ‘Occam’s Razor’ simplicity as an apologetic. ‘The dose makes the poison’, or ‘non-ionizing radiation can’t cause cancer’ are examples of pseudo-reduction. The arguer appears to be stepping down to a level of cause and effect inference, however has excluded so many factors that – there can only be one a priori inference drawn from the remaining set of influence.

Overton Window Manipulation – an Overton Window is the range of opinion positions which are acceptable to the mainstream population at a given time. It is also known as the acceptable window of discourse. The term is named after Joseph P. Overton, who stated that an idea’s political viability depends mainly on whether it falls within this range. False skeptics purposely pathologize subjects and individuals in order to artificially truncate or manipulate where this window falls in media, along with what is deemed acceptable for scientific study.

Palter/Paltering – lying through facts. Paltering is the deceptive use of truthful statements to convey a misleading impression or inference. It is the devious art of lying by telling unqualified truths. It usually involves equivocation and/or prevarication as the basis of its management of constraint, context or ignoratio elenchi – however often can also come in the form of a semantic truth as opposed to a logical one.

Panalytics – statistics purposely contrived to be broad and shallow so as to avoid any possible signal detection other than extreme effects, coupled with abject avoidance of any kind of direct observation to confirm. The opposite of intelligence (consilience between analytic signals and direct observation).

Panduction – an invalid form of inference which is spun in the form of pseudo-deductive study. Inference which seeks to falsify in one fell swoop ‘everything but what my club believes’ as constituting one group of bad people, who all believe the same wrong and correlated things – this is the warning flag of panductive pseudo-theory. No follow up series studies nor replication methodology can be derived from this type of ‘study’, which in essence serves to make it pseudo-science.  This is a common ‘study’ format which is conducted by social skeptics masquerading as scientists, to pan people and subjects they dislike. There are three general types of Panduction. In its essence, panduction is any form of inference used to pan an entire array of theories, constructs, ideas and beliefs (save for one favored and often hidden one), by means of the following technique groupings:

  1. Extrapolate and Bundle from Unsound Premise
  2. Impugn through Invalid Syllogism
  3. Mischaracterize though False Observation

The Paradox of Virtue – a decision condition wherein uncertainty or unknowns force one risk (its mitigation most often associated with social virtue) to be served at the expense of all other risks, both known and unknown. The known risk-mitigation is promoted until a Tau Point inflection is reached and decision makers realize that another risk is now showing to have been greater, and reveals our actions to have been a mistake. Usually the cost of the under-served risk is many times higher than the virtue-served risk would have cost if left alone to begin with. There is no ‘good’, there is only evil and not-evil. Under a Paradox of Virtue, those who try to enforce virtuous good, fail for having boasted unreasonable claims of God-level knowledge or power on their part. They end up causing more harm, than good.

Parsimony Regarding Oppression – we must remember that the opposite of conspiracy theory, is an even worse mistake called ‘oppression’. Oppression makes the very same mistakes in inference as does the conspiracy theorist – except in the case of oppression, usually a lot of people are harmed as a result.​

pavor mensura – an effect of apophenia wherein, upon analyzing a system or looking at a set of analytics or an issue for the very fist time, observers will often mistakenly perceive that a disaster is in the making. The erroneous tendency to perceive that the first statistical measures of a disease, natural system, dynamic process or social trend – can be reliably extrapolated to predict calamity therein.

Peer Review Gaming – when a study is accused of ‘not following the scientific method,’ when in fact it was denied method, via blocked access to peer review channels, journals and protocols.

Penultimate Set Bias – the contention or implication on the part of a proponent of an idea that they personally hold enough validated conclusion base or data to assume that the gaps in their knowledge will have little or no effect on the veracity of their further conclusions; and therefore they should not be brought into question. The implicit claim being that the proponent holds a ‘next to the last thing knowable’ domain of knowledge on the topic in question. The ‘God of the Gaps’ one liner is typically employed as an apologetic by this false claimant to authority.

Periplocate – (Greek: περίπλοκος, períplokos : complicated, elaborate, involved) – to render a process or approach the resolution of a question in a more complicated fashion than is necessary. The use of complicating ignoratio elenchi, red herring, or overly obscure academic heuristics or methodologies to solve a problem, when such complexity was not required to begin with. This is often done in an effort to capture the topic or question under a fallacy of relative privation (implying that the matter can only be addressed by academics or scientists). The opposite of methodical deescalation.

phantasiae vectis – the principle outlining that, when a human condition is monitored publicly through the use of one statistic/factor, that statistic/factor will trend more favorable over time, without any actual real underlying improvement in its relevant domain or condition. Such singular focus often to the detriment of all other related and appropriate factors. Unemployment not reflecting true numbers out of work, electricity rates or inflation measures before key democratic elections, efficiency focus instead of effectiveness, crime being summed up by burglaries or gun deaths only, etc.

Pharmaceutical Research Fraud – nine methods of attaining success results from pharmaceutical studies, which are borderline or fraudulent in nature. The first eight of these are developed by Richard Smith, editor of the British Medical Journal. The ninth is a converse way of applying these fraud accusations to filter out appeals for study under Ockham’s Razor, in situations where studies run counter to pharmaceutical/research revenue goals.

1.  Conduct a trail of your drug against a treatment known to be inferior.

2.  Trial your drug against too low of a dose of a competitor drug.

3.  Conduct a trial of your drug against too high of a dose of a competitor drug (making your drug seem less toxic).

4.  Conduct trials which are too small to show differences from competitor drugs.

5.  Use multiple endpoints in the trial and select for publication those that give favorable results.

6.  Do multicenter trials and select for publication results from centers that are favorable.

7.  Conduct subgroup analyses and select for publication those that are favorable.

8.  Present results that are most likely to impress – for example, reductions in relative risk rather than absolute risk.

9.  Conduct high inclusion bias statistical studies or no studies at all, and employ items 1 – 8 above to discredit any studies which indicate dissenting results.

Phylacterial (Theory) – the opposite of conspiracy theory. A de rigueur theory or memorized set of ideas/evidences which adhere to orthodox views regarding a subject. The set of memorized Schapiro Utterances which serve to identify one as residing in the membership of those who are approved to speak or lead inside a topic or social group. Refers to a phylactery, or a box containing slips inscribed with scriptural passages one must master in order to be considered orthodox/compliant.

Phylactorithm – an algorithm which scans media and discourse for compliant (phylacterial) or forbidden (conspiracy theory) phraseology – tasked with the purpose of bucket characterizing the writer into either the good guy or bad guy bucket. Refers to a phylactery, or a box containing slips inscribed with scriptural passages one must master in order to be considered orthodox/compliant.

Placebo Effect – when simply believing or being told that something will have an effect on you, causes one to indeed experience that effect.

Planning Fallacy – the tendency to underestimate the amount of science and method involved in adequate address of a subject, or of task-completion times conducted therein.

Plausible Conformance – a technique of obfuscation employed by SSkeptics to enforce a classic or predetermined conforming conclusion inside a pluralistic set of observations/data. The explanation is oft touted to be in compliance with science and an erroneous interpretation of “Occam’s Razor (sic)” wherein the ‘simplest explanation tends to be the correct one.’ In reality, the proposed conforming scenario, while seeming simple in concept, is highly complicated in its viability or application, and often constitutes an impossible explanation of the data set which has been observed. Plausible Conformance therefore is a method of thought control and data filtering and in no way represents science falsification hierarchy protocols nor the scientific method.

Plausible Deniability – a state of avoidance of the scientific method in which efforts to study an item are blocked, in favor of a standing prophylactic deniability explanatory scenario which acts in lieu of the scientific method. The deniability scenario is often conforming, however does not have to necessarily present Plausible Conformance. Sometimes the deniability scenario must be sufficiently outlandish enough to deflect the risk of research into very challenging/paradigm shifting observations (Extraordinary observations demand extraordinary denials). A provisional argument which is foisted solely for its outcome in blocking the introduction of an opposing explanation or theory. In practice this is often done with little or no suggestive evidence behind it and is validated or declared true simply based upon its plausibility rather than quality, structure or basis.

Plurality Error – adding complexity without merit to an argument. Introducing for active consideration, more than one idea, construct or theory attempting to explain a set of data, information or intelligence when there is no compelling reason to do so. Also, the adding of features or special pleading to an existing explanation, in order to adapt it to emerging data, information or intelligence – or in an attempt to preserve the explanation from being eliminated through falsification.

Policy Based Evidence Manipulation – when an Einfach or Höchste Mechanism is enforced socially by a governing body or a group enforcing false consensus and pluralistic ignorance to such an extent that the researching, data collection, analytical, legislative or other presiding research group is incentivized to construct objective adjustments to the data collection entailed around the issue being enforced.  Such adjustments, while often scientifically justifiable, introduce bias in two ways: 1) equally scientific counter adjustments are not considered (error by omission), and 2) the magnitude of such adjustments are left up to the sole discretion of the data analysis group. This introduces a guaranteed bias into most information sets featuring a high number or dynamic set of contributing factors/influences or a high number of measurement points.

Popper Error – when a predictive study confirming a hypotheses is abused to dismiss falsification based data or a competing hypotheses, because confirmatory evidence is easy to find and falsification evidence is comparatively of a higher rigor in origin.

Popper Fallacy – when relying on the weak positions of predictive studies, statistical analyses, a ‘study of studies,’ associative and correlative studies, or series of anecdotes to stand as sufficient basis for peer review and/or acceptance of a shaky contention. Such studies are more appropriate for plurality screening, not proof.

Post Stockholm Syndrome – a condition wherein a hostage, more than simply developing an affinity for captor, begins to develop amnesia about their state being hostage in the first place. Under such a condition, maintenance of this amnesia becomes the preeminent priority.

praedicate evidentia – any of several forms of exaggeration or avoidance in qualifying a lack of evidence, logical calculus or soundness inside an argument. A trick of preemptive false-inference, which is usually issued in the form of a circular reasoning along the lines of ‘it should not be studied, because study will prove that it is false, therefore it should not be studied’ or ‘if it were true, it would have been studied’.

praedicate evidentia – hyperbole in extrapolating or overestimating the gravitas of evidence supporting a specific claim, when only one examination of merit has been conducted, insufficient hypothesis reduction has been performed on the topic, a plurality of data exists but few questions have been asked, few dissenting or negative studies have been published, or few or no such studies have indeed been conducted at all.

praedicate evidentia modus ponens – any form of argument which claims a proposition consequent ‘Q’, which also features a lack of qualifying modus ponens, ‘If P then’ premise in its expression – rather, implying ‘If P then’ as its qualifying antecedent. This as a means of surreptitiously avoiding a lack of soundness or lack of logical calculus inside that argument; and moreover, enforcing only its conclusion ‘Q’ instead. A ‘There is not evidence for…’ claim made inside a condition of little study or full absence of any study whatsoever.

Prager’s Axiom – those who won’t fight the great evils will fight lesser or make-believe evils.

Predictive Fallacy – the fallacy of applying predictive studies which show the lack of evidence of a particular set of data, in an unconstrained domain of evidence, and presuming this to be scientifically indicative of the Evidence of Absence.

Prejucation – the state of possessed knowledge and training or process of producing an individual who is programmed to obfuscate allowance of access to science on behalf of specific targeted topics. A teaching bound in misinformation, religious principles conflated with science, or intimidation and fear to such an extent that a mental barrier is established regarding specific subjects in the mind of its programmed victims. The social training programs and teaching promoted by Social Skepticism imbedded inside any curriculum into which they have material input.

Principle of Relegation – any sufficiently analyzed magic is indistinguishable from science (see Relegation Error).

Probabilistic Fallacy – the presumption that one holds enough data to determine what is probable and improbable in a field of a set of data.

The Problem of Apophenia – apophenia is the contraposition of consilience. Apophenia is the perception of or belief in connectedness among unrelated phenomena. The problem of a claim to apophenia resides in this: consilience is required to prove apophenia’s ‘unrelated’ input claim. Thereby potentially rendering it a circular appeal. Consilience derives from a state of active neutrality (skepticism). Apophenia therefore, when employed as a fallacy accusation, can only derive from an anchoring bias.

Problem of Induction – a variety of forms of argument which either suffer from Popper’s problem of induction, demarcation or in some way imply or claim scientific completion or consensus, when such a standard has either not been attained in fact, or only exhibited inductive consilience as opposed to scientific deduction.

Procedural Truth – a form of argument validation which is weaker than a semantic truth (which in turn is weaker than a logical truth) wherein the ‘truth’ therein is derived merely as a trivial outcome of the constraints, method, or procedure employed and not because of any actual valid scientific or logical rule. A magician’s sleight-of-hand stage trick, or a huckster’s spin on mathematics in a corrupt purchase, both fall short of any form of truth (semantic or logical) and only tender the appearance of being correct by means of the specific approach used or a convoluted set of constraints/circumstances.

Procrustean Bed – an arbitrary standard to which exact conformity is forced, in order to artificially drive a conclusion or adherence to a specific solution. The undesirable practice of tailoring data to fit its container or some other preconceived argument being promoted by a carrier of an agenda.

Professional Victim – a person who seeks to leverage to their financial, intellectual or social advantage a perception of being a victim of some action on another stakeholder’s part. The purpose is to simultaneously injure the targeted stakeholder and at the same time enrich the purported victim. Such a method becomes habitual and increases in shrillness over time if left unchecked. A professional victim is in reality the abuser.

Promotification – deception or incompetence wherein only predictive testing methodology was undertaken in a hypothesis reduction hierarchy when more effective falsification pathways or current evidence were readily available, but were ignored.

Promotification Pseudo Science – especially in discovery science methodology, the pseudoscience practice of only developing, or the forcing the sponsor of an idea/set of observations, as a first priority to only fully develop, evidence in support of or a series of predictive-only tests which merely serve to confirm conventional or conforming explanations of that data in question. The act of advertising this methodology as being representative of the ‘scientific method.’

Proof by Celebrity – submission of others to an argument so over-addressed by biased celebrities, disdained and fraught with media ridicule so as to not reasonably be able to deal with in at any relevant depth or via salient data or argument.

Proof by Verbosity – submission of others to an argument too complex, meandering and verbose to reasonably deal with in all its intimate details.

Proof Gaming – employing dilettante concepts of ‘proof’ as a football in order to win arguments, disfavor disliked groups or thought, or exercise fake versions of science. Asking for proof before the process of science can ostensibly even start, knowing that plurality is what begins the scientific method not proof, and further exploiting the reality that science very seldom arrives at a destination called ‘proof’ anyway. Proof gaming presents itself in seven speciations:

Catch 22 (non rectum agitur fallacy) – the pseudoscience of forcing the proponent of a construct or observation, to immediately and definitively skip to the end of the scientific method and single-handedly prove their contention, circumventing all other steps of the scientific method and any aid of science therein; this monumental achievement prerequisite before the contention would ostensibly be allowed to be considered by science in the first place. Backwards scientific method and skipping of the plurality and critical work content steps of science. A trick of fake skeptic pseudoscience, which they play on non-science stakeholders and observers they wish to squelch.

Fictitious Burden of Proof – declaring a ‘burden of proof’ to exist when such an assertion is not salient under science method at all. A burden of proof cannot possibly exist if neither the null hypothesis or alternative theories nor any proposed construct possesses a Popper sufficient testable/observable/discernible/measurable mechanism; nor moreover, if the subject in the matter of ‘proof’ bears no Wittgenstein sufficient definition in the first place (such as the terms ‘god’ or ‘nothingness’).

Herculean Burden of Proof – placing a ‘burden of proof’ upon an opponent which is either arguing from ignorance (asking to prove absence), not relevant to science or not inside the relevant range of achievable scientific endeavor in the first place. Assigning a burden of proof which cannot possibly be provided/resolved by a human being inside our current state of technology or sophistication of thought/knowledge (such as ‘prove abiogenesis’ or ‘prove that only the material exists’). Asking someone to prove an absence proposition (such as ‘prove elves do not exist’).

fictus scientia – assigning to disfavored ideas, a burden of proof which is far in excess of the standard regarded for acceptance or even due consideration inside science methods. Similarly, any form of denial of access to acceptance processes normally employed inside science (usually peer review both at theory formulation and at completion). Request for proof as the implied standard of science – while failing to realize or deceiving opponents into failing to realize that 90% of science is not settled by means of ‘proof’ to begin with.

Observation vs Claim Blurring – the false practice of calling an observation or data set, a ‘claim’ on the observers’ part.  This in an effort to subjugate such observations into the category of constituting scientific claims which therefore must be now ‘proved’ or dismissed (the real goal: see Transactional Occam’s Razor Fallacy).  In fact an observation is simply that, a piece of evidence or a cataloged fact. Its false dismissal under the pretense of being deemed a ‘claim’ is a practice of deception and pseudoscience.

As Science as Law Fallacy – conducting science as if it were being reduced inside a court of law or by a judge (usually the one forcing the fake science to begin with), through either declaring a precautionary principle theory to be innocent until proved guilty, or forcing standards of evidence inside a court of law onto hypothesis reduction methodology, when the two processes are conducted differently.

Proof Pollyanna – when one has a tendency to cite a need for smoking gun proof as their standard of research and science, while not realizing that most of science hinges on Peer Acceptance and rarely on a single or sample case “Proof.”

Propaganda – the skilled exploitation of acerbic or surreptitious misinformation, anonymous malinformation, along with smoothed (both simple and authoritative) disinformation, passed selectively from fiat authority to those targeted and under its influence – which is used to harm opposition voices, and to make allied voices appear more credible. Propaganda exploits the human proclivity towards fear-uncertainty-doubt (FUD), identifying the bad guy in advance (judging intent), and finally the desire for easy and simple answers.

Prosecutor’s Fallacy – a low probability of valid detections does not mean a low probability of some valid detection or data being found.

Proteus Phenomenon – the phenomenon of rapidly alternating or antithetical extreme research claims and extremely opposite refutations early in the risk horizon involved in a hot topic of science. Before peer pressure to accede to one answer has been established. Research during a period of scientific maturation where dissent is not only acceptable – but is advantageous for establishing career notoriety. This is the opposite of jackboot ignorance, a period in which opposite refutation is forbidden, regardless of the risk horizon involved in the topic.

Proving Too Much – using a form of argument to counter observations or ideas, that if it were valid, could imply extrapolated absurd conclusions which cannot be valid, therefore the base argument is invalid, regardless of the data.

Pseudo Deduction – a type of appeal to authority in which a journalist or media outlet will cite the circumstances around a mystery or quandary of merit and contend that they do not know which answer is correct, but they do know which answer is not correct. A hack piece which appears to be an objective assessment, however is only constructed so as to target and discredit one specific idea, usually buried as lede inside an otherwise puff-piece article pretending to develop depth on the topic’s other aspects. This affords the journalist tacit permission to conduct deduction without any evidence whatsoever (since at a superficial level they are not ‘tendering a conclusion’).

Pseudo-Hypothesis – A pseudo-hypothesis explains everything, anything and nothing, all at the same time. A pseudo-hypothesis fails in its duty to reduce, address or inform. A pseudo-hypothesis states a conclusion and hides its critical path risk (magical assumption) inside its set of prior art and predicate structure. A hypotheses on the other hand reduces its sets of prior art, evidence and conjecture and makes them manifest. It then addresses critical path issues and tests its risk (magical assumption) as part of its very conjecture accountability. A hypothesis reduces, exposes and puts its magical assertion on trial. A pseudo-hypothesis hides is magical assumptions woven into its epistemology and places nothing at risk thereafter. A hypothesis is not a pseudo-hypothesis as long as it is ferreting out its magical assumptions and placing them into the crucible of accountability. Once this process stops, the hypothesis has become an Omega Hypothesis. Understanding this difference is key to scientific literacy. Grant me one hidden miracle and I can explain everything.

Pseudo-Inference – a form of very weak reverse modus absens linear induction in which a person uses an absence of Y, to infer an absence of X, based upon the post hoc ergo propter hoc solus fallacy of ‘only Y must result from X’. Any form of the rhetorical argument wherein one contends that ‘if your supposition is true, then we would have seen this, and we did not see this, therefore your supposition is false.’

Pseudo-Reduction (Debunking) – the non-critical path disassembly of a minor subset of logical objects as a pretense of examination of the whole. A process which pretends that a robust observation is already understood fully. Which consequently then ventures only far enough into the reducible material to a level sufficient to find ‘facts’ which appear to corroborate one of six a priori disposition buckets to any case of examination: Misidentification, Hoax/Being Hoaxed, Delusion, Lie, Accident, Anecdote. This process exclusively avoids any more depth than this level of attainment, and most often involves a final claim of panductive inference (falsification of an entire domain of ideas), along with a concealed preexisting bias.

Pseudoscience Disposition Malpractice – designation of a research effort as constituting pseudoscience by means of restricting access to, or by conflating or misrepresenting the diligent steps of science.

Psychologism – when one employs solely a psychological explanation in a central role of grounding or explaining some fact attempting to be established. Suffers from the weakness that psychological principles enjoy a perch which can never be falsified, therefore they are at risk of standing as pseudoscience.

P-value Amaurosis – the ironic state of a study or argument wherein there exists more confidence in the accuracy of the percentage significance in measure (p-value), than of the accuracy of its measured contention in the first place.

quod fieri – (Latin: (lit.) ‘(the fact) that (now a specific thing is) to be done’) – a form of intervention bias action, in which the action is not taken from sound evidence or a history of effectiveness, but rather simply because something must be done. This type of decision or action usually is executed in a panic situation, in the face of a slow moving disaster, or a theater of cataclysmic mirage. Ironically its feckless or inane basis is compensated for, by a religious, political, or social fanaticism as to its claimed (but usually false) effectiveness. Those who raise questions regarding the action are typically cast as deplorable and anti-virtue.

Rappaport Claim or Theory – a perspective useful in detecting official deception, based upon a quip by anthropologist Roy A. Rappaport, which states “If a proposition is going to be taken to be unquestionably true, it is important that no one understand it.” A commentary on the effectiveness of obfuscation at the intersection of pseudo-theory and wicker man defenses. Such a proposition must feature the traits of pseudo-theory, in that it explains everything under a Lindy effect (and therefore likely explains nothing in reality). Moreover, every critique of such an idea’s features must be deemed as a straw man, thus its elemental claims cannot be pinned down nor tested, because every (genuine) skeptic of its tenets is inevitably ‘wrong’ in some regard (a wicker man defense).

Rat’s Option – when the appearance of a choice is offered, however the only option offered is a preordained path which involves a trap.

Reactance – the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to force you out of a constrained adopted choice.

Reactionary Bunk – to be so threatened or angered by a pseudoscientific idea that you allow it to influence your skepticism and begin to spin pseudoscience as an argument against the idea.

Reactive Dissonance (in business and legal domains, also called ‘malfeasance’) – the fallacious habitual mindset of a researcher or doctrine of a group conducting research, wherein when faced with a challenging observation, study or experiment outcome, to immediately set aside rational data collection, hypothesis reduction and scientific method protocols in favor of crafting a means to dismiss the observation.

Red Shirt Syndrome – a belief that injury/calamity/ruin only happens to other people (those in red shirts on Star Trek – The Original Series) or those who deserve it for some subconsciously held reason. Ignorance of the principle that a pervasive systemic injury happens to almost everyone exposed to it, to varying degrees, and not merely to the unfortunate few. It is just the few who are indeed detected or measured.

Regressive Bias – a certain state of mind wherein perceived high likelihoods are overestimated while perceived low likelihoods are underestimated.

Relegation Error – a principle wherein an authority of control or truth establishes a circumstance wherein, any advance in validity which is produced by outside entities, is immediately appropriated to become part of the truth base of the controlling authority alone. By default, the controlling authority then must be held as the truth standard. All other entities remain in a perpetual state of practice ‘wrong’ – regardless of their actual track record. For example, successes of integrative medicine being immediately co-opted into academic science and accordingly stripped of their non-academic pedigree. Those pseudosciences, thereafter continue to be disdained and ignored as quackery, hokum and non-science by academia. By fait accompli under this method, outsider research, economic theories or controversial avenues of research will always constitute anecdotal pseudoscience, by practice of idea theft alone.

Religion – the compulsory adherence to an idea around which testing for falsification is prohibited.

Rent-Seeking – a version of appeal to authority, coined by Nassim Taleb, in which a person derives income simply because they are touted as an authority, or hold a position inside an organization with such authority, or hold a bureaucratic or power influence over the administration of assets/money – drawing unjustly thereof. Under such a model of value chain theory, even the rental of an asset involves some risk – however the principle hinges more around the idea that, for every dollar of compensation an equal and opposite flow of value should be provided. In similar context this is the origin of the statement of ethical skepticism ‘Risk is the leaven of the bread of hard work. Beware of those who’s trade is in neither.”

Researcher’s Catch 22 – the instance brought about through over-representation of fake skeptics or science communicators in the media. A condition wherein, on one hand if one is too liberal or permissive in their semantics regarding a paranormal topic, they will be crucified for promoting ‘woo’ by our fake ‘science communicators’ who don’t know their ascience from a hole in the ground. On the other hand if they choose the wrong words – reminiscent of catch-phrases familiar to skeptics, Atheists, monists and nihilists – thereafter such agency will jump on the chance to crucify the topic as therefore a ‘pseudoscience’ as well (by your own admission as a researcher). Either the researcher’s individual reputation, or their field of career study – is placed at risk. This presents a possible no-win scenario. This is also a common challenge which faces a focused-research corporation head with regard to competitors and stockholders.

The Riddle of Nelsonian Knowledge – it behooves the holder of Nelsonian knowledge to know more about this embargoed knowledge than would be reasonably expected inside standard ignorance. The irony with Nelsonian knowledge is that it demands of its ‘ignorant party’ a detailed awareness of schema, its depth and a flawless monitoring, which is unparalleled in official knowledge. If our desire to avoid so-called ‘baseless pseudoscience’ is as casual as we imply; casual to such an extent so as to justify our complete disinterest in it as a species, then why is our knowledge of specifically what is forbidden-to-study, so damned accurate and thorough? If it is all worthless fodder, then why are its opponents so well organized, trained and armed?

Road to Damascus Fraud – a person of weak integrity, who is seduced by one philosophy and extols it through practices of vehemency, virtue or deception, who then switches to the exact opposite philosophy and thereafter uses similar tactics and/or intensity of fanaticism – is not to be trusted. They have not ‘seen the light’. Their conversion is not evidence of validity of their latter philosophy, nor evidence of the invalidity of the former.

Rookem’s Razor – the method wherein the most expensive, fee generating or most oligarch profitable explanation tends to be the correct one at any given time.  Medicine in which a symptom is investigated first as if it were cancer, and through the most expensive testing, before considering vastly more likely alternative diagnoses.

Rupert’s Axiom of Reason – you cannot reason a person out of a stance which they obtained through thinking they held the sole monopoly on reason to begin with.

Sampled Population Mismatch – a study design methodology wherein a small population is sampled for a signal which is desired for detection, and a large population cohort is sampled regarding a signal which is not desired for detection. For example, using n=44 (unvaccinated) and 1,680 (vaccinated) in order to show that those vaccinated exhibit a lower rate of atopic allergies. The variance required to upshift the desired signal in the smaller group is on the order of a mere 2 or 3 persons. The odds of this occurring are very high, especially if the small group all originates from the same or similar lifestyle.

Satisficing – a term which describes bias from the perspective of professional intelligence analysis, wherein one chooses the first hypothesis that appears good enough rather than carefully identifying all possible hypotheses and determining which is most consistent with the evidence.

SAW House/SAW Trap – a condition wherein one is left with no option but to harm another innocent person, in order to protect themself or their loved ones from demise or extensive harm – and is then blamed, punished or extorted by an outside entity for that harming of another. Named for the SAW series of movies.

Schadenfreude/Epicaricacy – schadenfreude is the enjoyment of witnessing the misfortune of others through their own mistake, accident or self inflicted agony. In contrast, epicariacy is the enjoyment of witnessing the harm one individual receives at the hands of another, usually maliciously-minded party. The similar English expression would be ‘Roman holiday’, a metaphor from the poem Childe Harold’s Pilgrimage by George Gordon (Lord Byron) wherein a gladiator in ancient Rome expects to be “butchered to make a Roman holiday,” i.e. the audience would take pleasure from watching his forced suffering at another’s hands. The term suggests motives of pleasure or political expediency beyond simple schadenfreude; consisting more of debauchery and exploitation for gain in addition to sadistic enjoyment. One exception to both meanings, and common mistake in their application however, is citing schadenfreude or epicariacy in the case where one is witnessing a temper tantrum. Temper tantrums are intended forms of violence upon others, and in no way reflect a person being in a state of misfortune or harm.

sciebam – (latin: I knew) – an alternative form of knowledge development, which mandates that science begins with the orphan/non-informed step of ‘ask a question’ or ‘state a hypothesis’. A non-scientific process which bypasses the first steps of the scientific method: observation, intelligence development and formulation of necessity. This form of pseudoscience/non-science presents three vulnerabilities:

  1. First it presumes that the researcher possesses substantially all the knowledge or framework they need, lacking only to fill in final minor gaps in understanding. This creates an illusion of knowledge effect on the part of the extended domain of researchers. As each bit of provisional knowledge is then codified as certain knowledge based upon prior confidence. Science can only progress thereafter through a series of shattering paradigm shifts.
  2. Second, it renders science vulnerable to the possibility that, if the hypothesis, framework or context itself is unacceptable at the very start, then its researcher therefore is necessarily conducting pseudoscience. This no matter the results, nor how skillfully and expertly they may apply the methods of science. And since the hypothesis is now a pseudoscience, no observation, intelligence development or formulation of necessity are therefore warranted. The subject is now closed/embargoed by means of circular appeal to authority.
  3. Finally, the question asked at the beginning of a process of inquiry can often prejudice the direction and efficacy of that inquiry. A premature or poorly developed question, and especially one asked under the influence of agency (not simply bias) – and in absence of sufficient observation and intelligence – can most often result quickly in a premature or poorly induced answer.

Scienter – refers to the state of mind of an individual, typically regarding their awareness of the wrongful or unlawful nature of their actions.

Actual Scienter – knowingly & intentionally engaging in an action with full awareness of its wrongfulness.

Constructive Scienter – by function or expertise, should have known the wrongful nature of their actions, even under Nelsonian knowledge or intent.

Imputed Actual Scienter – when the party who established a condition of ignorance, thereafter appeals to that ignorance as justification for a wrongful action founded upon that appeal.

Self Confirming Process – a process which is constructed to only find the answer which was presumed before its formulation. A lexicon, set of assumptions or data, procedure or process of logical calculus which can only serve to confirm a presupposed answer it was designed to find in the first place. A process which bears no quality control, review, or does not contain a method through which it can reasonably determine its own conclusion to be in question or error.

Self-Fulfilling Inductive Prediction – prediction which is confirmed through induction by means of a separate rationale which appears to place its hypothesis at risk, whose predictive measure in fact has already been proved to be true by previous deductive inference. Pseudo-hypothesis – such as showing that people who are told they are predisposed to gain weight, by means of genetic testing – actually tend to gain more weight.  And attributing this effect to the psychology of ‘having been told they were predisposed’ as opposed to the simple fact that they have the genetics which predispose them in the first place. A common study trick in psychology.

Self-Sealing Argument – an argument which includes premises or constraints which alone or in concert force the argument to validity in all cases of its application, regardless of any evidence standing in support of or against the argument itself. For instance, a miracle is defined as the least probable event among a set of possibilities (premise). Historians by nature of their work, document only the most probable rendition of set of events (constraint), given a fixed set of recorded information. Therefore, by this premise-constraint tandem logic, history can never document a miracle – therefore a miracle cannot exist, and will never exist. The argument has self-sealed.

Self-Sublation (aufheben) – Hegelian principle of a dialectic which is stuck in stasis through an idea both canceling and sustaining itself at the same time. A doubled meaning: it means both to cancel (or negate) and to preserve at the same time.

Seth’s Razor – all things being equal, any explanation aside from the simplest one, constitutes a conspiracy theory. The principal technique of methodical cynicism, enforcing stacks of mandatory or pseudo-probable misinformation.

Shermer Error – mistaking the role of follow-on empirical observation in the confirmation of data which is simply being described. Regarding statistical analysis as needing empirical confirmation, when one is describing empirical observation distributions in the first place.

Shevel’s Inconsistency – a inconsistency wherein one simultaneously contends that science has shown a research subject to be invalid, yet at the same time chooses to designate any research into that subject as constituting pseudoscience. The two positions are mutually exclusive. The two positions are also not compatible when the pseudoscience in question has not been studied by science in the first place. In such a circumstance, investigators risk being accused of a being a ‘believer’, unless the researcher makes visible and extreme overtures to debunking or extreme doubt (methodical doubt, not Cartesian Doubt) on the matter, as part of their work.

Shield Effect – when an arguer in a valid matter of discourse drops any need to reference diligent research, method, or data collection in support of their contended position because a higher visibility arguer, or member of their club, has enjoined the discussion on their side.

Shirky Principle – a risk value chain perspective which warns that complex solutions (such as a charity, government program, extremist advocacy, or institution) can become so wound up and convoluted inside the target problem they are the solution to, that often they inadvertently serve to perpetuate the problem.

Skeptical Integrity Tell – a skeptic who has examined them self first, should never seek out dispute, fail to seek some essence of understanding, straw man, used canned explanations and party agendas, find entertainment in argument nor mock objective dissent in order to provide an ideological advantage for favored views. Instead, the seasoned skeptic should actually go into the field and dispassionately observe, be an autodidact despite their education background, bear new thoughts along with a compassion for those harmed, foremost. These are the sign posts on the road less traveled by; the telltale sign of whether on not one is a true skeptic.

Skeptical Inversion – when applying skepticism to useless subjects which bear little or no impact on humanity, while in contrast simultaneously and completely ignoring skepticism in matters of high risk of impact on humanity which are underpinned by conflict of interest, incomplete or sketchy science.

Skeptic’s Dilemma – shortcuts to denial are more oppressive even than are shortcuts to the affirmative, because the former is then regarded as truth. The skeptic who does not see this, is not a skeptic.

Skereto Curve/Rule – a condition wherein 99% of the skeptics are focused on and obsessing over 1% of the problem.

Slippery Slope – asserting that a relatively small first step in accepting data or ideas inevitably leads to a chain of related events culminating in some significant impact/event that should not happen, thus the first step should not happen. While this fallacy is a popular one, it is, in its essence, an appeal to probability fallacy.

Social Conformance Bias – any influence which implies that if you do not agree, then you will be in some ways rejected to ostracized by your former peer group. Employment of peer/media/social pressure instead of rational case and argument to establish consensus.

Social Epistemology – when we conceive of epistemology as including knowledge and justified belief as they are positioned within a particular social and historical context, epistemology becomes social epistemology. Since many stakeholders view scientific facts as social constructions, they would deny that the goal of our intellectual and scientific activities is to find facts. Such constructivism, if weak, asserts the epistemological claim that scientific theories are laden with social, cultural, and historical presuppositions and biases; if strong, it asserts the metaphysical claim that truth and reality are themselves socially constructed. Moreover, in recognizing this, when social justice or the counter to a perceived privilege are warranted, short cuts to science in the form of hyper and hypo epistemologies are enacted through bypassing the normal frustrating process of peer review, and substituting instead political-social campaigns – waged to act in lieu of science. These campaigns of ‘settled science’ are prosecuted in an effort to target a disliked culture, non-violent belief set, ethnicity or class – for harm and removal of human rights.

Social Skepticism

1. a form of social activism which seeks abuse of science through a masquerade of its underlying philosophical vulnerability, skepticism. An imperious set of political, social, and religious beliefs which proliferate through teaching weaponized fake skepticism to useful idiots. Agency which actively seeks to foment conflict between science and the lay public, which then exploits such conflict to bolster its celebrity and influence.

2. a form of weaponized philosophy which masquerades as science, science enthusiasm or science communication. Social skepticism enforces specific conclusions and obfuscates competing ideas via a methodical and heavy-handed science embargo. It promotes charades of critical thought, self aggrandizement, and is often chartered to defend corporate/Marxist agendas – all while maintaining a high priority of falsely impugning eschewed individuals and topics. Its philosophies and conclusions are imposed through intimidation on the part of its cabal and cast of dark actors, and are enacted in lieu of and through bypassing actual scientific method. One of the gravest weaknesses of human civilization is its crippling and unaccountable bent toward social coercion. This form of oppression disparages courage and curiosity inside the very arenas where they are most sorely needed.

Failures with respect to science are the result of flawed or manipulated philosophy of science. When social control, close-hold embargo or conformance agents subject science to a state of being their lap-dog, serving specific agendas, such agents err in regard to the philosophical basis of science, skepticism. They are not bad scientists, rather bad philosophers, seeking a socialized goal. They are social skeptics.

solum fieri – the fallacy of implied only occurrence. A common unspoken assumption premise of a religious or virtue argument (usually in the form of a suggestion or accusation), wherein the instance being considered or the person being targeted is unsoundly treated in isolation – as if the only occurrence of such an event or by considering the person to be in isolation. Introducing the idea that if a person gives 9 times but not the 10th time, then they are selfish or protectionist. Or, that a person should adopt fear because they are all alone, and are the first person to ever have encountered a specific troubling situation. A trick of isolating an intended victim, similar to but even more egregious than using anecdote as a final proof.

Sperging – is a slang term which describes certain behavior online which is stereotypically associated with people who have Asperger’s syndrome. The most common meaning refers to excessive discussion or analysis of what would normally be regarded as a very specific, irrelevant or pointless topic.

Sponsor Practice Hyperbole – the fallacy of regarding the process of observation by a sponsor of an idea, to constitute a presentation of ‘science’ the sponsors’ part. This in an effort to subjugate such activity falsely into the realm of pseudoscience for the simple act of being of curious in nature around a disfavored subject. In fact research is simply that, a set of observations, and its false dismissal under the pretense of being deemed a ‘pseudoscience’ is a practice of deception and itself, pseudoscience.

Stickiness – The principle wherein proponents of a theory philosophy often cling to it despite the mounting evidence against it. The theory is abandoned only after its last proponents die. Such obstinacy allows theories to be given a proper run for their money, rather than being prematurely abandoned in the face of scant or specious contrary data that could be overcome with further research and accrued verity. Otherwise, we risk prematurely abandoning theories which could add value in one aspect or are indeed valid themselves.

Stowaway Pseudofact or Stowaway – a specific form of unsubstantiated claim or pseudofact, which is granted undue gravitas through its stand-alone repetition inside a scientific study or an otherwise scientific context. The author, desiring the statement to be accepted as true without merit, will embed it within a body of work outlining a series of well-substantiated statements, hoping that the reader will not discern the difference and will accept the stowaway as part of the overall inference of the associated work.

Streetlight Effect – is a type of observational bias that occurs when people only search for something where it is easiest to look.

Streisand Effect – the phenomenon whereby an attempt to hide, remove, or censor a piece of information has the unintended consequence of publicizing the information more widely.

studium a studia – when a study is falsely touted as new breaking science, when all the study has accomplished is to study a group of older studies and brought out the same conclusion as suggested by the previous studies it studied. This study is then touted in the future by proponents of the same idea, as a scientific empirical basis for argument.

Subception – a perceptual defense that involves unconsciously applying strategies to prevent a troubling stimulus from entering consciousness. The method of deceiving one’s self and others in the process of cynicism, jealousy or denial. A process of expressing unrealized subconscious vitriol, in which one habitually creates artificial ‘violations’ (usually forms of administrative or social protocol which their target ‘does wrong’) which their target of jealously or hate keeps committing – in order for the subception holder to internally justify their ill feelings toward their target.

Superior Grasp of the Obvious – the bias on the part of one concealing a rational ego which has been inflated to enormous levels. Levels of ego betrayed by implication and oxymoron that one’s self is the only person who could possibly grasp that which is readily obvious inside complex questions. In fact, a prowess of such potential that the mastery of the obvious can be done from a university cubicle or parents’ basement, and in 4 minutes of research.

Tail to Body Transference – a statistical fallacy of relevant range in which one statistic is assumed to be fully descriptive across all conditions of its arrival range. An error wherein a descriptive suitable to frame a tail condition, is also applied to the main body of a distribution and assumed to be also descriptive therein. IQ measures which allow us to discriminate special needs persons, are not also useful in determining who should lead nations or corporations at the other end. Cholesterol figures which allow us to highlight who is at risk of arterial plaque do not apply to persons who have a familial history of higher cholesterol stats, etc. Mistakenly applying tail condition gradients to also be salient in the main body of the same descriptive data.

Taleb’s But – the principle which proceeds along the line of the Nassim Nicholas Taleb quote “Everything before the “but” is meant to be ignored by the speaker; and everything after the “but” should be ignored by the listener.”

Taleb’s Contraposition – For real people, if something works in theory, but not in practice, it doesn’t work. For social skeptics and many academics, if something works in practice, but not in theory, it doesn’t exist.

Taleb’s Law of Abundance – Abundance is harder for us to handle than scarcity. ~ Nassim Nicholas Taleb

Taleb’s Law of Data – The more data you get, the less you know what’s going on. ~ Nassim Nicholas Taleb

Taleb’s Law of Intelligence – In a complex world, intelligence consists in ignoring things that are irrelevant ~ Nassim Nicholas Taleb

Taleb’s Law of Tolerance – a toleration of intolerance will always escalate to extremism and proscription as the standard. The most intolerant, wins.

Tangenda – in critical path theory inside the scientific method, a competing critical path of questions which are misleading, poorly or bias-crafted which serve to reduce or stagnate the probative nature of science, and steer the results of the scientific method into a specific conclusive domain. A critical path is a series of questions asked under the scientific method which are probative, illuminating, incremental, contextual, logical in sequence, parsimonious, risk averse, low in feature stacking, maximum in impact, and sequitur in terms of optimal knowledge development. While this is a tall order, it can be attained. Tangendas perform as a kind of pseudo-critical path which leads inexorably to a favored or desired conclusion or conformity – corrupted by the fashion in which scientific questions are crafted or the biased way in which they are sequenced.

The Tar Baby (Principle) or The Rule of Unconventional Domains – as a domain of endeavor or study becomes increasingly nuanced and complex, the more difficult it becomes to define, comprehend, and control. Efforts to obfuscate and debunk this type of domain merely serve to increase its footprint and salience. Those who would debunk and obfuscate, should also heed the lessons of the history therein.

Tau Point Man (Person) – one who makes their cleverness or contempt manifest. Based upon the tenet of ethical skepticism which cites that a shrewdly apportioned omission at Point Indigo, an inflection point early in a system, event or process, is a much more effective and hard to detect cheat, than that of more manifest commission at Point Tau, the tipping point near the end of a system, event or process. Based upon the notion ‘Watch for the gentlemanly Dr. Jekyl at Point Tau, who is also the cunning Mr. Hyde at Point Indigo’. It outlines a principle wherein those who cheat (or apply their skill in a more neutral sense) most effectively, such as in the creation of a cartel, cabal or mafia – tend do do so early in the game and while attentions are placed elsewhere. In contrast, a Tau Point man tends to make their cheat/skill more manifest, near the end of the game or at its Tau Point (tipping point).

Tech Bamboozle – when engaging the services of an experienced professional for basic work, my circumstance should not be treated as perplexing, exceptional, or odd. This is a warning sign of either the service technician’s lack of experience, or one’s being overcharged for those basic services.

Tendentious – showing agency towards a particular point of view, especially around which there is serious disagreement in the at-large population. The root is the word ‘tendency’, which means ‘an inclination toward acting a certain way.’ A tendentious person holds their position from a compulsion which they cannot overcome through objective evaluation. One cannot be reasoned out of, a position which they did not reason themselves into to begin with.

TES’s Razor – among competing alternatives, all other things being equal, prefer the one for which discussion or research is embargoed.

The (Wonka) Golden Ticket – Have we ever really tested the predictive strength of this idea standalone, or evaluated its antithetical ideas for falsification?

Einfach Mechanism – an invalid null or best explanation hypothesis. An explanation, theory or idea which resolves a contention under the scientific method solely by means of the strength of the idea itself. An idea which is not vetted by the rigor of falsification, predictive consilience nor mathematical derivation, rather is simply considered such a strong, or Occam’s Razor (sic) stemming-from-simplicity idea that the issue is closed as finished science from its proposition and acceptance onward. A pseudo-theory which is granted status as the default null hypothesis or as posing the ‘best explanation’, without having to pass the rigors with which its competing alternatives are burdened. Such alternatives may indeed be pseudo-theory, but so is the accepted hypothesis as well. An einfach mechanism may or may not be existentially true.

The Method of Scientific Propaganda – The common deeper hallmarks of scientific propaganda in this regard therefore proceed according to this method:

  1. To alter scientific paradigms or questions in a sleight-of-hand manner in order to establish a false basis for a completely separate but disguised contention.
  2. To conflate and promote consilience as consensus. Consilience is not a ‘unity of knowledge’ as Edward O. Wilson contends – as only diligent investigation of all compelling alternatives can serve to unify knowledge.
  3. To employ as null hypothesis, that which cannot be approached by Popper demarcation and falsification, and then further demonize all competing ideas.
  4. To employ explanitude based disciplines, bullying, celebrity, journalism and false forms of philosophy and skepticism, as a means to enforce an agenda, dressed up as science.
  5. To fail to conduct followup or safety confirmation studies, or sufficient parsimonious or precautionary study, in a circumstance where a risk has been adopted in the name of science.
  6. To imply or default that a null hypothesis is ‘true‘ until proved otherwise, knowing that proof is a seldom attained standard in science.
  7. To investigate only one hypothesis, and deem the social pressure and pluralistic ignorance around this bad habit as consensus or even consilience.
  8. To proscribe investigation into any alternative or deviation from consilience and give a moniker (anti-science or pseudoscience) to those who do so.
  9. To tamper with or conflate, the three forms of consensus into a falsely (through vulnerability exploitation) derived claim to scientific consensus of an Omega Hypothesis.
  10. To teach simpleton (simplest answer) or black and white delineations of scientific arguments as settled science, through channels of journalism which cannot differentiate good science from bad.

TLDR – “I don’t have the attention span or even mental capacity to grasp Dr. Seuss, much less this. Please tender me a one-liner or idiom, because, if you want me to understand, you must pose your point in buckets of something with which I am already familiar.”

Torfuscation – pseudoscience or obfuscation enacted through a Nelsonian knowledge masquerade of scientific protocol and study design. Inappropriate, manipulated or shallow study design crafted so as to obscure or avoid a targeted/disliked inference. A process, contended to be science, wherein one develops a conclusion through cataloging study artifice or observation noise as valid data. Invalid observations which can be parlayed into becoming evidence of absence or evidence of existence as one desires – by accepting only the appropriate hit or miss grouping one desires as basis to support an a priori preference, and as well avoid any further needed ex ante proof.  A refined form of praedicate evidentia or utile abstentia employed through using less rigorous or probative methods of study than are requisite under otherwise ethical science.  Exploitation of study noise generated through first level ‘big data’ or agency-influenced ‘meta-synthesis’, as the ‘evidence’ that no further or deeper study is therefore warranted – and moreover that research of the subject entailed is now socially embargoed.

Transactional Occam’s Razor Fallacy (Appeal to Ignorance) – the false contention that a challenging construct, observation or paradigm must immediately be ‘explained.’ Sidestepping of the data aggregation, question development, intelligence and testing/replication steps of the scientific method and forcing a skip right to its artificially conclusive end (final peer review by ‘Occam’s Razor’).

Truckle Fallacy – when a skeptic makes visible and biased overtures to politically correct movements, whether sincere or not, in order to placate any critics which might arise from that advocacy group, or to enlist the aid of those movements in attacking persons the fake skeptic regards as foes.

Türsteher Mechanism – the effect or presence of ‘bouncer mentality’ inside journal peer review. An acceptance for peer review which bears the following self-confirming bias flaws in process:

  1. Selection of a peer review body is inherently biassed towards professionals who the steering committee finds impressive,
  2. Selection of papers for review fits the same model as was employed to select the reviewing body,
  3. Selection of papers from non core areas is very limited and is not informed by practitioners specializing in that area, and
  4. Bears an inability as to how to handle evidence that is not gathered in the format that it understands (large scale, hard to replicate, double blind randomized clinical trials or meta-studies).

Therein such a process, the selection of initial papers is biased. Under this flawed process, the need for consensus results in not simply attrition of anything that cannot be agreed upon – but rather, a sticky bias against anything which has not successfully passed this unfair test in the past. An artificial and unfair creation of a pseudoscience results.

Twitter’s Razor – all things being equal, the shorter the video or the less information related, the more likely a just indignation can be derived.

Unbegründet – “2 plus 2 equals 4, and your momma’s a whore” – a baseless, unfounded, or invalid inference concealed inside an otherwise competent scientific endeavor, which nonetheless is pulled from thin air. A statement, posed as a conclusion of research, which is juxtaposed in the abstract or conclusions of a detailed scientific assay or study, which is related to the subject yet possesses no sound backing inside the work. Often Narrative-conforming statements, these are strategically introduced because the author knows they will be favored in peer review, or might be conflated as having been derived from the research contained in the study itself – when nothing of the sort is true.

Unit Bias – the tendency to want to finish a given unit of a task or an item to completion before beginning work on, or considering another avenue of research.  The effort to avoid unwanted data by remaining fixated on legacy avenues of research, as a pretense.

Unity of Knowledge Error (Religion) – to conflate and promote consilience as consensus. Consilience is by its essence inductive and therefore cannot alone underpin a ‘unity of knowledge’ as Edward O. Wilson contends. Only diligent investigation of all compelling alternatives, deductive science, can serve to finalize and unify knowledge (under consensus). To promote consilience as a unity of knowledge or substitute for consensus, in absence of having diligently investigated competing alternative hypotheses, is also know in ethics as ‘religion.’

Untouchable Generalization – a condemning or dismissing generalization that comes with qualifications that eliminate so many cases which could falsify the derogatory claim, that what remains of it is much less impressive than the initial statement might have led one to assume. Yet the defamation stands as fiat knowledge and recitation nonetheless.

Unvestigation – the process of asking loaded questions and sculpting data so that your fake research will produce your desired conclusion.

utile absentia – a study which observes false absences of data or creates artificial absence noise through improper study design, and further then assumes such error to represent verified negative or positive observations. A study containing field or set data in which there exists a risk that absences in measurement data, will be caused by external factors which artificially serve to make the evidence absent, through risk of failure of detection/collection/retention of that data. The absences of data, rather than being filtered out of analysis, are fallaciously presumed to constitute bonafide observations of negatives. This is improper study design which will often serve to produce an inversion effect (curative effect) in such a study’s final results. Similar to torfuscation. As well, the instance when an abstract offers a background summary or history of the topic’s material argument as part of its introductory premise, and thereafter mentions that its research supports one argument or position – however fails to define the inference or critical path which served to precipitate that ‘support’ – or even worse, tenders research about a related but not-critical aspect of the research. Like pretending to offer ‘clinical research’ supporting opinions about capitalism, inside a study of the methods employed by bank tellers – it only sounds related. In this case you have converted an absence into a positive. A formal error called utile absentia. This sleight-of-hand allows an opinion article to masquerade as a ‘research study’. It allows one to step into the realm of tendering an apparent epistemological result, which is really nothing more than a ‘they said/I say’ editorial with a scientific analysis bolted onto it, which may or may not present any bearing whatsoever into the subject at hand. Most abstract surveyors do not know the difference – and most study leads cannot detect when this has occurred.​

Utility Blindness – when simplicity or parsimony are incorrectly applied as excuse to resist the development of a new scientific explanatory model, data or challenging observation set, when indeed the participant refuses to consider or examine the explanatory utility of any similar new model under consideration.

Verschlimmbesserung – (German) to make something worse while trying to make it better. The fallacy of judging disasters by the measure that, those who bore the ‘good intentions’ should bear no fault, or place themselves as disconnected from the disaster.

Virtue Telescope – employment of a theoretical virtue benefit projected inside a domain which is distant, slow moving, far into the future, diffuse or otherwise difficult to measure in terms of both potential and resulting impact, as exculpatory immunity for commission of an immoral act which is close by, obvious, defined and not as difficult to measure. Similar to but converse of an anachronistic fallacy, or judging distant events based on current norms.

Whack-a-Mole Science – when an arguer presents objection after objection to an observation, data or construct, only to shift to another objection in an inventory of habitual objections as each successive objection is satisfied or made irrelevant. Typically employed when one has no desire to allow discourse on the subject at hand, and through ignoring each successive failure of objection they raise.

What’s Done is Done Bias – the artificial refusal to accept new data because an argument has ‘already been settled.’

Whipping Horse – a martyr issue, study, chart, graphic or event which is cited as exemplary in condemning a targeted group or thought set – which is over-employed or used in a domain of ample equivocation, equivocal slack, straw man or other ambiguity that it simply becomes a symbol and ironically maintains no real salience to the argument at hand.

Whistleblower’s Irony – the same person who claims that ‘if conspiracy existed there would be whistleblowers’, is typically also the first person to publicly demean anyone who is a whistleblower.

Wittgenstein Error (Contextual) – employment of words in such as fashion as to craft rhetoric, in the form of persuasive or semantic abuse, by means of shift in word or concept definition by emphasis, modifier, employment or context.

Wittgenstein Error (Descriptive) – the contention or assumption that science has no evidence for or ability to measure a proposition or contention, when in fact it is only the crafting of language, language limitation or lack of a cogent question or (willful) ignorance on the part of the participants which has limited science and not in reality science’s domain of observability.

Corruptible:  Science cannot observe it because I have crafted language and definition so as to preclude its description.

Describable:  I cannot observe it because I refuse to describe it.

Existential Embargo:  By embargoing a topical context (language) I favor my preferred ones through means of inverse negation.

You Can’t Wake Someone Who is Pretending to be Asleep – a principle which cites that a faking individual will never follow through with the traits of the disposition they are faking. A fake skeptic will draw conclusions on scant data and hearsay, as long as it conforms with what they believe. Or conversely in logic, mainstream media cannot be shocked to awareness by emergent news or observation inside something they are choosing to ignore in the first place.

Zero Risk Bias – making a decision or creating economic, insurance or financing arrangements so that a zero-risk scenario is developed – ignorant of the ramifications involved in cost or where risk is shifted or the de-optimization entailed in the decision made.

Zombie Theory – pseudo-theory or an old paradigm of understanding which is still being enforced by a portion of the scientific community as consensus, when indeed it is not – and/or the topic is undergoing a Kuhn-Planck paradigm shift. A walking dead placeholder theory which is abused and contorted to both explain everything unknown, and as well uses wicker man rationalizations in order to excuse its shortfalls long after it has ceased to serve any valuable explanatory potential.

epoché vanguards gnosis

How to MLA cite this blog post => 1
  1. The Ethical Skeptic, “The Tree of Knowledge Obfuscation: Misrepresentation by Bias or Method” The Ethical Skeptic, WordPress, 17 Feb 2018, Web; https://wp.me/p17q0e-7dB
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

4 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
Scott S Manzel

Thank you. My gut is usually right. You explain it so well.

Scott Manzel

LEX HARRISON

Sites like this are more than reassuring. Sites such as this act as a lighthouse for disenfranchised, hopeless people who have been crushed by the totalitarian behavior of narcissists. This site not only proves there are rational, logical, like-minded people in America, but there is not a cult-like departure from knowledge itself in order to cultivate narcissistic supply from like-minded, “woke” people. The Ethical Skeptic empirically proves that some numbers cannot be tweaked, tortured or manipulated into some bombastic headline on a major “news” network. Some information is above reproach and may be ethically, responsibly, compassionately used to educate the… Read more »

James Westmoreland

Aleatoric Casuistry : In other words, perhaps – “Anything’s possible” (?)

James Westmoreland

Akratic Trolling – when an advocate of an agenda plays the game wherein they will troll and provoke their perceived enemy, then suddenly retreat into the pure technical of science or atheism and adopt a holy or statesman facade when the perceived enemy objects to their behavior.
In some supplementary reading, Socrates claimed no one would willingly do this (“akrasia”); however, in the context of this type of trolling, the image that comes to mind is a manipulative schoolyard juvenile luring a new student unaware into a den of bullies.