The Tree of Knowledge Obfuscation: Misrepresentation of Science

The following is The Ethical Skeptic’s list, useful in spotting both formal and informal logical fallacies, cognitive biases, statistical broaches and styles of crooked thinking on the part of those in the Social Skepticism movement. It is categorized by employment groupings so that it can function as a context appropriate resource in a critical review of an essay, imperious diatribe or publication by a thought enforcing Social Skeptic. To assist this, we have comprised the list inside an intuitive taxonomy of ten contextual categories of mischaracterization/misrepresentation:

Tree of Knowledge Obfuscation The Ethical Skeptic

.

Misrepresentation of Science

Absensus – if 1000 are convinced by experimental measure, that is consensus. If 1000 are coerced by social unwillingness to examine, that is absensus.

absurdum originem – a philosophy or principle which cites that eventually everything must reduce to something absurd which stands as its basis. There exists no miracle-free eschatology, science, nor ontology. The idea that something is natural or conventional in its understanding only appears as such based on the relative range of discussion in which it is considered. The notion that every explanation eventually must appeal to a miracle to underpin its argument.

acatalepsia Fallacy – a flaw in critical path logic wherein one appeals to the Pyrrhonistic Skepticism principle that no knowledge can ever be entirely certain – and twists it into the implication that therefore, knowledge is ascertained by the mere establishment of some form of ‘probability’. Moreover, that therefore, when a probability is established, no matter how plausible, slight or scant in representation of the domain of information it might constitute, it is therefore now accepted truth.  Because all knowledge is only ‘probable’ knowledge, all one has to do is spin an apparent probability, and one has ascertained accepted knowledge. Very similar in logic to the Occam’s Razor aphorism citing that the ‘simplest explanation’ is the correct explanation.

ad hoc/Pseudo-Theory – a placeholder construct which suffers from the additional flaw in that it cannot be fully falsified, deduced nor studied, and can probably never be addressed or further can be proposed in almost any circumstance of uncertainty. These ideas will be thrown out for decades. They can always be thrown out. They will always be thrown out. Sometimes also called ‘blobbing’ or ‘god of the gaps’, it is a bucket into which one dumps every unknown, hate-based, fear-based and unexplained observation – add in a jigger of virtue – then you shake it up like a vodka martini, and get drunk on the encompassing paradigm which can explain everything, anything and nothing all at the same time.

Adams’ Law (of Slow Moving Disasters) – for Scott Adams, Dilbert Comic strip author. Mankind has a 100% batting average at historically averting slow moving predicted disasters.

Corollary 1 – provisional knowledge (of fake skepticism and science) always selects-for and stacks worries faster than it does realities.

Corollary 2 – taking action to address a slow moving disaster worry is not the same as planning for risk.

Corollary 3 – those who habitually stack provisional slow moving future disasters routinely fail to recognize actual existing and past disasters.

Corollary 4 – any disaster that takes longer to come to fruition than the average lifespan of an academic scientist will exhibit an underpinning of 97% supporting consensus.

Agency – an activated, intentional and methodical form of bias, often generated by organization, membership, politics, hate or fear based agenda and disdain. Agency and bias are two different things. Ironically, agency can even tender the appearance of mitigating bias, as a method of its very insistence. Agency is different from either conflict of interest or bias. It is actually stronger than either, and more important in its detection. Especially when a denial is involved, the incentive to double-down on that denial, in order to preserve office, income or celebrity – is larger than either bias or nominal conflict of interest. One common but special form of agency, is the condition wherein it is concealed, and expresses through a denial/inverse negation masquerade called ideam tutela. When such agency is not concealed it may be call tendentiousness.

ideam tutela – concealed agency. A questionable idea or religious belief which is surreptitiously promoted through an inverse negation. A position which is concealed by an arguer because of their inability to defend it, yet is protected at all costs without its mention – often through attacking without sound basis, every other form of opposing idea.

Tendentious – showing agency towards a particular point of view, especially around which there is serious disagreement in the at-large population. The root is the word ‘tendency’, which means ‘an inclination toward acting a certain way.’ A tendentious person holds their position from a compulsion which they cannot overcome through objective evaluation. One cannot be reasoned out of, a position which they did not reason themselves into to begin with.

Amara’s Law – we tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run. Named after futurist Roy Amara (1925–2007).

Anecdote Data Skulpting (Cherry Sorting) – when one applies the categorization of ‘anecdote’ to screen out unwanted observations and data. Based upon the a priori and often subjective claim that the observation was ‘not reliable’. Ignores the probative value of the observation and the ability to later compare other data in order to increase its reliability in a more objective fashion, in favor of assimilating an intelligence base which is not highly probative, and can be reduced only through statistical analytics – likely then only serving to prove what one was looking for in the first place (aka pseudo-theory).

aphronêsis – twisted, extreme, ill timed, misconstrued, obtuse or misapplied wisdom, sometimes even considered correct under different contexts of usage – which allow an agenda holder to put on a display of pretend science, rationality and skepticism. The faking skeptic will trumpet loudly and often about the scientific method, evidence, facts, ‘skepticism’ or peer review, but somehow will never seem to be able to apply those principles, nor cite accurate examples of their application. The faking skeptic will speak often of ‘demanding proof,’ deny sponsors access to challenge ideas or fiat science, and incorrectly cite that denial of access to peer review is indeed – peer review. You will find them endlessly spouting incorrect phrases like ‘the burden of proof resides on the claimant’; its symbolic evisceration standing as de facto proof of their own beliefs. Vehement skeptics tend to be young and only academically/socially trained to a great degree – their ‘skepticism’ easing most of the time as they gain life experience. ‘Proof’ is the hallmark of religion. ~Bill Gaede

Apparent Epistemology – treatment or regard of underlying assumptions or implicit assumptions as proved or accepted science, typically executed through locution deception. Statements such as ‘is known to provide’ in lieu of ‘has been shown (by study X) to provide’ inside a touted epistemology. The use of social constructs, common wisdom or pluralistic ignorance based consensus as a foundation for scientific understanding or underpinning for further stacked provisional explanation.

Appeal to Probability – the false contention of a skeptic that the most probable, simple, or likely outcome in a set of highly convoluted but unacknowledged assumptions, is therefore the compulsory or prevailing conclusion of science.

Appeal to Scientists Fallacy – an argument that is misrepresented to be the premise held true on the part of the prevailing group of scientists; or concludes a hypothesis (typically a belief) to be either true or false based on whether the premise leads to a more successful career in science.

Appeal to Skepticism (Fallacy of Irrelevance)

ergo sum veritas Fallacy (of Irrelevance)

1′ (strong).  The assumption that because one or one’s organization is acting in the name of skepticism or science, that such a self claimed position affords that organization and/or its members exemption from defamation, business tampering, fraud, privacy, stalking, harassment and tortious interference laws.

1a.  The contention, implication or inference that one’s own ideas or the ideas of others hold authoritative or evidence based veracity simply because their proponent has declared themselves to be a ‘skeptic.’

1b.  The assumption, implication or inference that an organization bearing a form of title regarding skepticism immediately holds de facto unquestionable factual or ideological credibility over any other entity having conducted an equivalent level of research into a matter at hand.

1c.  The assumption, implication or inference that an organization or individual bearing a form of title regarding skepticism, adheres to a higher level of professionalism, ethics or morality than does the general population.

Appeal to Skepticism (Fallacy of Irrelevance)

2a.  The declaration, assumption or implication that a consensus skeptical position on a topic is congruent with the consensus opinion of scientists on that topic.

2b.  The argument assumption or implication that an opinion possesses authoritative veracity or a proponent possesses intellectual high ground simply through allegiance to a consensus skeptical position on a topic.

3.   The presumption or contention that taking a denial based or default dubious stance on a set of evidence or topic is somehow indicative of application of the scientific method on one’s part, or constitutes a position of superior intellect, or represents a superior critical or rational position on a topic at hand.

Inverse Negation Fallacy – The asymmetrical strategy of promoting an idea through negation of all its antithetical concepts. A method of undermining any study, proponent, media byte, article, construct, data, observation, effort or idea which does not fit one’s favored model, in a surreptitious effort to promote that favored model, along with its implicit but not acknowledged underpinning claims, without tendering the appearance of doing so; nor undertaking the risk of exposing that favored model or claims set to the scientific method or to risky critical scrutiny.

Truzzi Fallacy – The presumption that a position of skepticism or plausible conformance on a specific issue affords the skeptical apologist tacit exemption from having to provide authoritative outsider recitation or evidence to support a contended claim or counter-claim. “Pseudo-Skeptics: Critics who assert negative claims, but who mistakenly call themselves ‘skeptics,’ often act as though they have no burden of proof placed on them at all. A result of this is that many critics seem to feel it is only necessary to present a case for their counter-claims based upon plausibility rather than empirical evidence.”  – Marcello Truzzi (Founding Co-chairman of CSICOP)

Richeliean Appeal to Skepticism – an inflation of personal gravitas, celebrity or influence by means of implicit or explicit threats of coercive tactics which can harm a victim one wishes to be silenced. Coercive tactics include threats to harm family, contact employers, tamper with businesses, employment of celebrity status to conduct defamation activities or actions to defraud, or otherwise cause harm to persons, reputation or property. This includes the circumstance where a Richeliean skeptic encourages and enjoys a form of ‘social peer review,’ empowered via politics or a set of sycophants who are willing to enact harm to a level which the Richeliean power holder himself would not personally stoop.

Appeal to Tradition (argumentum ad antiquitam) – a conclusion advertised as proven scientifically solely because it has long been held to be true.

argumentum ad baculum (appeal to the stick, appeal to force, appeal to threat) – a counter argument made through coercion or threats of force on the part of a Social Skeptic, via the media, one’s employment or on one’s scientific reputation.

As Science as Law Fallacy – conducting science as if it were being reduced inside a court of law or by a judge (usually the one forcing the fake science to begin with), through either declaring a precautionary principle theory to be innocent until proved guilty, or forcing standards of evidence inside a court of law onto hypothesis reduction methodology, when the two processes are conducted differently.

Ascertainment Bias – a form of inclusion or exclusion criteria error where the mechanism of sampling, specimen selection, data screening or sub-population selection is inherently flawed, is developed based upon an inadequate or sophomoric view of the observations base or produces skewed representations of actual conditions.

Attentodemic – a pandemic which arises statistically for the most part from an increase in testing and case-detection activity. From the two Latin roots attento (test, tamper with, scrutinize) and dem (the people). A pandemic, whose curve arises solely from increases in statistical examination and testing, posting of latent cases or detected immunity as ‘current new cases’, as opposed to true increases in fact.

Axiom of the Lesson Learneda lesson can only be learned if one is willing to examine and capable of learning.

Bandwagon Effect – the tendency to do (or believe) things because many in the Social Skeptic community do (or believe) the same. See Margold’s Law.

bedeutungslos – meaningless. A proposition or question which resides upon a lack of definition, or which contains no meaning in and of its self.

Belief Bias – an effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion, or suitability under their acknowledged or unacknowledged set of beliefs.

Bespoke Truth – the idea that truth is not congruent with facts. Nobody thinks their own beliefs are untrue or nonfactual. The problem resides instead wherein people pick and choose what they decide to accept from among the array of facts in order to fit or craft a truth of their liking. A flawed application of inference or scientific study (torfuscation) – wherein it is not that the study or facts are incorrect, rather that they stand merely as excuses to adopt an extrapolated and tailored ‘truth’ which is not soundly represented by such ‘facts’.

Big is Science Error – bigger sample sizes and study data is a way of bypassing the scientific method, yet still tender an affectation of science and gravitas. Any time a study cannot be replicated, and a call to consensus is made simply because it would be too difficult to replicate the study basis of the consensus.

Bigger Data is Better Fallacy – the invalid assumption which researchers make when attempting to measure a purported phenomena, that data extraction inside a very large source population as the first and only step of scientific study, will serve to produce results which are conclusive or more scientific. The same presumption error can apply to meta-analysis. Wherein such an analysis is conducted in a context of low/detached, rather than informed knowledge sets, and will serve to dilute critical elements of signal and intelligence which could be used to elucidate the issue further.

bonus sive malum – a condition of Nelsonian ignorance, willful blindness or other circumstance of low information/diligence wherein inaction in addressing an error/shortfall or skepticism in its regard – serve to render incompetence as indistinguishable from professional malfeasance. There is no such thing as Hanlon’s Razor with national strategy for example. When it comes to human rights and the role of government, an absence of diligence is indistinguishable from malice. A limit/boundary to Hanlon’s Razor. Also known as the ‘Hanlon lived a sheltered life’ axiom.

Bradley Effect – the principle wherein a person being polled will, especially in the presence of trial heat or iteration-based polls, tend to answer a poll question with a response which they believe the polling organization or the prevailing social pressure, would suggest they should vote or which will not serve to identify them into the wrong camp on a given issue. The actual sentiment of the polled individual is therefore not actually captured.

Caesar’s Wife (Must be Above Suspicion) – a principle inside the philosophy of skepticism which cites that a mechanism, research/polling effort, or study which bears an implicit a priori claim to innocence (i.e. soundness, salience, precision, accuracy and/or lack of bias/agency) must transparently and demonstratively prove this claim before being assumed as such, executed or relied upon as scientific.

Catch 22 (non rectum agitur fallacy) – the pseudoscience of forcing the proponent of a construct or observation, to immediately and definitively skip to the end of the scientific method and single-handedly prove their contention, circumventing all other steps of the scientific method and any aid of science therein; this monumental achievement prerequisite before the contention would ostensibly be allowed to be considered by science in the first place. Backwards scientific method and skipping of the plurality and critical work content steps of science.

Cheater’s Hypothesis – Does an argument proponent constantly insist on a ‘burden of proof’ upon any contrasting idea, a burden that they never attained for their argument in the first place? An answer they fallaciously imply is the scientific null hypothesis; ‘true’ until proved otherwise?

Imposterlösung Mechanism – the cheater’s answer. Employing the trick of pretending that an argument domain which does not bear coherency nor soundness – somehow (in violation of science and logic) falsely merits assignment of a ‘null hypothesis’. Moreover, then that null hypothesis must be assumed sans any real form or context of evidence, or Bayesian science cannot be accomplished. Finally then, that a null hypothesis is therefore regarded by the scientific community as ‘true’ until proved otherwise. A 1, 2, 3 trick of developing supposed scientifically accepted theory which in reality bears no real epistemological, logical, predicate structure nor scientific method basis whatsoever.

Chicory Science – science cut with attractive contestable philosophical critical bases, which are glossed over or hidden. Coffee used to be cut secretly with chicory, then people learned to like chicory coffee. While the philosophy entailed may not be wrong, or provably wrong, and it may look rich, smell wonderful and taste good, it is still not science.   Just as chicory, despite all its favorable traits, is still not coffee.

Churnalism – (a portmanteau of ‘churn’ and ‘journalism’) – is a circumstance where media outlets rapidly produce articles based on wire stories and press releases with little original reporting, along with propaganda in disseminating dysinformation. The term highlights the role of media and communication channels in this process. These tools can amplify and normalize these distorted truths, embedding them into the public consciousness.

Cladistic Dismantling – the definition of a malady or observed set of phenomena, into numerous ‘distinction without a difference’ subsets in an attempt to disguise or cloud noise around the overall trend in numbers involved in the malady or the phenomenon’s overall impact.

Clarke’s Third Law – any sufficiently advanced technology is indistinguishable from magic.

Click Bait (or Headline) Skepticism – a position backed by articles or studies in which the headline appears to support the position contended, however which in reality actually contend something completely or antithetically different. A skeptical understanding which is developed though sound bytes and by never actually reading the abstract, method or content of cited articles or studies.

Close-Hold Embargo – is a common form of dominance lever exerted by scientific and government agencies to control the behavior of the science press. In this model of coercion, a governmental agency or scientific entity offers a media outlet the chance to get the first or most in-depth scoop on an important new ruling, result or disposition – however, only under the condition that the media outlet not seek any dissenting input, nor critically question the decree, nor seek its originating authors for in-depth query.

Complexifuscation – the introduction of similar signals, inputs or measures, alongside a control measure or an experimental measure, in an attempt to create a ‘cloud of confusion or distraction’ around the ability to effect observation, control or measure of a targeted set of data. Preemption of a phenomena with in-advance flurries of fake hoaxes, in order obscure the impact, or jade the attention span of a target audience, around a genuine feared phenomena.

Confirmation Reliance Error – abuse of the Popper demarcation principle, which cites that body of knowledge/finished science cannot rely upon predictive and confirming evidence alone, by then applying this principle incorrectly to the process of science – or failing to distinguish controlled predictive science from simple confirmatory observation after the fact. This in an effort to filter out selectively, those ideas and theories which are vulnerable through having to rely in part upon predictive evidence, consilience of multiple inputs, or are further denied access to peer review and replication steps of science simply because of this malpractice in application of the Popperian predictive demarcation.

Conflation Bias – the tendency of a proponent to be unable or unwilling to distinguish recollection between personal religious or unproven beliefs, and actual accepted science; and the resulting extrapolation of science entailed therein.

Conflation of Treatment and Cause – the assumption that, because a medicinal, food or activity helps to mitigate the symptoms of a malady, therefore the treatment is necessarily addressing the cause of that malady.  Assuming that because exercise and low caloric intake help reduce blood pressure and weight, that therefore low caloric activity and excess intake is quod erat demonstrandum the cause of those maladies.

Consensussing – (a portmanteau of ‘consensus’ and ‘sussing’) – is a dysinformation activity in which a false perception of consensus is built through crafting of pluralistic ignorance (regarding both the issue at hand and the matter of consensus itself) within the scientific community, or conducting the same activity to build pluralistic ignorance inside the public at large.

Consilience Evasion – a refusal to consider scientifically multiple sources of evidence which are in congruence or agreement, focusing instead on targeting a single item from that group of individual sources of evidence because single items appear weaker when addressed alone. Also called a ‘silly con.’

Silly Con – spinning consilience as consensus. Investigating only one alternative and through manipulative pluralistic ignorance and social pressure, declaring that hypothesis as consensus and all others as unnecessary/pseudoscience/anti-science.  Spinning politically motivated variations of an accepted scientific hypothesis, and selling those variations to the untrained public, for consumption in the name of science.

Constructive Ignorance (Lemming Weisheit or Lemming Doctrine) – a process related to the Lindy Effect and pluralistic ignorance, wherein discipline researchers are rewarded for being productive rather than right, for building ever upward instead of checking the foundations of their research, for promoting doctrine rather than challenging it. These incentives allow weak confirming studies to to be published and untested ideas to proliferate as truth. And once enough critical mass has been achieved, they create a collective perception of strength or consensus.

Contrathetic Impasse – a paradoxical condition wherein multiple competing hypotheses and/or ad hoc plausible explanations bear credible inductive evidence and research case history – yet each/all hypotheses or explanations have been falsified/eliminated as being sufficiently explanatory for more than a minor portion of a defined causal domain or observation set. For instance, the MiHoDeAL explanation contains 5 very credible possible explanations for challenging phenomena. However, the sum total of those 5 explanations often only amounts to explaining maybe 5 – 15% of many persistent paranormal phenomena. The presumption that one of those explanations is comprehensively explanatory, is a trick of pseudoscience. Another new hypothesis is therefore demanded in the circumstance of a contrathetic impasse paradox.

Causes or influences which contribute to a contrathetic impasse:*

1.  Foundational assumptions/investigation are flawed or have been tampered with.

2.  Agency has worked to fabricate and promote falsifying or miscrafted information as standard background material.

3.  Agency has worked to craft an Einfach Mechanism (Omega Hypothesis) from an invalid null hypothesis.

4.  Agency has worked to promote science of psychology, new popular theory or anachronistic interpretation spins on the old mystery.

5.  SSkeptics have worked to craft and promote simple, provisional and Occam’s Razor compliant conclusions.

6.  Agency has worked to foist ridiculous Imposterlösung constructs in the media.

7.  Agency has worked to foist shallow unchallenged ad hoc explanations in the media.

8.  SSkeptics seem to have organized to promote MiHoDeAL constructs in the media.

9.  There exist a set of repeatedly emphasized and/or ridiculously framed Embargo Hypotheses.

10.  Agency has worked to promote conspiracy theory, lob & slam Embargo Hypotheses as an obsession target to distract or attract attack-minded skeptics to the mystery.

The reason this is done is not the confusion it provides, rather the disincentive which patrolling skeptics place on the shoulders of the genuine skilled researcher. These forbidden alternatives may be ridiculous or indeed ad hoc themselves – but the reason they are raised is to act as a warning to talented researchers that ‘you might be tagged as supporting one of these crazy ideas’ if you step out of line regarding the Omega Hypothesis.

Contrathetic Inference Paradox – a condition where abductive and inductive inference point to one outcome or understanding, and deductive inference points in another antithetical direction entirely. Since science often begins with inductive study stemming from abductive understanding, it will dogmatically hold fast to an inductive understanding until a paradigm shift occurs – a result of the weight of deductive evidence pointing in a different direction. The job of fake skepticism is to ensure that this deductive evidence or any thought resulting from it, is never accepted into science in the first place.

Correlation to Causality Leap – contending that two events which occur together have a cause-and-effect relationship proven by science, since statistics are used to describe them.

Crate Effect – impact of persons who purposely give the opposite response as to what they really think because of animosity towards the polling group or the entailed issue (especially if non-free press) and/or their perceived history of bias, and/or animosity towards the circus around elections or the elections themselves. This false left leaning bias is generated most often inside groups who believe media outlets to be left-leaning and unfair.

Credentialism – an implication or overemphasis on academic or educational qualifications (e.g., certificates, degrees & diplomas), awards or publications as the basis of an individual’s expertise or credibility.

Critical Elegance – the character and makeup of a construct or hypothesis, in that it both addresses every ‘must answer’ or critical path question along with many or most questions at hand, and achieves this without having to resort to assembling highly convoluted and risky stacks of conjecture in order to do so.

Critical Thinking – the ability to assemble an incremental, parsimonious and probative series of questions, in the right sequence, which can address or answer a persistent mystery – along with an aversion to wallowing in or sustaining the mystery for personal or club gain. Critical thinking is the ability to understand, along with the skill in ability to deploy for benefit (value, clarity, risk and suffering alleviation), critical path logic and methodology. A process of methodically and objectively evaluating a claim to verity, through seeking new observations/questions which can be creatively and intelligently framed to challenge elements of fiat knowledge which underpin the claim, regardless of how compulsive, reasonable, justified and accepted that knowledge might be promoted or perceived.

Cut the Feed – also known as an Entscheiden Mechanism – the pseudoscientific or tyrannical approach of, when faced with epistemology or observation which is heading in an undesired direction, artificially declaring under a condition of praedicate evidentia, the science as ended or ‘settled’ – or to cut off all scientific examination of the subject – as in a NASA notorious ‘cut the feed’ event.

Demoveogenic Shift – a condition wherein amateurs of a science are proactive, well versed and investigate more depth/critical path, while in contrast the academic fellows of the discipline are habitually feckless, cocooned and privileged.

Denialty – when sskepticism or another institution claiming to represent ‘science’ conclusively promotes evidence showing all suspects they are seeking to protect, as being innocent, yet there is a dead body nonetheless. A state of ignorance which results from such a condition.

Deontological Doubt (epoché) – if however one defines ‘doubt’ – as the refusal to assign an answer (no matter how probable) for a specific question – in absence of assessing question sequence, risk and dependency (reduction), preferring instead the value of leaving the question unanswered (null) over a state of being ‘sorta answered inside a mutually reinforcing set of sorta answereds’ (provisional knowledge) – then this is the superior nature of deontological ethics.

Dichotomy of Compartmented Intelligence – a method of exploiting the compartmented skill sets of scientists, in order to preclude any one of them from fully perceiving or challenging a study direction or result. Call your data analysts ‘data scientists’ and your scientists who do not understand data analysis at all, the ‘study leads’. Ensure that those who do not understand the critical nature of the question being asked (the data scientists) are the only ones who can feed study results to people who exclusively do not grasp how to derive those results in the first place (the study leads). This is a method of intelligence laundering and is critical to eveyone’s being deceived, including study authors, peer reviewers, media and public at large.

The Diebold Test – if you and a friend are walking down the sidewalk, and a Diebold ATM machine falls from ten stories above and crushes him, you do not need a doctor to pronounce death, nor to take an EEG nor blood pressure in order to scientifically establish that your buddy is dead. Those who appeal for such false epistemology typically are not actually looking for verity – rather just seeking to deflect anything which competes with the answer they desire to enforce.

Dismissible Margin – the social engineering technique of ensuring that experts inside a body of embargoed science constitute no more than a couple percentage points or less, or that which is less than the Michael Shermer dismissible margin, of the larger body of scientists advised and educated by SSkepticism.

Dismissible Margin Fallacy – presuming that proponents inside a body of embargoed science constitute no more than a couple percentage points or less of all scientists, or that which is less than the Michael Shermer dismissible margin, of the larger body of scientists advised and educated by SSkepticism.

Double Standard – occurs when a person requires a higher level of proof, perfection, or rigor from opposing views or hypotheses than they do from their own or those they favor. This approach is contrary to the principles of genuine skepticism, which emphasizes impartiality and the uniform application of critical analysis to all claims, regardless of personal bias or preference.

Doubt – there are two forms of ‘doubt’ (below).  Most fake skeptics define ‘doubt’ as the former and not the latter – and often fail to understand the difference.

Methodical Doubt – doubt employed as a skulptur mechanism, to slice away disliked observations until one is left with the data set they favored before coming to an argument. The first is the questionable method of denying that something exists or is true simply because it defies a certain a priori stacked provisional knowledge. This is nothing but a belief expressed in the negative, packaged in such a fashion as to exploit the knowledge that claims to denial are afforded immediate acceptance over claims to the affirmative. This is a religious game of manipulating the process of knowledge development into a whipsaw effect supporting a given conclusion set.

Deontological Doubt (epoché) – if however one defines ‘doubt’ – as the refusal to assign an answer (no matter how probable) for a specific question – in absence of assessing question sequence, risk and dependency (reduction), preferring instead the value of leaving the question unanswered (null) over a state of being ‘sorta answered inside a mutually reinforcing set of sorta answereds’ (provisional knowledge) – then this is the superior nature of deontological ethics.

The Duty of Science (Einstein) – The right to search for the truth implies also a duty that one must not conceal any part of what one finds to be true. The right to search for the truth is commensurate also with a duty that one must not conceal any part of what one finds to be true, nor obfuscate what one fears could possibly be true.

DRiP Method – the method by which SSkpetics employ Pseudoscience to filter official discourse and enforce the religious goals of the Deskeption Cabal by the process of 1. D Denying Data, 2. R Restricting Research in 3. P Preventing or Pretending Peer Review.

Dual-Burden Model of Inferential Ethics – the condition which is called ‘plurality’ by Ockham’s Razor exists once the sponsors of an alternative (to the null) idea or construct (does not have to be fully mature as a hypothesis) have achieved any one of the following necessity thresholds:

‣  a nexus of a persistent and robust alternative construct observation base

‣  potential falsification of the ‘null’ exists (and certainly if that null is not really a hypothesis itself)

‣  the intent contribution of agency has been detected

‣  the critical issue involved is a matter of public trust

‣  the contention involves placing involuntary or large counts of stakeholders at risk ‣  there exists a critical immaturity of the entailed observation domain.

Under such necessity, the hypothesis reduction circumstance exists wherein an actual null hypothesis must be developed, and further be shown to have comprehensive explanatory potential to justify its contention – it can no longer reside as simply the lazy ‘null’ argument. Conditions wherein the evidence is forcing the null sponsor to contend something other than simply ‘nuh-uh’ (nulla infantis). However beware, the discipline in such defense of the null better be just as solide-en-preuve as that discipline set which was previously demanded of alternative explanation sponsors.

The Dunning Line (or Inretio Line, Latin ‘ensnare’) – ‘Discipline your mind into a steel trap, but make sure it doesn’t serve to only entrap you.’ The line beyond which, one has become so skeptical, that they have become stupid in the process. One skilled at filtering out only that information which offends their feelings and sensibility – as opposed to being based upon actual evidence or science. Named for the minimum level of prowess one can possess and still barely function as a skeptic, as opposed to a babbling cynic.

Duty to Address and Inform (Hypothesis) – a critical element and aspect of parsimony regarding a scientific hypothesis. The duty of such a hypothesis to expose and address in its syllogism, all known prior art in terms of both analytical intelligence obtained or direct study mechanisms and knowledge. If information associated with a study hypothesis is unknown, it should be simply mentioned in the study discussion. However, if countermanding information is known, the structure of the hypothesis itself must both inform of its presence and as well address its impact.

Dysinformation (Dystopian Disinformation) – a special form of disinformation (lying with facts or mostly-truths), pushed thru churnalism/propaganda – employed to prepare a collective awareness or mass formation paradigm in advance – to comfortably explain in advance, a future calamity or adverse spiral of events.

Ecneics – refusal to study a topic, so one can then cite the fact that no evidence exists, and moreover then can declare the subject to be invalid. Anecdotes in support, only serve to reinforce the correct conclusion in denial.

Educative Difficulty Effect – that information acquired in a course of academic instruction or that takes longer to read and is thought about more (processed with more difficulty) is more easily remembered or validly applied.

Effect Inversion – one sign of a study which has been tampered with through inclusion and exclusion criteria, in an effort to dampen below significance or eliminate an undesired signal, is the circumstance where an inverse or opposite relationship effect is observed in the data when the inverse question is asked concerning the same set of data. If a retrospective cohort study purportedly shows no effect relationship between a candidate cause and a malady – there should also be no relationship between the candidate cause and the absence of the malady as well (if the two are indeed unrelated epidemiology). The presence of a reverse or apparently ‘curative’ influence of the candidate cause being evaluated in the data may signal the impact of data manipulation.

Einfach Mechanism – an idea which is not yet mature under the tests of valid hypothesis, yet is installed as the null hypothesis or best explanation regardless. An explanation, theory or idea which sounds scientific, yet resolves a contention through bypassing the scientific method, then moreover is installed as truth thereafter solely by means of pluralistic ignorance around the idea itself. Pseudo-theory which is not fully tested at its inception, nor is ever held to account thereafter. An idea which is not vetted by the rigor of falsification, predictive consilience nor mathematical derivation, rather is simply considered such a strong, or Occam’s Razor (sic) stemming-from-simplicity idea that the issue is closed as finished science or philosophy from its proposition and acceptance onward. A pseudo-theory of false hypothesis which is granted status as the default null hypothesis or as posing the ‘best explanation’, without having to pass the rigors with which its competing alternatives are burdened. The Einfach mechanism is often accompanied by social rejection of competing and necessary alternative hypotheses, which are forbidden study. Moreover, the Einfach hypothesis must be regarded by the scientific community as ‘true’ until proved otherwise. An einfach mechanism may or may not be existentially true.

Elemental Pleading – breaking down the testing of data or a claim into testing of its constituents, in order to remove or filter an effect which can only be derived from the combination of elements being claimed. For instance, to address the claim that doxycycline and EDTA reduce arterial plaque, testing was developed to measure the impact of each item individually, and when no effect was found, the combination was dismissed as well without study, and further/combination testing was deemed to be ‘pseudoscience’.

Embargo Hypothesis (Hξ) – was the science terminated years ago, in the midst of large-impact questions of a critical nature which still remain unanswered? Is such research now considered ‘anti-science’ or ‘pseudoscience’?

Entscheiden Mechanism – the pseudoscientific or tyrannical approach of, when faced with epistemology which is heading in an undesired direction, artificially declaring under a condition of praedicate evidentia, the science as ‘settled’.

Poison Pill Hypothesis – the instance wherein sskeptics or agency work hard to promote lob & slam condemnation of particular ideas. A construct obsession target used to distract or attract attack-minded skeptics into a contrathetic impasse or argument. The reason this is done is not the confusion or clarity it provides, rather the disincentive which patrolling skeptics place on the shoulders of the genuine skilled researcher. These forbidden alternatives (often ‘paranormal’ or ‘pseudoscience’ or ‘conspiracy theory’ buckets) may be ridiculous or indeed ad hoc themselves – but the reason they are raised is to act as a warning to talented researchers that ‘you might be tagged as supporting one of these crazy ideas’ if you step out of line and do not visibly support the Omega Hypothesis. A great example is the skeptic community tagging of anyone who considers the idea that the Khufu pyramid at Giza might have not been built by King Khufu in 2450 bce, as therefore now supporting conspiracy theories or aliens as the builders – moreover, their being racist against Arabs who now are the genetic group which occupies modern Egypt.

Emperor’s New Clothes Cozenage – a cultural state in which such delusion persists or when one underpins argument for a supposed tenet of science by its being too intricate or incomprehensible for those unfit for their positions, or too stupid, or untrained, of of lower caste or incompetent to comprehend.

Epibelieology – the study of the patterns and effects of health & disease conditions in defined populations; while yet at the same time meticulously avoiding study of the cause of those same diseases or conditions.

Epistemic Commitment – is a compulsion on the part of a proponent to uphold the factual assertions of a given proposition, and to tender apologetic or No True Scotsman pleading exceptions, when faced with contraindicating data. It contrasts with dogma in that epistemic commitment may be released through respectful discourse, whereas dogma typically is not.

Essential Schema Filtering Error – when one uses pop psychology studies such as the 1980’s Loftus Study to dismiss memories and observations which they do not like. By citing that memories and eyewitness testimony are unreliable forms of evidence, pretend skeptics present an illusion of confidence on dismissing disliked eyewitness essential schema data, when neither the Federal Rules of Evidence, science nor even the cited studies make such a claim which allows the dismissal of eyewitness testimony at all.

Ethical Inversion –  a social condition and form of linear induction which prohibits the execution of actual science, ironically in the name of ethics. A false condition wherein proper study design is deemed to involve unethical or inhumane techniques, thus researchers are allowed to employ flawed study design, which in turn only serves to produce results that mildly suggest there is no need to conduct proper study design in the first place. So none is undertaken.

Ethical Skeptic’s Axiom – accurate, is simple. But that does not serve to make simple, therefore accurate.

Ethical Skeptic’s Axiom of Extremism and Science – if they will choose to be an extremist politically, this offers a glimpse as to the poor integrity and instant-grits rigor placed into their claim of ‘settled science’ as well.

Ethical Skeptic’s Dictum of Malice and Human Rights – “Within the context of an impingement of human rights, incompetence and malice are indistinguishable.”

Ethical Skeptic’s Dictum of Peer Review – in a domain which resides below a given quotient of knowledge, peer review is indistinguishable from ad populum appeal.

Ethical Skeptic’s Law – if science won’t conduct the experiment, society will force the experiment. One can only embargo an idea for so long.

The Ethical Skeptic’s Law of Advanced Intelligence

Neti’s Razor – one cannot produce evidence from an entity which at a point did or will not exist, to also demonstrate that nothing aside from that entity therefore exists. The principle which serves to cut secular nihilism as a form of belief, distinct from all other forms of atheism as either philosophy or belief. From the Sanskrit idiom, Neti Neti (not this, not that). Therefore, you are wholly unqualified to instruct me that this realm is the only realm which exists, and efforts to do so constitute a religious activity.

I Am that I Am – that which possesses the unique ability to be able to define itself, renders all other entities disqualified in such expertise. There is no such thing as an expert in god. The principle which serves to cut theism as a form of belief, distinct from all other forms of belief as either philosophy or religion. From the Torah idiom, I Am (I Am that I Am or in Sanskrit, Aham Bramsmi).  Therefore, if god existed, you are unqualified to tell me about it. So, theism falls into a lack of allow-for domain.

Non-Existence Definition – six questions form the basis of a definition: What, Where, When, How, Why, Who. The answers to this set of six questions still forms a definition of expert attributes, even if the answer to all six is ’empty set’. Therefore, when one applies the ethics of skepticism – one cannot formulate a definition which is specified as ’empty set’, without due empirical underpinning, a theory possessing a testable mechanism and a consilience of supporting research.  We have none of this, and can make no claims to ‘non-existence’ expertise in god.

Principle of Indistinguishability – any sufficiently advanced act of benevolence is indistinguishable from either malevolence or chance.

Ethical Skeptic’s Principle of Stakeholder Expertise – a stakeholder does not have to be a degree-holding expert in a discipline, in order to spot principal incompetence inside that discipline. As well, the stakeholder possesses the right of review of any claim which places the stakeholder under loss, harm, or risk.

The Ethical Skeptic’s Razor (The Antiwisdom of Crowds) – among competing alternatives, all other things being equal, prefer the one for which discussion or research is embargoed. Power, Politics, Narrative, and Profit demand a level of transparency which obviates that same burden upon mere dissent. What is enforced by Narrative, can also be dismissed as Narrative.

Existential Fallacy (of Science) – the implication or contention that there is no science supporting an idea, or that science has rejected an idea, when in fact no scientific study at all, or of any serious import has been conducted on the topic at hand.

Existential Occam’s Razor Fallacy (Appeal to Authority) – the false contention that the simplest or most probable explanation tends to be the scientifically correct one. Suffers from the weakness that myriad and complex underpinning assumptions, based upon scant predictive/suggestive study, provisional knowledge or Popper insufficient science, result in the condition of tendering the appearance of ‘simplicity.’

exoagnoia – conspiracy which is generated naturally through the accelerative interaction of several commonplace social factors. A critical mass of an uninformed, misinformed or disinformed population under chronic duress (the ignorance fuel), ignited by an input of repetitive authoritative propaganda (the ignition source). Such a phenomenon enacts falsehood through its own inertia/dynamic and does not necessarily require a continuous intervention on the part of an influencing group.

Expert/Expertise – one who can, or that awareness which enables one to, negotiate randomness and spot obviousness, more reliably than the average person.

Expert Relative Privation Error – the subjective contention that an avenue of research is not transparent to accountability inside science, that scientists are restricted from or too busy to access its undisciplined body or domain of evidence, or that the sponsors are hiding/ignoring counter evidence or are not forthcoming with their analysis. When in fact, such contentions are excuses foisted to countermand a need to pursue under the scientific method, a subject which has passed an Ockham’s Razor necessity of plurality.

Explanitude – the condition where a theory has been pushed so hard as authority, or is developed upon the basis of pseudoscience such as class struggle theory or psychology of sex, that it begins to become the explanation for, or possesses an accommodation for every condition which is observed or that the theory domain addresses. A theory which seems to be able to explain everything, likely explains nothing.

Extumary (Policy) – the condition wherein those in a position of authority must select for the most extreme of measures (typically inside an Overton Window) in order to allay perceptions of ineffectiveness, or deflect criticism that they did not take enough action. For example, initiating population lockdowns against a communicable illness, which bear a likelihood of being even more damaging than the illness itself – all for the mere appearance of ‘taking definitive action’ or illusions of possessing control of the situation. Inside this dynamic, the most extreme policy under consideration by science, usually ends up being the one implemented. Once implemented, any hint of its failure will thereafter be squelched.

¡fact! – lying through facts. Data or a datum which is submitted in order to intimidate those in a discussion, is not really understood by the claimant, or rather which is made up, is not salient or relevant to the question being addressed, or is non-sequitur inside the argument being made. The relating of a fact which might be true, does not therefore mean that one is relating truth.

Fallacy of Centrality – If something significant was happening, I’d know about it; since I don’t know about it, it isn’t happening. A fallacy framed by researcher Ron Westrum while observing the diagnostic practices of pediatricians in the 1940s and 1950s. An expert or self-proclaimed one assumes that they are in a central position inside a topic or discipline. Moreover they assume that if something of serious import were happening, they would know about it. Since they don’t know about it, therefore it isn’t happening.

Fallacy of Existential Privation – a claim that counters a person’s concern about a scientific issue with ‘Why haven’t you solved the problem then?’ – when raising an objection in science or society does not have to be qualified by having also solved the problem in another way.

Fallacy of Relative Privation – claiming that science is only the property of scientists. Dismissing an avenue of research due its waste of scientists’ time and to the existence of more important, but unrelated, problems in the world which require priority research.

Fallacy of Scientific Composition – The fallacy of contending explicitly or implicitly that legitimate science consists only as a set of approved or published studies. The failure to realize that professionals making trained observations in their field and operating environment, are more 1. timely, 2. accurate, and 3. scientific than studies which try and replicate the same through skeptic, academic, or cubicle work, even when followed by peer review.

Fallacy of The Control – a condition wherein invalid objection is raised to a valid observation, citing that the observation was not conducted against a differential cohort or control. Situations were the extreme nature of the observation is exceptional inside any normal context, to the point of not requiring a control, in order to be valid for inference or inclusion. A fake skeptic’s method of dismissing observations which threaten their religion, by means of sophistry and pretend science.

Fallacy of Univariate Linear Inductive Inquiry into Complex Asymmetric Systems (Univariate Fallacy) – the informal fallacy of attempting to draw inference regarding complex dynamic systems, such as biological, economic or human systems, through employment of singular, linear or shallow inductive analytical methods. A deep understanding of complex systems first demands a conceptual and analytical strategy that respects, defines and adequately reduces for analysis, that complexity. Only then can multiple component analyses be brought to bear as a consilience in understanding the whole.

False Consensus – consensus is the collective judgment, position, and opinion of the community of scientists composing a particular field of study. It is not a popularity poll among scientists in general nor even necessarily inside the field of study in question. Consensus can only be claimed when multiple opposing explanatory alternatives have been researched in objective detail, and a reasonable body of those scientists who developed the field of opposition alternatives, have been convinced of the complimentary alternative’s superiority. Just because a null hypothesis exists, and only that hypothesis has been researched, does not provide a basis for a claim to consensus, no matter how many scientists, or those pretending to speak for science in the media, favor the null hypothesis.

False Equipoise – abrogation or early broaching of the principle of equipoise. Equipoise is a term describing the ethical basis for research, in that there should exist genuine uncertainty in the expert (often medical) community conducting research on an idea, approach, treatment or theory. An ethical dilemma arises in a clinical trial or hypothesis reduction when the investigator(s) begin to observe evidence that one treatment or theory is performing to a superior level. As research progresses, the findings may provide sufficient evidence to convince both A. the direct investigation sub-community and B. the research community at large. Once a certain threshold of evidence is surpassed, in theory there is no longer genuine equipoise. False equipoise is driven by agenda, profits and bias. It consists of 1. initiating research in a condition where there is not a fair or unbiased degree of uncertainty on the part of the research group or community, 2. declaring a theory or treatment to be valid and ceasing study at too early a point in the overall research, 3. declaring a treatment or theory valid at too low a threshold of critical path evidence, or 4. declaring a treatment or theory to be consensus without adequate basis or review of uncertainty.

False Stickiness – enforcing as proved science, a theory which should have died off with its proponents years ago.

Fascism – when one spins or exaggerates science or falsely represents that science has proved an argument to be unsubstantiated in order to, legislatively or in media/public opinion, protect large corporations from the damage they cause through their products and services; especially when such services are mandated for every citizen by the presiding government.

ficta rationalitas – a disposition assessment, contended to be an outcome of rationality or scientific skepticism, when in fact it originates from flawed method and/or personal bias.

Fictitious Burden of Proof – declaring a ‘burden of proof’ to exist when such an assertion is not salient under science method at all. A burden of proof cannot possibly exist if neither the null hypothesis or alternative theories nor any proposed construct possesses a Popper sufficient testable/observable/discernible/measurable mechanism; nor moreover, if the subject in the matter of ‘proof’ bears no Wittgenstein sufficient definition in the first place (such as the terms ‘god’ or ‘nothingness’).

fictus scientia – assigning to disfavored ideas, a burden of proof which is far in excess of the standard regarded for acceptance or even due consideration inside science methods. Similarly, any form of denial of access to acceptance processes normally employed inside science (usually peer review both at theory formulation and at completion). Request for proof as the implied standard of science – while failing to realize or deceiving opponents into failing to realize that 90% of science is not settled by means of ‘proof’ to begin with.

fictus scientia – when one uses skepticism instead of science to develop knowledge and make conclusions. The substitution of skepticism, or a logically or socially twisted form thereof, into the inappropriate role of acting or speaking on behalf of science or scientists.

fictus scientia Fallacy – a contention, purported to be of scientific origin, when in fact, it is not.

Filbert’s Law – to find a result use a small sample population, to hide a result use a large one. More accurately expressed as the law of diminishing information return. Increasing the amount of data brought into an analysis does not necessarily serve to improve salience, precision nor accuracy of the analysis, yet comes at the cost of stacking risk in signal. The further away one gets from direct observation, and the more one gets into ‘the data’ only, the higher is the compounded stack of risk in inference; while simultaneously the chance of contribution from bias is also greater.

Forward Problem Blindness (Unfinished Science) – the “inverse problem” consists in using the results of actual observations to infer the values of the input parameters characterizing a system under investigation. Science which presupposes a forward problem solution, or employs big data/large S population measures only inside, a model and the physical theory linking input parameters forward to that model’s predicted outcome – without conducting direct outcome observation confirmation or field measure follow-up to such proposed values and linkages – stands as unfinished science, and cannot ethically justify a claim to consensus or finished science. The four types of Forward Problem Blindness Errors:

Type ICohort Ignorance – wherein special populations or peripheral groups consisting of different inherent profiles are not studied because the survey undertaken was inclusive but too large, or the peripheral groups themselves, while readily observable, were ignored or screened out altogether.

Type II – Parameter Ignorance – wherein a model or study disregards an important parameter – which is tendered an assumption basis which is not acknowledged by the study developer nor peer review, and is then lost as to its potential contribution to increased understanding, or even potential model or study error.

Type IIILack of Field Confirmation or Follow-Up – wherein a theoretical forward problem model is established and presumed accurate, yet despite the ready availability of a field confirming basis of observation – no effort was ever placed into such observation, confirmation of measures and relationships, or observations were not undertaken to determine long term/unanticipated outcomes.

Type IV – Field of Significant Unknown – wherein established ideas of science are applied to develop a theoretical forward problem model – and because of the familiarity on the part of science with some of the elements of the solution proposed – the solution is imputed tacit verity despite being applied inside a new field for the first time, or inside a field which bears a significant unknown.

Garbage Skepticism – when the ‘skeptical’ reasoning employed is less rational or scientifically literate than the contention it is being employed against.

Geneticide – genocide by means of genetic exploitation or introduction of environmental influences which produce a sustained dysgenic effect on target populations. The exploitation of the specific genetics of a lineage of people to craft impacts seeking to ultimately and eventually eliminate them from a population. Employment of food, mandatory medicines, product and environmental toxins, social policy, laws, pesticides, and allele manipulation as a means to effect in specific genetic lineages – developmental problems, protein malnutrition, cognitive impairment, encephalopathy, birth defects, attention and comprehension difficulties, growth inhibition, mental disorders, drug and alcohol susceptibility, auto-immune disorders, endocrine collapse, disease, low rates of reproductivity, premature death – and otherwise disrupt the general health and welfare of a target population.

Ghost Variable/Ghost Assumption – a novel variable which is inserted into a model to artificially compensate for the model’s lapse in matching observed reality, or to ‘describe’ a putative effect which itself is not fully understood, has not been scientifically reduced, or has not been confirmed to actually exist. Despite an idea or input being implausible, inelegant, and non-observable in any direct fashion, a ghost variable solely exists because it is required in order to sustain the model which requires its existence. A type of circular reasoning. When used as a placeholder, such an assumption can bear scientific utility; however, a ghost variable or assumption should never thereafter be assumed as conclusive, based upon its model-utility alone. Eventually the idea behind the ghost variable itself, must be demonstrated as valid.

God Protects Fools and Drunkards Inversion – a condition inside a domain or system wherein the complexity of the system is so high, that few to no experts actually exist. In such a circumstance, those who deceive themselves with facile, apothegm-fueled, or simpleton understandings of the complex system are more dangerous than those who are completely ignorant of the system altogether. An inversion wherein the ignorant are wiser in their outcomes than are the learned. One example is metabolic pathways found in prokaryotic and eukaryotic cells (particularly as it pertains to obesity and dieting) – a discipline which is so complex that pretend experts abound, and the only actual inhabitants inside the study domain are researchers, hustlers, and the lucky.

Google Goggles – warped or blinded understandings of science or scientific consensus bred through reliance on web searches for one’s information and understanding. Vulnerability to web opinions and misinformation where every street doubter pretends to be an authority on science.

Green Eggs and Ham (Poll) Error – the combined Crate-Bradley Effect in polling error. Including sentiment of those who have never heard of the topic. Including responses from those who know nothing about the topic, but were instructed to throw the poll results. Finally, treating both of these groups as valid ‘disagree/agree’ sentiment signal data. The presence of excessively small numbers of ‘I don’t know’ responses in controversial poll results. There exists an ethical difference between an informed-yet-mistaken hunch, versus making a circular-club-recitation claim to authority based upon a complete absence of exposure (ignorance) to a topic at all. In reality, the former is participating in the poll, the latter is not. The latter ends up constituting only a purely artificial agency-bias, which requires an oversampling or exclusion adjustment. One cannot capture a sentiment assay about the taste of green eggs and ham, among people who either don’t even know what green eggs and ham is, or have never even once tasted it because they were told it was bad.

Bradley Effect – the principle wherein a person being polled will, especially in the presence of trial heat or iteration-based polls, tend to answer a poll question with a response which they believe the polling organization or the prevailing social pressure, would suggest they should vote or which will not serve to identify them into the wrong camp on a given issue. The actual sentiment of the polled individual is therefore not actually captured.

Crate Effect – impact of persons who purposely give the opposite response as to what they really think because of animosity towards the polling group or the entailed issue (especially if non-free press) and/or their perceived history of bias, and/or animosity towards the circus around elections or the elections themselves. This false left leaning bias is generated most often inside groups who believe media outlets to be left-leaning and unfair.

Gundeck/Gundecking – a form of methodical deescalation and Nelsonian inference. Meaning to slip away to lower decks of a ship or lesser functions of professional conduct in order to put on the appearance of working on an objective. However in reality, one is merely avoiding or falsifying the necessary work at hand.

  • Being a scientist when one needs to be an investigator.
  • Being a technician when one needs to be a scientist.

Whatever keeps one from discovering that which they do not want discovered, or not performing work on the salient question at hand.

Hawthorne Observer Effects – an expansion of thought based around the principle in which individuals modify an aspect of their behavior in response to their awareness of being observed or to a test-change in their environment. The simple act of observation changes both the state of mind of the observer as well as those being observed. These effects principally center on the state of mind change in the observer:

  1. That which is observed, changes
  2. That which is observed for the first time, appears exceptional
  3. That which is observed obsessively, instills angst
  4. That which bears risk, is observed less
  5. That which is observed to move more slowly, is perceived as the greater risk

Herculean Burden of Proof – placing a ‘burden of proof’ upon an opponent which is either arguing from ignorance (asking to prove absence), not relevant to science or not inside the relevant range of achievable scientific endeavor in the first place. Assigning a burden of proof which cannot possibly be provided/resolved by a human being inside our current state of technology or sophistication of thought/knowledge (such as ‘prove abiogenesis’ or ‘prove that only the material exists’). Asking someone to prove an absence proposition (such as ‘prove elves do not exist’).

Hyperepistemology – transactional pseudoscience in the employment of extreme, linear, diagnostic, inconsistent, truncated, excessively lab constrained or twisted forms of science in order to prevent the inclusion or consideration of undesired ideas, data, observations or evidence.

Hypocrisy of Plenitude – when one employs the principle of plenitude in explaining the reality of a person’s existence and self identity (you simply are experiencing one of an infinity of potentials), which is then negated by the appeal to authority of mandating severe constraint of potentials in explaining that same person’s ontological future or past (you have never lived before, nor shall you have any afterlife). Especially when one is claiming that this hypocrisy is somehow derived from ‘evidence’ or science.

Hypoepistemology –  existential pseudoscience in the relegation of disfavored subjects and observations into bucket pejorative categorizations in order to prevent such subjects’ inclusion or consideration in the body of active science.  Conversely, acceptance of an a priori favored idea, as constituting sound science, based simply on its attractiveness inside a set of social goals.

Hypothesis – a disciplined and structured incremental risk in inquiry, relying upon the co-developed necessity of mechanism and intelligence. A hypothesis necessarily features seven key elements which serve to distinguish it from non-science or pseudoscience.

The Seven Elements of Hypothesis

1.  Construct based upon necessity. A construct is a disciplined ‘spark’ (scintilla) of an idea, on the part of a researcher or type I, II or III sponsor, educated in the field in question and experienced in its field work. Once a certain amount of intelligence has been developed, as well as definition of causal mechanism which can eventually be tested (hopefully), then the construct becomes ‘necessary’ (i.e. passes Ockham’s Razor). See The Necessary Alternative.

2.  Wittgenstein definition and defined domain. A disciplined, exacting, consistent, conforming definition need be developed for both the domain of observation, as well as the underpinning terminology and concepts. See Wittgenstein Error.

3.  Parsimony. The resistance to expand explanatory plurality or descriptive complexity beyond what is absolutely necessary, combined with the wisdom to know when to do so. Conjecture along an incremental and critical path of syllogism. Avoidance of unnecessarily orphan questions, even if apparently incremental in the offing. See The Real Ockham’s Razor. Two character traits highlight hypothesis which has adeptly posed inside parsimony.

a. Is incremental and critical path in its construct – the incremental conjecture should be a reasoned, single stack and critical path new construct. Constructs should follow prior art inside the hypothesis (not necessarily science as a whole), and seek an answer which serves to reduce the entropy of knowledge.

b. Methodically conserves risk in its conjecture – no question may be posed without risk. Risk is the essence of hypothesis. A hypothesis, once incremental in conjecture, should be developed along a critical path which minimizes risk in this conjecture by mechanism and/or intelligence, addressing each point of risk in increasing magnitude or stack magnitude.

c. Posed so as to minimize stakeholder risk – (i.e. precautionary principle) – a hypothesis should not be posed which suggests that a state of unknown regarding risk to impacted stakeholders is acceptable as central aspect of its ongoing construct critical path. Such risk must be addressed first in critical path as a part of 3. a. above.

4.  Duty to Reduce Address and Inform – a critical element and aspect of parsimony regarding a scientific hypothesis. The duty of such a hypothesis to expose and address in its syllogism, all known prior art in terms of both analytical intelligence obtained or direct study mechanisms and knowledge. If information associated with a study hypothesis is unknown, it should be simply mentioned in the study discussion. However, if countermanding information is known or a key assumption of the hypothesis appears magical, the structure of the hypothesis itself must both inform of its presence and as well address its impact. See Methodical Deescalation and The Warning Signs of Stacked Provisional Knowledge. Unless a hypothesis offers up its magical assumption for direct testing, it is not truly a scientific hypothesis. Nor can its conjecture stand as knowledge.

5.  Intelligence. Data is denatured into information, and information is transmuted into intelligence. Inside decision theory and clandestine operation practices, intelligence is the first level of illuminating construct upon which one can make a decision. The data underpinning the intelligence should necessarily be probative and not simply reliable. Intelligence skills combine a healthy skepticism towards human agency, along with an ability to adeptly handle asymmetry, recognize probative data, assemble patterns, increase the reliability of incremental conjecture and pursue a sequitur, salient and risk mitigating pathway of syllogism. See The Role of Intelligence Inside Science.

6.  Mechanism. Every effect in the universe is subject to cause. Such cause may be mired in complexity or agency; nonetheless, reducing a scientific study into its components and then identifying underlying mechanisms of cause to effect – is the essence of science. A pathway from which cause yields effect, which can be quantified, measured and evaluated (many times by controlled test) – is called mechanism. See Reduction: A Bias for Understanding.

7.  Exposure to Accountability.  This is not peer review. While during the development phase, a period of time certainly must exist in which a hypothesis is held proprietary so that it can mature – and indeed fake skeptics seek to intervene before a hypothesis can mature and eliminate it via ‘Occam’s Razor’ (sic) so that it cannot be researched. Nonetheless, a hypothesis must be crafted such that its elements 1 – 6 above can be held to the light of accountability, by 1. skepticism (so as to filter out sciebam and fake method) which seeks to improve the strength of hypothesis (this is a ‘ally’ process and not peer review), and 2. stakeholders who are impacted or exposed to its risk. Hypothesis which imparts stakeholder risk, which is held inside proprietary cathedrals of authority – is not science, rather oppression by court definition.

ideam tutela – concealed agency. A questionable idea or religious belief which is surreptitiously promoted through an inverse negation. A position which is concealed by an arguer because of their inability to defend it, yet is protected at all costs without its mention – often through attacking without sound basis, every other form of opposing idea.

Ignorance – is not a ‘lack of knowledge’ but is rather a verb, meaning ‘a cultivated quiescence before an idea or group which has become more important to protect than science, human rights, well-being, and life itself.’ The belief that one has personally attained a state of immunity to incorrect information. The action of blinding one’s self to an eschewed reality through a satiating and insulating culture and lexicon.

Ignorance-God of the Gaps – when we obfuscate a mystery in science by means of ignoring it as taboo – then Ignorance too becomes a ‘God of the Gaps.’

ignoro eventum – institutionalized pseudoscience wherein a group ignores or fails to conduct follow-up study after the execution of a risk bearing decision. The instance wherein a group declares the science behind a planned action which bears a risk relationship, dependency or precautionary principle, to be settled, in advance of this decision/action being taken. Further then failing to conduct any impact study or meta-analysis to confirm their presupposition as correct. This is not simply pseudoscience, rather it is a criminal action in many circumstances.

Illusion of Truth Effect – that people are more likely to identify as true statements those they have previously heard (even if they cannot consciously remember having heard them), regardless of the actual validity of the statement.

Imposterlösung Mechanism – the cheater’s answer. A disproved, incoherent or ridiculous contention, or one which fails the tests to qualify as a real hypothesis, which is assumed as a potential hypothesis anyway simply because it sounds good or is packaged for public consumption. These alternatives pass muster with the general public, but are easily falsified after mere minutes of real research. Employing the trick of pretending that an argument domain which does not bear coherency nor soundness – somehow (in violation of science and logic) falsely merits assignment as a ‘hypothesis’. Despite this, most people hold them in mind simply because of their repetition. This fake hypothesis circumstance is common inside an argument which is unduly influenced by agency. They are often padded into skeptical analyses, to feign an attempt at appearing to be comprehensive, balanced, or ‘considering all the alternatives’.

Impulse Inference (Religious Doctrine) – this is a twisted and sick-minded form of metaphysical selection or faith. The only practice set which operates under a masquerade in this set of inference species and genres, is the practice of religious assumption, doctrine and dogma. This of course includes the habits of those who practice social skepticism. These religions will attempt to pass their doctrines as species of logical inference – through a process known as apologetics. This is a type of pathology wherein the participant very desperately wants to seek validation for a taught or personally adopted set of metaphysical conclusions. This is not truly an actual form of inference.

In Extremis – a condition of rising or extreme danger wherein a decision which is dependent upon an outcome of scientific study, must be made well in advance of any reasonable opportunity for peer review and/or consensus to be developed. This is one of the reasons why science does not dictate governance, but rather may only advise it. Science must ever operate inside the public trust, especially if that trust requires expertise from multiple disciplines.

in medias res – a process, research initiative or argument which begins its discourse in the middle of a series of sequential questions as opposed to starting with foundational ones and/or outlining a specific objective. This does not serve to guarantee an errant outcome, however often can waste enormous amounts of time and attention trying to resolve questions which are orphan, ill timed or unsound given the current knowledge base. Can also be used as a method of deception. See also non rectum agitur fallacy.

Inflection Point Exploitation (The Cheat) – a flaw, exploit or vulnerability inside a business vertical or white/grey market which allows that market to be converted into a mechanism exhibiting cartel, cabal or mafia-like behavior. Rather than the market becoming robust to concavity and exposed to convexity – instead, this type of consolidation-of-control market becomes exposed to excessive earnings extraction and sequestration of capital/information on the part of its cronies. Often there is one raison d’être (reason for existence) or mechanism of control which allows its operating cronies to enact the entailed cheat enabling its existence. This single mechanism will serve to convert a price taking market into a price making market and allow the cronies therein to establish behavior which serves to accrete wealth/information/influence into a few hands, and exclude erstwhile market competition from being able to function. Three flavors of entity result from such inflection point exploitation:

Cartel – an entity run by cronies which enforces closed door price-making inside an entire economic white market. Functions through exploitation of buyers (monoopoly) and/or sellers (monopsony) through manipulation of inflection points. Points where sensitivity is greatest, and as early into the value chain as possible, and finally inside a focal region where attentions are lacking. Its actions are codified as virtuous.

Cabal – an entity run by a club which enforces closed door price-making inside an information or influence market. Functions through exploitation of consumers and/or researchers through manipulation of the philosophy which underlies knowledge development (skepticism) or the praxis of the overall market itself. Points where they can manipulate the outcomes of information and influence, through tampering with a critical inflection point early in its development methodology. Its actions are secretive, or if visible, are externally promoted through media as virtue or for sake of intimidation.

Mafia – an entity run by cronies which enforces closed door price-making inside a business activity, region or sub-vertical. Functions through exploitation of its customers and under the table cheating in order to eliminate all competition, manipulate the success of its members and the flow of grey market money to its advantage. Points where sensitivity is greatest, and where accountability is low or subjective. Its actions are held confidential under threat of severe penalty against its organization participants. It promotes itself through intimidation, exclusive alliance and legislative power.

ingens vanitatum – knowing a great deal of irrelevance. Knowledge of every facet of a subject and all the latest information therein, which bears irony however in that this supervacuous set of knowledge stands as all that composes the science, or all that is possessed by the person making the claim to knowledge. A useless set of information which serves only to displace any relevance of the actual argument, principle or question entailed.

Intent (Burden of Proof) – a novel constraint which arrives into a chaotic/complex process or a domain of high unknown, which does not originate from the natural background set of constraints, and further serves to produce a consistent pattern of ergodicity – when no feedback connection between outcome and constraint is possible. An intervening constraint in which every reasonable potential cause aside from intelligent derivation has been reduced, even if such constraint is accompanied or concealed by other peer stochastic and non-intent influences. When one makes or implies a claim to lack of intent, one has made the first scientific claim and cannot therefore be exempted from the burden of proof regarding that claim, nor reside inside the luxury of a false null hypothesis (Einfach Mechanism).

Inverse Negation Fallacy (of Presumption) – the asymmetric strategy of promoting a desired idea through cancellation of all its antithetical concepts and competing ideas. A method which seeks to undermine and censor any communication, research, or construct which runs counter to a favored idea, often through framing such activity as ‘pseudoscience’ or ‘conspiracy theory’. A surreptitious effort to promote a favored idea without acknowledging it, nor appearing to be in advocacy for it, nor undertaking the risk of exposing that favored idea to the scientific method or critical scrutiny. This because the implicitly favored model itself, although promoted as TruthTM, often is unethical or bears very little credibility when examined stand-alone.

Inversion Effect – an opposite and compensating signal effect inside a research study which has filtered out sample data through invalid study design or an exclusion bias towards a specific observation. By offsetting a subset of the population being studied which bears a condition that is not desired for examination or detection, a study can introduce an opposite or contrapositive effect in its analytical results.  Vaccines not only do not cause autism, but two major studies showed they actually cure autism. Persons vaccinated for Covid, die at 35% the rate of unvaccinated persons in terms of all non-Covid deaths (natural and non-natural). These are outcomes which are impossible, however show up statistically because of an invalid exclusion bias which served to produce the effect.

Is-Does Fallacy – a derivative of the philosophical principle that one does not have to framework what something is, in order to study what it does. The error of attempting to conform science and epistemology to the notion that one must a priori understand or hold a context as to what something is, before one can study what ‘it’ does. The error of intolerance resides in the assumption that in order to study a phenomenon, we must assume that its cause is ‘real’ first. When electromagnetic theory was posed, they conceived of the context of a ‘field’ as real (= IS), before conducting EMF experimentation. However not all science can be introduced in this manner. In a true Does/Is, a researcher does not conceive of the context of the cause as real in advance of study. This frees up scientists to study challenging phenomena without having to declare it or its context to be ‘real’ first. Is/Does is a problem of induction – as it forces us to to a highly constrained form of science called sciebam, which seeks to state a hypothesis as the first step of the scientific method. A corollary of this idea involves the condition when the ‘does’ involves some sort of prejudice or will on the part of the subject being studied. Does becomes more difficult to study in such instance, however this does not remove from us the responsibility to conduct such study.

Jackboot Consensus – a version of pluralistic ignorance where social justice activism, fake celebrity skepticism or corporate push activism works to threaten the careers or publication viability of concerned scientists – thereby precipitating a form of false consensus at the heel of a boot. Most scientists quietly dissent but do not offer their opinion, considering it to be career endangering and/or in the minority.

Kabā Alternative – when a more highly embarrassing or deadly event looms, create a less embarrassing or deadly but more viral event, in advance – in order to obscure the presence of the former.

Kilkenny’s Law – final claims to expertise and evidence may be tendered inside established trade, transactional, technical and diagnostic disciplines. Therefore:

I.  A conclusive claim to evidence inside a subject bearing a sufficiently unknown or risk-bearing horizon, is indistinguishable from an appeal to authority, and

II.  Corber’s Burden: A sufficiently large or comprehensive set of claims to conclusive evidence in denial, is indistinguishable from an appeal to authority.

III.  If you have brought me evidence based claims in the past which turned out to be premature and harmful/wrong, I will refuse to recognize your successive claims to be evidence based.

King of the Shill – resting idly on outdated only partly predictive studies while asking everyone else with any alternative idea to bring iron clad proof.

Knapp’s Axiom – a quip by Las Vegas based investigative reporter for KLAS-TV, George Knapp, “The purpose of science is to investigate the unexplained, not explain the uninvestigated.” Citing the fallacy of an appeal to ignorance especially as it pertains to the refusal to develop plurality around challenging and robust ‘paranormal’ observation sets.

Law of Static Privation – the test of knowledge cites that for knowledge to be confirmed as true, it should be useful in underpinning or predicting further confirmed knowledge or in the process of alleviating suffering. The law of static privation therefore treats static ¡facts! which are held as authority by groups of privation, who do not apply the knowledge to better our understanding or alleviate suffering – to then be subordinated to best practices which have been broadly confirmed by victims or outside stakeholders. It is a “Use it, or lose it” challenge to a body claiming to represent scientific authority.

Lemming Inertia/Karen Train – the propensity for a syndicate, club, or advocacy group to be deluded by ad populum (appeal to club popularity of an idea) and ad virtutem (appeal to virtue of self and their ideas) in support of a notion – to the extent that even if they are found wrong, the movement can no longer be stopped. The belief that because one is acting as proxy in the name of some oppressed party, therefore they are now qualified to rule over others, silence speech, scream, attack, and harm in the name of that virtue costume – without any further circumspection or accountability from that point onward.

Linear Affirmation Bias – a primarily inductive methodology of deriving inference in which the researcher starts in advance with a premature question or assumed answer they are looking for. Thereafter, observations are made. Affirmation is a process which involves only positive confirmations of an a priori assumption or goal. Accordingly, under this method of deriving inference, observations are classified into three buckets:

1. Affirming

2. In need of reinterpretation

3. Dismissed because they are not ‘simple’ (conforming to the affirmation underway).

Under this method, the model is complicated by reinterpretations. Failing the test that a model should be elegant, not exclusively simple. By means of this method, necessity under Ockham’s Razor is assumed in advance and all observations thereafter are merely reconfigured to fit the assumed model. At the end of this process, the idea which was posed in the form of a question, or sought at the very start, is affirmed as valid. Most often this idea thereafter is held as an Omega Hypothesis (more important to protect than the integrity of science itself).

Lindy-Ignorance Vortex – do those who enforce or imply a conforming idea or view, seem to possess a deep emotional investment in ensuring that no broach of subject is allowed regarding any thoughts or research around an opposing idea or specific ideas or avenues of research they disfavor? Do they easily and habitually imply that their favored conclusions are the prevailing opinion of scientists? Is there an urgency to reach or sustain this conclusion by means of short-cut words like ‘evidence’ and ‘fact’? If such disfavored ideas are considered for research or are broached, then extreme disdain, social and media derision are called for?

Verdrängung Mechanism – the level of control and idea displacement achieved through skillful employment of the duality between pluralistic ignorance and the Lindy Effect. The longer a control-minded group can sustain an Omega Hypothesis perception by means of the tactics and power protocols of proactive pluralistic ignorance, the greater future acceptability and lifespan that idea will possess. As well, the harder it will to be dethrone as an accepted norm or perception as a ‘proved’ null hypothesis.

Maleduction – an argument must be reduced before it can be approached by induction or deduction – failure to reduce an argument or verify that the next appropriate question under the scientific method is being indeed addressed, is a sign of pseudoscience at play. The non rectum agitur fallacy of attempting to proceed under or derive conclusions from science when the question being addressed is agenda driven, non-reductive or misleading in its formulation or sequencing.

Manager’s Error – from Nassim Taleb’s tome Fooled by Randomness (2001). The principle of forcing an argument into an artificial binary or bifurcated outcome set, examining only that which is a priori deemed to be the more probable or simple outcome, and not that choice which can serve to produce the largest net effect or ‘payoff’. Only researching the most likely, framework compliant or simple alternative, will only serve to confirm what we already know, and bears a much lower payoff in terms of information which might be garnered through a black swan, less likely or ‘complex’ alternative turning out to bear any form of credence or final veracity.​

Mazumah Shift – the inflection point wherein more grant money can be applied for or obtained for work based on a new scientific approach/theory than can be readily obtained for work based upon an older theory. A twist of Kuhn’s Theory of Scientific Revolution or Planck’s Paradigm Shift.

McCulloch’s Axiom – it is often possible to extrapolate the truth solely from what is banned. Named for physicist Mike McCulloch, key proponent of QI theory alternatives to Dark Matter.

McLuhan’s Axiom – “Only small secrets need to be protected. The large ones are kept secret by the public’s incredulity.” A quip attributed to philosopher on media theory, Herbert Marshall McLuhan.

Medical Proxy Abuse – two species of abuse in terms of human rights, child, or caretaker, in which a caregiver is pathologically negligent in their duties towards their patient or care recipient.

Munchausen Syndrome by Proxy (MSBP) – a form of child or care-recipient abuse. It is an intentional fabrication or physical production of illness in another, usually children by mothers, to assume a role of being sick by proxy on the part of the one in their care.

Sarscov Syndrome by Proxy (SSBP) – a form of pathological human rights abuse. A form of hero syndrome or megalomania expressed through establishing exclusivity or intentional withholding of access to a physiological necessity or treatment. This is conducted with the intent of only administering such a necessity, cure, or treatment of related symptoms under the most dire, sensational, attention-garnering, or expensive conditions.

Methodical Deescalation – employing abductive inference in lieu of inductive inference when inductive inference could have, and under the scientific method should have, been employed. In similar fashion employing inductive inference in lieu of deductive inference when deductive inference could have, and under the scientific method should have, been employed.

All things being equal, the latter is superior to the midmost, which is superior to the former:

  • Conformance of panduction​ (while a type/mode of inference this is not actually a type of reasoning)
  • Convergence of abductions
  • ​Consilience of inductions
  • Consensus of deductions

One of the hallmarks of skepticism is grasping the distinction between a ‘consilience of inductions’ and a ‘convergence of deductions’. All things being equal, a convergence of deductions is superior to a consilience of inductions. When science employs a consilience of inductions, when a convergence of deductions was available, yet was not pursued – then we have an ethical dilemma called Methodical Deescalation.

Methodical Doubt – doubt employed as a skulptur mechanism, to slice away disliked observations until one is left with the data set they favored before coming to an argument. The first is the questionable method of denying that something exists or is true simply because it defies a certain a priori stacked provisional knowledge. This is nothing but a belief expressed in the negative, packaged in such a fashion as to exploit the knowledge that claims to denial are afforded immediate acceptance over claims to the affirmative. This is a religious game of manipulating the process of knowledge development into a whipsaw effect supporting a given conclusion set.

Mission Directed Blindness – when one believes from being told, that they serve a greater cause or fight an evil enemy, or that some necessary actions must be taken to avoid a specific disaster. Once assumed, this renders the participant unable to handle evidence adeptly under Ockham’s Razor.

Mobsensus – the inferential will of media, club, mafia, cabal or cartel – in which a specific conclusion is enforced upon the rest of society, by means of threat, violence, social excoriation or professional penalty; further then the boasting of or posing such insistence as ‘scientific consensus’.

Must-Explain Observation – (in contrast with ‘it would be nice if it could explain’) a specific observation inside a domain of science which cannot be brushed off as anecdote and is critical path to the argument being evaluated. In this circumstance a scientific argument or theory must explain the observation, or have its credibility be eroded. Most theoretical explanations of observed phenomena are linear inductive in their inferential strength. Must-explain observations provide a deductive lever for strong theories to emerge as superior from the field of linear inductive theories. Never accept a theory which can only explain convenient observations. You have no claim to science, if you cannot explain even a basic must-explain observation.

Muta-Analysis – the most unreliable of scientific studies. Often a badly developed meta-analysis, which cannot be easily replicated or peer reviewed, contains a high degree of unacknowledged risk, or was executed based upon a poor study plan. An appeal to authority based upon faulty statistical knowledge development processes. Processes which alter or do not employ full scientific methodology, in favor of a premature claim to consensus or rigor implied by the popularity of a statistical study type. A method which does not directly observe, nor directly test, rather employs statistical procedures to answer a faulty inclusion criteria selected, asked, agenda bearing or peripherally addressed scientific question.

Myth of Certainty/Myth of Proof – based upon the wisdom elicited by the Leon Wieseltier quote ‘No great deed private or public has ever been undertaken in a bliss of certainty’.

Myth of the Excited Scientists – the mythical, dis-informative and/or Pollyanna contention on the part of fake skeptics wherein they will claim that if any evidence whatsoever for a disliked subject were actually found, then scientists surely would be excited about it and then dedicate their lives to study of the subject from then on.

The Necessary Alternative – an alternative which has become necessary for study under Ockham’s Razor because it is one of a finite, constrained and very small set of alternative ideas intrinsically available to provide explanatory causality or criticality inside a domain of sufficient unknown. This alternative does not necessarily require inductive development, nor proof and can still serve as a placeholder construct, even under a condition of pseudo-theory. In order to mandate its introduction, all that is necessary is a reduction pathway in which mechanism can be developed as a core facet of a viable and testable hypothesis based upon its tenets.

Nelsonian KnowledgeA precise and exhaustive knowledge, about that which one claims is not worth examining. No expertise is so profound in its depth as that expertise prerequisite in establishing what not to know. Such Nelsonian knowledge takes three forms:

1. a meticulous attentiveness to and absence of, that which one should ‘not know’,

2. an inferential method of avoiding such knowledge, and finally as well,

3. that misleading knowledge or activity which is used as a substitute in place of actual knowledge (organic untruth or disinformation).

The former (#1) is taken to actually be known on the part of a poseur. It is dishonest for a man deliberately to shut his eyes to principles/intelligence which he would prefer not to know. If he does so, he is taken to have actual knowledge of the facts to which he shut his eyes. Such knowledge has been described as ‘Nelsonian knowledge’, meaning knowledge which is attributed to a person as a consequence of his ‘willful blindness’ or (as American legal analysts describe it) ‘contrived ignorance’.

Nero Taunting – when one publicly attacks a consumer or public need to seek out a solution, regarding a critical matter in their lives which science has not adequately addressed or researched. Usually indicated by a lack of proposed solutions and a high degree of disdainful or indignant media clamor over a penultimate set fallacy – the hyperbole over the footprint of science and its ability/history of having addressed such need.

Newton’s Flameout –  One who thinks something can be settled merely by an experiment probably does not understand the question in the first place.

Nickell Plating – employing accoutrements and affectations of investigation work (field trips, cameras, notebooks, sample bags, etc.), along with an implicit appeal to authority as a skeptic (appeal to skepticism) in an attempt to sell one’s self as conducting science. A social celebrity pretense of investigation, and established authority through a track record of case studies, wherein adornment of lab coats, academic thesis books, sciencey-looking instruments and the pretense of visiting places and taking notes/pictures, etc was portrayed by a posing pseudo-skeptic. In reality the nickell plater is often compensated to ‘investigate’ and socially promote one biased explanation; dismissing the sponsored hypothesis from being considered by actual science research. This is an active part of an embargo process, and was a technique which replaced debunking after it fell from public favor.

nihil admirari – a tenet of ethical skepticism. The understanding that literally everything is astonishing in nature, coupled with therefore a refusal to be anchoring bias conditioned in terms of one’s consideration of our natural realm. To not be fooled by the mundane and regular aspects of life into thinking that only the mundane and regular are therefore worthy of study pursuit. Latin for, ‘to be surprised by nothing’, is not simply a tenet of Stoicism. It is also a mandate in understanding that what resides yet undiscovered, will inevitably be refused/embargoed by fake skeptics for being ‘too incredible to consider’.

Nihilism (or Sol-Nihilism) – is a philosophical doctrine that suggests the negation of one or more of the reputedly meaningful or non-material aspects of life. Socially enforced metaphysical or pseudo scientific naturalism. The religious belief that only such physical life on Earth is relevant, and that the conscious, spiritual, values or intent sets all reduce solely to the material. The substitution ontology which took the place of Abrahamic Religion in Western academia. The cult and religious doctrine enforcing absolute knowledge as to those things which are deemed ‘natural;’ moreover dictating that nothing exists outside the materials, energies, life forms, features and principles comprised inside an a priori defined and professionally compulsory domain of understanding. A religious presumption that only the physical is real, and that the mental or spiritual can be reduced solely to the physical. A presumption that all observations of phenomena related to consciousness stem from solely a neural configuration of a single biological source. This extraordinary array of claims is justified through specious, scant predictive and selective application of the experimental method; attributing its false empirical basis to a pretense standard of evidence, measurability and repeatability. Rather, Nihilism is an unsubstantiated set of pseudo-scientific claims, misconstrued as atheism and subtly conflated with and pork-barreled inside actual science. It is employed as an instrument to squelch freedom of speech, squelch knowledge through vigilante bullying in the name of skepticism, qualify entrants into scientific and academic professions, screen topics under an embargo policy regarding access to science, control and direct institutions, establish social power; and in similar fashion to its Abrahamic religious precedent, leverage the resulting pervasive ignorance into a position of absolute subjugation of mankind.

Ninety Seven Percent (97%) Pretense – when an imperious claim to science cannot be backed up by evidence and research, a posing pseudo-skeptic will resort to quoting the “97% of scientists concur with the idea that ___________” line.  The figure needs to be above 95% in order to imply appeal to authority, yet cannot be 99 or 98% as this pushes the bounds of a 3% error rate (commonly employed in statistics) in terms of credibility. 95% itself sounds like it is made up, and 96% just does not carry the imperious ring which does 97%.  A sure fire way to tell if a fake skeptic is fabricating a statistic or quoting one they do not in reality understand.

No True Scotsman Pleading – this fallacy modifies the subject of an assertion to exclude the specific case or others like it offered by an opponent, in complete ad hoc and without reference to any specific objective rule allowing for the exclusion.

Nocebo – something which is inert or not harmful is regarded by its victim to be harmful, and therefore causes harm.

Nocebo Appeal – a nocebo claim which is made in absence of any data, observation or evidence.

non rectum agitur Fallacy – a purposeful abrogation of the scientific method through corrupted method sequence or the framing and asking of the wrong, ill prepared, unit biased or invalid question, conducted as a pretense of executing the scientific method on the part of a biased participant. Applying a step of the scientific method, out of order – so as to artificially force a conclusion, such as providing ‘peer review’ on sponsored constructs and observations, rather than studies and claims, in an effort to kill research into those constructs and observations.

non sequitur evidentia – the false claim that scientific studies have proven or indicated a proponent’s claim to knowledge, when in fact such studies have addressed an equivocally different question or a completely different proof altogether.

Novella Shuffle – the sleight of hand mis-definition of protocols of the scientific method or equivocation in relating its principles or the process of peer review, in such a way as to deceive the media and general public into incorrectly understanding a disdained topic or observation or accepting a pseudo scientific approach as constituting actual science.

Observation vs Claim Blurring – the false practice of calling an observation or data set, a ‘claim’ on the observers’ part.  This in an effort to subjugate such observations into the category of constituting scientific claims which therefore must be now ‘proved’ or dismissed (the real goal: see Transactional Occam’s Razor Fallacy).  In fact an observation is simply that, a piece of evidence or a cataloged fact. Its false dismissal under the pretense of being deemed a ‘claim’ is a practice of deception and pseudoscience.

Observational Occam’s Razor Fallacy (Exclusion Bias) – through insisting that observations and data be falsely addressed as ‘claims’ needing immediate explanation, and through rejecting such a ‘claim’ (observation) based upon the idea that it introduces plurality (it is not simple), one effectively ensures that no observations will ever be recognized which serve to frame and reduce a competing alternative.  One will in effect perpetually prove only what they have assumed as true, regardless of the idea’s inherent risk. No competing idea can ever be formulated because outlier data and observations are continuously discarded immediately, one at a time by means of being deemed ‘extraordinary claims’.

Occam’s Razorall things being equal, that which is easy for most to understand, and as well conforms with an a priori stack of easy-to-understands, along with what I believe most scientists think, tends to obviate the need for any scientific investigation. A false logical construct invented by SSkepticism to replace and change the efficacy of Ockham’s Razor, the latter employed as a viable principle in scientific logic. Occam’s Razor was a twist off the older Ockham’s Razor, which was slight and almost undetectable, but can be used to reverse the applicability of the more valid thought discipline inside of Ockham’s Razor. “All things being equal, the simplest explanation tends to be the correct one” is a logical fallacy; constituting a completely different and antithetical approach than that of Ockham’s Razor. Occam’s Razor can only result in conformance based explanations, regardless of their scientific validity.

Occam’s Razor Fallacy – abuse of Ockham’s Razor (and misspelling) in order to to enact a process of sciencey-looking ignorance and to impose a favored idea. All things being equal, that which is easy for most to understand, and as well conforms with an a priori stack of easy-to-understands, along with what I believe most scientists think, tends to obviate the need for any scientific investigation. Can exist in four forms, transactional, existential, observational and utility blindness.

Transactional Occam’s Razor Fallacy (Appeal to Ignorance) – the false contention that a challenging construct, observation or paradigm must immediately be ‘explained.’ Sidestepping of the data aggregation, question development, intelligence and testing/replication steps of the scientific method and forcing a skip right to its artificially conclusive end (final peer review by ‘Occam’s Razor’).

Existential Occam’s Razor Fallacy (Appeal to Authority) – the false contention that the simplest or most probable explanation tends to be the scientifically correct one. Suffers from the weakness that myriad and complex underpinning assumptions, based upon scant predictive/suggestive study, provisional knowledge or Popper insufficient science, result in the condition of tendering the appearance of ‘simplicity.’

Observational Occam’s Razor Fallacy (Exclusion Bias) – through insisting that observations and data be falsely addressed as ‘claims’ needing immediate explanation, and through rejecting such a ‘claim’ (observation) based upon the idea that it introduces plurality (it is not simple), one effectively ensures that no observations will ever be recognized which serve to frame and reduce a competing alternative.  One will in effect perpetually prove only what they have assumed as true, regardless of the idea’s inherent risk. No competing idea can ever be formulated because outlier data and observations are continuously discarded immediately, one at a time by means of being deemed ‘extraordinary claims’.

Utility Blindness – when simplicity or parsimony are incorrectly applied as excuse to resist the development of a new scientific explanatory model, data or challenging observation set, when indeed the participant refuses to consider or examine the explanatory utility of any similar new model under consideration.

Facile – appearing neat and comprehensive only by ignoring the true complexities of an issue; superficial. Easily earned, arrived at or won – derived without the requisite rigor or effort. Something easy to understand, which is compatible with a predicate or associated stack of also easy-to-understands.

Ockham’s Inversion – the condition when the ‘rational or simple explanation’ requires so many risky, stacked or outlandish assumptions in order to make it viable, that is has become even more outlandish than the complex explanation it was originally posed against and was supposed to surpass in likelihood. Similarly, a condition wherein the proposed ‘more likely or simple’ alternative is just as outlandish in reality as is the originally considered one.

Omega Hypothesis (HΩ) – the argument which is foisted to end all argument, period. A conclusion promoted under such an insistent guise of virtue or importance, that protecting it has become imperative over even the integrity of science itself. An invalid null hypothesis or a preferred idea inside a social epistemology. A hypothesis which is defined to end deliberation without due scientific rigor, alternative study consensus or is afforded unmerited protection or assignment as the null. The surreptitiously held and promoted idea or the hypothesis protected by an Inverse Negation Fallacy. Often one which is promoted as true by default, with the knowledge in mind that falsification will be very hard or next to impossible to achieve.

1.  The (Wonka) Golden Ticket – Have we ever really tested the predictive strength of this idea standalone, or evaluated its antithetical ideas for falsification? Does an argument proponent constantly insist on a ‘burden of proof’ upon any contrasting idea, a burden that they never attained for their argument in the first place? An answer they fallaciously imply is the scientific null hypothesis; ‘true’ until proved otherwise?

Einfach Mechanism – an idea which is not yet mature under the tests of valid hypothesis, yet is installed as the null hypothesis or best explanation regardless. An explanation, theory or idea which sounds scientific, yet resolves a contention through bypassing the scientific method, then moreover is installed as truth thereafter solely by means of pluralistic ignorance around the idea itself. Pseudo-theory which is not fully tested at its inception, nor is ever held to account thereafter. An idea which is not vetted by the rigor of falsification, predictive consilience nor mathematical derivation, rather is simply considered such a strong, or Occam’s Razor (sic) stemming-from-simplicity idea that the issue is closed as finished science or philosophy from its proposition and acceptance onward. A pseudo-theory of false hypothesis which is granted status as the default null hypothesis or as posing the ‘best explanation’, without having to pass the rigors with which its competing alternatives are burdened. The Einfach mechanism is often accompanied by social rejection of competing and necessary alternative hypotheses, which are forbidden study. Moreover, the Einfach hypothesis must be regarded by the scientific community as ‘true’ until proved otherwise. An einfach mechanism may or may not be existentially true.

2.  Cheater’s Hypothesis – Does the hypothesis or argument couch a number of imprecise terms or predicate concepts? Is it mentioned often by journalists or other people wishing to appear impartial and comprehensive? Is the argument easily falsified through a few minutes of research, yet seems to be mentioned in every subject setting anyway?

Imposterlösung Mechanism – the cheater’s answer. A disproved, incoherent or ridiculous contention, or one which fails the tests to qualify as a real hypothesis, which is assumed as a potential hypothesis anyway simply because it sounds good or is packaged for public consumption. These alternatives pass muster with the general public, but are easily falsified after mere minutes of real research. Employing the trick of pretending that an argument domain which does not bear coherency nor soundness – somehow (in violation of science and logic) falsely merits assignment as a ‘hypothesis’. Despite this, most people hold them in mind simply because of their repetition. This fake hypothesis circumstance is common inside an argument which is unduly influenced by agency. They are often padded into skeptical analyses, to feign an attempt at appearing to be comprehensive, balanced, or ‘considering all the alternatives’.

Ad hoc/Pseudo-Theory – can’t be fully falsified nor studied, and can probably never be addressed or can be proposed in almost any circumstance of mystery. They fail in regard to the six tests of what constitutes a real hypothesis. Yet they persist anyway. These ideas will be thrown out for decades. They can always be thrown out. They will always be thrown out.

3. Omega Hypothesis (HΩ) – Is the idea so important or virtuous, that it now stands more important that the methods of science, or science itself. Does the idea leave a trail of dead competent professional bodies behind it?

Höchste Mechanism – when a position or practice, purported to be of scientific basis, is elevated to such importance or virtue that removing the rights of professionals and citizens to dissent, speak, organize or disagree (among other rights) is justified in order to protect the position or the practice inside society.

Constructive Ignorance (Lemming Weisheit or Lemming Doctrine) – a process related to the Lindy Effect and pluralistic ignorance, wherein discipline researchers are rewarded for being productive rather than right, for building ever upward instead of checking the foundations of their research, for promoting doctrine rather than challenging it. These incentives allow weak confirming studies to to be published and untested ideas to proliferate as truth. And once enough critical mass has been achieved, they create a collective perception of strength or consensus.

4.  Embargo Hypothesis (Hξ) – was the science terminated years ago, in the midst of large-impact questions of a critical nature which still remain unanswered? Is such research now considered ‘anti-science’ or ‘pseudoscience’?

Entscheiden Mechanism – the pseudoscientific or tyrannical approach of, when faced with epistemology which is heading in an undesired direction, artificially declaring under a condition of praedicate evidentia, the science as ‘settled’.

Poison Pill Hypothesis – the instance wherein sskeptics or agency work hard to promote lob & slam condemnation of particular ideas. A construct obsession target used to distract or attract attack-minded skeptics into a contrathetic impasse or argument. The reason this is done is not the confusion or clarity it provides, rather the disincentive which patrolling skeptics place on the shoulders of the genuine skilled researcher. These forbidden alternatives (often ‘paranormal’ or ‘pseudoscience’ or ‘conspiracy theory’ buckets) may be ridiculous or indeed ad hoc themselves – but the reason they are raised is to act as a warning to talented researchers that ‘you might be tagged as supporting one of these crazy ideas’ if you step out of line and do not visibly support the Omega Hypothesis. A great example is the skeptic community tagging of anyone who considers the idea that the Khufu pyramid at Giza might have not been built by King Khufu in 2450 bce, as therefore now supporting conspiracy theories or aliens as the builders – moreover, their being racist against Arabs who now are the genetic group which occupies modern Egypt.

5.  Evidence Sculpting – has more evidence been culled from the field of consideration for this idea, than has been retained? Has the evidence been sculpted to fit the idea, rather than the converse?

Skulptur Mechanism – the pseudoscientific method of treating evidence as a work of sculpture. Methodical inverse negation techniques employed to dismiss data, block research, obfuscate science and constrain ideas such that what remains is the conclusion one sought in the first place. A common tactic of those who boast of all their thoughts being ‘evidence based’. The tendency to view a logical razor as a device which is employed to ‘slice off’ unwanted data (evidence sculpting tool), rather than as a cutting tool (pharmacist’s cutting and partitioning razor) which divides philosophically valid and relevant constructs from their converse.

Also, the instance common in media wherein so-called ‘fact-based’ media sites tell 100% truth about 50% the relevant story. This is the same as issuing 50% misinformation or disinformation.

6.  Lindy-Ignorance Vortex – do those who enforce or imply a conforming idea or view, seem to possess a deep emotional investment in ensuring that no broach of subject is allowed regarding any thoughts or research around an opposing idea or specific ideas or avenues of research they disfavor? Do they easily and habitually imply that their favored conclusions are the prevailing opinion of scientists? Is there an urgency to reach or sustain this conclusion by means of short-cut words like ‘evidence’ and ‘fact’? If such disfavored ideas are considered for research or are broached, then extreme disdain, social and media derision are called for?

Verdrängung Mechanism – the level of control and idea displacement achieved through skillful employment of the duality between pluralistic ignorance and the Lindy Effect. The longer a control-minded group can sustain an Omega Hypothesis perception by means of the tactics and power protocols of proactive pluralistic ignorance, the greater future acceptability and lifespan that idea will possess. As well, the harder it will to be dethrone as an accepted norm or perception as a ‘proved’ null hypothesis.

One-Liner – this refers to a cliché that is a commonly used phrase, or folk wisdom, sometimes used to quell cognitive dissonance. It is employed to end and win an argument and imply that science has made a final disposition on a matter long ago, when indeed no such conclusion has ever been reached.

Orphan Question – a question, purported to be the beginning of the scientific method, which is asked in the blind, without sufficient intelligence gathering or preparation research, and is as a result highly vulnerable to being manipulated or posed by means of agency. The likelihood of a scientifically valid answer being developed from this question process, is very low. However, an answer of some kind can almost always be developed – and is often spun by its agency as ‘science’. This form of question, while not always pseudoscience, is a part of a modified process of science called sciebam. It should only be asked when there truly is no base of intelligence or body of information regarding a subject. A condition which is rare.

Overconfidence Effect – excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.

Oversimplification (Pseudo Reduction) – instead of reducing an argument so that its contributing elements can be tested, a pretend skeptic will oversimplify the argument as a pretense of reduction. A form of false reduction which only serves to reduce the possible outcomes, and not actually deconstruct an argument into its logical critical path. Rather than examining all contributing elements of cause to effect, soundness or observation, the oversimplifier pares off those influences, constraints, objectives, and factors which serve to get in the way of their agency or desired conclusion. Thereafter employing ‘Occam’s Razor’ simplicity as an apologetic. ‘The dose makes the poison’, or ‘non-ionizing radiation can’t cause cancer’ are examples of pseudo-reduction. The arguer appears to be stepping down to a level of cause and effect inference, however has excluded so many factors that – there can only be one a priori inference drawn from the remaining set of influence.

Panalytics – statistics purposely contrived to be broad and shallow so as to avoid any possible signal detection other than extreme effects, coupled with abject avoidance of any kind of direct observation to confirm. The opposite of intelligence (consilience between analytic signals and direct observation).

Panduction – an invalid form of inference which is spun in the form of pseudo-deductive study. Inference which seeks to falsify in one fell swoop ‘everything but what my club believes’ as constituting one group of bad people, who all believe the same wrong and correlated things – this is the warning flag of panductive pseudo-theory. No follow up series studies nor replication methodology can be derived from this type of ‘study’, which in essence serves to make it pseudo-science.  This is a common ‘study’ format which is conducted by social skeptics masquerading as scientists, to pan people and subjects they dislike. There are three general types of Panduction. In its essence, panduction is any form of inference used to pan an entire array of theories, constructs, ideas and beliefs (save for one favored and often hidden one), by means of the following technique groupings:

  1. Extrapolate and Bundle from Unsound Premise
  2. Impugn through Invalid Syllogism
  3. Mischaracterize though False Observation

pavor mensura – an effect of apophenia wherein, upon analyzing a system or looking at a set of analytics or an issue for the very fist time, observers will often mistakenly perceive that a disaster is in the making. The erroneous tendency to perceive that the first statistical measures of a disease, natural system, dynamic process or social trend – can be reliably extrapolated to predict calamity therein.

Pedantic Smokescreen – the process of deluding self regarding or the process of employing the exclusive and unique principles of science to obscure and justify activities which would otherwise constitute fraud and malfeasance in business and legal domains.

Perfect Solution Fallacy – when solutions to challenging observations are rejected because they are not perfect or the sponsors of the underlying ideas are not perfect.

Periplocate – (Greek: περίπλοκος, períplokos : complicated, elaborate, involved) – to render a process or approach the resolution of a question in a more complicated fashion than is necessary. The use of complicating ignoratio elenchi, red herring, or overly obscure academic heuristics or methodologies to solve a problem, when such complexity was not required to begin with. This is often done in an effort to capture the topic or question under a fallacy of relative privation (implying that the matter can only be addressed by academics or scientists). The opposite of methodical deescalation.

Placebo – something which is inert and non beneficial is regarded by its beneficiary to be helpful, and therefore helps.

Pharmaceutical Research Fraud – nine methods of attaining success results from pharmaceutical studies, which are borderline or fraudulent in nature. The first eight of these are developed by Richard Smith, editor of the British Medical Journal. The ninth is a converse way of applying these fraud accusations to filter out appeals for study under Ockham’s Razor, in situations where studies run counter to pharmaceutical/research revenue goals.

1.  Conduct a trail of your drug against a treatment known to be inferior.

2.  Trial your drug against too low of a dose of a competitor drug.

3.  Conduct a trial of your drug against too high of a dose of a competitor drug (making your drug seem less toxic).

4.  Conduct trials which are too small to show differences from competitor drugs.

5.  Use multiple endpoints in the trial and select for publication those that give favorable results.

6.  Do multicenter trials and select for publication results from centers that are favorable.

7.  Conduct subgroup analyses and select for publication those that are favorable.

8.  Present results that are most likely to impress – for example, reductions in relative risk rather than absolute risk.

9.  Conduct high inclusion bias statistical studies or no studies at all, and employ items 1 – 8 above to discredit any studies which indicate dissenting results.

Placebo Appeal – a placebo claim which is made in absence of any data, observation or evidence.

Planck Paradigm Shift – the final peer review. Science which is denied and squelched through manipulation of process and refusal to tender peer review eventually triumphs, not by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.

Pluralistic Ignorance – a situation in which a majority of scientists and researchers privately reject a norm, but incorrectly assume that most other scientists and researchers accept it, often because of a misleading portrayal of consensus by agenda carrying social skeptics. Therefore they choose to go along with something with which they privately dissent or are neutral.

Popper Demarcation Malpractice – the dilettante presumption that if any set of claims or theory is innately non-falsifiable, it belongs to the domain of pseudoscience. Wrongly presuming a subject to be a pseudoscience, instead of false practices pretending to be science. Purposely or unskillfully conflating the methods of science with the body of scientific knowledge, employing amphibology or proxy equivocation in their articulation of the issue, wherein every proposed claim about what distinguishes science from pseudoscience can be confused with a counter-example. This renders the demarcation boundary of no utility, and reduces overall understanding.

Popper Demarcation Non-Science – purported science which simply seeks results supporting a preexisting or favored explanation. Suffers from the weakness that real science seeks to falsify, relate, predict and problem solve; understanding that a force-to-conformance does none of this.

Popper Error – when relying on the weak positions of predictive studies, statistical analyses, a ‘study of studies,’ associative and correlative studies, or series of anecdotes to stand as sufficient basis for peer review and/or acceptance of a shaky contention. Such studies are more appropriate for plurality screening, not proof.

praedicate evidentia – any of several forms of exaggeration or avoidance in qualifying a lack of evidence, logical calculus or soundness inside an argument. A trick of preemptive false-inference, which is usually issued in the form of a circular reasoning along the lines of ‘it should not be studied, because study will prove that it is false, therefore it should not be studied’ or ‘if it were true, it would have been studied’.

praedicate evidentia – hyperbole in extrapolating or overestimating the gravitas of evidence supporting a specific claim, when only one examination of merit has been conducted, insufficient hypothesis reduction has been performed on the topic, a plurality of data exists but few questions have been asked, few dissenting or negative studies have been published, or few or no such studies have indeed been conducted at all.

praedicate evidentia modus ponens – any form of argument which claims a proposition consequent ‘Q’, which also features a lack of qualifying modus ponens, ‘If P then’ premise in its expression – rather, implying ‘If P then’ as its qualifying antecedent. This as a means of surreptitiously avoiding a lack of soundness or lack of logical calculus inside that argument; and moreover, enforcing only its conclusion ‘Q’ instead. A ‘There is not evidence for…’ claim made inside a condition of little study or full absence of any study whatsoever.

praedicate evidentia – hyperbole in extrapolating or overestimating the gravitas of evidence supporting a specific claim, when only one examination of merit has been conducted, insufficient hypothesis reduction has been performed on the topic, a plurality of data exists but few questions have been asked, few dissenting or negative studies have been published, or few or no such studies have indeed been conducted at all.

praedicate evidentia modus ponens – any form of argument which claims a proposition consequent ‘Q’, which also features a lack of qualifying modus ponens, ‘If P then’ premise in its expression – rather, implying ‘If P then’ as its qualifying antecedent. This as a means of surreptitiously avoiding a lack of soundness or lack of logical calculus inside that argument; and moreover, enforcing only its conclusion ‘Q’ instead. A ‘There is not evidence for…’ claim made inside a condition of little study or full absence of any study whatsoever.

Principle of Peerhood – (‘peer’ is a word derived from nobility ranking and matching) – I shall not tell an epidemiologist his business, unless he infers from his work that I should necessarily undertake a harm or ruin – at that point, I am now a peer. The stakeholder placed at risk is the peer review.

Principle of Relegation – any sufficiently analyzed magic is indistinguishable from science (see Relegation Error).

Pro Innovation Bias – the tendency to have an excessive optimism towards technology or science’s ability to shed light into a subject or advance understanding, while often failing to identify its limitations and weaknesses, and habitually dismissing all other methods.

Problem of Induction – a variety of forms of argument which either suffer from Popper’s problem of induction, demarcation or in some way imply or claim scientific completion or consensus, when such a standard has either not been attained in fact, or only exhibited inductive consilience as opposed to scientific deduction.

Projection Bias – the tendency to unconsciously assume that others (or one’s future selves) share one’s current emotional states, thoughts, ideas, beliefs and values.

Proof Gaming – employing dilettante concepts of ‘proof’ as a football in order to win arguments, disfavor disliked groups or thought, or exercise fake versions of science. Asking for proof before the process of science can ostensibly even start, knowing that plurality is what begins the scientific method not proof, and further exploiting the reality that science very seldom arrives at a destination called ‘proof’ anyway. Proof gaming presents itself in seven speciations:

Catch 22 (non rectum agitur fallacy) – the pseudoscience of forcing the proponent of a construct or observation, to immediately and definitively skip to the end of the scientific method and single-handedly prove their contention, circumventing all other steps of the scientific method and any aid of science therein; this monumental achievement prerequisite before the contention would ostensibly be allowed to be considered by science in the first place. Backwards scientific method and skipping of the plurality and critical work content steps of science. A trick of fake skeptic pseudoscience, which they play on non-science stakeholders and observers they wish to squelch.

Fictitious Burden of Proof – declaring a ‘burden of proof’ to exist when such an assertion is not salient under science method at all. A burden of proof cannot possibly exist if neither the null hypothesis or alternative theories nor any proposed construct possesses a Popper sufficient testable/observable/discernible/measurable mechanism; nor moreover, if the subject in the matter of ‘proof’ bears no Wittgenstein sufficient definition in the first place (such as the terms ‘god’ or ‘nothingness’).

Herculean Burden of Proof – placing a ‘burden of proof’ upon an opponent which is either arguing from ignorance (asking to prove absence), not relevant to science or not inside the relevant range of achievable scientific endeavor in the first place. Assigning a burden of proof which cannot possibly be provided/resolved by a human being inside our current state of technology or sophistication of thought/knowledge (such as ‘prove abiogenesis’ or ‘prove that only the material exists’). Asking someone to prove an absence proposition (such as ‘prove elves do not exist’).

fictus scientia – assigning to disfavored ideas, a burden of proof which is far in excess of the standard regarded for acceptance or even due consideration inside science methods. Similarly, any form of denial of access to acceptance processes normally employed inside science (usually peer review both at theory formulation and at completion). Request for proof as the implied standard of science – while failing to realize or deceiving opponents into failing to realize that 90% of science is not settled by means of ‘proof’ to begin with.

Observation vs Claim Blurring – the false practice of calling an observation or data set, a ‘claim’ on the observers’ part.  This in an effort to subjugate such observations into the category of constituting scientific claims which therefore must be now ‘proved’ or dismissed (the real goal: see Transactional Occam’s Razor Fallacy).  In fact an observation is simply that, a piece of evidence or a cataloged fact. Its false dismissal under the pretense of being deemed a ‘claim’ is a practice of deception and pseudoscience.

As Science as Law Fallacy – conducting science as if it were being reduced inside a court of law or by a judge (usually the one forcing the fake science to begin with), through either declaring a precautionary principle theory to be innocent until proved guilty, or forcing standards of evidence inside a court of law onto hypothesis reduction methodology, when the two processes are conducted differently.

Propaganda – when disseminating authorized, conclusive and pre-digested information via media and channels of public update, under a condition in which none of the promoters of such information are qualified to report it, nor understand its technical basis or ramifications, nor are allowed to question such information or its underpinning arguments/data.

Proteus Phenomenon – the phenomenon of rapidly alternating or antithetical extreme research claims and extremely opposite refutations early in the risk horizon involved in a hot topic of science. Before peer pressure to accede to one answer has been established. Research during a period of scientific maturation where dissent is not only acceptable – but is advantageous for establishing career notoriety. This is the opposite of jackboot ignorance, a period in which opposite refutation is forbidden, regardless of the risk horizon involved in the topic.

Provisional Knowledge – the contrivance of a series of purposed provisional arguments, into a stack of probable explanations wherein we ignore the increasing unlikelihood of our conclusions and simply consider the stack of plurality to be proscribed; and eventually by Neuhaus’s Law, prescribed.

Pseudo Scientific Naturalism – when one employs or implies furtive hyperbole as to what science has concluded, eliminated, disproved or studied, foisted to proactively preclude one’s personal or a group’s belief set from being qualified as a religion.

Pseudo-Hypothesis – A pseudo-hypothesis explains everything, anything and nothing, all at the same time. A pseudo-hypothesis fails in its duty to reduce, address or inform. A pseudo-hypothesis states a conclusion and hides its critical path risk (magical assumption) inside its set of prior art and predicate structure. A hypotheses on the other hand reduces its sets of prior art, evidence and conjecture and makes them manifest. It then addresses critical path issues and tests its risk (magical assumption) as part of its very conjecture accountability. A hypothesis reduces, exposes and puts its magical assertion on trial. A pseudo-hypothesis hides is magical assumptions woven into its epistemology and places nothing at risk thereafter. A hypothesis is not a pseudo-hypothesis as long as it is ferreting out its magical assumptions and placing them into the crucible of accountability. Once this process stops, the hypothesis has become an Omega Hypothesis. Understanding this difference is key to scientific literacy. Grant me one hidden miracle and I can explain everything.

Pseudo-Prophecy – a theory which is purported to be successful at induction and predictive power, yet as well, is able to explain everything observed. A theory which explains everything, probably explains nothing.  In similar principle, a prophecy which is vague enough such that it could apply to virtually any culture at any time, based on the preponderance of sets of circumstances historically – is not a prophecy at all.

Pseudo-Theory (Mock Hypothesis) – a construct, belief or overarching idea which explains anything, everything and nothing – all at the same time. It is a premature and imperious proposed explanation for a set of post facto observations or phenomenon. Instead of bearing the traits of true scientific theory (hypothesis) – a pseudo-theory is quickly crafted and installed so as to exploit the advantages of pluralistic ignorance and the Lindy Effect. It explains everything without having to be approached by falsification, nor having to successfully predict anything. Usually installed as the null hypothesis before an argument is even framed around an issue, pseudo-theory is used primarily as a football enabling dismissal of competing alternatives from the point of its installation as the null hypothesis, onward. More specifically, pseudo-theory (mock hypothesis) bears the following profiling traits or essences:

1.  Can be developed in full essence before any investigation even begins.

2.  Never improves in its depth, description nor falsifiable or inductive strength despite ongoing research and increases in observational data.

3.  Possesses no real method of falsification nor distinguishing predictive measure which is placed at risk, nor does it offer any other means of being held to account or measure..

4.  Employs non-Wittgenstein equivocal/colloquial terminology or underlying premises (possibly pseudo-theory itself) where the risk of conjecture is not acknowledged.

5.  Is employed primarily as a symbolic or fiat excuse to dismiss disliked or competing explanations.

6.  Filters out by method during the hypothesis formulation stages, high probative value information, in favor of perceived high reliability or authorized information only (cherry sorting).

7.  Can explain a multiplicity of observations or even every non-resolved question (Explanitude).

8.  Is artificially installed as the null hypothesis from the very start.

9.  Attains its strength through becoming a Verdrängung Mechanism.

10.  Considers the absence of observation or a data collection/detection failure as suitable to stand in as ‘evidence’ (argument from ignorance).

11.  Pseudo-theory can be identified in that, as less information is held or information is screened out (cherry sorted), pseudo-theory tends to appear to grow more plausible and more pervasively explanatory, and is able to be produced with less effort (armchair debunking for instance). Whereas valid theory and hypothesis tend to strengthen with research effort and an increase in information.

12.  Panduction – an invalid form of inference which is spun in the form of pseudo-deductive study. Inference which seeks to falsify in one fell swoop ‘everything but what my club believes’ as constituting one group of bad people, who all believe the same wrong and correlated things – this is the warning flag of panductive pseudo-theory. No follow up series studies nor replication methodology can be derived from this type of ‘study’, which in essence serves to make it pseudo-science.  This is a common ‘study’ format which is conducted by social skeptics masquerading as scientists, to pan people and subjects they dislike.

Psychologism – when psychology plays the sole or central role in underpinning facts or explaining a non-psychological fact or principle expressed as constituting accepted knowledge. Suffers from the weakness that psychological principles enjoy a perch which can never be falsified, therefore they are at risk of standing as pseudoscience.

Publication Bias – an effect observed by Sterling and Rosenthal (also called the ‘File Draw Problem’) wherein a bias toward publishing conforming, confirming or positive results studies, as opposed to negative or null result studies, or an over-reliance upon p-value bias, will inevitably lead to a whipsaw effect of both filtering negative results studies and unduly canonizing as fact, empirical conclusions of questionable merit.

P-value Amaurosis – the ironic state of a study or argument wherein there exists more confidence in the accuracy of the percentage significance in measure (p-value), than of the accuracy of its measured contention in the first place.

quod fieri – (Latin: (lit.) ‘(the fact) that (now a specific thing is) to be done’) – a form of intervention bias action, in which the action is not taken from sound evidence or a history of effectiveness, but rather simply because something must be done. This type of decision or action usually is executed in a panic situation, in the face of a slow moving disaster, or a theater of cataclysmic mirage. Ironically its feckless or inane basis is compensated for, by a religious, political, or social fanaticism as to its claimed (but usually false) effectiveness. Those who raise questions regarding the action are typically cast as deplorable and anti-virtue.

quotidie mortem – the psychopathic belief that through the administering of a poison slowly over time, one is not actually killing the person or persons they seek to murder. Often defended by the quip “The dose makes the poison.”

Rappaport Claim or Theory – a perspective useful in detecting official deception, based upon a quip by anthropologist Roy A. Rappaport, which states “If a proposition is going to be taken to be unquestionably true, it is important that no one understand it.” A commentary on the effectiveness of obfuscation at the intersection of pseudo-theory and wicker man defenses. Such a proposition must feature the traits of pseudo-theory, in that it explains everything under a Lindy effect (and therefore likely explains nothing in reality). Moreover, every critique of such an idea’s features must be deemed as a straw man, thus its elemental claims cannot be pinned down nor tested, because every (genuine) skeptic of its tenets is inevitably ‘wrong’ in some regard (a wicker man defense).

Reactionary Bunk – to be so threatened or angered by a pseudoscientific idea that you allow it to influence your skepticism and begin to spin pseudoscience as an argument against the idea.

Reactive Devaluation – devaluing proposals, observations, data or ideas only because they purportedly originated with an adversary group or individual.

Redactionary Principle – the lifecycle management of chemicals, adjuvants or biological agents which do not indicate immediate classic major pathology pathways in test animals, into a final phase of testing upon the broader human population, in order to speed them to market and generate revenue during long term employment testing. Establishment of activist ‘skeptics’ to patrol and ensure any failures are squelched as constituting only pseudoscience and anecdote.

Reification Fallacy – assuming that sciencey sounding words refer to existing and mature elements of science, and that the meaning of words are implicitly qualified within the things they refer to.

Relegation Error – a principle wherein an authority of control or truth establishes a circumstance wherein, any advance in validity which is produced by outside entities, is immediately appropriated to become part of the truth base of the controlling authority alone. By default, the controlling authority then must be held as the truth standard. All other entities remain in a perpetual state of practice ‘wrong’ – regardless of their actual track record. For example, successes of integrative medicine being immediately co-opted into academic science and accordingly stripped of their non-academic pedigree. Those pseudosciences, thereafter continue to be disdained and ignored as quackery, hokum and non-science by academia. By fait accompli under this method, outsider research, economic theories or controversial avenues of research will always constitute anecdotal pseudoscience, by practice of idea theft alone.

Religion – the compulsory adherence to an idea around which testing for falsification is prohibited.

Researcher’s Conundrum – if I conduct objective research inside a subject which is a pseudoscience, then I am considered a pseudo-scientist. However, if I dismiss the subject out of hand, with no research, then I am regarded as having been scientific in my approach.

Salami Slicing – the practice of artificially breaking a scientific study down into sub-components and publishing each sub-component as a separate study. If the paper involves consilience from studies derived from a number of disciplines, this might be acceptable.  However, when the studies are broken down simply to increase the publishing history of its authors or make a political position appear to be more scientific, through ‘a thousand studies support the idea that…’ styled articles, then this is a form of pseudoscience.

Sampled Population Mismatch – a study design methodology wherein a small population is sampled for a signal which is desired for detection, and a large population cohort is sampled regarding a signal which is not desired for detection. For example, using n=44 (unvaccinated) and 1,680 (vaccinated) in order to show that those vaccinated exhibit a lower rate of atopic allergies. The variance required to upshift the desired signal in the smaller group is on the order of a mere 2 or 3 persons. The odds of this occurring are very high, especially if the small group all originates from the same or similar lifestyle.

Satisficing – a term which describes bias from the perspective of professional intelligence analysis, wherein one chooses the first hypothesis that appears good enough rather than carefully identifying all possible hypotheses and determining which is most consistent with the evidence.

Scale Obfuscation – shooting high and shooting low. Any form of a priori assumption wherein a researcher presumes that larger or elemental sample populations are congruent with more study or better science. This as contrasted with an incremental scientific method under a condition of making open context field observations first, crafting the right (rather than presumed right) question, followed by study of multiple iterations of smaller, discrete, cohort and more focused populations, built then into larger field databases and study.

Bigger Data is Better Fallacy – the invalid assumption which researchers make when attempting to measure a purported phenomena, that data extraction inside a very large source population as the first and only step of scientific study, will serve to produce results which are conclusive or more scientific. The same presumption error can apply to meta-analysis. Wherein such an analysis is conducted in a context of low/detached, rather than informed knowledge sets, and will serve to dilute critical elements of signal and intelligence which could be used to elucidate the issue further.

Big is Science Error – bigger sample sizes and study data is a way of bypassing the scientific method, yet still tender an affectation of science and gravitas. Any time a study cannot be replicated, and a call to consensus is made simply because it would be too difficult to replicate the study basis of the consensus.

Yule-Simpson Paradox – a trend appears in different groups of data can be manipulated to disappear or reverse (see Effect Inversion) when these groups are combined.

Elemental Pleading – breaking down the testing of data or a claim into testing of its constituents, in order to remove or filter an effect which can only be derived from the combination of elements being claimed. For instance, to address the claim that doxycycline and EDTA reduce arterial plaque, testing was developed to measure the impact of each item individually, and when no effect was found, the combination was dismissed as well without study, and further/combination testing was deemed to be ‘pseudoscience’.

sciebam – (latin: I knew) – an alternative form of knowledge development, which mandates that science begins with the orphan/non-informed step of ‘ask a question’ or ‘state a hypothesis’. A non-scientific process which bypasses the first steps of the scientific method: observation, intelligence development and formulation of necessity. This form of pseudoscience/non-science presents three vulnerabilities:

  1. First it presumes that the researcher possesses substantially all the knowledge or framework they need, lacking only to fill in final minor gaps in understanding. This creates an illusion of knowledge effect on the part of the extended domain of researchers. As each bit of provisional knowledge is then codified as certain knowledge based upon prior confidence. Science can only progress thereafter through a series of shattering paradigm shifts.
  2. Second, it renders science vulnerable to the possibility that, if the hypothesis, framework or context itself is unacceptable at the very start, then its researcher therefore is necessarily conducting pseudoscience. This no matter the results, nor how skillfully and expertly they may apply the methods of science. And since the hypothesis is now a pseudoscience, no observation, intelligence development or formulation of necessity are therefore warranted. The subject is now closed/embargoed by means of circular appeal to authority.
  3. Finally, the question asked at the beginning of a process of inquiry can often prejudice the direction and efficacy of that inquiry. A premature or poorly developed question, and especially one asked under the influence of agency (not simply bias) – and in absence of sufficient observation and intelligence – can most often result quickly in a premature or poorly induced answer.

¡science! – lying through tendering the appearance of being scientific (pseudoscience). A process made to look like science, which is 25% assumption, 25% outdated or semi-relevant study, 25% derision and bullying and 25% false claims to consensus. Partly testing a favored hypothesis and declaring it true, coupled with blocking of testing on competing ideas and declaring all of them false or pseudoscience.

Science as the Sciences Error – constrained misdefinition and equivocation of the word science to, rather than the method and body of knowledge development, a restrictive domain of the academic sciences alone. This so that skepticism is free to now errantly be applied in any fashion ‘outside of science.’

Science Bundling – citing a couple accepted norms of science and then throwing into the bundle a contention which is contested, political or questionable. This in order to intimidate a neutral audience into accepting the questionable assertion as being equal in regard with the other scientific norms. Often practiced by political bloggers and propaganda journalists. This is practiced as well in the pejorative, citing that a person’s beliefs or contentions are equivalent to Santa Claus, the Easter Bunny, BigFoot and Flat Earth Theory.

Science Delusion – coined by Rupert Sheldrake, this is the belief that science has already ascertained the principal components of the nature of reality, and that only a small portion of the unknown details remains to be filled in. Ethical skepticism supports bringing attention to this cognitive mistake, but rather than deeming it an error of science, instead identifies the problem as our social influences which corrupt the common underlying philosophy in defense of science, skepticism. What is known as social skepticism.

Science Faction Bias – the forcing of authors, dramas, screenplays, movies and storytellers of science fiction to compulsively conform to an observer’s personal version of science. An irritation with imagination if it wanders into realms which disagree with the observer’s personal ontology, sold as being indignant over violations of established science.

Sciencewashing – both the existential state or the actions involved in dressing up a corporate profit or monist control motivated agenda as “the science”.

Scienter – is a legal term that refers to intent or knowledge of wrongdoing while in the act of executing it. An offending party then has knowledge of the ‘wrongness’ of their dealings, methods or data, but choose to turn a blind eye to the issue, commensurate to executing or employing such dealings, methods or data. This is typically coupled with a failure to follow up on the impacts of the action taken, a failure of science called ignoro eventum.

The Scientific Method – a method of knowledge development bearing traits of process accountability which serve to transcend mere casual inquiry, mitigate bias and proscribe surreptitious agency masquerading as knowledge. A strategic process, which employs direct observation, analysis, ethics, skepticism, as well as experimental methodology and hypothesis testing, as tools inside a broader more comprehensive set of diligence.

I.  Observation – Domain Observation

II.  Intelligence – Intelligence Gathering/Schema Construction

III.  Necessity – Establishment of Necessity

IV.  Construct Formulation (by Sponsors)

V.  Ockham’s Razor/Peer Support (Skeptics are allies not opponents)

VI.  Hypothesis Development

VII.  Inductive and Statistical Study

VIII.  Competitive Hypothesis Framing

IX.  Deductive Testing/Inductive Consilience

X.  Hypothesis Modification/Reduction

XI.  Falsification Testing/Repeatability

XII.  Theory Formulation/Refinement

XIII.  Peer Review

XIV.  Publication

Scooby-Doo Science – a mindset born by fake skeptics wherein every mystery is easily resolved by current science understanding or the pretense that science has studied a subject when it has not – a ‘science’ which also features a convenient ability to highlight the bad person in the argument – usually of a consistent gender and ethnicity.

Sculptured Narrative – a social declaration which fits a predetermined agenda, purported to be of ‘weight of evidence’ and science in origin. However, in reality stems more from only the removal/ignoring of the majority or plurality of available or ascertainable evidence, in order to sculpt a conclusion which was sought before research ever began (see Wittgenstein sinnlos Skulptur Mechanism). Conducting science by dwelling only in the statistical and meta-analytical domains while excising all data which does not fit the social narrative of funding entities, large corporations or sskeptic organizations. Refusing to conduct direct studies, publishing studies which contain an inversion effect and filtering of countermanding studies out by attacking journals, authors and ignoring large bodies of evidence, consilience or falsification opportunity.

Self Confirming Process – a process which is constructed to only find the answer which was presumed before its formulation. A lexicon, set of assumptions or data, procedure or process of logical calculus which can only serve to confirm a presupposed answer it was designed to find in the first place. A process which bears no quality control, review, or does not contain a method through which it can reasonably determine its own conclusion to be in question or error.

Self-Fulfilling Inductive Prediction – prediction which is confirmed through induction by means of a separate rationale which appears to place its hypothesis at risk, whose predictive measure in fact has already been proved to be true by previous deductive inference. Pseudo-hypothesis – such as showing that people who are told they are predisposed to gain weight, by means of genetic testing – actually tend to gain more weight.  And attributing this effect to the psychology of ‘having been told they were predisposed’ as opposed to the simple fact that they have the genetics which predispose them in the first place. A common study trick in psychology.

Shermerganda – the misrepresentation of science, argument or the scientific method by citing Michael Shermer as a source or similar very highly visible SSkteptic as an authority on science, despite their lack of expertise in the subject under consideration.

Shopcraft – traits, arrival forms and distributions of data which exhibit characteristics of having been produced by a human organization, policy or mechanism. A result which is touted to be natural, random or unconstrained, however which features patterns or mathematics which indicate human intervention is at play inside its dynamics. A method of detecting agency, and not mere bias, inside a system.

Silly Con – spinning consilience as consensus. Investigating only one alternative and through manipulative pluralistic ignorance and social pressure, declaring that hypothesis as consensus and all others as unnecessary/pseudoscience/anti-science.  Spinning politically motivated variations of an accepted scientific hypothesis, and selling those variations to the untrained public, for consumption in the name of science.

sinnlos – mis-sense. A contention which does not follow from the evidence, is correct at face value but disinformative or is otherwise useless.

Skereto Curve/Rule – a condition wherein 99% of the skeptics are focused on and obsessing over 1% of the problem.

Social Epistemology – when we conceive of epistemology as including knowledge and justified belief as they are positioned within a particular social and historical context, epistemology becomes social epistemology. Since many stakeholders view scientific facts as social constructions, they would deny that the goal of our intellectual and scientific activities is to find facts. Such constructivism, if weak, asserts the epistemological claim that scientific theories are laden with social, cultural, and historical presuppositions and biases; if strong, it asserts the metaphysical claim that truth and reality are themselves socially constructed. Moreover, in recognizing this, when social justice or the counter to a perceived privilege are warranted, short cuts to science in the form of hyper and hypo epistemologies are enacted through bypassing the normal frustrating process of peer review, and substituting instead political-social campaigns – waged to act in lieu of science. These campaigns of ‘settled science’ are prosecuted in an effort to target a disliked culture, non-violent belief set, ethnicity or class – for harm and removal of human rights.

Social Peer Review – a process of acting on behalf of science, and pretense of conducting science, encouraged by celebrity skeptics – where in one presumes that by declaring themselves to be a skeptic, any critique they offer towards a disliked subject, pseudoscience or person is therefore now tantamount to application of scientific peer review. Usually backed by the Richeliean power of celebrities or social skepticism itself.

Social Skepticism

1. a form of social activism which seeks abuse of science through a masquerade of its underlying philosophical vulnerability, skepticism. An imperious set of political, social, and religious beliefs which proliferate through teaching weaponized fake skepticism to useful idiots. Agency which actively seeks to foment conflict between science and the lay public, which then exploits such conflict to bolster its celebrity and influence.

2. a form of weaponized philosophy which masquerades as science, science enthusiasm or science communication. Social skepticism enforces specific conclusions and obfuscates competing ideas via a methodical and heavy-handed science embargo. It promotes charades of critical thought, self aggrandizement, and is often chartered to defend corporate/Marxist agendas – all while maintaining a high priority of falsely impugning eschewed individuals and topics. Its philosophies and conclusions are imposed through intimidation on the part of its cabal and cast of dark actors, and are enacted in lieu of and through bypassing actual scientific method. One of the gravest weaknesses of human civilization is its crippling and unaccountable bent toward social coercion. This form of oppression disparages courage and curiosity inside the very arenas where they are most sorely needed.

Failures with respect to science are the result of flawed or manipulated philosophy of science. When social control, close-hold embargo or conformance agents subject science to a state of being their lap-dog, serving specific agendas, such agents err in regard to the philosophical basis of science, skepticism. They are not bad scientists, rather bad philosophers, seeking a socialized goal. They are social skeptics.

Sowert’s Law – a technique of deflection wherein a claim to supposed fact is derived from a stand-alone, manufactured, or trivial observation which is made inside a purposeful context of ignorance or isolation, stripped of its corroborating or supporting aspects. Ignorance + Trivia = “Fact”.

Status Quo Bias – the tendency to like things to stay relatively the same, even in the face of necessity and new observations.

Stickiness – The principle wherein proponents of a theory philosophy often cling to it despite the mounting evidence against it. The theory is abandoned only after its last proponents die. Such obstinacy allows theories to be given a proper run for their money, rather than being prematurely abandoned in the face of scant or specious contrary data that could be overcome with further research and accrued verity. Otherwise, we risk prematurely abandoning theories which could add value in one aspect or are indeed valid themselves.

Stowaway Pseudofact or Stowaway – a specific form of unsubstantiated claim or pseudofact, which is granted undue gravitas through its stand-alone repetition inside a scientific study or an otherwise scientific context. The author, desiring the statement to be accepted as true without merit, will embed it within a body of work outlining a series of well-substantiated statements, hoping that the reader will not discern the difference and will accept the stowaway as part of the overall inference of the associated work.

Streetlight Effect – is a type of observational bias that occurs when people only search for something where it is easiest to look.

Sunk Cost Skepticism – the phenomenon where SSkeptics justify increased investment or fanaticism in a construct or belief, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Fanaticism is directly related to the level of nagging and cumulative inner doubt. Also known as the sunk cost fallacy.

System Justification – the tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual, intellectual and collective self-interest.

Tangenda – in critical path theory inside the scientific method, a competing critical path of questions which are misleading, poorly or bias-crafted which serve to reduce or stagnate the probative nature of science, and steer the results of the scientific method into a specific conclusive domain. A critical path is a series of questions asked under the scientific method which are probative, illuminating, incremental, contextual, logical in sequence, parsimonious, risk averse, low in feature stacking, maximum in impact, and sequitur in terms of optimal knowledge development. While this is a tall order, it can be attained. Tangendas perform as a kind of pseudo-critical path which leads inexorably to a favored or desired conclusion or conformity – corrupted by the fashion in which scientific questions are crafted or the biased way in which they are sequenced.

Tendentious – showing agency towards a particular point of view, especially around which there is serious disagreement in the at-large population. The root is the word ‘tendency’, which means ‘an inclination toward acting a certain way.’ A tendentious person holds their position from a compulsion which they cannot overcome through objective evaluation. One cannot be reasoned out of, a position which they did not reason themselves into to begin with.

Terms of Service Error – when citing a contention which is included as an element of a corporation’s Terms of Service as therefore constituting something which is correct, or as an example of a restriction which is moral or scientific based on the grounds that a corporation has enforced it on their contracted clients or customers. Abrogating the US Constitution or other aspect of goodwill and freedom, based on the fact that to not do so would ‘violate your agreement to our Terms of Service.’

The Embargo Hypothesis (Hξ) – the hypothesis which must be dismissed without science because it threatens simplicity and verisimilitude. A disfavored hypothesis which will never be afforded access to science or the scientific method no matter what level of consilience is attained. An idea which threatens to expose the risk linkages inside of or falsify a stack of protected provisional knowledge which has achieved an importance greater than science itself: an Omega Hypothesis.

The Method of Scientific Propaganda – The common deeper hallmarks of scientific propaganda in this regard therefore proceed according to this method:

  1. To alter scientific paradigms or questions in a sleight-of-hand manner in order to establish a false basis for a completely separate but disguised contention.
  2. To conflate and promote consilience as consensus. Consilience is not a ‘unity of knowledge’ as Edward O. Wilson contends – as only diligent investigation of all compelling alternatives can serve to unify knowledge.
  3. To employ as null hypothesis, that which cannot be approached by Popper demarcation and falsification, and then further demonize all competing ideas.
  4. To employ explanitude based disciplines, bullying, celebrity, journalism and false forms of philosophy and skepticism, as a means to enforce an agenda, dressed up as science.
  5. To fail to conduct followup or safety confirmation studies, or sufficient parsimonious or precautionary study, in a circumstance where a risk has been adopted in the name of science.
  6. To imply or default that a null hypothesis is ‘true‘ until proved otherwise, knowing that proof is a seldom attained standard in science.
  7. To investigate only one hypothesis, and deem the social pressure and pluralistic ignorance around this bad habit as consensus or even consilience.
  8. To proscribe investigation into any alternative or deviation from consilience and give a moniker (anti-science or pseudoscience) to those who do so.
  9. To tamper with or conflate, the three forms of consensus into a falsely (through vulnerability exploitation) derived claim to scientific consensus of an Omega Hypothesis.
  10. To teach simpleton (simplest answer) or black and white delineations of scientific arguments as settled science, through channels of journalism which cannot differentiate good science from bad.

The (Wonka) Golden Ticket – Have we ever really tested the predictive strength of this idea standalone, or evaluated its antithetical ideas for falsification?

Einfach Mechanism – an invalid null or best explanation hypothesis. An explanation, theory or idea which resolves a contention under the scientific method solely by means of the strength of the idea itself. An idea which is not vetted by the rigor of falsification, predictive consilience nor mathematical derivation, rather is simply considered such a strong, or Occam’s Razor (sic) stemming-from-simplicity idea that the issue is closed as finished science from its proposition and acceptance onward. A pseudo-theory which is granted status as the default null hypothesis or as posing the ‘best explanation’, without having to pass the rigors with which its competing alternatives are burdened. Such alternatives may indeed be pseudo-theory, but so is the accepted hypothesis as well. An einfach mechanism may or may not be existentially true.

Torfuscation – pseudoscience or obfuscation enacted through a Nelsonian knowledge masquerade of scientific protocol and study design. Inappropriate, manipulated or shallow study design crafted so as to obscure or avoid a targeted/disliked inference. A process, contended to be science, wherein one develops a conclusion through cataloging study artifice or observation noise as valid data. Invalid observations which can be parlayed into becoming evidence of absence or evidence of existence as one desires – by accepting only the appropriate hit or miss grouping one desires as basis to support an a priori preference, and as well avoid any further needed ex ante proof.  A refined form of praedicate evidentia or utile abstentia employed through using less rigorous or probative methods of study than are requisite under otherwise ethical science.  Exploitation of study noise generated through first level ‘big data’ or agency-influenced ‘meta-synthesis’, as the ‘evidence’ that no further or deeper study is therefore warranted – and moreover that research of the subject entailed is now socially embargoed.

Transactional Occam’s Razor Fallacy (Appeal to Ignorance) – the false contention that a challenging construct, observation or paradigm must immediately be ‘explained.’ Sidestepping of the data aggregation, question development, intelligence and testing/replication steps of the scientific method and forcing a skip right to its artificially conclusive end (final peer review by ‘Occam’s Razor’).

Türsteher Mechanism – the effect or presence of ‘bouncer mentality’ inside journal peer review. An acceptance for peer review which bears the following self-confirming bias flaws in process:

  1. Selection of a peer review body is inherently biassed towards professionals who the steering committee finds impressive,
  2. Selection of papers for review fits the same model as was employed to select the reviewing body,
  3. Selection of papers from non core areas is very limited and is not informed by practitioners specializing in that area, and
  4. Bears an inability as to how to handle evidence that is not gathered in the format that it understands (large scale, hard to replicate, double blind randomized clinical trials or meta-studies).

Therein such a process, the selection of initial papers is biased. Under this flawed process, the need for consensus results in not simply attrition of anything that cannot be agreed upon – but rather, a sticky bias against anything which has not successfully passed this unfair test in the past. An artificial and unfair creation of a pseudoscience results.

Tyflocracy – from Greek: τυφλός (tyflós: blind eye). A power wielding and expansive form of governance or administration which is willfully or maliciously blind to a suffering subject group or citizenry – often displaced in favor of groups who are not under its charge, employed as a means to increase its power. A group who strategically apportions risk, dismissing or refusing to examine its impart to a disfavored group over which they rule or have administering authority and impact – wherein a condition of negligence is indistinguishable from malevolence.

Unity of Knowledge Error (Religion) – to conflate and promote consilience as consensus. Consilience is by its essence inductive and therefore cannot alone underpin a ‘unity of knowledge’ as Edward O. Wilson contends. Only diligent investigation of all compelling alternatives, deductive science, can serve to finalize and unify knowledge (under consensus). To promote consilience as a unity of knowledge or substitute for consensus, in absence of having diligently investigated competing alternative hypotheses, is also know in ethics as ‘religion.’

unsinnignonsense. A proposition of compromised coherency. Feynman ‘not even wrong.’

Unvestigation – the process of asking loaded questions and sculpting data so that your fake research will produce your desired conclusion.

utile absentia – a study which observes false absences of data or creates artificial absence noise through improper study design, and further then assumes such error to represent verified negative or positive observations. A study containing field or set data in which there exists a risk that absences in measurement data, will be caused by external factors which artificially serve to make the evidence absent, through risk of failure of detection/collection/retention of that data. The absences of data, rather than being filtered out of analysis, are fallaciously presumed to constitute bonafide observations of negatives. This is improper study design which will often serve to produce an inversion effect (curative effect) in such a study’s final results. Similar to torfuscation. As well, the instance when an abstract offers a background summary or history of the topic’s material argument as part of its introductory premise, and thereafter mentions that its research supports one argument or position – however fails to define the inference or critical path which served to precipitate that ‘support’ – or even worse, tenders research about a related but not-critical aspect of the research. Like pretending to offer ‘clinical research’ supporting opinions about capitalism, inside a study of the methods employed by bank tellers – it only sounds related. In this case you have converted an absence into a positive. A formal error called utile absentia. This sleight-of-hand allows an opinion article to masquerade as a ‘research study’. It allows one to step into the realm of tendering an apparent epistemological result, which is really nothing more than a ‘they said/I say’ editorial with a scientific analysis bolted onto it, which may or may not present any bearing whatsoever into the subject at hand. Most abstract surveyors do not know the difference – and most study leads cannot detect when this has occurred.​

Utility Blindness – when simplicity or parsimony are incorrectly applied as excuse to resist the development of a new scientific explanatory model, data or challenging observation set, when indeed the participant refuses to consider or examine the explanatory utility of any similar new model under consideration.

Vacuous Truth – is a statement that contends an absence or presence of a principle in an empty set domain, to imply the accuracy of a consequent argument. A contention that science holds no evidence in support of a topic which has not undergone scientific efforts at collecting evidence in the first place. Such a statement which is technically true, but is not correct in relationship as a logically qualified antecedent in support of the consequent.

Verisimilitude – an argument establishing a screen of partially correct information employed to create an illusion of scientifically accurate conclusion. The acceptance of specific assumptions and principles, crafted in such a fashion as to lead to a therefore simple resulting explanatory conclusion.

‘Vitamins Don’t Cure Cancer’ Fallacy – an informal fallacy wherein the task of proof or effect threshold assigned to the test subject is far in excess of, or out of context with, the contentions being made about the test subject. Studies which show that supplements don’t cure cancer, therefore they are all a waste of money, or there are 28 types of depression, so inflammation is not related to depression, or quality of life improvement is not a ‘medical outcome’. Ridiculous contentions which are not backed by the study which they cite as evidence.

War Footing – a human conditioning wherein oligarch powers outline the purpose of life and the purpose of science as being critical in some theoretical final conflict. A conflict which demands supreme time and sacrifice on the part of its victims (the participants), sustained under the guise of WWII soldier or Cool Hand Luke style silent suffering. A social delusion of extremity, which benefits big governments and large socialist corporations, while at the same time stealing lives, in order to establish control and make money.

Wittgenstein Error (Descriptive) – the contention or assumption that science has no evidence for or ability to measure a proposition or contention, when in fact it is only the crafting of language, language limitation or lack of a cogent question or (willful) ignorance on the part of the participants which has limited science and not in reality science’s domain of observability.

Describable: I cannot observe it because I refuse to describe it.

Corruptible: Science cannot observe it because I have crafted language and definition so as to preclude its description.

Wittgenstein Error (Epistemological) – the contention that a proposition must be supported by empirical data or else it is meaningless, nonsense or useless, or that a contention which is supported by empirical data is therefore sensible, when in fact the proposition can be framed into meaninglessness, nonsense or uselessness based upon its underlying state or lacking of definition, structure, logical calculus or usefulness in addressing a logical critical path.

bedeutungslos – meaningless or incoherent. A proposition or question which resides upon a lack of definition, or which contains no meaning in and of its self. unsinnig – nonsense or non-science. A proposition of compromised formal structure or not framed in a scientifically valid form of reduction. Feynman ‘not even wrong.’ sinnlos – mis-sense, logical untruth or lying. A contention which does not follow from the evidence, is correct at face value but disinformative or is otherwise useless.

Yule-Simpson Paradox – a trend appears in different groups of data can be manipulated to disappear or reverse (see Effect Inversion) when these groups are combined.

Zombie Theory – pseudo-theory or an old paradigm of understanding which is still being enforced by a portion of the scientific community as consensus, when indeed it is not – and/or the topic is undergoing a Kuhn-Planck paradigm shift. A walking dead placeholder theory which is abused and contorted to both explain everything unknown, and as well uses wicker man rationalizations in order to excuse its shortfalls long after it has ceased to serve any valuable explanatory potential.

The Ethical Skeptic, “The Tree of Knowledge Obfuscation: Misrepresentation of Science”; The Ethical Skeptic, WordPress; Web, https://theethicalskeptic.com/2009/09/24/the-tree-of-knowledge-obfuscation-misrepresentation-of-science/

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments