The Ethical Skeptic

Challenging Pseudo-Skepticism, Institutional Propaganda and Cultivated Ignorance

The Ethical Skeptic’s Reference to Commonly Employed Fallacious Skepticism

The following is The Ethical Skeptic’s list, useful in spotting both formal and informal logical fallacies, cognitive biases, statistical broaches and styles of crooked thinking on the part of those in the Social Skepticism movement.¹ It is categorized by fallacy employment context so that it can function as a context appropriate resource in a critical review of an essay or article by a thought enforcing Social Skeptic. To assist this, I have created an intuitive taxonomy of nine contextual fallacy mischaracterization or misrepresentation usage groups: Opponents, Data, Method, Science, Argument, Assumptions, Groups, Self and Authorities.

Fallacy List Header

.

Mischaracterization of Opponents

fallacy opponent

Non Sequitur Accuse – a response which does not follow the logic of a contention made, which furtively seeks to position the contention maker falsely into a prescribed camp of irrationality or non sequitur relationship to the subject being considered. This will usually be delivered in the form of a one liner, memorized talking point or weapon word.

False Ally - citing that even the most extreme members of an opponent’s assumed group of inclusion agree with the position of the proponent’s argument.

Google Goggles – warped or blinded perception cultivated through reliance on web searches for one’s information and understanding. Vulnerability to web opinions where every street doubter pretends to be an authority on science, every Cabal member and celebrity is falsely lauded and every unapproved person is disparaged through hyperbole and misinformation.

Style over Substance Fallacy – the undermining of an opponent and their argument or data by citing that it looks too pushed, packaged or promoted (for money) in order to be deemed acceptable or believable.

Credulity Fallacy – the contention or implication that an opponent or group of opponents does not practice evidence based, rational or critical thinking simply because they disagree with the proponent or can be pigeon holed into a group disdained by the proponent or the proponent’s organization.

Appeal to Motive – a pattern of argument which consists in challenging a thesis or a set of data or observations by calling into question the motives of its proposer.

Lie Jerk Response – the psychological defense reaction a fake skeptic will employ, wherein they accuse a person who relates a challenging first hand observation or piece of evidence, of lying or exaggeration.

Ad Feminam – is marked by the discrediting of an argument, data or observation by appealing to the apparent or assumed bias towards or irrelevant personal considerations concerning women, especially prejudices against them on the part of an opponent. Applies also in the cases of reference to minorities or the disadvantaged.

Abusive Ad Hominem – usually involves attacking the traits of an opponent, including implying their lacking of critical thinking skills or rationality or membership in a pigeon hole of stupidity, which are irrelevant to the argument at hand, as a means to invalidate the arguments of the opponent. This includes an attack on a person when there is evidence to support it. The attack is still ad hominem if the attack had nothing to do with the pre-existing discussion context.

Ad Hominem Fallacy – inappropriately citing the objections of an opponent as constituting an ad hominem attack, when the personalized objections are simply made as counter evidence to the claims a proponent has made regarding themselves.  An exception occurs when the personalized objections could not have been possibly ascertained by objective research or knowledge, or are simply made for pejorative argument.

Masked Man Fallacy – the contention that an opponent cannot be scientific or rational because the skeptic knows a good scientist or rational thinker when they see one; and the opponent is not one.

Affirmative Characterization from a Negative Premise – believers in this subject are typically credulous, and credulous people do not command science; therefore all believers in this subject are pseudo scientists.

Characterization of the Undistributed Middle – all pseudo scientists promote un-vetted data, the proponent of this argument promoted un-vetted data (in my view), therefore the promoter of this argument is a pseudo scientist.

Ecological Fallacy - where inferences about the nature of individuals are deduced from inference for the group to which those individuals belong.

Fallacy of Composition – the contention or implication that an opponent’s belonging to a specific group of people, inside of which are held extreme positions or actions, is indicative of their adherence to those same positions and action sets; and further then that the membership in this group invalidates their ideas, observations or data.

Taxonomy Fallacy – the illegitimate assignment or characterization of a proponent of a set of ideas, into a disfavored, extreme or fanatical group; in an effort to discredit the set of ideas without undertaking or possessing the research, evidence or qualifications necessary to justify such assignment.

Belief Accusing – the pejorative categorization of an individual expressing a contention into a stereotypical ‘true believers’ box pertaining to such contention.  The fallacy of presumption and insult which implies that the victim is neither intelligent enough, informed enough nor of sufficiently social or credible status to merit possession of an epistemologically derived conclusion; therefore they must only ‘believe.’

Credulity Accusing – accusing a person of practicing pseudoscience and credulity simply because they are regarding an outlier idea.  A credulist may be wrong, but as long as they are not pretending to represent Science or claim to be using the Scientific Method, they are not practicing pseudoscience; rather, are merely guilty of being receptive to an untested conclusion.

Flattery Apology – a specific type of appeal to emotion where an argument is made condemning a group of people in which the opponent is included, while citing that the opponent is the acceptable and rational version of the member of that group (i.e. present company excepted, etc.).

Wishful Accusing – accusing a person of making a decision with regard to data or observations according to what the opponent believes might be pleasing to imagine on their part, rather than according to evidence or reason. The minor implication imbedded being that the opponent exhibits no such fault.

Forrest Gump Bias/Class Bias – the regard of a person’s ideas, contentions, data or observations as being unacceptable, simply because the person is genetically, economically, physically, mentally, or socially different or perceived to be of a lower class level by an observer.

Fundamental Attribution Bias – when one considers the traits of another to stem from situational factors that may affect a person’s behavior as opposed to dispositional factors; yet views their own behavior as stemming from chiefly dispositional factors.

Psychogenetic Fallacy – inferring why an argument is being used, associating it to some psychological reason, then assuming it is invalid as a result. It is wrong to assume that if the origin of an idea comes from a biased or credulous mind, then the idea itself must also be a false.

Judgmental Language – insulting or pejorative language employed to influence the recipient’s judgment through stirring of emotion, especially anger, by intimidation or by implying the superior status or rationality of the claimant. This will many times be delivered in the form of a one liner, cliché or weapon word.

Loaded Language – discrediting, bias implying or pejorative language employed through leveraged equivocation and innuendo in an attempt to make self, or a topic of discourse appear superior to an opponent or opponent’s subject or contention.

Tu Quoque (“you too”) – an argument stating that since the opponent has conducted a fallacy, logical broach or hypocrisy then the proponent should not be held to account for violations, or stands correct in their position due to the opponent having made errors, the same or similar errors.

Appeal to Hypocrisy – the argument states that a certain position is false or wrong and/or should be disregarded because its proponent fails to act consistently in accordance with that position.

Negativity Effect – the tendency of people, when evaluating the causes of the behaviors of a person they dislike, to attribute their positive behaviors to the environment and their negative behaviors to the person’s inherent nature or weaknesses.

Omission Bias – the tendency to judge harmful actions of opponents as worse, or less moral, than equally harmful omissions on the part of allies.

Restraint Bias – the tendency to overestimate one’s ability to show restraint or demonstrate character in the face of temptation to behave badly. The overestimation of the acts of others as being beneath your level of sophisticated self control.

Stereotyping – expecting a member of a group to possess certain characteristics reasonably or unreasonably assigned to that group, without having actual information about that individual.

Extrinsic Incentives Bias – an exception to the fundamental attribution error, when people view others as having (situational) extrinsic motivations and (dispositional) intrinsic motivations for oneself.

Illusion of Transparency – people overestimate others’ ability to know them, and they also overestimate their ability to know others.

Semantics Jousting – the twisting of the context inside which a quotation or idea has been expressed such that it appears to support a separate argument and inappropriately promote a desired specific outcome.

Misrepresentation of Data

fallacy data

Availability Error – to adjudicate the answer to questions according to the examples that come most easily to mind, rather than a wide or representative sample of salient evidence.

Equivocation – the misleading use of a term with more than one meaning, sense, or use in professional context by glossing over which meaning is intended in the instance of usage, in order to mis-define, inappropriately include or exclude data in an argument.

Continuum Fallacy – erroneous rejection of a vague claim or loosely defined data set simply because it is not as precise as one would like it to be.

Trivia Fallacy – the rejection of a entire set of data by the pointing out of one questionable or disliked element inside the data.

Antiquing Fallacy – the dismissal of an entire field of data by showing its false, hoax based or dubious past inside a set of well known anecdotal cases. Also the instance where a thesis is deemed incorrect because it was commonly held when something else, clearly false, was also commonly held.

Monkey Suit Fallacy – the dismissal of an entire field of data simply by abusing a position of authority or citing an authority declaring the data to all to be the result of hoaxes.

Observation Denial Special Pleading –  a form of spurious data and observation dismissal where a proponent introduces favorable details or excludes unfavorable details regarding the observation, through alleging a need to apply additional considerations, without proper criticism or vetting of these considerations.

Existential Fallacy of Data – the implication or contention that there is an absence of observation or data supporting an idea, when in fact no observational study at all, or of any serious import has been conducted by science on the topic at hand.

Appeal to Apati Fallacy – ‘Appeal to the hoax’ fallacy of presumption and irrelevance.  The attempt to impugn a subject by citing or fabricating a history or incident involving a hoax of one or more of the subject’s contentions.  The fallacy resides in the fact that if it exists, there is porn of it; and likewise, if it exists or not, there is a hoax of it.

Associative Condemnation – the attempt to link controversial subject A with personally disliked subject B, in an effort to impute falsehood to subject B though the association of some idea or keyword common to both topics.  Guilt through association and lumping all subjects into one subjective category.  This typically will involve a context shift or definition expansion in a key word as part of the justification.

Observation vs Claim Blurring – the false practice of calling an observation of data, a ‘claim’ on the observers’ part.  This in an effort to subjugate such observations into the category of constituting scientific claims which therefore must be supported by sufficient data before they may be regarded by science.  In fact an observation is simply that, a piece of evidence or a fact, and its false dismissal under the pretense of being deemed a ‘claim’ is a practice of deception and pseudoscience.

Confirmation Bias – the tendency to immediately accept propaganda published in the opponent’s favored group, and to reject observations, data or ideas which do not fit the opponent’s favored models.

Furtive fallacy – undesired data and observations are asserted to have been caused by the malfeasance of researchers or laymen.

Historian’s fallacy – occurs when one assumes that decision makers of the past viewed events from the same perspective and having the same information as those subsequently analyzing the decisions; therefore the levels of bunk, belief and credulity can be used to dismiss past events with historically credible persons, just as the same as they are employed in modern discourse.

Ignoring as Accident – exceptions or even massive sets of data and observational counter-evidence to an enforced generalization are ignored as anecdotes or accidents.

Fallacy of Relative Privation – dismissing an avenue of research due its waste of scientists’ time and to the existence of more important, but unrelated, problems in the world which require priority research.

Base Rate Fallacy – an error in thinking where, if presented with related base rate information (i.e. generic, general information) and specific information (information only pertaining to a certain anecdotal case), the mind tends to ignore the former and focus on the latter in characterizing the whole set of relevant data regarding a subject.

Experimenter’s Bias – the tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.

Less is Better Bias – the tendency to prefer a smaller set of data to a larger set of data judged separately, but not jointly, so as to capitalize off the increased variability of the small set of data as it supports an extreme or conforming opinion.

Not Invented Here Bias – aversion to contact with or use of products, research, standards, or knowledge developed outside a group in which one is a member or with which one associates.

Pareidolia Bias –  a presumption that any challenging observation can only be solely the result of vague and random stimulus (often an image or sound) errantly perceived as significant by the observer.

Semmelweis Reflex – the tendency to reject new evidence that contradicts one’s held paradigm.

Survivorship Bias – concentrating on the people or data that “survived” some process and inadvertently overlooking those that didn’t because of their lack of observability.

Shared Information Bias – known as the tendency for group members to spend more time and energy discussing information that all members are already familiar with (i.e., shared information), and less time and energy discussing information that only some members are aware of (i.e., unshared information).

Google Reaction Effect – The tendency to discount as unimportant or invalid, information that can be found readily online by using Internet search engines.

Google Goggles – warped or errant information cultivated through reliance on web searches as one’s resource or base of understanding. Vulnerability to web opinions where every street doubter can dismiss observations as a pretend authority on science, every claim represented as being by science is immediately accepted and every public observation is deemed a fable or a hoax.

Von Restorff Effect – the bias inducing principle that an item that sticks out is more likely to be remembered than other items.

Zeigarnik Effect – states that people remember uncompleted or interrupted tasks better than completed tasks. This imparts a bias to refute arguments or ideas which are unfinished.

Apophenia Bias – the immediate dismissal of data as being manufactured, mis-analyzed, or reflecting random patterns which are ascribed empirical meaning, without having conducted the research into the data, nor possessing the background in the discipline involved in order to be able to make such a claim.

Pleonasm – is the use of more words or parts of words than is necessary for clear expression, in an attempt to load language supporting, or add judgmental bias to a contention. A form of hyperbole.

Medium Fallax – the tendency to regard or promote the mean (μ) or other easily derived statistic as being comprehensively descriptive of the whole body of a set of data; or the process of misleading with statistical indications as to the makeup and nature of a body of data. I’ve got my head in the oven, and my ass in the fridge, so I’m OK.

Furtive Confidence Fallacy – The refusal to estimate, grasp or apply principles of statistical confidence to collected data and observed arrival distributions, as a means of falsely bolstering the perceived validity of or avoid the signalling of validity in an observation set or body of data. The act of prematurely declaring or doggedly denying a multiplicity of anecdote to be equal to data.

Amateur Confidence Fallacy – The act of substituting simple probability math to manipulate outcomes, because one does not understand the difference, or because it looks like the same thing to a layman, in instances where only confidence intervals can be correctly applied under the scientific method.

Misrepresentation by Anecdote or Method

fallacy methodCherry Picking – pointing to a talking sheet of circulated individual cases or data that seem to confirm a particular position, while ignoring or denying a significant portion of related context cases or data that may contradict that position.

Peer Review Gaming – when a study is accused of ‘not following the scientific method,’ when in fact it was denied method, via blocked access to peer review channels, journals and protocols.

Fake Hoax Ethics Error – when one errantly regards a hoax which is purposely constructed, then revealed to ‘show how easy it is to fake this stuff,’ as standing exemplary of ethical skeptical conduct.

Fallacy of Composition by Null Result – the contention that the result of a null set on an experimental observation in an unconstrained domain means that the evidence supporting the idea in question does not exist. The comparison is invalid when the null result is measured in an unconstrained field of measure, assuming it to be comparable to a null result in a constrained domain of measure, as is the instance of testing to effect a medical diagnosis.

Promotification – deception or incompetence wherein only predictive testing methodology was undertaken in a hypothesis reduction hierarchy when more effective falsification pathways or current evidence were readily available, but were ignored.

Predictive Fallacy – the fallacy of applying predictive studies which show the lack of evidence of a particular set of data, in an unconstrained domain of evidence, and presuming this to be scientifically indicative of the Evidence of Absence.

Promotification Pseudo Science – especially in discovery science methodology, the pseudoscience practice of only developing, or the forcing the sponsor of an idea/set of observations, as a first priority to only fully develop, evidence in support of or a series of predictive-only tests which merely serve to confirm conventional or conforming explanations of that data in question. The act of advertising this methodology as being representative of the ‘scientific method.’

As Science as Law Fallacy - the implication or assumption that something is ‘innocent until proven guilty’ under the scientific method, when in fact this is an incorrect philosophy of hypothesis reduction.

King of the Hill Science (Defaulting) – the pseudo scientific practice of assigning a favored and un-testable hypothesis the status as Null Hypothesis; and then further proclaiming it to be the prevailing conclusion of science, without any supporting evidence or testing, until such time as it can be defeated by new competing science.

Transactional Occam’s Razor Fallacy – the false contention that a challenging claim or observation must immediately be ‘explained.’ Sidestepping of the data aggregation and intelligence steps of the scientific method. The boast of claiming to know which question should be asked under the scientific method.

Proof Gaming – the pseudoscience of forcing the proponent of a construct or set of data, to immediately and definitively prove their contention, circumventing the scientific method and aid of science, before the contention would ostensibly be allowed to be researched by science.

Hedging – using words with ambiguous meanings, for the planned instance wherein changing the meaning of them later will provide for a plausible way to protect one’s reputation, in case one is found to be wrong on a position of opposition.

Popper Fallacy – when a predictive study confirming a hypotheses is abused to dismiss falsification based data or a competing hypotheses, because confirmatory evidence is easy to find and falsification evidence is comparatively of a higher rigor in origin.

Kettle Logic – using multiple inconsistent arguments or discipline examples to defend a specific position or outcome at all costs.

Shield Effect – when an arguer in a valid matter of discourse drops any need to reference diligent research, method, or data collection in support of their contended position because a higher visibility arguer, or member of their club, has enjoined the discussion on their side.

Actor-Observer Bias – in a situation where a person experiences something negative, the individual will often blame the situation or circumstances. When something negative happens to another person, people will often blame the individual for their personal choices, behaviors and actions.

Moving the Goalposts – argument in which evidence successfully presented in response to a specific demand for proof of an observation is dismissed and some other (often greater) evidence is then demanded.

Plausible Post Hoc Ergo Propter Hoc Fallacy – Latin for “after this, therefore because of this” (faulty cause/effect, coincidental correlation, correlation without causation) – X could have plausibly happened, and could have plausibly caused Y, then Y happened; therefore X, and nothing else, caused Y.

Proof by Verbosity – submission of others to an argument too complex, meandering and verbose to reasonably deal with in all its intimate details.

Proof by Celebrity – submission of others to an argument so over-addressed by biased celebrities, disdained and fraught with media ridicule so as to not reasonably be able to deal with in at any relevant depth or via salient data or argument.

Prosecutor’s Fallacy – a low probability of valid detections does not mean a low probability of some valid detection or data being found.

Probabilistic Fallacy – the presumption that one holds enough data to determine what is probable and improbable in a field of a set of data.

Proving Too Much – using a form of argument to counter observations or ideas, that if it were valid, could imply extrapolated absurd conclusions which cannot be valid, therefore the base argument is invalid, regardless of the data.

False Analogy – an argument by analogy in which the analogy is poorly suited, used to disprove a challenging or disdained construct.

Broad or Deep Fallacy – the habit of a pretend researcher to go deep into details on a subject possessing thin evidence, or alternately as the situation may warrant, to examine only old monkey suit stories or broaden the subject being considered sufficiently enough to include numerous anecdotes of a ludicrous or dismissed nature in an effort to avoid addressing a current body of robust observational evidence.

Discounting Vividness – the invalid presumption that all types of eyewitness testimony are universally faulty, and further, that those involving describing an occurrence in vivid or emotional detail, even if it is an exceptional occurrence, are immediately suspect and should be discredited.

Untouchable Generalization – a condemning or dismissing generalization that comes with qualifications that eliminate so many cases which could falsify the derogatory claim, that what remains of it is much less impressive than the initial statement might have led one to assume. Yet the defamation stands as fiat knowledge and recitation nonetheless.

Slippery Slope – asserting that a relatively small first step in accepting data or ideas inevitably leads to a chain of related events culminating in some significant impact/event that should not happen, thus the first step should not happen. While this fallacy is a popular one, it is, in its essence, an appeal to probability fallacy.

Regressive Bias – a certain state of mind wherein perceived high likelihoods are overestimated while perceived low likelihoods are underestimated.

Conservatism – a certain state of mind wherein the tendency to dismiss perceived lower likelihood events or change one’s mind very little when faced with their veracity, is dismissed as an act of rationality.

Focusing Effect – the tendency to place too much importance on one aspect of an event.

Information Diversion – the tendency to continually seek new information on a topic at hand in order to imply the invalidity or indeterminant nature of the subject, or to distract focus away from more relevant but disliked data.

Observer Expectancy Effect – when a researcher expects a given result and therefore unconsciously manipulates an experiment or scientific method or misinterprets data in order to find that expected result.

Planning Fallacy – the tendency to underestimate the amount of science and method involved in adequate address of a subject, or of task-completion times conducted therein.

Reactance – the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to force you out of a constrained adopted choice.

Unit Bias – the tendency to want to finish a given unit of a task or an item to completion before beginning work on, or considering another avenue of research.  The effort to avoid unwanted data by remaining fixated on legacy avenues of research, as a pretense.

Misinformation Selectiveness – Cherry picking of eyewitness data through the belief that memory becomes less accurate because of interference from post-event information,  except for information which a claimant happens to favor.

Reactive Dissonance (in business and legal domains, also called ‘malfeasance’) – the fallacious habitual mindset of a researcher or doctrine of a group conducting research, wherein when faced with a challenging observation, study or experiment outcome, to immediately set aside rational data collection, hypothesis reduction and scientific method protocols in favor of crafting a means to dismiss the observation.

Misrepresentation of Science

fallacy scienceCorrelation to Causality Leap – contending that two events which occur together have a cause-and-effect relationship proven by science, since statistics are used to describe them.

Appeal to Skepticism Fallacy – the declaration, assumption or implication that a consensus skeptical position on a topic is congruent with the consensus opinion of scientists on that topic.

One-Liner – this refers to a cliché that is a commonly used phrase, or folk wisdom, sometimes used to quell cognitive dissonance. It is employed to end and win an argument and imply that science has made a final disposition on a matter long ago, when indeed no such conclusion has ever been reached.

Appeal to Scientists Fallacy – an argument that is misrepresented to be the premise held true on the part of the prevailing group of scientists; or concludes a hypothesis (typically a belief) to be either true or false based on whether the premise leads to a more successful career in science.

Appeal to Probability – the false contention of a skeptic that the most probable, simple, or likely outcome in a set of highly convoluted but unacknowledged assumptions, is therefore the compulsory or prevailing conclusion of science.

Novella Shuffle – the sleight of hand mis-definition of protocols of the scientific method or equivocation in relating its principles or the process of peer review, in such a way as to deceive the media and general public into incorrectly understanding a disdained topic or observation or accepting a pseudo scientific approach as constituting actual science.

Verisimilitude – an argument establishing a screen of partially correct information employed to create an illusion of scientifically accurate conclusion. The acceptance of specific assumptions and principles, crafted in such a fashion as to lead to a therefore simple resulting explanatory conclusion.

Existential Occam’s Razor Fallacy – the false contention that the simplest explanation tends to be the scientifically correct one. Suffers from the weakness that myriad and complex underpinning assumptions, all of which tender the appearance of ‘simplicity,’ have not been vetted by science.

Existential Fallacy (of Science) – the implication or contention that there is no science supporting an idea, or that science has rejected an idea, when in fact no scientific study at all, or of any serious import has been conducted on the topic at hand.

Dismissible Margin Fallacy – presuming that proponents inside a body of embargoed science constitute no more than a couple percentage points or less of all scientists, or that which is less than the Michael Shermer dismissible margin, of the larger body of scientists advised and educated by SSkepticism.

Perfect Solution Fallacy – when solutions to challenging observations are rejected because they are not perfect or the sponsors of the underlying ideas are not perfect.

Argumentum Ad Baculum (appeal to the stick, appeal to force, appeal to threat) – a counter argument made through coercion or threats of force on the part of a Social Skeptic, via the media, one’s employment or on one’s scientific reputation.

Appeal to Tradition (argumentum ad antiquitam) – a conclusion advertised as proven scientifically solely because it has long been held to be true.

Bandwagon Effect – The tendency to do (or believe) things because many in the Social Skeptic community do (or believe) the same. See Margold’s Law.

Epistemic Commitment – is a compulsion on the part of a proponent to uphold the factual assertions of a given proposition, and to tender apologetic or No True Scotsman pleading exceptions, when faced with contraindicating data. It contrasts with dogma in that epistemic commitment may be released through respectful discourse, whereas dogma typically is not.

Belief Bias – an effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion, or suitability under their acknowledged or unacknowledged set of beliefs.

Nero Taunting – when one publicly attacks a consumer or public need to seek out a solution, regarding a critical matter in their lives which science has not adequately addressed or researched. Usually indicated by a lack of proposed solutions and a high degree of disdainful or indignant media clamor over a penultimate set fallacy – the hyperbole over the footprint of science and its ability to address the need.

Sunk Cost Skepticism – the phenomenon where SSkeptics justify increased investment or fanaticism in a construct or belief, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Fanaticism is directly related to the level of nagging and cumulative inner doubt. Also known as the sunk cost fallacy.

Overconfidence Effect – excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.

No True Scotsman Pleading – this fallacy modifies the subject of an assertion to exclude the specific case or others like it offered by an opponent, in complete ad hoc and without reference to any specific objective rule allowing for the exclusion.

Pro Innovation Bias – the tendency to have an excessive optimism towards technology or science’s ability to shed light into a subject or advance understanding, while often failing to identify its limitations and weaknesses, and habitually dismissing all other methods.

Reactive Devaluation – devaluing proposals, observations, data or ideas only because they purportedly originated with an adversary group or individual.

Status Quo Bias – the tendency to like things to stay relatively the same, even in the face of necessity and new observations.

Projection Bias – the tendency to unconsciously assume that others (or one’s future selves) share one’s current emotional states, thoughts, ideas, beliefs and values.

System Justification – the tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual, intellectual and collective self-interest.

Illusion of Truth Effect – that people are more likely to identify as true statements those they have previously heard (even if they cannot consciously remember having heard them), regardless of the actual validity of the statement.

Educative Difficulty Effect – that information acquired in a course of academic instruction or that takes longer to read and is thought about more (processed with more difficulty) is more easily remembered or validly applied.

Science Faction Bias – the forcing of authors, dramas, screenplays, movies and storytellers of science fiction to compulsively conform to an observer’s personal version of science. An irritation with imagination if it wanders into realms which disagree with the observer’s personal ontology, sold as being indignant over violations of established science.

Non Rectum Agitur Fallacy – a purposeful abrogation of the scientific method through the framing and asking of the wrong, ill prepared, unit biased or invalid question, conducted as a pretense of executing the scientific method on the part of a biased participant.

Praedicate Evidentia – hyperbole in extrapolating or overestimating the gravitas of evidence supporting a specific claim, when only one test of merit has been run, or insufficient hypothesis reduction has been performed on the topic, or few or no such studies have indeed been conducted at all.

Non Sequitur Evidentia – the false claim that scientific studies have proven or indicated a proponent’s claim to knowledge, when in fact such studies have addressed an equivocally different question or a completely different proof altogether.

Fictus Scientia – when one uses skepticism instead of science to develop knowledge and make conclusions. The substitution of skepticism, or a logically or socially twisted form thereof, into the inappropriate role of acting or speaking on behalf of science or scientists.

Fictus Scientia Fallacy – a contention, purported to be of scientific origin, when in fact, it is not.

Ficta Rationalitas – a disposition assessment, contended to be an outcome of rationality or scientific skepticism, when in fact it originates from flawed method and/or personal bias.

Conflation Bias – the tendency of a proponent to be unable or unwilling to distinguish recollection between personal religious or unproven beliefs, and actual accepted science; and the resulting extrapolation of science entailed therein.

Pedantic Smokescreen – the process of deluding self regarding or the process of employing the exclusive and unique principles of science to obscure and justify activities which would otherwise constitute fraud and malfeasance in business and legal domains.

Reification Fallacy – assuming that sciencey sounding words refer to existing and mature elements of science, and that the meaning of words are implicitly qualified within the things they refer to.

Google Goggles – warped or blinded understandings of science or scientific consensus bred through reliance on web searches for one’s information and understanding. Vulnerability to web opinions and misinformation where every street doubter pretends to be an authority on science.

Vacuous Truth – is a statement that contends an absence or presence of a principle in an empty set domain, to imply the accuracy of a consequent argument. A contention that science holds no evidence in support of a topic which has not undergone scientific efforts at collecting evidence in the first place. Such a statement which is technically true, but is not correct in relationship as a logically qualified antecedent in support of the consequent.

Misrepresentation of Argument

fallacy argument

Predictive Counter to Singular Existential Statement – when citing predictive evidence employed to counter a contention which is made as a Singular Existential Statement, ie. contending that x exists. Attempting to disprove the contention that something exists, by citing the number of hoaxes or antithetical cases regarding the contended subject.

Predictive Promotion of a Universal Statement – when citing predictive evidence employed to promote the idea that the set of X is comprised wholly and only by type x members. Attempting to show that all data in a contention is hoaxed by providing small sample evidence of hoaxing.

Transactional Popper Demarcation Error – incorrectly citing a topic as being a pseudo science, when in fact its sponsors are seeking falsification based protocols to counter the antithetical premise to their case, or in the instance of predictive studies being employed simply to establish plurality for sponsorship inside the scientific method.

Existential Popper Demarcation Error – citing something as a pseudoscience simply because one does not like the topic, or the topic has had pretend science performed in its name in the past.

Red Herring – presentation of an argument that may or may not be logically valid on its own, but distracts the discussion away from a failing argument, as well as failing nonetheless to address the context of the issue in question or address its logical validity.

Begging the Point – the framing of a question from a desired answer in such a fashion that its desired conclusion is the only viable answer.

False Dilemma -committed when one implies that sufficient data exists such that a choice must be made between a constrained subset of options, when no such threshold of data actually exists.

Inverse Negation Fallacy – the strategy of undermining any study, proponent, media byte, article, construct, data, observation, effort or idea which does not fit one’s favored model, in a surreptitious effort to promote that favored model, along with its implicit but not acknowledged underpinning claims, without tendering the appearance of doing so; nor undertaking the risk of exposing that favored model or claims set to the scientific method or to risky critical scrutiny.

Truzzi Fallacy – the presumption that a position of skepticism or plausible conformance on a specific issue affords the skeptical apologist tacit exemption from having to provide authoritative outsider recitation or evidence to support a contended claim or counter-claim.

Fact Ambiguity Dipole – the relation of a fact which carries along with it, usually through equivocation or semantics jousting, a false and misleadingly impugning implication which supports a point wishing to be made by a skeptic.

Denial/Dissent Blurring – denial obfuscation efforts by a SSkeptic being falsely passed off as informed dissent on their part. Conversely, spinning dissenters or those with opposing data as persons who are “Deniers.”

Kriging Leap – when an argument is touted as being supported by underpinning science or precision, when the contended conclusion is not in reality supported by or connected to the underpinning science or precision.  Jumping from theoretical science, glossing over intermediate principles, and directly to immediate application, in order to falsely bolster a desired position.

Reductionist’s Error – when one demands evidence recitation which cannot possibly be provided because the argument is over a point of philosophy or well established precedent of no particular origin. Demanding evidence for 2 + 2 = 4, or ‘charity is love.’

Twittertation Error – when one demands inappropriate or impossible comprehensive recitation in a chat venue which is not designed for comprehensive recitation.

Incomplete Comparison – in which insufficient information is provided to make a complete comparison of arguments between a disproved one, and a disfavored one an opponent is attempting to debunk.

Invalid Comparison – in which equivocating, inconsistent or errant information is provided to attempt a complete comparison of arguments between a disproved one, and a disfavored one an opponent is attempting to debunk.

Projection Error – when an argument is made that a one’s own choices and perception can reliably extrapolate to represent the same choices and perceptions of those constituting a general population holding to the same allegiances of the arguer.

Ontological Projection Error – when an argument is made that a moral choice of one’s own can reliably extrapolate to be the same choice made by a general population holding to the same ontology of the chooser. Because a contending atheist is moral, all other nihilists, atheists and persons under a culture teaching such ideas, will then choose to be moral as well.

Anger Bankruptcy – the habit of reacting in anger on the part of a faking or immature skeptic who has painted them self into a corner logically, yet must win all arguments immediately at all costs. A habit of resorting to attacks, pejorative equivocation, insults, playground and social bullying and surreptitious attempts to harm in order to ‘defeat’ an enemy they have crafted, in the instance where they are bankrupt of reason and evidence.

Intentionality Fallacy – the insistence that the ultimate meaning of a construct, idea or ideology must be consistent with the intention of the person from whom the original idea, concept or communication originated; and that no new or empirically improved version of its understanding may be tested.

Proof by Assertion – a proposition is reworded in a politically correct, jingo-ish, SSkeptic one-liner, or false professional way such as to hope that its re-expression will validate it, despite previous contradiction.

Flaw of Identity – mis-employment of the first classical law of Greek thought, regarding essence. Falsely contending that two things sharing a unique set of characteristic qualities or features, are indeed the same thing; or conversely that two things that have different essences are different things.

Objection sans Contexte – when an objection is raised to argue in opposition, which demonstrates a lack of salient understanding of the principle being argued against.

Presumptive Objection – when an objection is raised to argue in opposition, based on an a priori assumption of what the opponent is contending, or a prescribed version of what the objection raiser presumes or would like the opponent to be saying.

Appeal to Ridicule – an argument is made by presenting the opponent’s argument in a way that makes it appear ridiculous.

Straw Man – an argument based on misrepresentation of an opponent’s position. Any man can be made to appear irrational and vile, if his opponents only are allowed to speak on his behalf.

Normalcy Bias – the refusal to plan for, consider, or react to, a dramatic exception event or idea which has never happened or been considered before.

Rhyme as Reason Effect – rhyming statements are perceived as more truthful.

Subadditivity Effect – the tendency to judge probability of a broader argument to be less than the probabilities of the components of that same argument.

Risk Flippance – the tendency to judge the total risk of a series of transactional events to be equivalent to the risk identified for only one single event in the series.

Misrepresentation by Assumption

fallacy assumptionBegging the Question – falsely setting the starting point of an argument, or its foundational assumptions, such that the promoted conclusion is assumed as an inherent element or inevitable outcome of this starting position or set of assumptions.

Circular Cynicism – the practice of ensuring that a subject never possesses any valid scientific evidence through the fallacious step of declaring it to be a pseudoscience before investigation is ever undertaken by science.  Since the subject is a pseudoscience, all research by laymen can never be accepted as evidence, and since there is no evidence, then the subject is false and science should not study it, and since science will not study it and people research it with lay attempts at science, then it is a pseudoscience.

Bifurcation Fallacy – committed when a false dichotomy is presented, i.e. when someone is asked to choose between two options when there is at least one other option available.

Sponsorship Fallacy – the rejection of an entire methodological basis of a scientific argument and all its underpinning data and experimental history simply because one can point to a bad personality involved in the subject, hoaxes, old misconceptions about the subject or an errant conclusion which was drawn from the discipline, irrespective of the actual validity of its core scientific data and argument.

Bunk Nauseam Fallacy – the argument that a point is invalid by implying or citing incorrectly that the topic has been de-bunked many many times, and is now nothing but an irritating myth inside circles of stupidity.

Argumentative Definition – is a prima facia equivocation which purports to describe the ‘true,’ ‘unique professional employment’ and/or ‘commonly accepted’ meaning of a term, while in reality stipulating an irregular or altered employment of that term, usually to support an argument the proponent is attempting to force.

Furtivis Miraculo Fallacy – give us one free miracle, and we’ll explain all the rest. The scientific pretense of condensing all the magic involved in one’s epistemology into one single comprehensively explanatory miracle.  Based on the philosophical premise that one comprising ridiculous assumption is more believable than a myriad of such assumptions.

Ad Nauseam Fallacy – the intolerance of an argument or a set of data through implying that it has been hashed and re-hashed over and over so much by science or sponsors, that everyone is tired of the subject and if there were anything true to it, it would have come out and been published already.

Appeal to Skepticism Fallacy – the argument assumption or implication that an opinion possesses authoritative veracity or a proponent possesses intellectual high ground simply through allegiance to a consensus skeptical position on a topic.

Argument from Ignorance – asserts that a proposition is true because it has not yet been proven false.

Appeal to Scientific Democracy – the contention that if the majority of scientists believe something to be true, regardless of epistemological merit, then it must be assumed as true.

Argumentum Ad Populum (appeal to widespread belief, bandwagon argument, appeal to the majority, appeal to the people) – where a proposition is claimed to be true or good solely because many people believe it to be so.

Argument from Fallacy – the false assumption that the simple act of catching an opponent in commission of a logical fallacy immediately invalidates all of their ideas, observations and data.

Denying the Antecedent – the contention that since a subject is a pseudoscience, then any of its constructs, theories, results, data and observations are invalid and anyone who considers them is a pseudo scientist.

Rhetorical Tautology – employment of phrase or principle is such a way as to imply that its offing as stated is of patently obvious, self justifying or intrinsic epistemological merit, while evading issues of evidence or valid reasoning supporting the stated principle or phrase.

Penultimate Set Fallacy – the contention or implication on the part of a proponent of an idea that they personally hold enough validated conclusion base or data to assume that the gaps in their knowledge will have little or no effect on the veracity of their further conclusions; and therefore they should not be brought into question. The implicit claim being that the proponent holds a ‘next to the last thing knowable’ domain of knowledge on the topic in question. The ‘God of the Gaps’ one liner is typically employed as an apologetic by this false claimant to authority.

Dead Body Contraposition Fallacy – the contention that if there exists an element A, then there would be B observations which would result directly from the presence of A, for instance an animal A would result in a dead body, B.  Therefore the absence of observation of B means that A does not therefore exist.

Argument from Incredulity – the contention that because something is too difficult to imagine or possibly exist, then this is proof that it does not exist.

Pork-Barreling/Blurring – the practice of shifting the context of an accepted tenet of science or broadening the definitions involved in the principle, in order to appear to imply that science includes proof of additional ideas personally or religiously favored by the SSkeptic.  Blurring – to the converse, using the same tactics with opposing viewpoints to imply that science has condemned or disproved them; when in fact no such event has occurred.

Retrospective Causality – the argument that because some event has occurred, its occurrence must have been caused by a conforming plausible impetus, such as hype and hysteria, and not any other influence.

Naturalist’s Fallacy – wherein judgment is based solely on whether the subject of judgment fits one person’s a priori definition of what constitutes a ‘natural’ or ‘paranormal’ delineation.

Antiquing Fallacy – the dismissal of an entire field of data by showing its false, hoax based or dubious past inside a set of well known anecdotal cases. Also the instance where a thesis is deemed incorrect because it was commonly held when something else, clearly false, was also commonly held.

Mischaracterization of Groups (of People)

fallacy groupsLie of Allegiance – committed when a proponent of a specific side in a false dilemma argument misrepresents their membership group as holding one socially acceptable or attractive philosophy in name, yet that group in reality teaches/practices another less acceptable or extremist philosophy altogether.

Ergo Sum Veritas Fallacy – the assumption, implication or inference that an organization bearing a form of title regarding skepticism, is exempt from defamation laws or immediately holds de facto unquestionable factual or ideological credibility over any other entity having conducted an equivalent level of research into a matter at hand. The assumption, implication or inference that an organization or individual bearing a form of title regarding skepticism, adheres to a higher level of professionalism, ethics or morality than does the general population.

Fallacy of Exclusive Premises – there are believers and disbelievers, and some believers are gullible. Therefore no disbelievers are gullible.

Illicit Major Fallacy – all researchers of pseudoscience are irrational. No scientists study pseudoscience. Therefore, all the positions of scientists on pseudoscience are rational positions.

Illicit Minor Fallacy – all skeptics are rational thinkers. All scientists are rational thinkers. Therefore, all scientists are skeptics.

Hoary Glory Bias - when one cites old ridiculous arguments from opposing groups from ancient times or older eras of scientific understanding, to serve as exemplary rationale as to the impeccable nature of and false incumbent merit of argument on the part of the arguer’s allegiance group. Usually employed while implying that modern opponents hold to similarly ridiculous versions of argument today.

Astroturfing – the attempt to create an illusion of widespread grassroots support for a policy, viewpoint, or product, where little such support in reality exists. Multiple online identities coordinate around celebrity siren calls, manufactured data, fake-hoax counter propaganda and shill pressure groups; all employed to mislead the public into believing that the position of the astroturfer is a socially acceptable, rational reality and/or a commonly held view.

Latet Misandry – the errant employment of positions of skepticism or channels of skeptical media to promote ideas or ‘scientific’ evidence supporting the denigration of males or men.

Fallacy of Composition – assuming that something true of part of a whole must also be true of the whole.

Inflation of Conflict – disagreement in a field of knowledge legitimizes an opponents’ assumption of the invalidity of that entire field.

Fallacy of Ludic Dismissal – the contention that the change in statistics with regard to upswing in the belief in a disdained topic can unequivocally be shown to be an effect of media, hype hysteria and promotional campaigns by pseudo scientists.

Reciprocating Effect – cause and effect are reversed, then reversed again, over and over in a chicken and egg relationship. The effector hysteria around an observation is said to be the cause of it, and then vice versa, ad infinitum. It assumes no validity to the basic genesis of the argument.

Hasty Generalization - basing a broad conclusion about a group on rumor, stereotype, a small sample set or scant observational experience.

Appeal to Fear – a specific type of appeal to emotion where an argument is made by increasing fear and prejudice towards the opposing side or group of people who support a disdained idea.

Appeal to Pity/Poverty/Morality (argumentum ad misericordiam) – an argument which attempts to cite the poverty level or objective refusal to seek money on the part of academics and Social Skeptics, as a way of assigning them unmerited objectivity inside a topic of pluralistic contention.

Appeal to Spite – a specific type of appeal to emotion where an argument is made through exploiting people’s bitterness, spite or political orientation regarding an opposing party; or implication that certain politically disdained groups adhere universally to specific set of beliefs.

Cheerleader Effect – the exploitation of the tendency for people or ideas to appear more attractive in a proactive group than in isolation.

Hostile Media Channel Effect – the tendency to see a media report or specific network as being biased and purveying only pseudoscience, owing to one’s own strong partisan views.

Just World Bias – the tendency for SSkeptics to want to believe that the world is fundamentally just and rational, causing them to rationalize an otherwise unconscionable injustice as deserved by the victim(s) for their being irrational.

Negativity Bias – psychological phenomenon by which humans have a greater recall of unpleasant memories associated with a disliked organization or concept, compared with positive memories of the same.

Group Attribution Error – the biased belief that the characteristics of an individual group member are reflective of the group as a whole or the tendency to assume that group decision outcomes reflect the preferences of group members, even when information is available that clearly suggests otherwise.

Ingroup Bias – the tendency for people to give preferential treatment to others, or the ideas of others they perceive to be members of their own groups.

Outgroup Homogeneity Bias – individuals see members of their own group as being relatively more varied than members of other groups.

Universal Attribution Error – in this error a person is likely to make an internal attribution to an entire group instead of the individuals or disparate factions within the group.

Hindsight Bias – the inclination to see past events or actions by people or groups as being more predictable than they actually were; also called the “I-knew-it-all-along” effect.

Scarecrow – a high visibility claim that an opposing group has proposed ideas which are of patently ludicrous viability; when in fact no such theories or ideas have been proposed by the disliked group, and moreover that the only broaching of such a construct, theory or idea originates solely from the claiming group itself. Extreme Straw Man.

Misrepresentation of Self (Appeal to Skepticism)

fallacy selfLie of Allegiance – committed when a proponent of a specific side in a false dilemma argument misrepresents themselves as holding one socially acceptable or attractive philosophy in name, yet teaches/practices another less acceptable or extremist philosophy altogether.

Press Box Poser – pretending to be competent to critique, represent or act as an authority in an industry or discipline, when in fact one has never conducted a study, application research, formulated policy, run a business in, employed people in, filed a patent in, or otherwise conducted any diligent professional activity in the critiqued topic discipline.

Ergo Sum Veritas Fallacy – the contention, implication or inference that one’s own ideas or the ideas of others hold authoritative veracity simply because their proponent has declared themselves to be a ‘skeptic.’

Appeal to Skepticism Fallacy – the presumption or contention that taking a denial based or default dubious stance on a set of evidence or topic is somehow indicative of application of the scientific method on one’s part, or constitutes a position of superior intellect, or represents a superior critical or rational position on a topic at hand.

Argument from Self-Knowing – if P were true or false then I would know it as a skeptic; in fact I do not know it; therefore P cannot be true or false.

Cryptical Thinking Fallacy – the false claim by SSkeptics that they practice scientific or critical investigatory method within a topic of discourse. False skeptics advertise this as a honed skill which affords their opinions equal weight with a scientist, or superior credibility over any layman, on any particular topic they wish to dominate and condemn.

Fictus Skeptica – fake skeptic. Thinking that skepticism is an approach to evaluating claims in lieu of science’s role to perform such activities.  Thinking that since one’s personal version of ‘rational thinking’ emphasizes evidence and applies tools of science, therefore it can bypass having to employ actual science.

Hoary Glory Bias - when one cites old ridiculous arguments from opposing groups from ancient times or older eras of scientific understanding, to serve as exemplary rationale as to the impeccable nature of and false incumbent merit of argument on the part of the arguer/arguer’s allegiance group.

Latet Misandry – the errant employment of positions of skepticism or channels of skeptical media to promote ideas or ‘scientific’ evidence supporting the hatred of males or men. To conceal a hatred of males behind a pretense of rational thinking or science.

Stooge Posing – attacks on piece of data or an easily disprovable topic of credulity used as an effort to bolster an opponent’s record of debunking success and club ranking. This reputation to then further allow for irrelevant and unmerited gravitas in addressing other arguments where data and observation do not support the goals of the opponent.

Attribution Bias – when one considers the traits of another to stem from situational factors that may affect a person’s behavior as opposed to dispositional factors; yet views their own traits as stemming from chiefly dispositional factors.

Argument from Silence – the pretense that the exhibiting of silence on one’s part is somehow indicative of higher intellect, ethics, rationality or knowledge and skill regarding a topic at hand.

SSkepticism Projection Fallacy – when one fails to apply skepticism, and default considers the way the Social Skepticism movement sees the world as the way the world really is.

Skeptical Psychologist’s Fallacy – an opponent presupposes the objectivity of his own Skeptical position when analyzing a behavioral event on the part of others.

Bias Blind Spot – the tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases and faults in others than in oneself.

Choice Supportive Bias – the tendency to remember one’s choices and professional judgement as more educated or accurate than they actually were.

Commonality Error – the tip of hand accidentally committed by a faking skeptic when they bristle with disdain at consumers who are able to test their claims for accuracy in their own or home testing environments, and subsequently cite counter evidence to the skeptic’s contention.

Fictus Scientia Fallacy – the furtive presumption that one possesses the status, education, experience, intellect, professional background, critical thinking skills, empirical evidence or rational basis from which to speak in lieu or on behalf of science or representing proper execution of the scientific method.

Google Goggles – warped or blinded perception of self status cultivated through the ease of promotion of false information on the web. Vulnerability of self perception where every street doubter perceives and promotes them self as an authority on science. (See Margold’s Law on the friendly reception given by the Cabal to this crooked thinking)

Curse of Knowledge Effect – when better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people; or perceive that their burden of knowledge cannot be fathomed by lesser-capable people, rendering them unable to practice critical or evidence based thinking.

Empathy Gap – the tendency to underestimate the influence or strength of feelings, in either oneself and over-estimate it in others.

Illusion of Validity – belief that furtherly acquired information or promulgated policy generates additional relevant data for predictions or information to bolster a position, even when it in reality does not.

Fear as Doubt Fallacy – when doubt or skepticism serves a psychological defense mechanism to protect one from a subject which frightens the one feigning doubt or skepticism.

Moral Credential Effect – the tendency of a track record of non-prejudice to increase subsequent prejudice.

Social Desirability Bias – the tendency to over-report socially desirable characteristics or behaviors in one self and under-report socially undesirable characteristics or behaviors.

Dunning-Kruger Effect – an effect in which incompetent people fail to realize they are incompetent because they lack the skill or maturity to distinguish between competence and incompetence.

False Consensus Effect – the tendency for people to overestimate the degree to which others agree with them.

Illusion of Asymmetric Insight – people perceive their knowledge of their competitors to surpass their competitors’ knowledge of them.

Illusion of Superiority – overestimating one’s desirable qualities, and underestimating undesirable qualities, relative to other people.

Naive Cynicism – expecting more egocentric bias in others than in oneself.

Naive Realism – the belief that we see reality as it really is – objectively and without bias; that the facts are plain for all to see; that rational people will agree with us; and that those who don’t are either uninformed, lazy, irrational, or biased.

Trait Ascription Bias – the tendency for people to view themselves as relatively variable in terms of personality, behavior, and mood while viewing others as much more predictable.

Critical Blindness – the conflation of a position of authority or influence with one’s possession of a higher level of personal competence. The mental obstacle created in a person granted entitled authority before they are emotionally ready, wherein they lose their ability to create, to gracefully understand or value the dynamics of human nature, motivation and leadership; descending further into shallow and habitual negative or doubtful critical assessments of those ‘under’ or different from them, coupled with an ever growing hunger for absolute control.

Misrepresentation of Authorities

fallacy authoritySemantics Jousting – the twisting of the context inside which a quotation of authority or a recitation or scientific principle is applied, such that it appears to support a separate argument and inappropriately promote a desired specific outcome.

False Appeal to Authority – the contention that the opinions of an authority contradict, appear to countermand, or reject the data, topic or ideas of an opponent when in fact either the recitation or the ideas of the opponent or both are taken out of context, misquoted or are false in their portrayal.

Appeal to Celebrity Skeptic – the recitation of opinions tendered by a celebrity or prominent figure inside skepticism, as constituting authority inside a set of data; or contention that such ideas, one-liners and figures in fact constitute positions or persons of scientific gravitas. The ranking of the opinions of such figures above those of lay or professional experts in a given field.

Construct Laundering – a proposal of Plausible Deniability on the part of one prominent SSkeptic regarding a pluralistic topic, which is subsequently then cited as a peer reference by others inside the Cabal as ‘evidence of falsification’ and finally taken as peer reviewed proof that a topic or construct has been ‘debunked’ by experts in the community at large; when no such falsification has indeed been achieved or attempted.

Usurpation Error – when attempting to employ a recitation from an unrecognized, fringe or false authority as to policy or definition when the topic in question already has a recognized governing body which regulates policy, definition and standards.

Kuhn Denialism – the pseudoscience of social and media bullying with the ultimate goal of controlling exposure to and blocking Science’s consideration of a condition of plurality or new paradigm or its supporting data on a given disliked subject.

False Attribution – an proponent appeals to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument.

Quoting out of Context Fallacy – a proponent’s selective excerpting of words from their original context in a way that distorts the source’s intended meaning, in order to impugn or support specific ideas.

False Authority – using a persona who’s only expertise on a topic is that they have declared themselves to be a skeptic, or an expert of dubious credentials and/or using only one opinion to sell a product or idea.

Referential Reification Fallacy – assuming all words refer to existing and mature elements of science, and that the meaning of words are implicitly qualified within the things they refer to; then further presupposing that this referential definition employed then is also mature enough to be employed to discredit a contended set of observations or ideas.

Appeal to Accomplishment – where a position of opposition is deemed true or false based on the accomplishments of the proposer, and not their merits or accomplishments inside the field in question.

Dunning-Kruger Exploitation – the manipulation of unconsciously incompetent persons or laypersons into believing that a source of authority expresses certain opinions, when in fact the persons can neither understand the principles underpinning the opinions, nor critically address the recitation of authority imposed upon them.

Halo Effect – the tendency for a person’s positive or negative traits to “spill over” from one personality area to another in others’ perceptions of them.

Parem Falsum – the presumption or contention that since a person is a scientist or speaks as an authority on science, they are more qualified to make conclusions and tender opinions in fields of expertise they do not hold, and moreover be regarded as authority over actual experts (both scientists and lay persons) in that field of expertise.

Reach Around – an award or accolade given within clubs of particular thinking, to other members of the same club for ‘celebrating a visible and championing club member’ in order to increase the club member’s viability as an authority; in contrast to an accolades tendered for a lifelong pursuit of work on behalf of mankind, or in development of mercy, infrastructures, economies or new technologies.

Shield Effect – when an arguer in a valid matter of discourse drops any need to reference diligent research, method, or data collection in support of their contended position because of a false status given to a higher visibility arguer, or member of their club, who has enjoined the discussion on their side.

Where’s my Check Fallacy – the pretense on the part of an individual who is a Social or Celebrity Skeptic, or use skepticism to promote themselves in their institution or career, that they do not receive compensation for their speeches and visible positions on public matters, nor take payment from those who regularly seek to promote specific business, social and political goals and doctrines via well established channels of Social Skepticism.

Circus Partis – a false appeal to an authority who is simply ‘famous for being famous,’ or who is simply enjoying their 15 minutes of fame in the club, and do not stand as a credible authority independent of this pseudo-status. This includes personages who are simply famous for being a famous skeptic.

Google Goggles – warped or blinded perception of an individual’s credibility based on web search popularity or false celebrity. Vulnerability to web opinions where every street doubter is an authority on science and every Cabal member and celebrity is falsely lauded.

Availability Bias – to elicit or recite opinions of only those persons that come most easily to mind or who are most familiar inside a proponent’s group, rather than a wide or representative sample of salient evidence, recitation or opinion.

Appeal to Social Skeptic – when a journalist, stage magician, author, psychologist or liberal arts activist is cited as a recitation authority on science, technology, engineering, math or skepticism as it is employed in those disciplines.

 

TES Signature


¹ Most of these fallacies are principles crafted of my own observation (see Definitions in the blog and the over 100 blog entries outlining the illicit practices of thought control on the part of Social Skepticism). However many as well are adaptations of well known fallacies. Some as well are corrected from the colloquial or erroneous state into accurate, technically written or context precise form, in as brief a fashion as possible, from their Wikipedia version and the following resource sites therein:

http://en.wikipedia.org/wiki/Straight_and_Crooked_Thinking

http://en.wikipedia.org/wiki/List_of_fallacies

http://en.wikipedia.org/wiki/List_of_cognitive_biases

http://en.wikipedia.org/wiki/List_of_memory_biases

² Note that in many instances, the employment of the term ‘fallacy’ is used neither in formal nor informal technical context. In those instances, its employment is simply used to elicit the errant nature inside a short descriptive title.

September 7, 2014 Posted by | Argument Fallacies, Ethical Skepticism, Tradecraft SSkepticism | , , , , , , , , , , | 2 Comments

The Real Ockham’s Razor

“You ever hear the expression ‘Simplest answer’s often the correct one?’ ”

“Actually, I’ve never found that to be true.”

        – Gone Girl, 2014¹

Indeed, the actual role of Ockham’s Razor, the real scientific principle, is to begin the scientific method, not complete it in one felled swoop. Rational Thinking under Ockham’s Razor (ie. science) is the demonstrated ability to handle plurality with integrity.


FALSE ONE-LINER which is NOT OCKHAM’S RAZOR:

“All things being equal, the simplest explanation tends to be the correct one.”

unicycle1The above statement is NOT Ockham’s Razor.  It is a sleight of hand expression (called by SSkeptics, “Occam’s Razor”) used to squelch disdained topics which would otherwise be entertained for research by Ethical Skepticism.  The weakness of the statement resides in the philosophical principle that the simplest answer is typically the one which falls in line with the pre-cooked assumptions. Moreover, implicit within this statement is the claim that all data and observations must immediately be ‘explained’ so that a disposition can be assembled a priori and anecdotally; as a means of preventing data aggregation or intelligence development steps of science. In these two ways, the statement is employed to obfuscate and abrogate the application of the scientific method. This trick is a common sales technique, having little to do with rationality.  Don’t let your integrity slip to the point where you catch yourself using it to deceive others. The two formal fallacies introduced via this errant philosophy are:

Transactional Occam’s Razor Fallacy

The false contention that a challenging claim or observation must immediately be ‘explained.’ Sidestepping of the data aggregation and intelligence steps of the scientific method. The boast of claiming to know which question should be asked under the scientific method.

Existential Occam’s Razor Fallacy

The false contention that the simplest explanation tends to be the scientifically correct one. Suffers from the weakness that myriad and complex underpinning assumptions, all of which tender the appearance of ‘simplicity,’ have not been vetted by science.

Indeed, the actual role of Ockham’s Razor, the real scientific principle, is to begin the scientific method, not complete it in one felled swoop of denial.

OCKHAM’S RAZOR:

“Pluralitas non est ponenda sine neccesitate” or “Plurality should not be posited without necessity.”

The words are those of the medieval English philosopher and Franciscan monk William of Ockham (ca. 1287-1347).²

william-of-ockham cutThis apothegm simply means that, until we have enough evidence to compel us, science should not invest its resources into outside theories. Not because they are false or terminally irrelevant, rather existentially they are unnecessary in the current incremental discourse of science. But Ockham’s Razor most importantly also means that once there exists a sufficient threshold of evidence to warrant attention, then science should seek to address the veracity of a an outside claim. Plurality is a condition of science which is established by observations and sponsorship, not by questions, peer review or claims. To block the aggregation and intelligence of this observational data, or attempt to filter it so that all data are essentially relegated as fiat anecdote, is pseudoscience. It is fraud, and is the chief practice of those in the Social Skepticism movement today. The claim of “Prove it” – or Proof Gaming Fallacy, embodies this fundamental misunderstanding of Ockham’s Razor on the part of those who have not pursued a rigorous philosophical core inside their education.

This threshold of plurality and in contrast, the ‘proof’ of an idea, are not the same standard of data, testing and evidence.  Muddying the two contexts is a common practice of deception on the part of SSkeptics. Proof is established by science, plurality is established by sponsors.  SSkeptics regard Ockham’s Razor as a threat to their religion, and instead quote the former substitute above, which while sounding similar and ‘sciencey’, does not mean the same thing at all.  An imposter principle which rather seeks to blur the lines around and prevent competing ideas from attaining this threshold of plurality and attention under the scientific method.  Their agenda is to prohibit ideas from attaining this threshold at ANY cost.  This effort to prohibit an idea its day in the court of science, constitutes in itself, pseudoscience.

This method of pseudoscience is called the DRiP Method.

Misuse of “Occam’s” Razor to effect Knowledge Filtering

One of the principal techniques, if not the primary technique of the practitioners of thought control and Deskeption, is the unethical use of Knowledge Filtering.  The core technique involves the mis-use of Ockham’s Razor as an application to DATA and not to competitive thought Constructs.  This is a practice of pseudoscience and is in its essence dishonesty.

Ockham’s Razor, or the discernment of Plurality versus Singularity in terms of competing Hypotheses, is a useful tool in determining whether Science should be distracted by bunk theories which would potentially waste everyone’s time and resources.  Data on the other hand is NOT subject to this threshold.

By insisting that observations be explained immediately, and through rejecting a datum, based on the idea that it introduces plurality, one effectively ensures that no data will ever be found which produces a competing Construct.  You will in effect, perpetually PROVE only what you are looking for, or what you have assumed to be correct. No competing idea can ever be formulated because outlier data is continuously discarded immediately, one datum at a time. This process of singularly dismissing each datum in a series of observations, which would otherwise constitute data collection in an ethical context is called “Knowledge Filtering” and stands as a key step in the Cultivation of Ignorance, a practice on the part of Social Skepticism. It is a process of screening data before it can reach the body of non-expert scientists. It is a method of squelching science in its unacknowledged steps of process and before it can gain a footing inside the body of scientific discourse. It is employed in the example graphic to the right, in the center, just before the step of employing the ‘dismissible margin’ in Social Skepticism’s mismanagement of scientific consensus.

Plurality is a principle which is applied to Constructs and Hypotheses, not Data.

cultivation of ignoranceI found a curious native petroglyph once while on an archaeological rafting excursion, which was completely out of place, but who’s ocre had been dated to antiquity.  I took a photo of it to the state museum and was unable to find the petroglyph in the well documented library of Native American Glyphs. I found all the glyphs to the right and all the glyphs to the left of the curious one.  However, the glyph in question had been excluded from the state documentation work performed by a local university professor.  The museum director, when I inquired replied appropriately “Perhaps the Glyph just didn’t fit.” He had hit the nail on the head. By Occam’s Razor, the professor had been given tacit permission to filter the information out from the public database, effectively erasing its presence from history. He did not have to erase the glyph itself, rather simply erase the glyph from the public record, our minds and science – and excuse it all as an act of ‘rational thinking.’

The Purpose of Ockham’s Razor is to BEGIN the scientific method, not screen data out and finish it.

Data stands on its own.  Additionally, when found in abundance or even sometimes when found in scarcity, and not eliminated one at a time by the false anecdotal application of “Occam’s” Razor, can eventually be formulated into a Construct which then will vie for plurality under the real Ockham’s Razor.  A useful principle of construct refinement, prior to testing, under the scientific method.

The Role of Ockham’s Razor: To Screen for Plurality at the Beginning of the Scientific Method

1.  OBSERVATION

2.  NECESSITY

3.  INTELLIGENCE/AGGREGATION OF DATA (The Three Key Questions)

4.  CONSTRUCT FORMULATION

5.  SPONSORSHIP/PEER INPUT (Ockham’s Razor)

6.  HYPOTHESIS DEVELOPMENT

7.  PREDICTIVE TESTING

8.  COMPETITIVE HYPOTHESES FRAMING (ASKING THE RIGHT QUESTION)

9.  FALSIFICATION TESTING

10.  HYPOTHESIS MODIFICATION

11.  FALSIFICATION TESTING/REPEATABILITY

12.  THEORY FORMULATION/REFINEMENT

13.  PEER REVIEW (Community Vetting)

14.  PUBLISH

15.  ACCEPTANCE

Map of Deskeption 7.8.1: Misuse of “Simplicity”

Think through this comparative and your ethics will begin to change. Simplicity can be a deceptive tactic, when used to obfuscate:

1)  Simple explanations have complex underpinnings.  Our “simple” explanations are only simple, because we choose to contort reality in extremely exhaustive complexities in order to force simplicity.  You can lead a horse to water, but you can’t make him drink.  But if he does drink, that is not a simple action by any means, it may appear to be simple but that is an illusion on the part of the casual observer. Simplicity, many times, is only an illusion.

2)  Beware of the tyranny of the simple.  Simplicity as a principle of discretion is best suited for the clear application of judgments and governances, and as such is usually based on sets of laws and procedure which change only slowly and under great necessity.  Laws only change as men change, and men are slow to change.  Because of this, laws of governance are always behind current understandings.  Unassailable principles of governance have little place in discovery and science.

3)  Simplicity conveys neither straightforwardness, nor elegance; which are central tenets of understanding.  “The simplest vehicle I know of is a unicycle.  I’ll be damned if after all these years of trying, I still have not managed to learn how to ride one.”

4)  Simplicity implies that enough data exists to warrant a conclusion regarding an observation, then further implies that a disposition must be tendered immediately.  Simplicity in this fashion is sold through construction of a false dilemma, a fatal logical fallacy.

When Rational Thinking becomes nothing more than an exercise in simply dismissing observations to suit one’s inherited ontology, then the entire integral will and mind of the individual participating in such activity, has been broken.

TES Signature


¹  Gone Girl, screenplay by Gillian Flynn, motion picture by 20th Century Fox; September 26, 2014; David Fincher.

²  The Stanford Encyclopedia of Philosophy, William of Ockham; http://plato.stanford.edu/entries/ockham/

June 30, 2013 Posted by | Agenda Propaganda, Argument Fallacies, Ethical Skepticism, Institutional Mandates | , , , , , , | Leave a comment

The Three Indicators of Personal Ethical Objectivity

If a skeptic contending an argument cannot cite exception observations to his case, does not practice an awareness of their own weaknesses, and possesses a habitual inability to hold a state of epoché and delare “I don’t know;” then be very wary. Because that person may be trying to sell you something, promote an agenda or may even indeed be lying.

There are three indicators I look for in a researcher who is objective and sufficiently transparent enough to be trusted in the handling of the Knowledge Development Process (Skepticism and Science). These are warning flags to watch for, which I have employed in my labs and businesses for decades.  They were hard earned lessons about the nature of people, but have proven very helpful over the years.

1.  The demonstrated ability to neutrally say “I do not have an answer for that” or “I do not know”

i don't knowOne of the ways to spot a fake skeptic, or at a professional level, those who pretend to perform and understand science – is that the pretender tends to have a convenient answer for just about everything. Watch the Tweets of the top 200 narcissistic Social Skeptics and you will observe (aside from their all tweeting the exact same claims 5 minutes after the previous tweet of verbatim contention) the clear indication that there exists not one pluralistic topic upon which they are not certain of the answer. Everything has a retort, one liner, or club recitation. Your dissent on the issue is akin to attacking them personally. They get angry and attack you personally as the first response to your objection. Too many people fall for this ruse. In this dance of irrationality passed off as science, factors of personal psychological motivation come into play and serve to countermand objective thinking. This can be exposed though a single warning flag indicating that the person you are speaking with might not be trustworthy.  An acquaintance of mine, who is a skeptic, was recently faced with a very challenging paradigm shaking observation set. I will not go into what it was, as siding with issues of plurality is not the purpose of this blog. The bottom line was that as I contended “Ken, this is an excellent opportunity to make observations around this phenomenon to the yeah or nay, damn you are lucky,” he retorted “No, I really would rather not. I mean not only is it a waste of time, but imagine me telling the people at work ‘Oh, I just went to ___________ last night and took a look at what was happening.’ That would not go over well, and besides it really is most likely [pareidolia] and will be a distraction.” Ken in this short conversation, revealed an abject distaste for the condition of saying ‘I do not know.’ His claim to absolute knowledge on this topic over the years, revealed to be simply a defense mechanism motivated by fear. Fear of the phenomena itself. Fear of his associates. Fear of the state of unknowing.

It struck me then, that Ken was not a skeptic. It was more comfortable to conform with pat answers and to pretend to know. He employed SSkepticism as a shield of denial, in order to protect his emotions and vulnerabilities.  The Ten Pillars hidden underneath the inability to say “I do not know” can serve as the underpinning to fatal flaws in judgement. Flaws in judgement which might come out on the battlefield of science, at a time most unexpected. Flaws in judgement which might allow one to fall prey to a verisimilitude of fact which misleads one into falsely following a SSkeptic with ulterior motives or influences, were you not the wiser.

The Ten Pillars which hinder a person’s ability to say “I don’t know” are:

I.             Social Category and Non-Club Hatred
II.            Narcissism and Personal Power
III.           Promotion of Personal Religious Agenda
IV.           Emotional Psychological Damage/Anger
V.            Overcompensation for a Secret Doubt
VI.           Fear of the Unknown
VII.          Effortless Argument Addiction
VIII.         Magician’s Deception Rush
IX.           Need to Belittle Others
X.            Need to Belong/Fear of Club Perception

These are the things to watch out for in considering the advice of someone who is promoting themselves as a skeptic, or even as a science professional; one key little hint to be observed inside the ability of an individual to dispassionately declare “I don’t know” or “I do not have an answer for that” on a challenging or emotional subject.

 2.  A balanced perspective CV of pertinent past personal mistakes and what one has learned from them

i made a mistakeI cringe when I review my past mistakes, those both personal and professional. My cringing is not so much over mistakes from lack of knowledge, as in the instance of presuming for example that a well recognized major US Bank would be ethical and not steal for themselves and their hedge/mutual fund cronies, 85% of my parent’s retirement estate once it was put into an Crummy Irrevocable Trust. Some mistakes come from being deceived by others or from a lack of knowledge or an inability to predict markets. But what is more important in this regard, is both the ethical recall of and recollection of how we handle mistakes we commit from missteps in judgement.  Those are the anecdotes I maintain on my mistake resume. For instance, back in the 90’s I led research team on a long trek of predictive studies supporting a particular idea regarding the interstitial occupancy characteristics of an alpha phase substance.  I was so sure of the projected outcome, that my team missed almost a year of excellent discovery and falsifying work, which could have shed light on this interstitial occupancy phenomenon.  It was a Karl Popper moment, my gold fever over a string of predictive results, misled my thinking into a pattern of false understanding as to what was occurring in the experiments.  It was only after a waste of time and a small sum that my colleague nudged me with a recommendation that we shift focus a bit. I had to step off that perch, and realize that a simple falsification test could have elicited an entirely new, more accurate and enlightening pathway. Again I did this in an economic analysis of trade between Asia and the United States. I missed for months an important understanding of trade, because I was chasing a predictive evidence set and commonly taught MBA principles relating costs and wealth accrual in trade transactions. I earned two very painful stripes on my arm from those mistakes, and today – I am keenly aware of the misleading nature of predictive studies – observing a SSkeptic community in which deception by such Promotification pseudo-science runs pandemic.

Mistakes made good are the proof in the pudding of our philosophy. I can satisfy my Adviser with recitations or study the Stanford Encyclopedia of Philosophy for years; but unless I test at least some of the discovered tenets in my research life, I am really only an intellectual participant.  Our mistakes are the hashmarks, the stripes we bear on the sleeves of wisdom and experience. Contending that “Oh I was once a believer and now I am a skeptic” as does Michael Shermer in his heart tugging testimony about his errant religious past and his girlfriend, is not tantamount to citing mistakes in judgement. That is an apologetic explaining how one does not make mistakes, only others do. This is not an objective assessment of self, nor the carrying a CV of lessons learned. Michael learned nothing in the experience, and rather the story simply acts as a confirmation moral; a parable to confirm his religious club.  Other people made the mistake, he just caught on to it as he became a member of the club. This type of ‘mistake’ recount is akin to saying “My only mistake was being like you.” That is not an objective assessment of personal judgement at all. It is arrogance, belied by a fake cover of objectivity and research curiosity.

Watch for this sign in lack of objectivity, especially the fake-mistake kind. It can serve to be extraordinarily illuminating.

3.  A reasonable set of statistical outliers in their theory observation set

person relating a lieOutliers call out liars. This is a long held, subtle tenet of understanding which science lab managers employ to watch for indications of possible group think, bias or bandwagon effect inside their science teams.  Not that people who commit such errors are necessarily liars, but we are all prone to undertake the habit of observational bias, the tendency to filter out information which does not fit our paradigm or objective of research. One way to spot this is to be alert for the presence and nature of and method by which exceptions and outliers are handled in the field of data being presented in an argument. Gregory Francis, professor of psychological sciences at Purdue University, has employed the professional observation of critiquing studies in which the resulting signals are “too good to be true.” Francis has been known to critique psychology papers in this regard. His reviews are based on the principle that experiments, particularly those with relatively small sample sizes, are likely to produce “unsuccessful” findings or outlier data, even if the experiments or observation are supporting a credible and measurable phenomenon.¹  Now, as a professional who conducts hypothesis reduction hierarchy in their work, I know that it is a no-no to counter inception stage falsifying work with countermanding predictive theory – a BIG no no.  This is a common tactic of those seeking to squelch a science.  There is a time and a place in the scientific method for the application of predictive data sets, and that is not at Sponsorship Ockham’s Razor Peer Input step. In other words, I professionally get upset when a new science is shot at inception. I want to see the falsifying data, not the opposition’s opinion. So set aside the error Greg Francis is making here in terms of the scientific method inside Epigenetics, and rather focus on the astute central point that he is making, which is indeed correct.

“If the statistical power of a set of experiments is relatively low, then the absence of unsuccessful results implies that something is amiss with data collection, data analysis, or reporting,” – Dr. Greg Francis, Professor of Psychology, Purdue University¹

An absence of outlier data can be a key warning flag that the person offering the data is up to something. They are potentially falling prey to an observer’s bias, some form of flaw in their positioning of the observation set, or even in purposeful deception – as is the case with many of our Social Skeptics. Indications that the person is adhering to a doctrine, a social mandate, a personal religious bias, or a furtive agenda passed to them by their mentors.

cultivation of ignoranceBe circumspect and watch for how a self purported skeptic handles outlier data.

  • Do they objectively view themselves, the body of what is known, and the population of observations and data?
  • Do they simply spin a tale of complete correctness without exception?
  • Do they habitually begin explanatory statements with comprehensive fatalistic assessments or claims to inerrant authority without exception?
  • Do they bristle at the introduction of new data? 
  • Do they cite perfection of results inside an argument of prospective or well established plurality?
  • Do they even mock others who cite outlier data?
  • Do they attack you if you point this out?

If so be wary. Michael Shermer terms this outlying group of countermanding ideas, data and persons the “Dismissible Margin.”  It is what SSkeptics practice in order to prepare a verisimilitude of neat little simple arguments. These are key warning indicators that something is awry.  If a skeptic cannot acknowledge their own biases, the outlier data itself, nor do they possess the ability to say “I don’t know;” they may very well be lacking in the ability to command objective thinking.  They may very well be lying. Lying in the name of science.

 

TES Signature


¹  Epigenetics Paper Raises Questions, The Scientist, Kate Yandell, October 16, 2014; http://www.the-scientist.com/?articles.view/articleNo/41239/title/Epigenetics-Paper-Raises-Questions/

October 16, 2014 Posted by | Deskeption | Leave a comment

How Social Skepticism Obfuscates the Scientific Method

Once plurality has been argued, I would rather the Scientific Method show my premise to be false, than be told in advance what ‘critical thinking’ says.

The public at large, and our leaders cannot always envision this process of corruption. SSkeptics thrive inside and promote this general ignorance. It is up to people who have actually performed real science – to possess the responsibility of courage which is incumbent with the privilege of acumen; to stand up and say “No, I am not ready to dismiss this subject, just because you tell me to.”

Abrogating the Scientific Method

 

TES Signature

October 13, 2014 Posted by | Tradecraft SSkepticism | , , , , , , | Leave a comment

Contrasting the USFDA and Social Skepticism Definitions of ‘Homeopathy’

Any man can be made to appear irrational and vile, if his enemies only are allowed to speak on his behalf.

Homeopathy, as defined by a particular series of strawman fallacies, is a completely ridiculous pseudoscience, yes. The problem is that the United States Food and Drug Administration, the manufacturers of homeopathic product and the industry in general do not define homeopathy by means of these strawman fallacies. Only Social Skepticism does. Why?

As it turns out, the claims from Social Skepticism that homeopathy is defined as employment of extreme dilutions, placebo, metaphysical entities, vital energies or adherence to antiquated science, all reveal themselves upon diligent investigation, to be false.


sleight of hand of misrepresentationThe Ethical Skeptic is forced to ask: Why? (Note: I am not a homeopathy proponent, I believe there should be one standard for OTC medicinals, and efficacy (not safety) approval should come through a qualified consumer review panel – anonymous so that it cannot be corrupted by SSkeptics and Big Pharma – and not have the process left to an overwhelmed FDA, or prejudiced SSkeptic and Big Pharma lobby groups. The most informed researchers are mom’s and dad’s – we just need to harness the power of enough of their ethics and knowledge.  And in today’s data and intelligence technology environment, this can be done. Unfortunately lobbyists, money and Social Skepticism are getting in the way of science.)

In order to give that inquiry justice, that will have to be the subject of another blog article altogether. But I suppose we will find a hint below with respect to the FDA’s 1962 delineation of HPUS qualified substances.  An issue which begs the question, and forces us too examine the all-to-common milieu around SSkeptic definitions, and in particular theirs of ‘homeopathy':

Who Loses? and

Who Benefits?

An exercise I command regularly in my advisement services with nations suffering from corruption and oppression. We have already published a blog study (see the graphic below) showing that homeopathic cold remedies sold in drug stores over the counter were exactly the same ingredients and dosages as employed in the big pharma equivalents; so The Ethical Skeptic has suspected that some chicanery was afoot in the Social Skepticism movement regarding homeopathy.  It appears to be a whipping boy subject to further a furtive oligarch agenda.  The definitions below elicit this dishonesty and the specific fallacies involved:

From The Skeptic’s Dictionary, Social Skepticism’s definitions of homeopathy:¹

“…the less you use it, the stronger it gets.” – Phil Plait

“…helping the vital force restore the body to harmony and balance.”

“…involves the appeal to metaphysical entities and processes.”

“…trying to balance “humors” by treating a disorder with its opposite (allos).”

“…generally defined as a system of medical treatment based on the use of minute quantities of remedies…”

“…potency could be affected by vigorous and methodical shaking (succussion).”

“…succussion could release “immaterial and spiritual powers,” thereby making substances more active.”

“Homeopaths refer to “the Law of Infinitesimals”… The law of infinitesimals seems to have been partly derived from his notion that any remedy would cause the patient to get worse before getting better and that one could minimize this negative effect by significantly reducing the size of the dose. Most critics of homeopathy balk at this “law” because it leads to remedies that have been so diluted as to have nary a single molecule of the substance one starts with.”

“…homeopathic remedies work by altering the structure of water, thereby allowing the water to retain a “memory” of the structure of the homeopathic substance that has been diluted out of existence.”

“…homeopathic remedies, if effective, are no more effective than placebos.”

 “…homeopaths believe, …that the [scientific practice of a] placebo-controlled randomised (sic) controlled trial is not a fitting research tool with which to test homeopathy.”

 “…homeopathy claims a special exemption from the rules of logic and science…”

This array of mis-definitions and prejudicial portrayals suffers from not only the fact that they are incorrect – homeopathy, as defined by the professional organizations who regulate and manufacture in that industry (in other words the “authorities” as SSkeptics like to cite), employs standard industry effective ingredients in standard OTC dilutions – but also suffers employment of

6 fatal, non-trivial fallacies in the definition of homeopathy foisted by Social Skepticism

  1. a fallacy of composition in that one historical person or fringe idea is used as rationale to dismiss an entire equivocal subject
  2. a lie of equivocation, in employing a mis-definition to impugn a similarly titled topic ‘homeopathy’
  3. a characterization from a negative premise, in that they presume all users of the term ‘homeopathy’ practice pseudo science or adhere to anti-scientific agendas
  4. a fictus scientia fallacy, wherein one claims that science has tested industry defined homeopathy, when indeed it has only tested SSkeptic defined homeopathy
  5. a scarecrow error, in fabricating completely fictitious, ridiculous or expired beliefs as constituting the claims of a disdained group, in order to discredit the group
  6. an antiquing fallacy, showing its false, hoax based or dubious past inside a set of well known anecdotal cases. Also the instance where a thesis is deemed incorrect because it was commonly held when something else, clearly false, was also commonly held.

But in the real world, where real professional business is done, it’s the same damn stuff, same damn substrate, same damn dilutions, same damn product – Just costs 15-55% less than the products pushed by the sponsors of Social Skepticism (see below).

homeopathic comparative

And here is why. The United States Food and Drug Administration defines homeopathy² in its jurisdictional provisions under Section CPG 400.400 of its Compliance Policy Guidelines regarding homeopathy

DEFINITIONS:

 The following terms are used in this document and are defined as follows:

us fda homeopathy guidelines 1. Homeopathy: The practice of treating the syndromes and conditions which constitute disease with remedies that have produced similar syndromes and conditions in healthy subjects.

 2. Homeopathic Drug: Any drug labeled as being homeopathic which is listed in the Homeopathic Pharmacopeia of the United States (HPUS), an addendum to it, or its supplements. The potencies of homeopathic drugs are specified in terms of dilution, i.e., 1x (1/10 dilution), 2x (1/100 dilution), etc. Homeopathic drug products must contain diluents commonly used in homeopathic pharmaceutics. Drug products containing homeopathic ingredients in combination with non-homeopathic active ingredients are not homeopathic drug products.

According to the Homeopathic Pharmacopeia of the United States, a homeopathic drug does not have to involve the principle of dilution at all.³ In fact the seven criteria, recognized by the FDA and the HPUS, which selectively or collectively qualify a drug as being homeopathic are

1) US FDA compliant as safe and effective

2) prepared according to the specifications of the General Pharmacy

3) submitted documentation to USFDA and HPUS

4) drug proving and clinical verification in accordance with modern evolved scientific practices

5) substance was in use prior to 1962 – Note Public Domain Medicinal – ‘Who loses and who benefits?’

6) two adequately controlled double blind clinical studies

7) clinical experience or data documented in the medical literature

Not a single one of these requirements for listing on the HPUS involves or required extreme dilutions, placebo, metaphysical entities, vital energy or adherence to antiquated science.  Imagine that. The FDA and SSkepticism definitions do not match in the least. An all too common occurrence.

Again, I am not a proponent of homeopathy. I buy the product if it is a low cost alternative to the big pharma inflated equivalent. But given the rancor with which Social Skepticism deals with the subject, and the patterns of habitual corruption employed in their opposition, the natural question I then must ask is: Why?

 TES Signature


¹  homeopathy, The Skeptic’s Dictionary, October 11 2014; http://skepdic.com/homeo.html. Note that the employment of the definition is shrouded in its recitation offing, yet the contention that this constitutes the definition of homeopathy remains logically clear once the text is removed of antiquing fallacy, misdirection and equivocation prejudice employed in the history of the subject outlined in the reference. In this context the ‘history of the subject’ is employed in a context in which no other definition is offered, and is employed in such a way to discredit the subject as a whole – an invalid form of argument – and thus, substantiates the argument’s employment as a method of tendering a definition.

²  The United States Food and Drug Administration, Inspections Compliance Enforcement and Criminal Investigations Section “CPG Sec. 400.400 Conditions Under Which Homeopathic Drugs May be Marketed,” http://www.fda.gov/iceci/compliancemanuals/compliancepolicyguidancemanual/ucm074360.htm.

³  The HPUS Criteria for Eligibility, http://www.hpus.com/eligibility.php.

October 11, 2014 Posted by | Argument Fallacies, Institutional Mandates, Tradecraft SSkepticism | , , , , , , , , , , | Leave a comment

The Hypocrisy of the Socialist Anthropogenic Global Warming Agenda

global social warmingIt is indeed ironic that Big Socialism is manifesting at the forefront of the previously ethical movement over concerns about Anthropogenic Global Warming. Ironic since history is clearly linking Big Socialism explicitly as the major cause of the carbon contribution problem in the first place. Our ‘oh so smart’ academic scholars have fallen for this sleight-of-hand yet again; demonstrating for an eighth consecutive decade that they not only fail to grasp economic principles and elements of sound business, but moreover fail to perceive when they are being manipulated by powerful global social forces who could care less about the Earth – only employing it as a battering ram in the war to enslave mankind under their voracious form of tyranny.

The Economist exposes why the most recent flurry of Social Skeptic push campaigns have hit the press:  Why Climate Change is Suddenly Back on the Agenda.

This is not about Climate Change – This is about Political Change.

No solutions are ever proposed or discussed – only attacks on targeted people, and demands for lock step allegiance.

And the Earth is the innocent victim of Socialism in two ways: First at the hand of misleading and purposely polarizing politics, and second ironically through Big Socialism’s actual primary contribution as THE major cause of Anthropogenic Global Warming

They created the problem, and now purport to be its solution. Sound familiar?

Seven Ways in Which Big Socialism is THE Major Cause of AGW

1.  Exploitation of Highly Remote Socialist Work Camp Slave Manufacturing Labor
global social warming 2

By enslaving the working class and selling their captured value as manufacturing slaves, we ensure that products can only be made in the most remote socialist slave labor countries. As a result, the products they produce must be shipped great distances in order to be consumed. These were products which formerly shipped an average distance of 200 miles in many cases within classic unit based capital business scenarios.  Products were produced by ethical family driven businesses which either by ethic or market nature valued local community and local community labor. This new remote aggregation of manufacturing results in the creation of massive trade agencies and shipping consolidation groups which ensure that the average consumer good ships over 7500 miles before it is consumed – and that the retail and wholesale buyers and ‘sales plan modeling packages’ which write the purchase orders are instructed to only buy from authorized Big Socialism shipping conglomerates.  “A-Level Vendors” they are called in the big software packages that drive manufacturing, SAP, Oracle and JDA. What are B-Level Vendors? Smaller local capital driven businesses which cannot produce enough cheap volume for you to put your competitors out of business. Oligarchs need an alliance with Big Socialism in order to eliminate the ethical employment of capital which could be employed to compete with them. That Oligarch/Big Socialism alliance of manufacturing is what drives AGW. That includes every 24 cent trinket and every $9 t-shirt ever worn, and a 30% overbuy on top of that. In fact, if you examine the world list of socialist states (http://en.wikipedia.org/wiki/List_of_socialist_states), you will find that these states comprise in excess of 75% of the world’s consumer goods manufacturing.

2.  Aggregation of Slave Labor into Highly Remote Domestic MegaCommunities with Poor Infrastructure
global social warming 3

A natural outcome of aggregating work camps into highly remote slave labor centers, is that manufacturing centers are concentrated into regions of carbon unaccountability, where emissions are not only not monitored, but the remote aggregation of people employed in the workforce forces those in the community to resort to ancient methods of sheltering and food preparation.  I have spent time in Islamo-socialist and Neo-socialist countries where around the cities, every tree is dead, bereft of leaves in the winter of human ignorance and desire for a reclaimed ancient empire or dynasty. Hundreds of millions of families employing the domestic burning of wood, cow dung and coal (highly toxic fumigant sources) in order to provide energy for their homes.  Young people left without means of birth control, where schools cannot be powered by wood, with absent long shift dual-slave working parents.  All this creating an unsustainable severe drain on our environment, even independent of Climate Change. In fact, dollar for dollar, Socialist manufacturing countries produce 20+ times the carbon soot emissions as compared to the same size exporting non-socialist country (http://www.wisegeek.org/what-are-the-top-manufacturing-countries.htm).

3.  Aggregation of Carbon Intensive Business into the Most Unaccountable and Irresponsible Political Climates
socialism creates global warming

Socialism takes advantage of political structures with low levels of accountability, and seeks to dominate people who have lost their ability to stand on their own and hold their government accountable. In this environment, Big Socialism is able to more effectively hide the always incumbent royalty and pseudo-egalitarian elite groups who siphon the wealth out of the Big Socialism system.  But this same tendency to hide and retreat from global accountability bears manifest, problems with respect to accountability within the countries Big Socialism has seen fit to conquer and dominate.  It is solidly and unequivocally no coincidence, that the countries in the chart to the right which contribute the greatest contribution of carbon into the planet’s environment, all are Big Socialist countries.   As you may see in the graphic to the right, provided by the World Economic Forum, 10 of the top 15 carbon and soot emitting nations are members of Big Socialism. It is clear that Big Socialism only TALKS a lot about Climate Change.  Sound a lot like Social Skepticism and the scientific method? Social Skeptics remain the hapless pawns of this movement and its desire to influx its philosophies into western economic practices. See India and China Ignore UN Climate Change Summit.

4. Conversion of Lean Competitive Capital Business into Bloated Bureaucratic Oligarch Big Box and Monopoly Socialist Elite Dominated Industry
global social warming 4

AGW proponents have a nasty habit of trying to blame carbon contributions on the activity of capitalist and Western business.  But the sad fact remains, that as our businesses get more dominant, and cut out smaller competitors, resources tend to aggregated into the hands of a few, production centers are reduced, local labor is put out of work, and product must be dedicated to massive oligarch driven empires which generate carbon as a predatory way of using efficiency to put capital competitive businesses out of business. Ethical business owners must die off and give way to the detached and procedurally obedient workslave executive inhabiting a suit. Execubots trained by universities who only understand efficiency as a means-to-MEGA, and no longer grasp the heart of what values bolster ethical business or healthy economies. If you examine the official list of sweatshop labor practice countries, those where the state allows manufacturing to abuse its labor, and provide a totalitarian environment in which labor has no other choice of work, you will find the majority of these countries to be Big Socialist (http://www.independent.org/publications/working_papers/article.asp?id=1369) in their government, business and economics.

5.  An Ever Increasing Lust for Control Mandates that Each Year Must Surpass the Last (or Risk Collapse of the Empire)
global social warming 5

Box business, means that each year, the domination must increase – and the increase in revenue must come at any cost; or the slave enterprise risks losing control. Think of what will happen when China hits its first real depression. How will it survive the 5 largest insurrection dangers within its new dynasty? Big Socialism knows that China cannot survive a depression.  They will push for mega increases in revenue, even at the cost of ethical employment of resources, or in the face of over-production versus demand to mitigate the risk of long lead times between manufacturing and consumption.  This is the sad current reality of the China to US/Europe supply cycle under which we now suffer.  As well, we currently waste 40+% of the food we produce in socialist growing regions through post harvest perishment and non-consumption globally.  But the over production is part of the Socialist manifesto of business.  There is no LEAN process in Socialism.  There is efficiency, save for when it is needed to kill all competitors.  Then that efficiency ethic goes by the wayside as bureaucracy builds, and stands in lieu of efficiency in the need for mega revenues.  If you think that Nestlé, Blue Cross and Walmart are capitalist enterprises, you need to go apply for a refund from your graduate business university. These companies are not capitalist winners, they are the last losers. They are Big Box Socialism with a market equity mask. Indeed, the two top Socialist labor employing nations, rely upon a track record of slave labor to produce a smooth curve of growth without major recession, for the past 55 years.  This, rather than an effect of competent business practice, is indeed an outcome of being able to force more production/overproduction year after year and depend upon consuming countries to accept the incumbent product dumping, and not complain because they are getting ‘low cost manufacturing.” (http://en.wikipedia.org/wiki/Historical_GDP_of_the_People%27s_Republic_of_China#mediaviewer/File:China_india_gdp.jpg).

6.  The Polarization of Politics into an Environmentally Impotent Artifice Effective only at Absolute Control of the Population
global social warming 6

It is no secret that the majority of the Social Skepticism movement, the pawns who are cluelessly employed to further the goals of Big Socialism, possess a lock step disdain for certain politics and sectors of our society. Tweets from the SSkeptic community continue to be embarrassingly awash with latet misandry, hatred of Caucasians, hatred of stay at home mom’s, hatred of families and every disdained remnant of the America they were trained to defeat. I am a critic of both sides of the political aisle as it comes to the application of good science and good sound economic policy. But when I survey the landscape of political extremism and irrationality which inhabits the halls of US Academia and Big Socialism proponents in the Social Skepticism movement, it renders me, even as a moderate, a great dead repulsed.  The net of this is polarization; a purposeful division which ensures that nothing will get done, except hate.  The Earth will be the victim, as conservative and moderates grow increasingly skeptical of the political agendas, and our elected representatives are emasculated of any ability to accomplish policy in western governments.  The Social Skepticism movement is led and chartered by Socialist and Big Party democrats who make it clear that they will gladly sell science based on politics.  A great example can be found here (http://www.skepticblog.org/2014/09/06/false-equivalence/) and here (http://theness.com/neurologicablog/index.php/how-to-be-a-science-denier/).  After all, power and political domination is the real goal. The Earth is simply a playing card in their game of political domination. You will notice that Steven Novella and Donald Prothero are exceedingly quiet about proposed solutions to AGW, and the need to divert carbon from our Value Chains. The solutions are there, solutions which do not entail the adoption of Socialism (not that Socialism will help in the least) but these guys just don’t seem to be able to talk about them much.

7.  The Destruction of the Human Problem Solving Spirit
enslavement of nihilism

The final nail in the coffin of mankind (and the Earth), through the deleterious effect of Big Socialism, is its insistence that only the Elite and their obedient cronies are allowed to possess meaningful lives.  Big Socialism has made it clear that it will gladly destroy the planet in its quest for domination.  But moreover, Big Socialism has another effect, probably the most damaging of all.  That of destroying our spirit, our ability to create, our ability to counter and strive for survival, our gumption to face a big problem and seek to win.  Big Socialism kills all within man that is driven by a ‘more than just me’ ethical boldness. Big Socialism kills our spirit as creative problems solvers and dreamers – just the kind of ethical people who can cure cancer, go to the moon, and help solve issues like Climate Change. The decline in American optimism is rising commensurate with the tyrranical obsession our Social Skeptics have placed onto destroying certain strength sectors of our society. They continue to ignore the alarms, fail to be circumspect about who is formulating their propaganda, and run through a process of denial when faced with any argument which runs counter to their form of Big Socialism and its decaying effect on a free society (http://www.washingtonpost.com/opinions/dana-milbank-americans-optimism-is-dying/2014/08/12/f81808d8-224c-11e4-8593-da634b334390_story.html).

With Big Socialism, we lose our soul first, then we lose our lives.

TES Signature

September 23, 2014 Posted by | Deskeption, Institutional Mandates | , , , | 3 Comments

The Ethical Skeptic Statement of Faith

 Epoché vanguards Gnosis.

Ethical Skeptic Faith
Abrahamism
When enforced on me as a child, was a destructive lie which took years to shed through education, integrity, rational thinking, fellowship, a global deep life experience and a smarter circumspect view about man and our elegant universe.
Metaphysical Naturalism
When it is offered to me as the only alternative to Abrahamism, this becomes a bifurcation fallacy;
When I am told there exists enough epistemological evidence upon which to select it, this becomes a false dilemma;
When it is enforced on society, government, in my career, in the media, and in my academic progression, it becomes Nihilism;
When it is bolstered by pep rallies and through surrounding one’s self with angry like-minded fellows, it becomes every bit an abusive religion as is Abrahamism;
When it is promoted as “atheism” or “skepticism” or “free thinking” or as supporting a single set of political views
…it becomes a lie.
My research, and my review of the research of others, has led me to conclude that both Abrahamism and Metaphysical Naturalism are false. At a certain point, the integrity of philosophy will dictate that it must yield to evidence. Those who pretend the evidence does not exist, remain lost in their own weakened minds. However, I await more true research before choosing to conclude anything further. I object when either group seeks to control the evidence, means, access, method, funding, attention, work, conclusions and visibility of all research which allows me to improve my understanding of our realm. This knowledge is neither the property nor propriety of any government or group. I object when my peers seek to enforce either religion on me, society, careers, media, social discourse, or government. When this happens, I will speak up.
We do not yet understand enough, as men, to substantiate these grandiose fatalistic claims as to the nature of our universe and our existence. Epoché vanguards Gnosis. I expect to be amazed. This is my faith.

 

TES Signature

 

September 20, 2014 Posted by | Ethical Skepticism | , , | 4 Comments

%d bloggers like this: