The Ethical Skeptic

Challenging Pseudo-Skepticism, Institutional Propaganda and Cultivated Ignorance

Panduction: The Invalid Form of Inference

One key, if not the primary form of invalid inference on the part of fake skeptics, resides in the methodology of panductive inference. A pretense of Popper demarcation, panduction is employed as a masquerade of science in the form of false deduction. Moreover it constitutes an artifice which establishes the purported truth of a favored hypothesis by means of the extraordinary claim of having falsified every competing idea in one felled swoop of rationality. Panduction is the most common form of pseudoscience.

Having just finished my review of the Court’s definition of malice and oppression in the name of science, as outlined in the Dewayne Johnson vs. Monsanto Company case, my thinking broached a category of pseudoscience which is practiced by parties who share similar motivations to the defendant in that landmark trial. Have you ever been witness to a fake skeptic who sought to bundle all ‘believers’ as one big deluded group, who all hold or venerate the same credulous beliefs? Have you ever read a skeptic blog, claiming a litany of subjects to be ‘woo’ – yet fully unable to cite any evidence whatsoever which served to epistemologically classify that embargoed realm of ideas under such an easy categorization of dismissal? What you are witness to, is the single most common, insidious and pretend-science habit of fake skeptics, panduction.

It’s not that all the material contained in the embargoed hypotheses realm has merit. Most of it does not. But what is comprised therein, even and especially in being found wrong, resides along the frontier of new discovery. You will soon learn on this journey of ethical skepticism, that discovery is not the goal of the social skeptic; rather that is exactly what they have been commissioned to obfuscate.

Science to them is nothing more than an identity which demands ‘I am right’.

There exist three forms of valid inference, in order of increasing scientific gravitas: abduction, induction and deduction. Cleverly downgrading science along these forms of inference in order to avoid more effective inference methods which might reveal a disliked outcome, constitutes another form of fallacy altogether, called methodical deescalation.  We shall not address methodical deescalation here, but rather, a fourth common form of inference, which is entirely invalid in itself. Panduction is a form of ficta rationalitas; an invalid attempt to employ critical failures in logic and evidence in order to condemn a broad array of ideas, opinions, hypotheses, constructs and avenues of research as being Popper falsified; when in fact nothing of the sort has been attained. It is a method of proving yourself correct, by impugning everyone and everything besides the idea you seek to protect, all in one incredible feat of armchair or bar-stool reasoning. It is often peddled as critical thinking by fake skeptics.

Panduction is a form of syllogism derived from extreme instances of Appeal to Ignorance, Inverse Negation and/or Bucket Characterization from a Negative Premise. It constitutes a shortcut attempt to promote one idea at the expense of all other ideas, or kill an array of ideas one finds objectionable. Nihilists employ panduction for example, as a means to ‘prove’ that nothing exists aside from the monist and material entities which they approve as real. They maintain the fantasy that science has proved that everything aside from what they believe, is false by a Popperian standard of science – i.e. deducted. This is panduction.

Panduction

/philosophy : invalid inference/ : an invalid form of inference which is spun in the form of pseudo-deductive study. Inference which seeks to falsify in one felled swoop ‘everything but what my club believes’ as constituting one group of bad people, who all believe the same wrong and correlated things – this is the warning flag of panductive pseudo-theory. No follow up series studies nor replication methodology can be derived from this type of ‘study’, which in essence serves to make it pseudoscience.  This is a common ‘study’ format which is conducted by social skeptics masquerading as scientists, to pan people and subjects they dislike.

There are three general types of Panduction. In its essence, panduction is any form of inference used to pan an entire array of theories, constructs, ideas and beliefs (save for one favored and often hidden one), by means of the following technique groupings:

  1. Extrapolate and Bundle from Unsound Premise
  2. Impugn through Invalid Syllogism
  3. Mischaracterize though False Observation

The first is executed through attempting to falsify entire subject horizons through bad extrapolation. The second involves poorly developed philosophies of denial. Finally the third involves the process of converting disliked observations or failures to observe, into favorable observations:

Panduction Type I

Extrapolate and Bundle from Unsound Premise – Bucket Characterization through Invalid Observation – using a small, targeted or irrelevant sample of linear observations to extrapolate and further characterize an entire asymmetric array of ideas other than a preferred concealed one. Falsification by:

Absence of Observation (praedicate evidentia modus ponens) – any of several forms of exaggeration or avoidance in qualifying a lack of evidence, logical calculus or soundness inside an argument. Any form of argument which claims a proposition consequent ‘Q’, which also features a lack of qualifying modus ponens, ‘If P then’ premise in its expression – rather, implying ‘If P then’ as its qualifying antecedent. This as a means of surreptitiously avoiding a lack of soundness or lack of logical calculus inside that argument; and moreover, enforcing only its conclusion ‘Q’ instead. A ‘There is not evidence for…’ claim made inside a condition of little study or full absence of any study whatsoever.

Insignificant Observation (praedicate evidentia) – hyperbole in extrapolating or overestimating the gravitas of evidence supporting a specific claim, when only one examination of merit has been conducted, insufficient hypothesis reduction has been performed on the topic, a plurality of data exists but few questions have been asked, few dissenting or negative studies have been published, or few or no such studies have indeed been conducted at all.

Anecdote Error – the abuse of anecdote in order to squelch ideas and panduct an entire realm of ideas. This comes in two forms:

Type I – a refusal to follow up on an observation or replicate an experiment, does not relegate the data involved to an instance of anecdote.

Type II – an anecdote cannot be employed to force a conclusion, such as using it as an example to condemn a group of persons or topics – but an anecdote can be employed however to introduce Ockham’s Razor plurality. This is a critical distinction which social skeptics conveniently do not realize nor employ.

Cherry Picking – pointing to a talking sheet of handpicked or commonly circulated individual cases or data that seem to confirm a particular position, while ignoring or denying a significant portion of related context cases or data that may contradict that position.

Straw Man – misrepresentation of either an ally or opponent’s position, argument or fabrication of such in absence of any stated opinion.

Dichotomy of Specific Descriptives – a form of panduction, wherein anecdotes are employed to force a conclusion about a broad array of opponents, yet are never used to apply any conclusion about self, or one’s favored club. Specific bad things are only done by the bad people, but very general descriptives of good, apply when describing one’s self or club. Specifics on others who play inside disapproved subjects, general nebulous descriptives on self identity and how it is acceptable ‘science’ or ‘skepticism’.

Associative Condemnation (Bucket Characterization and Bundling) – the attempt to link controversial subject A with personally disliked persons who support subject B, in an effort to impute falsehood to subject B and frame its supporters as whackos. Guilt through bundling association and lumping all subjects into one subjective group of believers. This will often involve a context shift or definition expansion in a key word as part of the justification. Spinning for example, the idea that those who research pesticide contribution to cancer, are also therefore flat Earther’s.

Panduction Type II

Impugn through Invalid Syllogism – Negative Assertion from a Pluralistic, Circular or Equivocal Premise – defining a set of exclusive premises to which the contrapositive applies, and which serves to condemn all other conditions.

Example (Note that ‘paranormal’ here is defined as that which a nihilist rejects a being even remotely possible):

All true scientists are necessarily skeptics. True skeptics do not believe in the paranormal. Therefore no true scientist can research the paranormal.

All subjects which are true are necessarily not paranormal. True researchers investigate necessarily true subjects. Therefore to investigate a paranormal subject makes one not a true researcher.

All false researchers are believers. All believers tend to believe the same things. Therefore all false researchers believe all the same things.

Evidence only comes from true research. A paranormal investigator is not a true researcher. Therefore no evidence can come from a paranormal subject.

One may observe that the above four examples, thought which rules social skepticism today, are circular in syllogism and can only serve to produce the single answer which was sought in the first place. But ruling out entire domains of theory, thought, construct, idea and effort, one has essentially panned everything, except that which one desires to be indeed true (without saying as much).  It would be like Christianity pointing out that every single thought on the part of mankind, is invalid, except what is in the Bible. The Bible being the codification equivalent of the above four circular syllogisms, into a single document.

Panduction Type III

Mischaracterize through False Observation – Affirmation from Manufacturing False Positives or Negatives – manipulating the absence of data or the erroneous nature of various data collection channels to produce false negatives or positives.

Panduction Type III is an extreme form of an appeal to ignorance. In an appeal to ignorance, one is faced with observations of negative conditions which could tempt one to infer inductively that there exists nothing but the negative condition itself. An appeal to ignorance simply reveals one of the weaknesses of inductive inference.  Let’s say that I find a field which a variety of regional crow murders frequent. So I position a visual motion detection camera on a pole across from the field in order to observe crow murders who frequent that field. In my first measurement and observation instance, I observe all of the crows to be black. Let us further then assume that I then repeat that observation exercise 200 times on that same field over the years. From this data I may well develop a hypothesis that includes a testable mechanism in which I assert that all crows are black. I have observed a large population size, and all of my observations were successful, to wit: I found 120,000 crows to all be black. This is inductive inference. Even though this technically would constitute an appeal to ignorance, it is not outside of reason to assert a new null hypothesis, that all crows are black – because my inference was derived from the research and was not a priori favored. I am not seeking to protect the idea that all crows are black simply because I or my club status are threatened by the specter of a white crow. The appeal to ignorance fallacy is merely a triviality in this case, and does not ‘disprove’ the null (see the Appeal to Fallacy). Rather it stands as a caution, that plurality should be monitored regarding the issue of all crows being black.

But, what if I become so convinced that the null hypothesis in this case is the ‘true’ hypothesis, or even preferred that idea in advance because I was a member of a club which uses a black crow as its symbol? In such a case I approach the argument with an a priori belief which I must protect. I begin to craft my experimental interpretation of measurement such that it conforms to this a priori mandate in understanding. This will serve to produce four species of study observation procedural error, which are in fact, pseudoscience; the clever masquerade of science and knowledge:

A.  Affirmation from Result Conversion  – employing a priori assumptions as filters or data converters, in order to produce desired observational outcomes.

1.  Conversion by a priori Assumption (post hoc ergo propter hoc). But what if the field I selected, bore a nasty weather phenomenon of fog, on an every other day basis. Further then, this fog obscured a good view of the field, to the point where I could only observe the glint of sunlight off the crow’s wings, which causes several of them to appear white, even though they are indeed black. But because I know there are no white crows now, I use a conversion algorithm I developed to count the glints inside the fog, and register them as observations of black crows? Even though a white crow could also cause the same glint. I have created false positives by corrupted method.

2.  Conversion by Converse a priori Assumption (propter hoc ergo hoc – aka plausible deniability). Further then, what if I assumed that any time I observed a white crow, that this would therefore be an indication that fog was present, and a condition of Data Conversion by a priori Assumption was therefore assumed to be in play? I would henceforth, never be able to observe a white crow at all, finding only results which conform to the null hypothesis, which would now be an Omega Hypothesis (see The Art of Professional Lying: The Tower of Wrong).

Example: Viking Mars Lander Data Manipulation

Two Mars Viking Landers were sent to Mars, in part to study for signs of life. NASA researchers took soil samples the Viking landers scooped from the surface and mixed it with nutrient-rich water. If the soil had life, the theory went that the soil’s microbes would metabolize the nutrients in the water and release a certain signature of radioactive molecules. To their pleasant surprise, the nutrients metabolized and radioactive molecules were released – suggesting that Mars’ soil contained life. However, the Viking probes’ other two experiments found no trace of organic material, which prompted the question: If there were no organic materials, what could be doing the metabolizing? So by assumption, the positive results from the metabolism test, were dismissed as derivative from some other chemical reaction, which has not been identified to date. The study was used as rational basis from which to decline further search for life on Mars, when it should have been appropriately deemed ‘inconclusive’ instead (especially in light of our finding organic chemicals on Mars in the last several months)1

B. Affirmation from Observation Failure Conversion – errors in observation are counted as observations of negative conditions, further then used as data or as a data screening criterion.

Continuing with our earlier example, what if on 80% of the days in which I observed the field full of crows, the camera malfunctioned and errantly pointed into the woods to the side, and I was fully unable to make observations at all on those days? Further then, what if I counted those non-observing days as ‘black crow’ observation days, simply because I had defined a black crow as being the ‘absence of a white crow’ (pseudo-Bayesian science) instead of being constrained to only the actual observation of an actual physical white crow? Moreover, what if, because of the unreliability of this particular camera, any observations of white crows it presented were tossed out, so as to prefer observations from ‘reliable’ cameras only? This too, is pseudoscience in two forms:

1.  Observation Failure as Observation of a Negative (kíndynos apousías). – a statistical study which observes false absences of data and further then assumes them to represent verified negative observations. A study containing field or set of data in which there exists a risk that absences being measured, will be caused by external factors which artificially serve to make the evidence absent, through risk of failure of detection/collection/retention of that data. The absences of data are therefore not a negative observation, rather are presumed to be a negative observation in error. This will often serve to produce an inversion effect in the final results.

2.  Observation Failure as Basis for Selecting For Reliable over Probative Data (Cherry Sorting) – when one applies the categorization of ‘anecdote’ to screen out unwanted observations and data. Based upon the a priori and often subjective claim that the observation was ‘not reliable’. Ignores the probative value of the observation and the ability to later compare other data in order to increase its reliability in a more objective fashion, in favor of assimilating an intelligence base which is not highly probative, and can be reduced only through statistical analytics – likely then only serving to prove what one was looking for in the first place (aka pseudo-theory).

These two forms of conversion of observation failures into evidence in favor of a particular position, are highlighted no better than studies which favor healthcare plan diagnoses over cohort and patient input surveys. Studies such as the Dutch MMR-Autism Statistical Meta-Analysis or the Jain-Marshall Autism Statistical Analysis failed precisely because of the two above fallacious methods regarding the introduction of data. Relying only upon statistical analytics of risk-sculpted and cherry sorted data, rather than direct critical path observation.

 Example: Jain-Marshall Autism Study

Why is the 2015 Jain-Marshall Study of weak probative value? Because it took third party, unqualified (health care plan) sample interpretations of absences (these are not observations – they are ‘lack-of’ observations – which are not probative data to an intelligence specialist – nor to a scientist – see pseudo-theory) from vaccinated and non-vaccinated children’s final medical diagnoses at ages 2, 3, and 5. It treated failures in the data collection of these healthcare databases, as observations of negative results (kíndynos apousías). A similar data vulnerability to the National Vaccine Injury Compensation System’s ‘self-volunteering’ of information and limitation of detection to within 3 years. This favors a bad, non-probative data repository, simply because of its perception as being ‘reliable’ as a source of data. This fails to catch 99% of signal observations (Cherry Sorting), and there is good demonstrable record of that failure to detect actual injury circumstances.2

One might chuckle at the face value ludicrousness of either Panduction Type III A and B. But Panduction Type III is regularly practiced inside of peer reviewed journals of science. Its wares constitute the most insidious form of malicious and oppressive fake science. One can certainly never expect a journalist to understand why this form of panduction is invalid, but certainly one should expect it of their peer review scientists – those who are there to protect the public from bad science. And of course, one should expect it from an ethical skeptic.

epoché vanguards gnosis

——————————————————————————————

How to MLA cite this blog post =>

The Ethical Skeptic, “Panduction: The Invalid Form of Inference” The Ethical Skeptic, WordPress, 31 Aug 2018; Web, https://wp.me/p17q0e-8c6

 

August 31, 2018 Posted by | Argument Fallacies, Tradecraft SSkepticism | , , | Leave a comment

Abuse of the Ad Hoc ‘Fallacy’

By domain definition, something which is critical path in argument can never be fallaciously ad hoc, even if not readily addressable by evidence. Beware of those who do not get this. The sad reality is that, contrary to their memorized talking points, social skeptics exercise an implicit definition of ad hoc fallacy which is – ‘a bucket in which I place every counter explanation, evidence or claim which defends a subject I do not like’. Ironically exposing the fake skeptic’s inability to handle critical path logic in the first place.

I just finished reading a series of articles by various social skeptics, purporting to explain the ‘ad hoc‘ rescue or ‘ad hoc‘ fallacy as it is sometimes called. The various definitions tendered include a variety of spins on the concept of basically arguing through Making Shit Up (MSU). This comes close enough I suppose, but for me, being a philosopher and having struggled through the prosecution of real scientific questions and real patents, much unlike our cabal of social skeptics, I demand a bit more rigor in my Wittgenstein. Let’s examine one of those purported definitions of the ad hoc rescue:

I need to spend a few minutes explaining an extremely common logical fallacy among flat earthers (and creationists, anti-vaccers, etc.). This is what is known as an ad hoc fallacy. Unlike most fallacies, this does not occur as part of an argument, but rather as part of a counterargument. It arises when someone is faced with evidence that contradicts their view, and they respond by inventing a solution for which there is no evidence. In other words, they invent a response that you would never accept unless you were already convinced of their view. It also often has the property of being unfalsifiable. In other words, it is something that cannot actually be tested and must be accepted on faith.1

Before we begin, let me make a couple things clear. I am not a flat Earther, nor a creationist, nor am I anti-vaccine (although in full disclosure I do have a child permanent encephalopathy-disabled by the pertussis vaccine – fake skeptics you can stop reading here and continue to cocoon inside your self-aggrandizing ignorance). So my purpose is not to defend those movements in this blog article. Fake skeptics will spin any word of caution as ‘anti-ism’ (itself a failure of critical path logic), and that is simply a social foible which the rest of humanity has learned to expect from them. But I do bristle at the ways in which social skeptics go about bucket condemning subjects and abusing philosophy in the process, and in support of their political goals. Where else will these false philosophies and methods be applied, in order to condemn a subject which does bear merit?

The author above (his identity is not my focus here, I am sure he is a fine person) begins his blog title with the incorrect but trendy technique of not capitalizing the title of the article; a fad introduced by persons wishing to appear as if they were publishing a scientific study (a style used by some journals). The pretense includes his failure to capitalize the word ‘Earth’ at all (as in ‘The Earth’ and not ‘dirt’).2 Aside from this, the author tenders a half-correct framing of the principle of ad hoc response; moreover one which constitutes also a Bridgman reduction and permissive argument framing. A very common technique in most social-political circles. The reconstruction of a principle into a version which is ‘simple’, such that sycophants can understand it, but the crafting of which also mis-defines and encourages abuse or misapplication of the principle itself. In other words a political ‘fallacy’, and not one of science nor skepticism.

Where the author is correct

An ad hoc rescue is a defensive response to a challenge in argument or evidence, which is ignoratio elenchi and serves to divert or distract a discussion into a domain inside which claims can no longer be discriminated. This much of his definition is true.

“John still loves me, I just know it.”

“But John is living with Lisa now, and has been for months!”

“He is just doing that to make me jealous.” <– the ad hoc rescue (litmus: non-critical path and cannot be differentiated from something just made up)

But he is failing to discriminate important principles with regard to the nature of non-ad hoc assertions, exhibited thusly:

“John said he is sending me a note, and wants to meet for dinner next Thursday.” <– this is not ad hoc (she might be mistaken in conclusion, but she poses a testable and more importantly, critical path)

Most commonly, the ad hoc fallacy accusation is misapplied to the equivalent of the last sentence in this example. The author of the definition above has hinted that he has not grappled with this fallacy inside an actual complex argument of plurality, say like in a professional lab setting; as opposed to an argument of political symbology like flat Earth, vaccine risk or Bigfoot ‘skepticism’. As a side note, I am not entirely convinced that real flat Earth believers even exist – rather they are simply applying akratic trolling, purposely irritating skeptics by using the very methods (methodical cynicism) taught by their Cabal.  Ironically demonstrating that such sciencey-sounding protocols can and do lead to very errant socialized conclusions. Flat Earther’s are following the Carl Sagan Baloney Detection Kit, step by step fake skeptics. Get used to it. These are the eggs you laid in the 60’s and 70’s, and the chickens are just now starting to come home to roost. When you use Bridgman Reduction to craft methods which can be used to debunk anything, expect anything to be debunked by those same methods. Even your pet ideas. Skeptics, you need to up your game, if we are all to prevent such deleterious uses of your protocols. This is one reason why your movement is disintegrating – you do not really understand the principles which you employ in your high-caliber doctrine rifles. This is a very hilarious play to us philosophers.

Your methods and you yourself social skeptic, are being mocked by flat Earthers and you fail to even realize it.

The above definition itself however looks and sounds alright, does it not? But the trained discriminating eye of the philosopher is needed here in order to distinguish its ad hoc social and political demagoguery from real science. The author is indeed incorrect with the remainder of his definition.

Where the author is incorrect

1.  Ad hoc rescue conjectures involve responses wherein the defensive counter being made, involves a principle of non-falsifiability –  and not ‘unfalsifiability‘ as he has used. (See English Language & Usage: When is the prefix non- used versus un-).3

un – as in the ‘opposite of’ (i.e. ‘inductively/deductively true’)

non – as in ‘outside the domain of applicability’

This mistake is not a triviality of substitution – as the difference between ‘non-‘ and ‘un-‘ here relates to a principle called critical path logic. The inability to handle critical path logic such as the hyperbolic or inappropriate use of the ‘un-‘ prefix stands as a stark warning flag in a skeptic.  It is much akin to one’s claiming to be an expert in ‘Kwantum Mechanics’, – this type of mistake is not just a misspelling and does NOT constitute a trivial error. In fact, one’s skill in discerning critical path logic, determines whether or not one can even correctly discern a condition of ad hoc fallacy response to begin with (see below).

2.  An ad hoc fallacious response cannot be addressed in the here and now (not simply ‘cannot actually be tested‘), and therefore cannot realistically be differentiated from something just made up.

3.  An ad hoc fallacy does not involve something which ‘must be accepted on faith‘ by someone ‘already convinced of their view‘ – as this simply constitutes prejudicial wording which will be familiar and therefore granted automatic favor by his apothegm-trained audience. The critical principle resides in two elements:

i.  a distraction into the non-critical path (ignoratio elenchi), and

ii.  the inability to differentiate what was said, from something just made up as a response artifice or as misinformation.

This is more often the action of one who could care less about studying the issue at hand, than it is an action of a ‘true believer’. One’s acting as the sponsor of an idea is never the single qualification as to an ad hoc fallacy. This is one litmus you can use for detecting a fake skeptic. A sponsor who goes and looks is NEVER ad hoc arguing.

4.  An ad hoc fallacy is not actually a ‘fallacy of logic‘ (a fallacy in critical path logic is called a formal fallacy) – rather is an informal fallacy called ignoratio elenchi. It is a fallacy outside of logic. Some day it might logically end up being true or false – we don’t know. But in the now, it constitutes a diversion into a realm of the non-critical path. Poetically, this understanding is absolutely essential (critical path) to the principle of usage of the term ad hoc in the first place. By definition, something critical path can NEVER be ad hoc, even if currently non-addressable by study. Beware of those who do not get this – and more importantly do not get the fact that citing an informal fallacy does not stand as a disproof of an opponent’s claim nor logic.

5.  Based upon these principles alone, bucket characterizing flat Earth, creationist and vaccine injury risk proponent arguments is lazy, convenient and constitutes a false equivalence. I can probably list the other ‘ad hoc fallacy’ subjects this person would include in this bucket as well. Social skeptics bear very predictable mindsets. They carry an a priori laundry list of things they despise, and they all carry the same list. An ad hoc fallacy is not a fallacy committed by ‘anyone promoting anything which has been debunked or my club and I do not like’. As that basis is rather ad hoc in itself.

6.  Simply citing a response concern for which there currently ‘is no evidence‘, is NOT the qualification of an ad hoc fallacy. Plenty of arguments have not been studied at all. This in no way serves to make them fallaciously ‘ad hoc‘.

7.  An ad hoc fallacy is used to disqualify or warn on a particular point inside a context of dialectic or debate – and cannot be employed to condemn entire subjects inside a polemic, as the author of this mis-definition has done above (which is itself, fallaciously ad hoc). The “extremely common logical fallacy among flat earthers (sic) (and creationists, anti-vaccers, etc.” quip is a prejudicial framing without qualification merit – in other words, pseudoscience. I do not even belong to these groups, but this definition is so inexpertly wielded, with such shortcut bandwagon-esque vitriol, that my hackles begin to raise. Who is the next victim in this lazy hack job? My field of study? My company? My home and kids? Oh, that is right… clowns like this have already attacked my family and kids, I forgot. Mistakes like these are the signature traits committed by unaccountable idiots.

You will find that the ‘skeptic’ community never polices their own, nor provides any ethical peer review of its members’ drivel nor horrid actions. Any jerk or malevolent actor can become a skeptic, as long as they spout the familiar sounding jargon. Skeptics are never held accountable for anything they say and do.

This is why skeptics are losing the battle for the American mind. Americans are a sincere and open minded people for the most part – and they learn about people from their actions, and not their words.

8.  Finally, this fallacy is not ‘extremely common‘ as the author cites, without evidence. Rather it is a recourse of common use on the part of people who do not grasp elements 1 through 7 above.

The actual incidence of the ad hoc fallacy is not as great as is the instance of its unqualified accusation.

Below you will see examples as to when and why the above definition is wrong. But first, let’s examine the ad hoc fallacy itself.

ad hoc fallacy

/philosophy : rhetoric : pseudo-theory : ignoratio elenchi/ : an ignoratio elenchi response to an argument or evidence, which seeks to exploit ambiguity or non-accountability as a domain in which to craft a defense which cannot be readily distinguished from something made up. Invention of an explanation which distracts attention away from critical path logic, and/or for which evidence to the pro and con cannot be derived in the now, and/or falsification is unapproachable. A tactic of pseudo-theory and a form of rhetoric.

Despite often getting the definition right, social skeptics even more commonly fail to relate correct examples in its application. More often they extrapolate the fallacy to condemn whole subjects, and appeals on the part of those people they do not like. However, when you see these same bad philosophers exercise their skill in applying their understanding or grasp of the ad hoc fallacy (as above) – inevitably it simply comes down to the real discriminating definition that the ad hoc fallacy is

ad hoc fallacy – any counter explanation, evidence or claim which appears to defend an idea I do not like.

When ad hoc Fallacy Does Not Apply

There are several circumstances in which the ad hoc fallacy is accused, however which are not a fit – in fact, circumstances in which the claim of ad hoc pseudo-theory is just flatly wrong; flagging a condition of incompetence on the part of the contending skeptic:

A.  When the claimant is raising plurality with sufficient Ockham’s Razor basis (not ‘Occam’s Razor’).

The faking skeptic may mistakenly straw man this species of assertion as, ‘just asking the question’ on the part of the claimant. Demanding that h. pylori be studied as the potential cause of ulcers (plurality had been introduced), was not an ad hoc claim that ‘some mysterious pathogen was to blame’, as skeptics had employed in order to block science for 30 years on this issue. Evidence which inductively pointed to h. pylori‘s role in ulcers had existed for 30 years prior to science eventually dropping this idea as being ‘ad hoc‘.

B.  When the claimant is raising plurality as a stakeholder under risk.

Asking that long term cohort (to age 14 and multifaceted expression) studies be conducted on both specific vaccines and the 43+ event vaccine schedule as a whole, is not a case of ad hoc fallacy. It is otherwise normal, ethical and critical path science. Such studies are critical to the issue and have not been attempted. Such appeals for study are not ‘made up’ nor do they appeal to a domain of non-measurability. This is study we can perform as a reasonable body of science, but yet we refuse to allow or execute because of oppressive non-science political influences (such as the pretend science article from which the above definition of ad hoc fallacy originated). Understanding this is part of a skill set in critical path logic. To equate vaccine-risk study requests to flat Earth theory or creationism is simply a malevolent and lazy lie, on the part of someone who does not care about science nor humanity – only their own celebrity and club ranking. Watch this type of person to observe if they ever visibly step out of line with their club’s doctrine. Then you will witness their supposed courage and conviction of science.

C.  When the claimant is addressing the critical path of study or logic.

If a researcher proposes an alternate natural physical explanation for observed phenomena we attribute to ‘dark matter’ – just because we cannot investigate its full set of founding assumptions in the now, nor test its predictive outcomes fully, does not serve to make the professional conjecture an ad hoc rescue. The researcher may still be addressing scientific and logical critical path. They may simply dissent or disagree – but still bearing just as much accountability or credibility as the null in this case. An example of ad hoc fallacy in this context would be ‘God hides the foundational elements of the physical universe from us, so that we may focus on spiritual development as our priority’.

D.  When the claimant is addressing the critical path of study and has simply made a mistake/misinterpretation.

If a flat Earther builds a rocket ship to go up and see for himself/herself, whether or not the Earth is round or flat. This is critical path. It is not ad hoc. They may be mistaken, but they are embarking upon the pathway which will help them answer the question at hand. This is by definition, not ad hoc. Beware of those who do not get this. Going into the field to study, or asking that such be done is NEVER fallaciously ad hoc.

E.  When the claimant (even an outsider) is citing that insufficient study has been conducted (praedicate evidentia, ignoro eventum or fallacy of relative privation).

The faking skeptic may mistakenly straw man this species of assertion as, ‘just asking the question’ on the part of the claimant. Any time a sponsor is requesting that further study be done, and for particular reasons – even if anecdotal and even if in unsophisticated language or philosophy – this is not a case of fallacious ad hoc appeal. Nor does it amount to a case of Dunning-Kruger.

Typically in such circumstances, you will see fake skeptics chime in with the claim that the sponsor has appealed to conspiracy. They are a conspiracy theorist! This is lazy and shallow accusation, the appeal to implicit conspiracy. This is the ultimate form of ad hoc fallacious accusation itself, the appeal to implied conspiracy (distraction from the critical path of the argument and an accusation which can always be made, yet can never be distinguished from just being made up):

Appeal to Implicit Conspiracy

/philosophy : pseudoscience : pseudo-skepticism : ad hoc framing/ : the default position taken by a pseudo-skeptic that in order for a counter-claimant to actively research or have confidence in their proposition, then quod erat demonstrandum they must therefore believe a conspiracy exists which is holding back their preferred alternative from being studied or accepted. This default ad hoc fallacy explanation can be accused of anyone, without discretion, distracts from the logic at hand, can never be verified and results in only finding what we already think we know, to therefore be true. A substitute form of science (pseudo-theory) issued in the form of pejorative ad hominem and straw man, all rolled up into one baseless and easy claim on the part of a pseudo-skeptic.

F.  When the person making the appeal to ad hoc fallacy, does not understand the context, playing field nor critical arguments entailed. Or thinks that the identification of the fallacy entails a disproof of the opponent’s assertion on their part. Or moreover uses a single point commentary or informal fallacy to condemn an entire group of people or field of research.

War is the ultimate form of ad hoc. Everything done in warfare is adopted as tactic or strategy in order to obtain a particular and many times emergent purpose. Just because a casual observer might fail to understand what is going on inside a theater of warfare, does not mean that the ad hoc actions therein constitute ad hoc fallacies. Much of warfare, science and life in general is ad hoc by its very nature. This does not also serve therefore to make it also then fallacious.

If one uses a fly swatter to kill a fly, the fly swatter is an ad hoc design. Just because one uses it, does not mean that its use is therefore an ad hoc fallacy. If one attempts to use a sledgehammer in order to kill flies on the other hand, and never seeks to craft a fly swatter for that purpose, then that is ad hoc fallacious.

Beware of those who do not grasp the above principles, who often make the accusation of conspiracy theory, or are unskilled and symbolically habituated in their application of the ad hoc fallacy. First-resort, inexpert and clumsy artifices, employed without necessary qualification. Ironically canned, memorized and inexpertly crafted to be knee-jerk employed for a single rhetorical purpose: to kill the inconvenient flies of the opposition. Sledgehammer means by definition.

The true philosopher of science, demands more than this charade of skepticism.

epoché vanguards gnosis

——————————————————————————————

How to MLA cite this blog post =>

The Ethical Skeptic, “Abuse of the Ad Hoc ‘Fallacy’” The Ethical Skeptic, WordPress, 19 Jul 2018; Web, https://wp.me/p17q0e-7Wv

July 19, 2018 Posted by | Argument Fallacies | , | Leave a comment

Interrogative Biasing: Asking the Wrong Question in Order to Get the Right Answer

A wrong question under the scientific method is generally posed for one of two reasons: ignorance or the desire to cultivate ignorance. It is the latter motive for which the ethical skeptic must always be on guard. One learns early on inside the social skepticism movement, that in order to derive the right answer, all one need do is simply ask the wrong question.

Pseudoscience is a descriptive of method, and not of subject. The understanding of this is what differentiates the fake-skeptic from the real thing. One of the primary tactics of pseudoscience is a condition wherein a person tenders the appearance of asking a sciencey-sounding question (usually under the virtue identity of being a ‘skeptic’), while hoping that the victim against whom they are arguing does not comprehend the difference between pseudoscience and real science. The first tactic of pseudoscience is the asking of a biased or incoherent question, which tenders the appearance of being scientific in its crafting. You will be surprised that, even in the halls of established science – this trick is applied and passes peer review. The study claims run along the lines of ‘we are asking an incomplete and partially incoherent answer, and should understand the results for what they are inside that light’ – whereupon the answer is then extrapolated by social activists (social skeptics) into a set of ramifications and pervasive conclusions such studies never meant to impart. This type of study often constitutes a wild, disconnected shot in the dark – a hope for a compliant outcome, through the clever abrogation of real and plenary science.

Failure to follow critical path is a key sign of scientific fraud – even if the internal procedural protocols of a study itself are ethical. A grand statistical study, which does not follow an incremental and dependent pathway of query (in other words, specific outcomes established its sequential logical necessity under Ockham’s Razor) – is fraud dressed up in a science lab coat. It is out of sequence, bypassing much more deductive and direct-testing alternatives, employing science based upon an unsound and manipulated grand set of data – otherwise known as pseudoscience.

An example of such an Ockham’s Razor orphan form of pseudoscience can be found here:  https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1124634/

The sincere skeptical researcher, will begin their research from a position of suspended judgement, and then proceed to ask a series of dependent and incremental questions, called a critical path. They are not overly retrophile on previous work/art, often working more as a critic of such approaches. They do not begin with grand statistical studies outside the question domain or focused on one small portion of the scientific or population domains. The onus is upon the ethical skeptic to understand this, and detect when a query seeks to combine or skip questions inside this critical path to force a compliant outcome; or worse, attempt to trick, impugn or twist ideas and people by means of ‘asking a question’. This is done for two reasons: ignorance, or the desire to cultivate ignorance. The two motivations help create each other in a social context, hence the origin of the apothegm of ethical skepticism:

Ignorance is contagious.

The latter, a desire to cultivate ignorance established by means of Verdrängung Mechanism, is practiced by social skeptics. One learns early on inside the social skepticism movement, that in order to derive the right answer, all one need do is first ask the wrong question. It is actually a very brilliant strategy; one can even practice it without knowing that fact. However, it takes a more committed, sincere and sharp acumen, in order to catch the trick which enables this symbiosis between ignorance and the cultivation of ignorance. A trick called interrogative biasing.

Interrogative Biasing

/philosophy : pseudoscience : fallacy : red herring : scientific method pretense/ : ask the wrong question and you are assured to arrive at the right answer. A method of faking science by asking an incomplete, statistical absence, non-probative, ill sequenced or straw man question, fashioned so as to achieve a result which implies a specific desired answer; yet is in no way representative of plenary or ethical science on the matter under consideration.

One can observe interrogative biasing in a number of situations. It usually comes within a context of virtue signaling on the part of the person asking the question. The virtue can be positions of social justice, claims to represent god, or claims to represent science. Interrogative biasing is the strategy of obfuscation through posing of incorrect, impugning or badly sequenced questions of science. But the tactics it typically comprises include:

1.  Querying Reliable Data and Not Probative Data

“We sought medical plan databases, and avoided cohort studies or parental reports due to the unethical or unreliable nature of such study.”

2.  Querying Flawed Means of Collection for Observations of Absence (Hempel’s Paradox)

“We examined two specific public healthcare plan databases in Denmark to observe incidence of accepted claims of plan doctor diagnoses of autism in kids 6 months to 5 years in age.”

3.  Asking a Surreptitiously Incoherent Question (Imposterlösung Mechanism)

“Please provide testable evidence for God.”

4.  Asking an Out of Sequence Question – a question which eventually should be asked, but is dependent upon other questions needed to be answered first

“What technologies will allow us to sequester carbon into ocean water?”

5.  Asking a Currently (Current Knowledge) Unaddressable Question

“If life did not originate from abiogenesis on Earth, then how did life begin?”

6.  Proof Gaming – Demanding things be ‘proven’ before science can be allowed to begin

“What if any, physical proof do you have of this persistent phenomenon (observation)?”

7.  Straw Man Question Framing

“We sought to test if therapeutic vitamin supplementation would have any impact on incidence of heart disease during a 5 year observation horizon of a group of persons.”

8.  Question Lacking in Plenary Science, Adequate or Ethical Domain

“We sought to test if the MMR vaccine was associated with higher rates of autism in Danish children (on a much lower vaccine schedule).”

9.  Trick/Ambiguous/Amphibological Question (uti dolo)

“Do you as a scientist accept the reality of climate change?”

10.  Begging the Point – the framing of a question from a desired answer in such a fashion that its desired conclusion is the only viable answer

“Why if there is no God, is everything around us in perfect designed balance?”

11.  Eristic Question – a question posed so as to pose the recipient in the worst light

“Wasn’t your paper rejected for fraudulent scientific procedure, if I recall correctly?” (Had to correct one assumption, which did not change outcome)

12.  Convergent Semantics – a question which does not allow an answer outside a particular conclusion domain

“Have you stopped beating your wife?”

13.  Red Herring – posing an irrelevant, bucket characterization, misinforming or unsound question

“Why are supplements not controlled by the FDA in ways which scheduled drugs are?”

14.  ingens vanitatum – posing a rapid series of irrelevant questions, in order to tender the appearance of competency inside a subject. However none of the questions seem to bear any critical nature of understanding of the subject being discussed, or are posed in an illogical sequence or order.

“What was the court docket number?  Was the case heard by a state or federal judge? In what precinct was it filed?”

Become skilled at detecting such circumstances in query, and you will be amazed at how the supposed heroes of ‘skepticism’ will in your eyes, steadily become tarnished and fall from grace.

epoché vanguards gnosis

——————————————————————————————

How to MLA cite this blog post =>

The Ethical Skeptic, “Interrogative Biasing: Asking the Wrong Question in Order to Get the Right Answer” The Ethical Skeptic, WordPress, 14 Jul 2018; Web, https://wp.me/p17q0e-7Vo

July 14, 2018 Posted by | Agenda Propaganda, Argument Fallacies | , , | 9 Comments

Chinese (Simplified)EnglishFrenchGermanHindiPortugueseRussianSpanish
%d bloggers like this: