Rather than presume as capstone upon incredibly risk-ignorant stacks of knowledge, what is true and not true, the ethical skeptic instead focuses on field observation and the suspension of doubt, belief and provisionally stacked assumption. He is not denying knowledge, rather denying the lie-spinner the raw material he so desperately needs. He is denying the tradecraft of the lie: an Omega Hypothesis. A condition wherein the conclusions themselves become more important than the process of knowledge development. Wherein conformance of belief becomes now a moral imperative.
In the previous blog post we discussed the Riddle of Skepticism and a thing called the Tradecraft of the Lie. Inside this precept we are made sensitive to the role which machinated doubt, belief and ignorance of provisionally stacked risk play inside grand fantasies posed as representing (pluralistic ignorance) accepted scientific thinking. Several warning signs can be monitored to watch for such a condition wherein, social forces are seeking to promote an idea (The Omega Hypothesis) in such a fashion as to block further study and end the scientific discourse. In the process of doing so, these same forces will speak often about ‘evidence’, ‘study’ and ‘science’. Of course evidence, study and science are the foundation of our knowledge development process. But simply because one proclaims such words, does not mean therefore that their proclaimant actually understands nor represents science.
The Tower Which Cannot Be Touched – At All Costs
Below I have employed the analogy of a Jenga blocks game as illustrative of the principles comparing ideal science, the reality of fake knowledge posing as science and how its proponents undertake pseudoscience (pretend method/action – and NOT a subject) in an effort to block competing ideas. A shaky tower cannot be touched, at all costs. Therefore any method of obfuscating competing ideas becomes part of the ‘stack’ of provision afforded the Omega Hypothesis. The job of fake skepticism is to ensure that no competing idea nor unauthorized entity ever touches the shaky tower of blocks. As well, to ensure that the resource and obfuscation gaps inside the Embargo Hypothesis (on the right below), the Hypothesis which gets them angry – that those gaps are never addressed by science. That the question of the Embargo Hypothesis can never be raised in serious scientific discussion – at cost of severe career penalty. (Click to enlarge)
This principle is underpinned by key Karl Popper philosophy as outlined inside The Stanford Encyclopedia of Philosophy:¹
In the view of many social scientists, the more probable a theory is, the better it is, and if we have to choose between two theories which are equally strong in terms of their explanatory power, and differ only in that one is probable and the other is improbable, then we should choose the former. Popper rejects this. Science, or to be precise, the working scientist, is interested, in Popper’s view, in theories with a high informative content, because such theories possess a high predictive power and are consequently highly testable. But if this is true, Popper argues, then, paradoxical as it may sound, the more improbable a theory is the better it is scientifically, because the probability and informative content of a theory vary inversely—the higher the informative content of a theory the lower will be its probability, for the more information a statement contains, the greater will be the number of ways in which it may turn out to be false. Thus the statements which are of special interest to the scientist are those with a high informative content and (consequentially) a low probability, which nevertheless come close to the truth.¹
Provisional Knowledge and Risk
In contrast, you will note that it is the job of the provisional knowledge proponent, to seek out stacks of self-defined most probable conclusion. This is why basement and cubicle skepticism gets swallowed so easily by its sycophancy. It involves information which simultaneously ignores the contribution of increasing risk in plurality, and sidesteps the requirements of a coherent and sound argument, bearing a sufficient level of integrity in logical calculus. Most of our questionable skeptical ‘beliefs’ today (such as nihilism) fail in regard to these two duties, regardless of how probable the evidence makes them.
The Specter of Ignored Increasing Risk
1. Is not evaluated in terms of increasing risk-chain dependency
2. Is not informative because it is probable; nor tolerates competing falsification work to be undertaken
3. Cannot be assailed because of its probable superiority and critical basis for other arguments.
The key here being that as a stack of favored knowledge gets increasingly higher and even more extrapolations are added over time (and they always are), even so more powerful become the standing 3 risk features above. So powerful in fact, that in order for the argument to survive, the entailed conclusions need become more important than the process of knowledge development itself. They must become doctrine as a vaccination against future science which might threaten their status. This is how an Omega Hypothesis works – it must be protected at all costs, in order to not upset its multiple and tall stacks of provision. This is the job of Social Skepticism, to protect these stacks – the money, power, tenure and religious interweaving from which multiple arguments extend and thrive. Let’s examine a study abstract below from a proponent of what they call ‘evidentialism’. Ethical skepticism in contrast regards evidence as only one of the evaluation factors of a good argument; ranking, in order of importance:
1. Coherency – the conjecture and its underpinning definitions actually form a resolvable scientific sentence
2. Soundness – the premises and assumptions bear merit and accuracy in underpinning the argument
3. Formal Theory – the logical calculus (flow) of the argument and the arrangement its mechanisms, are critically dependent and sequitur
4. Inductive Strength – the argument bears consilience and has fairly addressed its relevancy and risk chain dynamics
5. Circumstantial Strength (the ‘evidence’) – the argument bears a robust array of inductive or deductive evidences
6. Integrity of Form/Cogency – the argument filters agency and mitigates bias and is expressed in a neutral fashion bearing scientific integrity.
The above list of 6 components of argument is how a modus ponens or tollens builds its cache of reason. Not a list of ‘facts’ tossed out as evidence, even if they do happen to inductively support a somewhat probable conclusion. Under an assumption of ‘probability’, one can pass off sound arguments (bearing plurality merit certainly) as ‘proved’ and ‘accepted’ arguments, skipping past critical elements 3 – 6 in the list above – and bereft of understanding of the three risk elements listed earlier. This is the trick of fake skepticism. Such reasoning is foolishness, taught by people who do not understand what is going on inside science, philosophy and skepticism.
The problem with the equivocal word ‘evidence’ is that it is a permissive argument in itself,
which admits any unmerited form of casuistry or trivia to masquerade as epistemology at the table of science.
Even worse, is when they begin to insist that adopting such unsound conjecture is a moral obligation. Yes, you heard me – if I produce one small iota of evidence to support my a priori belief, you now bear a moral obligation to believe it as well. Look at the bad things which happen to good people, so therefore only this realm can exist. The world is getting hotter, therefore we must elect progressives and introduce violence into the political process. Eminent garbage. Eventually yes, jail sentences will be introduced for those who do not hold their dissent quietly. This is coming folks. Get ready.
What the author has done in the below example of such a reasoning pathway, is to execute a process of panduction – one study which disproves everything which stands in objection to his sophistry. An apologetic, to be sure, but also one of an exceptionally awesome insistence. This is not a religious pitch which is optional for you, as most religious pitches go. The author makes it clear instead that believing like he does is mandatory. Poetically, his very paper fails the six elements of a strong argument above and ignores the elements of entailed risk. He executes one fell swoop of invalid inference. This is illegitimate science. Notice the 7 things he supposedly ‘disproves’ in his inverse negation rush to defend something which bears no evidence of its own right. He conflates evidence with ‘proof’ and uses the word ‘or’ as a permissive artifice which opens the door to just about any datum which can be whipped up or has some brief funds backing. This is deception and philosophical incompetence. It is oppression in a legal sense. To wit:
‘Evidentialism’ is the conventional name (given mainly by its opponents) for the view that there is a moral duty to proportion one’s beliefs to evidence, proof or other epistemic justifications for belief. This essay defends evidentialism against objections based on the alleged involuntariness of belief, on the claim that evidentialism assumes a doubtful epistemology, that epistemically unsupported beliefs can be beneficial, that there are significant classes of exceptions to the evidentialist principle, and other shabby evasions and alibis (as I take them to be) for disregarding the duty to believe according to the evidence. Evidentialism is also supported by arguments based on both self-regarding and other-regarding considerations. ~ Allen Wood, The Duty to Believe According to the Evidence, International Journal for the Philosophy of Religion; Feb 2008, Vol 63.
This ‘study’ is highlighted by social skeptics wishing to enforce iffy beliefs upon those who do not share them with the oppression-minded ‘skeptic’. The problem which is introduced by such flawed thinking, is highlighted by a principle of ethical skepticism called the acatalepsia fallacy:
/philosophy : casuistry : pseudoscience/ : a flaw in critical path logic wherein one appeals to the Pyrrhonistic Skepticism principle that no knowledge can ever be entirely certain – and twists it into the implication that therefore, knowledge is ascertained by the mere establishment of some form of ‘probability’. Moreover, that therefore, when a probability is established, no matter how plausible, slight or scant in representation of the domain of information it might constitute, it is therefore now accepted truth. Because all knowledge is only ‘probable’ knowledge, all one has to do is spin an apparent probability, and one has ascertained accepted knowledge. Very similar in logic to the Occam’s Razor aphorism citing that the ‘simplest explanation’ is the correct explanation.
This can be seen as well inside the habitual conflation of verisimilitude and consilience inside the Sskeptic community, two differing approaches to the establishment of hypothesis gravitas inside the scientific community – purposed for two different situations of inference. While consilience is not finished science, rather an important step along the way in developing a hypothesis or understanding, sadly often verisimilitude is non-expertly regarded conveniently as finished science by fake skeptics. This is the reason why doubt, belief and stacks of provisional knowledge are eschewed by the ethical skeptic.
/philosophy : pseudoscience : method/ : a social declaration which fits a predetermined agenda, purported to be of ‘weight of evidence’ and science in origin. However, in reality stems more from only the removal/ignoring of the majority or plurality of available or ascertainable evidence, in order to sculpt a conclusion which was sought before research ever began (see Wittgenstein sinnlos Skulptur Mechanism). Conducting science by dwelling only in the statistical and meta-analytical domains while excising all data which does not fit the social narrative of funding entities, large corporations or sskeptic organizations. Refusing to conduct direct studies, publishing studies which contain an inversion effect and filtering of countermanding studies out by attacking journals, authors and ignoring large bodies of evidence, consilience or falsification opportunity.
Verisimilitude (Unity of Knowledge Error)
The Omega Hypothesis: A provisionally stacked basis supporting an apparent simple reality
/philosophy : pseudoscience : method/ : an argument establishing a screen of partially correct information employed to create an illusion of scientifically accurate conclusion. The acceptance of specific assumptions and principles, crafted in such a fashion as to lead to a therefore simple resulting explanatory conclusion.
A dynamic enhancement of measure, observation and analysis increasingly promoting a single coherent idea
/philosophy : science : method : induction/ : is the nature or characteristic of an argument wherein its underpinning premises, data, multiple associated disciplines, avenues of research or predicates provide for independent but mutual reinforcement of its conclusion.
The Embargo Hypothesis (Hξ)
The hypothesis which must be dismissed without science because it threatens simplicity and verisimilitude
/philosophy : pseudoscience : method/ : a disfavored hypothesis which will never be afforded access to science or the scientific method no matter what level of consilience is attained. An idea which threatens to expose the risk linkages inside of or falsify a stack of protected provisional knowledge which has achieved an importance greater than science itself: an Omega Hypothesis.
Consilience is not finished science, nor does consilience stand as consensus. You will find fake skeptics conflating and twisting the two terms in order to support provisional knowledge they wish to enforce. However consilience does stand as a principal threat to those wishing to protect an agenda of verisimilitude. The Embargo Hypothesis on the other hand, is typically an elegant yet ignored robust and cogent theory. One which threatens power, tenure and money. It will never see the light of a scientific day, no matter how much consilience is developed.
Several Key Warning Signs to Look For Indicating an Omega Hypothesis at Play
A. Mixing up the steps of or using only a portion of the scientific method
B. Becoming irritated at calls for further competing study
C. Over-using the term ‘settled science’ or ‘evidence’ in pluralistic debate
D. Applying social pressure and club conclusions – feigned as ‘promoting scientific literacy’
E. Liberal use of the prefix ‘Anti’
F. Identifying enemies and pigeon-hole bifurcating an argument
G. Becoming threatened by specific alternatives – calling them magic, pseudoscience, conspiracy or irrational
H. Relying on scant, outdated or Big-Data only study
I. Employing celebrity, journalism, funding, influence or authority (see Kilkenny’s Law) to intimidate
J. Claiming their stack of provisional argument results from a principle of doubt and/or skepticism
K. A complete ignorance of favored hypothesis limitations and risk.
Several Filtering Techniques Employed to Block The Embargo Hypothesis
And finally, whenever you observe the practices cited above, at play inside the methods of science employed to craft a stack of provisional knowledge and/or enforce an embargo of an eschewed but competing alternative – watch for the following filtering methods. A short definition is provided below with each.
Amplanecdote – so many observations are discarded as anecdote, that an entire science can be assembled around them
Filbert’s Law – relying solely upon single big-data, large domain or arm’s length inexpert meta-analysis to underpin a conclusion
Correlation Dismissal – assuming that all correlation is invalid and instead demanding proof as a first step in science
Effect Inversion – when the reverse question is asked inside an apparently non-significant signal data set, the opposite effect (‘curative’) shows up as a statistically significant signal
Procrustean Solution – further and further study is either modified or thrown out in order to not conflict with the current paradigm
Forward Problem Blindness – the reverse question is never asked or only predictive science is employed and falsification remains unused, despite its availability
Ignoro Eventum – failure to conduct follow-on/impact study or observe an impacted population after a major environmental change has been implemented
Ascertainment Bias – a form of inclusion or exclusion criteria error where the mechanism of sampling, specimen selection, data screening or sub-population selection is inherently flawed
MiHoDeAL Filtering – resorting too often to disposition unliked observations as Misidentification, Hoaxes, Delusions, Anecdotes and Lies
Law of Static Privation – does the provisionally stacked knowledge have a track record of improving further knowledge or alleviating suffering? – if not, it is a provisional stack
Existential Fallacy of Data – the implication or contention that there is an absence of observation or data supporting an idea, when in fact no observational study at all, or of any serious import has been conducted by science on the topic at hand
Shevel’s Inconsistency – one simultaneously contends that science has shown a research subject to be invalid, yet at the same time chooses to designate any research into that subject as constituting pseudoscience
Manipulative Rational Ignorance – an arguer contends rational ignorance applies inside an argument, or the ignoring of a pathway of science because the cost or effort entailed is too high versus the results or lack thereof to be obtained
Furtive Fallacy – undesired data and observations are asserted to have been caused by the malfeasance of researchers or laymen
Fallacy of Relative Privation – science is only the property of scientists. Dismissing an avenue of research due its waste of scientists’ time and focus
Semmelweis Reflex – the tendency to reject new evidence that contradicts one’s held paradigm
Furtive Confidence Fallacy – refusal to estimate, grasp or apply principles of statistical confidence to collected data. The act of prematurely declaring or doggedly denying a multiplicity of anecdote to be equal to data
Muta-Analysis/Studium a Studia – arm’s length, big data, studies-of-studies conducted by 3 year or less experienced analysts, presided over by money influenced directors
Dismissible Margin Fallacy – ensuring that the dissenting real experts and outlier data in a study field are of sufficiently low percentage that they can be ignored (typically cited as <5%)
Consilience Evasion – the refusal to look at a body of growing consilience in an effort to deny a disliked alternative any access to science
Hume’s Razor Error – the false presumption that a seemingly miraculous explanation is assumed to be false if any alternative explanation provided is less miraculous
Sponsorship Bias – rejection of an entire methodological basis of a scientific argument and all its underpinning data and experimental history simply because one can point to a bad personality involved in the subject
Regressive Bias – a certain state of mind wherein perceived high likelihoods are overestimated while perceived low likelihoods are underestimated
Observer Expectancy Effect – when a researcher expects a given result and therefore unconsciously manipulates an experiment or scientific method or misinterprets data in order to find that expected result
False Stickiness/Consensus – false belief that, or willingness to acceptance the claim that, scientists are all in agreement a given subject
Reactive Dissonance – when faced with a challenging observation, study or experiment outcome, to immediately set aside rational data collection, hypothesis reduction and scientific method protocols in favor of crafting a means to dismiss the observation
Unity of Knowledge Error – to conflate and promote consilience as consensus, in absence of having diligently falsified or even studied any competing hypothesis
All of these may be found in the Glossary or Tree of Knowledge Obfuscation Pages of this blog.
epoché vanguards gnosis
¹ Thornton, Stephen, “Karl Popper”, The Stanford Encyclopedia of Philosophy (Winter 2015 Edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/archives/win2015/entries/popper/>.