The Ethical Skeptic

Challenging Agency of Pseudo-Skepticism & Cultivated Ignorance

The Sleight-of-Hand StageCraft of the Debunker

The debunker craves science-celebrity with such a blood lust, that any form of deception will suffice, provided the great preponderance of those who witness their stage act will fail to spot the incumbent sleight-of-hand.
The debunker is spinning a facade of cherry-picked anecdotal anti-data, which is then used to linearly claim that something isn’t. This backwards method of outference runs anathema to the practices of science, evidence, and inference; a method plied by someone who possesses no interest in truth whatsoever.

Debunking is a form of social activism which seeks abuse of science through a masquerade of its underlying philosophical vulnerability, skepticism. At its heart, debunking is nothing but weaponized fake skepticism deployed by useful idiots to further the agency (not simply bias) of their sponsors. The debunker is a catalyseur who actively seeks to foment conflict between science and the lay public regarding an embargoed topic, who then exploits such conflict to bolster their celebrity and false form of go-to authority.

Inside this article we draw witness to the debunker’s alchemy, wherein they transmute their fantasy and obsession with trivia into accepted claims of science. It’s backbone serving only purposes of discrediting and ridicule, the form of oppressive dialogue outlined below ironically serves to most often disparage courage and curiosity inside the very arenas in which these virtues are most sorely needed.

The process of debunking at its heart constitutes an abject mockery of science. It centers upon the fabrication of anti-data from extreme agency and anchoring bias, along with employment of this anti-data inside an argument from ignorance to enforce a fake form of null hypothesis called an Omega Hypothesis. Not even featuring the integrity of cynicism, this process is outference, an antithetical ethic to critical-path syllogism and inference. This is depicted below:

Debunking is the abuse of methodical doubt and mere plausibility to fabricate cherry-picked anti-data (?), which extrapolates into an empty-set (Ø) appeal to ignorance and supposedly proves (enforces) a fake form of null called the Omega Hypothesis. Debunking doubts and embargoes contending ideas before they can be tested for merit by science, yet never exposes its own ‘null hypothesis’ to any kind of accountability. Debunking is the antithesis of science.

In contrast, science is the use of robust, corroborative, and repeating observational data to develop sound inference along a critical path of inquiry, which all serve to falsify the Null Hypothesis. Science doubts the Null Hypothesis.1

Debunking would be ‘cleverness pretending to be wise’, if it even rose to the level of being clever to begin with. (note: this is not the same as heteroduction, because as shown above heteroduction cannot be employed to prove the null hypothesis). Without further ado, ladies and gentlemen may I present the method of the debunker, The Pseudo-Scientific Method.

       Fabricate Supposed Flaw in Observation

Fallacy of Exclusion/Sowert’s Law – (Fallacy of Suppressed or Isolated Evidence) – one of the basic principles of argumentation is that a sound argument is one which presents all the relevant, and especially critical-path, evidence. A debunker will seek to isolate one single facet of an observation and then pretend that it is weak, when stripped of its corroborating observations, context, and facets of credibility. This is the warning flag that the above pseudo-scientific method is at play. Ignorance + Trivia = “Fact” in the compromised mind of a person bearing agency (Sowert’s Law).

Fallacy of Opposition/MiHoDeAL Claim – presuming that someone is wrong, including trained and qualified scientists measuring direct observation in a controlled environment, because they appear to reside in an opposing camp. Dismissing a single (or all) observation(s) as ‘Misinterpretation, Hoax, Delusion, Anecdote, and Lie (MiHoDeAL)’ – based solely upon a disdain for what was observed or who did the observing. A fallacy where an untrained, unequipped, ignorant ‘skeptic’ is suddenly instructing experts on how to correctly conduct observations in their field, on systems and in environments where the ‘skeptic’ has zero relevant experience.

       This Evidence is Disqualified

Semmelweis Reflex – the tendency to reject by informal, incomplete, or invalid basis new evidence that contradicts one’s held paradigm.

Truzzi Fallacy – when a cynic, debunker, or denialist regards that it is only necessary to present a case for their counter-claims based only upon their notion of plausibility, fictitious versions of Occam’s Razor, or probability no matter how slight it may be, rather than any actual empirical evidence.

Subception – a perceptual defense (martial art) that involves unconsciously applying strategies to prevent a troubling stimulus from entering one’s personal gestalt.

       All Evidence is Disqualified

Fallacy of Composition – when one infers that something is true of the whole from the fact that it is (possibly) true of some part of the whole or anecdote thereof.

Dietrologia – an insistence that the obvious or repeatedly observed explanation, cannot possibly be the truth. Invoking as a first response and without any evidence, that ‘conspiracy theory’ spinning must be the motivation behind any repeated interpretation other than a preferred conventional one.

Cherry Sorting/Data Skulpting – applying diligent doubt, investigation, and skepticism to all instances of observation one disfavors, while relaxing and offering a free pass to all observations or interpretations which fit one’s preconceived notion. Skulpting the answer one desired in advance from skillful and selective ‘doubting’ of the observation/evidence base.

       Quod Erat Demonstrandum: ‘No Evidence Exists’

praedicate evidentia – hyperbole in extrapolating or overestimating the preponderance of evidence supporting a specific claim (even to convention), when few examinations of merit have been conducted regarding a hypothesis, or few or no such studies of the subject have indeed been conducted at all.

Appeal to Ignorance – an argument for a conclusion based on a lack of evidence, or the insistence that a lack of observed evidence means there is not evidence to be observed at all.

       Subject and Researchers are Therefore Discredited (A Scientific Claim)

Appeal to Skepticism – the invalid employment of skepticism to act in lieu of science. The employment of skepticism, in absence of any form of scientific study, in order to derive a scientific conclusion. Philosophy (skepticism) cannot be used to supplant science, as that is neither its capability nor role. Also, when one does not hold a science qualification, the pretense that one’s use of skepticism implies therefore that one’s opinions still represent science or scientific consensus.

Appeal to Ridicule – an argument is made by framing the opponent or opponent’s argument in a way that makes either appear ridiculous.

One should notice that the debunker most often ends this process with two implicit (and often explicit) scientific claims, issuing such offenses inside the masquerade of a staid and erudite demeanor:

1. That the subject and the one who approaches that subject as an open-minded researcher, are both now discredited scientifically (debunked), and

2. The debunker themself bears the qualifications necessary to represent the scientific method, scientific consensus, critical thinking, skepticism, and science itself.

‘Debunking’ in this context is defined as ‘the habit of the debunker’. While there is a small context of legitimate use for the term, its equivocal and pejorative connotations render it an unwise choice as a professional or philosophical term. Debunking in the manner cited above is an illicit form of martial art (see ‘subception’ above). Most often as well, it is a form of exploitation of the innocent, theft/obfuscation of common intellectual property, and lying on the part of a self-appointed authority – a fairly heinous set of actions given their casual issuance. A stage magician who has taken it upon themselves to apportion the proprietary knowledge development of mankind as they solely desire, by means of their tattered magic hat and worn-out stage act.

Wherein one is corrupt in their skepticism, there also will they be corrupt in their heart.

Most people sense the incumbent dishonesty in their gut. However, like myself years ago possess neither the philosophy nor ethics deliberation skills requisite in a Wittgenstein sufficient framing of such deceptive tactics. Hence, the reason why ethical skepticism continues to grow in popularity.

The Ethical Skeptic, “The Sleight-of-Hand StageCraft of the Debunker”; The Ethical Skeptic, WordPress, 3 Jul 2021; Web, https://theethicalskeptic.com/?p=51488

July 3, 2021 Posted by | Ethical Skepticism | , , | 8 Comments

Meta-Ethical Praxis of Science

Ethical skepticism is a form of meta-ethical philosophy which serves specific benevolent/knowledge goals. The ethics of science (a fortiori of skepticism as well) relate therefore primarily to the study of meta-ethics – and have little to do with morality or virtue. They focus on specific standards of praxis within the scientific community at large.

There exist three domain-forms of ethics. The first is normative ethics, the domain of appearances and correctness from a social perspective.

Normative Ethics – objective practices of morality and social codes of conduct (virtue, religious, moral, identity, personal conduct, etc.)

However, since one of the entire purposes of this blog is to decry pretense, false virtue, concealed religion and identity warfare (the abuse of ethics), we choose to focus instead on more professionally applicable contexts of ethics. More specifically, those of meta-ethics and praxis of science and skepticism:

Meta-Ethics – the study of the disciplines and philosophical bases behind professional standards of practice (skepticism, objectivity, consequentialism, deontology)

Applied Ethics (Praxis) – the decision theory behind professional standards of practice or social codes of conduct (law, procedure, codes of conduct, standards of practice) 1

While I am an upstanding and conscientious person in my private and professional life, one should not infer from the term ‘ethical skepticism’ a personal boast of morality (normative ethics), as those who are ignorant of graduate level philosophy are prone to accuse. Rather one should comprehend ethical skepticism as an intellectual and practical allegiance to an actual long held standard of science. After all, this is what ‘ethics’ means, the decision theory behind adherence to standing professional standards of practice (meta and applied ethics or praxis ethics). Ethical skepticism therefore, is a meta-ethical philosophy which serves specific benevolent/knowledge goals and results in specific modifications to some of our applied ethics (pseudo-skepticism, institutional propaganda and cultivated ignorance). Especially applied ethics which have been in error.

The context of the moniker I use, ‘The Ethical Skeptic’ or the general practitioner descriptive in the form ‘ethical skeptic’, are set in the impersonal; as in the case of Zen and the Art of Motorcycle Maintenance for instance. The context of ethics employed in this blog is deontological in as far as the adherence to standards of protocol, such as the real and complete scientific method, are regarded as both sufficient and necessary to direct our knowledge development actions. An idempotent neutral practice, characterized by an aversion to tampering with observations and data in favor of one’s ontology. Yet, still consequentialist from the perspective that the outcomes of value and clarity manifest as the signature handiwork of those who practice such ethics. In my profession and research skepticism is the substrate of science, and I feel it is abused when applied in lieu of science by agenda-schooled journalists, stage magicians, propaganda bloggers, psychologists and party/social activists.

One can make the serious contention that skepticism is itself, the philosophical deliberation of the meta-ethics of science. However, skepticism must be watched and held accountable too if it errs in consequentialist outcome, attempts to manipulate the outcomes of science itself, embargoes disliked topics or begins to step in and act in lieu of science. Hence the need for ethical skepticism – to watch the watchers.

There exists a contrast of relative movement between where the skeptic movement was 40 years ago, and where it resides now – versus the relative change in practice inside of the so-called pseudosciences during that same timeframe. It is skepticism which has had to be taught how to behave over the last 40 years, and not the pseudosciences. More people believe in a litany of pseudoscience than ever have before. I believe this be be precisely because of the mistaught version of skepticism which was hatched in the 60’s and 70’s.

Skeptics have had to be taught how to behave over the last 20 years in particular.
As a result of their malpractices so-called fringe ideas, both valid and invalid, have grown dramatically in subscribership.
If such such a fringe subject bears validity, then of course its cynics were always in error.
If the fringe subject is invalid, its ensuing popularity too is the fault of the pseudo skeptic – and for the same reasons.

Fake moon landing and flat earth proponents have learned to employ the very same methods which
have been taught by fake skepticism in the targeting of disliked ideas over the last 60 years.
The chickens of failed philosophy have come home to roost.
And the blame for this resides squarely with our floundering skeptics.

These practices are not simply unethical because of the negative consequentialist outcomes in terms of subscribership to fringe topics like fake moon landing and flat Earth theories, but as well – from the history that it has indeed been skeptics who have proved to require the most re-education in this process of deliberation, and ironically not the ‘credulous’. Finally, the specific practices which have resulted in this are detailed and cataloged by The Ethical Skeptic. Skeptics have failed us both from ontological and consequentialist perspectives. What follows is the reason for all this. Fake virtue does not work. Club quality does not work. Normative ethics can serve to provide a clever disguise for agency of malice and oppression.

Exploitation of Ethics Reveals the Necessity of Meta-Ethics

This principle of a policing club such as social skepticism introduces several problems identified by ethical skepticism regarding prima facia virtue and normative ethics:

Virtue Signaling

/philosophy : pseudoscience : normative ethics/ : the ironic principle entailed in the social observation that, prima facia ethics or normative ethics, virtue, religious precepts, morality, victimhood, identity warfare, personal conduct codes, etc. can, and often do serve as a cover for unethical agency masquerading under such pretenses. An action performed in accordance with socially correct pressure, or inside a visible boundary of political correctness, which is performed by a person wishing to show that they are on the good side in a political argument. Symbolic virtuous acts or positions adopted solely to build political power or exempt one from being accused of racism, bigotry, misogyny, greed or any of the canned talking attack points currently being fad utilized by the political left.

Exoentropy of Normatives

/philosophy : ethics : meta-ethics/ : The effort to enforce order inside a controlled subsystem, inevitably and ironically serves to increase the level of disorder or entropy surrounding it. Moreover, systemic dynamics can serve to impart unethical consequentialist outcomes which arrive as a result solely and wholly from individual efforts to maintain normatives of propriety or the appearance of such propriety; especially when coupled with the gaming and exploitation potential therein. This is also known as exoentropy, wherein a decrease in entropy of a subsystem leads further to an even greater entropic contribution to its surroundings or surrounding systems – resulting in an overall entropy or loss to the whole. An example of this can be found in the observation known as Goodhart’s Law and Goodhart’s Law of Skepticism.

Goodhart’s Law – when a specific measure becomes the sole or primary target, it ceases to be a good measure.

Goodhart’s Law of Skepticism – when skepticism itself becomes the goal, it ceases to be skepticism.

Qualitas Clava Error

/philosophy : fallacy : demarcation of skepticism and pseudo-skepticism/ : club quality error. The presumption on the part of role-playing or celebrity-power-seeking social skeptics that their club or its power, is important in ensuring the quality of science and scientific understanding on the part of the broader population. The presumption that external club popularity and authority, lock step club allegiance and presumptive stacks of probable knowledge will serve to produce valid or quality outcomes inside scientific, rational or critical thought processes. The pretense of encouraging skepticism, while at the same time promoting conclusions. Such thought fails in light of time proven quality improvement practices.

This problem of a single standard of skepticism (Science Based Medicine, The Skeptic’s Dictionary, CSICOP, Skeptical Inquirer, etc.), becoming in itself the goal – or in a single measure (p-value) acting now in lieu of science elicits the central issue with regard to scientific ethics today. And in my estimation therefore, the central issue regarding skepticism as well. They have simply replaced the old-boys’ networks with a new club – however a club which is much more prone to witch hunting. One example of such single measure chicanery is outlined inside a very popular Aeon Essay on Science from 2016, by Siddhartha and Edwards.

Since the Second World War, scientific output as measured by cited work has doubled every nine years. How much of the growth in this knowledge industry is, in essence, illusory and a natural consequence of Goodhart’s law? It is a real question.

The increased reliance on quantitative metrics might create inequities and outcomes worse than the systems they replaced. Specifically, if rewards are disproportionally given to individuals manipulating the metrics, well-known problems of the old subjective paradigms (eg, old-boys’ networks) appear simple and solvable. Most scientists think that the damage owing to metrics is already apparent. In fact, 71 per cent of researchers believe that it is possible to ‘game’ or ‘cheat’ their way into better evaluations at their institutions.

~ Science is Broken, Siddhartha Roy, environmental engineer and PhD candidate at Virginia Tech and Marc A Edwards, Distinguished Professor at Virginia Tech. 2

The ethics of science (a fortiori of skepticism as well) relate therefore primarily to the study of meta-ethics – and have little to do with single indicator morality or virtue. Surface measures can be gamed by forces pretending to be or manipulate science. ‘Doubt’, ‘critical thinking’, ‘focus on the data’ – can all serve as virtue costumes which agency adorns to play the role. Moreover, even truly moral and virtuous players can indeed serve to produce highly unethical outcomes, so prima facia virtue is unreliable as a predictor for ethical outcome. 3 Meta-ethics relates to the study of decision theory and how it impacts the overall quality of science inside a hyper-growth institutionalized vertical. Accordingly, the ethics of science are defined by premier ethics philosophers, biochemist Adil Shamoo, PhD and bioethicist David Resnik, PhD as such: 4

a.  Disciplined standards of conduct
b.  Discipline of study of standards of conduct
c.  Decision science incorporating standards of conduct
d.  Resulting state of character which undertakes such disciplined decisions.

Below are several standards of scientific ethics, developed in part from leading discipline materials on the ethics of science, with my own experience inside the subject incorporated therein. These ethical norms of science are re-developed from the following resources: 5 6

Paul Humphreys, The Oxford Handbook of Philosophy of Science; Oxford University Press, New York, NY, 2016; pp. 255-9
Resnik, David B., The Ethics of Science: An Introduction (Philosophical Issues in Science), New York: Routledge, 1998

It is the purpose of the PhD program, and the examination by review board, dissertation committee and advisor, to ensure that the PhD candidate grasps and has developed the skill in applying the following principles TO SELF FIRST and not others, before they are allowed to adorn the moniker of scientist. Skepticism has mistakenly taught the less mature among us, to only examine others under cherry picked versions of these principles.

   The Meta-Ethical Praxis of Science (and Skepticism)

Integrity – responsibility with regard to both the soundness and the critical implications of one’s scientific research. Understand what constitutes data versus information versus intelligence and its probative potential versus its eventual reliability (not simply current). Understand the differing implications of various types of inference and what to do with an outcome which may not be well accepted by the community at large.

Openness – transparency of process undertaken, sources employed, assumptions made and models utilized, along with the sharing of data, results, methods and materials with other researchers – and yes, even laymen and curious stakeholders.

Diligence – maintaining good records of data, experimental protocols, and other research documentation. Take appropriate steps to recognize and mitigate potential bias and error. Subject your own work to critical scrutiny and do not overstate the significance of your results. Disclose information sufficient to allow the critical review of your work.

Freedom – support freedom of inquiry in the laboratory, research environment and in the field. Do not obfuscate scientific arguments or data, nor prohibit scientific or layman researchers from engaging in investigation and debate.

Due Credit – identify and allocate credit for prior art, investigative and analytical work, where such credit is due.

Respect for Intellectual Property and Prior Art – do not plagiarize nor steal intellectual property. Respect data sources, copyrights and patents.

Discretion – maintain the confidentiality of materials and data sources which are entrusted under such constraints or in any case where doubt exists as to such necessity. Maintain anonymity (such as in peer review, personal data or subject identity) unless identity is specifically warranted (publication) or requested.

Stewardship – take care with data, resources, test subjects, results, databases, samples, equipment, supplies, and physical research or anthropological/paleontological sites.

Development and Competence – maintain and enhance your competence inside your discipline of study. Increase awareness of the research field and impacted stakeholders. Take appropriate steps to deal with incompetence, or premature conclusiveness inside your discipline of study. Take appropriate steps to identify and hold accountable, those who fail in their burdens of service inside the public trust.

Serves Inside the Public Trust – it is necessary that science not be viewed as an activity in lieu of governance or any form of governance proxy, and must exclusively exercise its work inside the public trust.

Respect for Stakeholders – treat collaborators, data collection specialists, student and interns, and other peer and colleagues with respect. Treat impacted stakeholders with tolerance and the respect due those who will bear the burden of your outcomes. Do not discriminate against colleagues nor exploit them or their work efforts.

Respect for Humanity/Suffering – respect the rights, welfare and dignity of human or animal impacted stakeholders, research subjects, and protect them from harm or exploitation (except where exclusively proven to be unavoidable). Communicate risk in advance, in a clear and objective fashion – allowing human stakeholders to opt out, unless final proof (not simply consensus) is determined as to their necessity to comply.

Social Responsibility – prioritize research which is likely to benefit society or reduce suffering. Avoid causing harm to animals, the economy, a nation, humans or the environment. Science and scientists should never engage in activity to bypass/usurp the governance of a nation in a desire for application of their goals.  Engage in extracurricular activities which serve to benefit society.

The Human Right to Know – humans bear the right to knowledge about their origins or concerning any threat to their safety, well being or livelihood. Public access to study artifacts serving to illuminate mankind’s social, morphological and genetic history should not be denied based upon property conventions of any haplogroup, culture, owner, propriety, government, nation, intelligence group or institution. Knowledge is a basic human right; and in particular, it is a basic human right to access freely the knowledge of where mankind came from and the pathway which brought us here as a species.

Legality – comply with international, national and local laws. Comply with regulations and institutional policies – unless they compel you to violate the above ethics.

These are the standards by which an ethical skeptic regards their science. Praxis, not virtue.

The Ethical Skeptic, “Meta-Ethical Praxis of Science” The Ethical Skeptic, WordPress, 25 Oct 2018; Web, https://wp.me/p17q0e-8rl

October 25, 2018 Posted by | Ethical Skepticism | , , , | Leave a comment

Contrasting Deontological Intelligence with Cultivated Ignorance

A deontologist prefers a state of ‘unknown’ over choice of a highly probable stacked provisional knowledge or abductive reason, because of the more informative deontology of declaring a precise answer to be unknown, over ‘probably known’ inside a context of low intelligence and unevaluated risk. According to Wittgenstein, the formulation of elemental intelligence is the critical first step of science – which steers our methods away from the pitfalls of having to employ ‘skeptics’ to defend answers derived from stacks of highly probable knowledge – which bear a high risk of ultimately turning out to be wrong – a state to which we are blinded by the processes we chose to undertake and the clowns we hire to defend its answers.

Data and Deontology: A Revolution in the Making

fallacy dataAnother revolution is underway in the development of data structures employed by economic entities (corporations, funds, banks, trade partners and economies, etc.). Database normalization is the process of organizing the columns (attributes), rows (records) and tables (relations) of a relational database, along with the disciplining of the parsed data (answers) in order to reduce data redundancy and make data integrity more robust inside a high transaction environment (IT departments for banks, finance departments, consumer goods traders, brokerages, etc). First Order Logic normalization was employed since the 1970’s in both hierarchical and then further relational data base structures into the 80’s and 90’s. Third Order Logic, or ‘third normal form’ databases have performed as the standard for relational structures thereafter. Query languages such as SQL have traditionally been able to access answers from such third normalized structures in intuitive if not rules based lookup protocols such as single access protocols or intuitive query by example (QBE) user interfaces. Odds are, if you have used Microsoft’s Access or the user friendly dbms Airtable, then you already have had exposure to a query by example user interface. The companies I have owned/managed have thrived off the flexible and crucial role of the relational database in managing our customers, products, transactions, money and cash flows, along with other business information and intelligence (information and intelligence are two distinctly different things).

If all this sounds like gobbledygook, my apologies. It sounds like gobbledygook to me as well, and I have owned and managed corporations executing this type of information technology solution set for a portion of my three decades of work. My job as CEO or similar, was to translate the technical language of my senior IT Techs and System Engineers, and express it such a way as to allow client CEO’s to understand the transformative advantages of such technologies for their businesses. Normalization is akin to a very efficient set of file drawers – a method of disciplining how files are indexed, sorted, labeled, held and accessed in such as fashion as to allow the file drawer manager the ability to answer any and every question thrown at them, in as expedient a fashion as is possible. In addition, such discipline affords the file drawer manager the ability to quickly assess the level of integrity inside his or her stored data. This is a very satisfying situation to the mind of those crazy individuals who manage to keep their desks in tightly aligned and neat order.

But all this is changing. A new gunslinger is in town. Query Oriented Normalization (QON) is replacing the old relational normalization structure (or more specifically 3rd normal form databases), as well as the even older hierarchical database structures of data still in use in some of the older, larger institutions of science and technology. Before we address what a QON intelligence structure is, lets take a step back and examine exactly what deontology means:

Deontology

/philosophy : science : knowledge development and integrity : ethics/ : an approach to the ethics of knowledge development that focuses on the rightness or wrongness of actions themselves, as opposed to the attractive or unattractive nature of the consequences of those actions (Consequentialism) or to the character, credentials and habits of the actor (Virtue Ethics). In science, the process of valuing the scientific method over any of its particular conclusions or the people/institutions claiming them.

A deontologist seeks to reduce the unnecessary complexity of a process of questioning and the associated answers or lack thereof. Then further by conformance to a set of accepted practices, induce or deduce answers to specific questions which collectively serve to reduce the overall level of ignorance (a priori doubt, belief and stacked provisional knowledge) featured inside a given topic. Principally this process results in what is called an epistemology. A deontologist prefers a state of ‘unknown’ over even a highly probable stacked set of provisional knowledge, because of the preferential deontological ethics of declaring a precise answer to be unknown, over ‘probably known’ inside a context of low intelligence and unevaluated risk. This because when the deontologist surveys the horizon of what is truly unknown, he is able then to reduce process and focus on the correct next question under the scientific method.

Now – let’s examine the three forms of database management approach in the context of deontological ethics and the development of knowledge.  Ideally, a structure of knowledge (intelligence) comprises five interlinking elements (nodes and spans):¹

  1.  A precise question       – – –     (‘elementary proposition’ as Wittgenstein calls it) node Q(x)
  2.  Its answer                      – – –     (‘atomic’ fact as Wittgenstein calls it) node A(x)
  3.  A logical association to predicate answers                – – –    (‘certain relation’ as Wittgenstein calls it) spanning tree
  4.  A linkage to a fortiori questions                                  – – –    (‘features’ as Wittgenstein calls them) spanning tree
  5.  A logical phylogeny introducing a posteriori questions             – – –    (‘successor’ as Wittgenstein calls it) spanning tree.¹

In the past, both hierarchical and relational data have presumed that only answers (elements 2. and 4. above) exist, and further that they exist only in the form of object data (data repositories bearing no question which frames their context of employment) – often independent of question, and even less associated with predicate answers and a posteriori questions. You have often heard scientists remark “we answered one question only to have 6 others pop up.” Oddly enough, this is the correct state of affairs inside a knowledge development structure as it relates to the process of science. This is exactly how it should be.

Data is a set of answers without context of question. Intelligence is a framework of questions which have either certain or null answers. The latter is more informative than the former.

4.2211  Even if the world is infinitely complex, so that every fact consists of an infinite number of atomic facts and every atomic fact is composed of an infinite number of objects, even then there must be objects and atomic facts.¹   ~Wittgenstein, Ludwig; Tractatus Logico-Philosophicus

The Framing of Intelligence as Opposed to Diagnostic Data

An ethically answered intelligence question should result in 6 new ethical questions. An ethically answered reductive question should then reduce this set. What one typically fails to account for inside this despair inducing evaluation is the reduced set of risk produced along with the overall displacement of ignorance attained through the improvement of knowledge – two of the consequentialist objectives (expressed as value and clarity respectively in ethical skepticism) of such a process of reduction. Such is the nature of gnosis in our realm, and in absence of possessing all the answers already (a priori doubt, belief and stacked provisional knowledge) – is the very nature of deontological ethics.

Let’s examine this principle below in relation to the three database structure types we outlined above then. In the QON structure we observe all 5 interlinking elements (nodes and spans) present in true Wittgenstein based knowledge development. The QON structure not only catalogs answers in the form of data – but arranges a minimum spanning tree sequence of question as it relates to answer.

QON structures serve to establish intelligence, while the two classic datalogging structures to the left only serve to catalog data.

qon-concept-contrast

You will notice that several features serve to distinguish the QON structure from both the hierarchical and relational database structures (catalogs).

  1. The QON structure frames a record (answer) in terms of both a particular question and its particular (atomic) answer, in a constrained 1:1 relationship – the older structures only frame a repository of answers in a one-to-many relationship, with no linking to question.
  2. The QON structure first reduces the set of questions which are to be asked (reduction), as well as conducts a minimum spanning tree configuration of those questions, so that the path to answering them is pursued in the most expedient and logical framework achievable (if ascertainable).
  3. The QON structure is reflexive, i.e. allows multiple questions to evolve from one successfully answered question.
  4. The QON structure is recursive, i.e. allows multiple questions to solve or modify each other.
  5. The QON structure prefers a null answer, as opposed to a probable answer inference because:
    • a condition of ignorance of risk escalation with each successive probable answer
    • the pseudo-enhancing of answer probability solely on the basis that the answer serves to agree with the probability of other reflexive answers (see Unity of Knowledge Error)
    • the whipsaw amplification effect imbued by any error in successor relationships, or their communication or preference
    • null critical paths can be targeted for prioritization by researchers – rather than being ignored because they contain plug answers.
  6. The QON structure possesses no accommodation for a priori knowledge – a data catalog cannot indicate the difference.
  7. Successive queries inside a QON intelligence structure become more informative as each link/answer is resolved. A massing of facts in contrast is not necessarily more informative in relation to its size.

This (1 thru 7) is called a Q(x) to A(x) sequence of structured intelligence – and it is highly informative in its own essence (deconvolutional in neural network terms). So highly informative that it reduces the need to rely upon ‘Occam’s’ Razor mandates, likely guesses and unwise inferences on data. Even if you call all this nonsense ‘evidence’.

5.133  All inference takes place a priori.

5.1363  If from the fact that a proposition is obvious to us it does not follow that it is true, then obviousness is no justification for our belief in its truth.

5.14  If a proposition follows from another, then the latter says more than the former, the former less than the latter.¹

~Wittgenstein, Ludwig; Tractatus Logico-Philosophicus

In short, the QON model of data development more closely reflects the reality which we face in the prosecution of science. A diagnostician in contrast, will typically only demand a measuring machine and a hierarchical or relational database to address his or her abductive scientific challenges. He or she will ‘doubt’ any catalog proposition which runs counter to their prescribed notions. This is pseudoscience as the majority of scientific endeavor does not function in this fashion.

Doubt, belief and provisional knowledge are all in reality the same exact thing. All a form of succumbing to the short cut temptation to establish ‘likely’ guesses on specific answers; guesses which conform with our other ‘likely’ (or preferred) guesses. This in lieu of doing the field work necessary in reducing the Q(x) to A(x) sequence of structured intelligence requisite under the scientific method.

The diagnostician (see Diagnostician’s Error) therefore thrives off provisional knowledge and doubt (which along with belief, comprise the fabric of the lie):

There are two forms of ‘doubt’

Methodical Doubt – doubt employed as a skulptur mechanism, to slice away disliked observations until one is left with the data set they favored before coming to an argument. The first is the questionable method of denying that something exists or is true simply because it defies a certain a priori stacked provisional knowledge. This is nothing but a belief expressed in the negative, packaged in such a fashion as to exploit the knowledge that claims to denial are afforded immediate acceptance over claims to the affirmative. This is a religious game of manipulating the process of knowledge development into a whipsaw effect supporting a given conclusion set.

Deontological Doubt (epoché) – if however one defines ‘doubt’ – as the refusal to assign an answer (no matter how probable) for a specific question – in absence of assessing question sequence, risk and dependency (reduction), preferring instead the value of leaving the question unanswered (null) over a state of being ‘sorta answered inside a mutually reinforcing set of sorta answereds’ (provisional knowledge) – then this is the superior nature of deontological ethics.

Most fake skeptics define ‘doubt’ as the former and not the latter – and often fail to understand the difference.

The Whipsaw Effect of Probable Stacked Knowledge and Perception

5.5262 The truth or falsehood of every proposition alters something in the general structure of the world. ~Wittgenstein, Ludwig; Tractatus Logico-Philosophicus

Now all of this is not to contend that the realm of information technology is ready to tackle the challenge of datalogging the entire catalog of current knowledge and next appropriate scientific question. The purpose of this contrast in data structures is to elucidate the superior nature of deontological data structures to those which serve probable elements of knowledge. Why they are superior, and why the raw materials of a priori doubt, belief and stacks of provisional knowledge serve only the tradecraft of the lie. Science is developed along the lines of the QON ethic of intelligence development. Science is the process of asking and answering the right questions at the right time, and converting those binary relationships into usable minimum spanning tree pathways to knowledge.

In short, under a deontological context, two knowns and four nulls inside a Q(x) to A(x) intelligence structure – is considered more informative and superior to 6 probables in a normal structure of data.

canonization-rate-versus-neg-publicCarl Bergstrom of the University of Washington and Kevin Gross of North Carolina State University and their team of researchers, recently published a paper entitled ‘Publication bias and the canonization of false facts‘.² Nothing elicits the whipsaw effect of tampering with the processes of intelligence crafting, by means of provisional knowledge, more than the expose elicited inside this paper. The graphic to the right is extracted from the publication for reporting purposes only. It depicts the volatile effect which suppressing publication of negative outcome studies has upon consensus and canonization of scientific ideas. An important principle to observe here (and indeed a contention made by Gross-Bergstrom et. al.) is that the tightening of p-value measures is not a panacea in mitigation of canonizing false facts by means of false sequential method. Despite our precision of measure and tolerance, there exists a point at which, our suppression of negative outcome studies only serves to whip our consensus as a body scientific into unrealistic ranges of conclusivity. What is shown here of importance is that – the structure of intelligence afforded by inclusion of negative studies far outweighs the impact of precision increases of any particular answer contained in a positive-study-only publication approach.

In science, Q(x) to A(x) QON intelligence structures and deontological discipline are vastly more important than just making more likely p-value guesses which support our other likely guesses.

But this error in the form of publishing bias is not the only pitfall which can be encountered in mishandled science. What the deontological intelligence philosopher would note is that a series of mistakes and missteps can serve to impact or amplify this effect before negative and positive outcome publication biases are even introduced. All of them involve premature questions, and a complete ignorance of the intelligence surrounding a subject, in favor of the ‘data’ involved in the subject. Specifically they are (white numbered on the chart below as well):

  1. Questions may be asked in the wrong order – and serve to mislead – when assumed answered by a probable answer.
  2. Questions may be framed in the wrong context and seek to answer too many things at once – a condition which can be masked by a probable answer.
  3. Answers may be developed for the wrong question – and serve to confuse.
  4. a fortiori and a posteriori relationships may be assumed as a result of a probable answer being issued.
  5. Risk has not been evaluated in relationship to stacking of successive probable answers. Risk multiplied by the impacted a fortiori and a posteriori linkages.

Why a QON Driven Scientific Method Based Upon Intelligence and Not Simply Curiosity/Prejudice – Will Change Everything

time-is-coming-for-intelligenceThis is why a null answer is preferable over a probable answer inside a structure of intelligence. Below we see an depiction wherein – tampering with probabilities and dependencies in an intelligence structure (technically a deconvolutional Restricted Boltzmann Machine progression) can serve to produce dramatically differing outcomes of conclusion from that of a more classic Wittgenstein deontological reduction/deduction. Notice that all the atomic answers in the stacked provisional knowledge were the most probable answers (abductive reason) available inside the QON sequence chosen blindly (sans intelligence). But the conslusions were wrong. Today we non-skeptically rest upon foundations of knowledge in many arenas, developed from the method of stacked provisional knowledge depicted below. We fail to acknowledge the questions we have mishandled or skipped, and the incumbent risk we have introduced to the process, by not appropriately using intelligence as the formulative part of the scientific method – and rather ‘asking a question’ as the first step in the scientific method.

query-2

beware-data-versus-questionExacerbating this is the specter of publication. Assume that the above deontology reflects the reality on a critical social issue such as food contamination or the discovery of a new species. The abysmal depiction above becomes even more dire when one introduces the impact of publication bias into this process. Not only is the conclusion set wrong, but moreover we endure the danger of publication bias further then skewing the perceptions of science, and finally – per the conclusions of the Bergstrom-Gross study – canonize a completely incorrect scientific consensus, fully unable to overturn it from that point on – because no successive questions remain. This is the sixth vulnerability in terms of mistakes and missteps inside the scientific method, followed by the seventh vulnerability, the role of social skepticism.

6.  Publication and Acceptance of specific answers can serve to whipsaw the consensus conclusions or perceptions thereof, of science.

7.  Introduce false skepticism, people so energized through the heady role of ‘representing science’ – that critical questions which normally should be introduced to challenge the nature of stacked provisional knowledge – can never be asked. Opponents are framed under Frank’s Law as ‘anti-_________’ and the provisional knowledge achieves the status of a fervently protected religion all on its own.

Praedicate Evidentia

/philosophy : argument : organic untruth/ : any of several forms of exaggeration or avoidance in qualifying a lack of evidence, logical calculus or soundness inside an argument. A trick of preemptive false-inference, which is usually issued in the form of a circular reasoning along the lines of ‘it should not be studied, because study will prove that it is false, therefore it should not be studied’ or ‘if it were true, it would have been studied’.

Praedicate Evidentia – hyperbole in extrapolating or overestimating the gravitas of evidence supporting a specific claim, when only one examination of merit has been conducted, insufficient hypothesis reduction has been performed on the topic, a plurality of data exists but few questions have been asked, few dissenting or negative studies have been published, or few or no such studies have indeed been conducted at all.

Praedicate Evidentia Modus Ponens – any form of argument which claims a proposition consequent ‘Q’, which also features a lack of qualifying modus ponens, ‘If P then’ premise in its expression – rather, implying ‘If P then’ as its qualifying antecedent. This as a means of surreptitiously avoiding a lack of soundness or lack of logical calculus inside that argument; and moreover, enforcing only its conclusion ‘Q’ instead. A ‘There is not evidence for…’ claim made inside a condition of little study or full absence of any study whatsoever.

This is called in fake skepticism: ‘the evidence’. No question has really been answered in a QON reduction sequence – but we have studies, we do have studies. We therefore, are the science. A perch of power which is now necessary in defending against those who know that foundational questions have been ignored. Who needs intelligence when you have the ‘evidence’?

This is the origin of social skepticism. It is the process of cultivating ignorance.

epoché vanguards gnosis


¹  Wittgenstein, Ludwig; Tractatus Logico-Philosophicus; London: Kegan Paul, Trench, Trubner & Co., Ltd.; New York: Harcourt, Brace & Company, Inc., 1922.

²  Gross, K., Bergstrom, Carl T., et. al.; Publication bias and the canonization of false facts; Cornell University Library; arXiv:1609.00494 [physics.soc-ph]; Sept 5, 2016.

September 10, 2016 Posted by | Agenda Propaganda | , , , , | Leave a comment

Chinese (Simplified)EnglishFrenchGermanHindiPortugueseRussianSpanish
%d bloggers like this: