The Ethical Skeptic

Challenging Agency of Pseudo-Skepticism & Cultivated Ignorance

When Observation Gives Way to Data-Centric Only Science We All Lose

Is the study which you are imperiously and gleefully foisting upon me a meta-analysis? Or is it muta-analysis? It behooves the ethical skeptic to be able to discern the difference. Data alone, does not a science make. Meta-Anlysis studies bear risks inherent in the study methodology which do not exist in Level I and II direct observation science studies. To ignore these risks is pseudoscience.
Are we relying too much upon meta-analysis and imparting too much gravitas to such study approaches on a large scale? The answer is yes, absolutely. Data is the management task of the technician. Even more, an excellent magician can blur the lines between data and method, introducing muta-science into the mix of contended data rigor. It is amazing to me that a study method, which does not allow for suitable replication or peer review, and which contains 21 elements of risk in series – is somehow regarded as the pinnacle of scientific rigor. How did we arrive at such a delusion? Obviously someone of celebrity merit said it and it was repeated from then on.
In contrast, observation remains the heartfelt journey of the scientist. It affords him or her the less common ability to detect bullshit-all-dressed-up-as-rigor.

data is not the same as observationThis morning an excellent blog by The Neuroskeptic wafted across my Twitter screen. One which rekindled my interest inside a topic upon which I have expounded with my science and engineering teams at various times over the years. The Neuroskeptic pointed out a migration of nomeclature and terminology utilized inside medical science paper titles over primarily the last 50 years. The graph to the right is extracted from the March 19th Discover Magazine Online blog article by The Neuroskeptic, addressing this issue.¹ In this blog, he points out the – information technology driven, yet nonetheless peculiar – migration from the term observation, to the term data inside published medical paper titles from 1915 to 2015. As you might infer from the graphic to the right, beginning with the advent of the IBM 1400 common use mainframe in 1961,² until now, there has been an asymptotic decline in the use of observation as the basis of medical paper contention, in favor of data manipulation approaches. A certainly understandable trend. What I contend herein, is not that the rise in data-analyses is an inappropriate trend inside science; rather its employment in lieu of observation – and not as its compliment – or its employment as the gold standard of scientific analysis, both bear pitfalls. Warning flags that science may be trending into directions relegating it vulnerable to manipulation and agenda. And while this graphic only pertains to paper titles, I think the The Neuroskeptic has tapped into an appropriate way in which to introduce and elucidate this issue.

Observation, it is the first step in the scientific method. Otherwise, how does one even know what question to ask? One of the central tenets of Ethical Skepticism involves the skillful understanding of the difference between the role of observation, and its influence in the assimilation of data and crafting of intelligence. Data alone, does not a science make. It is only when we assemble data after a journey of observation, and in the purpose of crafting intelligence, and in answering a question born of necessity – that the process of science is undertaken. One does not simply begin science with data and a question. A meta-analysis can be rigorous, however to presume that since one has conducted a meta-analysis, therefore one has performed rigorous science is a logical fallacy. It is a game for the dilettante – in which they impress each other and the scientifically illiterate.

One thing I learned as the CEO of an information technology and business intelligence firm is… Anything can be crafted from data, in the eyes of the inexpert technician observer. You simply stack the data relationships in the right fashion, ask questions in the right fashion and sequence and juxtapose the results at the right time.

meta analysis increase in useThey key of data science is to be able to detect when you run the risk of deceiving yourself and need to bring in direct material in order to cull and craft such data. In order to accomplish such, one must undertake observations or one must employ the beneficial wisdom life journey of an observer. The strength of the data sets from which one draws has little to do with the strength of argument one claims – unless one possesses the journeyman expertise to interpret such data and its basis of origin. One must be able to skillfully produce intelligence in order to answer a question, not simply analyze data. In order to do that, one cannot be simply a data engineer with a general familiarity of the subject at hand. Nor can one be a scientist, who bears little expertise at the methods and tools entailed in data analysis.

Why do I call it intelligence? Because in actual Security and Intelligence work, one finds that data by itself is useless to most of the time, misleading. Almost any message can be spun from data. Only field work and a knowledgeable observer can distinguish data, and allow it to be placed into sets of useful query-based information – intelligence. Modern Skepticism, if it ever knew this, has certainly forgotten it. The advent of mass storage, relational database structures enabling query by example capability, and the readiness by which we can now assemble data to our fingertips, while mimicking the practices of intelligence, do not stand in adequate substitution of them.

Moreover, a key outcome of these data technology advantages is the increase in employment inside of science of what is called (in particular in medical science) the meta-analysis. A meta-analysis is a ‘study of studies,’ an approach which seeks to bring to coherence a measure of consensus over a specific question of science inside a complicated field of alternative study methods, and potentially conflicting conclusions. The question always remains, is such a drive to consensus representation – too premature, unfounded, unfriendly to replication or heavily risk laden to serve as a basis of claim? I contend that meta-analysis is fatally vulnerable to all such risks.

Meta-Analysis

/science : data science : third level study/ : a post hoc study which does not itself directly observe nor test, rather employs statistical procedures or combines for power, analyses of the same study design type or species. Procedures crafted to substantiate a claim based upon interrogation and integration of the results of a pooled group of first and second level studies, abstracts or case studies. When objective analyses cannot be conducted or the study set consists of studies of different species, this is also often called a systematic review.

See Study Design Indexed to Mode of Inference Strength and Risk

“A meta-analysis is considered the most rigorous of approaches to experimental study. Meta-analysis is a quantitative, formal, epidemiological study design used to systematically assess previous research studies to derive conclusions about that body of research. The examination of variability or heterogeneity in study results is also a critical outcome [of the meta-analysis].”³

~ A. B. Hadich, Hippokratia, 2010.

Potential Benefits of Meta-Anaylsis

  1.  It can potentially access the entire body of research around a question.
  2.  A more precise estimate of the effect of treatment or risk factor for disease, or other outcomes, than any individual study contributing to the pooled analysis.³
  3.  A consolidated and quantitative review of a large, and often complex, sometimes apparently conflicting, body of literature.³
  4.  The ability to contrast varied outcomes addressing a single question addressed by multiple parties and varying approaches to study.
  5.  The ability to gauge the sensitivity of various peripheral issues surrounding a scientific question, based upon its frequency of relevancy inside the pooled analysis group.
  6.  May be conducted from a clinical data distance, by a person with less skin in the game, and potentially therefore, presenting less bias and cost.

Weakness of Meta analysisIn general, because of such potential benefits, the meta-analysis is considered the pinnacle of medical scientific study rigor.  Accordingly, the study approach use is skyrocketing in employment. The number of publications employing meta-analysis over time, through 2012, has skyrocketed beyond reason (results from PubMed search using text “meta-analysis”) is depicted in the graphic to the above right.‡ It exhibits the increasing popularity of such a bypass to Level I and II direct observational and expert study. It represents an alarming and increasing reliance upon a data-centric only approaches to science, which leverages only upon our increased technological prowess at handling data, and not in an overall increase in method coherence or knowledge of subject.

But are there weaknesses entailed in a data-centric only approach to science? Achilles heels which only appear once one is able to delve into and understand the observation set which composes the foundational basis of the pooled first and second level studies inside the analysis? And is there sufficient time, resource and money to effect replication and peer review of such a monumental study? It is not as impressive to me, that the meta-analysis technician understand the question to which the meta-analysis pertains. Instead, I am impressed when the meta-analysis is founded upon a study which proves both that the question being asked is the next logical calculus or reductive critical path question in the scientific method, and that the question has been vetted to be such, as agreed by the study authors, inside the conclusions of the pooled studies themselves.  This sadly however, is almost never the case in a meta-analysis. In general, a study which is further removed from the observational basis from which it is derived, bears a greater risk of unseen error and flaw in logical derivations which can be used for conclusion. This is according to several separate laws:

Filbert’s Law

/philosophy : science : analysis : data : risk/ : to find a result use a small sample population, to hide a result use a large one. More accurately expressed as the law of diminishing information return. Increasing the amount of data brought into an analysis does not necessarily serve to improve salience, precision nor accuracy of the analysis, yet comes at the cost of stacking risk in signal. The further away one gets from direct observation, and the more one gets into ‘the data’ only, the higher is the compounded stack of risk in inference; while simultaneously the chance of contribution from bias is also greater.

Simpson’s Paradox

/philosophy : science : analysis : data : risk/ : a trend appearing in different groups of data can be manipulated to disappear or reverse (see Effect Inversion) when these groups are combined.

Univariate Error

/philosophy : science : error : method/ : a procedural error (not a ‘fallacy’) wherein one is misled by the phenomenon where it’s possible for two multivariate distributions to overlap along any one variable, but be cleanly separable or have the relationship disappear when one examines the whole relational or configuration space in its entirety.

utile absentia

/philosophy : science : bias : method : converting silence into data/ : a study which observes false absences of data or creates artificial absence noise through improper study design, and further then assumes such error to represent verified negative or positive observations. A study containing field or set data in which there exists a risk that absences in measurement data, will be caused by external factors which artificially serve to make the evidence absent, through risk of failure of detection/collection/retention of that data. The absences of data, rather than being filtered out of analysis, are fallaciously presumed to constitute bonafide observations of negatives. This is improper study design which will often serve to produce an inversion effect (curative effect) in such a study’s final results. Similar to torfuscation.

As well, the instance when an abstract offers a background summary or history of the topic’s material argument as part of its introductory premise, and thereafter mentions that its research supports one argument or position – however fails to define the inference or critical path which served to precipitate that ‘support’ – or even worse, tenders research about a related but not-critical aspect of the research. Like pretending to offer ‘clinical research’ supporting opinions about capitalism, inside a study of the methods employed by bank tellers – it only sounds related. In this case you have converted an absence into a positive. A formal error called utile absentia. This sleight-of-hand allows an opinion article to masquerade as a ‘research study’. It allows one to step into the realm of tendering an apparent epistemological result, which is really nothing more than a ‘they said/I say’ editorial with a scientific analysis bolted onto it, which may or may not present any bearing whatsoever into the subject at hand.

Most abstract surveyors do not know the difference – and most study leads cannot detect when this has occurred.​

If one is to ask a broad final consensus question of science, from a series of 1,000 studies many of whom did not ask, nor result in such a question-related set of data, one runs the risk of becoming fouled in the deceptive nature of questioneering and muta-science.

Observing the Candle

observation candleMy Advanced Placement Physics teacher in high school, early in my senior year pulled me aside, and in different approach to that of my equation learning fellow students, assigned me the task of ‘observing a candle.’ I was a bit chagrined by this seemingly mundane task, its burden appearing to me to be somewhat punitive in comparison to the excitement of lighting paper on fire by means of equation estimates of Joule’s First Law. Nonetheless my physics instructor, smiled and said “TES, I want you to sit quietly, relax your mind, and breathe steadily. Then I want you to cite 250 observations of this candle. Not for me, but for you. Lit and unlit, 250 observations.” My mind immediately swept through the salient observations which compose all that an impatient young mind needs to know.

  1.  It is a candle.
  2.  It is key-lime yellow.
  3.  I can light it.
  4.  I can lick my finger and move it back and forth through the flame for fun.

OK, done.

“No, I want you to observe harder…” He smiled and left me with my candle, a pen and 10 sheets of notebook paper. “Crap” I thought. “Sigh…” Okay, so I will observe every little minute detail of this candle and every nuance of that detail, in order to get to my 250 observation count. I breathed, I relaxed and cleared my mind, and stared at the candle well until after my classmates had left the physics lab.

  1. The flame is in a sinusoidal oscillating rhythm.
  2. The average rhythm is .6 seconds.
  3. The rhythm possesses a variation in time from .9 seconds to .2 seconds.
  4. The rhythm moves from .9 seconds to .2 seconds in progression.
  5. The flame appears as if aspirating or breathing.
  6. There is yellow in the flame.
  7. There is blue in the flame.
  8. There is orange in the flame.
  9. The blue increases and decreases with the aspirating sinusoidal pattern…

My instructor did not want me memorizing formulas in order to pass tests – he knew that the other students were going to get senior administrative jobs following rules at big companies or in government offices.  He wanted me, to instead become a scientist. Science is about the ability to observe sufficiently, not the administration of data and formulas, to examine them for familiar patterns and make life easy.

And so on it went. My best observations came as a direct result of my curiosity about what was happening inside this flame. How the candle was made, and the complex interaction between the wax, the ether which combusted versus the paraffin which remained. Suddenly I realized what a wick was for in the first place. I began to formulate ideas as to how to make this device even better, with role plays drifting through my naturally distracted mind. By the time I arrived at 250 observations, I had found that I was not yet done. I knew more about a candle, than I thought possible. I needed to fill in some blanks with some reference technical information, but I understood candle-dom.

A mother who cares for and loves her encephalitic disabled or autistic child from its first moments kicking in her womb, to the most recent day she is spat upon by ill meaning ‘science communicators’ pretending to represent the medical science community, …she is the observer. Everyone else is a data and formula poser. She knows the science, they only know the script. They wave a single academic meta-analysis in the air as if a Bible, yet possess an empty set of intimate knowledge of the questions entailed, the subtle nature of the subject, its risk or even the right question to ask under the scientific method.

Data is the management task of the technician. Stand alone data can serve to deceive even more easily than it can underpin analyses which enlighten.

Observation is the heartfelt journey of the true scientist.

data is notorious way to hideThis is why, even inside subjects towards which I instinctively react in declaring bunk, I bristle at ‘skeptics’ who do not go into the field and make observations. I do not believe in astrology. But if you have not immersed yourself into it, then do not make claims to me regarding ‘evidence’ and science. I am not impressed at your ability to recite a talking sheet. This is a charade worse than the astrology itself, no matter how likely it is that what you think is indeed correct. If you have not sat quietly in ‘haunted’ houses for nights on end, as part of your life’s journey, don’t pretend to tell me all about the reality of ghosts. If you have not chased a mach VIII object in your F-14 and filmed it on the gun camera and pulled the tape on the fire control radar, don’t pretend to tell me all about UFO’s. And if you have spent sizeable amounts of your ‘skeptic’ time debunking such subjects, and then wander over and purport to tell me all about the evidence behind medical conformity which you would like to enforce upon me, I am going to call you a ‘fucking idiot.’

Don’t be deluded by your databases, formulas and ‘evidence’ crafted by those just like you. It is nothing but counter-bunk, to an ethical skeptic.

Simply having run a statistical analyses on candles, can deceive one into thinking that they know even the first thing about candles at all.  This is the principle behind what I call Muta-Science. Or in any ethical world, pseudoscience. Bad method, even more questionable results – advertised as science of the ‘highest rigor.’

Fatal Pitfalls of a Data-Centric Meta-Analysis Only Approach to Science

magicians make for good skepticsThis principal fallacy, the lack of observation in science, or its sacrifice at the hands of the ‘data scientist’ (a scientist who bears little or no qualifying reductive expertise in the subject at hand) introduces the faulty form of science which is often conflated with a rigorous and focused meta-analysis. A muta-analysis changes, by means of sleight-of-hand which cannot easily be isolated, the question being asked, the underlying data itself, the contentions of past science, and the methods of science – by skipping around all but one of the steps of the scientific method.

“In general, post hoc analysis should be deemed exploratory and not conclusive. In the best-case scenario, by revealing the magnitude of effect sizes associated with prior research, meta-analysis can suggest how future studies might be best designed to maximize their individual power.”

     ~T. Greco, Pitfalls of Meta-Analyses‡

bonus sive malusIn other words, meta-analyses are not meant to provide boast to consensus, nor completion of the scientific method. It is ironic indeed that where we deny the need to assemble data and intelligence at the beginning of the scientific method, we bless such action as ‘the gold standard’ when used inappropriately in lieu of or to artificially force consensus as the end of the scientific method.

Muta-Analysis (Pseudoscience)

/science : data science : pseudoscience : spin : questioneering/ : the most unreliable of scientific studies. Often a badly developed meta-analysis, which cannot be easily replicated or peer reviewed, contains a high degree of unacknowledged risk, or was executed based upon a poor study plan. An appeal to authority based upon faulty statistical knowledge development processes. Processes which alter or do not employ full scientific methodology, in favor of a premature claim to consensus or rigor implied by the popularity of a statistical study type. A method which does not directly observe, nor directly test, rather employs statistical procedures to answer a faulty inclusion criteria selected, asked, agenda bearing or peripherally addressed scientific question.

The Probability of Failure is High and Unacknowledged: Muta-Analysis Employed Simply as a Data-Centric Appeal to Authority

Uncertainty Imparted from Source Study Material Practices and Lack of Observational Basis

  1. muta analysisLack of qualified individual data or poor study design results in such a large domain of material so as to dilute, or render-below-p, important signal data.
  2. Lack of qualified individual data or poor study design causes access to such a complex domain of material so as to conceal important focused signal data inside the noise generated.
  3. Data obtained from study summaries, rather than original data – serves to surreptitiously mislead.
  4. Studies which corroborate or correspond to current avenues of science tend to have robustly informative titles to catch attention, while studies which have spotted a counter signal, only title their study precisely to the observed signal effect and not the broader topic, and will tend to show up less often as dissenting or countermanding material in a study search.
  5. Large pool study populations challenge the literature search’s ability to understand each study adequately for inclusion criteria.
  6. Does not draw knowledge from the literature, commentary, peer review commentary or expert editorial/caution around a subject, otherwise available from scientists and experts.
  7. When used to impart inferences from studies, pertaining to peripheral topics, which the Level I and II study authors never intended, nor to which they would agree.
  8. Can be used to impart a premature consensus conclusion from the authors of its pooled studies, which they never addressed nor intended to issue.
  9. The mistaken perception can be held that increases in data or precision, result in increases in study quality, result in increases in accuracy.
  10. Meta-analysis makes it possible to look at events that were too rare in the original studies to show a statistically significant difference. However, analyzing rare events represents a problem because small changes in data can determine important changes in the results and this instability can be exaggerated by the use of relative measures of effect instead of absolute ones.‡
  11. There is resistance from authors to allow ready access to their own dataset containing individual patient data, because of a variety of proprietary reasons and concerns over misinterpretation or liability.‡

Study Scale and Cost Detractor Risks

  1. A meta-analysis is work content and resource costly and difficult to replicate or review. Therefore it can be tempting to take its results as consensus finished science – appeal to authority by cost or effort – when such a claim is not warranted or is premature.
  2. Peer review is difficult to impossible to execute because the qualifications to review the data at hand require specific technical, not science, expertise, effort, cost and domain visibility.
  3. An expensive and work content heavy study – which risks a statistical outcome of underwhelming proportion or one lacking heterorgeneity, may be tempted to revise the study or question, once principal results are in hand, in order to resolve a weaker, but more monumental outcome for peer review and publication.
  4. The very potential of wasted money, introduces an intrinsic conflict of interest, which does not exist in incremental Level I and II studies.
  5. Database queries are run by low cost analysts and research assistants, as a necessity of cost. These non-experts may inadvertently tamper with the available set of input data by means of the criteria employed in a search, its exclusion and inclusion conditions or the inability to know when a suitably representative sample has been extracted from a domain with which the assistant has no familiarity.

Misalignment Between Expertise Demand and Task Assignments Imparting Uncertainty and Risk

  1. Can be executed by academic students and research assistants, at an arms-length from the field, or by data technicians who do not fully grasp the field or question at hand because the lead scientists are not familiar with the database/query technology employed in the meta-analysis.
  2. Executed by data technicians who are not skilled as science or observation, nor the process of developing a logical critical reduction path.
  3. Literature searches performed by semi-qualified, data technician or research assistant parties serve to bias the formulation of question and bias the results.
  4. Technical tasks inside the study do not allow senior research or observation professionals ease of visibility and access into the data practices and intermediate results. No litmus testing is easily allowed midstream. All of which serve to weaken the overall strength and circumspection of the study. The results are the results, and who has the expertise to say they are wrong?
  5. Most abstract surveyors are not aware when they have misinterpreted an abstract or created false support/dissent observation for the sampled issue – and most study leads cannot detect when this has occurred.​

Study Design & Execution Risks

  1.  Subtle failures to adequately define the question studied serve to bias the results, through faulty inclusion criteria.
  2.  Electronic-only database searches artificially skew the study pool and induce bias through excluding studies without an exclusion criteria
  3.  Inclusion criteria crafted to unfairly represent the objections of the dissenting body of science inside a given question.
  4.  Inclusion criteria posed to answer a different question than the ones posed in the first and second level pool studies from which it draws.
  5.  Development of a conclusion from Level I and II studies which provide for little or no heterogeneity in conclusion base.
  6.  Quality scores in study design may exclude more salient studies with lower quality scores in favor of less salient studies which have high quality scores, thereby both imbuing bias into the selection criteria while at the same time advertising a high quality selection rating.
  7. Sensitivity analyses must presume a correlative conjecture – yet do not prove that relationship at the same time. The danger is that these relationships will be granted the same gravitas as the primary claim. “Correlation does not prove causality, unless it is a side observation inside a meta-analysis.”
  8. Both data sourcing and arm’s length Cochrane-style Reviews come from contributors and groups based all around the world with the majority of the work carried out online. Enormous bias in data interpretation can be imputed through such a process.

Abuse of Study Intent Gravitas and Implications

  1. In the face of sufficient risk, unknown or detractors, a meta-analysis can be employed as a defensive boast to provide a smoke screen of rigor which cannot be otherwise claimed in level I or II study.
  2. In the face of conflicting results, a meta-analysis can be utilized more as an involuntary ‘vote’ by scientists or popularity contest fake-measure for deriving artificially crafted ‘consensus.’
  3. Can be used, because of its ‘most rigorous form of study’ POTENTIAL – as an appeal to authority.
  4. The question arises as to why, in a field such as medicine where direct observations are so readily available, do we constantly value science which involves NO observation at all? This renders the study methodology vulnerable to market forces who wish to bypass liability imparted by means of direct observation.
  5. Cochrane Collaboration approaches may cause undue credibility to be granted biased studies from vested/conflicted interests through otherwise respected systematic review channels.

Note: None of the above risks exist in a Level I and II study, which fail more from errors in measurement, alternative diligence and conclusive shortcuts – not risks which are imparted by the nature of the study methodology itself. This is the unacknowledged risk, taken on by the person making a claim via meta-analysis.

High Risk  +  Used as a Claim to Gold Standard Rigor or Consensus  =  High Probability of Abuse

nevrer meta analysis I did not likeRemember that risk is not defined as the ‘probability of failure,’ rather ‘a cumulation of the number of risk bearing elements in series, which could potentially serve to undermine a process or result.’† In this regard, a meta-analysis bears more risk than does a level I or II study – because of the complexity and inability to replicate the logical calculus or scale of effort involved. Twelve high probability tasks targeting accomplishment of a goal, results in one low probable outcome of success. Twenty one risk points (ten of the above 31 points are parallel risks or potential abuses of employment, not intrinsic series method risks) affords an even less probable avenue to success, approaching a probability of failure of 1.00.† Figure 1 shows the cumulative series on such methodology risk, where P(f) is the the probability of failure, and its approach to a 1.00 asymptote.

Figure 1: Highly populated probability of failure, series†

series risk in

So, with the ‘gold standard of rigor’ claim with regard to meta-analyses comes the knowledge that we are purchasing such a claim to authority at the cost of an enormous amount of unacknowledged but adopted risk. To wholesale allow for the establishment of consensus evidence, on the basis of solely meta-study approaches, is scientific foolishness and self deception.

It is amazing to me that a study method, which does not allow for replication, nor suitable peer review, and which contains 21 elements of risk in series – is somehow regarded as the pinnacle of scientific rigor.

A meta-analysis – CAN be – the gold standard of study rigor and verity. But more importantly, we introduce error into science when we issue this appeal to authority basis to studies which fail to address their own propensity to misidentify, misconstrue, mislead, and misinform science and the public.

epoché vanguards gnosis


¹  The Neuroskeptic; Discover Magazine Online: From “Observations” to “Data”: The Changing Language of Science | March 19, 2016 10:32 am; http://blogs.discovermagazine.com/neuroskeptic/2016/03/19/from-observations-to-data/#more-7513.

²  Computer History Museum: Timeline of Computer History; http://www.computerhistory.org/timeline/computers/.

³  Haidich, A. B. (2010). Meta-analysis in medical research. Hippokratia, 14(Suppl 1), 29–37; http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3049418/.

†  Bin Suoa, Yong sheng Chenga; Calculation of Failure Probability of Series and Parallel Systems for Imprecise Probability; Jun Lihttp://www.mecs-press.org/ijem/ijem-v2-n2/IJEM-V2-N2-12.pdf.

‡  Greco, T., Zangrillo, A., Biondi-Zoccai, G., & Landoni, G. (2013). Meta-analysis: pitfalls and hints. Heart, Lung and Vessels, 5(4), 219–225.

March 19, 2016 Posted by | Agenda Propaganda, Argument Fallacies | , , , , | Leave a comment

Popper Demarcation Practice and Malpractice

Many people presume the Popper Demarcation Principle to distinguish the boundary between science and pseudoscience. While the Popper Demarcation indeed involves this aspect, the two ideas are not congruent. The actual delineation hinges on the role of predictive and falsifying testing practiced by those entities claiming the methods of science, or science as the body of knowledge. He contrasts this claimant group with those who make no such claim to science at all. False Skepticism to Popper, was also pseudoscience, because it claims to be conducting science – but does not employ rules of evidence or falsification. This includes the practice where his definitions are abused in order to falsely condemn disliked subjects.
If your version of skepticism purports that it ’employs the tools of science to make the most probable conclusion’ on behalf of science, or calls an entire subject a ‘pseudoscience,’ …then beware.

Popper Demarcation GuidelinesKarl Popper proposed the demarcation principle, as a means to approach the problem of how we differentiate science from non-science in principle. This categorization of that which resides outside of science is a non-pejorative filtering of those mechanisms which can be relied upon to product the body of knowledge. To put it another way, the demarcation problem consists in crafting principles, constraints, reasons, or conditions to regard something like epigenetics under “science” and place a discipline (falsified, yet still pretending to represent science) such as phrenology as a pseudoscience. The two critical aspects of the Popper Demarcation Principle involve the separate issues regarded below: The issue of the Role of Predictive Study, and the claim or lack thereof, of doing or representing science. In absence of framing Popper Demarcation inside these two clarifying factors, much confusion and false condemnation can be spun by fake skeptics, through Popper Demarcation Malpractice.

The Ambidextrous Nature of Predictive Studies

Predictive study is treated differently by Popper, as distinguished by its role of employment in the methods of science as opposed to the conclusions of science (see graphic to the right). Popper, like any scientist, fully understood the critical role of predictive studies in the scientific method, as well as the critical role of prediction making ability inside a successful theory. He was not discounting these valuable components/steps out of the process of science.  What Popper was framing, is the circumstance where predictive study alone is employed to substantiate conclusions as accepted or peer-ready science. This type of science is the chief method of hypoepistemology practices by those wishing to push a social agenda. In this role, predictive studies can be employed as pseudoscience. The Ethical Skeptic must discern the circumstance where an epistemology is based only on scant statistics, studies of studies, or predictive tests – and has not fully challenged its theory with ascertainable falsification testing or past falsification achievements (Promotification or Popper Error).

However, for those who confuse or conflate the methods of science with the body of scientific knowledge – the role of predictive study is sacrificed at the alter of agenda. In such approaches, employing equivocal terms or proxy equivocation in their articulation of the issue of predictability, every proposed claim about what distinguishes science from pseudoscience can be confused with a counter-example. Karl Popper postulated that falsifiability stands as the criterion which distinguishes science from pseudoscience. If any set of claims or theory can be shown true through the disciplines of falsification, it belongs to the domain of science. Many people wrongly presume this to mean that if any set of claims or theory is innately unfalsifiable, it belongs to the domain of pseudoscience. This delineation is incoherent as some un-testable scientific claims sets, such as M-theory or multi-verse interpretations are not considered pseudoscience.

If they were enforced based on predictive study only, as a finished body of knowledge, that would indeed be hypoepistemology pseudoscience. But in science as a method, M-theory or multi-verse predictive studies are indeed considered science.

The key opportunistic play here for Social Skeptics is that both, context dancing between science as a body of knowledge and science as a method, or the equivocation involved in merging the two ideas, produces incoherence and useful confusion. A method of condemning subjects by dancing between the two contexts of the Popperian term. A simple prima facia incoherence that Karl Popper, a seasoned scientist and philosopher surely would have, and did, recognize. Did the people who presume this equivocation, think Karl Popper to be a simpleton, more stupid than are they? The reality that escapes the philosophically dilettante is that he did indeed deal with this inconsistency. The Handbook of the Philosophy of Science expounds on this:

The phrases “demarcation of science” and “demarcation of science from pseudoscience” are often used interchangeably, and many authors seem to have regarded them as equal in meaning. In their view the task of drawing the outer boundaries of science is essentially the same as that of drawing the boundary between science and pseudoscience.

This picture is oversimplified. All non-science is not pseudoscience, and science has non-trivial borders to other non-scientific phenomena, such as metaphysics, religion, and various types of non-scientific systematized knowledge.¹

Claiming and Not Claiming to Do or Represent Science

science is moresoThere is a stark difference between those things which claim to be science, and those things which claim nothing of the sort. If my neighbor runs over and swears that he saw The Chupacabra running through his backyard, he is not claiming to do science, he is not practicing pseudoscience. If he goes to the city council and cites that there are hundreds of missing cats and dogs in the area, he is still not practicing pseudoscience. This set of activity simply constitutes observation and advocacy (or possibly fraud). This is a key understanding which differentiates the false skeptic from the real skeptic. It is when he makes the nonsense claim that he has done research, and by examining the poop of the supposed animal in a lab, now claims that what he saw in his backyard must be an interdimensional being, released by UFO’s, because its poop contained animal proteins not found on this Earth. That is when the person making such claims has indeed stepped into the bounds of pseudoscience. At no time is he ever a pseudo scientist simply because he made an observation of something called by fake skeptics ‘a pseudoscience.’ ¹ ²

Even if he becomes an advocate, and attempts to petition science to study the issue, he is not dabbling in pseudoscience. To kill this type of process through fake skepticism, is to kill the process of science; yes, even on a brazenly ridiculous topic like The Chupacabra. Presuming that one is doing science, by calling the gentleman a liar, or deluded, is in itself – a claim Ξ pseudoscience. Many fake and shallow skeptics fail to discern this important aspect of the Popper Demarcation principle.

Among things which are unfairly labeled as pseudoscience by ill intended fake skeptics, are:

  • Sponsorship of ideas for research
  • Subjects which are ignored through social epistemology or pressure
  • Positions which appear to oppose oligarch corporations
  • Political positions
  • Religious tenets
  • Citing of anomalous observations
  • Moral positions
  • Art, fiction, creative works
  • Advocacy for health observations and those who suffer
  • Anecdotal evidence which is ignored on a grand scale

By practicing Popper Demarcation Malpractice, Social Skeptics can manage the control of access to science, effectively screening out disliked topics, observations and ideas.

Popper Demarcation Malpractice

/philosophy : science : pseudoscience : malpractice/ : the dilettante presumption that if any set of claims or theory is innately non-falsifiable, it belongs to the domain of pseudoscience. Wrongly presuming a subject to be a pseudoscience, instead of false practices pretending to be science. Purposely or unskillfully conflating the methods of science with the body of scientific knowledge, employing amphibology or proxy equivocation in their articulation of the issue, wherein every proposed claim about what distinguishes science from pseudoscience can be confused with a counter-example. This renders the demarcation boundary of no utility, and reduces overall understanding.

Transactional Popper Demarcation Error – incorrectly citing a topic as being a pseudo science, when in fact its sponsors are seeking falsification based protocols to counter the antithetical premise to their case, or its sponsors are employing predictive studies being employed simply to establish plurality for sponsorship inside the scientific method.

Existential Popper Demarcation Error – citing something as a pseudoscience simply because one does not like the topic, or the topic has had pretend science performed in its name in the past.

The reality is that there exist three domains of idea development:  Science, Pseudoscience, Parascience/Non-science. Understanding these three domains and skillfully applying that understanding inside the discourse of ideas is the ethic of one who sincerely wants to know. It is the habit of one who practices Ethical Skepticism as opposed to the purposely smoke and mirrors, equivocation imbued, pretend science and idea assassinating fake version of skepticism.

Science (a method, a discipline and a body of knowledge)

The application of observation, thought, reason, testing, and peer input to arrive at conclusions which reliably can be added to the body of knowledge. That body of knowledge itself.

Particle Acceleration

Materials Fabrication

Epigenetics

Pseudoscience (a method and pretense only)

A process which claims to arrive at conclusions by means of science, or citing of elements it purports to exist in the body of scientific knowledge, where in fact neither adheres to nor originates from, actual methods of science.

Attempting to demonstrate free energy by sleight-of-hand battery switching and amperage measurements

Attempting to show one is located on the Earth’s equator by demonstrating differing water drain patterns both south and north of a fictitiously drawn line

Pseudo-Skepticism

Parascience

Thinking disciplines of benefit to mankind, which seek to improve the human condition, or solve perplexing issues, or even assist science in its overall efficacy, but do not necessarily make the claim of employing science in order to derive such ethics.

Advocacy

Observation

Science Fiction

Non-science

Disciplines of human endeavor which do not employ, nor claim to employ science in their execution. However may involve some science in their development – or turn into a discipline of science through diligent sponsorship.

Law

Religion

Public Speaking

An Example of Popper Demarcation Malpractice:

Sometimes the term “pseudoscience” is used in a wider sense in order to pejoratively filter out ideas considered by researching sponsors, advocates, legal activists, politicians and those making disturbing observations. The abuse of the term in this fashion, as constituting that which

(2′)  it is part of a non-scientific doctrine whose major proponents try to create the impression that it is scientific.

(2″)  is part of a doctrine whose major proponents try to create the impression that it represents the most reliable knowledge on its subject matter.²

This is false, because the practice which established that ‘proponents try to create the impression that it represents science’ fails the Popper Demarcation itself. So if we are applying Popper here, we cannot create postulates which violate the very principle we are seeking to construct. Declaring a subject, in absence of evidence proving such a claim, to be constituted solely by individuals who are pretending to be science – 1. claims to hold a body of knowledge, and 2. does so without a basis of true science to derive that knowledge. Therefore, such a claim is itself, pseudoscience, according to Popper.

The SSkeptics Dictionary for example (http://www.skepdic.com/pseudosc.html) incorrectly defines pseudoscience as

“A pseudoscience is set of ideas based on theories put forth as scientific when they are not scientific.”

This definition is an incoherent one-liner – Wittgenstein unsinnig: highly convoluted and implication laden professional-sounding babble, articulated so as to tender the appearance of being simple. It is incompatible with parsimony in this regard; and as well, ironically fails the Popper Demarcation of Science itself, because

  1. It conflates ideas into ‘theories’ by default in an effort to pejoratively filter them – a practice of pseudoscience. A theory implies a set of claims under science method, which ideas may not involve. A very similar equivocation to calling an observation a ‘claim.’ So you can then dismiss it as ‘failing science.’
  2. It is NOT ideas which are pseudo-scientific – rather
    1. those things purported to already exist in the body of knowledge, when indeed such is not the case, and
    2. those things purported to be based on methods which are scientific, but in reality are not.
  3. It regards a SUBJECT MATTER (theories) rather than a contention or process, as that which qualifies something as pseudoscience. This is errant and constitutes a logical fallacy – and to those who understand this – yet commit the offense so as to screen subjects from access to science, also constitutes a practice of fraud.
  4. It may or may not imply that proponents of the ‘ideas’ try to create the impression that they represent science or the most reliable knowledge on its subject matter. Again, such a claim cannot be made outside of research and scientific practice; constituting in its implied claim, defamation and pseudoscience.
  5. It makes a final contention that certain ideas are ‘not scientific’ based on a prescribed set of conclusions or the personal level of knowledge on the part of the observer. This is not how science nor skepticism work at all.

The grasp of this differentiation is a key litmus test distinguishing a false skeptic from a true skeptic. They claim to represent science to you in this misrepresentation sleight-of-hand. The shallow and inexperienced might buy this at face value, but an Ethical Skeptic will not.

It is nothing but Popper Demarcation Malpractice… scientific quackery.


¹  Mahner, Martin, 2007. “Demarcating Science from Non-Science”, pp 515-575 in Theo Kuipers (ed.) Handbook of the Philosophy of Science: General Philosophy of Science – Focal Issues, Amsterdam: Elsevier.

²  Hansson, Sven Ove, “Science and Pseudo-Science”, The Stanford Encyclopedia of Philosophy (Spring 2015 Edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/archives/spr2015/entries/pseudo-science/>.

September 20, 2015 Posted by | Argument Fallacies | , , , , , , , , | Leave a comment

Wittgenstein Error and Its Faithful Participants

I neither want to understand your observation or contention, nor do I regard it as acceptable for consideration unless I see solid conclusive empirical underpinning; much as I hold for all the things I regard as true.  Until it is proved, I will allow no language of science to develop around the subject. Your terms and measures are all pseudo-science.
Wittgenstein says bullshit to the supposed objectivity of those who game process in this manner, and identifies three types of error to which the social epistemologist falls prey.

One of the principal developers of our modern framing of philosophy was Ludwig Wittgenstein (most often pronounced /vhit’-geng-shtiyne/). His Philosophy of Mathematics served as a means to bring into coherence his own contentions regarding the role and limitations of philosophy as it impacts our science. Indeed, Wittgenstein’s writings from 1929 through 1944 are heavily devoted to mathematics, a fact that Wittgenstein himself emphasized in 1944 by writing that his “chief contribution has been in the philosophy of mathematics.”¹ This focus on mathematics afforded Wittgenstein a frame of reference from which to understand the contast between a hard-boundary science such as maths, and in contrast a soft-boundary science, such as philosophy. Wittgenstein going even so far as to consider philosophy thusly: “philosophy is not a theory, or a doctrine, but rather an activity. It is an activity of clarification (of thoughts), and more so, of critique (of language).”²

The Wittgenstein Thresholds of Knowledge - Copy - CopyAs such philosophy does not lend itself to recitation in the Wittgenstein argument, aside from its basic dependence upon foundational elements, tautologies that are true based on their own essence (eg. 2+2=4, or I am alive). A philosophy can be nonsense (unsinnig) when it is devoid of any referenceable structure or meaning, or senseless (sinnlos), when possessing referenceable structure and meaning but contestable in terms of accuracy.¹ Most of mankinid’s contentions, are incorrectly ascribed to be nonsense by faking skeptics, when in fact, they are more accurately termed senseless under the Wittgensteinian philosophical framing. But for the most part, we forgive social skeptics this error. Employment of the pejorative ‘nonsense’ sounds more scientifically authoritative and conclusive. They are all about sounding authoritative and conclusive, so they would rather employ that term in error. Terminology and language as it turns out in the Wittgenstein sense, are not only crucial to the foundations of understanding and science, but also stand as one of the principle tools through which fake skepticism is leveraged.

Like everything metaphysical the harmony between thought and reality is to be found in the grammar of the language.
     ~ Ludwig Wittgenstein

Wittgenstein is considered one of the key developers of thought regarding how we understand what is philosophy, sense, nonsense and knowledge.  Wittgenstein outlined a useful framework hinging around the all important role of language (#2 in the series), inside of which I have developed a series of litmus thresholds which define what is both knowable, and what is known. Elements of science, or what is considered to be added to the body of what is known, are dependent upon several channels of serviceability in order to possess even the remotest possibility of becoming a part of our body of knowledge. Each must sufficiently pass a litmus test of serviceability in terms of:

  1. Domain of Comprehensibility – a tenet of knowledge must be graspable by the mind of at least one person
  2. Domain of Descriptive Symbology (Language) – a tenet of knowledge must be describable in some kind of symbolism, both privately and commonly held
  3. Domain of Intelligibility – a tenet of knowledge must be framable in reference to previous foundation tenets of knowledge (Wittgenstein elements)
  4. Domain of Observability – a tenet of knowledge must possess a feature which is at least in part, observable
  5. Domain of Tolerability – a tenet of knowledge must not offend the sensibility of members of those who hear it
  6. Domain of Sustainability – a tenet of knowledge must both be teachable and teachable by others than its originator

Wittgenstein placed language and descriptive symbology as the foundational aspect which influences the intelligibility and more of an idea. These six screening mechanisms are the filters through which mankind develops what is considered the Body of That Which is Known (science). However, when this process is tampered with, or the methods of science are crafted in such a way as to corrupt and game this series of acceptances, two errors result, below.

Because most things are innately comprehensible (1), intelligible (3), and can be observed (4) and taught (6); therefore, should I wish to constrain science in the Wittgensteinian/Ethical Skeptic sense, science which is headed in a direction I do not like, I must attack the two vulnerable thresholds of the knowledge development process (Numbers 2 and 5 above):

A.  Refuse to afford the subject an intelligible and professionally agreed set of concepts, questions or descriptive language, and

B.  Position a screen of intolerability as to its being sustained (taught).

Wittgenstein cautioned that what is ‘known’ can be as much an exercise in philosophy, as is philosophy itself. Unlike maths’ hard boundary science, much of what we know, is vulnerable to what we want to know, or our soft-boundary philosophy of both what we can comprehend, measure, communicate or desire to observe. Very often this knowledge is not in reality anchored existentially into both that which is known, or especially that which can be known. We pick and choose what eventually arrives as ‘truth’ based on our philosophy.

It is this refusal to observe that which can be known, which is the chief sinnloss on the part of the Social Skeptic. Both the desire to not know something, and the belief that all one’s knowledge is underpinned outside the framework of philosophy, stand as a grand fantasy on the part of the social skeptic. The nonsense arises in their inability to observe this in themselves.

More than simply an argument from ignorance, the Wittgenstein Error is the active construction of the ignorance itself. A gaming of what is observable by tampering with language and symbolism first. It is akin to attempting complex math while refusing to allow a mechanism for integration. All in order to shepherd to a priori ends, that which can be known.

This gives rise to two particular forms of error on the part of those who profess science as part of a social agenda. Errors which The Ethical Skeptic, is wise to avoid in their own thinking – and quietly identify in the thinking of others, in a non-pejorative context (we all are vulnerable to this human frailty).

Wittgenstein Error (Descriptive)

Describable: I cannot observe it because I refuse to describe it.

Corruptible: Science cannot observe it because I have crafted language and definition so as to preclude its description.

Existential Embargo:  By embargoing a topical context (language) I favor my preferred ones through means of inverse negation.

/philosophy : knowledge development : symbolism and language/ – the contention or assumption that science has no evidence for or ability to measure a proposition or contention, when in fact it is only a flawed crafting of language and definition, limitation of language itself or lack of a cogent question or (willful) ignorance on the part of the participants which has limited science and not in reality science’s domain of observability.

Philosophy is a battle against the bewitchment of our intelligence by means of language.” ~Wittgenstein

Wittgenstein Error (Contextual)

Situational: I can shift the meaning of words to my favor or disfavor by the context in which they are employed.

/philosophy : knowledge development : symbolism and language/ – the philosophical conception of words bearing a meaning-as-use approach to definition, or the idea that the meanings of words, relative or not, cannot be defined abstract in isolation from the contexts in which they are employed.³ Semantics and locution abuse as it formulates the basis of rhetoric.

We are unable clearly to circumscribe the concepts we use; not because we don’t know their real definition, but because there is no real ‘definition’ to them.” ~Wittgenstein

Wittgenstein Error (Epistemological)

Tolerable: My science is an ontology dressed up as empiricism.

/philosophy : knowledge development : fallacies/ – the contention that a proposition must be supported by empirical data or else it is meaningless, nonsense or useless, or that a contention which is supported by empirical data is therefore sensible, when in fact the proposition can be framed into meaninglessness, nonsense or uselessness based upon its underlying state or lacking of definition, structure, logical calculus or usefulness in addressing a logical critical path.

bedeutungslos – meaningless or incoherent. A proposition or question which resides upon a lack of definition, or which contains no meaning in and of its self.

unsinnig – nonsense or non-science. A proposition of compromised formal structure or not framed in a scientifically valid form of reduction. Feynman ‘not even wrong.’

sinnlos – mis-sense, logical untruth or lying. A contention which does not follow from the evidence, is correct at face value but disinformative or is otherwise useless.1

Nothing is so difficult as not deceiving oneself.” ~Wittgenstein

Our duty is to challenge pseudo-skepticism which employs these two error bases, institutional doctrine which is founded upon them and the resulting cultivation of ignorance which provides the fertile soil from which more of this type of error can be perpetuated.


¹  Rodych, Victor, “Wittgenstein’s Philosophy of Mathematics”, The Stanford Encyclopedia of Philosophy (Summer 2011 Edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/archives/sum2011/entries/wittgenstein-mathematics/>.

²  Biletzki, Anat and Matar, Anat, “Ludwig Wittgenstein”, The Stanford Encyclopedia of Philosophy (Spring 2014 Edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/archives/spr2014/entries/wittgenstein/>.

³  Auerbach, David, Slate, The Limits of Language: Wittgenstein explains why we always misunderstand one another on the Internet; Sep 1, 2015; http://www.slate.com/articles/life/classes/2015/09/take_a_wittgenstein_class_he_explains_the_problems_of_translating_language.single.html.

August 15, 2015 Posted by | Agenda Propaganda, Argument Fallacies | , , , , , | Leave a comment

Chinese (Simplified)EnglishFrenchGermanHindiPortugueseRussianSpanish
%d bloggers like this: