The Ethical Skeptic

Challenging Pseudo-Skepticism, Institutional Propaganda and Cultivated Ignorance

The Lyin’tific Method: The Ten Commandments of Fake Science

The earmarks of bad science are surreptitious in fabric, not easily discerned by media and the public at large. Sadly, as well they are not often easily discerned by scientists themselves. This is why we have ethical skepticism. It’s purpose is not simply to examine ‘extraordinary claims’, but also to examine those claims which masquerade, hidden in plain sight, as if constituting ordinary boring old ‘settled science’.

When you do not want the answer to be known, or you desire a specific answer because of social pressure surrounding an issue, or you are tired of irrational hordes babbling some nonsense about your product ‘harming their family members’ *boo-hoo 😢. Maybe you want to tout the life extending benefits of drinking alcohol, or overinflate death rates so that you can blame it on people you hate – or maybe you are just plain ol’ weary of the requisite attributes of real science. Wherever your Procrustean aspiration may reside, this is the set of guidebook best practices for you and your science organization. Trendy and proven techniques which will allow your organization to get science back on your side, at a fraction of the cost and in a fraction of the time. 👍

Crank up your science communicators and notify them to be at the ready, to plagiarize a whole new set of journalistic propaganda, ‘cuz here comes The Lyin’tific Method!

The Lyin’tific Method: The Ten Commandments of Fake Science

When you have become indignant and up to your rational limit over privileged anti-science believers questioning your virtuous authority and endangering your industry profits (pseudo-necessity), well then it is high time to undertake the following procedure.

1. Select for Intimidation. Appoint an employee who is under financial or career duress, to create a company formed solely to conduct this study under an appearance of impartiality, to then go back and live again comfortably in their career or retirement. Hand them the problem definition, approach, study methodology and scope. Use lots of Bradley Effect vulnerable interns (as data scientists) and persons trying to gain career exposure and impress. Visibly assail any dissent as being ‘anti-science’, the study lead will quickly grasp the implicit study goal – they will execute all this without question. Demonstrably censure or publicly berate a scientist who dissented on a previous study – allow the entire organization/world to see this. Make him become the hate-symbol for your a priori cause.

2. Ask a Question First. Start by asking a ‘one-and-done’, noncritical path & poorly framed, half-assed, sciencey-sounding question, representative of a very minor portion of the risk domain in question and bearing the most likely chance of obtaining a desired result – without any prior basis of observation, necessity, intelligence from stakeholders nor background research. Stress that the scientific method begins with ‘asking a question’. Avoid peer or public input before and after approval of the study design. Never allow stakeholders at risk to help select nor frame the core problem definition, nor the data pulled, nor the methodology/architecture of study.

3. Amass the Right Data. Never seek peer input at the beginning of the scientific process (especially on what data to assemble), only the end. Gather a precipitously large amount of ‘reliable’ data, under a Streetlight Effect, which is highly removed from the data’s origin and stripped of any probative context – such as an administrative bureaucracy database. Screen data from sources which introduce ‘unreliable’ inputs (such as may contain eyewitness, probative, falsifying, disadvantageous anecdotal or stakeholder influenced data) in terms of the core question being asked. Gather more data to dilute a threatening signal, less data to enhance a desired one. Number of records pulled is more important than any particular discriminating attribute entailed in the data. The data volume pulled should be perceptibly massive to laymen and the media. Ensure that the reliable source from which you draw data, bears a risk that threatening observations will accidentally not be collected, through reporting, bureaucracy, process or catalog errors. Treat these absences of data as constituting negative observations.

4. Compartmentalize. Address your data analysts and interns as ‘data scientists’ and your scientists who do not understand data analysis at all, as the ‘study leads’. Ensure that those who do not understand the critical nature of the question being asked (the data scientists) are the only ones who can feed study results to people who exclusively do not grasp how to derive those results in the first place (the study leads). Establish a lexicon of buzzwords which allow those who do not fully understand what is going on (pretty much everyone), to survive in the organization. This is laundering information by means of the dichotomy of compartmented intelligence, and it is critical to everyone being deceived. There should not exist at its end, a single party who understands everything which transpired inside the study. This way your study architecture cannot be betrayed by insiders (especially helpful for step 8).

5. Go Meta-Study Early. Never, ever, ever employ study which is deductive in nature, rather employ study which is only mildly and inductively suggestive (so as to avoid future accusations of fraud or liability) – and of such a nature that it cannot be challenged by any form of direct testing mechanism. Meticulously avoid systematic review, randomized controlled trial, cohort study, case-control study, cross-sectional study, case reports and series, or reports from any stakeholders at risk. Go meta-study early, and use its reputation as the highest form of study, to declare consensus; especially if the body of industry study from which you draw is immature and as early in the maturation of that research as is possible.  Imply idempotency in process of assimilation, but let the data scientists interpret other study results as they (we) wish. Allow them freedom in construction of Oversampling adjustment factors. Hide methodology under which your data scientists derived conclusions from tons of combined statistics derived from disparate studies examining different issues, whose authors were not even contacted in order to determine if their study would apply to your statistical database or not.

6. Shift the Playing Field. Conduct a single statistical study which is ostensibly testing all related conjectures and risks in one felled swoop, in a different country or practice domain from that of the stakeholders asking the irritating question to begin with; moreover, with the wrong age group or a less risky subset thereof, cherry sorted for reliability not probative value, or which is inclusion and exclusion biased to obfuscate or enhance an effect. Bias the questions asked so as to convert negatives into unknowns or vice versa if a negative outcome is desired. If the data shows a disliked signal in aggregate, then split it up until that disappears – conversely if it shows a signal in component sets, combine the data into one large Yule-Simpson effect. Ensure there exists more confidence in the accuracy of the percentage significance in measure (p-value), than of the accuracy/salience of the contained measures themselves.

7. Trashcan Failures to Confirm. Query the data 50 different ways and shades of grey, selecting for the method which tends to produce results which favor your a priori position. Instruct the ‘data scientists’ to throw out all the other data research avenues you took (they don’t care), especially if it could aid in follow-on study which could refute your results. Despite being able to examine the data 1,000 different ways, only examine it in this one way henceforth. Peer review the hell out of any studies which do not produce a desired result. Explain any opposing ideas or studies as being simply a matter of doctors not being trained to recognize things the way your expert data scientists did. If as a result of too much inherent bias in these methods, the data yields an inversion effect – point out the virtuous component implied (our technology not only does not cause the malady in question, but we found in this study that it cures it~!).

8. Prohibit Replication and Follow Up. Craft a study which is very difficult to or cannot be replicated, does not offer any next steps nor serves to open follow-on questions (all legitimate study generates follow-on questions, yours should not), and most importantly, implies that the science is now therefore ‘settled’. Release the ‘data scientists’ back to their native career domains so that they cannot be easily questioned in the future.  Intimidate organizations from continuing your work in any form, or from using the data you have assembled. Never find anything novel (other than a slight surprise over how unexpectedly good you found your product to be), as this might imply that you did not know the answers all along. Never base consensus upon deduction of alternatives, rather upon how many science communicators you can have back your message publicly. Make your data proprietary. View science details as a an activity of relative privation, not any business of the public.

9. Extrapolate and Parrot/Conceal the Analysis. Publish wildly exaggerated & comprehensive claims to falsification of an entire array of ideas and precautionary diligence, extrapolated from your single questionable and inductive statistical method (panduction). Publish the study bearing a title which screams “High risk technology does not cause (a whole spectrum of maladies) whatsoever” – do not capitalize the title as that will appear more journaly and sciencey and edgy and rebellious and reserved and professorial. Then repeat exactly this extraordinarily broad-scope and highly scientific syllogism twice in the study abstract, first in baseless declarative form and finally in shocked revelatory and conclusive form, as if there was some doubt about the outcome of the effort (ahem…). Never mind that simply repeating the title of the study twice, as constituting the entire abstract is piss poor protocol – no one will care. Denialists of such strong statements of science will find it very difficult to gain any voice thereafter. Task science journalists to craft 39 ‘research articles’ derived from your one-and-done study; deem that now 40 studies. Place the 40 ‘studies’, both pdf and charts (but not any data), behind a registration approval and $40-per-study paywall. Do this over and over until you have achieved a number of studies and research articles which might fancifully be round-able up to ‘1,000’ (say 450 or so ~ see reason below). Declare Consensus.

10. Enlist Aid of SSkeptics and Science Communicators. Enlist the services of a public promotion for-hire gang, to push-infiltrate your study into society and media, to virtue signal about your agenda and attack those (especially the careers of wayward scientists) who dissent.  Have members make final declarative claims in one liner form “A thousand studies show that high risk technology does not cause anything!” ~ a claim which they could only make if someone had actually paid the $40,000 necessary in actually accessing the ‘thousand studies’. That way the general public cannot possibly be educated in any sufficient fashion necessary to refute the blanket apothegm. This is important: make sure the gang is disconnected from your organization (no liability imparted from these exaggerated claims nor any inchoate suggested dark activities *wink wink), and moreover, who are motivated by some social virtue cause such that they are stupid enough that you do not actually have to pay them.

The organizations who manage to pull this feat off, have simultaneously claimed completed science in a single half-assed study, contended consensus, energized their sycophancy and exonerated themselves from future liability – all in one study. To the media, this might look like science. But to a life-long researcher, it is simply a big masquerade. It is pseudo-science in the least; and at its worst constitutes criminal felony and assault against humanity. It is malice and oppression, in legal terms (see Dewayne Johnson vs Monsanto Company)

The discerning ethical skeptic bears this in mind and uses this understanding to discern the sincere from the poser, and real groundbreaking study from commonplace surreptitiously bad science.

epoché vanguards gnosis


How to MLA cite this blog post =>

The Ethical Skeptic, “The Lyin’tific Method: The Ten Commandments of Fake Science” The Ethical Skeptic, WordPress, 3 Sep 2018; Web,

September 3, 2018 Posted by | Agenda Propaganda, Institutional Mandates, Social Disdain | , | Leave a comment

Malice and Oppression in the Name of Skepticism and Science

The Dewayne Johnson versus Monsanto case did not simply provide precedent for pursuit of Monsanto over claims regarding harm caused by its products. As well, it established a court litmus regarding actions in the name of science, which are generated from malice and as well seek oppression upon a target populace or group of citizens.
Watch out fake skeptics – your targeting of citizens may well fit the court’s definition of malice, and your advocacy actions those of oppression – especially under a context of negligence and when posed falsely in the name of science.

If you are a frequent reader of The Ethical Skeptic, you may have witnessed me employ the terms ‘malice’ and ‘malevolence’ in terms of certain forms of scientific or political chicanery. Indeed, the first principles of ethical skepticism focus on the ability to discern a condition wherein one is broaching malice in the name of science – the two key questions of ethical skepticism:

  1. If I was wrong, would I even know it?
  2. If I was wrong, would I be contributing to harm?

These are the questions which a promoter of a technology must constantly ask, during and after the deployment of a risk bearing mechanism. When a company starts to run from these two questions, and further then employs science as a shield to proffer immunity from accountability, a whole new set of motivation conditions comes into play.

The litmus elements of malice and oppression, when exhibited by a ‘science’ promoting party exist now inside the following precedent established by the Court in the case of Dewayne Johnson vs. Monsanto : Superior Court of the State of California, for the County of San Francisco: Case No. CGC-16-550128, Dewayne Johnson, Plaintiff, v. Monsanto Company, Defendant. (see Honorable Suzanne R. Bolanos; Verdict Form; web, below). Below I have digested from the Court Proceedings, the critical questions which led to a verdict of both negligence, as well as malice and oppression, performed in the name of science, on the part of Monsanto Company.

It should be noted that Dewayne Johnson v. Monsanto Company is not a stand alone case in the least. The case establishes precedent in terms of those actions which are punishable in a legal context, on the part of corporations or agencies who promote risk bearing technologies in the name of science – and more importantly in that process, target at-risk stakeholders who object, dissenting scientists and activists in the opposition. So let us be clear here, inside a context of negligence, the following constitutes malice and oppression:

1.  The appointing of inchoate agents, who’s purpose is to publicly demean opponents and intimidate scientific dissent, by means of a variety of public forum accusations, including that of being ‘anti-science’.

Inchoate Action

/philosophy : pseudoscience : malice and oppression/ : a set of activity or a permissive argument which is enacted or proffered by a celebrity or power wielding sskeptic, which prepares, implies, excuses or incites their sycophancy to commit acts of harm against those who have been identified as the enemy, anti-science, credulous or ‘deniers’. Usually crafted is such a fashion as to provide a deniability of linkage to the celebrity or inchoate activating entity.

This includes skeptics, and groups appointed, commissioned or inchoate encouraged by the promoter, even if not paid for such activity.

2.  The publishing of scientific study, merely to promote or defend a negligent product or idea, or solely for the purpose of countermanding science disfavored by the promoter of a negligent product or idea.

All that has to be established is a context of negligence on the part of the promoter. This includes any form of failure to followup study a deployed technology inside which a mechanism of risk could possibly exist. So, let’s take a look at the structure of precedent in terms of negligence, malice and oppression established by the Court in this matter. The questions inside the verdict, from which this structure was derived, are listed thereafter, in generic form.

Malice and Oppression in the Name of Science

/philosophy : the law : high crimes : oppression/ : malice which results in the oppression of a targeted segment of a population is measured inside three litmus elements. First, is the population at risk able to understand and make decisions with regard to the science, technology or any entailed mechanism of its risk? Second, has an interest group or groups crafted the process of science or science review and communication in a unethical fashion so as to steer its results and/or interpretation in a desired direction? Third, has a group sought to attack, unduly influence, intimidate or demean various members of society, media, government or the targeted group, as a means to enforce their science conclusions by other than appropriate scientific method and peer review.

I.  Have a group or groups targeted or placed a population at other than natural risk inside a scientific or technical matter

a. who bears a legitimate stakehold inside that matter

b. who can reasonably understand and make self-determinations inside the matter

c. whom the group(s) have contended to be illegitimate stakeholders, or as not meriting basic human rights or constitutionality with regard to the matter?

II.  Have these group or groups contracted for or conducted science methods, not as an incremental critical path means of investigation, rather only as means to

a. promote a novel technology, product, service, condition or practice which it favors, and

b. negate an opposing study or body of research

c. exonerate the group from reasonable liability to warn or protect the stakeholders at risk

d. exonerate the group from the burden of precaution, skepticism or followup scientific study

e. cover for past scientific mistakes or disadvantageous results

f. damage the reputation of dissenting researchers

g. influence political and legislative decisions by timing or extrapolation of results

h. pose a charade of benefits or detriment in promotion/disparagement of a market play, product or service

i. establish a monopoly/monopsony or to put competition out of business?

III.  Have these group(s) enlisted officers, directors, or managing agents, outside astroturf, undue influence, layperson, enthusiast, professional organization or media entities to attack, intimidate and/or disparage

a. stakeholders who are placed at risk by the element in question

b. wayward legislative, executive or judicial members of government

c. dissenting scientists

d. stakeholders they have targeted or feel bear the greatest threat

e. neutral to challenging media outlets

f. the online and social media public?

The Ruling Precedent (Verdict)

The sequence of questions posed by the Court, to the Jury, in the trail of Dewayne Johnson vs. Monsanto (applied generically as litmus/precedent):


I.  Is the product or service set of a nature about which an ordinary consumer can form reasonable minimum safety expectations?

II.  Did the products or services in question fail to ensure the safety an ordinary consumer would have expected when used or misused in an intended or reasonably foreseeable way?

III.  Was the product design, formulation or deployment a contributor or principal contributing factor in causing harm?

IV.  Did the products or services bear potential risks that were known, or were knowable, in light of the scientific knowledge that was generally accepted in the scientific community at the time of their manufacture, distribution or sale?

V.  Did the products or services present a substantial danger to persons using or misusing them in an intended or reasonably foreseeable way?

VI.  Would ordinary citizen stakeholder users have recognized these potential risks?

VII.  Did the promoting agency or company fail to adequately warn either government or citizen stakeholders of the potential risks, or did they under represent the level of risk entailed?

VIII.  Was this lack of sufficient warnings a substantial factor in causing harm?

IX.  Did the promoter know or should it reasonably have known that its products or services were dangerous or were likely to be dangerous when used or misused in a reasonably foreseeable manner?

X.  Did the promoter know or should it reasonably have known that users would not realize the danger?

XI.  Did the promoter fail to adequately warn of the danger or instruct on the safe use of products or services?

XII.  Could and would a reasonable manufacturer, distributor, or seller under the same or similar circumstances have warned of the danger or instructed on the safe use of the products or services?

XIII.  Was the promoter’s failure to warn a substantial factor in causing harm?

Malice and Oppression

XIV.  Did the promoter of the products or services act with malice or oppression towards at-risk stakeholders or critical scientists or opponents regarding this negligence or the risks themselves?

XV.  Was the conduct constituting malice or oppression committed, ratified, or authorized by one or more officers, directors, or managing agents of the promoter, acting on behalf of promoter?

epoché vanguards gnosis


How to MLA cite this blog post =>

The Ethical Skeptic, “Malice and Oppression in the Name of Skepticism and Science” The Ethical Skeptic, WordPress, 28 Aug 2018; Web,


August 28, 2018 Posted by | Institutional Mandates, Tradecraft SSkepticism | , | 2 Comments

It Does Not Take a Conspiracy

At some point ignorance must betray the lie which exploits it. Mass delusions are a natural outcome of a specific recipe of commonplace cultural norms. All that is required to deploy a large scale deception, is a critical mass of ignorance and chronic angst, ignited by small repetitive prodding sourced from a position of authority. One does not have to conspire – rather only understand the malleable nature of duress inside complex social systems.
As it turns out, there is no need for a micro cause of this macro phenomena of complexity.

In order to create an exothermic nuclear decay acceleration from gamma rays, fission materials, and fast and thermal neutrons, one requires several physical components to effect such a reaction. Nuclear fuel along with a reactor core, neutron moderator, neutron poison (absorber), steady source of ignition neutrons, coolant, control rods and a reactor pressure vessel.1 Save for the mitigating features of a coolant, neutron poison, control rods and a neutron moderator – the process which foments the real social vulnerability which social skeptics falsely spin as ‘conspiracy theory’, is a natural outcome stemming from exploitation of several commonplace and naturally occurring social norms. It does not take a conspiracy after all, rather merely a pinch of chronically induced social anxiety, along with some gentle prodding in the right places, and in the right direction.

In 1990 a company called LA Gear introduced a footwear line into the high school aged buyer demographic, featuring a light emitting diode which flashed each time that the footwear user stepped on the ground. Called ‘LA Gear Lights’, these sneakers propelled this little known company to over $1 billion in sales revenue in just two short years of product maturation. Every high school socialite in California, and then the broader US demographic, desired these symbols of approved conformity.2 In similar fashion (pardon the pun), Kevin Planck at Under Armour was able to build a powerhouse brand through exploiting the tribal psychology example of college and professional athletes, upon a population thirsting for social acceptance. A momentum of such magnitude that it challenged and surmounted the pinnacle of brand strength (apologies to Coca-Cola) in the consumer goods industry, Nike. Kevin had listened to a small consumer goods advisory firm who taught that value in product strengthened brand and pricing better than did a roll of the dice on style, and creation of a margin-resilient value chain was paramount over mere purchase and operating cost minimization. Under Armour’s apparel cost them more to produce, message and deliver than did Nike’s, but they were also able to value their items at a higher price point than did Nike. They had solved a problem of tribal duress.3

Fashion science as it turns out is a very informative field of study, eliciting principles which are very useful to those seeking to exploit its elements to direct and control thought.

The essence of human interaction called a fad – elicits a principal with regard to social vulnerability, which bears dynamics similar to that of an unconstrained atomic pile (reactor core). In both of the case studies cited above, the momentum of personal statement and tribal example, was a neutron ignition source into a pile of compressed and anxious young adults (the fuel), exploiting the kinetic energy of their desire to be accepted. Starlings in group flight do not have to exhibit a specific pattern desired, all they have to do is not exhibit the pattern which is forbidden. And in order to reduce their likelihood of exhibiting the embargoed hypothesis – all we have to do is keep them under constant angst. Even an image bearing truth can be quickly dissipated through chaos and duress.

This social vulnerability does not simply end at age 25 of course. It continues to ferment and mature into less obvious forms of control-ability and fanaticism, in the average adult member of society. Nazi Germany did not proliferate its message simply through means of the concerted effort of broadscale conspiracy, rather an exploiting of the common social norms fermenting in the aftermath of World War I. Germans struggled to understand their country’s uncertain future. Citizens faced poor economic conditions, skyrocketing unemployment, political instability, and profound social change. While downplaying more extreme goals, Adolf Hitler and just a few individuals inside the Nazi Party offered simple solutions to Germany’s problems, exploiting people’s fears and frustrations.4 There existed a common nutrient solution of duress upon the general population (see The Ten Pillars of Social Skepticism). A study published in June of 2017, elicits and supports this notion that populations under duress are vulnerable to being exploited by control-minded influences. Highlighting that even our official authorized stories themselves, may yet be the result of this vulnerability, moreso than either an enormous effort of influencing or a prevailing realization of the truth inside a matter.

Evidence suggests that the aversive feelings that people experience when in crisis—fear, uncertainty, and the feeling of being out of control—stimulate a motivation to make sense of the situation, increasing the likelihood of perceiving conspiracies in social situations. We then explain that after being formed, conspiracy theories can become historical narratives that may spread through cultural transmission. We conclude that conspiracy theories originate particularly in crisis situations and may form the basis for how people subsequently remember and mentally represent a historical event.

~ Van Prooijen, Douglas; Sage – Memory Studies : “Conspiracy theories as part of history”5

Establishing Isolation and Chronic Duress is All that is Required

There is no micro cause of conspiracy. There is safety in the herd however. One does not have to conspire – rather only understand the malleable nature of social duress and establish separations between groups of people. The public does not only invent creative alternatives under chronic applications of such duress, but they are vulnerable to adopting an official version more easily as well, and spreading it predictably and habitually. This as much as anything, may be the reason behind why all our news is negatively charged. It allows for control and exploits human regularity and tribal habits. Consider for instance, the work of Associate Professor of Media Arts and Sciences at MIT and the director of the Collective Learning group at The MIT Media Lab, Cesar Hidalgo, and his associates.6

We find that in contrast with the random trajectories predicted by the prevailing Levy flight and random walk models, human trajectories show a high degree of temporal and spatial regularity, each individual being characterized by a time independent characteristic length scale and a significant probability to return to a few highly frequented locations. After correcting for differences in travel distances and the inherent anisotropy of each trajectory, the individual travel patterns collapse into a single spatial probability distribution, indicating that despite the diversity of their travel history, humans follow simple reproducible patterns.

~ Understanding individual human mobility patterns, Marta C. Gonzalez, Cesar A. Hidalgo, Albert-Laszlo Barabasi

Conspiracy theory accusation therefore, goes both ways. Both the dissenting minority and the conforming majority are vulnerable, and a conspiracy is not required at all in either case. Humans are creatures of habit, whether to their advantage or detriment – producing fixed patterns which become exothermic when placed under a negative vibrating energy.

  • fear of outsiders,
  • desire to regain power,
  • habit/history of religious-styled fervor,
  • emotional damage from traumatic past events,
  • overcompensation for secret doubts,
  • fear of the new and unknown,
  • cultural addiction to confrontation & denial,
  • emotional rush derived from control and deception,
  • cathartic joy of belittling those who are different and
  • the need to belong
  • the need to communicate and story-tell.

In this combination of factors, an interesting troop dynamic occurs in which humans naturally seek to reinforce, protect and promote a dogmatic message; and they will do so without much prodding. When taken inside a complex social system, this set of micro traits can serve to create disconnected, but reliable, social phenomena. It is not that we can ascribe a single mirco event or influence to be the cause of conspiracy or conspiracy thinking itself. We have to understand the nature of complex social systems and how they can be manipulated towards specific ends – and it does not take a conspiracy itself, in order to make this happen. It is a naturally exothermic process.

This combination of social factors causes a proliferation of dogmatic ignorance and compliance, which is similar in nature to an exothermic nuclear reaction. A principle called exoagnoia:


/philosophy : rhetoric : exploitation : fad : ignorance/ : conspiracy which is generated naturally through the accelerative interaction of several commonplace social factors. A critical mass of uninformed, misinformed, disinformed and/or compartmentalized population under chronic duress (the ignorance fuel), ignited by an input of repetitive authoritative propaganda (the ignition source). Such a phenomenon enacts falsehood through its own inertia/dynamic and does not necessarily require a continuous intervention on the part of an influencing group.

Critical Elements of a ‘Conspiracy’ (Fad)

  • a compressed and interactive population
  • a conformance compelling and persistent angst (the duress)
  • identification of the unacceptable (bad)
  • compartmentalized organizations who apparatchiks do not fully understand the big picture
  • introduction of an easily observable ‘acceptability’ influence from a tribe or very small sliver of the population
  • social celebrity backing and praise for the influence
  • media sources who will craft ingoratio elenchi, ingens vanitatum and verum mendacium filled publications (see The Art of the Professional Lie)
  • silence about or disincentive towards considering any alternatives

There exist two flavors of this mechanism:

  1. Popular confirmation (promotion of the preferred idea)
  2. Popular inverse negation (condemnation of the full set of unsanctioned ideas)

That is all it takes folks. As it turns out, it does not take a conspiracy after all, rather merely a gentle prodding in the right places, and in the right direction, at the right time.

epoché vanguards gnosis

How to MLA cite this blog post =>
The Ethical Skeptic, “It Does Not Take a Conspiracy” The Ethical Skeptic, WordPress, 30 March 2018, Web;

March 30, 2018 Posted by | Agenda Propaganda, Institutional Mandates | , | Leave a comment

Chinese (Simplified)EnglishFrenchGermanHindiPortugueseRussianSpanish
%d bloggers like this: