Tag Archives: critical thinking

FLOATER: A Tool-Kit for Evaluating [Scientific] Claims

FLOATER toolkit [downloaded from http://thinkingispower.com/floater-a-tool-kit-for-evaluating-claims/]
FLOATER [downloaded from http://thinkingispower.com/floater-a-tool-kit-for-evaluating-claims/¸]

Thanks to Raymond Nakamura`s November 16, 2021 tweet (his website is here), I found this rather nifty tool and the Thinking is Power website.

Before moving on to the toolkit, here’s a little about the website’s creator (from the About page) Note: A link has been removed,

Thinking Is Power (TIP) was created by science educator and communicator Melanie Trecek-King to provide accessible critical thinking content to the general public through entertaining stories, approachable language, and shareable graphics.

TIP’s content is based on a general-education science course Trecek-King developed that’s designed to help students understand the process of science and how to use critical thinking to make informed decisions. The course was the result of years of teaching introductory biology students to (mostly) memorize facts and realizing they were going to forget everything…except how much they hated science. Trecek-King’s other goal with TIP is therefore to encourage other science educators to reflect on how we teach science, and even what it means to be science literate. Is it memorizing and regurgitating facts? Or understanding how the process of science acquires knowledge and why it’s reliable? 

Since its launch in January of 2021, Thinking Is Power has rapidly become a go-to resource in the area of science communication and critical thinking. …

As for the toolkit, here’s some of what I found particularly interesting (from the FLOATER webpage) Note: Links have been removed,

As a science educator, my primary goals are to teach students the essential skills of science literacy and critical thinking. Helping them understand the process of science and how to draw reasonable conclusions from the available evidence can empower them to make better decisions and protect them from being fooled or harmed.

Yet while nearly all educators would agree that these skills are important, the stubborn persistence of pseudoscientific and irrational beliefs demonstrates that we have plenty of room for improvement. To help address this problem, I developed a general-education science course which, instead of teaching science as a collection of facts to memorize, teaches students how to evaluate the evidence for claims to determine how we know something and to recognize the characteristics of good science by evaluating bad science, pseudoscience, and science denial.

In my experience, science literacy and critical thinking skills are difficult to master. Therefore, it helps to provide students with a structured toolkit to systematically evaluate claims and allow for ample opportunities to practice. …

The foundation of FLOATER is skepticism. While skepticism has taken on a variety of connotations, from cynicism to denialism, scientific skepticism is simply insisting on evidence before accepting a claim, and proportioning the strength of our belief to the strength and quality of the evidence.  

Before using this guide, clearly identify the claim and define any potentially ambiguous terms. And remember, the person making the claim bears the burden of proof and must provide enough positive evidence to establish the claim’s truth. 

I’m included one excerpt from the poster in the hope that it will encourage readers to visit the webpage and/or site for themselves (from the FLOATER webpage) , Note: Links have been removed,

It seems counterintuitive, but the first step in determining if a claim is true is to try to determine if you can prove it wrong. 

Falsifiable claims can be proven false with evidence. If a claim is false, the evidence will disprove it. If it’s true the evidence won’t be able to disprove it. 

Scientific claims must be falsifiable. Indeed, the process of science involves trying to disprove falsifiable claims. If the claim withstands attempts at disproof we are more justified in tentatively accepting it. 

Unfalsifiable claims cannot be proven false with evidence. They could be true, but since there is no way to use evidence to test the claim, any “evidence” that appears to support the claim is useless. Unfalsifiable claims are essentially immune to evidence. 

There are four types of claims that are unfalsifiable.

1. Subjective claims: Claims based on personal preferences, opinions, values, ethics, morals, feelings, and judgements. 

For example, I may believe that cats make the best pets and that healthcare is a basic human right, but neither of these beliefs are falsifiable, no matter how many facts or pieces of evidence I use to justify them.

2. Supernatural claims: Claims that invoke entities such as gods and spirits, vague energies and forces, and magical human abilities such as psychic powers.

By definition, the supernatural is above and beyond what is natural and observable and therefore isn’t falsifiable. This doesn’t mean these claims are necessarily false (or true!), but that there is no way to collect evidence to test them.

For example, so-called “energy medicine,” such as reiki and acupuncture, is based on the claim that illnesses are caused by out-of-balance energy fields which can be adjusted to restore health. However, these energy fields cannot be detected and do not correspond to any known forms of energy.

There are, however, cases where supernatural claims can be falsifiable. First, if a psychic claims to be able to impact the natural world in some way, such as moving/bending objects or reading minds, we can test their abilities under controlled conditions. And second, claims of supernatural events that leave physical evidence can be tested. For example, young earth creationists claim that the Grand Canyon was formed during Noah’s flood approximately 4,000 years ago. A global flood would leave behind geological evidence, such as massive erosional features and deposits of sediment. Unsurprisingly, the lack of such evidence disproves this claim. However, even if the evidence pointed to a global flood only a few thousand years ago, we still couldn’t falsify the claim that a god was the cause.

3. Vague claims: Claims that are undefined, indefinite, or unclear.

Your horoscope for today says, “Today is a good day to dream. Avoid making any important decisions. The energy of the day might bring new people into your life.”

Because this horoscope uses ambiguous and vague terms, such as “dream,” “important”, and “might”, it doesn’t make any specific, measurable predictions. Even more, because it’s open to interpretation, you could convince yourself that it matches what happened to you during the day, especially if you spent the day searching for “evidence.”

Due to legal restrictions, many alternative medicine claims are purposefully vague. For example, a supplement bottle says it “strengthens the immune system,” or a chiropractic advertisement claims it “reduces fatigue.” While these sweeping claims are essentially meaningless because of their ambiguity, consumers often misinterpret them and wrongly conclude that the products are efficacious.

4. Ad hoc excuses: These entail rationalizing and making excuses to explain away observations that might disprove the claim. 

While the three types of claims described thus far are inherently unfalsifiable, sometimes we protect false beliefs by finding ways to make them unfalsifiable. We do this by making excuses, moving the goalposts, discounting sources or denying evidence, or proclaim that it’s our “opinion.” 

For example, a psychic may dismiss an inaccurate reading by proclaiming her energy levels were low. Or, an acupuncturist might excuse an ineffective treatment by claiming the needles weren’t placed properly along the patient’s meridians. Conspiracy theorists are masters at immunizing their beliefs against falsification by claiming that supportive evidence was covered up and that contradictory evidence was planted.

The rule of falsifiability essentially boils down to this: Evidence matters. And never assume a claim is true because it can’t be proven wrong. 

Interesting, eh? There are another six to investigate on the FLOATER webpage.

One last thing, there’s also, “How to Read the News Like a Scientist; Overwhelmed by your news feed? Use tools from science to evaluate what’s true and what’s fake,” suggests researcher Emma Frans in a March 22, 2019 blog posting (made available by Pocket) by Daniella Balarezo and Daryl Chen for TED Ideas .

Café Scientifique Vancouver (Canada) talk on July 31st. 2018: Test Tubes to Teaching: How Anti-Vaxxers and a Global Financial Crisis Shaped my Career

I received (via email) this Café Scientifique July 15, 2018 notice ,

Our next café will happen on TUESDAY, JULY 31ST at 7:30PM in the back
room at YAGGER’S DOWNTOWN (433 W Pender). Our speaker for the
SCIENCES AT SFU. Her topic will be:


Part research talk, and part memoir, Dr. van Houten will describe her
career progression from vaccine design scientist to education
researcher. From early childhood, Dr. van Houten developed an
unrelenting interest in human biology and infectious diseases and made
it her goal to become a scientist. Her passion for vaccines came about,
in part, due to the publicity surrounding the infamous retracted paper
in _The Lancet_ that erroneously connected measles vaccination with
autism. Her Ph.D. and postdoctoral research focused on how vaccines
work, and she engineered anti-viral vaccines to produce focused antibody
responses. However, her plan of working in the pharmaceutical industry
was sidelined by the financial crash of 2008, and she was offered a full
time teaching faculty position. This created an opportunity to study how
students think critically about science and apply those findings to
train students to recognize bad science such as that promoted by
anti-vaxxers and other garbage “science” that pervades our society.

We hope to see you there!

I wasn’t able to find out much more about Dr. van Houten and her work but her SFU profile page is here.

Cookie cutters; agility vs. rigidity; 2010 Canadian Science Policy Conference; Kate Pullinger GG 2009 award winner for fiction

Ever wonder about all that talk about critical thinking? Supposedly that’s what education does for you, i.e. encourages critical thinking. I mention it because there’s a great little essay on The Black Hole blog about critical thinking in higher education. It’s called, Science is like Baking: The Rise of the Cookie Cutter PhD. I did have one minor quibble,

Together, these forces do what I think we should be very very scared of… they apply pressure to churn out PhDs faster, with more papers, with less flexibility in ideas and more rigid (read publishable) research project designs. So, in the end, little effort goes into helping the PhD students think critically about their field – and while I don’t believe this style of training is as far gone in the Humanities… I think it’s coming, so get yourself ready!

Sadly, I believe that the process is already gaining momentum in the humanities.

Rob Annan at Don’t Leave Canada Behind has a very pointed (scathing) analysis of a pre-budget submission from the SSHRC/NSERC/CIHR tri-council to the House of Commons Standing Committee.  [SSHRC = Social Sciences and Humanities Research Council; NSERC = Natural Sciences and Engineering Research Council; CHIR = Canadian Health Institutes Research] From his posting,

… What does this mean? Sounds to me like stable, long-term funding is to be sacrificed at the altar of increased flexibility. And what exactly is a “dynamic approach” to funding research? This bureaucratic nonsense speak could have real consequences for researchers. Does agility, dynamism, and responsiveness mean that the agencies will be rapidly changing funding priorities from year to year? Will the agencies just start chasing the hottest trends?

Annan’s concern about “agility, dynamism and responsiveness” as a funding agency priority would seem to contradict The Black Hole’s essayist’s concern “with more papers, with less flexibility in ideas and more rigid (read more publishable) research project designs.”

In fact, we could end up with a situation where both apply. Imagine this. (1) A researcher applies for a ‘trendy’ area of research thereby fulfilling the funding agency’s dynamic, responsive funding requirement. (2) The researcher or PhD student’s academic institution or employer constrains the researcher to pump out multiple papers from a rigid research design under the funding agency’s the rubric of being responsive and agile.

Frankly, I’d like to see a little more agility and dynamism but I’d like it see it applied effectively. Sadly, I believe that my little scenario is more likely than not. The funding agencies are scrambling for money and, with the best of intentions, will do what it takes to get more so they can fulfill their mandate of supporting research. Meanwhile, the academic institutions will pay lip service to agility and dynamism while they apply the principles of rigidity and conformity used in production lines to pump out more product (publishable papers, awards, etc.) so they can maintain themselves and provide (their raison d’etre) education.

On other notes: there is a 2010 Public Science in Canada | Strengthening Science and Policy to Protect Canadians conference coming up in May. The keynote speakers are Stephen Lewis in an as yet untitled talk and [David] Suzuki and [Preston] Manning on Science: A Public Dialogue.  (Is there a Canadian science conference or science event where Preston Manning isn’t giving a keynote address?) More details can be found here.

On a personal note, congratulations to the Governor General’s latest fiction award winner, Kate Pullinger for the Mistress of Nothing. She was one of the leaders and teachers in my master’s programme (Creative Writing and New Media) at De Montfort University in the UK. I’m grateful that I had a chance to study in the programme (which was canceled after its 3rd year). I was able to experiment with creative writing techniques and science writing and that was a privilege.