The value of skepticism
Although medical safety protocols work well at protecting the public, skeptics should still be given a fair hearing, as occasionally their suspicions are proven correct
“So I got the Moderna, and people were like, So, you trust the government? No, I don’t trust the government. You trust big pharma? No, I don’t trust big pharma. I trusted my doctor. The guy who saved my life… So I tweeted about [my severe adverse reactions] and people started to come at me calling me antivax. No, I got the vax. When people have a reaction to an experimental vaccine that’s not FDA approved, you’re not supposed to suppress the reactions. You’re supposed to ask people what the reactions are so we can make the vaccines better.” — Left-wing political commentator Jimmy Dore speaking with Joe Rogan on July 28, 2021
When my dad was a teenager in the late 1940s, he had a bad case of acne. But there was a new medical treatment that provided relief for teens who suffered the social embarrassment of acne-ridden faces.
Radiation treatment.
At the time, the risks of radiation weren’t well known, or they were generally brushed aside. It was a belief in science and technology that gave doctors and patients alike great comfort in getting nuked for minor maladies.
My dad’s acne was cured, but about ten years later his teeth started to fall out, and he wore dentures for the rest of his life. He also had to get lesions removed from his face a few times in his 40s and 50s. He got off lucky. Many recipients of dermatological radiation treatments developed thyroid cancer.
Even in the late 1960s, some physicians dismissed concerns over the treatments as “scare propaganda,” even as the medical profession was largely abandoning the practice.
It was only until radiation’s dangers in medical use were recognized that advances were made to ensure that it could be used safely for treatment and diagnosis. Its use wasn’t banned. It was improved. This is why we have safe x-ray technology and low-risk cancer radiation treatments.
Still, there was a long in-between period in which a 25-year-old man saying that he was losing his teeth because of radiation treatment ten years earlier would have been dismissed as “scare mongering.” At the time, physicians would have said that no cause-and-effect connection could be established.
Medical science has improved a great deal since the middle of the last century. The profession of research and development is much more cognizant of potential dangers that could come from medical breakthroughs, and go to greater lengths to determine and mitigate harms. Testing and evaluation processes are rigorous and overseen by regulators.
This is why we have never again seen such drug as nasty as thalidomide since it was pulled from the marketplace in 1961. Still, it’s worth understanding how long it took for its effects to be recognized.
Thalidomide was approved in Europe and Canada in 1957 for anxiety and sleep troubles. It was considered safe for pregnant women, but it turned out that it had been preventing proper human formation in the womb, and women were giving birth to babies with severely deformed limbs, 40% of which died at birth.
The malformations were tied to thalidomide and its use was ceased in 1961. Before that, a reviewer at the US Food and Drug Administration named Frances Oldham Kelsey had been holding up thalidomide’s approval because she had concerns about its lack of safety studies.
In other words, Kelsey was a skeptic. She had no evidence that thalidomide was harmful. She made no predictions of its long-term adverse effects. She only had safety studies conducted by the company to go by, and in her judgement, it wasn’t enough. She was virtually alone among world drug regulators in resisting pressure to approve thalidomide. It took four years of slowly increasing evidence to prove her concerns correct.
Before that, imagine mothers and physicians in 1959 or 1960 noticing a pattern and blaming thalidomide for these birth defects. The response would have been: “This drug has been used safely for three years and has been approved by over 20 countries worldwide. There is no evidence to support these claims. Malformed children are born every year and no ties to thalidomide can be established.”
Until they were. Again, it took four years.
The thalidomide episode led to modern-day drug-approval regulations that have helped ensure the safety of the medicines we take. Overall, pharmaceutical screwups and errors are far less common and less drastic. Still, they do occur, and it takes time for the safety problems to become noticed and acknowledged.
One famous example is Vioxx, an arthritis drug that was approved in 1999 and withdrawn in 2004 over increased risks of strokes and heart attacks. There are many other instances of bad drugs that got pulled from market, which were lesser known, but regardless demonstrate that it can take anywhere from two to fifteen years for their dangers to become noticed.
Do these screw-ups happen with vaccines? Sometimes. The most well known is an old case, and as with thalidomide, it led to better safety standards for vaccine development. In 1955, some batches of a polio vaccine contained a whole live virus despite passing safety testing. It caused about 250 cases of polio, causing paralysis in many of the recipients. It came to be known as the Cutter incident, named after the laboratory responsible for the mistake.
Like thalidomide and early radiation therapy, the Cutter incident was a “first and worst” scenario, and we haven’t seen deadly vaccine mishaps on that scale since. Still, there have been some cases of modern day vaccines being pulled from the market over safety concerns that emerged over several years of being administered.
One example is the diphtheria-tetanus-pertussis vaccine, which was associated with a five-fold increase in child mortality when administered in African countries. (This was discussed in a fascinating 2018 TED Talk by Christine Stabell Benn; the link includes a written transcript. Stabell Benn describes how some vaccines help the body fight non-targeted diseases in unexpected ways, while other vaccines have shown unintended harms.)
Another example is an early measles-mumps-rubella (MMR) vaccine that was taken off the market in 1992, four years after its introduction, when data gradually revealed that it caused aseptic meningitis and convulsions.
A later MMR vaccine was the spark of the “anti-vax” movement as we know it today, but those anti-vaxxers were responding to a major scientific screw-up. A 1998 study published in The Lancet medical journal erroneously connected that vaccine with autism. The study was shown to be riddled with flawed research methods and possible fraud, and the fact that it was even published was considered a major failure of the esteemed journal’s peer review process. Even so, it took 12 years (!) for The Lancet to retract the study. While this itself is not a vaccine screw-up, I mention it so that the above-mentioned MMR case is not confused with this debunked study, and to show that medical research itself makes mistakes in the noble process of discovery.
Then there’s LYMErix, a Lyme disease vaccine that federal regulators okay’d, but with reservations about possible autoimmune disorders appearing “down the road.” Although autoimmune reactions never materialized, LYMErix was suspected of causing arthritis and chronic fatigue syndrome in 121 recipients who filed a class action lawsuit that materialized more than a year after the vaccine was approved. The manufacturer settled the suit out of court and pulled the vaccine from market.
Because regulators excluded LYMErix from the recommended vaccine schedule, patients harmed by the vaccine could sue the manufacturer rather than file a claim through the National Vaccine Injury Compensation Program.
Similar programs exist in Canada, the EU and elsewhere. Permanent injuries and other harms caused by vaccines listed on the schedule are covered by a government body. Because of this, regular but rare cases of vaccine injuries are settled quietly and do not become subject to the media spotlight in the way a court case does — the very reason LYMErix received such attention. Had it been on the recommended vaccine schedule, it would likely still be on the market.
Was this a good or bad thing? Was it good that the “dangers” of LYMErix were widely publicized to the point where production had to be stopped? Or is it to be lamented that the world does not have a vaccine for Lyme disease, given that all scheduled vaccines have similar risk profiles and aren’t subject to the same kind of attention?
Which all leads to the topic to Covid-19 vaccines.
As with any other vaccine, it’s expected that Covid vaccines carry some degree of risk to a slight percentage of those they are administered to.
On the other hand, these vaccines are using new technology and work using methods that are called “novel” for a reason. This is especially true of the mRNA vaccines (Pfizer and Moderna). The viral-vector technology used in the AstraZeneca and Johnson & Johnson vaccines has only before been used in Ebola vaccines starting in 2018, and thus administered in extremely small populations. It has falsely been called a “traditional” vaccine by journalists who didn’t do any rudimentary research.
The risks of blood clotting that have been reported in the viral vector vaccines were not discovered in the Phase 3 trials or in its uses for Ebola. It was not until the vaccines were sent into vast, widespread use under emergency use authorization that the blood clotting problems came to light. The same is true for the risks of myocarditis that were discovered after mass administration of the mRNA vaccines.
Given that it usually takes well over a year to discover adverse harms caused by vaccines, it’s perfectly reasonable for anyone to question what other unpleasant discoveries might be coming down the road with relation to the Covid vaccines. Are the blood clots and heart inflammation just the tip of the iceberg, or the last of the potential harms to be discovered?
Raising such questions, within the general public and within research communities, is a normal part of the scientific process.
To give just one example: Early in the development of the mRNA Covid vaccines, an Israeli infectious disease specialist named Tal Brosh told the Jerusalem Post that there “are unique and unknown risks to messenger RNA vaccines, including local and systemic inflammatory responses that could lead to autoimmune conditions.” The article went on to quote another specialist who “believes there is no cause for concern.” Overall, the article was demonstrating the process of scientific inquiry. Scientists raise concerns and might differ in outlook, knowing that they can make individual predictions but are not all-knowing soothsayers.
That article was from November 2020, before it was deemed essential to suppress such discussion about the new Covid vaccines. I also note that this piece came from a newspaper in Israel, where reporting of Covid issues has been a bit more balanced given that the vaccines have less political baggage in that region, and some room has been allowed for non-conformist reporting on Covid matters.
During the pandemic, many people who did their own vaccine research online were mocked and slandered. I understand where this kind of sentiment comes from, but it’s really disingenuous. Journalists, scientists, academics, and even high school students are finding the information they need on the internet. Given that everything from recipes to the world’s leading academic journals are found online, there should be nothing disturbing about people engaging their curiosity through internet resources. We all do it every day.
It’s likely that you have even done your own medical research online. If, say, a physician wants to prescribe a certain antidepressant or arthritis drug to a patient, that person is likely to look it up online and consider its listed side effects and the experiences of other users of the medication. Physicians generally encourage this type of patient involvement. Many years ago when I picked up a tropical intestinal infection that resisted two antibiotics, I spent some time online googling through medical journals and found one combination therapy that looked promising to my situation. I printed the article and took it to my doctor, who decided that the two medications were worth a try. They worked.
The other consideration is that the internet’s promise 25 years ago was that it would liberate information and enlighten humanity. As a broadcast production assistant, magazine writer, textbook editor, medical case manager, masters student and all-around civilian, I have relied on the internet in virtually every realm of my life. Without it, I likely would be working in government administration or perhaps managing a grocery store. Doing “my own research online” has allowed me to flourish in a variety of occupations I wouldn’t have imagined possible when I was 20. I don’t begrudge anyone who takes advantage of this incredible resource.
I understand that much of the backlash against “doing your own research online” is related to conspiracy theorists and quacks whose internet information dumps are a combination of misinterpreted conclusions to good research, references to bad or fraudulent scientific papers, and widely spread conjecture. This type of medical misinformation was never unique to the internet. The “alternative health” subculture existed well before the internet, and could be found in mainstream bookstores and DIY magazines, leading people to “crystal therapies” and holistic cancer “cures” that did more harm than good. I can specifically recall how shark cartilage was peddled in health food stores, based on misleading claims in a bestselling book. Is this type of thing more prevalent now with the internet? Probably not.
Let’s just take Covid vaccination as a baseline example. In Canada, at the time of writing, 92% of the eligible population has voluntarily taken a Covid vaccine. Of the 8% that have resisted the vaccine, there is strong evidence that this number includes PhDs, nurses, doctors, pilots — educated people who are not the type to be led astray by conspiracy theories involving the Illuminati. Assuming that this makes up half of the non-vaccinated, that’s about 4% of the eligible population not getting vaccinated for crackpot reasons. Even at that, the people we consider “crackpots” have useful roles in society. The “freaks and weirdos” of society think in such different ways that they enlighten our world with great art, music and films — and even in the sciences. One such crackpot invented the PCR technology we now use in Covid testing, and another somewhat unorthodox individual who has fueled some skepticism of mRNA vaccines (although he did take the Moderna shot) in fact played a primary role in the invention of mRNA technology. So we’ve come full circle. Without the 4%, we’d also be without the inventions we now rely on to diagnose and treat Covid.
So there are multiple reasons why we should cut a lot of slack toward those who would not choose the same course we would in relation to the Covid vaccines. These are not standard flu, hepatitis, measles, meningitis, or yellow fever vaccines that have been in widespread use for several years and have a known safety record. If people want to be skeptical, they have that right.
The other variable that complicates matters is the socio-politicization of the vaccines. In the rush to get as many people vaccinated as possible, it has become nearly forbidden to discuss adverse events caused by the vaccines. Even if these events are not common, and even if you accept the argument that people recover from these events and would fare worse if they caught Covid, it is regardless scientifically important that we allow discussion in order to create the drive for better vaccines.
When debate is stifled, people reasonably wonder if other harmful trends were to emerge, what information would be kept from the public? A case in point is the quote introducing this essay. The situation Jimmy Dore describes has been a common one across the media, whether it’s the mainstream news, YouTubers or Twitterers. If you question government Covid policy, raise doubts about the vaccines, or even take a vaccine and discuss a negative experience, you’re smeared as an antivaxxer or a Covid denier.
A vivid example of this is what happened to Eric Clapton. He took both shots of the AstraZeneca vaccine. He went on record to say that after the first shot, he had “severe reactions which lasted ten days,” and after the second shot said: “My hands and feet were either frozen, numb or burning, and pretty much useless for two weeks, I feared I would never play again — I suffer with peripheral neuropathy and should never have gone near the needle — but the propaganda said the vaccine was safe for everyone…”
There is no evidence that Clapton was lying about his experience. Taken in balance with the celebrities who have spoken positively about getting their shots, this story could have been ignored or passed off as a rare experience. Instead, when put together with Clapton’s prior criticisms of lockdown policies — which wasn’t exactly holocaust denial — the media went full force in calling Clapton an antivaxxer, going as far as to dig up 40-year-old comments he made on stage in the midst of a severe drug and alcohol addiction. They were racist comments to be sure, but comments for which he profusely apologized years later. “It was shocking and unforgivable and I was so ashamed of who I was,” Clapton said in what is part of a longer statement of taking responsibility for his behavior. The public and the media treated his remorse as sincere and moved on.
Until he discussed his vaccine side effects. Then he was smeared as a racist antivaxxer. The comments were reprinted as if he stood by them to this day, with no mention of his remorse and recovery from addiction. It was all across the media, in Rolling Stone, People, the LA Times, all the major TV networks — small and large outlets across North America and Europe, trending all over Twitter and Facebook. What effect does this and other examples have on the general public? Some people might be experiencing severe adverse effects and not discussing them with friends out of fear of stigmatization. Jimmy Dore, in the clip linked at the top of this article, mentions that he talked with doctors and nurses who won’t go on the record with their vaccine stories because their careers would be at stake. Others just might go into denial about connections between their vaccination and strange illnesses they experience.
It is in this way I relate most to the “Covid antivaxxers” and the vax hesitant. There is a deliberate campaign to suppress discussion and questioning of the vaccines, even when legitimate concerns are present. Even children know when they are being lied to and manipulated, so it shouldn’t be surprising when a certain segment of society resents character assassination being used to shut down their concerns. All it does is raise suspicions — If these vaccines are truly safe, then why lie about and smear the people who truthfully talk about their adverse reactions? I’m not even suggesting that this is proof that the vaccines are abnormally unsafe, but if you’re on the side of promoting the safety of these vaccines, then you shouldn’t be afraid of letting the truth win out; there’s no reason to put a thumb on the scale and give people a reason for doubt.
And if it turns out the skeptics are right, then perhaps it’s in humanity’s benefit to let their stories be heard.
According to Anthony Fauci, adverse effects of vaccines never appear after three months. One of his many lies. I would like to add that Eric Clapton has been tremendously generous with his time and money for the Crossroads charity. The people slandering him now have chosen to overlook his good deeds.
Well spoken, many thanks. One thing I wanted to point out in your article.. "I understand that much of the backlash against “doing your own research online” is related to conspiracy theorists and quacks whose internet information dumps are a combination of misinterpreted conclusions to good research, references to bad or fraudulent scientific papers, and widely spread conjecture. This type of medical misinformation was never unique to the internet. The “alternative health” subculture existed well before the internet, and could be found in mainstream bookstores and DIY magazines, leading people to “crystal therapies” and holistic cancer “cures” that did more harm than good. I can specifically recall how shark cartilage was peddled in health food stores, based on misleading claims in a bestselling book. Is this type of thing more prevalent now with the internet? Probably not." Just fyi.. advanced science has now learned what some indiginous cultures have long known: that crystals, much like herbs have specific healing powers that can be extracted when put into a glass of water, for example, and taken internally for a certain condition. It is amazing the powers of healing that exist naturally on this wonderful planet! Thanks for your well considered writing. It is much appreciated.