fbpx
Connect with us

The Conversation

The ‘average’ revolutionized scientific research, but overreliance on it has led to discrimination and injury

Published

on

The ‘average' revolutionized scientific research, but overreliance on it has led to discrimination and injury

The average can tell you a lot about a dataset, but not everything.
marekuliasz/iStock via Getty Images Plus

Zachary del Rosario, Olin College of Engineering

When analyzing a set of data, one of the first steps many people take is to compute an average. You might compare your height against the average height of people where you live, or brag about your favorite player's batting average. But while the average can you study a dataset, it has important limitations.

Uses of the average that ignore these limitations have led to serious issues, such as discrimination, injury and even -threatening accidents.

For example, the U.S. Air Force used to design its planes for “the average man,” but abandoned the practice when pilots couldn't control their aircraft. The average has many uses, but it doesn't tell you anything about the variability in a dataset.

I am a discipline-specific education researcher, meaning I study how people learn, with a focus on engineering. My research includes study of how engineers use averages in their work.

Advertisement

Using the average to summarize data

The average has been around for a long time, with its use documented as early as the ninth or eighth century BCE. In an early instance, the Greek poet Homer estimated the number of soldiers on ships by taking an average.

Early astronomers wanted to predict future locations of . But to make these predictions, they first needed accurate measurements of the stars' current positions. Multiple astronomers would take position measurements independently, but they often arrived at different values. Since a star has just one true position, these discrepancies were a problem.

Galileo in 1632 was the first to push for a systematic approach to address these measurement differences. His analysis was the beginning of error theory. Error theory helps scientists reduce uncertainty in their measurements.

Error theory and the average

Under error theory, researchers interpret a set of measurements as falling around a true value that is corrupted by error. In astronomy, a star has a true location, but early astronomers may have had unsteady hands, blurry telescope images and bad weather – all sources of error.

Advertisement

To deal with error, researchers often assume that measurements are unbiased. In statistics, this means they evenly distribute around a central value. Unbiased measurements still have error, but they can be combined to better estimate the true value.

Say three scientists have each taken three measurements. Viewed separately, their measurements may seem random, but when unbiased measurements are put together, they evenly distribute around a middle value: the average.

When measurements are unbiased, the average will tend to sit in the middle of all measurements. In fact, we can show mathematically that the average is closest to all possible measurements. For this reason, the average is an excellent tool for dealing with measurement errors.

Statistical thinking

Error theory was, in its time, considered revolutionary. Other scientists admired the precision of astronomy and sought to bring the same approach to their disciplines. The 19th century scientist Adolphe Quetelet applied ideas from error theory to study humans and introduced the idea of taking averages of human heights and weights.

Advertisement

The average helps make comparisons across groups. For instance, taking averages from a dataset of male and female heights can show that the males in the dataset are taller – on average – than the females. However, the average does not tell us everything. In the same dataset, we could likely find individual females who are taller than individual males.

So, you can't consider only the average. You should also consider the spread of values by thinking statistically. Statistical thinking is defined as thinking carefully about variation – or the tendency of measured values to be different.

For example, different astronomers taking measurements of the same star and recording different positions is one example of variation. The astronomers had to think carefully about where their variation came from. Since a star has one true position, they could safely assume their variation was due to error.

Taking the average of measurements makes sense when variation from sources of error. But researchers have to be careful when interpreting the average when there is real variation. For instance, in the height example, individual females can be taller than individual males, even if are taller on average. Focusing on the average alone neglects variation, which has caused serious issues.

Advertisement

Quetelet did not just take the practice of computing averages from error theory. He also took the assumption of a single true value. He elevated an ideal of “the average man” and suggested that human variability was fundamentally error – that is, not ideal. To Quetelet, there's something wrong with you if you're not exactly average height.

Researchers who study social norms note that Quetelet's ideas about “the average man” contributed the modern meaning of the word “normal” – normal height, as well as normal behavior.

These ideas have been used by some, such as early statisticians, to divide populations in two: people who are in some way superior and those who are inferior.

For instance, the eugenics movement – a despicable effort to prevent “inferior” people from traces its thinking to these ideas about “normal” people.

Advertisement

While Quetelet's idea of variation as error supports practices of discrimination, Quetelet-like uses of the average also have direct connections to modern engineering failures.

Failures of the average

In the 1950s, the U.S. Air Force designed its aircraft for “the average man.” It assumed that a plane designed for an average height, average arm length and the average along several other key dimensions would work for most pilots.

This contributed to as many as 17 pilots crashing in a single day. While “the average man” could operate the aircraft perfectly, real variation got in the way. A shorter pilot would have trouble seeing, while a pilot with longer arms and legs would have to squish themselves to fit.

While the Air Force assumed most of its pilots would be close to average along all key dimensions, it found that out of 4,063 pilots, zero were average.

Advertisement

The Air Force solved the problem by designing for variation – it designed adjustable seats to account for the real variation among pilots.

While adjustable seats might seem obvious now, this “average man” thinking still causes problems today. In the U.S., women experience about 50% higher odds of severe injury in automobile accidents.

The Accountability Office blames this disparity on crash-test practices, where female passengers are crudely represented using a scaled version of a male dummy, much like the Air Force's “average man.” The first female crash-test dummy was introduced in 2022 and has yet to be adopted in the U.S.

The average is useful, but it has limitations. For estimating true values or making comparisons across groups, the average is powerful. However, for individuals who exhibit real variability, the average simply doesn't mean that much.The Conversation

Zachary del Rosario, Assistant Professor of Engineering, Olin College of Engineering

Advertisement

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Did you miss our previous article…
https://www.biloxinewsevents.com/?p=336749

The Conversation

How to tell if a conspiracy theory is probably false

Published

on

theconversation.com – H. Colleen Sinclair, Associate Research Professor of Social Psychology, Louisiana University – 2024-05-07 07:33:01

Conspiracy theories can muddle people's thinking.

Natalie_/iStock / Getty Images Plus

H. Colleen Sinclair, Louisiana State University

Conspiracy theories are everywhere, and they can involve just about anything.

Advertisement

People believe false conspiracy theories for a wide range of reasons – including the fact that there are real conspiracies, like efforts by the Sackler family to profit by concealing the addictiveness of oxycontin at the cost of countless American lives.

The extreme consequences of unfounded conspiratorial beliefs could be seen on the staircases of the U.S. Capitol on Jan. 6, 2021, and in the self-immolation of a protestor outside the courthouse holding the latest Trump trial.

But if hidden forces really are at work in the world, how is someone to know what's really going on?

That's where my research in; I'm a social psychologist who studies misleading narratives. Here are some ways to vet a claim you've seen or heard.

Advertisement

An overview of a maze of passages between shrubs.

Sometimes there's nothing but the maze itself.

oversnap/E+ via Getty Images

Step 1: Seek out the evidence

Real conspiracies have been confirmed because there was evidence. For instance, in the allegations dating back to the 1990s that tobacco companies knew cigarettes were dangerous and kept that information secret to make money, scientific studies showed problematic links between tobacco and cancer. Court cases unearthed corporate documents with internal memos showing what executives knew and when. Investigative journalists revealed efforts to hide that information. explained the effects on their patients. Internal whistleblowers sounded the alarm.

But unfounded conspiracy theories reveal their lack of evidence and substitute instead several elements that should be red flags for skeptics:

  • Dismissing traditional sources of evidence, claiming they are in on the plot.

  • Claiming that missing information is because someone is hiding it, even though it's common that not all facts are known completely for some time after an .

  • Attacking apparent inconsistencies as evidence of lies.

  • Overinterpreting ambiguity as evidence: A flying object may be unidentified – but that's different from identifying it as an alien spaceship.

  • Using anecdotes – especially vaguely attributed ones – in place of evidence, such as “people are saying” such-and-such or “my cousin's friend experienced” something.

  • Attributing knowledge to secret messages that only a select few can grasp – rather than evidence that's plain and clear to all.

Step 2: Test the allegation

Often, a conspiracy theorist only evidence that confirms their idea. Rarely do they put their idea to the tests of logic, reasoning and critical thinking.

Advertisement

While they may say they do research, they typically do not apply the scientific method. Specifically, they don't actually try to prove themselves wrong.

So a skeptic can follow the method scientists use when they do research: Think about what evidence would contradict the explanation – and then go looking for that evidence.

Sometimes that effort will yield confirmation that the explanation is correct. And sometimes not. Like a scientist, ask yourself: What would it take for you to believe your perception was wrong?

A hand holds a magnifying glass over one silhouetted figure, which is connected in a diagram to other figures.

Look closely at allegations of massive conspiracies.

Boris Zhitkov/Moment via Getty Images

Advertisement

Step 3: Watch out for tangled webs

When theories claim large groups of people are perpetrating wide-ranging activities over a long period of time, that's another red .

Confirmed conspiracies typically involve small, isolated groups, like the top echelon of a company or a single terrorist cell. Even the alliance among tobacco companies to hide their products' danger was confined to those at the top, who made decisions and enlisted paid scientists and ad agencies to spread their messages.

False conspiracies tend to implicate wide swaths of people, such as world , mainstream media outlets, the global scientific community, the Hollywood entertainment industry and interconnected agencies.

The online manifesto of Max Azzarello – the man who self-immolated on the steps of a New York courthouse in April 2024– railed against a conspiracy allegedly including every president since Bill Clinton, sex offender Jeffrey Epstein, even the writers of “The Simpsons.”

Advertisement

Remember that the more people who supposedly know a secret, the harder it is to keep.

Step 4: Look for a motive

Confirmed conspiracies tell stories about why a group of people acted as they did and what they hoped to gain. Dubious conspiracies involve a lot of accusations or just questions without examining what real benefit the conspiracy nets the conspirators, especially when factoring in the costs.

For instance, what purpose would NASA have to lie about the existence of Finland?

Be particularly suspicious when conspiracies allege an “agenda” being perpetrated by an entire sociodemographic, which is often a marginalized group, such as a “gay agenda” or “Muslim agenda.”

Advertisement

Also look to see whether those spreading the conspiracy theories have something to gain. For example, scholarly research has identified the 12 people who are the primary sources of false claims about vaccinations. The researchers also found that those people profit from making those claims.

Step 5: Seek the source of the allegations

If you can't figure out who is at the root of a conspiracy allegation and thus how they came to know what they claim, that is another red flag. Some people say they have to remain anonymous because the conspiracists will take revenge for revealing information. But even so, a conspiracy can usually be tracked back to its source – maybe a social media account, even an anonymous one.

Over time, anonymous sources either forward or are revealed. For instance, years after the Watergate scandal took down Richard Nixon's presidency, a key inside source known as “Deep Throat” was revealed to be Mark Felt, who had been a high-level FBI official in the early 1970s.

Even the notorious “Q” at the heart of the QAnon conspiracy cult has been identified, and not by government investigators chasing leaks of national secrets. Surprise! Q is not the high-level official some people believed.

Advertisement

Reliable sources are transparent.

A view of a person holding a flashlight standing in a dark field while a circular shape hovers overhead, beaming a light down.

This didn't happen.

David Wall/Moment via Getty Images

Step 6: Beware the supernatural

Some conspiracy theories – though none that have been proven – involve paranormal, alien, demonic or other supernatural forces. People alive in the 1980s and 1990s might remember the public fear that satanic cults were abusing and sacrificing children. That idea never disappeared entirely.

And around the same time, perhaps inspired by the TV series “V,” some Americans began to believe in lizard people. It may seem harmless to keep hoping for evidence of Bigfoot, but the person who detonated a bomb in Nashville on Dec. 25, 2020, apparently believed lizard people ran the Earth.

Advertisement

The closer the conspiracy is to science fiction, the closer it is to just being fiction.

Step 7: Look for other warning signs

There are other red flags too, like the use of prejudicial tropes about the group allegedly behind the conspiracy, particularly antisemitic allegations.

But rather than doing the work to really examine their conspiratorial beliefs, believers often choose to write off the skeptics as fools or as also being in on it – whatever “it” may be.

Ultimately, that's part of the allure of conspiracy theories. It is easier to dismiss criticism than to admit you might be wrong.The Conversation

H. Colleen Sinclair, Associate Research Professor of Social Psychology, Louisiana State University

Advertisement

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Continue Reading

The Conversation

Future pandemics will have the same human causes as ancient outbreaks − lessons from anthropology can help prevent them

Published

on

theconversation.com – Ron Barrett, Associate Professor of Anthropology, Macalester College – 2024-05-07 07:33:36

The changes that came with the transition from foraging to farming paved the way for disease.

Nastasic/DigitalVision Vectors via Getty Images

Ron Barrett, Macalester College

The last pandemic was bad, but is only one of many infectious diseases that emerged since the turn of this century.

Advertisement

Since 2000, the world has experienced 15 novel Ebola epidemics, the global spread of a 1918-like influenza strain and major outbreaks of three new and unusually deadly coronavirus infections: SARS, MERS and, of course, COVID-19. Every year, researchers discover two or three entirely new pathogens: the viruses, bacteria and microparasites that sicken and kill people.

While some of these discoveries reflect better detection methods, genetic studies confirm that most of these pathogens are indeed new to the human species. Even more troubling, these diseases are appearing at an increasing rate.

Despite the novelty of these particular infections, the primary factors that led to their emergence are quite ancient. Working in the field of anthropology, I have found that these are primarily human factors: the ways we feed ourselves, the ways we together, and the ways we treat one another. In a forthcoming book, “Emerging Infections: Three Epidemiological Transitions from Prehistory to the Present,” my colleagues and I examine how these same elements have influenced disease dynamics for thousands of years. Twenty-first century technologies have served only to magnify ancient challenges.

Neolithic infections

The first major wave of newly emerging infections occurred with the start of the Neolithic revolution about 12,000 years ago, when people began shifting from foraging to farming as their primary means of subsistence.

Advertisement

Before then, human infections tended to be mild and chronic in nature, manageable burdens of long-term parasites that people carried around from place to place. But full-time agrarian living brought the kinds of acute and virulent infections that we are familiar with . This global shift was humanity's first epidemiological transition.

illustration of an Egyptian tomb engraving of farmers with domesticated animals

The first emerging infections followed the rise of intensive agriculture.

mikroman6/Moment via Getty Images

Farming itself was not the cause. Rather, it was the major lifestyle changes associated with this new enterprise. Agriculture supplied people with high-calorie grains, but often did so at the expense of dietary diversity, resulting in compromised immunity from nutritional deficiencies.

The human population increased dramatically, and so did the number of large and densely settled communities that could sustain the transmission of deadlier pathogens.

Advertisement

Our ancient ancestors domesticated animals for food and labor, and their proximity to one another created opportunities for livestock diseases to evolve into human diseases.

Finally, the social hierarchies of newly agrarian societies led to disparities in the distribution of essential resources for healthy living.

These challenges of subsistence, settlement and social organization were the root causes of humanity's first major disease transition.

Declining infections

For a dozen millennia, these patterns spread across the world like a plague of plagues. They persisted until the 19th and 20th centuries, when expectancy rose with the precipitous decline of infectious diseases in high- and middle-income countries.

Advertisement

Remarkably, the greatest proportion of this decline occurred before the discovery of effective antibiotics and most of the vaccines we use today. improvements were mainly due to nonmedicinal factors such as better farming and food distribution methods, major sanitation projects and housing reforms in poor urban areas.

Etching of unhygienic street conditions in 1800s New York City

Urban sanitation did more than new medicines to reduce infections in the 19th century.

Bettmann via Getty Images

These were significant reversals in the same ancient categories – subsistence, settlement and social organization – that led to the rise of infectious diseases in the first place. They resulted in humanity's second epidemiological transition, a significant but only partial reversal of the changes that first began in the Neolithic period.

This second pattern was not a panacea. Despite overall health improvements, chronic noninfectious conditions such as heart disease and cancer rose to become the primary causes of human mortality.

Advertisement

Most low-income countries experienced a later version of this transition after World War II, but their health gains from declining infections were significantly less than those of their wealthier counterparts. At the same time, their losses to noninfectious diseases rose at comparable rates. These conflicting trends have led to a “worst-of-all-worlds” scenario with respect to the health of poor societies.

It is also worth noting that the declining infections in low-income societies have depended more on affordable antimicrobial drugs. Given the emergence of drug-resistant pathogens, these medicinal buffers are proving to be little more than short-term for the health consequences of poverty.

With the ability of pathogens to move freely across borders and boundaries, these consequences can quickly become everyone's problems.

part of Earth from space showing lines like flightpaths connecting cities

Every corner of the globe is connected by modern travel.

fotograzia/Moment via Getty Images

Advertisement

Converging infections

In recent decades, humanity's interconnections have reached the point that nearly everyone now lives within a single global disease . Borders and boundaries no longer constrain the spread of distant outbreaks. The COVID-19 pandemic dramatically illustrated this new reality, when the SARS-CoV-2 virus spread around the world in only a few weeks.

The COVID-19 pandemic also highlighted the ways that infectious and noninfectious diseases can interact synergistically with one another to produce even worse outcomes than the simple sum of each disease. This is starkly illustrated by the majority of COVID-19 deaths, which occurred among people with chronic heart, lung and metabolic conditions that are common to a growing proportion of older people in populations both wealthy and poor.

When combined, these challenges have set the stage for the converging disease patterns visible today. This is the third epidemiological transition: the rise of new, virulent and drug-resistant infections occurring in a rapidly aging and highly interconnected world.

Unfortunately, the present pattern entails increasing outbreaks of new and deadly infections. The root causes of these outbreaks are in such as commercial agricultural practices, the urbanization of human populations and the challenges of poverty in the face of economic growth.

Advertisement

Despite the magnitude of these determinants, they are essentially the same issues of subsistence, settlement and social organization from 12,000 years ago. Addressing these recurring issues will do more than prepare the world for future pandemics; it will to prevent them from in the first place.The Conversation

Ron Barrett, Associate Professor of Anthropology, Macalester College

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Continue Reading

The Conversation

Venus is losing water faster than previously thought – here’s what that could mean for the early planet’s habitability

Published

on

theconversation.com – Eryn Cangi, Research Scientist in Astrophysical & Planetary Sciences, of Colorado Boulder – 2024-05-06 11:19:38
An artist's illustration of hydrogen disappearing from Venus.
Aurore Simonnet/ Laboratory for Atmospheric and Space Physics/ University of Colorado Boulder

Eryn Cangi, University of Colorado Boulder

, the atmosphere of our neighbor planet Venus is as hot as a pizza oven and drier than the driest desert on Earth – but it wasn't always that way.

Billions of years ago, Venus had as much water as Earth does today. If that was ever liquid, Venus may have once been habitable.

Over time, that water has nearly all been lost. Figuring out how, when and why Venus lost its water helps planetary scientists like me understand what makes a planet habitable — or what can make a habitable planet transform into an uninhabitable world.

Advertisement
Venus, with clouds visible on its surface, photographed using UV light.
Venus, Earth's solar system neighbor.
JAXA/ISAS/DARTS/Kevin M. Gill, CC BY

Scientists have theories explaining why most of that water disappeared, but more water has disappeared than they predicted.

In a May 2024 study, my colleagues and I revealed a new water removal that has gone unnoticed for decades, but could explain this water loss mystery.

Energy balance and early loss of water

The solar system has a habitable zone – a narrow ring around the Sun in which planets can have liquid water on their surface. Earth is in the middle, Mars is outside on the too-cold side, and Venus is outside on the too-hot side. Where a planet sits on this habitability spectrum depends on how much energy the planet gets from the Sun, as well as how much energy the planet radiates away.

The theory of how most of Venus' water loss occurred is tied to this energy balance. On early Venus, sunlight broke up water in its atmosphere into hydrogen and oxygen. Atmospheric hydrogen heats up a planet — like too many blankets on the bed in summer.

When the planet gets too hot, it throws off the blanket: the hydrogen escapes in a flow out to space, a process called hydrodynamic escape. This process one of the key ingredients for water from Venus. It's not known exactly when this process occurred, but it was likely within the first years or so.

Advertisement

Hydrodynamic escape stopped after most hydrogen was removed, but a little bit of hydrogen was left behind. It's like dumping out a water bottle – there will still be a few drops left at the bottom. These leftover drops can't escape in the same way. There must be some other process still at work on Venus that continues to hydrogen.

Little reactions can make a big difference

Our new study reveals that an overlooked chemical reaction in Venus' atmosphere can produce enough escaping hydrogen to close the gap between the expected and observed water loss.

Here's how it works. In the atmosphere, gaseous HCO⁺ molecules, which are made up of one atom each of hydrogen, carbon and oxygen and have a positive charge, combine with negatively charged electrons, since opposites attract.

But when the HCO⁺ and the electrons react, the HCO⁺ breaks up into a neutral carbon monoxide molecule, CO, and a hydrogen atom, H. This process energizes the hydrogen atom, which can then exceed the planet's escape velocity and escape to space. The whole reaction is called HCO⁺ dissociative recombination, but we like to call it DR for short.

Advertisement

Water is the original source of hydrogen on Venus, so DR effectively dries out the planet. DR has likely happened throughout the history of Venus, and our work shows it probably still continues into the present day. It doubles the amount of hydrogen escape previously calculated by planetary scientists, upending our understanding of present-day hydrogen escape on Venus.

Understanding Venus with data, models and Mars

To study DR on Venus we used both computer modeling and data analysis.

The modeling actually began as a Mars . My Ph.D. research involved exploring what sort of conditions made planets habitable for life. Mars also used to have water, though less than Venus, and also lost most of it to space.

To understand martian hydrogen escape, I developed a computational model of the Mars atmosphere that simulates Mars' atmospheric chemistry. Despite being very different planets, Mars and Venus actually have similar upper atmospheres, so my colleagues and I were able to extend the model to Venus.

Advertisement

We found that HCO⁺ dissociative recombination produces lots of escaping hydrogen in both planets' atmospheres, which agreed with measurements taken by the Mars Atmosphere and Volatile EvolutioN, or MAVEN, mission, a satellite orbiting Mars.

A spacecraft that looks like a metal box with two solar panels attached on either side and a small limb extending downward.
An illustration of the MAVEN mission orbiting Mars.
NASA's Goddard Space Flight Center

Having data collected in Venus' atmosphere to back up the model would be valuable, but previous missions to Venus haven't measured HCO⁺ – not because it's not there, but because they weren't designed to detect it. They did, however, measure the reactants that produce HCO⁺ in Venus' atmosphere.

By analyzing measurements made by Pioneer Venus, a combination orbiter and probe mission that studied Venus from 1978-1992, and using our knowledge of chemistry, we demonstrated that HCO⁺ should be present in the atmosphere in similar amounts to our model.

Follow the water

Our work has filled in a piece of the puzzle of how water is lost from planets, which affects how habitable a planet is for life. We've learned that water loss happens not just in one fell swoop, but over time through a combination of methods.

Faster hydrogen loss today via DR means that less time is required overall to remove the remaining water from Venus. This means that if oceans were ever present on early Venus, they could have been present for longer than scientists thought before water loss through hydrodynamic escape and DR started. This would more time for possible life to arise. Our results don't mean oceans or life were definitely present, though – answering that question will require lots more science over many years.

Advertisement

There is also a need for new Venus missions and observations. Future Venus missions will provide some atmospheric measurements, but they won't focus on the upper atmosphere where most HCO⁺ dissociative recombination takes place. A future Venus upper atmosphere mission, similar to the MAVEN mission at Mars, could vastly expand everyone's knowledge of how terrestrial planets' atmospheres form and evolve over time.

With the technological advancements of recent decades and a flourishing new interest in Venus, now is an excellent time to turn our eyes toward Earth's sister planet.The Conversation

Eryn Cangi, Research Scientist in Astrophysical & Planetary Sciences, University of Colorado Boulder

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisement
Continue Reading

News from the South

Trending