The Conversation
‘Man, the hunter’? Archaeologists’ assumptions about gender roles in past humans ignore an icky but potentially crucial part of original ‘paleo diet’
‘Man, the hunter'? Archaeologists' assumptions about gender roles in past humans ignore an icky but potentially crucial part of original ‘paleo diet'
gorodenkoff/iStock via Getty Images Plus
Raven Garvey, University of Michigan
One of the most common stereotypes about the human past is that men did the hunting while women did the gathering. That gendered division of labor, the story goes, would have provided the meat and plant foods people needed to survive.
That characterization of our time as a species exclusively reliant on wild foods – before people started domesticating plants and animals more than 10,000 years ago – matches the pattern anthropologists observed among hunter-gatherers during the 19th and early 20th centuries. Virtually all of the large-game hunting they documented was performed by men.
UMMAA 27673, 39802, 30442 and 37737, Courtesy of the University of Michigan Museum of Anthropological Archaeology
It's an open question whether these ethnographic accounts of labor are truly representative of recent hunter-gatherers' subsistence behaviors. Regardless, they definitely fueled assumptions that a gendered division of labor arose early in our species' evolution. Current employment statistics do little to disrupt that thinking; in a recent analysis, just 13% of hunters, fishers and trappers in the U.S. were women.
Still, as an archaeologist, I've spent much of my career studying how people of the past got their food. I can't always square my observations with the “man the hunter” stereotype.
A long-standing anthropological assumption
First, I want to note that this article uses “women” to describe people biologically equipped to experience pregnancy, while recognizing that not all people who identify as women are so equipped, and not all people so equipped identify as women.
I am using this definition here because reproduction is at the heart of many hypotheses about when and why subsistence labor became a gendered activity. As the thinking goes, women gathered because it was a low-risk way to provide dependent children with a reliable stream of nutrients. Men hunted either to round out the household diet or to use difficult-to-acquire meat as a way to attract potential mates.
One of the things that has come to trouble me about attempts to test related hypotheses using archaeological data – some of my own attempts included – is that they assume plants and animals are mutually exclusive food categories. Everything rests on the idea that plants and animals differ completely in how risky they are to obtain, their nutrient profiles and their abundance on a landscape.
It is true that highly mobile large-game species such as bison, caribou and guanaco (a deer-sized South American herbivore) were sometimes concentrated in places or seasons where plants edible to humans were scarce. But what if people could get the plant portion of their diets from the animals themselves?
pchoui/iStock via Getty Images Plus
Animal prey as a source of plant-based food
The plant material undergoing digestion in the stomachs and intestines of large ruminant herbivores is a not-so-appetizing substance called digesta. This partially digested matter is edible to humans and rich in carbohydrates, which are pretty much absent from animal tissues.
Conversely, animal tissues are rich in protein and, in some seasons, fats – nutrients unavailable in many plants or that occur in such small amounts that a person would need to eat impractically large quantities to meet daily nutritional requirements from plants alone.
If past peoples ate digesta, a big herbivore with a full belly would, in essence, be one-stop shopping for total nutrition.
UMMAA 83209 a and b, Courtesy of the University of Michigan Museum of Anthropological Archaeology
To explore the potential and implications of digesta as a source of carbohydrates, I recently compared institutional dietary guidelines to person-days of nutrition per animal using a 1,000-pound (450-kilogram) bison as a model. First I compiled available estimates for protein in a bison's own tissues and for carbohydrates in digesta. Using that data, I found that a group of 25 adults could meet the U.S. Department of Agriculture's recommended daily averages for protein and carbohydrates for three full days eating only bison meat and digesta from one animal.
Among past peoples, consuming digesta would have relaxed the demand for fresh plant foods, perhaps changing the dynamics of subsistence labor.
Recalibrating the risk if everyone hunts
One of the risks typically associated with large-game hunting is that of failure. According to the evolutionary hypotheses around gendered division of labor, when risk of hunting failure is high – that is, the likelihood of bagging an animal on any given hunting trip is low – women should choose more reliable resources to provision children, even if it means long hours of gathering. The cost of failure is simply too high to do otherwise.
MPI/Archive Photos via Getty Images
However, there is evidence to suggest that large game was much more abundant in North America, for example, before the 19th- and 20th-century ethnographers observed foraging behaviors. If high-yield resources like bison could have been acquired with low risk, and the animals' digesta was also consumed, women may have been more likely to participate in hunting. Under those circumstances, hunting could have provided total nutrition, eliminating the need to obtain protein and carbohydrates from separate sources that might have been widely spread across a landscape.
And, statistically speaking, women's participation in hunting would also have helped reduce the risk of failure. My models show that, if all 25 of the people in a hypothetical group participated in the hunt, rather than just the men, and all agreed to share when successful, each hunter would have had to be successful only about five times a year for the group to subsist entirely on bison and digesta. Of course, real life is more complicated than the model suggests, but the exercise illustrates potential benefits of both digesta and female hunting.
Topical Press Agency/Hulton Archive via Getty Images
Ethnographically documented foragers did routinely eat digesta, especially where herbivores were plentiful but plants edible to humans were scarce, as in the Arctic, where prey's stomach contents was an important source of carbohydrates.
I believe eating digesta may have been a more common practice in the past, but direct evidence is frustratingly hard to come by. In at least one instance, plant species present in the mineralized plaque of a Neanderthal individual's teeth point to digesta as a source of nutrients. To systematically study past digesta consumption and its knock-on effects, including female hunting, researchers will need to draw on multiple lines of archaeological evidence and insights gained from models like the ones I developed.
Raven Garvey, Associate Professor of Anthropology; Curator of High Latitude and Western North American Archaeology, Museum of Anthropological Archaeology; Faculty Affiliate, Research Center for Group Dynamics, University of Michigan
This article is republished from The Conversation under a Creative Commons license. Read the original article.
The Conversation
Nationwide test of Wireless Emergency Alert system could test people’s patience – or help rebuild public trust in the system
Nationwide test of Wireless Emergency Alert system could test people's patience – or help rebuild public trust in the system
Jeff Greenberg/Education Images/Universal Images Group via Getty Images
Elizabeth Ellcessor, University of Virginia and Hamilton Bean, University of Colorado Denver
The Wireless Emergency Alert system is scheduled to have its third nationwide test on Oct. 4, 2023. The Wireless Emergency Alert system is a public safety system that allows authorities to alert people via their mobile devices of dangerous weather, missing children and other situations requiring public attention.
Similar tests in 2018 and 2021 caused a degree of public confusion and resistance. In addition, there was confusion around the first test of the U.K. system in April 2023, and an outcry surrounding accidental alert messages such as those sent in Hawaii in January 2018 and in Florida in April 2023.
The federal government lists five types of emergency alerts: National (formerly labeled Presidential), Imminent Threat, Public Safety, America's Missing: Broadcast Emergency Response (Amber), and Opt-in Test Messages. You can opt out of any except National Alerts, which are reserved for national emergencies. The Oct. 4 test is a National Alert.
We are a media studies researcher and a communications researcher who study emergency alert systems. We believe that concerns about previous tests raise two questions: Is public trust in emergency alerting eroding? And how might the upcoming test rebuild it?
Confusion and resistance
In an ever-updating digital media environment, emergency alerts appear as part of a constant stream of updates, buzzes, reminders and notifications on people's smartphones. Over-alerting is a common fear in emergency management circles because it can lead people to ignore alerts and not take needed action. The sheer volume of different updates can be similarly overwhelming, burying emergency alerts in countless other messages. Many people have even opted out of alerts when possible, rummaging through settings and toggling off every alert they can find.
Even when people receive alerts, however, there is potential for confusion and rejection. All forms of emergency alerts rely on the recipients' trust in the people or organization responsible for the alert. But it's not always clear who the sender is. As one emergency manager explained to one of us regarding alerts used during COVID-19: “People were more confused because they got so many different notifications, especially when they don't say who they're from.”
When the origin of an alert is unclear, or the recipient perceives it to have a political bias counter to their own views, people may become confused or resistant to the message. Prior tests and use of the Wireless Emergency Alert system have indicated strong anti-authority attitudes, particularly following the much-derided 2018 test of what was then called the Presidential Alert message class. There are already conspiracy theories online about the upcoming test.
Trust in alerts is further reduced by the overall lack of testing and public awareness work done on behalf of the Wireless Emergency Alert system since its launch in June 2012. As warning expert Dennis Mileti explained in his 2018 Federal Emergency Management Agency PrepTalk, routine public tests are essential for warning systems' effectiveness. However, the Wireless Emergency Alert system has been tested at the national level only twice, and there has been little public outreach to explain the system by either the government or technology companies.
More exposure and info leads to more trust
The upcoming nationwide test may offer a moment that could rebuild trust in the system. A survey administered in the days immediately following the 2021 national test found that more respondents believed that the National Alert message class label would signal more trustworthy information than the Presidential Alert message class label.
Similarly, in contrast to the 2021 test, which targeted only select users, the Oct. 4 test is slated to reach all compatible devices in the U.S. Since users cannot opt out of the National Alert message class, this week's test is a powerful opportunity to build awareness about the potential benefits of a functional federal emergency alert system.
The Oct. 4 test message is expected to state, “THIS IS A TEST of the National Wireless Emergency Alert system. No action is needed.” We instead suggest that action is, in fact, urgently needed to help people better understand the rapidly changing mobile alert and warning ecosystem that confronts them. Familiarity with this system is what will allow it to support public health and safety, and address the crises of the 21st century.
Here are steps that you can take now to help make the Wireless Emergency Alert system more effective:
-
The Wireless Emergency Alert system is only one form of emergency alert. Identify which mobile notification systems are used by your local emergency management organizations: police, fire and emergency services. Know which systems are opt-in and opt-out, and opt in to those needed. Ensure access to other sources of information during an emergency, such as local radio and television, or National Oceanic and Atmospheric Administration weather radio.
-
Understand the meaning of mobile device notification settings. Just because you are opted in to “Emergency Alerts” on your cellphone does not necessarily mean you are signed up to receive notifications from local authorities. Check the FEMA website for information about the Wireless Emergency Alert system and your local emergency management organizations' websites about opt-in systems.
-
Have a plan for contacting family, friends and neighbors during an emergency. Decide in advance who will help the vulnerable members of your community.
-
Find out if your local emergency management organizations test their alert systems, and make sure to receive those local tests.
-
Anticipate the possibility that mobile systems will be damaged or unavailable during a crisis and prepare essentials for sheltering in place or quick evacuation.
Finally, push back on the lack of information and rise of misinformation about alerts by sharing reliable information about emergency alerts with your family and friends.
Elizabeth Ellcessor, Associate Professor of Media Studies, University of Virginia and Hamilton Bean, Associate Professor of Communication, University of Colorado Denver
This article is republished from The Conversation under a Creative Commons license. Read the original article.
The Conversation
Superconductivity at room temperature remains elusive a century after a Nobel went to the scientist who demonstrated it below -450 degrees Fahrenheit
Superconductivity at room temperature remains elusive a century after a Nobel went to the scientist who demonstrated it below -450 degrees Fahrenheit
Benjamin Couprie/Wikimedia Commons
David D. Nolte, Purdue University
On April 8, 1911, Dutch physicist Heike Kamerlingh Onnes scribbled in pencil an almost unintelligible note into a kitchen notebook: “near enough null.”
The note referred to the electrical resistance he'd measured during a landmark experiment that would later be credited as the discovery of superconductivity. But first, he and his team would need many more trials to confirm the measurement.
Their discovery opened up a world of potential scientific applications. The century since has seen many advances, but superconductivity researchers today can take lessons from Onnes' original, Nobel Prize-winning work.
I have always been interested in origin stories. As a physics professor and the author of books on the history of physics, I look for the interesting backstory – the twists, turns and serendipities that lie behind great discoveries.
The true stories behind these discoveries are usually more chaotic than the rehearsed narratives crafted after the fact, and some of the lessons learned from Onnes' experiments remain relevant today as researchers search for new superconductors that might, one day, operate near room temperature.
Superconductivity
A rare quantum effect that allows electrical currents to flow without resistance in superconducting wires, superconductivity allows for a myriad of scientific applications. These include MRI machines and powerful particle accelerators.
Imagine giving a single push to a row of glass beads strung on a frictionless wire. Once the beads start moving down the wire, they never stop, like a perpetual motion machine. That's the idea behind superconductivity – particles flowing without resistance.
For superconductors to work, they need to be cooled to ultra-low temperatures colder than any Arctic blast. That's how Onnes' original work cooling helium to near absolute zero temperature set the stage for his unexpected discovery of superconductivity.
The discovery
Onnes, a physics professor at the University of Leiden in the Netherlands, built the leading low-temperature physics laboratory in the world in the first decade of the 20th century.
His lab was the first to turn helium from a gas to a liquid by making the gas expand and cool. His lab managed to cool helium this way to a temperature of -452 degrees Farenheit (-269 degrees Celsius).
Onnes then began studying the electrical conductivity of metals at these cold temperatures. He started with mercury because mercury in liquid form can conduct electricity, making it easy to fill into glass tubes. At low temperatures, the mercury would freeze solid, creating metallic wires that Onnes could use in his conductivity experiments.
On April 8, 1911, his lab technicians transferred liquid helium into a measurement cryostat – a glass container with a vacuum jacket to insulate it from the room's heat. They cooled the helium to -454 F (-270 C) and then measured the electrical resistance of the mercury wire by sending a small current through it and measuring the voltage.
It was then that Onnes wrote the cryptic “near enough null” measurement into his kitchen notebook, meaning that the wire was conducting electricity without any measurable resistance.
That date of April 8 is often quoted as the discovery of superconductivity, but the full story isn't so simple, because scientists can't accept a scribbled “near-enough null” as sufficient proof of a new discovery.
In pursuit of proof
Onnes' team performed its next experiment more than six weeks later, on May 23. On this day, they cooled the cryostat again to -454 F (-270 C) and then let the temperature slowly rise.
At first they barely measured any electrical resistance, indicating superconductivity. The resistance stayed small up to -452 F, when it suddenly rose by over a factor of 400 as the temperature inched up just a fraction of a degree.
The rise was so rapid and so unexpected that they started searching for some form of electrical fault or open circuit that might have been caused by the temperature shifts. But they couldn't find anything wrong. They spent five more months improving their system before trying again. On Oct. 26 they repeated the experiment, capturing the earlier sudden rise in resistance.
Heike Kamerlingh Onnes via Wikimedia Commons
One week later, Onnes presented these results to the first Solvay Conference, and two years later he received his Nobel Prize in physics, recognizing his low-temperature work generally but not superconductivity specifically.
It took another three years of diligent work before Onnes had his irrefutable evidence: He measured persistent currents that did not decay, demonstrating truly zero resistance and superconductivity on April 24, 1914.
New frontiers for critical temperatures
In the decades following Onnes' discovery, many researchers have explored how metals act at supercooled temperatures and have learned more about superconductivity.
But if researchers can observe superconductivity only at super low temperatures, it's hard to make anything useful. It is too expensive to operate a machine practically if it works only at -400 F (-240 C).
So, scientists began searching for superconductors that can work at practical temperatures. For instance, K. Alex Müller and J. Georg Bednorz at the IBM research laboratory in Switzerland figured out that metal oxides like lanthanum-barium-copper oxide, known as LBCO, could be good candidates.
It took the IBM team about three years to find superconductivity in LBCO. But when they did, their work set a new record, with superconductivity observed at -397 F (-238 C) in 1986.
A year later, in 1987, a lab in Houston replaced lanthanum in LBCO with the element yttrium to create YBCO. They demonstrated superconductivity at -292 F. This discovery made YBCO the first practical superconductor, because it could work while immersed in inexpensive liquid nitrogen.
Since then, researchers have observed superconductivity at temperatures as high as -164 F (-109 C), but achieving a room-temperature superconductor has remained elusive.
Gingras.ol/Wikimedia Commons, CC BY-NC-SA
In 2023, two groups claimed they had evidence for room-temperature superconductivity, though both reports have been met with sharp skepticism, and both are now in limbo following further scrutiny.
Superconductivity has always been tricky to prove because some metals can masquerade as superconductors. The lessons learned by Onnes a century ago – that these discoveries require time, patience and, most importantly, proof of currents that never stop – are still relevant today.
David D. Nolte, Distinguished Professor of Physics and Astronomy, Purdue University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
The Conversation
Tenacious curiosity in the lab can lead to a Nobel Prize – mRNA research exemplifies the unpredictable value of basic scientific research
Tenacious curiosity in the lab can lead to a Nobel Prize – mRNA research exemplifies the unpredictable value of basic scientific research
Sebastian Condrea/Moment via Getty Images
André O. Hudson, Rochester Institute of Technology
The 2023 Nobel Prize in physiology or medicine will go to Katalin Karikó and Drew Weissman for their discovery that modifying mRNA – a form of genetic material your body uses to produce proteins – could reduce unwanted inflammatory responses and allow it to be delivered into cells. While the impact of their findings may not have been apparent at the time of their breakthrough over a decade ago, their work paved the way for the development of the Pfizer-BioNTech and Moderna COVID-19 vaccines, as well as many other therapeutic applications currently in development.
We asked André O. Hudson, a biochemist and microbiologist at the Rochester Institute of Technology, to explain how basic research like that of this year's Nobel Prize winners provides the foundations for science – even when its far-reaching effects won't be felt until years later.
What is basic science?
Basic research, sometimes called fundamental research, is a type of investigation with the overarching goal of understanding natural phenomena like how cells work or how birds can fly. Scientists are asking the fundamental questions of how, why, when, where and if in order to bridge a gap in curiosity and understanding about the natural world.
Researchers sometimes conduct basic research with the hope of eventually developing a technology or drug based on that work. But what many scientists typically do in academia is ask fundamental questions with answers that may or may not ever lead to practical applications.
Humans, and the animal kingdom as a whole, are wired to be curious. Basic research scratches that itch.
What are some basic science discoveries that went on to have a big influence on medicine?
The 2023 Nobel Prize in physiology or medicine acknowledges basic science work done in the early 2000s. Karikó and Weissman's discovery about modifying mRNA to reduce the body's inflammatory response to it allowed other researchers to leverage it to make improved vaccines.
Another example is the discovery of antibiotics, which was based on an unexpected observation. In the late 1920s, the microbiologist Alexander Fleming was growing a species of bacteria in his lab and found that his Petri dish was accidentally contaminated with the fungus Penicillium notatum. He noticed that wherever the fungus was growing, it impeded or inhibited the growth of the bacteria. He wondered why that was happening and subsequently went on to isolate penicillin, which was approved for medical use in the early 1940s.
This work fed into more questions that ushered in the age of antibiotics. The 1952 Nobel Prize in physiology or medicine was awarded to Selman Waksman for his discovery of streptomycin, the first antibiotic to treat tuberculosis.
Basic research often involves seeing something surprising, wanting to understand why and deciding to investigate further. Early discoveries start from a basic observation, asking the simple question of “How?” Only later are they parlayed into a medical technology that helps humanity.
Why does it take so long to get from curiosity-driven basic science to a new product or technology?
The mRNA modification discovery could be considered to be on a relatively fast track from basic science to application. Less than 15 years passed between Karikó and Weissman's findings and the COVID-19 vaccines. The importance of their discovery came to the forefront with the pandemic and the millions of lives they saved.
Most basic research won't reach the market until several decades after its initial publication in a science journal. One reason is because it depends on need. For example, orphan diseases that affect only a small number of people will get less attention and funding than conditions that are ubiquitous in a population, like cancer or diabetes. Companies don't want to spend billions of dollars developing a drug that will only have a small return on their investment. Likewise, because the return on investment for basic research often isn't clear, it can be a hard sell to support financially.
Another reason is cultural. Scientists are trained to chase after funding and support for their work wherever they can find it. But sometimes that's not as easy as it seems.
A good example of this was when the human genome was first sequenced in the early 2000s. A lot of people thought that having access to the full sequence would lead to treatments and cures for many different diseases. But that has not been the case, because there are many nuances to translating basic research to the clinic. What works in a cell or an animal might not translate into people. There are many steps and layers in the process to get there.
Why is basic science important?
For me, the most critical reason is that basic research is how we train and mentor future scientists.
In an academic setting, telling students “Let's go develop an mRNA vaccine” versus “How does mRNA work in the body” influences how they approach science. How do they design experiments? Do they start the study going forward or backward? Are they argumentative or cautious in how they present their findings?
Marco VDM/E+ via Getty Images
Almost every scientist is trained under a basic research umbrella of how to ask questions and go through the scientific method. You need to understand how, when and where mRNAs are modified before you can even begin to develop an mRNA vaccine. I believe the best way to inspire future scientists is to encourage them to expand on their curiosity in order to make a difference.
When I was writing my dissertation, I was relying on studies that were published in the late 1800s and early 1900s. Many of these studies are still cited in scientific articles today. When researchers share their work, though it may not be today or tomorrow, or 10 to 20 years from now, it will be of use to someone else in the future. You'll make a future scientist's job a little bit easier, and I believe that's a great legacy to have.
What is a common misconception about basic science?
Because any immediate use for basic science can be very hard to see, it's easy to think this kind of research is a waste of money or time. Why are scientists breeding mosquitoes in these labs? Or why are researchers studying migratory birds? The same argument has been made with astronomy. Why are we spending billions of dollars putting things into space? Why are we looking to the edge of the universe and studying stars when they are millions and billions of light years away? How does it affect us?
There is a need for more scientific literacy because not having it can make it difficult to understand why basic research is necessary to future breakthroughs that will have a major effect on society.
In the short term, the worth of basic research can be hard to see. But in the long term, history has shown that a lot of what we take for granted now, such as common medical equipment like X-rays, lasers and MRIs, came from basic things people discovered in the lab.
And it still goes down to the fundamental questions – we're a species that seeks answers to things we don't know. As long as curiosity is a part of humanity, we're always going to be seeking answers.
André O. Hudson, Dean of the College of Science, Professor of Biochemistry, Rochester Institute of Technology
This article is republished from The Conversation under a Creative Commons license. Read the original article.
-
Mississippi News Video2 days ago
Woman marches for opioid awareness
-
Our Mississippi Home1 day ago
Remembering When Marilyn Monroe Won Miss Mississippi
-
Kaiser Health News5 days ago
These Appalachia Hospitals Made Big Promises to Gain a Monopoly. They’re Failing to Deliver.
-
Mississippi News6 days ago
Macon woman indicted on several charges after deadly shooting
-
SuperTalk FM4 days ago
Man arrested for illegal firearms after reporting guns as stolen
-
Mississippi News5 days ago
Columbus City Council moves forward with plans to pave city streets
-
Mississippi Today6 days ago
Who’s behind the ‘TANF Tate’ TV and billboard ads?
-
Mississippi News6 days ago
Rolling Fork continues to rebuild six months after tornado