Connect with us

The Conversation

Forget ‘Man the Hunter’ – physiological and archaeological evidence rewrites assumptions about a gendered division of labor in prehistoric times



Forget ‘Man the Hunter' – physiological and archaeological evidence rewrites assumptions about a gendered division of labor in prehistoric times

In small-group, subsistence living, it makes sense for everyone to do lots of .
gorodenkoff/iStock via Getty Images Plus

Sarah Lacy, University of Delaware and Cara Ocobock, University of Notre Dame

Prehistoric hunted; prehistoric women gathered. At least this is the standard narrative written by and about men to the exclusion of women.

The idea of “Man the Hunter” runs deep within anthropology, convincing people that hunting made us human, only men did the hunting, and therefore evolutionary forces must only have acted upon men. Such depictions are found not only in media, but in museums and introductory anthropology textbooks, too.

A common argument is that a sexual division of labor and unequal division of power exists ; therefore, it must have existed in our evolutionary past as well. But this is a just-so story without sufficient evidentiary support, despite its pervasiveness in disciplines like evolutionary psychology.

There is a growing body of physiological, anatomical, ethnographic and archaeological evidence to suggest that not only did women hunt in our evolutionary past, but they may well have been better suited for such an endurance-dependent activity.


We are both biological anthropologists. Cara specializes in the physiology of humans living in extreme conditions, using her research to reconstruct how our ancestors may have adapted to different climates. Sarah studies Neanderthal and early modern human health, and excavates at their archaeological sites.

It's not uncommon for scientists like us – who attempt to include the contributions of all individuals, regardless of sex and gender, in reconstructions of our evolutionary past – to be accused of rewriting the past to fulfill a politically correct, woke agenda. The actual evidence speaks for itself, though: Gendered labor roles did not exist in the Paleolithic era, which lasted from 3.3 million years ago until 12,000 years ago. The story is written in human bodies, now and in the past.

We recognize that biological sex can be defined using multiple characteristics, including chromosomes, genitalia and hormones, each of which exists on a spectrum. Social gender, too, is not a binary category. We use the terms female and male when discussing the physiological and anatomical evidence, as this is what the research literature tends to use.

Female bodies: Adapted for endurance

One of the key arguments put forth by “Man the Hunter” proponents is that females would not have been physically capable of taking part in the long, arduous hunts of our evolutionary past. But a number of female-associated features, which provide an endurance advantage, tell a different story.


All human bodies, regardless of sex, have and need both the hormones estrogen and testosterone. On average, females have more estrogen and males more testosterone, though there is a great deal of variation and overlap.

Testosterone often gets all the credit when it to athletic . But estrogen – technically the estrogen receptor – is deeply ancient, originating somewhere between 1.2 billion and 600 million years ago. It predates the existence of sexual reproduction involving egg and sperm. The testosterone receptor originated as a duplicate of the estrogen receptor and is only about half as old. As such, estrogen, in its many forms and pervasive functions, seems necessary for among both females and males.

Estrogen influences athletic performance, particularly endurance performance. The greater concentrations of estrogen that females tend to have in their bodies likely confer an endurance advantage – an ability to exercise for a longer period of time without becoming exhausted.

sihoutte of a woman's body with cartoon systems highlighted
The hormone estrogen has multiple effects throughout the body and plays a role in people regardless of sex.
Cara Ocobock, CC BY-ND

Estrogen signals the body to burn more fat – beneficial during endurance activity for two key reasons. First, fat has more than twice the calories per gram as carbohydrates do. And it takes longer to metabolize fats than carbs. So, fat provides more bang for the buck overall, and the slow burn provides sustained energy over longer periods of time, which can delay fatigue during endurance activities like running.

In addition to their estrogen advantage, females have a greater proportion of type I muscle fibers relative to males.


These are slow oxidative muscle fibers that prefer to metabolize fats. They're not particularly powerful, but they take awhile to – unlike the powerful type II fibers that males have more of but that tire rapidly. Doing the same intense exercise, females burn 70% more fats than males do, and unsurprisingly, are less likely to fatigue.

Estrogen also appears to be important for post-exercise recovery. Intense exercise or heat exposure can be stressful for the body, eliciting an inflammatory response via the release of heat shock proteins. Estrogen limits this response, which would otherwise inhibit recovery. Estrogen also stabilizes cell membranes that might otherwise be damaged or rupture due to the stress of exercise. Thanks to this hormone, females incur less damage during exercise and are therefore capable of faster recovery.

Silhouette of woman running with cartoon systems highlighted
A variety of physiological differences add up to an advantage for women in endurance activities.
Cara Ocobock, CC BY-ND

Women in the past likely did everything men did

Forget the Flintstones' nuclear family with a stay-at-home wife. There's no evidence of this social structure or gendered labor roles during the 2 million years of evolution for the genus Homo until the last 12,000 years, with the advent of agriculture.

Our Neanderthal cousins, a group of humans who lived across Western and Central Eurasia approximately 250,000 to 40,000 years ago, formed small, highly-nomadic bands. Fossil evidence shows females and males experienced the same bony traumas across their bodies – a signature of a hard life hunting deer, aurochs and wooly mammoths. Tooth wear that results from using the front teeth as a third hand, likely in tasks like tanning hides, is equally evident across females and males.

This nongendered picture should not be surprising when you imagine small-group living. Everyone needs to contribute to the tasks necessary for group survival – chiefly, producing food and shelter and raising . Individual mothers are not solely responsible for their children; in foragers, the whole group contributes to child care.


You might imagine this unified labor strategy then changed in early modern humans, but archaeological and anatomical evidence shows it did not. Upper Paleolithic modern humans leaving Africa and entering Europe and Asia show very few sexed differences in trauma and repetitive motion wear. One difference is more evidence of “thrower's elbow” in males than females, though some females shared these pathologies.

And this was also the time when people were innovating with hunting technologies like atlatls, fishing hooks and nets, and bow and arrows – alleviating some of the wear and tear hunting would take on their bodies. A recent archaeological experiment found that using atlatls decreased sex differences in the speed of spears thrown by contemporary men and women.

Even in , there are no sexed differences in how Neanderthals or modern humans buried their dead, or the goods affiliated with their graves. These indicators of differential gendered social status do not arrive until agriculture, with its stratified economic system and monopolizable resources.

All this evidence suggests paleolithic women and men did not occupy differing roles or social realms.

young women adorned with toucan and macaw feathers holding wooden sticks
Young women from the Awa Indigenous group in Brazil return from a hunt with their bows and arrows.
Scott Wallace/Hulton Archive via Getty Images

Critics might point to recent forager populations and suggest that since they are using subsistence strategies similar to our ancient ancestors, their gendered roles are inherent to the hunter-gatherer lifestyle.

However, there are many flaws in this approach. Foragers are not living fossils, and their social structures and cultural norms have evolved over time and in response to patriarchal agricultural neighbors and colonial administrators. Additionally, ethnographers of the last two centuries brought their sexism with them into the field, and it biased how they understood forager societies. For instance, a recent reanalysis showed that 79% of cultures described in ethnographic data included descriptions of women hunting; however, previous interpretations frequently left them out.

Time to shake these caveman myths

The myth that female reproductive capabilities somehow render them incapable of gathering any food products beyond those that cannot run away does more than just underestimate Paleolithic women. It feeds into narratives that the contemporary social roles of women and men are inherent and define our evolution. Our Paleolithic ancestors lived in a world where everyone in the band pulled their own weight, performing multiple tasks. It was not a utopia, but it was not a patriarchy.

Certainly accommodations must have been made for group members who were sick, recovering from childbirth or otherwise temporarily incapacitated. But pregnancy, lactation, child-rearing and menstruation are not permanently disabling , as researchers found among the living Agta of the Philippines who continue to hunt during these life periods.

Suggesting that the female body is only designed to gather plants ignores female physiology and the archaeological record. To ignore the evidence perpetuates a myth that only serves to bolster existing power structures.The Conversation

Sarah Lacy, Assistant Professor of Anthropology, University of Delaware and Cara Ocobock, Assistant Professor of Anthropology, University of Notre Dame


This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation

3 ways AI can help farmers tackle the challenges of modern agriculture



3 ways AI can help farmers tackle the challenges of modern agriculture

Farming is as much about data as hardware.
AP Photo/Nati Harnik

Joe Hollis, Iowa State University

For all the attention on flashy new artificial intelligence tools like ChatGPT, the challenges of regulating AI, and doomsday scenarios of superintelligent machines, AI is a useful tool in many fields. In fact, it has enormous potential to benefit humanity.

In agriculture, farmers are increasingly using AI-powered tools to tackle challenges that threaten human , the and food security. Researchers the market for these tools to reach US$12 billion by 2032.

As a researcher studying agricultural and rural policy, I see three promising developments in agricultural AI: federated learning, pest and disease detection and forecasting prices.

Pooling data without sharing it

Robotics, sensors and information technology are increasingly used in agriculture. These tools aim to farmers improve efficiency and reduce chemical use. In addition, data collected by these tools can be used in software that uses machine learning to improve management and -making. However, these applications typically require data sharing among stakeholders.


A survey of U.S. farmers found that more than half of respondents said they do not trust federal agencies or private companies with their data. This lack of trust is linked to concerns about sensitive information becoming compromised or being used to manipulate markets and regulations. Machine learning could reduce these concerns.

Federated learning is a technique that trains a machine learning algorithm on data from multiple parties without the parties having to reveal their data to each other. With federated learning, a farmer puts data on a local computer that the algorithm can access rather than sharing the data on a central server. This method increases privacy and reduces the risk of compromise.

If farmers can be persuaded to share their data this way, they can contribute to a collaborative system that helps them make better decisions and meet their sustainability goals. For example, farmers could pool data about conditions for their chickpea crops, and a model trained on all of their data could give each of them better forecasts for their chickpea yields than models trained only on their own data.

An AI-driven giant robot armed with lasers is a major threat – to weeds.

Detecting pests and disease

Farmer livelihoods and global food security are increasingly at risk from plant disease and pests. The Food and Agriculture Organization estimates that worldwide annual losses from disease and pests total $290 billion, with 40% of global crop production affected.


Farmers typically spray crops with chemicals to preempt outbreaks. However, the overuse of these chemicals is linked to harmful effects on human health, soil and water quality and biodiversity. Worryingly, many pathogens are becoming resistant to existing treatments, and developing new ones is proving to be difficult.

Reducing the amount of chemicals used is therefore paramount, and AI may be part of a solution.

The Consortium of International Agricultural Research Centers has created a mobile phone app that identifies pests and disease. The app, “Tumaini,” allows users to upload a of a suspected pest or disease, which the AI compares with a database of 50,000 images. The app also provides analysis and can recommend treatment programs.

If used with farm management tools, apps like this can improve farmers' ability to target their spraying and improve accuracy in deciding how much chemical to use. Ultimately, these efficiencies may reduce pesticide use, lessen the risk of resistance and prevent spillovers that cause harm to both humans and the environment.


Crystal ball for prices

Market volatility and fluctuating prices affect how farmers invest and decide what to grow. This uncertainty can also prevent farmers from taking risks on new developments.

AI can help reduce this uncertainty by forecasting prices. For example, services from companies such as Agtools, Agremo and GeoPard AI-powered farm decision tools. These tools allow for real-time analysis of price points and market data and present farmers with data on long-term trends that can help optimize production.

This data allows farmers to react to price changes and allows them to plan more strategically. If farmers' economic resilience improves, it increases the likelihood that they can invest in new opportunities and technologies that benefit both farms and the larger food system.

AI for good

Human innovation has always produced winners and losers. The dangers of AI are apparent, biased algorithms, data privacy violations and the manipulation of human behavior. However, it is also a technology that has the potential to solve many problems.


These uses for AI in agriculture are a cause for optimism among farmers. If the agriculture industry can promote the utility of these inventions while developing strong and sensible frameworks to minimize harms, AI can help reduce modern agriculture's impact on human health and the environment while helping improve global food security in the 21st century.The Conversation

Joe Hollis, PhD student in Rural Sociology and Sustainable Agriculture, Iowa State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Continue Reading

The Conversation

ChatGPT turns 1: AI chatbot’s success says as much about humans as technology



ChatGPT turns 1: AI chatbot's success says as much about humans as technology

The drama surrounding OpenAI CEO Sam Altman, left − joined on stage here by Microsoft CEO Satya Nadella − has overshadowed the first anniversary of the company's ChatGPT.
AP Photo/Barbara Ortutay

Tim Gorichanaz, Drexel University

ChatGPT was launched on Nov. 30, 2022, ushering in what many have called artificial intelligence's breakout year. Within days of its release, ChatGPT went viral. Screenshots of conversations snowballed across social , and the use of ChatGPT skyrocketed to an extent that seems to have surprised even its maker, OpenAI. By January, ChatGPT was seeing 13 million unique visitors each day, setting a record for the fastest-growing user base of a consumer application.

Throughout this breakout year, ChatGPT has revealed the power of a good interface and the perils of hype, and it has sown the seeds of a new set of human behaviors. As a researcher who studies technology and human information behavior, I find that ChatGPT's influence in society as much from how people view and use it as the technology itself.

Generative AI systems like ChatGPT are becoming pervasive. Since ChatGPT's release, some mention of AI has seemed obligatory in presentations, conversations and articles. , OpenAI claims 100 million people use ChatGPT every week.

Besides people interacting with ChatGPT at home, employees at all levels up to the C-suite in businesses are using the AI chatbot. In tech, generative AI is being called the biggest platform since the iPhone, which debuted in 2007. All the major players are making AI bets, and venture in AI startups is booming.


Along the way, ChatGPT has raised numerous concerns, such as its implications for disinformation, fraud, intellectual property issues and discrimination. In my world of higher education, much of the discussion has surrounded cheating, which has become a focus of my own research this year.

Lessons from ChatGPT's first year

The of ChatGPT speaks foremost to the power of a good interface. AI has already been part of countless everyday products for well over a decade, from Spotify and Netflix to Facebook and Google Maps. The first version of GPT, the AI model that powers ChatGPT, dates back to 2018. And even OpenAI's other products, such as DALL-E, did not make the waves that ChatGPT did immediately upon its release. It was the chat-based interface that set off AI's breakout year.

There is something uniquely beguiling about chat. Humans are endowed with language, and conversation is a primary way people interact with each other and infer intelligence. A chat-based interface is a natural mode for interaction and a way for people to experience the “intelligence” of an AI system. The phenomenal success of ChatGPT shows again that user interfaces widespread adoption of technology, from the Macintosh to web browsers and the iPhone. Design makes the difference.

A man wearing glasses looks at a laptop screen, his hands poised over the keyboard, text on the screen
The chat in ChatGPT is just as important as the AI under the hood.
Nicolas Maeterlinck/Belga Mag/AFP via Getty Images

At the same time, one of the technology's principal strengths – generating convincing language – makes it well suited for producing false or misleading information. ChatGPT and other generative AI systems make it easier for criminals and propagandists to prey on human vulnerabilities. The potential of the technology to boost fraud and misinformation is one of the key rationales for regulating AI.

Amid the real promises and perils of generative AI, the technology has also provided another case study in the power of hype. This year has brought no shortage of articles on how AI is going to transform every aspect of society and how the proliferation of the technology is inevitable.


ChatGPT is not the first technology to be hyped as “the next big thing,” but it is perhaps unique in simultaneously being hyped as an existential risk. Numerous tech titans and even some AI researchers have warned about the risk of superintelligent AI systems emerging and wiping out humanity, though I believe that these fears are far-fetched.

The media favors hype, and the current venture funding climate further fuels AI hype in particular. Playing to people's hopes and fears is a recipe for anxiety with none of the ingredients for wise making.

What the future may hold

The AI floodgates opened in 2023, but the next year may bring a slowdown. AI is likely to meet technical limitations and encounter infrastructural hurdles such as chip manufacturing and server capacity. Simultaneously, AI regulation is likely to be on the way.

This slowdown should give for norms in human behavior to form, both in terms of etiquette, as in when and where using ChatGPT is socially acceptable, and effectiveness, like when and where ChatGPT is most useful.


ChatGPT and other generative AI systems will settle into people's workflows, allowing workers to accomplish some tasks faster and with fewer errors. In the same way that people learned “to google” for information, humans will need to learn new practices for working with generative AI tools.

But the outlook for 2024 isn't completely rosy. It is shaping up to be a historic year for elections around the world, and AI-generated content will almost certainly be used to influence public opinion and stoke division. Meta may have banned the use of generative AI in political advertising, but this isn't likely to stop ChatGPT and similar tools from being used to create and spread false or misleading content.

Political misinformation spread across social media in 2016 as well as in 2020, and it is virtually certain that generative AI will be used to continue those efforts in 2024. Even outside social media, conversations with ChatGPT and similar products can be sources of misinformation on their own.

As a result, another lesson that everyone – users of ChatGPT or not – will have to learn in the blockbuster technology's second year is to be vigilant when it comes to digital media of all kinds.The Conversation

Tim Gorichanaz, Assistant Teaching Professor of Information Science, Drexel University


This article is republished from The Conversation under a Creative Commons license. Read the original article.

Continue Reading

The Conversation

MicroRNA is the master regulator of the genome − researchers are learning how to treat disease by harnessing the way it controls genes



MicroRNA is the master regulator of the genome − researchers are learning how to treat disease by harnessing the way it controls genes

RNA is more than just a transitional between DNA and protein.
Kateryna Kon/Science Photo Library via Getty Images

Andrea Kasinski, Purdue University

The Earth formed 4.5 billion years ago, and less than a years after that. Although life as we know it is dependent on four major macromolecules – DNA, RNA, proteins and lipids – only one is thought to have been present at the beginning of life: RNA.

It is no surprise that RNA likely came first. It is the only one of those major macromolecules that can both replicate itself and catalyze chemical reactions, both of which are essential for life. Like DNA, RNA is made from individual nucleotides linked into chains. Scientists initially understood that genetic information flows in one direction: DNA is transcribed into RNA, and RNA is translated into proteins. That principle is called the central dogma of molecular biology. But there are many deviations.

One major example of an exception to the central dogma is that some RNAs are never translated or coded into proteins. This fascinating diversion from the central dogma is what led me to dedicate my scientific career to understanding how it works. Indeed, research on RNA has lagged behind the other macromolecules. Although there are multiple classes of these so-called noncoding RNAs, researchers like myself have started to focus a great deal of attention on short stretches of genetic material called microRNAs and their potential to treat various diseases, including cancer.

MicroRNAs play a key role in regulating gene expression.

MicroRNAs and disease

Scientists regard microRNAs as master regulators of the genome due to their ability to bind to and alter the expression of many protein-coding RNAs. Indeed, a single microRNA can regulate anywhere from 10 to 100 protein-coding RNAs. Rather than translating DNA to proteins, they instead can bind to protein-coding RNAs to silence genes.


The reason microRNAs can regulate such a diverse pool of RNAs stems from their ability to bind to target RNAs they don't perfectly match up with. This means a single microRNA can often regulate a pool of targets that are all involved in similar processes in the cell, leading to an enhanced response.

Because a single microRNA can regulate multiple genes, many microRNAs can contribute to disease when they become dysfunctional.

In 2002, researchers first identified the role dysfunctional microRNAs play in disease through patients with a type of blood and bone marrow cancer called chronic lymphocytic leukemia. This cancer results from the loss of two microRNAs normally involved in blocking tumor cell growth. Since then, scientists have identified over 2,000 microRNAs in people, many of which are altered in various diseases.

The field has also developed a fairly solid understanding of how microRNA dysfunction contributes to disease. Changing one microRNA can change several other genes, resulting in a plethora of alterations that can collectively reshape the cell's physiology. For example, over half of all cancers have significantly reduced activity in a microRNA called miR-34a. Because miR-34a regulates many genes involved in preventing the growth and migration of cancer cells, losing miR-34a can increase the risk of developing cancer.


Researchers are looking into using microRNAs as therapeutics for cancer, heart disease, neurodegenerative disease and others. While results in the laboratory have been promising, bringing microRNA treatments into the clinic has met multiple challenges. Many are related to inefficient delivery into target cells and poor stability, which limit their effectiveness.

Diagram showing a loop of microRNA binding to a strand of mRNA as it's being translated from DNA
MicroRNA can silence genes by binding to mRNA.
Kajsa Mollersen/Wikimedia Commons, CC BY-SA

Delivering microRNA to cells

One reason why delivering microRNA treatments into cells is difficult is because microRNA treatments need to be delivered specifically to diseased cells while avoiding healthy cells. Unlike mRNA COVID-19 vaccines that are taken up by scavenging immune cells whose job is to detect foreign materials, microRNA treatments need to fool the body into thinking they aren't foreign in order to avoid immune attack and get to their intended cells.

Scientists are studying various ways to deliver microRNA treatments to their specific target cells. One method garnering a great deal of attention relies on directly linking the microRNA to a ligand, a kind of small molecule that binds to specific proteins on the surface of cells. with healthy cells, diseased cells can have a disproportionate number of some surface proteins, or receptors. So, ligands can microRNAs home specifically to diseased cells while avoiding healthy cells. The first ligand approved by the U.S. Food and Drug Administration to deliver small RNAs like microRNAs, N-acetylgalactosamine, or GalNAc, preferentially delivers RNAs to liver cells.

Identifying ligands that can deliver small RNAs to other cells requires finding receptors expressed at high enough levels on the surface of target cells. Typically, over one million copies per cell are needed in order to achieve sufficient delivery of the drug.

One ligand that stands out is folate, also referred to as vitamin B9, a small molecule critical during periods of rapid cell growth such as fetal development. Because some tumor cells have over one million folate receptors, this ligand provides sufficient to deliver enough of a therapeutic RNA to target different types of cancer. For example, my laboratory developed a new molecule called FolamiR-34a – folate linked to miR-34a – that reduced the size of breast and lung cancer tumors in mice.

Microscopy image juxtaposing endothelial cells sprouting extensions to form new blood vessels and a cell bathed in microRNA unable to sprout
Tumors can exploit healthy cells to grow blood vessels that them nutrients, as seen in the endothelial cells to the left sprouting extensions. Exposing these cells to certain microRNAs, however, can disable that growth, as seen in the cell to the right.
Dudley Lab, University of Virginia School of Medicine/NIH via Flickr, CC BY-NC

Making microRNAs more stable

One of the other challenges with using small RNAs is their poor stability, which to their rapid degradation. As such, RNA-based treatments are generally short-lived in the body and require frequent doses to maintain a therapeutic effect.

To overcome this , researchers are modifying small RNAs in various ways. While each RNA requires a specific modification pattern, successful changes can significantly increase their stability. This reduces the need for frequent dosing, subsequently decreasing treatment burden and cost.

For example, modified GalNAc-siRNAs, another form of small RNAs, reduces dosing from every few days to once every six months in nondividing cells. My team developed folate ligands linked to modified microRNAs for cancer treatment that reduced dosing from once every other day to once a . For diseases like cancer where cells are rapidly dividing and quickly diluting the delivered microRNA, this increase in activity is a significant advancement in the field. We anticipate this accomplishment will facilitate further development of this folate-linked microRNA as a cancer treatment in the years to come.

While there is still considerable work to be done to overcome the hurdles associated with microRNA treatments, it's clear that RNA shows promise as a therapeutic for many diseases.The Conversation

Andrea Kasinski, Associate Professor of Biological Sciences, Purdue University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Did you miss our previous article…

Continue Reading

News from the South