Connect with us

The Conversation

Deepfake detection improves when using algorithms that are more aware of demographic diversity



theconversation.com – Siwei Lyu, Professor of Computer Science and Engineering; Director, UB Forensic Lab, at Buffalo – 2024-04-16 07:31:48
Deepfake detection software may unfairly target people from some groups.
JLco – Ana Suanes/iStock via Getty Images

Siwei Lyu, University at Buffalo and Yan Ju, University at Buffalo

Deepfakes – essentially putting words in someone else's mouth in a very believable way – are becoming more sophisticated by the day and increasingly hard to spot. Recent examples of deepfakes include Taylor Swift nude images, an audio recording of President Joe Biden telling New Hampshire not to vote, and a video of Ukrainian President Volodymyr Zelenskyy calling on his troops to lay down their arms.

Although companies have created detectors to spot deepfakes, studies have found that biases in the data used to train these tools can to certain demographic groups being unfairly targeted.

a hand holds a smartphone with text on it in front of a screen with a man in front of a lectern
A deepfake of Ukraine President Volodymyr Zelensky in 2022 purported to show him calling on his troops to lay down their arms.
Olivier Douliery/AFP via Getty Images

My team and I discovered new methods that improve both the fairness and the accuracy of the algorithms used to detect deepfakes.

To do so, we used a large dataset of facial forgeries that lets researchers like us train our deep-learning approaches. We built our work around the -of-the-art Xception detection algorithm, which is a widely used foundation for deepfake detection and can detect deepfakes with an accuracy of 91.5%.


We created two separate deepfake detection methods intended to encourage fairness.

One was focused on making the algorithm more aware of demographic diversity by labeling datasets by gender and race to minimize errors among underrepresented groups.

The other aimed to improve fairness without relying on demographic labels by focusing instead on features not visible to the human eye.

It turns out the first method worked best. It increased accuracy rates from the 91.5% baseline to 94.17%, which was a bigger increase than our second method as well as several others we tested. Moreover, it increased accuracy while enhancing fairness, which was our main focus.


We believe fairness and accuracy are crucial if the public is to accept artificial intelligence technology. When large language models like ChatGPT “hallucinate,” they can perpetuate erroneous information. This affects public trust and safety.

Likewise, deepfake images and can undermine the adoption of AI if they cannot be quickly and accurately detected. Improving the fairness of these detection algorithms so that certain demographic groups aren't disproportionately harmed by them is a key aspect to this.

Our research addresses deepfake detection algorithms' fairness, rather than just attempting to balance the data. It offers a new approach to algorithm design that considers demographic fairness as a core aspect.The Conversation

Siwei Lyu, Professor of Computer Science and Engineering; Director, UB Media Forensic Lab, University at Buffalo and Yan Ju, Ph.D. Candidate in Computer Science and Engineering, University at Buffalo

This article is republished from The Conversation under a Creative Commons license. Read the original article.


The Conversation

Animals self-medicate with plants − behavior people have observed and emulated for millennia



theconversation.com – Adrienne , Research Scholar, Classics and History and Philosophy of Science, Stanford – 2024-05-24 07:29:22

A goat with an arrow wound nibbles the medicinal herb dittany.

O. Dapper, CC BY

Adrienne Mayor, Stanford University

When a wild orangutan in Sumatra recently suffered a facial wound, apparently after fighting with another male, he did something that caught the attention of the scientists observing him.


The animal chewed the leaves of a liana vine – a plant not normally eaten by apes. Over several days, the orangutan carefully applied the juice to its wound, then covered it with a paste of chewed-up liana. The wound healed with only a faint scar. The tropical plant he selected has antibacterial and antioxidant properties and is known to alleviate pain, fever, bleeding and inflammation.

The striking story was picked up by media worldwide. In interviews and in their research paper, the scientists stated that this is “the first systematically documented case of active wound treatment by a wild animal” with a biologically active plant. The discovery will “ new insights into the origins of human wound care.”

left: four leaves next to a ruler. right: an orangutan in a treetop

Fibraurea tinctoria leaves and the orangutan chomping on some of the leaves.

Laumer et al, Sci Rep 14, 8932 (2024), CC BY

To me, the behavior of the orangutan sounded familiar. As a historian of ancient science who investigates what Greeks and Romans knew about plants and animals, I was reminded of similar cases reported by Aristotle, Pliny the Elder, Aelian and other naturalists from antiquity. A remarkable body of accounts from ancient to medieval times self-medication by many different animals. The animals used plants to treat illness, repel parasites, neutralize poisons and heal wounds.


The term zoopharmacognosy – “animal medicine knowledge” – was invented in 1987. But as the Roman natural historian Pliny pointed out 2,000 years ago, many animals have made medical discoveries useful for humans. Indeed, a large number of medicinal plants used in modern were first discovered by Indigenous peoples and past cultures who observed animals employing plants and emulated them.

What you can learn by watching animals

Some of the earliest written examples of animal self-medication appear in Aristotle's “History of Animals” from the fourth century BCE, such as the well-known habit of dogs to eat grass when ill, probably for purging and deworming.

Aristotle also noted that after hibernation, bears seek wild garlic as their first food. It is rich in vitamin C, iron and magnesium, healthful nutrients after a long winter's nap. The Latin name reflects this folk belief: Allium ursinum translates to “bear lily,” and the common name in many other languages refers to bears.

medieval image of a stag wounded by a hunter's arrow, while a doe is also wounded, but eats the herb dittany, causing the arrow to come out

As a hunter lands several arrows in his quarry, a wounded doe nibbles some growing dittany.

British Library, Harley MS 4751 (Harley Bestiary), folio 14v, CC BY


Pliny explained how the use of dittany, also known as wild oregano, to treat arrow wounds arose from watching wounded stags grazing on the herb. Aristotle and Dioscorides credited wild goats with the discovery. Vergil, Cicero, Plutarch, Solinus, Celsus and Galen claimed that dittany has the ability to expel an arrowhead and close the wound. Among dittany's many known phytochemical properties are antiseptic, anti-inflammatory and coagulating effects.

According to Pliny, deer also knew an antidote for toxic plants: wild artichokes. The leaves relieve nausea and stomach cramps and protect the liver. To cure themselves of spider bites, Pliny wrote, deer ate crabs washed up on the beach, and sick goats did the same. Notably, crab shells contain chitosan, which boosts the immune system.

When elephants accidentally swallowed chameleons hidden on green foliage, they ate olive leaves, a natural antibiotic to combat salmonella harbored by lizards. Pliny said ravens eat chameleons, but then ingest bay leaves to counter the lizards' toxicity. Antibacterial bay leaves relieve diarrhea and gastrointestinal distress. Pliny noted that blackbirds, partridges, jays and pigeons also eat bay leaves for digestive problems.

17th century etching of a weasel and a basilisk in conflict

A weasel wears a belt of rue as it attacks a basilisk in an illustration from a 1600s bestiary.

Wenceslaus Hollar/Wikimedia Commons, CC BY


Weasels were said to roll in the evergreen plant rue to counter wounds and snakebites. Fresh rue is toxic. Its medical value is unclear, but the dried plant is included in many traditional folk medicines. Swallows collect another toxic plant, celandine, to make a poultice for their chicks' eyes. Snakes emerging from hibernation rub their eyes on fennel. Fennel bulbs contain compounds that promote tissue repair and immunity.

According to the naturalist Aelian, who lived in the third century BCE, the Egyptians traced much of their medical knowledge to the wisdom of animals. Aelian described elephants treating spear wounds with olive flowers and oil. He also mentioned storks, partridges and turtledoves crushing oregano leaves and applying the paste to wounds.

The study of animals' remedies continued in the Middle Ages. An example from the 12th-century English compendium of animal lore, the Aberdeen Bestiary, tells of bears coating sores with mullein. Folk medicine prescribes this flowering plant to soothe pain and heal burns and wounds, thanks to its anti-inflammatory chemicals.

Ibn al-Durayhim's 14th-century manuscript “The Usefulness of Animals” reported that swallows healed nestlings' eyes with turmeric, another anti-inflammatory. He also noted that wild goats chew and apply sphagnum moss to wounds, just as the Sumatran orangutan did with liana. Sphagnum moss dressings neutralize bacteria and combat infection.


Nature's pharmacopoeia

Of course, these premodern observations were folk knowledge, not formal science. But the stories reveal long-term observation and imitation of diverse animal species self-doctoring with bioactive plants. Just as traditional Indigenous ethnobotany is leading to lifesaving drugs today, scientific testing of the ancient and medieval claims could to discoveries of new therapeutic plants.

Animal self-medication has become a rapidly growing scientific discipline. Observers observations of animals, from birds and rats to porcupines and chimpanzees, deliberately employing an impressive repertoire of medicinal substances. One surprising observation is that finches and sparrows collect cigarette butts. The nicotine kills mites in bird nests. Some veterinarians even allow ailing dogs, horses and other domestic animals to choose their own prescriptions by sniffing various botanical compounds.

Mysteries remain. No one knows how animals sense which plants cure sickness, heal wounds, repel parasites or otherwise promote . Are they intentionally responding to particular health crises? And how is their knowledge transmitted? What we do know is that we humans have been learning healing secrets by watching animals self-medicate for millennia.The Conversation

Adrienne Mayor, Research Scholar, Classics and History and Philosophy of Science, Stanford University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Continue Reading

The Conversation

What Philadelphians need to know about the city’s 7,000-camera surveillance system



theconversation.com – Albert Fox Cahn, Practitioner-in-Residence, Information Institute, New York University – 2024-05-24 07:28:38

Surveillance cameras are getting cheaper, more powerful and more ubiquitous.

Denniro/iStock via Getty Images Plus

Albert Fox Cahn, New York University

The Philadelphia Inquirer recently investigated Philadelphia's use of what it described as a “little-scrutinized, 7,000-camera system that is exposing across the city to heightened surveillance with few rules or safeguards against abuse.” The article detailed how Philadelphia narcotics cops not only allegedly failed to disclose their use of video surveillance in arrest reports or to prosecutors, but also that the video footage at times proved were lying when they testified.


U.S. talked to Albert Fox Cahn, founder and executive director of the nonprofit Surveillance Technology Oversight Project and a practitioner-in-residence at NYU School of Law, about what these new video can do and the privacy and other issues they raise.

What can these cameras do?

The closed-circuit television, or CCTV, cameras most Americans pass each day may look interchangeable, but a lot has changed behind the lens in recent years. As video surveillance cameras have become cheaper and more ubiquitous, they have also grown more powerful – featuring increasingly high-definition images and the ability to pan, tilt and zoom. But the most significant change to cameras like those used in Philadelphia is the networks that police departments set up to aggregate these countless images of city residents' daily lives.

A variety of AI tools can also harvest this data in new ways that some may find alarming.

Automated license plate reader software can both track drivers across the city in real time and create a long-term log of their cars' movements. Want to know where a driver is now or was parked two years ago? Just check the database.


And pedestrians are no less prone to surveillance. Facial recognition software can scan images to automatically identify individuals and track them across the city.

How widespread is this technology?

According to the Inquirer's investigation, Philadelphia's camera network grew at an astounding pace. In the past decade, the city has gone from 216 cameras to a network of more than 7,000 cameras operated by and transportation .

But those are just the cameras that city officials directly control and can access in real time.

In addition, police routinely turn to the images captured by private surveillance cameras. This includes everything from multimillion-dollar, internet-enabled camera systems at large stores, offices and universities to the individual cameras that homeowners or small-business owners screw into their door frames or exteriors. The public simply has no idea how many of these private cameras are in operation or how often their data is requested.


How is this different from traditional police video surveillance?

Traditional cameras offered a narrow, grainy perspective on a single fixed place. These systems not only collected much less data than contemporary cameras, but they also retained far less.

A single CCTV camera at a bank might help police identify a suspect in a robbery, but it poses no privacy threat beyond that. It is confined to a small space where privacy concerns are minimal and security concerns are high. But mass camera deployments create a fundamentally different model, collecting far more information on all of us and creating far greater potential for misuse.

Police have attempted these techniques for decades, but the technology simply wasn't up to the task. When the City of London Police deployed its so-called “ring of steel” security system in the 1990s, fewer than two dozen cameras tried to track the cars entering a tiny portion of the British capital, surveilling roughly a square kilometer of the city's financial core. Officers manually jotted down vehicle plate numbers and surveilled drivers' profile photos.

The labor-intensive exercise was impossible to scale.


To deploy such a system across an entire city would likely have taken every police officer in the city and then some. Through automation, technology enables this mass surveillance by reducing the marginal cost of tracking, allowing police to expand monitoring far more broadly than would have been financially or pragmatically possible before.

People walk past a police van in street underneath elevated train

Security cameras hang from the elevated train tracks at Kensington and Allegheny avenues in North Philadelphia.

Spencer Platt/Getty Images

What privacy concerns does it raise?

A single camera can capture our image; a citywide camera system can reconstruct our lives. Networked camera systems like those in Philadelphia, when combined with smartphones and other internet-enabled devices, allow officers to reconstruct an individual's movements for days or weeks at a time, all without any court oversight.

While it would take a warrant to install a GPS tracker on a 's car, police can recreate GPS-like location tracking without a warrant, all thanks to mass camera systems. And facial recognition in municipal cameras threatens the First Amendment, which protects of speech, religion and peaceful assembly. The police are armed with a way to track nearly every person at a political protest, clinic or house of worship. Such surveillance melts away the anonymity that is indispensable to an open society.


Are there other risks or unintended consequences?

I believe giving thousands of city employees the keys to a small surveillance state is a recipe for disaster.

The Philadelphia Inquirer found that the city has policies that forbid zooming in on residents for amusement, spying on someone by zooming in through their window, or blatant racial profiling. But what it didn't find was evidence that these safeguards were being enforced.

When thousands of employees can spy on their neighbors, romantic partners and business rivals on a whim, it raises the question: Who watches the watchers?

At least for now, the grim answer appears to be no one.The Conversation

Albert Fox Cahn, Practitioner-in-Residence, Information Law Institute, New York University


This article is republished from The Conversation under a Creative Commons license. Read the original article.

Continue Reading

The Conversation

Phone cameras can take in more light than the human eye − that’s why low-light events like the northern lights often look better through your phone camera



theconversation.com – Douglas Goodwin, Visiting Assistant Professor in Studies, Scripps College – 2024-05-23 07:29:41

A May 2024 solar storm made the northern lights visible across parts of the northern U.S.

AP Photo/Lindsey Wasson

Douglas Goodwin, Scripps College

Smartphone cameras have significantly improved in recent years. Computational photography and AI allow these devices to capture stunning images that can surpass what we see with the naked eye. Photos of the northern lights, or aurora borealis, one particularly striking example.


If you saw the northern lights during the geomagnetic storms in May 2024, you might have noticed that your smartphone made the photos look even more vivid than reality.

Auroras, known as the northern lights (aurora borealis) or southern lights (aurora australis) occur when the solar wind disturbs Earth's magnetic field. They appear as streaks of color across the sky.

Two images of the northern lights, the left labeled 'eye' and the right labeled 'camera.' The 'eye' image is darker with the colors more muted.

The left side shows the aurora as seen with the naked eye. The right side reveals how a smartphone camera can capture brighter and more colorful lights.

Douglas Goodwin

What makes photos of these even more striking than they appear to the eye? As a professor of computational photography, I've seen how the latest smartphone features overcome the limitations of human vision.


Your eyes in the dark

Human eyes are remarkable. They allow you to see footprints in a sun-soaked desert and pilot vehicles at high speeds. However, your eyes perform less impressively in low light.

Human eyes contain two types of cells that respond to light – rods and cones. Rods are numerous and much more sensitive to light. Cones handle color but need more light to function. As a result, at night our vision relies heavily on rods and misses color.

A diagram of a human eye, with a zoomed panel showing rod and cone receptors. The rods are cylindrical, while the cones are conical.

Rods and cones in your eyes are photoreceptors that black and white as well as color.

Blume, C., Garbazza, C. & Spitschan, M., CC BY-SA

The result is like wearing dark sunglasses to watch a . At night, colors appear washed out and muted. Similarly, under a sky, the vibrant hues of the aurora are present but often too dim for your eyes to see clearly.


In low light, your brain prioritizes motion detection and shape recognition to you navigate. This trade-off means the ethereal colors of the aurora are often invisible to the naked eye. Technology is the only way to increase their brightness.

Taking the perfect picture

Smartphones have revolutionized how people capture the world. These compact devices use multiple cameras and advanced sensors to gather more light than the human eye can, even in low-light conditions. They achieve this through longer exposure times – how long the camera takes in light – larger apertures and increasing the ISO, the amount of light your camera lets in.

But smartphones do more than adjust these settings. They also leverage computational photography to enhance your images using digital techniques and algorithms. Image stabilization reduces the camera's shakiness, and exposure settings optimize the amount of light the camera captures.

Multi-image processing creates the perfect by stacking multiple images together. A setting called night mode can balance colors in low light, while LiDAR capabilities in some phones keep your images in precise focus.


A diagram showing a stack of grainy images flattened down to one clear image.

Image stacking involves aligning and combining several noisy photos to enhance the final image's quality. Averaging these images together suppresses random sensor noise. This results in a clearer and more detailed picture than any of the photos alone.

Douglas Goodwin

LiDAR stands for light detection and ranging, and phones with this setting emit laser pulses to calculate the distances to objects in the scene quickly in any kind of light. LiDAR generates a depth map of the to improve focus and make objects in your photos stand out.

Two images, the left labeled 'optical' and the right labeled 'depth' of a person dancing. The 'optical' image shows how the person would look normally in the photo, while the 'depth' image shows their silhouette in white against a black background.

Smartphone cameras don't just capture flat images – they collect depth information too. The left side shows a regular photo, while the right side illustrates the depth map, with lighter pixels closer to the camera and darker ones farther away. Normally hidden, this depth data enables smartphones to apply effects such as artificial background blur to mimic the look of the northern lights against a night sky.

Douglas Goodwin

Artificial intelligence tools in your smartphone camera can further enhance your photos by optimizing the settings, applying bursts of light and using super-resolution techniques to get really fine detail. They can even identify faces in your photos.


AI processing in your smartphone's camera

While there's plenty you can do with a smartphone camera, regular cameras do have larger sensors and superior optics, providing more control over the images you take. Camera manufacturers like Nikon, Sony and Canon typically avoid tampering with the image, instead letting the photographer take creative control.

These cameras offer photographers the flexibility of shooting in raw format, which allows you to keep more of each image's data for editing and often produces higher-quality results.

Unlike dedicated cameras, modern smartphone cameras use AI while and after you snap a picture to enhance your photos' quality. While you're taking a photo, AI tools will analyze the scene you're pointing the camera at and adjust settings such as exposure, white balance and ISO, while recognizing the subject you're shooting and stabilizing the image. These make sure you get a great photo when you hit the button.

You can often find features that use AI such as high dynamic range, night mode and portrait mode, enabled by default or accessible within your camera settings.


AI algorithms further enhance your photos by refining details, reducing blur and applying effects such as color correction after you take the photo.

All these features help your camera take photos in low-light conditions and contributed to the stunning aurora photos you may have captured with your phone camera.

While the human eye struggles to fully appreciate the northern lights' otherworldly hues at night, modern smartphone cameras overcome this limitation. By leveraging AI and computational photography techniques, your devices allow you to see the bold colors of solar storms in the atmosphere, boosting color and capturing otherwise invisible details that even the keenest eye will miss.The Conversation

Douglas Goodwin, Visiting Assistant Professor in Media Studies, Scripps College

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Continue Reading

News from the South