Tag Archives: eBird (Cornell)

Can citizen science be trusted? Yes, it can!

Caption: Western tanagers are migratory birds that are present in Northern California in the spring and again in late summer. A new study shows that observations by ‘citizen scientists’ using apps such as iNaturalist and eBird accurately reflect bird migrations and therefore can be used in scientific studies. Credit: Jonathan Eisen, UC Davis

Having heard a scientist during an online UNESCO (United Nations Educational, Scientific and Cultural Organization) press briefing (about their 2024 Water Report) express doubts about the accuracy of citizen science data, this study held special interest for me.

An April 15, 2025 University of California at Davis (UC Davis) news release (also on EurekAlert) by Liana Wait announces a study on the accuracy of citizen science-gathered ecological data, Note: Links have been removed,

Platforms such as iNaturalist and eBird encourage people to observe and document nature, but how accurate is the ecological data that they collect?

In a new study published in Citizen Science: Theory and Practice March 28 [2025], researchers from the University of California, Davis, show that citizen science data from iNaturalist and eBird can reliably capture known seasonal patterns of bird migration in Northern California and Nevada — from year-round residents such as California Scrub-Jays, to transient migrants such as the Western Tanager and the Pectoral Sandpiper.

“This project shows that data from participatory science projects with different goals, observers and structure can be combined into reliable and robust datasets to address broad scientific questions,” said senior author Laci Gerhart, associate professor of teaching in the UC Davis Department of Evolution and Ecology. “Contributors to multiple, smaller projects can help make real discoveries about bigger issues.”

Wild Davis research

The study began as a student capstone project in Gerhart’s Wild Davis field course, which teaches students about urban ecology and California ecosystems. First author Cody Carroll, now an assistant professor at the University of San Francisco, took the course in 2020 while completing his doctorate in statistics at UC Davis.

Most Wild Davis capstone projects are focused on community service at the Stebbins Cold Canyon Nature Reserve, but students were restricted to computer-based projects during the COVID-19 shutdown, so Carroll decided to use his statistical expertise to analyze data from iNaturalist.

After Carroll graduated and began working at USF, the team regrouped and took the project a step further by combining the iNaturalist data with data from eBird, a different citizen science platform that is preferred by bird enthusiasts with significant birding experience.

Merging iNaturalist and eBird

Since iNaturalist and eBird differ substantially in the type of data they collect and the type of user they appeal to, the team wanted to investigate whether their data could be integrated.

“eBird is more geared toward trained and very active birders who are doing complete record keeping of the birds that they’re seeing in particular areas,” said Gerhart. “iNaturalist is intentionally geared toward more casual observers who are there as much to learn about the organisms as they are to document them scientifically.”

To merge the data, Carroll considered the relative frequency of observations rather than the overall number of observations and also took into account the cyclic, seasonal nature of bird migrations.

Overall, the researchers compared data for 254 different bird species that were observed in Northern California and Nevada in 2019 and 2022. They found that the two platforms showed similar seasonal patterns for over 97% of bird species.

An assortment of seasonal bird patterns

To “ground truth” their findings, Gerhart and Carroll teamed up with Rob Furrow, an assistant professor of teaching in the Department of Wildlife, Fish and Conservation Biology, who is an avid bird watcher and eBird user.

“We wanted to test whether we were seeing actual migratory patterns or whether these were just due to biases in the observations, so we reached out to Rob, who is an expert about birds,” said Gerhart.

With Furrow’s expertise, the team showed that the combined iNaturalist and eBird data recapitulated a variety of known bird seasonality patterns within the region — meaning that the patterns were representative of actual bird presence, not due to biases in the observations.

For example, their data showed that California Scrub-Jays are present in the region year-round, whereas Bufflehead ducks arrive in mid-fall and depart in early spring. Western Tanagers pass through in late spring when they journey south for winter, and again in late summer as they fly back northwards to breed.

“We were really pleasantly surprised that we could still get reliable data, despite the differences between eBird and iNaturalist,” said Furrow. “Even when you’re relying on casual hobbyists who are taking photos of what they like, when they like, you’re still getting a reliable representation of the birds in that area at that time.”

The power of publicly generated data

The study shows that in addition to inspiring people to connect with nature, platforms such as iNaturalist and eBird can help answer important biological questions.

“This is a good example of why interdisciplinarity is important — we each brought different knowledge to this project, and it pushed each of us intellectually,” said Gerhart. “It was a really fun experience for us to combine our skill sets, and I hope that Cody, Rob and I have a chance to work together again.”

To give back to the people who helped collect the data they used, the team made a point to publish their results in an open access journal. Carroll also created a dashboard, in collaboration with a student at USF, that allows people explore and visualize the seasonality patterns for all 254 bird species.

“It’s important for scientists who are relying on publicly generated data to make sure that their results are also publicly available,” said Gerhart.

Here’s a link to a citation for the paper,

Consistency and Validity of Participatory Science Data: A Comparison of Seasonality Patterns of Northern California and Nevada Birds Across eBird and iNaturalist by Cody Carroll, Robert E. Furrow, Laci M. Gerhart. Citizen Science Theory and Practice Year: 2025 Volume: 10 Issue: 1 Page/Article: 11 DOI: 10.5334/cstp.825 Published on Mar 28, 2025 (Creative Commons Licence: CC Attribution 4.0)

This paper is open access.

You can find the NorCal Bird Dashboard here.

Squirrel observations in St. Louis: a story of bias in citizen science data

Squirrels and other members of the family Sciuridae. Credit: Chicoutimi (montage) Karakal AndiW National Park Service en:User:Markus Krötzsch The Lilac Breasted Roller Nico Conradie from Centurion, South Africa Hans Hillewaert Sylvouille National Park Service – Own work from Wikipedia/CC by 3.0 licence

A March 5, 2024 news item on phys.org introduces a story about squirrels, bias, and citizen science,

When biologist Elizabeth Carlen pulled up in her 2007 Subaru for her first look around St. Louis, she was already checking for the squirrels. Arriving as a newcomer from New York City, Carlen had scrolled through maps and lists of recent sightings in a digital application called iNaturalist. This app is a popular tool for reporting and sharing sightings of animals and plants.

People often start using apps like iNaturalist and eBird when they get interested in a contributory science project (also sometimes called a citizen science project). Armed with cellphones equipped with cameras and GPS, app-wielding volunteers can submit geolocated data that iNaturalist then translates into user-friendly maps. Collectively, these observations have provided scientists and community members greater insight into the biodiversity of their local environment and helped scientists understand trends in climate change, adaptation and species distribution.

But right away, Carlen ran into problems with the iNaturalist data in St. Louis.

A March 5, 2024 Washington University in St. Louis news release (also on EurekAlert) by Talia Ogliore, which originated the news item, describes the bias problem and the research it inspired, Note: Links have been removed,

“According to the app, Eastern gray squirrels tended to be mostly spotted in the south part of the city,” said Carlen, a postdoctoral fellow with the Living Earth Collaborative at Washington University in St. Louis. “That seemed weird to me, especially because the trees, or canopy cover, tended to be pretty even across the city.

“I wondered what was going on. Were there really no squirrels in the northern part of the city?” Carlen said. A cursory drive through a few parks and back alleys north of Delmar Boulevard told her otherwise: squirrels galore.

Carlen took to X, formerly Twitter, for advice. “Squirrels are abundant in the northern part of the city, but there are no recorded observations,” she mused. Carlen asked if others had experienced similar issues with iNaturalist data in their own backyards.

Many people responded, voicing their concerns and affirming Carlen’s experience. The maps on iNaturalist seemed clear, but they did not reflect the way squirrels were actually distributed across St. Louis. Instead, Carlen was looking at biased data.

Previous research has highlighted biases in data reported to contributory science platforms, but little work has articulated how these biases arise.

Carlen reached out to the scientists who responded to her Twitter post to brainstorm some ideas. They put together a framework that illustrates how social and ecological factors combine to create bias in contributory data. In a new paper published in People & Nature, Carlen and her co-authors shared this framework and offered some recommendations to help address the problems.

The scientists described four kinds of “filters” that can bias the reported species pool in contributory science projects:

* Participation filter. Participation reflects who is reporting the data, including where those people are located and the areas they have access to. This filter also may reflect whether individuals in a community are aware of an effort to collect data, or if they have the means and motivation to collect it.

* Detectability filter. An animal’s biology and behavior can impact whether people record it. For example, people are less likely to report sightings of owls or other nocturnal species.

* Sampling filter. People might be more willing to report animals they see when they are recreating (i.e. hanging out in a park), but not what they see while they’re commuting.

* Preference filter. People tend to ignore or filter out pests, nuisance species and uncharismatic or “boring” species. (“There’s not a lot of people photographing rats and putting them on iNaturalist — or pigeons, for that matter,” Carlen said.)

In the paper, Carlen and her team applied their framework to data recorded in St. Louis as a case study. They showed that eBird and iNaturalist observations are concentrated in the southern part of the city, where more white people live. Uneven participation in St. Louis is likely a consequence of variables, such as race, income, and/or contemporary politics, which differ between northern and southern parts of the city, the authors wrote. The other filters of detectability, sampling and preference also likely influence species reporting in St. Louis.

Biased and unrepresentative data is not just a problem for urban ecologists, even if they are the ones who are most likely to notice it, Carlen said. City planners, environmental consultants and local nonprofits all sometimes use contributory science data in their work.

“We need to be very conscious about how we’re using this data and how we’re interpreting where animals are,” Carlen said.

Carlen shared several recommendations for researchers and institutions that want to improve contributory science efforts and help reduce bias. Basic steps include considering cultural relevance when designing a project, conducting proactive outreach with diverse stakeholders and translating project materials into multiple languages.

Data and conclusions drawn from contributory projects should be made publicly available, communicated in accessible formats and made relevant to participants and community members.

“It’s important that we work with communities to understand what their needs are — and then build a better partnership,” Carlen said. “We can’t just show residents the app and tell them that they need to use it, because that ignores the underlying problem that our society is still segregated and not everyone has the resources to participate.

“We need to build relationships with the community and understand what they want to know about the wildlife in their neighborhood,” Carlen said. “Then we can design projects that address those questions, provide resources and actively empower community members to contribute to data collection.”

Here’s a link to and a citation for the paper,

A framework for contextualizing social-ecological biases in contributory science data by Elizabeth J. Carlen, Cesar O. Estien, Tal Caspi, Deja Perkins, Benjamin R. Goldstein, Samantha E. S. Kreling, Yasmine Hentati, Tyus D. Williams, Lauren A. Stanton, Simone Des Roches, Rebecca F. Johnson, Alison N. Young, Caren B. Cooper, Christopher J. Schell. People & Nature Volume 6, Issue 2 April 2024 Pages 377-390 DI: https://doi.org/10.1002/pan3.10592 First published: 03 March 2024

This paper is open access.

Ever heard a bird singing and wondered what kind of bird?

The Cornell University Lab of Ornithology’s sound recognition feature in its Merlin birding app(lication) can answer that question for you according to a July 14, 2021 article by Steven Melendez for Fast Company (Note: Links have been removed),

The lab recently upgraded its Merlin smartphone app, designed for both new and experienced birdwatchers. It now features an AI-infused “Sound ID” feature that can capture bird sounds and compare them to crowdsourced samples to figure out just what bird is making that sound. … people have used it to identify more than 1 million birds. New user counts are also up 58% since the two weeks before launch, and up 44% over the same period last year, according to Drew Weber, Merlin’s project coordinator.

Even when it’s listening to bird sounds, the app still relies on recent advances in image recognition, says project research engineer Grant Van Horn. …, it actually transforms the sound into a visual graph called a spectrogram, similar to what you might see in an audio editing program. Then, it analyzes that spectrogram to look for similarities to known bird calls, which come from the Cornell Lab’s eBird citizen science project.

There’s more detail about Merlin in Marc Devokaitis’ June 23, 2021 article for the Cornell Chronicle,

… Merlin can recognize the sounds of more than 400 species from the U.S. and Canada, with that number set to expand rapidly in future updates.

As Merlin listens, it uses artificial intelligence (AI) technology to identify each species, displaying in real time a list and photos of the birds that are singing or calling.

Automatic song ID has been a dream for decades, but analyzing sound has always been extremely difficult. The breakthrough came when researchers, including Merlin lead researcher Grant Van Horn, began treating the sounds as images and applying new and powerful image classification algorithms like the ones that already power Merlin’s Photo ID feature.

“Each sound recording a user makes gets converted from a waveform to a spectrogram – a way to visualize the amplitude [volume], frequency [pitch] and duration of the sound,” Van Horn said. “So just like Merlin can identify a picture of a bird, it can now use this picture of a bird’s sound to make an ID.”

Merlin’s pioneering approach to sound identification is powered by tens of thousands of citizen scientists who contributed their bird observations and sound recordings to eBird, the Cornell Lab’s global database.

“Thousands of sound recordings train Merlin to recognize each bird species, and more than a billion bird observations in eBird tell Merlin which birds are likely to be present at a particular place and time,” said Drew Weber, Merlin project coordinator. “Having this incredibly robust bird dataset – and feeding that into faster and more powerful machine-learning tools – enables Merlin to identify birds by sound now, when doing so seemed like a daunting challenge just a few years ago.”

The Merlin Bird ID app with the new Sound ID feature is available for free on iOS and Android devices. Click here to download the Merlin Bird ID app and follow the prompts. If you already have Merlin installed on your phone, tap “Get Sound ID.”

Do take a look at Devokaitis’ June 23, 2021 article for more about how the Merlin app provides four ways to identify birds.

For anyone who likes to listen to the news, there’s an August 26, 2021 podcast (The Warblers by Birds Canada) featuring Drew Weber, Merlin project coordinator, and Jody Allair, Birds Canada Director of Community Engagement, discussing Merlin,

It’s a dream come true – there’s finally an app for identifying bird sounds. In the next episode of The Warblers podcast, we’ll explore the Merlin Bird ID app’s new Sound ID feature and how artificial intelligence is redefining birding. We talk with Drew Weber and Jody Allair and go deep into the implications and opportunities that this technology will bring for birds, and new as well as experienced birders.

The Warblers is hosted by Andrea Gress and Andrés Jiménez.