Tag Archives: General Data Protection Regulation (GDPR)

Being tracked by internet browser fingerprinting

There’s seems to be a bit of a war (for credit or acknowledgement?) between Texas A&M University and Johns Hopkins University. Both universities have published almost identical news releases with some disturbing news about privacy on the internet.

A June 18, 2025 Texas A&M University news release (also on EurekAlert) announces new insight into online tracking, Note: A link has been removed,

New research provides first evidence of the use of browser fingerprints for online tracking.

Clearing your cookies is not enough to protect your privacy online. 

New research led by Texas A&M University found that websites are covertly using browser fingerprinting — a method to uniquely identify a web browser — to track people across browser sessions and sites.

“Fingerprinting has always been a concern in the privacy community, but until now, we had no hard proof that it was actually being used to track users,” said Dr. Nitesh Saxena, cybersecurity researcher, professor of computer science and engineering and associate director of the Global Cyber Research Institute at Texas A&M. “Our work helps close that gap.”

When you visit a website, your browser shares a surprising amount of information, like your screen resolution, time zone, device model and more. When combined, these details create a “fingerprint” that’s often unique to your browser. Unlike cookies — which users can delete or block — fingerprinting is much harder to detect or prevent. Most users have no idea it’s happening, and even privacy-focused browsers struggle to fully block it.

“Think of it as a digital signature you didn’t know you were leaving behind,” explained co-author Zengrui Liu, a former doctoral student in Saxena’s lab. “You may look anonymous, but your device or browser gives you away.”

This research marks a turning point in how computer scientists understand the real-world use of browser fingerprinting by connecting it with the use of ads.

“While prior works have studied browser fingerprinting and its usage on different websites, ours is the first to correlate browser fingerprints and ad behaviors, essentially establishing the relationship between web tracking and fingerprinting,” said co-author Dr. Yinzhi Cao, associate professor of computer science and technical director of the Information Security Institute at Johns Hopkins University. [emphasis mine]

To investigate whether websites are using fingerprinting data to track people, the researchers had to go beyond simply scanning websites for the presence of fingerprinting code. They developed a measurement framework called FPTrace, which assesses fingerprinting-based user tracking by analyzing how ad systems respond to changes in browser fingerprints. This approach is based on the insight that if browser fingerprinting influences tracking, altering fingerprints should affect advertiser bidding — where ad space is sold in real time based on the profile of the person viewing the website — and HTTP records — records of communication between a server and a browser. 

“This kind of analysis lets us go beyond the surface,” said co-author Jimmy Dani, Saxena’s doctoral student. “We were able to detect not just the presence of fingerprinting, but whether it was being used to identify and target users — which is much harder to prove.”

The researchers found that tracking occurred even when users cleared or deleted cookies. The results showed notable differences in bid values and a decrease in HTTP records and syncing events when fingerprints were changed, suggesting an impact on targeting and tracking.

Additionally, some of these sites linked fingerprinting behavior to backend bidding processes — meaning fingerprint-based profiles were being used in real time, likely to tailor responses to users or pass along identifiers to third parties. 

Perhaps more concerning, the researchers found that even users who explicitly opt out of tracking under privacy laws like Europe’s General Data Protection Regulation (GDPR) and California’s California Consumer Privacy Act (CCPA) may still be silently tracked across the web through browser fingerprinting.

Based on the results of this study, the researchers argue that current privacy tools and policies are not doing enough. They call for stronger defenses in browsers and new regulatory attention on fingerprinting practices. They hope that their FPTrace framework can help regulators audit websites and providers who participate in such activities, especially without user consent. 

This research was conducted in collaboration with Johns Hopkins University and presented at the ACM Web Conference (WWW) 2025.

Funding for this research is administered by the Texas A&M Engineering Experiment Station (TEES), the official research agency for Texas A&M Engineering.

By Texas A&M University College of Engineering

On June 19, 2025 Johns Hopkins University published the same news release except for a change in the third sentence,

New research provides first evidence of the use of browser fingerprints for online tracking

Clearing your cookies is not enough to protect your privacy online.

New research conducted by a team at Johns Hopkins and Texas A&M universities [emphasis mine] found that websites are covertly using browser fingerprinting—a method to uniquely identify a web browser—to track people across browser sessions and sites.

Johns Hopkins University republished the text in an August 4, 2025 news release..

It’s not unusual for multiple institutions to publish news releases about the same research but rewriting them to highlight their own researchers’ contributions and almost ignoring the other institution? These examples stand out.

In any event, here’s a link to and a citation for the paper,

The First Early Evidence of the Use of Browser Fingerprinting for Online Tracking by Zengrui Liu, Jimmy Dani, Yinzhi Cao, Shujiang Wu, Nitesh Saxena. WWW ’25: Proceedings of the ACM [Association for Computing Machinery] on Web Conference 2025 Pages 4980 – 4995 DOI: https://doi.org/10.1145/3696410.3714548 Published: 22 April 2025

This paper is behind a paywall.

Smart toys spying on children?

Caption: Twelve toys were examined in a study on smart toys and privacy. Credit: University of Basel / Céline Emch

An August 26, 2024 University of Basel press release (also on EurekAlert) describes research into smart toys and privacy issues for the children who play with them,

Toniebox, Tiptoi, and Tamagotchi are smart toys, offering interactive play through software and internet access. However, many of these toys raise privacy concerns, and some even collect extensive behavioral data about children, report researchers at the University of Basel, Switzerland.

The Toniebox and the figurines it comes with are especially popular with small children. They’re much easier to use than standard music players, allowing kids to turn on music and audio content themselves whenever they want. All a child has to do is place a plastic version of Peppa Pig onto the box and the story starts to play. When the child wants to stop the story, they simply remove the figurine. To rewind and fast-forward, the child can tilt the box to the left or right, respectively.

A lot of parents are probably thinking, “Fantastic concept!” Not so fast – the Toniebox records exactly when it is activated and by which figurine, when the child stops playback, and to which spot they rewind or fast-forward. Then it sends the data to the manufacturer.

The Toniebox is one of twelve smart toys studied by researchers headed by Professor Isabel Wagner of the Department of Mathematics and Computer Science at the University of Basel. These included well-known toys like the Tiptoi smart pen, the Edurino learning app, and the Tamagotchi virtual pet as well as the Toniebox. The researchers also studied less well-known products like the Moorebot, a mobile robot with a camera and microphone, and Kidibuzz, a smartphone for kids with parental controls.

One focus of the analysis was security: is data traffic encrypted, and how well? The researchers also investigated data protection, transparency (how easy it is for users to find out what data is collected), and compliance with the EU General Data Protection Regulation. Wagner and her colleagues are presenting their results at the Annual Privacy Forum (https://privacyforum.eu/) in early September [2024]. Springer publishes all the conference contributions in the series Privacy Technologies and Policy.

Collect data while offline, send it while online

Neither the Toniebox nor the Tiptoi pen come out well with respect to security, as they do not securely encrypt data traffic. The two toys differ with regard to privacy concerns, though: While the Toniebox does collect data and send it to the manufacturer, the Tiptoi pen does not record how and when a child uses it.

Even if the Toniebox were operated offline and only temporarily connected to the internet while downloading new audio content, the device could store collected data locally and transmit it to the manufacturer at the next opportunity, Wagner surmises. “In another toy we’re currently studying that integrates ChatGPT, we’re seeing that log data regularly vanishes.” The system is probably set up to delete the local copy of transmitted data to optimize internal storage use, Wagner says.

Companies often claim the collected data helps them optimize their devices. Yet it is far from obvious to users what purpose this data could serve. “The apps bundled with some of these toys demand entirely unnecessary access rights, such as to a smartphone’s location or microphone,” says the researcher. The ChatGPT toy still being analyzed also transmits a data stream that looks like audio. Perhaps the company wants to optimize speech recognition for children’s voices, the Professor of Cyber Security speculates.

A data protection label

“Children’s privacy requires special protection,” emphasizes Julika Feldbusch, first author of the study. She argues that toy manufacturers should place greater weight on privacy and on the security of their products than they currently do in light of their young target audience.

The researchers recommend that compliance with security and data protection standards be identified by a label on the packaging, similar to nutritional information on food items. Currently, it’s too difficult for parents to assess the security risks that smart toys pose to their children.

“We’re already seeing signs of a two-tier society when it comes to privacy protection for children,” says Feldbusch. “Well-informed parents engage with the issue and can choose toys that do not create behavioral profiles of their children. But many lack the technical knowledge or don’t have time to think about this stuff in detail.”

You could argue that individual children probably won’t experience negative consequences due to toy manufacturers creating profiles of them, says Wagner. “But nobody really knows that for sure. For example, constant surveillance can have negative effects on personal development.”

Here’s a link to and a citation for the paper,

No Transparency for Smart Toys by Julika Feldbusch, Valentyna Pavliv, Nima Akbari & Isabel Wagner. Privacy Technologies and Policy Conference paper (part of Annual Privacy Forum [series]: APF 2024; Part of the book series: Lecture Notes in Computer Science [LNCS,volume 14831]) First Online: 01 August 2024 pp 203–227

This paper is behind a paywall.

Internet of toys, the robotification of childhood, and privacy issues

Leave it to the European Commission’s (EC) Joint Research Centre (JRC) to look into the future of toys. As far as I’m aware there are no such moves in either Canada or the US despite the ubiquity of robot toys and other such devices. From a March 23, 2017 EC JRC  press release (also on EurekAlert),

Action is needed to monitor and control the emerging Internet of Toys, concludes a new JRC report. Privacy and security are highlighted as main areas of concern.

Large numbers of connected toys have been put on the market over the past few years, and the turnover is expected to reach €10 billion by 2020 – up from just €2.6 billion in 2015.

Connected toys come in many different forms, from smart watches to teddy bears that interact with their users. They are connected to the internet and together with other connected appliances they form the Internet of Things, which is bringing technology into our daily lives more than ever.

However, the toys’ ability to record, store and share information about their young users raises concerns about children’s safety, privacy and social development.

A team of JRC scientists and international experts looked at the safety, security, privacy and societal questions emerging from the rise of the Internet of Toys. The report invites policymakers, industry, parents and teachers to study connected toys more in depth in order to provide a framework which ensures that these toys are safe and beneficial for children.

Robotification of childhood

Robots are no longer only used in industry to carry out repetitive or potentially dangerous tasks. In the past years, robots have entered our everyday lives and also children are more and more likely to encounter robotic or artificial intelligence-enhanced toys.

We still know relatively little about the consequences of children’s interaction with robotic toys. However, it is conceivable that they represent both opportunities and risks for children’s cognitive, socio-emotional and moral-behavioural development.

For example, social robots may further the acquisition of foreign language skills by compensating for the lack of native speakers as language tutors or by removing the barriers and peer pressure encountered in class room. There is also evidence about the benefits of child-robot interaction for children with developmental problems, such as autism or learning difficulties, who may find human interaction difficult.

However, the internet-based personalization of children’s education via filtering algorithms may also increase the risk of ‘educational bubbles’ where children only receive information that fits their pre-existing knowledge and interest – similar to adult interaction on social media networks.

Safety and security considerations

The rapid rise in internet connected toys also raises concerns about children’s safety and privacy. In particular, the way that data gathered by connected toys is analysed, manipulated and stored is not transparent, which poses an emerging threat to children’s privacy.

The data provided by children while they play, i.e the sounds, images and movements recorded by connected toys is personal data protected by the EU data protection framework, as well as by the new General Data Protection Regulation (GDPR). However, information on how this data is stored, analysed and shared might be hidden in long privacy statements or policies and often go unnoticed by parents.

Whilst children’s right to privacy is the most immediate concern linked to connected toys, there is also a long term concern: growing up in a culture where the tracking, recording and analysing of children’s everyday choices becomes a normal part of life is also likely to shape children’s behaviour and development.

Usage framework to guide the use of connected toys

The report calls for industry and policymakers to create a connected toys usage framework to act as a guide for their design and use.

This would also help toymakers to meet the challenge of complying with the new European General Data Protection Regulation (GDPR) which comes into force in May 2018, which will increase citizens’ control over their personal data.

The report also calls for the connected toy industry and academic researchers to work together to produce better designed and safer products.

Advice for parents

The report concludes that it is paramount that we understand how children interact with connected toys and which risks and opportunities they entail for children’s development.

“These devices come with really interesting possibilities and the more we use them, the more we will learn about how to best manage them. Locking them up in a cupboard is not the way to go. We as adults have to understand how they work – and how they might ‘misbehave’ – so that we can provide the right tools and the right opportunities for our children to grow up happy in a secure digital world”, Stéphane Chaudron, the report’s lead researcher at the Joint Research Centre (JRC).).

The authors of the report encourage parents to get informed about the capabilities, functions, security measures and privacy settings of toys before buying them. They also urge parents to focus on the quality of play by observing their children, talking to them about their experiences and playing alongside and with their children.

Protecting and empowering children

Through the Alliance to better protect minors online and with the support of UNICEF, NGOs, Toy Industries Europe and other industry and stakeholder groups, European and global ICT and media companies  are working to improve the protection and empowerment of children when using connected toys. This self-regulatory initiative is facilitated by the European Commission and aims to create a safer and more stimulating digital environment for children.

There’s an engaging video accompanying this press release,

You can find the report (Kaleidoscope on the Internet of Toys: Safety, security, privacy and societal insights) here and both the PDF and print versions are free (although I imagine you’ll have to pay postage for the print version). This report was published in 2016; the authors are Stéphane Chaudron, Rosanna Di Gioia, Monica Gemo, Donell Holloway , Jackie Marsh , Giovanna Mascheroni , Jochen Peter, Dylan Yamada-Rice and organizations involved include European Cooperation in Science and Technology (COST), Digital Literacy and Multimodal Practices of Young Children (DigiLitEY), and COST Action IS1410. DigiLitEY is a European network of 33 countries focusing on research in this area (2015-2019).