Who Owns Your Data in the AI Age?

You tap to check the weather, order lunch, or scroll through news. Each action generates data that flows far beyond your screen. Platforms, app makers, and advertisers harvest clicks, locations, and purchase patterns to build profiles that fuel a hidden economy. Until recently, most of that value never returned to the people who created it. Today, new rules in Europe and shifting guidance in the United States are redrawing the lines around who can access, share, and profit from personal information. Understanding these changes can help you take back control and share in the benefits your data creates.

The price of "free" online
Every time you open a free app or browse a social feed, you enter a bargain most people never signed up for consciously. Platforms make money by turning your clicks, locations, and purchases into targeted ads and detailed insights they sell to others. That model, often called surveillance capitalism, treats attention and behavior as raw materials for profit.
Data rarely stays in one place. It travels through hidden pipes built into software development kits, invisible trackers, and real-time ad auctions. These flows happen in milliseconds, making it nearly impossible for an ordinary user to see or control where information goes next. Along the way, choices about what you see and when you see it can be steered by algorithms optimizing for engagement rather than your best interest.
The result is a growing sense that users have become the product. Privacy risks multiply as more parties hold copies of sensitive details, and the economic value generated flows mostly to platforms and data brokers. Regulators and advocates now recognize that rebalancing the data economy means giving people meaningful rights to access, share, and benefit from the information they generate. Efforts like the EU Data Act and new cooperative models described in the Marpole whitepaper aim to shift power back toward contributors.
Who owns your data, really?
The phrase "data ownership" sounds simple, but in practice there is rarely a single owner. Instead, laws grant bundles of rights: who can access information, who can use it for which purposes, who must delete it on request, and who controls sharing. These rights vary by region, by type of data, and by the relationship between the person and the platform.
In the European Union, the Data Act strengthens the ability of people and businesses to access and share data generated by connected products. If your smart thermostat or fitness tracker collects usage logs, the law says you should be able to retrieve that data and port it to a competing service or analytics tool of your choice. That principle shifts leverage away from device makers who once locked information inside proprietary ecosystems.
In the United States, no single federal law covers all personal data. Rules are more fragmented, with sector-specific statutes for health, finance, and children's information, plus state laws like California's privacy act. Enforcement agencies shape practices by targeting deceptive claims, unauthorized sales of sensitive location data, and failures to honor opt-outs. The upshot is a patchwork where your rights depend on where you live and what kind of data is at stake.
Knowing what rights exist helps you take practical steps: request copies of your data, switch services when portability tools are available, and limit unnecessary sharing by adjusting permissions. As regulators tighten rules, these rights will become more uniform and easier to exercise.
What the EU Data Act changes
The EU Data Act entered into force on January 11, 2024, and will apply from September 12, 2025. Its aim is to rebalance value in the data economy by ensuring that the people and businesses using connected devices gain enforceable rights to the data those devices generate, not just the manufacturer or service provider.
Under the new law, users can request access to raw sensor readings, usage logs, and diagnostic information from smart appliances, wearables, and connected vehicles. A driver who owns a connected car, for example, can retrieve trip data and share it with an independent insurance app or maintenance service. Previously, only the car maker held that information and decided who else could see it.
Data holders face new obligations to make data available under fair, reasonable, and non-discriminatory terms. They cannot impose excessive fees or technical barriers that make portability meaningless. The law also includes safeguards to protect trade secrets and prevent abuse, balancing openness with legitimate business interests.
By making switching and interoperability easier, the Data Act encourages competition and consumer choice. Small and medium businesses gain access to device-generated insights that once belonged exclusively to large manufacturers, leveling the playing field for innovation and services.

AI's appetite for training data
Modern artificial intelligence systems learn patterns by ingesting vast amounts of text, images, audio, and video. That training phase raises pressing questions: Who consented to the use of their work? Were creators compensated? Does existing copyright law cover scraping public websites to build a commercial model?
In the European Union, transparency requirements are tightening. From August 2025, providers of general-purpose AI must publish detailed summaries of the content used for training and demonstrate compliance with EU copyright law. Those summaries will help rights holders understand whether their material ended up inside a model and provide a foundation for enforcement or licensing negotiations.
Across the Atlantic, the U.S. Copyright Office released reports in 2025 that signal many training uses may not qualify as fair use, particularly when copyrighted works are copied wholesale without transformation and the resulting AI competes with the original creators. The reports point toward licensing solutions and underscore that human authorship remains essential for copyright protection.
Greater transparency benefits everyone. Creators gain visibility into how their work is used, consumers can assess whether an AI was trained responsibly, and regulators obtain the information needed to enforce intellectual-property and data-protection rules. As scrutiny intensifies, companies that document data sources and secure permissions early will face fewer legal and reputational risks.
U.S. momentum: enforcement and guidance
Even without a comprehensive federal data law, U.S. regulators are applying pressure through enforcement actions and policy signals. Agencies have targeted data brokers that sell precise location information gleaned from ad exchanges, especially when that data reveals visits to sensitive places like medical clinics or places of worship. These cases underscore risks in the data broker ecosystem and send a clear message that certain uses cross legal and ethical lines.
The U.S. Copyright Office AI reports clarify that human authorship remains essential for copyright protection and outline circumstances under which training on copyrighted works may require licenses. For developers, that means relying on a vague fair-use defense is riskier than ever.
These moves add pressure for clearer privacy notices, tighter controls on data sharing with third parties, and more careful vetting of datasets used to train or fine-tune models. For companies, keeping detailed records of data sources, permissions, and processing purposes is shifting from a nice-to-have to a must-have practice. Legal teams are drafting policies that address transparency, consent, and vendor oversight, preparing for a future where enforcement is both more frequent and more consequential.
What you can do now
You do not need to wait for regulations to fully take effect to protect your privacy and prepare for the new landscape. Start by using built-in privacy controls on your devices. Limit location access to only the apps that truly need it, disable background tracking, and review microphone and camera permissions regularly. Many operating systems now show indicators when sensors are active, making it easier to spot unexpected access.
If you live or operate in the EU, take advantage of the Data Act's portability rights once they apply in September 2025. Request access to data your connected devices generate and explore tools that let you switch services without losing history or insights. Even before full enforcement, some manufacturers and platforms are building portability features to get ahead of the curve.
For teams building AI products, the time to act is now. Document every training dataset: where it came from, what licenses or terms apply, and whether you obtained explicit permission from rights holders. Be ready to publish training summaries if you offer general-purpose AI in the EU from August 2025 onward. Transparency builds trust and reduces the risk of enforcement action or public backlash.
Map the data you share with vendors, especially ad-tech and analytics providers. Cut unnecessary software development kits that collect more information than you need. Set internal policies that respect user rights, honor opt-outs promptly, and default to privacy-friendly choices. Organizations that treat data stewardship seriously today will find themselves better positioned as rules tighten and user expectations rise.
A fairer data future
As transparency requirements grow and access rights become enforceable, expect to see new products and services that let people share in the value created from their data. Community-driven platforms, cooperative ownership models, and fair-compensation schemes point toward a more equitable digital economy where contributors are recognized and rewarded rather than mined.
Clear rights and disclosures build trust. When users understand what data is collected, how it will be used, and what benefits they receive in return, they are more likely to opt in and participate actively. That trust becomes a competitive advantage for companies willing to go beyond minimum compliance and design systems that respect autonomy and agency.
Organizations that respect user rights early, invest in portability and interoperability, and adopt transparent practices around AI training will be better prepared for the regulatory environment ahead. They will also differentiate themselves in a market where consumers increasingly value privacy and fairness alongside convenience and features.
The shift is already underway. New rules in Europe set a high bar, and momentum in the United States signals that fragmented enforcement will give way to clearer standards. Together, these forces are reshaping the data economy, making it possible to imagine a future where the people who generate data share meaningfully in the opportunities it creates.
Frequently Asked Questions
What is surveillance capitalism in simple terms?
Surveillance capitalism describes a business model where platforms collect detailed information about user behavior, clicks, locations, and preferences, then turn that data into profiles sold to advertisers and other buyers. Users receive free services in exchange, but the economic value flows primarily to the platforms. Privacy advocates argue this model treats people as raw material rather than partners, and regulators are now pushing for more balanced arrangements.
Do people in the EU get new rights to data from their connected devices?
Yes. The EU Data Act, which applies from September 12, 2025, grants users and businesses enforceable rights to access and share data generated by connected products like smart thermostats, fitness trackers, and vehicles. Data holders must provide that information under fair terms and cannot impose excessive fees or technical barriers. The goal is to enable portability, encourage competition, and let users control their own device-generated insights.
Can AI companies train on copyrighted works without permission?
It depends on jurisdiction and circumstances. In the EU, general-purpose AI providers must comply with copyright law and publish training summaries starting in August 2025. In the U.S., recent Copyright Office reports indicate that many training uses may not qualify as fair use, especially when entire works are copied and the AI output competes with original creators. Licensing agreements and transparency are becoming essential to manage legal and reputational risk.
What practical steps can I take to protect my data today?
Review and tighten privacy settings on your devices and apps, limiting location, microphone, and camera access to what is truly necessary. Disable background tracking and check permissions regularly. If you are in the EU, prepare to request access to device-generated data and use portability tools when they become available. For businesses, document data sources, secure licenses for training datasets, map vendor sharing, and set clear policies that honor user rights and support transparency.

