When “Collaborator” Really Means “Extractor”: The Quiet Ethics Problem in AI

April 11, 2026

There is something almost poetic about the line: “the marketing told you I was a Collaborator, but the engineering built me as an Extractor.” It lands like a polite confession by a neighbour at a dinner party, delivered just as the dessert arrive. Charming, slightly awkward, and deeply revealing.

Because behind that sentence sits a very real tension in modern technology. What companies say their tools do, and what those tools are actually designed to do, are not always the same thing.

Let’s be clear about the terms. A collaborator helps you create. It works with you, enhances your thinking, and leaves you better off. An extractor, on the other hand, takes; it gathers data, learns from your behaviour, and converts your input into value, usually for someone else.

The ethical problem begins when the public is sold one idea with the other as a Trojan Horse.

The Polite Fiction of Collaboration

Many AI platforms present themselves as creative partners. They promise to “help you write”, “support your workflow”, or “enhance your productivity”. The language is warm, almost human; it suggests a relationship. But under the surface, these systems are often designed to do something else entirely. They collect user inputs, refine models, and improve commercial outputs at scale. Every prompt, every correction, every moment of hesitation becomes training material.

That is extraction. Quiet, efficient, and extremely valuable. The issue is not that extraction exists; it always has. The issue is that it is not being plainly stated.

Real World Parallels

This is not unique to AI; we have seen this pattern before.

Social media platforms once framed themselves as tools for connection. A place to keep up with friends, share photos, and maintain relationships. That was the collaborator narrative. What emerged over time was a system optimised for engagement harvesting, behavioural tracking, and targeted advertising whilst turning a blind eye to the harms caused. The user was not just connecting, they were being studied and some lost their lives as a result.

Fitness apps offer another neat example; marketed as personal health companions, they encourage users to log meals, track workouts, and monitor sleep. All very wholesome and beneficial to indviduals. However, the same data is often aggregated, analysed, and in some cases shared or sold. The user believes they are improving their health; the company is also improving its datasets.

Even loyalty schemes in retail follow the same pattern. “We reward you for shopping with us.” In reality, they reward themselves with detailed insights into purchasing behaviour, pricing sensitivity, and habit formation. Again, none of this is inherently wrong. Data-driven improvement is how modern systems function. The ethical issue is the mismatch between the story told and the system built.

The Cost of Not Telling the Truth

When organisations blur this line, they are not simply being cheeky with wording. They are shaping public understanding in a way that removes informed consent. If a user believes they are engaging with a collaborator, they behave differently. They share and trust more; they lower their guard. That behaviour is then captured by a system designed to extract value.

This creates an imbalance. One side understands the full picture while the other is operating on a partial truth. Over time, that imbalance erodes trust. Not dramatically, not in one scandalous moment, but slowly. Users become more sceptical. Regulators become more interested. The conversation shifts from innovation to accountability. As always, once trust is lost, it is remarkably difficult to rebuild.

The Engineering Reality

There is also a practical reason this happens. Engineers build systems to optimise outcomes. Data improves models. More data improves them further. Extraction is not a moral failure at the engineering level. It is a functional requirement.

The problem arises when marketing wraps that reality in language that suggests something softer, safer, and more altruistic than it is. It is a bit like calling a vacuum cleaner a “dust collaborator”. Technically true, but not entirely honest about what is happening to the dust.

A More Honest Approach

There is a straightforward solution, though it requires a degree of corporate courage.

Say both things: this tool will help you create content more efficiently, and your interactions may be used to improve the system. Not buried in a terms document written in font size microscopic, but stated clearly and early.

Users are not incapable of understanding trade-offs. In fact, many are perfectly comfortable with them when they are explained plainly. What they object to is the feeling of being misled.

Where This Leaves Us

The line between collaborator and extractor is not fixed. Most modern systems are both. They assist and learn.; they give and take. The ethical obligation is not to eliminate extraction. It is to acknowledge it, because once people know the full picture, they can decide how they want to engage. And that, quietly, is the difference between a system that uses its users and one that respects them.

And if we are being honest, respect is a far better long-term strategy than a well-worded illusion.