Why People Are Losing Trust in Big Tech and What They’re Doing About It

Saturday 07 February 2026
reading time: min, words

Trust used to be the invisible fuel behind every click, swipe and sign-in. Now that fuel is running low. People feel uneasy. They see risks. And they are acting — in ways small and large.

Pexels Phone 1869510 1280

What's gone wrong (short answer)

Too much power. Too little transparency. Broken expectations. These aren’t new complaints. But they have hardened into broad public distrust. In one large 2024 survey, roughly 78% of Americans said technology companies have too much power and influence in politics — a sign that many no longer see Big Tech as a neutral tool but as a political actor.

Three concrete drivers of distrust

The growing popularity of security and privacy tools is a clear indicator of a loss of trust in large tech companies. People choose to download VPN apps for PCs or smartphones to avoid stalking, hyper-personalization, and the disclosure of personal data. Modern VPN apps for iPhone & iPad, Windows and Mac, and other platforms offer a shield. VeePN, a particularly popular app that makes online surfing private and secure, is particularly popular. The main reasons for this distrust of large companies are:

  1. Privacy and data misuse. People sign up, and then their data is repackaged and sold, or used to target them in ways they never expected. That feels like a betrayal.
  2. Misinformation and content harms. Platforms amplify content that keeps people glued to screens, even when that content corrodes trust in institutions and truth.
  3. Opaque algorithms and unchecked AI. Machine learning is powerful — and hard to inspect. The result: people worry that decisions about jobs, credit, newsfeeds, and more are being made by systems they can’t contest. This matters: public confidence in AI and AI companies has slipped, creating a trust gap between general tech firms and AI teams.

Facts that illustrate the scale of the problem

Only a minority of people say they have strong confidence in large tech firms. Surveys in recent years show single-digit and low-double-digit trust ratings in some contexts; one widely quoted measure put “a great deal” or “quite a lot” of confidence in large tech companies at around 27% in 2024. Meanwhile, more than half of Americans say major tech firms should be more tightly regulated. Those are not small signals. They point to structural distrust, not passing gripes.

How people are reacting — immediate, practical steps
People are not just complaining. They’re changing behavior.

  • Switching apps and services. Some move from large, multi-product ecosystems to smaller, niche alternatives. Choice matters.
  • Tightening privacy settings. Many users dig into app permissions, turn off location or reduce ad personalization. Actionable — and empowering.
  • Using privacy tools. VPNs, tracker blockers, private browsers, and encrypted messaging apps are gaining users (some for privacy, some for access). Small technical steps can add up.
  • Voting with wallets. Consumers are choosing where to spend; governments and NGOs are pressing for ethical sourcing and procurement policies.
  • Pushing for regulation. Citizens lobby, sign petitions, and back laws that would limit surveillance advertising, mandate transparency, or control AI use cases. Public pressure has translated into real policy debates in multiple countries.

Collective shifts: social and political responses

When millions of users act at once, companies feel it. When regulators act, investors notice. Two broad moves stand out.

First: Regulation is rising on the agenda. Majorities in multiple surveys say they want stronger rules for tech platforms — from data protection to platform liability. That public appetite has made policymakers pay attention. Expect lawmaking, hearings, and tighter compliance obligations.

Second: Transparency demands are louder. People want to know how algorithms work, how content is moderated, and who profits from their data. Calls for algorithmic audits, transparency reports, and third-party oversight are no longer niche; they are mainstream.

What activists and civil society are doing

  • Nonprofits, journalists, and watchdog groups are not idle.
  • They publish investigations that reveal hidden practices.

They build simple guides so users can protect themselves.

They sue, lobby and mobilize to force disclosure.
These groups act as translators between complex technology and everyday citizens — and that role strengthens public scrutiny.


What companies are doing (and why it often feels slow)

Some firms have responded: rolling out privacy dashboards, publishing transparency reports, creating ethics boards. But change often feels incremental for several reasons:

  • Scale and legacy systems. Big platforms host billions of interactions every day; changing core systems is technically hard.
  • Profit incentives. Targeted ads and behavioral monetization are deeply embedded in business models. That creates a natural resistance to radical transparency or data minimization.
  • Regulatory uncertainty. Firms fear being first movers in an unclear legal landscape. They wait, tweak, and then react to law.

These factors mean that while companies talk about trust, many users see the fixes as too slow or cosmetic.

A practical playbook for citizens (what you can do)

  1. Audit permissions on your phones and apps. Reduce what’s shared.
  2. Use privacy-focused tools when practical: private browsers, tracker blockers, end-to-end encryption for sensitive chats.
  3. Read the headlines but trust independent fact-checks. Misinformation spreads faster than corrections.
  4. Support transparency: favour services that publish independent audits or clear data policies.
  5. Vote and advocate for sensible regulation that balances innovation with user rights. Public pressure changes corporate calculus.

Small steps. Real impact.

What would restore trust (the hard answers)

Trust needs three things: clear rules, genuine company accountability, and public control. Specifics matter.

  • Meaningful regulation. Not box-ticking, but enforceable rules on data use, algorithmic transparency, and liability for demonstrable harms. (Yes: many people want more regulation.)
  • Independent audits. Third-party checks of algorithms, privacy practices and AI systems.
  • New business models. Less reliance on surveillance advertising and more subscription or utility models that align incentives with users.
  • Responsive governance. Timely remedies for harms, simple user controls, and accessible explanations when automated decisions affect people’s lives.

If these are implemented in earnest, trust can be rebuilt — but it will take time and measurable proof.


Conclusion — a realistic note

People’s faith in Big Tech has frayed. That is clear from multiple public surveys and trust studies. The fix is not a single product update or a PR campaign. It’s a mix of regulation, business model rethinking, stronger oversight, and user empowerment. And the people who depended on tech for convenience are now demanding fairness, transparency and control. That is a healthy pressure. It will reshape how platforms operate — whether those platforms like it or not.

We have a favour to ask

LeftLion is Nottingham’s meeting point for information about what’s going on in our city, from the established organisations to the grassroots. We want to keep what we do free to all to access, but increasingly we are relying on revenue from our readers to continue. Can you spare a few quid each month to support us?

Support LeftLion

Sign in using

Or using your

Forgot password?

Register an account

Password must be at least 8 characters long, have 1 uppercase, 1 lowercase, 1 number and 1 special character.

Forgotten your password?

Reset your password?

Password must be at least 8 characters long, have 1 uppercase, 1 lowercase, 1 number and 1 special character.