Why Are Digital Apps One-Way Streets—and What Does It Mean for Our Freedom?

 


Why Are Digital Apps One-Way Streets—and What Does It Mean for Our Freedom?


Rahul Ramya

29.08.2025




Are Apps Serving Us—or Ruling Us?

We all use apps every day—for banking, food delivery, shopping, even healthcare. But here’s the uncomfortable truth: most apps are one-way streets. They tell us what to do, but never truly listen to us.

My new essay, “Why Are Digital Apps One-Way Streets—and What Does It Mean for Our Freedom?”, dives into this reality.

Inside the essay:

  • A mother in Nairobi trapped by a predatory loan app that nearly destroyed her life.

  • ASHA health workers in India who fought back against a surveillance app—and won.

  • How everyday frustrations—like endless IVR loops, useless chatbots, or unreadable “Terms & Conditions”—are not accidents but designs that take away our choices.

The bigger picture?

Apps are not just tools. They are quietly shaping our freedoms, our dignity, and even our democracy.

But there’s hope: resistance works. When people organize, they can stop digital exploitation and reclaim technology for humanity.

Do you think apps make us freer citizens—or just obedient users?


A Nightmare for Borrowers: Predatory Loan Apps in Kenya

Story Highlight—Grace’s Ordeal

In Nairobi’s Mathare slums, Grace urgently needed money—her child had swallowed a coin, and doctors demanded 1,500 Kenyan shillings (roughly USD 12) before treatment could begin. In desperation, she downloaded a mobile loan app. She agreed to its terms without reading them, only to discover later that she received just 1,050 shillings after a 450-shilling deduction. Worse still, the app demanded full repayment within just seven days—roughly 43% more than she borrowed  .

When Grace failed to pay on time, harassment began. She was bombarded with threatening calls and texts—relentless pressure that nearly drove her into depression  . Her desperate situation, initially seeking help, became a digital trap.


Why This Case Matters

Grace’s story is more than a personal tragedy—it is a testament to how apps that claim to offer solutions can instead perpetuate exploitation:

  • Enabled by Unidirectional Design: The app was built to extract—data, money, peace of mind—not to give support or negotiate realistic terms.

  • Automated Harassment, No Human Empathy: Borrowers like Grace face ruthlessly programmed messages and calls that leave no room for negotiation or understanding.

  • Privacy Violations: These apps accessed contacts, messages, and more—using personal data not just for eligibility, but also to shame and threaten users when they couldn’t pay   .


Broader Impact in Kenya

This is not an isolated case. Across Kenya:

  • A man earning 55,000 Ksh per month ended up borrowing over 1,071,000 Ksh across 52 apps, due to lack of shared data among lenders, leading to spiraling debt  .

  • Taxi driver John Bigingi defaulted on a small loan and was threatened with the exposure of his contacts—his family and friends—to shame him into paying  .

  • Many others face debt shaming, emotional distress, and privacy violations—all through relentless app-driven surveillance and control  .


Fitting into the Philosophical Frame

Grace’s ordeal is more than a cautionary tale. It highlights how digital systems can reduce human beings—our urgency, vulnerability, and privacy—to profit streams:

  • An assault on human dignity: When apps turn our crises into opportunities for extraction, they violate our very being.

  • Surveillance as weapon: Personal data—contacts, messages, associations—becomes ammunition for control and shame.

  • Erosion of agency: Harassment via apps silences negotiation; humans become ruled, not heard or respected.


Summary Table

Aspect

Real-World Example

Implication

One-way design

Grace forced into app pawns

Users coerced, not empowered

Aggressive control

Harassing calls and threats

Reinforces inequality and trauma

Data misuse

Borrowers’ contacts exposed

Violates trust, community safety

Normalization

Multiple Kenyans trapped in similar cycles

Shows systemic design flaws, not one-off fail


This case vividly demonstrates how apps can compound trauma, strip power, and violate dignity. 

The Problem of Unidirectional Apps

Why are most digital apps designed as one-way systems, where users are forced to follow preset instructions without real choices or meaningful support—and why do so-called AI customer services and IVR helplines often make the problem worse instead of solving it?

This is not just a minor inconvenience. It is a daily reality for millions of people. You download an app—whether it is for banking, food delivery, booking train tickets, or paying bills—and you quickly realize that the app talks to you, but never really listens to you.

  • Banking apps: You want to ask a specific question about why a transaction failed. Instead, the app gives you only pre-fixed options like “check balance,” “block card,” “reset password.” Your actual problem is not on the list.

  • Food delivery apps: Your food comes late or is missing items. You try customer support. An AI bot repeats, “We are sorry for the inconvenience. Your feedback has been recorded.” But your money is gone and your hunger remains.

  • Government service apps: When trying to book a passport or driving license slot, the system may freeze or show errors. You look for help, but the “support” is just a FAQ list written months ago, irrelevant to your situation.

This is the essence of unidirectionality: you are required to comply with what the app wants, but the app takes no responsibility to respond meaningfully to you.


The False Promise of AI Customer Service

Many companies now say, “Don’t worry, we have AI-powered customer service.” But in reality, these services are deliberately kept weak.

  • The “AI” is often just a chatbot that can handle only the simplest queries like “What is my order status?” or “How do I reset my password?”

  • If you ask anything slightly outside its script, the bot collapses into repeating: “I’m sorry, I didn’t understand that.”

  • Interactive Voice Response (IVR) helplines are no better. They force you to press endless numbers (press 1 for English, press 2 for balance enquiry, press 3 for complaints…) and still never give you a human response.

Such systems are not just useless—they are harmful. They waste your time, increase your frustration, and often prevent you from getting the help you actually need.


The Trap of “Privacy” and “Terms”

When you download an app, before you even use it, you are asked to accept “Terms and Conditions” and “Privacy Policies.”

But here lies another problem:

  1. Too Long and Complex: Most terms run into dozens of pages, written in complex legal English that an ordinary person cannot understand. You are forced to click “Accept” because otherwise the app won’t even open.

  2. Hidden Consent: In these terms, companies quietly include permissions to access your data—your location, your contacts, even your microphone or camera in some cases.

  3. No Real Choice: Even if you disagree, you cannot negotiate. It is a one-way contract: “Take it or leave it.”

For example:

  • A flashlight app once asked for permission to access the microphone and location. Why would a torch need to know where you are or record sound? The answer: to collect data and sell it.

  • Social media apps track not just what you post, but also what you type and delete before posting.

  • Banking and payment apps often ask for permission to read your SMS “for security.” But in reality, they get access to your personal messages, building detailed profiles about you.

This shows that privacy policies are not written to protect you but to protect the company from you in case you complain later.


Everyday Consequences for People

These problems are not abstract. They affect daily life:

  • Elderly citizens: Many older people struggle with app interfaces. When their pension app fails or their bank transaction is blocked, there is no human support. The IVR keeps looping, and they are left helpless.

  • Rural users in India or Africa: Many apps don’t even provide proper local language support. If you can’t read English well, you may press the wrong option or accept harmful terms without realizing it.

  • Students: Online exam platforms often crash in the middle of tests. The system blames the student’s internet, but provides no real-time help. Future opportunities are lost due to one-way design.

  • Small businesses: Sellers on e-commerce platforms like Amazon or Flipkart often get their accounts suspended automatically due to “policy violations.” They try to appeal, but the only responses are template emails with no human reasoning.

In all these cases, the common thread is: apps demand obedience, but deny responsibility.


Why Are Apps Built This Way?

There are three main reasons:

  1. Profit First: Companies save money by reducing human customer support and replacing it with cheap bots.

  2. Data Extraction: By forcing you through fixed paths, apps ensure you give away the maximum amount of data—location, habits, clicks.

  3. Control: If apps gave you real negotiation power, they would have to be accountable. By making everything one-directional, they remain in control.


The Larger Picture: Digital Colonies

Think about it: when you accept terms blindly, give away data without consent, and follow instructions without being heard—what is happening? You are being treated not as a citizen with rights, but as a data subject whose only role is to feed the machine.

This is very similar to colonialism. In the past, colonial rulers extracted resources from people without giving them power. Today, companies extract data instead of land, but the imbalance of power is the same.



Philosophical Questions Arising from the Essay

  1. Capabilities and Human Agency

    • If most apps give only one-way instructions, are we truly free to shape our learning, choices, and actions?

    • Do we still have “capabilities,” in Amartya Sen’s sense—the real freedom to live the lives we value—or are we reduced to following narrow digital pathways designed by corporations?

    • What happens to human creativity and problem-solving when our choices are limited to prefixed menus and automated replies?

  2. Freedom vs. Digital Compliance

    • Is freedom real when every interaction with apps is conditioned by accepting “terms and conditions” we cannot negotiate?

    • Are we free, or merely compliant subjects, when our consent is forced by design—accept or be excluded?

    • If even customer support denies us dialogue, are we slowly being trained to accept authority without question?

  3. Surveillance Capitalism and the Self

    • What does it mean for our existence when our daily actions—clicks, locations, conversations—are turned into data, owned and traded by others?

    • When surveillance capitalism predicts and shapes our behavior for profit, are we still acting as autonomous individuals, or as programmed consumers?

    • Can a person truly “be themselves” if algorithms silently reward or punish every action with visibility, offers, or denials?

  4. Subjects, Not Citizens

    • By turning us into predictable data subjects, do apps deny us the dignity of being full citizens in a democracy?

    • If democracy requires informed, questioning individuals, what happens when knowledge itself is filtered and controlled by corporate interests?

    • Are we moving from being political participants to becoming managed populations—like digital colonies of surveillance empires?

  5. Existence, Dignity, and Progress

    • Can there be genuine progress when technology measures success by engagement and profit, not by human well-being?

    • What happens to dignity when elders, students, or rural citizens are silenced by unidirectional systems that refuse to listen to their needs?

    • Is survival with dignity possible in a world where our private choices, mistakes, and desires are continuously monitored and monetized?


⚖️ Together, these questions open up the core philosophical tension:

  • Technology promises convenience, but may erode capabilities.

  • Apps promise freedom of access, but force compliance without dialogue.

  • Surveillance capitalism promises progress, but risks hollowing out democracy and dignity.


This entire system of unidirectional apps, forced acceptance of “terms and conditions,” empty privacy promises, and meaningless AI or IVR assistance is not a neutral design flaw. It is an assault of capital, knowledge resources, and power on our very being as human beings. It transforms us from citizens with rights into subjects to be ruled, extracted, and exploited.

Every click of acceptance, every forced pathway, every denial of human dialogue is part of a larger mechanism where corporations and elites secure permanent control:

  • Capital by monetizing our data and selling it back to us through targeted advertisements and services;

  • Knowledge resources by deciding what truths we can access and how reality itself is filtered;

  • Power by conditioning us to obey preset instructions, making us docile users rather than questioning citizens.

As members of a community, a society, or a nation, this reduces us to permanently ruled creatures—where our existence is governed not by democratic debate or shared values but by algorithms that extract value from us for the benefit of elites.

In this assault, our labor, our attention, our emotions, and even our private lives are converted into raw material for profit. The promise of digital freedom is revealed as digital subjugation: we are not empowered participants in progress but subjects endlessly exploited for the enrichment of a few.



What Must Be Done

If digital apps have become tools of control, then the answer is not blind acceptance but collective resistance. We must reclaim our rights as human beings, not obedient subjects of capital and algorithms.

  1. Clear and Honest Terms: No more tricking people with endless pages of legal language. Every app must show, in plain words, what it will do with our data before we give consent. Anything less is theft.

  2. Right to Human Help: People deserve to be heard. Every app must guarantee real human support—not just bots that repeat the same lines. Respect begins with conversation.

  3. Digital Literacy as Power: Using a phone is not enough. Citizens must learn what it really means to press “Accept.” Digital literacy must teach people how to guard their privacy, resist manipulation, and demand accountability.

  4. Public and Community Apps: Technology should serve people, not trap them. Governments, schools, and communities must build open, public alternatives—like Kerala’s software for schools—that place human needs above profit.

  5. Cooperatives of Users: Imagine drivers, shopkeepers, farmers, and teachers running their own apps together. When people control the platforms they use, exploitation ends and dignity begins.


From Obedience to Freedom

Today, most apps tell us what to do. We have no choice but to obey. This is not technology serving us—it is technology ruling us. It is the silent hand of surveillance capitalism, turning us into data mines and obedient subjects for the profit of a few.

But this is not destiny. It is a design—and what is designed can be redesigned. By questioning, resisting, and building our own systems, we can break free from one-way apps and build two-way communities.

Real freedom in the digital age will come only when technology listens as much as it speaks, protects as much as it collects, and serves as much as it controls.

Every forced click of “Accept Terms and Conditions” is not just a button—it is a warning. A warning that our rights, our democracy, and our dignity are at stake.

The choice before us is clear: remain permanent subjects of digital empires, or rise as free citizens of a digital commons built by and for the people.


A Call to Action

We must stop being silent users and start being active citizens. The digital world belongs to all of us, not to a handful of corporations. Every time we question, every time we demand clarity, every time we build or support community-owned alternatives, we weaken the grip of surveillance capitalism. The future of our freedom, our democracy, and our dignity depends on whether we accept being ruled—or whether we rise to reclaim technology as a tool for humanity.



Resistance and Victory: ASHA Workers Versus the 

Shield 360

 Surveillance App in India

In 2022, community health workers in Haryana, India—known as ASHAs (Accredited Social Health Activists)—were suddenly ordered by the state government to install a new app called Shield 360 on their official smartphones. The app was presented as a “security tool” to prevent misuse of government-issued devices.

But soon, doubts began to grow. The app demanded sweeping permissions, including the ability to remotely access, monitor, update, and even delete apps on their phones. For workers who relied on their phones not only for official duties but also for personal communication, this meant constant surveillance.

One ASHA worker, Surekha, voiced her concern and consulted local IT experts. They confirmed her worst fears: Shield 360 was not a harmless tool but a surveillance system that tracked movements and digital activity (Pulitzer Center, 2023).

Instead of remaining silent, the workers organized. Through WhatsApp groups, union leaders mobilized thousands of ASHA workers across 22 districts of Haryana. On June 25, 2022, they staged a one-day sit-down protest. With smartphones in their hands, they marched to local health offices and demanded the immediate removal of Shield 360.

Their united voice worked. Under pressure, authorities suspended further installations of the app, effectively halting the digital surveillance regime. While traces of the app remained on some phones, the mass protest succeeded in blocking its expansion and set a precedent: ordinary citizens, even from marginalized groups, could fight back against intrusive technology and win.


Philosophical Angle: From Subjects to Citizens

The ASHA workers’ struggle is not just about an app—it is about the meaning of freedom in the digital age.

  • Surveillance treats people as subjects, whose actions must be monitored and controlled.

  • Resistance transforms them into citizens, who assert rights, demand accountability, and refuse silent obedience.

  • Their protest was not only against a piece of software but against the deeper assault on dignity, where human beings are reduced to tools of efficiency and control.

The ASHAs, by resisting, reclaimed their capabilities in Amartya Sen’s sense: the freedom to live and work with dignity, without being constantly reduced to data points for a surveillance state. Their act echoes Hannah Arendt’s idea that freedom lies in collective action—in the courage to appear, speak, and act together in public.

Thus, the story becomes more than a labor protest; it is a living example of how human beings resist being permanently ruled, reclaiming their space as agents of democracy in a digital world increasingly designed to subject them.


Summary Table: Meaning of the ASHA Resistance

Aspect

Real-World Example

Philosophical Meaning

Unidirectional Control

Shield 360 app allowed state surveillance over workers

People reduced from workers with rights to subjects under watch

Collective Resistance

ASHAs organized protests across 22 districts

Freedom is reclaimed when people act together, not alone

Suspension of the App

Government halted further installations

Proof that power is not absolute; citizens can resist and win

Defense of Dignity

Workers refused to be silently monitored

Dignity means being treated as citizens, not as managed objects



Story from Kenya — Predatory Loan Apps

Title: "‘Traumatising’: how rogue digital loan apps in Kenya intimidate borrowers" (The Guardian)
Link: https://www.theguardian.com/global-development/2022/oct/12/traumatising-how-rogue-digital-loan-apps-in-kenya-intimidate-borrowers
The Guardian


Story from India — ASHA Workers vs. Shield 360 Surveillance App

Title: "How Healthcare Workers in India Fought a Surveillance Regime—and Won" (Pulitzer Center)
Link: https://pulitzercenter.org/stories/how-healthcare-workers-india-fought-surveillance-regime-and-won
Pulitzer Center




Comments