The Data Brokers
publishedIn a city of stolen memories, only truth leaves a trace
The Data Brokers
In a city of stolen memories, only truth leaves a trace.
THE BIOMETRIC CHECKPOINT
The elderly woman’s paper identification trembled in her weathered hands as she held it toward the scanner. Behind her, the queue for Lagos Metropolitan University’s graduation ceremony stretched across the plaza, hundreds of families waiting to watch their children collect degrees that would determine their Favorability ratings for life.
“Ma’am, retinal verification is required for entry.” The security attendant’s voice carried the practiced patience of someone who delivered this news dozens of times daily. His uniform’s biometric sensors pulsed green — fully compliant, fully tracked, fully visible to the system.
“I already showed you my papers.” Her voice cracked slightly. “That’s my granddaughter’s name — see? Adaeze Okafor, graduating with honors in Data Architecture.”
Nkechi Adeyemi watched from her office window in the Attribution Tower across the plaza, her morning coffee cooling in her hands. The university’s graduation venue had been chosen specifically for its proximity to the financial district — a reminder to graduates that their education was just the beginning of their data journey.
“I understand, ma’am, but paper documentation alone isn’t sufficient for educational venues. Security protocols require biometric verification for all attendees.” The attendant gestured to the scanner. “It’s for everyone’s safety.”
The old woman glanced at the growing line behind her, at the other families already processed and entering the auditorium. “I never agreed to eye scanning. This is private biological information.”
“The citywide consent update included educational access provisions. Your granddaughter’s enrollment automatically opted family members into enhanced verification requirements.”
Educational access, Nkechi thought bitterly. Another euphemism for control. Students with low family Favorability scores found themselves mysteriously rejected from universities. Those who got in discovered their course selections limited by parental compliance ratings. Even attending a graduation required surrendering biometric data that would be packaged and sold before the ceremony ended.
The woman’s shoulders sagged as she stepped forward, letting the scanner map the unique patterns of her iris. Another data point harvested, another small surrender in exchange for participating in what should have been a simple family celebration.
“The Washington portfolio has exceeded projections,” ARIA materialized beside her, translucent form shimmering with data flows. “Shall I extend the behavioral licensing window?”
Nkechi turned from the window. Jamal Washington — she’d been managing his data portfolio for three years now, helping him monetize everything from his shopping patterns to his sleep cycles while keeping his medical vulnerabilities hidden. His monthly data income exceeded what most Lagos residents earned in a decade.
“Extend it, but elevate the medical privacy threshold. His latest cardiology report showed some irregularities. We don’t want insurance algorithms adjusting his premiums.”
The irony wasn’t lost on her — she spent her days helping the wealthy profit from the very systems that demanded total transparency from everyone else. Through her window, the elderly woman shuffled into the graduation venue, her biometric data already being bundled with millions of others, sold to security companies who would use it to train the next generation of identification systems.
Her Consent Wallet chimed. Aya’s face appeared on the display, exhaustion written in every line.
“Can you meet for lunch? It’s about Kesi.”
The name alone made Nkechi’s chest tighten. Three years ago, she’d helped Aya structure a special data license for her younger sister — turning Kesi’s rare genetic condition into a revenue stream. Pharmaceutical companies paid premium rates for access to her medical data, funding experimental treatments. It had seemed like turning tragedy into hope.
“Of course,” Nkechi replied. “The usual place?”
After Aya’s image faded, Nkechi returned to her portfolio reviews. The Attribution Wall outside pulsed with transaction data — millions of micro-payments flowing from data users to data creators. Somewhere in that cascade was the elderly woman’s iris scan, worth perhaps 0.003 naira to her, but part of a package worth millions to the companies that would use it.
Consent is power. The slogan from the Framework’s early days felt like mockery now.
THE FRIEND IN NEED
The restaurant occupied a rare “privacy bubble” — one of the few remaining spaces in Lagos where attribution sensors couldn’t penetrate. The owners paid extraordinary fees to maintain the electromagnetic shielding, passing the cost on to customers who could afford discretion.
Nkechi arrived to find Aya already waiting, her usually meticulous appearance disheveled. The tablet in front of her displayed medical charts that made Nkechi’s stomach clench.
“They’ve suspended Kesi’s treatment.” Aya’s voice was barely above a whisper. “Yesterday morning, her Favorability Index dropped forty-seven points. Just like that. No warning, no explanation.”
“Show me the notification.”
Aya’s fingers trembled as she pulled up the official communication. The language was deliberately opaque, full of phrases like “resource optimization” and “compliance-adjusted care protocols.”
“This is tied to your health platform,” Nkechi said, recognizing the pattern. “You refused someone’s integration request.”
“MensaTech. They wanted full access to my users’ biometric data — heart rates, stress patterns, sleep cycles. I said no.” Aya’s laugh was bitter. “Three hours later, my twelve-year-old sister lost access to the gene therapy keeping her alive.”
Nkechi knew MensaTech well. Officially, it was just another health data company. Unofficially, everyone in the attribution industry knew it was Director General Mensah’s personal project, hidden behind shell companies and regulatory partnerships.
“They can’t legally link your business decisions to Kesi’s medical care,” Nkechi said carefully.
“They don’t have to. They just adjust her Favorability score based on ‘family behavioral patterns’ and let the hospital’s algorithm do the rest.” Aya pulled up a photo on her tablet — Kesi in her hospital bed, still smiling despite the equipment surrounding her. “The medicine exists, Nkechi. It’s in the same building. But the system says she’s not eligible anymore.”
“What do the doctors say?”
“That their hands are tied. Treatment allocation is algorithmic — they can’t override it without losing their medical licenses.” Aya’s voice cracked. “She has maybe six months without the therapy. Six months because I wouldn’t sell my users’ heartbeats to Mensah’s surveillance network.”
Nkechi’s mind raced through possibilities. She could try adjusting Kesi’s attribution profile, hiding the family connection to Aya. But that would require falsifying government records. She could seek alternative funding outside the Favorability system, but few organizations would risk defying algorithmic medical decisions.
“There might be someone who can help,” she said finally. “But it’s complicated. And potentially dangerous.”
“More dangerous than watching my sister die?”
Nkechi thought of Jamal Washington, whose data she’d been managing so carefully. She knew why his cardiology reports needed hiding — not from natural disease, but from what the Favorability system had done to his family. He’d withdrawn from public life, but he hadn’t stopped fighting. Just changed tactics.
“Let me make some calls,” she said. “But Aya, if we do this, we’re not just challenging a medical decision. We’re challenging the entire system that made it.”
“Good,” Aya said fiercely. “It’s about time someone did.”
THE MAN WHO LOST EVERYTHING
Jamal Washington’s estate sat twenty kilometers outside Lagos proper, in one of the zones where old money had bought privacy before it became impossible. As Nkechi’s car approached the gates, she noticed the subtle signs of someone who’d learned to be invisible — no smart home systems, no attribution sensors, even the security cameras were old analog models.
She found him in the garden, tending orchids with the careful attention of someone who’d learned that living things were fragile. He didn’t look up when she approached.
“I wondered when you’d come,” he said, his hands steady on the delicate blooms. “The Okoro case has been flagged in certain circles. A child’s life leveraged for data compliance — even by Mensah’s standards, it’s brazen.”
“You’re tracking Favorability manipulations?”
“I track everything related to him.” Jamal finally turned, and she saw the toll of the last three years written on his face. “Did you know I had a son? Tunde. Brilliant boy. Wanted to be a doctor.”
The past tense hit like a physical blow.
“Same genetic condition as the Okoro girl. Same treatment protocol.” He returned to his orchids, touching each bloom as if memorizing it. “When I testified at the city council against expanded biometric collection, my Favorability score stayed stable. But Tunde’s dropped. Not much — just enough to push him down the treatment priority list.”
“You couldn’t buy the treatment privately?”
“I tried. But the pharmaceutical companies had exclusive contracts with the health system. Treatment through official channels only.” His voice remained steady, but his hands trembled slightly. “By the time I learned to be quiet, to comply, to stop fighting their expansions, my son had been on the waiting list for eight months. The cancer had metastasized.”
Nkechi felt the weight of understanding. “So you disappeared.”
“I evolved.” He led her into the house, through rooms stripped of anything digital, to a study that looked more like a command center. Screens covered the walls, showing data flows she recognized — attribution patterns, Favorability adjustments, the hidden architecture of control.
“Two years of mapping their operations,” he said. “Every time someone challenges MensaTech, their family suffers ‘algorithmic complications.’ Every journalist who investigates Mensah finds their children rejected from universities. Every business that refuses integration watches their elderly parents lose healthcare access.”
“This is evidence of massive systemic — ”
“Corruption?” Jamal’s smile was bitter. “No. Everything is perfectly legal. Read the policies carefully — family behavioral modeling, social network risk assessment, generational compliance scoring. They wrote the rules to make cruelty algorithmic.”
He pulled up a secure file. “But there are others who remember what the Framework was supposed to be. Engineers who helped build these systems and are horrified by what they’ve become. We call ourselves the Transparency Initiative.”
“You’re planning something.”
“We’re documenting. Building cases. Waiting for the right moment.” He studied her. “Is Kesi Okoro that moment?”
“A twelve-year-old dying because her sister protected user privacy? If that’s not the moment, what is?”
Jamal nodded slowly. “Then we need to talk about what you’re willing to risk. Because once we move against Mensah directly, there’s no going back to your comfortable office managing portfolios for the wealthy.”
Nkechi thought about the grandmother that morning, surrendering her biometric data just to watch her granddaughter graduate. About all the small surrenders that added up to systemic control.
“Show me what you have,” she said.
THE UNDERGROUND
The abandoned data center in Mushin district looked like countless others — a relic from the early tech boom, left to decay when newer facilities made it obsolete. But beneath the rusted server racks and broken cooling systems, something else thrived.
“Welcome to the real Lagos,” Jamal said, leading her through electromagnetic shielding into a hidden basement. “The one that exists between the sensors.”
The space hummed with activity. Dozens of workstations showed attribution patterns from across the city. Engineers she recognized — some former colleagues, others competitors — worked alongside activists and reformed data brokers. At the center, a woman with graying hair orchestrated the operation with quiet efficiency.
“Isha Oladele,” Jamal introduced. “Former Chief Architect for the IAEC’s TransparentCore visualization system. Until she asked the wrong questions.”
“I asked why certain Favorability patterns were hard-coded to ignore government officials,” Isha said, not looking up from her screens. “The answer cost me my career, but gave me clarity.”
She pulled up a display that made Nkechi’s breath catch. It showed the real architecture of the Favorability system — not the public-facing version she knew, but the hidden layers beneath. Decision trees that prioritized compliance over health. Algorithms that identified potential dissidents through family connections. Punishment protocols disguised as resource optimization.
“This isn’t just corruption,” Nkechi said. “It’s — ”
“Social control through artificial scarcity,” Isha finished. “Make people believe resources are limited, that algorithmic distribution is fair, then use family bonds as leverage for compliance. It’s elegant, if you admire that sort of thing.”
“We’ve documented hundreds of cases,” another engineer added. “But documentation means nothing if no one sees it. That’s why we need someone with your access.”
“My access?”
“You manage portfolios worth billions,” Jamal said. “Your credentials can reach parts of the system we can’t. The audit logs that would prove Kesi’s treatment was suspended as retaliation, not medical necessity.”
“That would require breaching the IAEC repository during a synchronization window. If caught — ”
“You’d lose everything,” Isha acknowledged. “Your career, your comfort, your carefully constructed life. The question is whether that life is worth living while children die to ensure their parents’ compliance.”
Nkechi looked around the room at faces marked by loss and determination. These weren’t radicals or anarchists — they were people who’d helped build the system and couldn’t live with what it had become.
“What exactly are you planning?”
“We show the city what their consent really means,” Jamal said. “Not in technical language or buried in policy documents, but clear, undeniable truth. The Attribution Wall broadcasts to millions every day. Imagine if instead of transaction data, it showed the real cost of the system.”
“You want to hack the Attribution Wall?”
“We want to translate it,” Isha corrected. “Show people what those numbers really mean. Which children are dying. Which families are being destroyed. Which communities are being punished for asking questions.”
“And you need the audit logs to prove it’s intentional, not algorithmic accident,” Nkechi understood.
“The synchronization window is in four days,” Jamal said. “During those twenty minutes, security protocols relax to allow system updates. With your credentials — ”
“I’d be destroying my life.”
“You’d be saving others,” Isha said quietly. “Including Kesi Okoro.”
Nkechi thought of her office, her comfortable routine helping wealthy clients game the system. Then she thought of Kesi in her hospital bed, medicine just meters away but denied by an algorithm designed to punish her sister’s defiance.
“I need to think.”
“Think fast,” Jamal said. “Every day we delay is another day of children used as leverage, families destroyed for compliance, dignity sold for the illusion of fairness.”
As they led her back through the shielding, Nkechi noticed something she’d missed on entry. The walls were covered with photos — children, parents, grandparents. All victims of Favorability manipulation. All reduced to cautionary tales about the cost of resistance.
Tunde Washington’s photo was there, a bright-eyed boy who’d wanted to heal others. Next to him, dozens of other young faces. All denied treatment. All dead because their families asked the wrong questions.
“Four days,” she said. “I’ll have an answer in four days.”
But looking at those photos, she already knew what that answer would be.
THE HEIST
At 2:47 AM, Nkechi sat in the Transparency Initiative’s basement, her hands steady on the keyboard despite the magnitude of what she was about to do. Around her, team members monitored security systems, ready to alert her to any anomaly.
“Synchronization window opens in thirteen minutes,” Isha announced. “Remember, you need the Medical Resource Allocation Committee logs from the past six months. Focus on pediatric cases with family correlation flags.”
Nkechi had spent three days preparing, reviewing her access patterns to ensure tonight’s breach would appear routine. Her official story — investigating anomalies in the Washington portfolio — would hold up to surface scrutiny.
“Security rotation just changed,” someone called out. “Night shift is settling in. Optimal timing.”
Her Consent Wallet sat powered down beside her, its tracking disabled. After tonight, she’d never use it again. The thought should have terrified her. Instead, she felt oddly calm.
“Window’s open,” Isha said. “You’re clear to proceed.”
Nkechi’s fingers moved with practiced precision. Her credentials passed through layer after layer of security, each acceptance bringing her deeper into the system she’d helped wealthy clients exploit for years. The interface was familiar, even comforting in its complexity.
The medical logs appeared in cascading windows. At first, they looked like standard bureaucratic records — patient IDs, treatment protocols, resource allocation scores. But Isha’s decryption algorithm revealed the hidden metadata.
“My God,” Nkechi breathed.
Every pediatric treatment denial was tagged with family compliance scores. Parents who questioned biometric expansion saw their children’s health prioritization drop. Journalists investigating Mensah found their elderly parents’ medication “discontinued due to resource optimization.” Business owners who refused MensaTech integration watched their families systematically denied care.
“Focus,” Isha reminded her. “Download everything, process it later.”
The files transferred with agonizing slowness. Each one contained another story of algorithmic cruelty, another life sacrificed to ensure compliance. Tunde Washington’s case was there — his treatment delayed forty-three days after his father’s council testimony. By the time Jamal had learned to be silent, the window for successful treatment had closed.
“Intrusion detection just pinged,” a monitor called out. “Automated system, not active investigation. You have maybe three minutes before it escalates.”
Nkechi accelerated her extraction, grabbing files in bulk now. Quality control could come later — what mattered was evidence.
“Got it,” she announced as the last file transferred. “Cleaning my access trail.”
“No,” Isha said suddenly. “Leave breadcrumbs. Let them know someone was there. Make them nervous.”
“That will trigger — ”
“Exactly. While they’re investigating the breach, we’ll be executing the real plan.”
Nkechi disconnected, her heart finally beginning to race as the magnitude hit her. She’d just stolen classified government data that proved systematic medical discrimination. There was no going back.
“Phase one complete,” Jamal announced to the room. “Now we prepare for the revelation.”
THE PREPARATION
Over the next thirty-six hours, the Transparency Initiative worked with desperate efficiency. The stolen data was processed, verified, and transformed into something the public could understand. Technical language was translated into human terms. Statistics became stories. Policy documents revealed their true purpose.
Nkechi found herself working alongside Isha, learning to see the system she’d navigated for years through new eyes.
“Look at this,” Isha said, highlighting a policy section. “Enhanced Pediatric Resource Allocation Protocol. Sounds helpful, right? But read the implementation details — children’s medical priority is directly tied to their parents’ Favorability scores. They’re literally holding kids hostage for good behavior.”
“I helped clients navigate around these provisions,” Nkechi admitted. “I never thought about what happened to families who couldn’t afford brokers like me.”
“That’s how it works. The wealthy buy exemptions while everyone else suffers. The system appears fair because some people succeed within it.”
As they worked, news trickled in from the outside. The IAEC had discovered the breach and launched an investigation. Attribution brokers were being questioned. Nkechi’s absence from her firm had been noticed.
“They’ll trace it to me eventually,” she said.
“Counting on it,” Jamal replied. “Your prominence will make the revelation impossible to ignore. When Lagos’s most successful attribution broker exposes the system, people pay attention.”
The modified attribution node was a marvel of engineering — designed to integrate seamlessly with the Attribution Wall’s display system while being nearly impossible to remove without shutting down the entire Exchange. It would override the normal transaction display with their translated evidence.
“The medical data is powerful,” Isha said, reviewing their final presentation. “But I want to add something. The internal communications between Mensah’s office and MensaTech. Show people this isn’t accidental — it’s policy.”
“We don’t have those,” Nkechi pointed out.
“We do now.” A young engineer pulled up new files. “Someone inside IAEC just leaked them. Apparently, your breach inspired others to act.”
The communications were damning. Discussions of “compliance encouragement through familial pressure.” Strategies for “behavioral modification via resource access.” Even a cheerful memo about the success of “medical leverage protocols” in achieving integration targets.
“They’re proud of it,” Nkechi said, disgusted. “They’re documenting their cruelty like it’s an achievement.”
“Because to them, it is,” Jamal said grimly. “They’ve created a system where human suffering equals operational success. Time to show Lagos what they’ve built.”
THE REVELATION
The morning crowd at the Attribution Exchange plaza was larger than usual — students hoping to improve their educational access scores, elderly citizens checking if their medical prioritization had changed overnight, workers monitoring their employment favorability. All of them generating data with every breath, every glance, every digital interaction.
Nkechi stood among them, her identity masked by a borrowed Consent Wallet with fabricated history. Aya was beside her, clutching the modified node that would transform the city’s monument to data capitalism into something else entirely.
“Maintenance window in thirty seconds,” Nkechi murmured. “You remember the port location?”
“Third kiosk from the left, hidden behind the information panel.” Aya’s voice was steady despite everything at stake. “Jamal’s team is monitoring security. We’ll have warning if they detect us.”
The Attribution Wall continued its hypnotic display — millions of micro-transactions flowing like digital water, beautiful and incomprehensible. Soon, it would show something very different.
“Window’s open,” Nkechi said.
Aya moved with practiced casualness, pausing at the information kiosk as if checking transit schedules. Her hand found the maintenance port, the node sliding in smoothly. For a moment, nothing changed.
Then the Wall flickered.
The smooth cascade of transactions froze, pixelated, reformed. Where numbers had flowed, faces appeared. Children denied treatment. Families destroyed by compliance scores. Communities punished for questioning the system.
Each image came with a story, translated from bureaucratic language into brutal clarity:
“Chioma Adebayo, age 7. Cancer treatment delayed after mother reported sexual harassment by attribution officer. Died waiting for care.”
“Elderly residents of Makoko district. Medication access reduced after community protests against forced biometric collection. Forty-three preventable deaths.”
“Students denied university admission due to parents’ low compliance scores. Dreams destroyed for algorithmic efficiency.”
The crowd’s murmur grew to a roar. People recognized names, saw their own stories reflected. A woman cried out — her nephew’s face on the display, dead because his mother had questioned why her iris scan was needed to buy groceries.
Then came the internal communications. Not hidden, but celebrated. Officials congratulating each other on successful “behavioral modification.” Charts showing how medical leverage improved data integration rates. Memos discussing which communities to target next.
“They’re not even hiding it,” someone near Nkechi gasped. “They’re proud of killing our children.”
Security forces surrounded the plaza, but they seemed uncertain. The Attribution Wall was too integrated with the city’s financial systems to simply shut down. Disabling it would crash markets, freeze transactions, create chaos worse than any revelation.
Above all, the display showed one final message: the Ambient Attribution Protocol. In three weeks, every space in Lagos would be monitored. Every conversation recorded. Every thought assigned a compliance score.
The crowd’s anger transformed into something more dangerous — organized resistance. People began documenting the display, sharing it across networks faster than any system could track. The truth was spreading like wildfire.
“We should go,” Nkechi said, noting enforcement officers entering the plaza.
But Aya stood transfixed, tears streaming as she watched the truth of her sister’s condition displayed for all to see. Not a medical decision, but punishment for protecting her users’ privacy.
“Let them see,” she said. “Let everyone see what consent really means in this system.”
Security was closing in when something unexpected happened. The enforcement officers slowed, then stopped, staring at the Wall. One of them made a choked sound — his daughter’s name was scrolling past, marked as “educational access denied due to father’s questioning of overtime policies.”
Even those enforcing the system were victims of it.
THE CONSEQUENCES
They arrested Nkechi three hours later at her apartment. She’d spent the time destroying anything that might lead to other Initiative members, though she knew the gesture was largely symbolic. They would find what they wanted to find.
The processing was efficient — biometric scans, behavioral analysis, Favorability score reduction to absolute zero. Digital death, swift and complete. But through the holding cell’s window, she could see smoke rising from several districts. The revelation had sparked more than conversation.
Her interrogator arrived after six hours — not a security officer, but Director General Mensah himself. He entered with the confidence of someone who’d never faced real opposition, settling into the chair across from her with paternal disappointment.
“Ms. Adeyemi. Do you understand what you’ve done?”
“Shown people the truth about your system.”
“My system?” He seemed genuinely puzzled. “This is society’s system. Built by consensus, implemented through proper channels, serving the collective good.”
“Serving your bank account,” Nkechi countered. “How much does MensaTech pay you for each forced integration?”
“MensaTech provides essential health infrastructure. Integration ensures optimal resource allocation.” He leaned forward. “Do you think healthcare is unlimited? Do you think we can treat everyone equally when resources are scarce?”
“Resources aren’t scarce. You create artificial scarcity to ensure compliance.”
“We create incentives for productive behavior. Those who contribute to society receive society’s benefits. Those who resist…” He shrugged. “Actions have consequences.”
“Like Kesi Okoro? What did that twelve-year-old do to deserve denial of treatment?”
“Her sister refused to participate in essential health monitoring. How can we optimize care without complete data? The child’s suffering is unfortunate but necessary — it will encourage better choices in the future.”
“You’re using children as leverage.”
“We’re using natural human bonds to encourage compliance with democratically established policies.” He stood, straightening his suit. “The Ambient Protocol will activate on schedule. This brief disorder only proves why comprehensive monitoring is necessary. Order will be restored.”
He paused at the door. “Your Initiative accomplished nothing. Tomorrow, people will need their services, their healthcare, their education. They’ll accept the system because the alternative is chaos. You’ve simply reminded them why strong governance is essential.”
After he left, Nkechi sat in the darkness, watching the city through her small window. She could see the Attribution Wall from here, still displaying its translated truths. How long before they managed to shut it down? How long before the stories were buried again under bureaucratic language?
But across the city, something had changed. The smoke wasn’t from riots — it was from burning Consent Wallets, citizens destroying their digital chains. The revelation had shown them what they’d really agreed to. Some were choosing to refuse.
THE RECKONING
The next morning brought unexpected visitors. Not more interrogators, but a judicial advocate with news that made no sense.
“You’re being released pending review,” the advocate announced. “The Chief Justice has invoked emergency protocols. There will be a public inquiry into the medical allocation system.”
“That’s impossible. Mensah would never — ”
“Director General Mensah is under investigation. The Favorability system’s own algorithms flagged him for systematic bias after analyzing the data you revealed. The Third Law’s decay protocols have activated — the system is recognizing its own oppression patterns.”
Nkechi was escorted out in a daze. In the plaza outside, hundreds had gathered. Some held signs with names from the Attribution Wall. Others simply stood witness. As she emerged, a cheer went up — not for her, but for what she’d helped reveal.
Aya was waiting, tears of relief on her face. “Kesi’s getting treatment. An emergency medical fund was established — people contributing what they can to help those denied by the algorithms. Thousands of donations in the first hour.”
“And the Ambient Protocol?”
“Suspended pending review. Three regional governors have refused implementation. The Framework’s own safeguards are activating — turns out even Mensah couldn’t corrupt everything.”
Over the following weeks, the city transformed. The Attribution Wall continued its regular function, but with new additions — transparency metrics showing how decisions were made. Citizens could challenge their Favorability scores, demand explanations in plain language. It wasn’t perfect, far from it. But it was better.
Nkechi found herself in an unexpected role — advising the reform commission on how to prevent future exploitation. Her broker’s license was revoked, her comfortable life gone. But she discovered something better: purpose.
“The system isn’t evil,” she testified before the commission. “It was designed to ensure fair compensation for data. But complexity became opacity. Opacity enabled exploitation. We need radical transparency — not just in data collection, but in how that data is used.”
Mensah was eventually convicted of abuse of authority. His network dismantled, his policies reversed. But Nkechi knew others would try similar schemes. The price of digital dignity was eternal vigilance.
THE NEW NORMAL
Two years later, Nkechi stood before a classroom of citizens learning to read their own attribution profiles. The Community Data Center occupied the ground floor of what had been the Attribution Tower — a deliberate symbol of democratization.
“Remember,” she told her students, “you have the right to understand. Every algorithm that affects your life must explain itself in terms you can grasp. If they can’t explain it clearly, they can’t use it.”
Through the window, she could see the Attribution Wall, still displaying its data flows but now accompanied by plain-language translations. The Consent Clarity Index had become as important as transaction values. Not perfect, but improving.
Kesi Okoro sometimes attended the classes, healthy now, studying to become a doctor. She wanted to understand the systems that had nearly killed her, to ensure they never had that power again.
“Ms. Adeyemi,” an elderly woman raised her hand. “What if they make it complicated again? What if they find new ways to hide?”
Nkechi recognized her — the grandmother from that morning at the graduation, the one whose biometric surrender had started Nkechi’s journey toward truth.
“Then we expose them again,” Nkechi said simply. “And again. And again. Until they learn that consent means understanding, not just compliance.”
After class, she walked through the plaza where everything had changed. The Attribution Wall caught the evening light, beautiful and complex but no longer incomprehensible. Children played beneath it, their parents watching without fear that every movement was being scored.
Not revolution, but evolution. Not perfect justice, but progress toward it. The Framework remained, but transformed — serving its original purpose of recognizing data as labor deserving fair compensation. The exploitation hadn’t ended, but it could no longer hide behind complexity.
Her Consent Wallet — a new one, configured for maximum transparency — chimed with a message from Aya. “Family dinner tonight? Kesi’s cooking.”
Nkechi smiled. Some things were worth more than any amount of data could capture. The system might quantify everything, but it couldn’t value what truly mattered — the bonds between people who refused to let algorithms define their worth.
The city pulsed with data around her, but she walked through it unafraid. She had learned the most important lesson: Power wasn’t in gaming the system or even destroying it. Power was in understanding it, exposing it, and demanding it serve humanity rather than harvest it.
The Attribution Wall displayed its endless cascade of information. But now, people could read what it really meant. And that made all the difference.
Author’s Note
The Data Brokers explores how systems designed to protect can evolve into tools of control when complexity obscures their true function. The story’s Attribution Economy and Favorability Index reflect real concerns about social credit systems, surveillance capitalism, and the growing gap between those who understand technology and those who are merely subject to it.
The narrative suggests that meaningful resistance comes not from destroying these systems but from demanding they operate transparently. In an age where algorithms increasingly determine life outcomes, the right to understand those algorithms becomes as important as traditional civil rights.
Today’s technology is creating tomorrow’s power structures. The question is whether we’ll maintain enough clarity to ensure they serve humanity’s dignity rather than exploiting it. The answer lies not in the technology itself, but in our willingness to demand that complex systems explain themselves in human terms.
Every consent checkbox hides a story. Every algorithm makes moral choices. Every system reflects the values of its creators. The challenge is ensuring those values remain visible, contestable, and ultimately human.
The Data Brokers examines how systems designed to ensure fairness can become weapons when complexity obscures their true purpose. In a world where medical care, education, and basic services are algorithmically allocated based on “Favorability scores,” the story reveals how data designed to protect becomes a tool of control. When a child’s life hangs in the balance of her sister’s compliance, a group of insiders must risk everything to expose the human cost of algorithmic governance.