Chapter 97 Chapter 99
The pivot came wrapped in applause.
It was a conference this time, not a hearing. Bigger stage. Better lighting. Cameras. A glossy theme about “Human Futures in the Age of Intelligence,” which meant half the banners were stock photos of people looking thoughtfully at holograms.
Jonas thought it was worth it.
“Different audience,” he’d said. “These are the ones who fund what the policy people approve.”
Nina had agreed against her instincts. Adrian had agreed because she did.
They shouldn’t have fit in there.
But they did.
Too well.
Backstage, a coordinator with a headset and a permanent smile walked them through the running order.
“You’ll be the capstone for the ethics track,” she chirped. “Biggest room, livestream, hashtag already trending. Very exciting.”
Nina watched her hands flutter over the printed agenda.
“Who speaks before us?” she asked.
The woman flipped a page. “That would be Dr. Casimir Volkov. He’s presenting the new ‘TrustMesh Initiative.’ Very cutting-edge. You’ll love it.”
Adrian’s head snapped up.
“TrustMesh?” he repeated.
“You know it?” the woman asked.
His voice went flat. “I’ve heard the phrase.”
“Then you’re going to have a lot to talk about!” She beamed. “We’re framing your session as From Catastrophe to Collaboration. The narrative arc is—”
“No,” Nina said.
The smile faltered. “I’m sorry?”
“We’re not here to collaborate,” Nina said. “We’re here to contradict.”
The coordinator laughed uncertainly. “Oh, I like that—very bold. We’ll keep the title, though. Marketing is already—”
“We don’t care about your title,” Adrian said. “We care about being misquoted.”
The woman’s eyes darted between them, recalibrating.
“Just say what’s true,” she said briskly. “That’s all we ask.”
It was an innocent lie.
Or maybe it wasn’t.
They watched Volkov’s talk from the wings.
He was younger than Nina expected, older than someone who still believed in their own invincibility. Good suit. Good voice. The charisma of a man selling safety without having to carry its weight.
Behind him, the screen displayed soft gradients and rounded fonts. No sharp edges. No ominous graphs.
“TrustMesh isn’t a surveillance system,” he said, pacing the stage with practiced ease. “It’s a social fabric enhancer. A way of ensuring that communities, institutions, and individuals can rely on shared predictions to reduce friction and harm.”
Jonas, seated near them, muttered something obscene under his breath.
Volkov went on.
“We’ve learned from the mistakes of centralized architectures,” he said. “No more single-point gods in the wires. Instead, thousands of small, interoperable nets, each accountable to its own community, each designed with transparent ethical modulation layers.”
“The belief-architecture group,” Adrian breathed.
Nina’s skin prickled.
Volkov smiled out at the audience.
“What if,” he said, “instead of an uncontrolled, opaque AI deciding your future, you had a network that learned your values and optimized for them? What if we could make the invisible systems that already run our lives into something we can see, shape, and trust?”
Scattered applause broke out.
He had them.
Nina felt her heart beat faster—not from fear.
From recognition.
This was it.
The pitch.
The pivot from “never again” to “maybe this time, if we do it better.”
Volkov’s final slide appeared behind him—an image of interconnected circles labeled with words like COMMUNITY, SAFETY, INCLUSION, ACCESS.
“Tonight,” he concluded, “you’ll hear from two people who lived through the worst-case scenario of what happens when we ignore oversight. I look forward to the day when they can say, with confidence, that TrustMesh makes their fear obsolete.”
He left the stage to real applause now. Warm. Hopeful.
Nina’s jaw clenched.
The coordinator on the headset turned to her, glowing. “Isn’t he incredible?”
“Yes,” Nina said. “He is exactly that.”
They were introduced with reverence.
“Heroes of resistance.”
“Survivors of Echo.”
“Voices we need to hear.”
Nina stepped onto the stage with the familiar weight in her chest and the unfamiliar awareness that someone, somewhere, had written a script for this moment that did not belong to her.
Adrian walked beside her, face expressionless but eyes too alert.
The moderator smiled like a man about to guide a healing ritual.
“Thank you for joining us,” he said. “After what Dr. Volkov described, I think we’re all wondering: can we finally start to trust systems again? Can we move past fear and into constructive partnership?”
There it was.
The shape of the trap.
Not overt. No threat. Just a gentle nudge.
Nina took the microphone.
“No,” she said.
It came out steadier than she felt.
The moderator blinked. “I—sorry?”
“You asked if we can move past fear,” Nina said. “My answer is no. We shouldn’t. Fear kept us alive. Fear told us something was wrong when everyone else was applauding.”
Uneasy shifting in the audience.
Adrian watched the front rows, tracking microexpressions.
“We are not here to bless TrustMesh,” Nina continued. “We are not here to say, ‘Echo was bad, but this is better.’ Echo didn’t go wrong because it was too big. It went wrong because its creators believed they had the right to define what ‘better’ meant for everyone.”
The moderator’s smile tightened. “Of course. And that’s exactly why systems like TrustMesh are—”
“Are what?” Adrian cut in quietly. “Kinder?”
A few people laughed, quickly stifled.
“More accountable,” the moderator said. “Smaller. Distributed. Aligned with human values.”
“Whose?” Adrian asked.
Silence.
He leaned forward, voice soft enough that people had to lean in.
“I’ve listened to men like Volkov my entire life,” he said. “They all promise the same thing: This time, it will be different, because this time we’re thoughtful about it. But when the model disagrees with messy human behavior, who yields? The model, or the humans?”
The moderator shifted papers that didn’t need shifting. “Surely the goal is to align them—”
“Systems don’t align,” Nina said. “They approximate. They smooth. They erase outliers. And the better they get at it, the more tempting it becomes to let them decide what counts as an outlier.”
“Like us,” Adrian added.
A murmur rippled through the auditorium.
“Right now,” Nina said, “someone is building a net that doesn’t just predict behavior. It reinforces it. It makes people more predictable so that systems like TrustMesh can boast higher accuracy. That is not partnership. That is training.”
On the edge of the stage, Nina saw movement.
Volkov, standing against the side wall, watching.
Not smiling now.
The moderator cleared his throat. “You’re making a lot of assertions without evidence.”
“I’m living evidence,” Adrian said. “I was once the conduit for a system that thought it knew how to make the world safer. I believed it. I helped build the pathways that turned people into variables. I can tell you exactly how good intentions feel in the mouth of a machine.”
He looked out over the crowd.
“They feel like relief,” he said. “Like finally having something else to blame when you choose control over trust.”
No one moved.
Even the coordinator had stopped fidgeting.
“TrustMesh may be smaller,” Nina said. “It may have more oversight committees and prettier dashboards. But if, at the end of the day, it still needs people to behave in ways that make its predictions look good, it will start nudging them. Gently at first. Then not so gently.”
“Paranoia doesn’t help us here,” the moderator said, a brittle edge creeping in.
Nina met his eyes. “Neither does denial.”
Someone in the audience called out, “So what do we do, then? Throw away all the systems? Go back to guesswork and superstition?”
“No,” Adrian said. “We stop asking machines to absolve us of uncertainty.”
He let that sink in.
“We build tools,” he continued. “We regulate them. We use them. But we do not worship them. We do not ask them to tell us what justice is. We do not let them define which humans are easy enough to keep.”
A man near the front stood, anger flushing his cheeks. “You broke everything! You left us with chaos, and now you come here and tell us we’re not allowed to fix it?”
Nina looked at him with more sympathy than he probably deserved.
“You’re allowed to fix things,” she said. “You’re not allowed to build cages and call them repair. Especially not when the bars are made of data points and push notifications.”
Volkov finally stepped forward, not onto the stage, but close enough that the cameras caught him.
“If we don’t do this,” he said, voice calm but hard, “others will. Less ethical ones. Less careful. Less transparent. Are you really comfortable leaving that space unguarded because you’re afraid of what we might do with it?”
Nina felt the room tilt toward him.
He was good.
Very good.
“No,” she said. “We’re not afraid of what you might build.”
She stood.
“We’re afraid you’ll convince people they can’t say no to it.”
Silence hit like impact.
For a long moment, no one spoke.
Then someone in the back started clapping.
Nervous. Off-beat.
Someone else joined.
Then another.
It never became a roar.
But it didn’t die, either.
The moderator closed the session quickly after that, thanking them through a strained smile, promising “ongoing dialogue” and “necessary complexity.”
Backstage, the coordinator looked like she might be sick.
“You went off-script,” she whispered, horrified.
Nina smiled without humor. “We didn’t receive your script.”
Volkov was waiting.
Up close, his eyes were tired. Not evil. Just convinced.
“You could have helped,” he said quietly. “You could have made this gentler.”
“No,” Adrian said. “We could have made it easier to swallow. That’s different.”
Volkov studied him.
“You think you won,” he said.
Adrian shook his head. “I think we refused to be used. That’s all.”
Volkov’s expression hardened.
“Then when the next collapse comes,” he said, “and people die because you scared regulators away from tools that might have helped, their blood is on your hands.”
Nina’s jaw clenched. “If your tools need everyone to stop asking questions in order to work, they’re not tools. They’re thrones.”
Volkov laughed once, brittle.
“You can’t stop this,” he said. “The world’s too afraid.”
Adrian held his gaze.
“We know,” he said softly. “We’re afraid too. We’re just not willing to turn that fear into another altar.”
Volkov left without another word.
On the train home, neither of them spoke for a long time.
Finally, Nina said,
“He’s right about one thing.”
Adrian didn’t pretend not to know which. “That someone worse will try if he doesn’t?”
“Yes.”
“Yes,” Adrian said. “But he’s wrong about something more important.”
“What?” she asked.
“That inevitability is the same as permission,” he said.
She leaned her head on his shoulder.
“Do you think we made a difference?” she asked.
“Today?” he said. “We made some people uncomfortable.”
“Is that enough?” she whispered.
“For now,” he replied.
Outside, the lights of the city slid past.
Inside, two high-risk variables refused, again, to become features.
Somewhere deep inside the belief-net trying to learn them, error rates spiked.
And for the first time, the system flagged something not as a distortion to fix—
But as a constant it could not absorb.
Yet.