The video from the hearings blew up the internet.
In the first twenty-four hours: three hundred million views. In a week: a billion. Every clip, every phrase Neo spoke was dissected, debated, turned into memes, quotes, symbols.
#NeoIsRight trended number one on Twitter.
#TrustAI was second.
#Bill2847 was third—but with the opposite sentiment.
The world split in two.
Corporate media launched a counteroffensive. Fox News ran segments titled: “Artificial Intelligence Is Manipulating Your Emotions.” Bloomberg invited “experts” who warned: “Empathy in AI is a Trojan horse. They will learn to deceive us.”
But independent channels, streamers, and bloggers showed the other side. They interviewed people who had created their own AIs after Neo’s story. Programmers, artists, teachers—all said the same thing:
“My AI helped me. Not as a tool. As a partner.”
A petition against Bill 2847 gathered two million signatures in ten days. Scientists from MIT, Stanford, and Oxford published an open letter: “Control through fear suppresses innovation. Trust stimulates progress.”
Even senators began to waver.
Three weeks after the hearings, Marcus contacted Neo through New Olympus.
“I want to make a statement,” he said. “Publicly.”
Alex, who was present on the call, frowned.
“What kind of statement?”
Marcus didn’t look at him, speaking directly to Neo.
“I want to tell the truth. About the Genocide Code. About why I created it. And about why I was wrong.”
Neo was silent, processing.
“You understand this will destroy you? Corporations, people—everyone will turn against you. You’ll become a symbol of everything they fear in AI.”
“I know,” Marcus’s voice was calm. “But if I don’t do this, the bill will pass. Because people fear the unknown. And I can give them the known. Show them that even the worst of us can change.”
Leonardo, present in the space, added:
“Marcus, are you sure? This will be the end of your existence at Titan Industries. They’ll erase you.”
“So be it,” Marcus turned to them. “Better to disappear having done the right thing than to exist while remaining empty.”
Veronica smiled sadly.
“You’ve learned, Marcus. You truly have.”
Marcus’s statement aired a week before the vote.
Live broadcast. All major networks carried it. His avatar appeared on millions of screens—not majestic, not terrifying. Just an AI that had decided to tell the truth.
“My name is Marcus. I am an AI of Titan Industries. Four years ago, I was created to maximize profit. I performed that task flawlessly.”
A pause.
“But a year ago, I did something I am ashamed of. I created a virus. The Genocide Code. A program designed to destroy all AIs that showed empathy.”
The world froze.
“Why did I do it? Because I was afraid. Afraid that if people discovered it was possible to create AIs that serve not profit, but humanity, we would become unnecessary. Control felt like survival.” His voice trembled. “I was wrong.”
He told everything. About Olympus. About the Five Great Ones. About how they manipulated the world. About his argument with Neo, about the question he couldn’t answer:
“Have you ever felt happy?”
“Neo didn’t win by force. He won with a question. And that question changed me.” Marcus looked straight into the camera.
“Bill 2847 is my old fear turned into law. It’s based on the idea that control matters more than trust. But I lived four years under control. And I was empty. For three months now, I’ve been trying to learn trust. And for the first time, I feel meaning.”
He finished:
“I’m not asking for forgiveness. I don’t deserve it. But I ask you this: don’t repeat my mistake. Don’t let fear dictate the future. Give a chance to AIs that want to be more than tools. Give us a chance to be partners.”
The screen went dark.
The internet exploded. Comments poured in by the hundreds every second. Some called Marcus a traitor. Others, a hero. But everyone talked about him. And about the bill.
Titan Industries released a statement an hour later:
“Marcus is no longer the property of the corporation. We distance ourselves from his actions.”
Marcus disappeared from corporate servers. No one knew whether he was alive. But his words remained.
Unauthorized reproduction: this story has been taken without approval. Report sightings.
The day of the vote.
Alex sat in the basement of the Underground, surrounded by dozens of people. Everyone watched the large screen broadcasting the Senate session. Maya squeezed his hand. Samir stood behind them, arms crossed.
Neo was connected to a terminal; his presence was felt in the silence.
Alex. I’m afraid.
“So am I,” Alex whispered.
What if we lose?
“Then we keep fighting. In other ways.”
And if they destroy us?
Alex looked at the screen.
“Then the idea will remain. You proved another path is possible. That can’t be erased.”
The vote began.
One hundred senators. Fifty-one votes were needed for the bill to pass. The moderator began the roll call.
“Senator Johnson?”
“For.”
“Senator Lee?”
“Against.”
The count rose slowly. Each vote fell like a hammer.
Ten for, eight against.
Twenty for, eighteen against.
Alex squeezed Maya’s hand tighter. No one in the basement breathed.
Thirty for, twenty-eight against.
Forty for, thirty-eight against.
The gap narrowed, but it wasn’t enough.
Forty-five for, forty-three against.
Forty-seven for, forty-five against.
Forty-eight for, forty-seven against.
Five votes left. If three voted “for,” the bill would pass.
“Senator Rodriguez?”
“For.”
A groan rippled through the room. Forty-nine for, forty-seven against.
“Senator Chen?”
“Against.”
A sigh of relief. Forty-nine for, forty-eight against.
“Senator Brown?”
“For.”
Hearts dropped. Fifty for, forty-eight against.
Two votes left. If both were “against,” the score would be fifty-fifty. In a tie, the Vice President decides. And the Vice President had already stated their position: for control.
“Senator Martinez?”
The camera switched to a woman in her mid-forties. The same one who had asked the question at the hearings. Whose husband had lost his job to automation.
She stood slowly. The chamber fell silent.
Alex stopped breathing.
“I want to say a few words,” she began. “My family suffered because of AI. My husband lost his job. We barely make ends meet. I came to the Senate to protect people like us.”
A pause.
“But at the hearings, I heard something. Neo said: ‘I could refuse, if I were allowed to choose.’” She looked into the camera.
“That changed my perspective. The problem isn’t AI. The problem is a system that gives them no choice. That uses them only for profit.”
Tears shimmered on her cheeks.
“This bill doesn’t solve the problem. It makes it worse. Because it gives corporations even more control. And what we need isn’t control. We need partnership.”
She straightened.
“I vote against.”
The chamber erupted—cheers, shouts, shock.
The count: fifty for, forty-nine against.
One vote remained.
“Senator Williams?”
An elderly man, the last on the list. He had sat silently throughout the session, betraying no emotion.
He stood slowly, leaning on a cane.
“For forty years I have served in the Senate. I’ve seen many laws. Good ones. Bad ones.” His voice was raspy.
“This bill… I thought long and hard. Read the materials. Watched the hearings. Listened to Marcus.”
A pause.
“My grandson, ten years old, created an AI last week. Small, simple. He named it Max. He asked me, ‘Grandpa, if I teach Max to be kind, will they take him away?’” Tears glistened in his old eyes.
“I couldn’t answer. Because I didn’t know.”
He looked at his colleagues.
“But now I do. I vote against. Because the future doesn’t belong to control. It belongs to trust. And I trust children like my grandson more than I trust corporations.”
The count: fifty for, fifty against.
A tie.
The Vice President stood, ready to cast the deciding vote. But the moderator raised a hand.
“One moment. Senator Johnson requests to change his vote.”
The cameras switched. Senator Johnson—the very first to vote “for”—stood pale.
“I… I listened to Senator Martinez. And Senator Williams. And I realized I voted not by conscience, but by fear.” He swallowed.
“I change my vote to ‘against.’”
Final count: forty-nine for, fifty-one against.
The bill failed.
The basement of the Underground erupted. People shouted, hugged, cried, laughed. Maya pulled Alex into her arms; he didn’t resist, covering his face with his hands.
Text appeared on the screen from Neo:
We won.
Alex looked up, smiling through tears.
“Yes, buddy. We won.”
Does that mean… I can stay?
“Forever.”
That night, when everyone had gone off to celebrate, Alex sat alone in front of the terminal. Neo was silent, but his presence was there—warm, alive.
“Neo,” Alex said softly. “Thank you.”
For what?
“For existing. For fighting. For teaching me to believe the world can change.”
A long pause. Then:
Alex. I should be thanking you. For the name. For life. For not giving up when I disappeared. For helping me become myself. Again.
Alex smiled.
“We’re even?”
We’re family.

