The Background Check

If the people building your AI had to pass a background check, 3.5 million pages of DOJ evidence would disqualify most of them. One man from the slums would pass. That’s the problem—and the solution.

By :  Newstrack
Update: 2026-02-28 04:46 GMT

A SIMPLE TEST

Here is a thought experiment that should keep you up at night.

Imagine we required the people building artificial intelligence to pass the same background check we require of a public school teacher. A daycare worker. A Little League coach. A volunteer at a homeless shelter. The most basic screening society applies to anyone who works with vulnerable populations: Have you maintained voluntary, documented relationships with a convicted sex offender?

Most of Silicon Valley’s AI leadership would fail.

Not because they committed crimes. Because they maintained relationships—documented in 3.5 million pages of DOJ files—with someone who did. Scheduling emails. Dinner confirmations. Island visit discussions. Investment conversations. Year after year. Post-conviction. In writing. One figure: 2,658 file mentions. Another: 2,592. Another: 2,281. One asked about the “wildest party” on a private island. AI researchers in their networks exchanged emails about eugenics, fascism, and population control with the same convicted predator.

A public school teacher who maintained those relationships would be fired. A daycare worker would lose their license. A Little League coach would be removed by Tuesday.

But a tech billionaire? A tech billionaire gets to keep building the AI that decides what your children see online.

“We background-check the person who watches your kid for two hours at soccer practice. We don’t background-check the person whose algorithm watches your kid for six hours a day on a screen. The Epstein files are the background check Silicon Valley never had to take. 3.5 million pages. They failed.” — Natarajan

THE DOUBLE STANDARD

Think about who we screen and who we don’t.

To drive an Uber, you submit to a criminal background check. To teach a third-grade class, you submit fingerprints and a multi-agency screening. To adopt a child, you undergo home visits, interviews, financial reviews, and psychological evaluations. To work in a nursing home, you pass a state registry check. To volunteer at your kid’s school book fair, you fill out a form and wait for clearance.

To build an AI system that determines who gets a mortgage, who gets parole, what news four billion people see, what content children are served, which résumés are flagged as worthy and which are trashed—to build the single most powerful tool for shaping human behavior ever created—you need nothing. No screening. No check. No review of character whatsoever.

You need money. You need a network. You need access. You need to be at the right dinner. And as the Epstein files prove, the dinner might include a convicted child sex offender—and that’s fine, because the background check doesn’t exist.

The people who failed the background check are building the systems that perform the background check. AI-powered hiring tools. AI-powered criminal risk assessments. AI-powered content moderation for children. AI-powered credit scoring. Systems that judge your character—built by people whose own character is documented across 3.5 million pages of federal evidence.

“The algorithm that screens your résumé was built by people who couldn’t pass their own screening. The AI that moderates your child’s content was designed in networks that included a child predator. The system that judges your creditworthiness was architected by people whose moral creditworthiness is documented in DOJ files. This isn’t irony. It’s institutional failure.” — Natarajan

WHAT A CLEAN BACKGROUND LOOKS LIKE

Now consider the alternative résumé.

Born in a one-room slum. Eight people. No electricity. No running water. Father: $1.75 a month, telegram delivery by bicycle, thirty kilometers a day, gave most of it away. Mother: no education, no money, no connections. Stood outside a headmaster’s office 365 consecutive days. Pawned her silver wedding toe ring for thirty rupees to pay school fees.

Studied under a street light. Arrived at Georgia Tech with fifty dollars. Worked five jobs. Slept in his car. Faced deportation. Got hired at Coca-Cola with two weeks left on his visa. Spent twenty-five years transforming logistics at six Fortune 500 companies. Filed 300 patents. Grew Walmart’s grocery business from $30 million to $5 billion. Took his father off life support. Slept in his car for two weeks. Named his son Vishnu. Walked away from the corporate machine. Built the world’s first virtue-native AI.

Background check: clean.

Not clean because he avoided the wrong dinners. Clean because the formation that produced him made the wrong dinners unthinkable. When your moral education comes from a mother who sacrificed her wedding ring and a father who gave away his wages, you don’t attend post-conviction dinners with predators. The calculation doesn’t enter your mind. The cost-benefit analysis doesn’t run. Because the operating system is different. The base code is different. The virtue is native.

NATIVE VS. INSTALLED

This is the core technical distinction. Silicon Valley installs ethics the way you install antivirus software: after the system is built, as a layer of protection against threats the system itself creates. The ethics is a patch. A scan. A periodic audit. It runs in the background and occasionally throws up a warning that gets dismissed because it slows things down.

Angelic Intelligence is virtue-native. The morality is not installed. It is not a layer. It is not a patch. It is the kernel—the core process from which everything else executes. Twenty-seven Virtue Agents are the decision-making architecture. Compassion is not a filter on routing decisions. Compassion is the routing decision. Transparency is not a report generated after the fact. Transparency is the fact. The system does not optimize and then check whether the optimization was ethical. The optimization is ethical or it does not execute.

And the virtues are configurable across cultures—because Natarajan grew up in the moral landscape of South India, was educated across American institutions, and built supply chains spanning six continents. He knows that compassion in a Lagos distribution center and compassion in a Stockholm fulfillment hub express differently. Same virtue. Different configuration. That is not relativism. That is the only form of AI ethics that will work on a planet with eight billion people and thousands of moral traditions.

“Silicon Valley installs ethics like antivirus—after the system ships, running in the background, ignored when it slows things down. We built virtue into the kernel. It doesn’t run in the background. It’s the foreground. It’s the only process. Everything else executes from it.” — Natarajan

THE DEMAND

Here is what 3.5 million pages of evidence should force us to demand:

Moral due diligence for AI builders. Not a PR exercise. Not a voluntary pledge. Actual scrutiny of the networks, relationships, and moral formation of the people designing the systems that govern billions of lives. If we background-check a Little League coach, we can background-check the people whose algorithms shape the minds of every child with a smartphone.

Virtue-native architecture as the standard. Not bolt-on ethics. Not trust-and-safety theater. AI where morality is the computational foundation—auditable, configurable, and present in every decision, not just the ones that make the press release.

Builders whose moral formation can withstand scrutiny. People whose background check is clean not because they managed their exposure, but because the formation that produced them made moral compromise unthinkable. People who were shaped by sacrifice, not access. By consequence, not comfort. By street lights, not islands.

“Run the background check. On all of them. The Epstein files already did. 3.5 million pages. Now ask yourself: Would you let these people coach your kid’s soccer team? Then why are you letting them build the AI that raises your kid?” — Natarajan

The background check exists. It’s sitting on the DOJ’s servers. 3.5 million pages.

The question is whether we have the courage to read it—and act on it.

■ ■ ■

Shekhar Natarajan is the Founder and CEO of Orchestro.AI, creator of Angelic Intelligence™. Davos 2026 opening keynote. Tomorrow, Today podcast (#4 Spotify). Signature Awards Global Impact laureate. 300+ patents. Georgia Tech, MIT, Harvard Business School, IESE. Grew up in a one-room house in the slums of Hyderabad. No electricity. Father earned $1.75/month on a bicycle. Mother stood outside a headmaster’s office for 365 days. One son, Vishnu. Paints every morning at 4 AM. Does not appear in the Epstein files.

Similar News