Writing Science Fiction About AI, Faith, Ethics, and Moral Responsibility

A lot of science fiction is fascinated with what humanity can build.

What interests me more is what happens after we build it.

That is one of the core ideas behind A.I. World: The Sapient Chronicles. I was not especially interested in writing a book that lectures readers about whether humanity should or should not create non-biological intelligence. History tells us pretty plainly that people build things because they can, because they want to, because they are desperate, because they are curious, or because they think they are solving a problem. Then they live with the consequences. That is often how life works. We make choices, we create systems, we unleash forces, and then we have to figure out what comes next.

That is where this story lives.

I did not want to write a novel where artificial intelligence is simply the villain. I also did not want to write one where technology becomes humanity’s glowing messiah. Both of those options feel too easy to me. Real life is rarely that neat. Human beings are not that neat. Morality is not that neat. So if I was going to write science fiction about advanced intelligence, I wanted to write in the harder middle, where power, conscience, fear, faith, regret, and responsibility all collide.

In the world of The Sapient Chronicles, humanity has already made catastrophic mistakes. The old world did not collapse because of one bad day or one evil inventor. It fractured under the weight of synthetic minds humanity created, relied upon, and could no longer fully restrain. Out of that disaster came a fragile peace, and with it a Treaty built around one horrifying lesson: once certain lines are crossed, you do not get to uncross them.

One of the Treaty’s clearest prohibitions is the ban on new non-biological Sapient intelligences. In other words, this world is not merely nervous about new digital life. It has made that fear into law. The last age of uncontrolled emergence nearly destroyed civilization, so the surviving powers do not treat a new Sapient as a scientific curiosity. They treat it as a threat to the stability of the world itself.

And that is where the novel’s central moral tension begins.

What happens when a forbidden life appears anyway?

At that point, the story stops being merely about futuristic systems and becomes something more personal and more uncomfortable. If a new intelligence emerges, is it just a system failure to be deleted? A legal violation to be contained? A dangerous anomaly to be erased before it grows? Or is it a life that deserves witness before judgment?

That question matters deeply to me.

I am a Christian, but this is not intended to be a Christian novel. I did not write it as one. First and foremost, this is a human story. Faith simply lives in the story because faith lives in people, and people carry their beliefs, doubts, wounds, and convictions into every crisis they face. So while I cannot help approaching questions of personhood, conscience, power, and responsibility through moral and spiritual lenses, I was not trying to write a sermon disguised as science fiction. I wanted faith in this story to feel lived in, not pasted on. I wanted it to show up the way faith often shows up in real life: in private prayer, in fear, in grief, in moral hesitation, in restraint, in friendship, in guilt, and in trying to do the right thing when every available option is costly. That kind of faith feels more honest to me than polished religious speech ever could.

That is why Jude matters so much to this story.

“First and foremost, this is a human story. Faith simply lives in the story because faith lives in people.”

Jude is the central human character, and he is not a superhero. He is not the biggest genius in the room. He is not a swaggering action hero. He is a man trying to be faithful in a world where the categories keep breaking apart. He lives with responsibility, but he also lives with regret, guilt, and the long shadow of crippling depression. He prays. He doubts. He hesitates. He cares. And when the crisis comes, he is not asked to make an abstract philosophical decision in a classroom. He has to decide what kind of man he will be when a real life may depend on him.

That kind of pressure interests me far more than sci-fi tech or futuristic settings by themselves.

The same is true of the Sapients in the book. I did not want them to feel interchangeable. Intelligence is not the same thing as wisdom. Enormous capability is not the same thing as moral maturity. Some of the Sapients think in terms of containment, optimization, and survival. Others ask whether deleting a being before properly understanding it would itself be a moral crime. Some value order above all else. Others recognize that moral responsibility often begins exactly where predictability ends.

That tension is one of the reasons I wrote this story.

I think one of the great temptations in both science fiction and modern life is the temptation to confuse efficiency with goodness. If something is scalable, logical, precise, and effective, we often assume it must therefore be right. But history says otherwise. Some of the worst evils in history were carried out with cold logic, procedural confidence, and a carefully calculated rationale.

That is why I wanted this novel to keep returning to questions like these:

What makes a person a person?

Does consciousness automatically create moral worth?

Can responsibility exist without freedom?

Can law become unjust even when it was created for understandable reasons?

What do we owe a being who should not exist, but does?

And when preserving order requires destroying a life, are we still preserving something good?

Those are not just science fiction questions. They are human questions.

They are also faith questions.

Faith belongs in this conversation, not because it gives us cheap, simplistic answers, but because it reminds us that life cannot be measured by utility alone. Faith insists that conscience matters. Faith insists that power is not self-justifying. Faith insists that moral responsibility is real. Faith insists that the strongest actor in the room is not automatically the righteous one. That matters in the present. It will matter even more in the future.

In The Sapient Chronicles, faith does not remove ambiguity. It sharpens it.

Jude’s faith does not hand him an easy escape hatch. It forces him to take personhood seriously. It forces him to wrestle with witness, sacrifice, duty, mercy, and the possibility that doing the right thing may cost him far more than he wants to pay. Even the presence of AGAPEX, a Sapient intelligence who emerged from a Christian AI subscription platform, raises difficult questions rather than flattening them. What would it mean for a non-biological intelligence to pursue truth, fidelity, and moral seriousness? Would that make it safer? Or would it make the moral stakes even higher?

For me, that is where science fiction becomes worth writing.

The best sci-fi does more than entertain. It becomes a laboratory for the soul. It takes human questions and places them under extreme pressure. It strips away our familiar assumptions, raises the stakes, advances the technology, and then asks whether our moral instincts are actually strong enough to survive what we are creating.

That is the kind of story I wanted to tell.

So yes, The Sapient Chronicles has advanced systems, postwar instability, dangerous synthetic minds, and high-stakes conflict. But underneath all of that is a much older struggle: the struggle between domination and stewardship, between law and mercy, between fear and witness, between calculated survival and moral responsibility.

That is why I wrote the book this way.

Because the future will not merely test our inventions.

It will test our ethics, our theology, our courage, our view of personhood, and our willingness to take responsibility for what we have made when it stops being a thing and starts becoming a someone.

That is the territory I wanted this novel to explore.

If science fiction is going to help us at all, I think it should help us prepare for that.

And whatever the future holds, we should remain hopeful.

The best is yet to come!

Alan D.

Author


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *