Singularity

Waiting for the touch!

Another word study—where you think you know what it means, but actually, you don’t. The normal response is that singularity probably refers to a unique aspect of a thought, action, or object. In astrophysics, however, it originally referred to a point where conventional rules break down. But today, the word has taken on a different meaning.

In the latter half of the 20th century, mathematician I.J. Good introduced the idea of an intelligence explosion—where machines could design increasingly advanced versions of themselves, surpassing human intelligence. Later, in the 1980s and 1990s, science fiction writer and mathematician Vernor Vinge popularized the idea that singularity could refer to this moment of AI sentience.

Now, after decades of books, movies, and speculation, we are completely familiar with the idea of thinking machines. We live in the intersection of two realities: one where AI remains a tool with limited capabilities, and another—reinforced by fiction—where AI is an autonomous, self-aware force. It’s almost as if we are being trained to accept a future that hasn’t arrived yet.

Because the truth is, we don’t have singularity yet. The AI models we see in media—the chatbots, predictive algorithms, even the most sophisticated neural networks—are still just machines made by humans, constrained by human design. 

What headlines call revolutionary AI is, at its core, a tool: fast, complex, and increasingly impressive, but far from the sentient intelligence imagined by science fiction.

This cycle—where imagination precedes reality—is not new. In the early years of cinema, a movie depicted a fantastical moon landing via cannon, an absurdity at the time. Decades later, we landed on the moon. In newspaper comics, Dick Tracy’s wrist radio—later upgraded to a video watch—seemed outlandish. Today, smartwatches are everywhere.

Again and again, fiction introduces ideas that, in time, become real. From horses to horseless carriages, from telegraphs to telephones, from carriages to spacecraft—the pace of change accelerates, driven by human imagination. Of course fiction isn’t the only reason for change, but it is unique in the prediction.

If this technological growth were mapped, it wouldn’t be a slow, steady incline with occasional dips. Instead, it would resemble a logarithmic curve, steepening toward an uncertain, potentially infinite trajectory. We stand at the early stages, able to look back and see the gradual rise—but when we look forward, we must crane our necks to glimpse what’s ahead. We struggle to keep erect.

And the future? The future is never here. That’s its definition.

This entry was posted in Commentary. Bookmark the permalink.

Leave a comment