Discussion about this post

User's avatar
Faithless Digital Vagabond's avatar

I guess should say (in thinking about this now) that i've always operated under the assumption that in understanding "where we are" we can, in some significant measure, know where we're going. The idea of evolution, as opposed to revolution. The innovation of AI shows the intersection of what seems like endless vortices of factors many of the most important of them, which face, a VERY uncertain future. Now, with the future of AGI so wide open, revolution won't begin to touch what's in store for us.

we used to choose between, half empty, half full, and all full. now we have glasses, ice trays, crucibles, an atmosphere, in essence endless options going forward to for filling receptacles, not to mention the states of what these vessels are filled with. What boggles our minds today, may be simple arithmetic for future generations.

This Substack message, will be something I refer people to until it is replaced with newer, better information. Anything that detangles us from any dogma (dogmatic thinking) is a net positive - which I think this, I believe this approach seeks to do.

I think Mr. Shapiro has definitely thought about this a lot . . . it's a good frame to use for thinking about AGI and our future in relation to it, as well as understanding what different individuals (and their own spectrums of wants, desires and fears) will reveal us to be; so many of us have been waiting for a future that approaches asymptoticly, some with excitement, some with fear, many with both. The cure for bad science isn't to throw science away, it's to do more science. We've been given us some thoughtful ideas here. I feel better equipped in my own thinking going forward. Clarity is a rare commodity, and it’s even rarer to find clarity when expressing ideas. I wonder (and fear) that AGI will ultimately be a major contribution in revealing Nick Bostrom's "Venerable world" hypothesis as sound, and whether or not the hell it brings will outweigh the heaven. Thanks David.

now, here's the MS Copilot rewritten to be "punchier" -

- Let me cut to the chase: I’ve always believed that understanding where we are helps us figure out where we’re going. That’s not revolution—it’s evolution. And with AI, we’re not just crossing boundaries; we’re colliding with them. AGI isn’t a single leap forward—it’s a cascade of unknowns. Revolution alone won’t cut it.

We used to argue over half-full or half-empty glasses. Now we’ve got ice trays, crucibles, and entire atmospheres to fill. The future isn’t binary—it’s multidimensional. What stumps us today might be child’s play tomorrow.

This post is a placeholder—a snapshot before deeper insights arrive. Anything that breaks us out of rigid thinking is a win. That’s the spirit behind this.

David Shapiro clearly gets it. His framework helps us think about AGI not just as tech, but as a mirror for our hopes, fears, and contradictions. We’re chasing a future that creeps closer but never quite lands. The fix for bad science isn’t rejection—it’s better science. And what’s laid out here? It’s thoughtful. It sharpens my own lens.

Clarity is rare. Expressing it is rarer. I worry AGI might validate Bostrom’s “Venerable World” hypothesis—and I’m not sure the trade-offs will be worth it.

Thanks, David. -

close, but doesn't really sound or feel like me. but it mostly gets the job done, and maybe that's the point, and, same as it ever was - we improve things, and sometimes make them worse, maybe that's the constant here.

Expand full comment
Roni GBZ Jr.'s avatar

Yeah, that makes sense. I think AGI is meaningless in the same way that a dream about reaching "peak humanity" or being "fully human" is ultimately meaningless. Human thought is sometimes that way, because we mistakenly think this vector is supposed to be an arrow pointing at a destination. Of course it isn't, both mathematically and spiritually, but this realizations can be very alarming to people who require definitions to feel safe.

Ultimately I think AGI already exists, but as a vector as you said, and any and all future developments in its abilities will be more points on that vector. Sometime in the future historians will look back and do what historians do, which is to draw lines and definitions. They will say this or that time was the time when AGI was "achieved". That'll come to pass. At the present moment though, AGI is just the current placeholder for this myth of a loving or a vengeful god, depending on the person.

Expand full comment
5 more comments...

No posts