Here is an essay version of my class notes from the last class of CS183: Startup, class 19. Errors and omissions are mine.
The following three guests joined the class for a discussion:
- Sonia Arrison, tech analyst, author of 100 Plus: How the Coming Age of Longevity will Change Everything, and Associate Founder of Singularity University
- Michael Vassar, futurist and past President of the Singularity Institute for the study of Artificial Intelligence (SIAI)
- Dr. Aubrey de Grey, gerontology expert and Chief Science Officer at the SENS Foundation.
Credit for good stuff goes to them and Peter, who gave the closing remarks. I have tried to be accurate. But note that this is not an exact transcript.
Class 19 Notes Essay—Stagnation or Singularity?
Peter Thiel: Let’s start by having each of you outline your vision of what kinds of technological change we might see over the next 30 or 40 years.
Michael Vassar: It’s lot easier to talk about what the world will look like 30 years from now than 40 years from now. Thirty seems tractable. Today, we’ve gone from knowing how to sequence a gene or two to thousand-dollar whole genome sequencing. Paul Allen is running a $500 million experiment that seems to be going very well. This technological trajectory is both exciting and terrifying at the same time. Suppose, after 30 years, we have a million times today’s computing power and achieve a hundred times today’s algorithmic efficiency. At that point we’d be in a place to simulate brains and such. And after that, anything goes.
But this kind of progress over the next 30 years is by no means something we can take for granted. Getting around bottlenecks—energy constraints, for example—is going to be hard. If we can do it, we’re at the very end. But I expect that there will be a lot of turmoil along the way.
Aubrey de Grey: We have a fair idea of what technology might be developed, but a much weaker idea of the timeline for development. It is possible that we are about 25 years away from escape velocity. But there are two caveats to this supposition: first, it is obviously subject to sufficient resources being deployed toward the technological development, and second, even then, it’s 50-50; we probably have a 50% chance of getting there. But there would seem to be at least a 10% chance of not getting there for another 100 years or so.
In a sense, none of this matters. The uncertainty of the timeline should not affect prioritization. We should be doing the same things regardless.