James Hughes is the the executive director of the Institute for Ethics and Emerging Technologies and host of Changesurfer Radio. He discussed preparation for the future and how futurists can have an impact on public policy.
Hughes listed a series of assumptions he makes about AGI and the Technological Singularity, such as AGI is likely, but probably dangerous and radically alien. While he does not adhere to either a utopian or dystopian view of the Singularity, Hughes did suggest that there are real risks involved, and that our assumptions about AGI may need to be analysized in some detail.
For example, do we really want a perfect utilitarian AGI, or a perfectly truthful AGI? These AGI might not turn out in ways that are beneficial to humans.
While AGI is likely, Hughes suggested that we might also want to augment human intelligence in ways that attempt to allow us to keep up with AGI. In this way we might be able to keep up with technological advance including AGI so that we can deal with any potential problems and consequences.
Other considerations Hughes highlighted included robot/AI rights and the possibility that we will need to regulate and license these superintelligences to better control their use.