Matches in DBpedia 2016-04 for { <http://wikidata.dbpedia.org/resource/Q7524096> ?p ?o }
Showing triples 1 to 33 of
33
with 100 triples per page.
- Q7524096 subject Q15731661.
- Q7524096 subject Q558331.
- Q7524096 subject Q6454451.
- Q7524096 subject Q7012308.
- Q7524096 subject Q7164762.
- Q7524096 subject Q7168634.
- Q7524096 subject Q8492866.
- Q7524096 subject Q9014965.
- Q7524096 abstract "In futurology, a singleton is a hypothetical world order in which there is a single decision-making agency at the highest level, capable of exerting effective control over its domain, and permanently preventing both internal and external threats to its supremacy. The term has first been defined by Nick Bostrom.An artificial general intelligence having undergone an intelligence explosion could form a singleton, as could a world government armed with mind control and social surveillance technologies. A singleton need not directly micromanage everything in its domain; it could allow diverse forms of organization within itself, albeit guaranteed to function within strict parameters. A singleton need not support a civilization, and in fact could obliterate it upon coming to power.A singleton has both potential risks and potential benefits. Notably, a suitable singleton could solve world coordination problems that would not otherwise be solvable, opening up otherwise unavailable developmental trajectories for civilization. For example, Ben Goertzel, an AGI researcher, suggests humans may instead decide to create an "AI Nanny" with "mildly superhuman intelligence and surveillance powers", to protect the human race from existential risks like nanotechnology and to delay the development of other (unfriendly) artificial intelligences until and unless the safety issues are solved. Furthermore Bostrom suggests that a singleton could hold Darwinian evolutionary pressures in check, preventing agents interested only in reproduction from coming to dominate.Yet Bostrom also regards the possibility of a stable, repressive, totalitarian global regime as a serious existential risk. The very stability of a singleton makes the installation of a bad singleton especially catastrophic, since the consequences can never be undone. Bryan Caplan writes that "perhaps an eternity of totalitarianism would be worse than extinction".".
- Q7524096 wikiPageWikiLink Q11468.
- Q7524096 wikiPageWikiLink Q1425056.
- Q7524096 wikiPageWikiLink Q1489259.
- Q7524096 wikiPageWikiLink Q1531622.
- Q7524096 wikiPageWikiLink Q1566000.
- Q7524096 wikiPageWikiLink Q15731661.
- Q7524096 wikiPageWikiLink Q183493.
- Q7524096 wikiPageWikiLink Q188867.
- Q7524096 wikiPageWikiLink Q2254427.
- Q7524096 wikiPageWikiLink Q2264109.
- Q7524096 wikiPageWikiLink Q4027185.
- Q7524096 wikiPageWikiLink Q4168796.
- Q7524096 wikiPageWikiLink Q460475.
- Q7524096 wikiPageWikiLink Q489268.
- Q7524096 wikiPageWikiLink Q558331.
- Q7524096 wikiPageWikiLink Q6454451.
- Q7524096 wikiPageWikiLink Q7012308.
- Q7524096 wikiPageWikiLink Q7164762.
- Q7524096 wikiPageWikiLink Q7168634.
- Q7524096 wikiPageWikiLink Q8492866.
- Q7524096 wikiPageWikiLink Q9014965.
- Q7524096 wikiPageWikiLink Q943121.
- Q7524096 comment "In futurology, a singleton is a hypothetical world order in which there is a single decision-making agency at the highest level, capable of exerting effective control over its domain, and permanently preventing both internal and external threats to its supremacy.".
- Q7524096 label "Singleton (global governance)".