Picture 1 of 1

Gallery
Picture 1 of 1

Have one to sell?
Superintellige nce: Paths, Dangers, Strategies by Bostrom, Nick VERY GOOD
US $14.09
ApproximatelyRM 59.66
Condition:
“Clean and tight. Good jacket”
Very Good
A book that has been read but is in excellent condition. No obvious damage to the cover, with the dust jacket included for hard covers. No missing or damaged pages, no creases or tears, and no underlining/highlighting of text or writing in the margins. May be very minimal identifying marks on the inside cover. Very minimal wear and tear.
Oops! Looks like we're having trouble connecting to our server.
Refresh your browser window to try again.
Shipping:
Free USPS Media MailTM.
Located in: Cary, North Carolina, United States
Delivery:
Estimated between Sat, 2 Aug and Fri, 8 Aug
Returns:
No returns accepted.
Coverage:
Read item description or contact seller for details. See all detailsSee all details on coverage
(Not eligible for eBay purchase protection programmes)
Seller assumes all responsibility for this listing.
eBay item number:257039310984
Item specifics
- Condition
- Very Good
- Seller Notes
- “Clean and tight. Good jacket”
- Book Title
- Superintelligence: Paths, Dangers, Strategies
- ISBN
- 9780199678112
About this product
Product Identifiers
Publisher
Oxford University Press, Incorporated
ISBN-10
0199678111
ISBN-13
9780199678112
eBay Product ID (ePID)
201597656
Product Key Features
Number of Pages
352 Pages
Publication Name
Superintelligence : Paths, Dangers, Strategies
Language
English
Publication Year
2014
Subject
Social Aspects / General, Intelligence (Ai) & Semantics, General
Type
Textbook
Subject Area
Computers
Format
Hardcover
Dimensions
Item Height
1.1 in
Item Weight
24 Oz
Item Length
9.4 in
Item Width
6.3 in
Additional Product Features
Intended Audience
Scholarly & Professional
LCCN
2013-955152
Dewey Edition
23
Reviews
"Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT "Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Springfrom 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics "Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" -- The Economist "There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times "Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla, "Worth reading." -- Elon Musk, Founder of SpaceX and Tesla "I highly recommend this book" -- Bill Gates "very deep ... every paragraph has like six ideas embedded within it." -- Nate Silver "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT "Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics "Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" --The Economist "There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times "His book Superintelligence: Paths, Dangers, Strategies became an improbable bestseller in 2014" -- Alex Massie, Times (Scotland) "Ein Text so nüchtern und cool, so angstfrei und dadurch umso erregender, dass danach das, was bisher vor allem Filme durchgespielt haben, auf einmal höchst plausibel erscheint. A text so sober and cool, so fearless and thus all the more exciting that what has until now mostly been acted through in films, all of a sudden appears most plausible afterwards. (translated from German)" -- Georg Diez, DER SPIEGEL "Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla "A damn hard read" -- Sunday Telegraph "I recommend Superintelligence by Nick Bostrom as an excellent book on this topic" -- Jolyon Brown, Linux Format "Every intelligent person should read it." -- Nils Nilsson, Artificial Intelligence Pioneer, Stanford University "An intriguing mix of analytic philosophy, computer science and cutting-edge science fiction, Nick Bostrom's Superintelligence is required reading for anyone seeking to make sense of the recent surge of interest in artificial intelligence (AI)." -- Colin Garvey, Icon, "Worth reading." -- Elon Musk, Founder of SpaceX and Tesla"I highly recommend this book" -- Bill Gates"very deep ... every paragraph has like six ideas embedded within it." -- Nate Silver"Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley"Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society"This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT"Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics"Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" --The Economist"There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times"His book Superintelligence: Paths, Dangers, Strategies became an improbable bestseller in 2014" -- Alex Massie, Times (Scotland)"Ein Text so n"uchtern und cool, so angstfrei und dadurch umso erregender, dass danach das, was bisher vor allem Filme durchgespielt haben, auf einmal h"ochst plausibel erscheint. A text so sober and cool, so fearless and thus all the more exciting that what has until now mostly been acted through in films, all of a sudden appears most plausible afterwards. (translated from German)" -- Georg Diez, DER SPIEGEL"Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla"A damn hard read" -- Sunday Telegraph"I recommend Superintelligence by Nick Bostrom as an excellent book on this topic" -- Jolyon Brown, Linux Format"Every intelligent person should read it." -- Nils Nilsson, Artificial Intelligence Pioneer, Stanford University"An intriguing mix of analytic philosophy, computer science and cutting-edge science fiction, Nick Bostrom's Superintelligence is required reading for anyone seeking to make sense of the recent surge of interest in artificial intelligence (AI)." -- Colin Garvey, Icon, '[A] magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs and by physicists who think there is no point to philosophy.'Brian Clegg, Popular Science, "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?"-- Professor Max Tegmark, MIT, "I highly recommend this book" --Bill Gates "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." --Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." --Martin Rees, Past President, Royal Society "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" --Professor Max Tegmark, MIT "Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever." --Olle Haggstrom, Professor of Mathematical Statistics "Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" --The Economist "There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." --Clive Cookson, Financial Times "Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" --Elon Musk, Founder of SpaceX and Tesla "a magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs and by physicists who think there is no point to philosophy." -- Brian Clegg, Popular Science "Bostrom...delivers a comprehensive outline of the philosophical foundations of the nature of intelligence and the difficulty not only in agreeing on a suitable definition of that concept but in living with the possibility of dire consequences of that concept." -- A. Olivera, Teachers College, Columbia University, CHOICE "Bostrom's achievement (demonstrating his own polymathic intelligence) is a delineation of a difficult subject into a coherent and well-ordered fashion. This subject now demands more investigation."--PopMatters "Every intelligent person should read it." --Nils Nilsson, Artificial Intelligence Pioneer, Stanford University, "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT "Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics "Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" -- The Economist "There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times "Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla "a magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs and by physicists who think there is no point to philosophy." -- Brian Clegg, Popular Science, "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society, "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?"-- Professor Max Tegmark, MIT "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. iSuperintelligencer charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society, "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society "a magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs and by physicists who think there is no point to philosophy." -- Brian Clegg, Popular Science "There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT, "I highly recommend this book" --Bill Gates "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT "Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics "Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" -- The Economist "There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times "Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla "a magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs and by physicists who think there is no point to philosophy." -- Brian Clegg, Popular Science, "Worth reading." -- Elon Musk, Founder of SpaceX and Tesla"I highly recommend this book" -- Bill Gates"very deep ... every paragraph has like six ideas embedded within it." -- Nate Silver"Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley"Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society"This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT"Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics"Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" --The Economist"There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times"His book Superintelligence: Paths, Dangers, Strategies became an improbable bestseller in 2014" -- Alex Massie, Times (Scotland)"Ein Text so nüchtern und cool, so angstfrei und dadurch umso erregender, dass danach das, was bisher vor allem Filme durchgespielt haben, auf einmal höchst plausibel erscheint. A text so sober and cool, so fearless and thus all the more exciting that what has until now mostly been acted through in films, all of a sudden appears most plausible afterwards. (translated from German)" -- Georg Diez, DER SPIEGEL"Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla"A damn hard read" -- Sunday Telegraph"I recommend Superintelligence by Nick Bostrom as an excellent book on this topic" -- Jolyon Brown, Linux Format"Every intelligent person should read it." -- Nils Nilsson, Artificial Intelligence Pioneer, Stanford University"An intriguing mix of analytic philosophy, computer science and cutting-edge science fiction, Nick Bostrom's Superintelligence is required reading for anyone seeking to make sense of the recent surge of interest in artificial intelligence (AI)." -- Colin Garvey, Icon, "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT "Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Springfrom 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics "Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" -- The Economist "There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times "Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla "a magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs and by physicists who think there is no point to philosophy." -- Brian Clegg, Popular Science, "Worth reading." -- Elon Musk, Founder of SpaceX and Tesla"I highly recommend this book" -- Bill Gates"very deep ... every paragraph has like six ideas embedded within it." -- Nate Silver"Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley"Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society"This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT"Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics"Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" --The Economist"There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times"His book Superintelligence: Paths, Dangers, Strategies became an improbable bestseller in 2014" -- Alex Massie, Times (Scotland)"Ein Text so nüchtern und cool, so angstfrei und dadurch umso erregender, dass danach das, was bisher vor allem Filme durchgespielt haben, auf einmal h¨ochst plausibel erscheint. A text so sober and cool, so fearless and thus all the more exciting that what has until now mostly been acted through in films, all of a sudden appears most plausible afterwards. (translated from German)" -- Georg Diez, DER SPIEGEL"Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla"A damn hard read" -- Sunday Telegraph"I recommend Superintelligence by Nick Bostrom as an excellent book on this topic" -- Jolyon Brown, Linux Format"Every intelligent person should read it." -- Nils Nilsson, Artificial Intelligence Pioneer, Stanford University"An intriguing mix of analytic philosophy, computer science and cutting-edge science fiction, Nick Bostrom's Superintelligence is required reading for anyone seeking to make sense of the recent surge of interest in artificial intelligence (AI)." -- Colin Garvey, Icon, "Worth reading." -- Elon Musk, Founder of SpaceX and Tesla "I highly recommend this book" -- Bill Gates "very deep ... every paragraph has like six ideas embedded within it." -- Nate Silver "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT "Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics "Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" --The Economist "There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times "His book Superintelligence: Paths, Dangers, Strategies became an improbable bestseller in 2014" -- Alex Massie, Times (Scotland) "Ein Text so nchtern und cool, so angstfrei und dadurch umso erregender, dass danach das, was bisher vor allem Filme durchgespielt haben, auf einmal hchst plausibel erscheint. A text so sober and cool, so fearless and thus all the more exciting that what has until now mostly been acted through in films, all of a sudden appears most plausible afterwards. (translated from German)" -- Georg Diez, DER SPIEGEL "Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla "A damn hard read" -- Sunday Telegraph "I recommend Superintelligence by Nick Bostrom as an excellent book on this topic" -- Jolyon Brown, Linux Format "Every intelligent person should read it." -- Nils Nilsson, Artificial Intelligence Pioneer, Stanford University "An intriguing mix of analytic philosophy, computer science and cutting-edge science fiction, Nick Bostrom's Superintelligence is required reading for anyone seeking to make sense of the recent surge of interest in artificial intelligence (AI)." -- Colin Garvey, Icon, "I highly recommend this book" --Bill Gates "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT "Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics "Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" -- The Economist "There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times "Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla "a magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs and by physicists who think there is no point to philosophy." -- Brian Clegg, Popular Science "Bostrom...delivers a comprehensive outline of the philosophical foundations of the nature of intelligence and the difficulty not only in agreeing on a suitable definition of that concept but in living with the possibility of dire consequences of that concept." -- A. Olivera, Teachers College, Columbia University, CHOICE "Bostrom's achievement (demonstrating his own polymathic intelligence) is a delineation of a difficult subject into a coherent and well-ordered fashion. This subject now demands more investigation."--PopMatters, "I highly recommend this book" --Bill Gates "Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era." -- Stuart Russell, Professor of Computer Science, University of California, Berkley "Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book." -- Martin Rees, Past President, Royal Society "This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?" -- Professor Max Tegmark, MIT "Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever." -- Olle Haggstrom, Professor of Mathematical Statistics "Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" -- The Economist "There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." -- Clive Cookson, Financial Times "Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes" -- Elon Musk, Founder of SpaceX and Tesla "a magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs and by physicists who think there is no point to philosophy." -- Brian Clegg, Popular Science "Bostrom...delivers a comprehensive outline of the philosophical foundations of the nature of intelligence and the difficulty not only in agreeing on a suitable definition of that concept but in living with the possibility of dire consequences of that concept." -- A. Olivera, Teachers College, Columbia University, CHOICE "Bostrom's achievement (demonstrating his own polymathic intelligence) is a delineation of a difficult subject into a coherent and well-ordered fashion. This subject now demands more investigation."--PopMatters "Every intelligent person should read it." --Nils Nilsson, Artificial Intelligence Pioneer, Stanford University
Illustrated
Yes
Dewey Decimal
006.301
Table Of Content
Preface1. Past Developments and Present Capabilities2. Roads to Superintelligence3. Different Forms of Superintelligence4. Singularity Dynamics5. Decisive Strategic Advantage?6. Intellectual Superpowers7. The Superintelligent Will8. The Control Problem9. Achieving a Controlled Detonation10. Oracles, Genies, Sovereigns, and Tools11. Acquiring Values12. Design Choices13. What is to Be Done?Summary/Epilogue
Synopsis
The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. Other animals have stronger muscles or sharper claws, but we have cleverer brains.If machine brains one day come to surpass human brains in general intelligence, then this new superintelligence could become very powerful. As the fate of the gorillas now depends more on us humans than on the gorillas themselves, so the fate of our species then would come to depend on the actions of the machine superintelligence.But we have one advantage: we get to make the first move. Will it be possible to construct a seed AI or otherwise to engineer initial conditions so as to make an intelligence explosion survivable? How could one achieve a controlled detonation?To get closer to an answer to this question, we must make our way through a fascinating landscape of topics and considerations. Read the book and learn about oracles, genies, singletons; about boxing methods, tripwires, and mind crime; about humanity's cosmic endowment and differential technological development; indirect normativity, instrumental convergence, whole brain emulation and technology couplings; Malthusian economics and dystopian evolution; artificial intelligence, and biological cognitive enhancement, and collective intelligence.This profoundly ambitious and original book picks its way carefully through a vast tract of forbiddingly difficult intellectual terrain. Yet the writing is so lucid that it somehow makes it all seem easy. After an utterly engrossing journey that takes us to the frontiers of thinking about the human condition and the future of intelligent life, we find in Nick Bostrom's work nothing less than a reconceptualization of the essential task of our time., Human beings occupy a dominant position on our planet, not because we have stronger muscles or sharper teeth than other species, but because we have smarter brains. Our brains developed the technologies and the complex social organization that make us powerful. For example, our smartness gave us bulldozers and knives that are stronger and sharper than any animal's muscles or teeth.If machine brains come to surpass human brains as ours surpass those of other animals, the machine brains could become as powerful relative to us as we are to the other animals. Extreme levels of machine intelligence - superintelligence - would potentially be in a position to shape the future. What happens to humanity, whether humanity would even survive, would then depend on the goals of the superintelligence. The possibility of a machine intelligence revolution is therefore an extremely important topic. Perhaps it is the most important topic., A New York Times bestseller Superintelligence asks the questions: What happens when machines surpass humans in general intelligence? Will artificial agents save or destroy us? Nick Bostrom lays the foundation for understanding the future of humanity and intelligent life. The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. If machine brains surpassed human brains in general intelligence, then this new superintelligence could become extremely powerful - possibly beyond our control. As the fate of the gorillas now depends more on humans than on the species itself, so would the fate of humankind depend on the actions of the machine superintelligence. But we have one advantage: we get to make the first move. Will it be possible to construct a seed Artificial Intelligence, to engineer initial conditions so as to make an intelligence explosion survivable? How could one achieve a controlled detonation? This profoundly ambitious and original book breaks down a vast track of difficult intellectual terrain. After an utterly engrossing journey that takes us to the frontiers of thinking about the human condition and the future of intelligent life, we find in Nick Bostrom's work nothing less than a reconceptualization of the essential task of our time., This seminal book injects the topic of superintelligence into the academic and popular mainstream. What happens when machines surpass humans in general intelligence? Will artificial agents save or destroy us? In a tour de force of analytic thinking, Bostrom lays a foundation for understanding the future of humanity and intelligent life.
LC Classification Number
Q335.B685 2014
Item description from the seller
Seller feedback (6,344)
- u***o (4086)- Feedback left by buyer.Past monthVerified purchaseGreat condition, fast ship. Gave 100% Positive Feedback. Thanks!
- y***c (1471)- Feedback left by buyer.Past monthVerified purchaseThank you.
- i***a (1563)- Feedback left by buyer.Past monthVerified purchaseItem was as described and arrived on time.
More to explore :
- Strategy Textbooks,
- Strategy Textbook Textbooks,
- Strategy Hardcover Textbooks,
- Strategy Paperback Textbooks,
- Strategy Textbooks in English,
- Good Housekeeping Magazines,
- Fashion Good Housekeeping Magazines,
- Fiction The Very Hungry Caterpillar Fiction & Books,
- Health Good Housekeeping Magazines,
- Good Housekeeping Magazines 1940-1979