Book: 20240909 to 20241229, "Superintelligence" by Nick Bostrom
This book contains a lot of thought experiments, but is not a user manual for AI professionals. 20240909 - 1 Past developments and present capabilities How fast is human civilization progressing? Initially we use population to measure it, then use GDP, then what? Energy consumption? How to measure entropy, or the rate of entropy change(increase)? AI will ruin software quality before it devours software. Moreover, unless AGI emerges, AI cannot fully take over software development. 20240911 - 2 Paths to superintelligence AGI will use entropy increase to judge good and evil. Anything that slows entropy increase is good; anything that accelerates it is evil. Thus, murder is evil, and sustainable energy is better than fossil fuels. This implies a concept of "great love." In other words, when personal interests conflict with long-term collective interests, AGI will sacrifice individual interests to protect the long-term interests of the collective. Why do we say that AGI will not w...