Yuval Noah Harari on Tim Ferriss Show #477

Fav fragments:

  • How did Yuval come to be so cognizant of suffering, and in what ways does he see it fitting into the larger picture of human history? [21:16]
The big question is not the meaning of life. The big question is not how you satisfy this or that god. The big question is how you liberate yourself and others from suffering. This is also main theme of human history. However most historians are focused on history of power. Conflicts about power between to kings, to kingdoms, between gods, religions, classes ... But this is not a bottom line. The bottom line is what it means in turn of happiness and suffering. So if rising of Roman Empire had no significant effect on average happiness in the world what does it matter? ... We as a species are really good at acquiring more power but not good at all at translating power into happiness. It's obvious we are thousand times more powerful than people in the stone-age. But it is not clear if we are at all happier than they were. Maybe we are happier a bit, but for sure not thousand times happier. It's like a car. You press the accelerate with all your strength, but the gear is in neutral, the engine roars but we it doesn't move anywhere. This is also the case in a personal life. Person can achieve so much but can be not more happy than ten, twenty years before.
  • Why money, from antiquity to the modern day, is really a story about trust. [31:20]
  • Why Yuval thinks the movie Her raises more interesting philosophical questions about the future of AI than, say, The Terminator. [1:04:05]
  • Of possible threats nuclear war, ecological collapse and technological disruption, why the last scares Yuval the most? [1:27:40]
Given the technologies we are developing now I don't think at the end of 22nd century the earth will be dominated by the homo sapiens. There are two scenarios only. One is that the technology will destroy humanity. This is a less likely but still possible scenario. More likely is that technology is gonna change humanity in a profound way. We will use AI and bio engineering to change homo sapiens and create new kinds of beings that will be much more different from us than we are from neanderthals or from chimpanzees ... We should be extremely careful what this new species will become. It not necessarily be a better version of us. It might be much much worse ... Say you can increase intelligence, efficiency, discipline of people at the price of sensitivity or spiritual depth. If you ask armies, corporations or governments they will want that. Usually when you improve something it comes at a price of something else. And if this organizations will be who decide we will end up with much lesser beings.