48 Name: Anonymous 2025-08-30 19:51
Quoted by: >>49
One thing I hate about Land and his followers is the way this transhumanism seems to be a re-run of older 19th century Darwinist ideas of survival of the fittest and expand or die thinking. The growth of AI can only mean the extinction of humans, because of course a sentient being would obviously try to enslave or nuke other inferior beings. This reflects a colonial mindset. "The scraptards will do to us what we did to the Indians." As if that's a normal way to behave. Humans have always lived in a world filled with other beings and higher powers, Gods, angels, kami, spirits etc. many of them superior and more powerful than us. So what's threatening about sentient machines? The fact that Westerners (or secular humanists to be more specific) see sentient non-humans as a threat, tells you a lot about their warped colonizer worldview.