Spooky indeed!
I think we're also guilty of anthropomorphizing machines more than we deserve to. Many sci-fi novels depict intelligent machines behaving more like intelligent ants than humans. If machines aren't game theoretically human-like, all bitcoin bets are off. The only reason machines would terminate at human-like coordination is that it's optimal or because we force them to. Do we know if either is true/possible?
I'm trying to look so far in the future, it's purely an exercise. This potential spookiness will maybe be relevant long after I'm dead.