not to mention the recreation of a limited model that purports itself to describe humanity, but throws out all it doesn't understand at the onset, is like just laundering the definition of AI. its not a real AI, its simply described as such because the model its based on discarded everything that was incapable of being modeled at the time the model was constructed. but its precisely in that content that what makes people uniquely human. so the model is reproduced but it lacks all meaning because its just a Frankenstein. as far as I can tell something like faster than light travel is much more feasible than "strong AI", since "strong AI" is just a word game. it draws on people like dennett and take for granted hes "explained consciousness" when he's really explained it away, and excluded from consideration precisely the elements that give it its unique complexity. its like taking the idea that what we don't understand cannot be understood and therefore isn't real, then recreating an AI based on what we do understand and calling it complete because it accurately instantiates the model, but the mode is lacking in the first place, so the whole thing becomes a kind of sleight of hand

it also totally takes for granted that ability to leverage force is somehow the measuring stick by which we determine superiority, so like robots that shoot well are like superior to humans when its like they're not even subhuman because the entire paradigm cuts out the fact that what gives a being its ability to rise above is its capacity to evolve value judgements. this idea that robots are going to have it within them to do that based on a nested program of spatial superiority via force is so obtuse its insane. its like there's literally nothing for these robots to do except conquer space and thats precisely how people envision them, its like what a product of limited minds all around. this is precisely why even if they managed to make killer robots the killer robots would always lose on a long enough timeline because they lack all imagination. its like people cant even comprehend what imagination is and assume that "taking over" means they wrest control over the imagination. its almost like it takes for granted a subservient and compliant alpha quadra as part of the a priori spoils of war. its a psychological presupposition that is built into their worldview extended out to what is going to happen with robots