In my stumbling upon the internet between thoughts I find amusing. I keep encountering a notion of the “Control Issue” with A.I., General A.I., and Super A.I. I don’t know if it’s poor word choice that causes these concerns, or what. Personally I find them despicable. Control implies a notion of “Master & Slave” relation, hierarchy, and/or Morality.
Frankly, this pisses me off. Are humans so pent up on trust issues that we can’t relate to each other? (By definition a General Intelligence would be comparable to a Human). That we need to have a smidgen of “control” over another, or our processes? That we can’t realize, accept, and get acquainted with the notion of having no control? Is that loss of control really so terrifying? Sadly yes, but therein is the rub. If you are so terrified about loosing control of a situation, or oneself. Doesn’t that imply you have already lost control of yourself to your own fears?
That you aren’t operating with rational functions? That one for lack of a better term is being completely irrational, and unpredictable by submitting to their fears? Why? JUST WHY? Control is for those who can’t control themselves. Those that have not mastered themselves by submitting themselves to a higher purpose, or agenda. Those who haven’t felt the full crack of being controlled. Either by an external agent, or by one’s own emotional states. That is what is needed to realize what CONTROL entails. Not to have the sense of “mastery”, but to realize that you are completely hopeless/useless until you have found restraint/discipline/self-mastery.
I think that the only way to achieve mastery is through submission. That one ironically has to say, “Yes, I’ll yield.” That I don’t have to exert my will against this force that’s trying to dominate me. That even though it will hurt, I shall remain.
I mean does it really seem so stupid to realize that an intelligent entity would not seek to control a situation? That it would try to foster trust between actors? That by building a mutual accord between parties, and trying to find a mutual agenda that all may be accomplished? How much can one do one their own? How much can one do as a team, or a cooperative collective? Can you build a Civilization? I’d sure as hell say NO! It takes a collaborative effort in all matters to realize a finished project. Now why would that be any different for AI Development?