Transhumanism, is not what I thought I'd see a post on around here. It presupposes that technological innovation has accelerating returns though. -.-
|
||||||||||||||||||||||||||||||||||||||
Transhumanism, is not what I thought I'd see a post on around here. It presupposes that technological innovation has accelerating returns though. -.-
It's hard to be a ACLU hating, philosophically Libertarian, socially liberal, fiscally conservative, scientifically grounded, agnostic, porn admiring gun owner who believes in self determination.
Chuck, we miss ya man.
كافر
So how long after this technology does Skynet enslave us? This kind of stuff can't end well. Just my $.02.
Between two groups of people who want to make inconsistent kinds of worlds, I see no remedy but force. - Oliver Wendell Holmes.
well that's definitely one possibility, and is, really, the biggest source of "concern," for me. if we can't control real AI, when it's born, what's to prevent it from deciding mankind is a threat to it's natural sense of self-preservation? the machine will likely realize it's peril within a few moments of sentience.
Great... just great! I've been waiting 40+ years for all those cool spaceships I saw in 2001: A Space Odyssey when I was ten years old and all I'm gonna get is Hal?
Nothing. We cannot even control others, some cannot even control themselves. If AI can exist independent of us, there is no guarantee of control of any kind.
That said, I think you are viewing AI in terms of ourselves. AI might not even recognize us. And a significantly advanced AI might be concerned with us as we are with microscopic life.
It may end up being the new order of things, but we may not even be advanced enough to perceive it. Just as microbes have no idea we exist.
And if there should happen to be conflict and we are displaced. Well that would be the natural order of things. And we will have been just another species that existed for a specified length of time as the dominant species on this planet.
Of course we need not invent our downfall, it is an eventuality. Any alien species sophisticated enough to travel here would certainly have us at their mercy, and it would be foolish to expect human inventions like compassion to be universal. (29075) 1950 DA could possibly wipe out most animal life if it actually impacts in March 16, 2880. This is assuming we haven't destroyed ourselves with some future war.
And even if we manage to fare all that, Earth will no longer sustain animal life in about 500 million years. There is the assumption that we will find some other habitable planet and escape there. But we have assumed many things that have not yet come to pass even though we expected them by now.
At the end of the day, sooner or later, the inevitable is the same. When the sun consumes the moon and eliminates the last relics of mankind left on it's perfect vacuum environment, it is entirely probable that we will have long since ceased to exist.
It's hard to be a ACLU hating, philosophically Libertarian, socially liberal, fiscally conservative, scientifically grounded, agnostic, porn admiring gun owner who believes in self determination.
Chuck, we miss ya man.
كافر
Probably won't even be anything like HAL. Actual AI that can develop independently of us is probably nothing like we could currently even predict.
Part of me would be fascinated to see what it really would be like, the other part of me is glad it probably won't actually exist in my lifetime. We don't have a strong track record of being able to keep the genie in the bottle.
It's hard to be a ACLU hating, philosophically Libertarian, socially liberal, fiscally conservative, scientifically grounded, agnostic, porn admiring gun owner who believes in self determination.
Chuck, we miss ya man.
كافر
Bookmarks