I'm sitting in work researching mobile internet technologies, when this thought hits me. Let me present a few examples for you to consider

Are we moving faster than we can control? Are we painting ourselves into a corner as we invent and discover more and more?

Should we be worried?

This random thought brought to you by the worry that I'm nearly 30 and haven't achieved anything of note

Well, spears and swords have been around for millennia and are still being used to kill people. If we waited for morals to catch up with technology, mankind would probably have become a footnote of history, a rather lame species which existed on the African plains for about 20 years before extinction. People will do good or evil, whether it's with the latest technology or not.

We have a lot to be proud of, actually. We've known about U-235 for more than 50 years and still haven't eradicated ourselves. Go us!

I would say without a doubt yes. At face value, our ability to use technology for good is like a retard on ritalin - totally doped. Obviously, this is a bad thing, although not incredibly bad - like Uberfetus said we haven't extinguished ourselves yet. However we shouldn't just be shrugging it off because we haven't brought about our extinction yet.

Sometimes a form of the converse is true - the pace of technology is dependent on bad things, such as war. For instance, the cold war saw a lot of developments, the internet for one finds its origins in DARPA. Similarly, I'm sure some rednecks are asking themselves questions like "does the pace of technology outpace our ability to use it to kill those damn Soviets?" I for one certainly hope so.

The startling growth in technology has in a way ended the old idea that in working to gain power, you have developed the responsibility to use it. Not too long ago things such as computers and weapons were fairly difficult to come by, but today you have kindergarden children bringing carbine rifles to school, people creating nuclear reactors in their backyards and the like. Due, in part, to technology.

It's also interesting to note in terms of technological resourcefulness in humankind's that U-235 is not only used for energy and atomic bombs but can then be used as high accuracy bullets. Chalk another one up for war.
It's not that mankind has somehow failed to control technology. Technology is an inbuilt mechanism by which we will be allowed to destroy ourselves the moment we are too much of a burden for the ecosystem to bear, and it's been there since the dawn of mankind.

Basically, the day the first proto-human picked up a sharp stone and used it to cut a piece of meat off a carcass, our ancestors took the first step on a road to the modification of the environment by ourselves, for our own good (for varying definitions of good, of course). This anomaly in the fabric of the ecosystem has been tolerated so long as we didn't unbalance things too much.

However, anything Really Bad(tm) that we are likely to do to the planet will turn on us and cause our own extinction, or at least a great reduction in our numbers, so the system, from its end, is safe.

Species come and species go - our tenuous relationship with our own brainchildren is nature's way of ensuring that we don't outstay our welcome.

Technology, like any other product of our monoculture, is created not with the goal of helping the many, but to serve the interests of those wealthy enough to control it.

This is purely commonsense. I was watching the news yesterday, there was an item on how Canadian Members of Parliament used wireless technology to assist them in doing what they do--no different than any businesspeople.

What has the internet become? I have already written about the change from information superhighway to ecommerce.

Atomic power, once touted as being able to produce electricity too cheap to meter, threatens us with accidents--Three Mile Island, Chernobyl--with decommissioning costs. And even though U-235 has been around for more than 50 years, and so have we, is not to say that the threat of mutually assured destruction hasn't made us do things that really were out of our control--what was/is the arms race?

All technology will eventually be put to the use of threat. This seems to be a pathology built into our psyche. There is more money here than for curing disease, ending poverty, ensuring good jobs for all who can work.

All technology enslaves. We never see this clearly. We sure never want to admit it. Even all the high technology that permits the existence of Everything--as much as I like it--is beyond our ability to truly use it. Even though the administration here is benign, others use similar technology more malevolently.

We must always be on our guard. We must never take anything at face value.

Just because we can do something, doesn't mean that we should or must.

I think we're approaching a point where we're going to find out if we've matured enough as a species to avoid self-annihilation. And I feel that point is coming soon.

It's not technology that's the problem. Technology is just a tool, a means. It's the motive to use it for bad that's the problem. If we didn't have the desire to take from other people, to deny them freedom, to deny them life, we'd never have to worry.

The sad part is that there are still people who think world peace is a bad idea. Obviously, at least some people are not ready for this technology, and given access to it, would immediately attempt to remove other people, those with certain attributes, from the world.

At some point, an advance is going to arrive that, in a manner of speaking, will put a timer on us as a species. That once this advance arrives, we have a certain amount of time to make advances to either protect us against the eventual spread of this advance to people with bad motives, or to enlighten us, as a species, to the point where those motives disappear.

However, because of the progression of science and technology, we can't just decide to stop working on advances, because it would be impossible for everyone to agree not to do it - at best, you can convince the people with good motives to stop, leaving the dangerous ones to do it on their own, creating a near-certain doomsday scenario.

I think our only chance is to work to both increase the speed at which advances come, to make it more likely we can protect ourselves before the timer expires, and to work with other people, to try and get rid of the opinions, the thoughts, the beliefs, that would lead toward the misuse of the technology.

Personally, I think the advance that's going to start the timer is nanotechnology, and from what I've been hearing, we're looking at a ten to thirty year arrival time.

"It is the business of the future to be dangerous.... The major advances in civilization
are processes that all but wreck the societies in which they occur"

Alfred North Whitehead

Technology has always reshaped a socities value systems.  What so scary now is that are humanity is now global and varied and is reacting directly to technology in completely different ways in different places. It is a truely unique time in history. The mid dawn of the information age has way bigger variables and is fucking scary.

How would you measure our ability to use technology for good? I tend to think that humanity has collectively made some progress in this area, but it's more along the lines of social engineering than moral progress. That is, we're not qualitatively more likely to do the morally right thing left to ourselves, but we have developed systems that reduce the consequences of our bad intentions. We turned down our first opportunity to drive ourselves into extinction, so for now we're at 100%.

It seems to me that "our" (humanity's) ability to use technology for good actually is keeping up with technology. Even if you count Chernobyl, nuclear power is less far less polluting on the global scale than the fossil fuel equivalents--and you'll note that nations around the world are working together to pay for the decommissioning of that plant, rather than allowing it to fester. Even the hydrogen bomb has sat unused for 50 years, something that hasn't happened with any other weapon in history.

The real danger comes if a small group of people can get access to extinction-level weaponry--say, a Stand-style superflu. That's the scary (and unpredictable) scenario. As long as a lot of people need to work together to immamentize the eschaton, we are relatively safe. But if a hundred people can wipe the rest of us out, we are probably going to be in big trouble.

Log in or registerto write something here or to contact authors.