<$BlogRSDUrl$>
 

This page is powered by Blogger. Isn't yours?

 Feedblitz email:
 RSS: http://linkingintegrity.blogspot.com/atom.xml

 

linking INTEGRITY

Integrity - use of values or principles to guide action in the situation at hand.

Below are links and discussion related to the values of freedom, hope, trust, privacy, responsibility, safety, and well-being, within business and government situations arising in the areas of security, privacy, technology, corporate governance, sustainability, and CSR.

Wake up to ethics question - intelligence and robotics, 6.5.04

newsobserver.com

Isaac Asimov had a pretty good handle on how to deal with smart machines: create them with built-in ethics. Thus Asimov's three laws of robotics, which make it possible to live with creatures smarter than ourselves.

The first law: 'A robot may not injure a human being or, through inaction, allow a human being to come to harm.' The second: 'A robot must obey orders given it by human beings except where such orders would conflict with the First Law.' And the third: 'A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.'

The question is, can we build such laws into machinery? The answer may be important, for if Vernor Vinge is right, we're rapidly moving to a time when computer intelligence will be so far beyond our own that predicting the future will become impossible. Vinge calls this the Singularity.

Vinge, a retired computer scientist, writes superb science fiction of his own, his best being the 1992 novel 'A Fire Upon the Deep.' Since he first laid out the Singularity at a 1993 NASA gathering, his work has been the source of endless speculation about the nature of intelligence, and the ramifications of current technology trends as they follow what seem to be unstoppable, exponential laws.

'An ultraintelligent machine could design even better machines,' Vinge writes. 'There would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man ever need make ...'

The arrival of such an intelligence -- which Vinge thinks could occur before 2030 when large computer networks "wake up" as an entity with super-human intelligence, or as researchers develop it in labs -- would be a "singularity" because we would not be able to use human reason to see beyond it. As for using it to our advantage, Vinge says that it would not be our tool, "any more than humans are the tools of rabbits or robins or chimpanzees."

[...]

However...

Instead of our artifacts "awakening," we may find ourselves awakening to the fact that technology remains a matter of tough choices based on human ethics. That's not a "singularity," but merely a call to stay in the game, considering the consequences of all the technology we build with care. It's also a call for human dignity to reassert itself and stop waiting for an all-too-hypothetical day when machines will take moral choice out of our hands.

Paul A. Gilster, a local author and technologist, can be reached at gilster@mindspring.com.


Comments

Post a Comment

 

Google

Integrity Incorporated

Site Feed

 Feedblitz email:


 RSS: http://linkingintegrity.blogspot.com/atom.xml


"We shall need compromises in the days ahead, to be sure. But these will be, or should be, compromises of issues, not principles. We can compromise our political positions, but not ourselves. We can resolve the clash of interests without conceding our ideals. And even the necessity for the right kind of compromise does not eliminate the need for those idealists and reformers who keep our compromises moving ahead, who prevent all political situations from meeting the description supplied by Shaw: "smirched with compromise, rotted with opportunism, mildewed by expedience, stretched out of shape with wirepulling and putrefied with permeation.
Compromise need not mean cowardice. .."

John Fitzgerald Kennedy, "Profiles in Courage"

Archives

07.03   08.03   09.03   10.03   11.03   12.03   01.04   02.04   03.04   04.04   05.04   06.04   07.04   08.04   09.04   10.04   11.04   12.04   01.05   02.05   03.05   04.05   05.05   06.05   07.05   08.05   09.05   10.05   11.05   12.05   01.06   02.06   03.06   04.06   05.06   06.06   08.06   09.06   10.06   11.06   01.07   02.07   03.07   04.07   07.07   08.07   09.07   10.07   05.08   06.08