• [ Pobierz caÅ‚ość w formacie PDF ]

    As the computing cloud grows, as it becomes ubiquitous, we will feed ever more
    intelligence into it. Using global positioning satellites and tiny radio transmitters, it will track our
    movements through the physical world as meticulously as it today tracks our clicks through the
    virtual world. And as the types of commercial and social transactions performed through the
    Internet proliferate, many more kinds of data will be collected, stored, analyzed, and made
    available to software programs. The World Wide Computer will become immeasurably smarter.
    The transfer of our intelligence into the machine will happen, in other words, whether or not we
    allow chips or sockets to be embedded in our skulls.
    Computer scientists are now in the process of creating a new language for the Internet
    that promises to make it a far more sophisticated medium for expressing and exchanging
    intelligence. In creating Web pages today, programmers have limited options for using codes, or
    tags, to describe text, images, and other content. The Web s traditional hypertext markup
    language, or HTML, concentrates on simple formatting commands on instructing, for instance,
    a Web browser to put a line of text into italics or to center it on a page. The new language will
    allow programmers to go much further. They ll be able to use tags to describe the meaning of
    objects like words and pictures as well as the associations between different objects. A person s
    name, for instance, could carry with it information about the person s address and job, likes and
    dislikes, and relationships to other people. A product s name could have tags describing its price,
    availability, manufacturer, and compatibility with other products.
    This new language, software engineers believe, will pave the way for much more
    intelligent  conversations between computers on the Internet. It will turn the Web of
    information into a Web of meaning a  Semantic Web, as it s usually called. HTML s
    inventor, Tim Berners-Lee, is also spearheading the development of its replacement. In a speech
    before the 2006 International World Wide Web Conference in Scotland, he said that  the Web is
    only going to get more revolutionary and that  twenty years from now, we ll look back and say
    this was the embryonic period. He foresees a day when the  mechanisms of trade, bureaucracy
    and our daily lives will be handled by machines talking to machines.
    At the University of Washington s Turing Center, a leading artificial intelligence
    laboratory, researchers have already succeeded in creating a software program that can, at a very
    basic level,  read sentences on Web pages and extract meaning from them without requiring
    any tags from programmers. The software, called TextRunner, scans sentences and identifies the
    relationships between words or phrases. In reading the sentence  Thoreau wrote Walden after
    leaving his cabin in the woods, for instance, TextRunner would recognize that the verb  wrote
    describes a relationship between  Thoreau and  Walden. As it scans more pages and sees
    hundreds or thousands of similar constructions, it would be able to hypothesize that Thoreau is a
    writer and Walden is a book. Because TextRunner is able to read at an extraordinary rate in one
    test, it extracted a billion textual relationships from 90 million Web pages it can learn quickly.
    Its developers see it as a promising prototype of  machine reading, which they define as  the
    automatic, unsupervised understanding of text by computers.
    Scientists are also teaching machines how to see. Google has been working with
    researchers at the University of California at San Diego to perfect a system for training
    computers to interpret photographs and other images. The system combines textual tags
    describing an image s contents with a statistical analysis of the image. A computer is first trained
    to recognize an object a tree, say by being shown many images containing the object that
    have been tagged with the description  tree by people. The computer learns to make an
    association between the tag and a mathematical analysis of the shapes appearing in the images. It
    learns, in effect, to spot a tree, regardless of where the tree happens to appear in a given picture.
    Having been seeded with the human intelligence, the computer can then begin to interpret
    images on its own, supplying its own tags with ever increasing accuracy. Eventually, it becomes
    so adept at  seeing that it can dispense with the trainers altogether. It thinks for itself.
    In 1945, the Princeton physicist John von Neumann sketched out the first plan for
    building an electronic computer that could store in its memory the instructions for its use. His
    plan became the blueprint for all modern digital computers. The immediate application of von
    Neumann s revolutionary machine was military designing nuclear bombs and other
    weapons but the scientist knew from the start that he had created a general purpose technology,
    one that would come to be used in ways that could not be foretold.  I am sure that the projected
    device, or rather the species of devices of which it is to be the first representative, is so radically
    new that many of its uses will become clear only after it has been put into operation, he wrote to
    Lewis Strauss, the future chairman of the Atomic Energy Commission, on October 24, 1945.
     Uses which are likely to be the most important are by definition those which we do not
    recognize at present because they are farthest removed from our present sphere.
    We are today at a similar point in the history of the World Wide Computer. We have built
    it and are beginning to program it, but we are a long way from knowing all the ways it will come
    to be used. We can anticipate, however, that unlike von Neumann s machine, the World Wide
    Computer will not just follow our instructions. It will learn from us and, eventually, it will write
    its own instructions.
    GEORGE D YSON, A historian of technology and the son of another renowned [ Pobierz całość w formacie PDF ]

  • zanotowane.pl
  • doc.pisz.pl
  • pdf.pisz.pl
  • zambezia2013.opx.pl