Books
in black and white
Main menu
Share a book About us Home
Books
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics
Ads

the semantic web a gide to the future of XML, Web Services and Knowledge Management - Daconta M,C.

Daconta M,C. the semantic web a gide to the future of XML, Web Services and Knowledge Management - Wiley publishing , 2003. - 304 p.
ISBN 0-471-43257-1
Download (direct link): thesemanticwebguideto2003.pdf
Previous << 1 .. 5 6 7 8 9 10 < 11 > 12 13 14 15 16 17 .. 116 >> Next

A smart program can now follow this rule to make a simple deduction: 'John has sold 102 things, therefore John is a member of the Super Salesman club.'"7
Trust. Instead of having trust be a binary operation of possessing the correct credentials, we can make trust determination better by adding semantics. For example, you may want to allow access to information if a trusted friend vouches (via a digital signature) for a third party. Digital signatures are crucial to the "web of trust" and are discussed in Chapter 4. In fact, by allowing anyone to make logical statements about resources, smart applications will only want to make inferences on statements that they can trust. Thus, verifying the source of statements is a key part of the Semantic Web.
Figure 1.7 Using rules to infer the uncleOf relation.
7Aaron Swartz, "The Semantic Web in Breadth," http://logicerror.com/semanticWeb-long.
childOf
-'uncleOf
Chapter 1
The five directions discussed in the preceding text will move corporate intranets and the Web into a semantically rich knowledge base where smart software agents and Web services can process information and achieve complex tasks. The return on investment (ROI) for businesses of this approach is discussed in the next chapter.
What Do the Skeptics Say about the Semantic Web?
Every new technology faces skepticism: some warranted, some not. The skepticism of the Semantic Web seems to follow one of three paths:
Bad precedent. The most frequent specter caused by skeptics attempting to debunk the Semantic Web is the failure of the outlandish predictions of early artificial intelligence researchers in the 1960s. One of the most famous predictions was in 1957 from early AI pioneers Herbert Simon and Allen Newell, who predicted that a computer would beat a human at chess within 10 years. Tim Berners-Lee has responded to the comparison of AI and the Semantic Web like this:
A Semantic Web is not Artificial Intelligence. The concept of machine-understandable documents does not imply some magical artificial intelligence which allows machines to comprehend human mumblings. It only indicates a machine's ability to solve a well-defined problem by performing well-defined operations on existing well-defined data. Instead of asking machines to understand people's language, it involves asking people to make the extra effort.8
Fear, uncertainty, and doubt (FUD). This is skepticism "in the small" or nitpicking skepticism over the difficulty of implementation details. The most common FUD tactic is deeming the Semantic Web as too costly. Semantic Web modeling is on the same scale as modeling complex relational databases. Relational databases were costly in the 1970s, but prices have dropped precipitously (especially with the advent of open source). The cost of Semantic Web applications is already low due to the Herculean efforts of academic and research institutions. The cost will drop further as the Semantic Web goes mainstream in corporate portals and intranets within the next three years.
Status quo. This is the skeptic's assertion that things should remain
essentially the same and that we don't need a Semantic Web. Thus, these people view the Semantic Web as a distraction from linear progress in current technology. Many skeptics said the same thing about the World Wide
8Tim Berners-Lee, "What the Semantic Web can Represent," http://www.w3.org/DesignIssues/
RDFnot.html.
What Is the Semantic Web?
Web before understanding the network effect. Tim Berners-Lee's first example of the utility of the Web was to put a Web server on a mainframe and have the key information the people used at CERN (Conseil Européen pour la Recherche Nucléaire), particularly the telephone book, encoded as HTML. Tim Berners-Lee describes it like this: "Many people had workstations, with one window permanently logged on to the mainframe just to be able to look up phone numbers. We showed our new system around CERN and people accepted it, though most of them didn't understand why a simple ad hoc program for getting phone numbers wouldn't have done just as well."9 In other words, people suggested a "stovepipe system" for each new function instead of a generic architecture! Why? They could not see the value of the network effect for publishing information.
Why the Skeptics Are Wrong!
We believe that the skeptics will be proven wrong in the near future because of a convergence of the following powerful forces:
■ We have the computing power. We are building an always-on, always-connected, supercomputer-on-your-wrist information management infrastructure. When you connect cell phones to PDAs to personal computers to servers to mainframes, you have more brute-force computing power by several orders of magnitude than ever before in history. More computing power makes more layers possible. For example, the virtual machines of Java and C# were conceived of more than 20 years ago (the P-System was developed in 1977); however, they were not widely practical until the computing power of the 1990s was available. While the underpinnings are being standardized now, the Semantic Web will be practical, in terms of computing power, within three years.
Previous << 1 .. 5 6 7 8 9 10 < 11 > 12 13 14 15 16 17 .. 116 >> Next