Download (direct link):
So if you are focused on the instance level, then that level can be viewed as the object level, and its meta level is the level at which class or universal knowledge is asserted (the concepts of the ontology). If instead one is focused on the class or universal level, then that level can be viewed as the object level, and its meta level is the level of the knowledge representation language.
Table 8.6 displays the three levels of representation required for ontologies and the kinds of constructs represented at the individual levels.
■ Level 1—The knowledge representation level
■ Level 2—The ontology concept level ■■ Level 3—The ontology instance level
The knowledge representation language level (the highest meta level) defines the constructs that will be used at the ontology concept level. These constructs include the notions of Class, Relation, Property, and so on. Examples of KR languages (which we talk about in more detail in the next section) include languages that preceded the Semantic Web—such as KL-ONE, Ontolingua, Classic, LOOM, Knowledge Interchange Format (KIF), CycL, and Unified Modeling Language (UML)—and Semantic Web languages, including RDF/S, DAML+OIL, and OWL.13
13KL-ONE: Brachman and Schmoltze (1985), CLASSIC: Patel-Schneider et al. (1991); LOOM: MacGregor (1991); Knowledge Interchange Format (KIF): [KIF]; Ontolingua: Gruber (1993);
Cyc and CycL: Lenat and Guha (1990, 1991) and [CYC]; Unified Modeling Language: [UML]; DAML+OIL: [DAML+OIL]; OWL: Dean et al. (2002), Smith et al. (2002), McGuinness and van Harmelen (2002).
Table 8.6 Ontology Representation Levels
Knowledge representation (KR) Class, Relation, Instance, Function,
language (Ontology Language) level: Attribute, Property, Constraint, Axiom, Rule
Meta level to the ontology concept level
Ontology concept (OC) level:
Person, Location, Event, Parent, Hammer, River, FinancialTransaction, BuyingAHouse, Automobile, TravelPlanning, etc.
Object level to the KR language level, meta level to the instance level
Ontology instance (OI) level:
Harry X. Landsford III, Ralph Waldo
Object level to the ontology concept level
Emerson, Person560234, PurchaseOrder TransactionEvent6117090, 1995-96 V-6 Ford Taurus 244/4.0 Aerostar Automatic with Block Casting # 95TM-AB and Head Casting 95TM
Web Ontology Language is nicknamed OWL in honor of Owl in Winnie the Pooh (Milne, 1996), who spells his name "WOL." Examples of OWL documents can be found at http://www.w3.org/2001/sw/WebOnt/.
At the second level, the ontology concept (OC) level, ontologies are defined using the constructs of the KR level. At this level, you are interested in modeling the generic or universal content, the domain knowledge about Persons, Locations, Events, Parents, Hammers, and FinancialTransactions.
At the third and lowest level, the ontology instance level, the constructs are instances of ontology concept level constructs. So this level concerns the knowledge base or fact base, the assertions about instances or individuals such as Harry X. Landsford III, an instance of the class Person, and PurchaseOrder-TransactionEvent6117090, an instance of the class PurchaseOrderTransaction-Event.
Ontology and Semantic Mapping Problem
One important issue in understanding and developing ontologies is the ontology or semantic mapping problem. We say "or semantic mapping problem" because this is an issue that affects everything in information technology that must confront semantics problems—that is, the problem of representing meaning for systems, applications, databases, and document collections. You must
always consider mappings between whatever representations of semantics you currently have (for system, application, database, document collection) and some other representation of semantics (within your own enterprise, within your community, across your market, or the world). And you must consider semantic mappings within your set of ontologies or whatever your semantic base representation is (if it's not ontologies, it's probably hard-coded in the procedural code that services your databases, and that means it's really a problem).
This semantic problem exists within and without ontologies. That means that it exists within any given semantic representation such as an ontology, and it exists between (without) ontologies. Within an ontology, you will need to focus on a specific context (or view) of the ontology, given a specific purpose or rationale or use of the ontology. And without (between) ontologies, you will need to focus on the semantic equivalence between different concepts and relations in two or more distinct ontologies. These ontologies may or may not be about approximately the same things. Chances are, the two distinct ontologies that you need to map together say similar but slightly different things about the same domain. Or you may need to map your reference ontology or ontology lattice to another standard represented as a taxonomy, thesaurus, or ontology. And you need to avoid semantic loss in the mapping.