Week 3: OWL & Reasoning
Learning Objectives
This week introduces the Web Ontology Language (OWL) and automated reasoning. You will learn to create expressive ontologies with class restrictions and use reasoners to infer new knowledge.
1. Introduction to OWL
1. Introduction to OWL
Why RDFS Isn't Enough: The "Dictionary" vs. The "Logic Book"
In Week 2, we used RDFS to define our vocabulary. RDFS is like a Dictionary: it tells you that "Movie" is a type of "Work".
But a dictionary can't enforce logic. It can't say:
- "A Movie MUST have at least one Director."
- "A Person cannot be both a Movie and an Actor."
OWL (Web Ontology Language) is the Logic Book. It adds the rules of the world to your vocabulary, allowing a machine to reason and discover new facts.
OWL extends RDFS with more expressive power for defining complex ontologies.
OWL Profiles
| Profile | Expressivity | Use Case |
|---|---|---|
| OWL 2 EL | Low | Large ontologies, classification |
| OWL 2 QL | Low | Query answering, databases |
| OWL 2 RL | Medium | Rule-based reasoning |
| OWL 2 DL | High | Full reasoning, decidable |
| OWL 2 Full | Maximum | No reasoning guarantees |
OWL vs RDFS
RDFS Features:
- Class and property hierarchies
- Domain and range constraints
OWL Additions:
- Class restrictions (someValuesFrom, allValuesFrom)
- Cardinality constraints
- Property characteristics (transitive, symmetric)
- Class operations (union, intersection, complement)
- Equivalence and disjointness2. Class Hierarchy and Relationships
Defining Classes in OWL
@prefix owl: <http://www.w3.org/2002/07/owl#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix ex: <http://example.org/> .
# Class declarations
ex:Animal a owl:Class .
ex:Mammal a owl:Class ;
rdfs:subClassOf ex:Animal .
ex:Bird a owl:Class ;
rdfs:subClassOf ex:Animal .
# Disjoint classes
ex:Mammal owl:disjointWith ex:Bird .
# Equivalent classes
ex:Human owl:equivalentClass ex:Person .Property Definitions
# Object property (relates individuals)
ex:hasParent a owl:ObjectProperty ;
rdfs:domain ex:Person ;
rdfs:range ex:Person .
# Datatype property (relates to literals)
ex:hasAge a owl:DatatypeProperty ;
rdfs:domain ex:Person ;
rdfs:range xsd:integer .
# Inverse properties
ex:hasChild a owl:ObjectProperty ;
owl:inverseOf ex:hasParent .3. Property Characteristics
OWL supports various property characteristics for richer modeling:
Transitive Properties
ex:isAncestorOf a owl:ObjectProperty,
owl:TransitiveProperty .
# If A isAncestorOf B and B isAncestorOf C
# Then A isAncestorOf C is inferredSymmetric and Asymmetric
ex:isSiblingOf a owl:ObjectProperty,
owl:SymmetricProperty .
# If A isSiblingOf B, then B isSiblingOf A
ex:isParentOf a owl:ObjectProperty,
owl:AsymmetricProperty .Functional and Inverse Functional
ex:hasBiologicalMother a owl:ObjectProperty,
owl:FunctionalProperty .
# Each person has exactly one biological mother
ex:hasSocialSecurityNumber a owl:DatatypeProperty,
owl:InverseFunctionalProperty .
# SSN uniquely identifies a person4. Class Restrictions
Existential Restrictions (someValuesFrom)
# A Parent is someone who has at least one child
ex:Parent a owl:Class ;
owl:equivalentClass [
a owl:Restriction ;
owl:onProperty ex:hasChild ;
owl:someValuesFrom ex:Person
] .Universal Restrictions (allValuesFrom)
# Vegetarians only eat vegetables
ex:Vegetarian a owl:Class ;
rdfs:subClassOf [
a owl:Restriction ;
owl:onProperty ex:eats ;
owl:allValuesFrom ex:Vegetable
] .Cardinality Restrictions
# A bicycle has exactly 2 wheels
ex:Bicycle a owl:Class ;
rdfs:subClassOf [
a owl:Restriction ;
owl:onProperty ex:hasWheel ;
owl:cardinality "2"^^xsd:nonNegativeInteger
] .
# A person has at least 1 parent
ex:Person rdfs:subClassOf [
a owl:Restriction ;
owl:onProperty ex:hasParent ;
owl:minCardinality "1"^^xsd:nonNegativeInteger
] .5. Using Protege
Protege is the most popular open-source ontology editor.
Key Features
- Visual class hierarchy browser
- Property definition interface
- Restriction builder
- Reasoner integration
- Ontology visualization
Creating an Ontology in Protege
- Create Classes: Define your class hierarchy in the Classes tab
- Define Properties: Add object and data properties
- Add Restrictions: Use the class expression editor
- Run Reasoner: Use HermiT or Pellet to check consistency
- Export: Save as OWL/XML, Turtle, or other formats
Protege Workflow
1. File > New Ontology
2. Set IRI: http://example.org/myontology
3. Classes tab > Add subclass
4. Object Properties tab > Add property
5. Reasoner > Start reasoner
6. Inferred view shows derived facts6. Automated Reasoning
What Reasoners Do
| Task | Description |
|---|---|
| Consistency Checking | Verify no contradictions exist |
| Classification | Compute inferred class hierarchy |
| Realization | Determine class membership of individuals |
| Entailment | Check if a statement is implied |
Popular Reasoners
- HermiT: OWL 2 DL reasoner, very complete
- Pellet: Supports rules, explanations
- ELK: Fast for OWL 2 EL ontologies
- FaCT++: Efficient description logic reasoner
Reasoning in Python with Owlready2
from owlready2 import *
# Load ontology
onto = get_ontology("family.owl").load()
# Run reasoner
with onto:
sync_reasoner()
# Check inferred facts
for person in onto.Person.instances():
print(f"{person.name}: {person.is_a}")
# Query inferred classes
for cls in onto.classes():
inferred_parents = cls.is_a
print(f"{cls.name} subClassOf {inferred_parents}")Example: Inference
# Defined classes
ex:Person a owl:Class .
ex:hasChild a owl:ObjectProperty .
ex:Parent owl:equivalentClass [
a owl:Restriction ;
owl:onProperty ex:hasChild ;
owl:someValuesFrom ex:Person
] .
# Instance data
ex:John a ex:Person ;
ex:hasChild ex:Mary .
ex:Mary a ex:Person .
# Reasoner infers:
# ex:John a ex:Parent .Project: Movie Recommendation Knowledge Graph
Progress
| Week | Topic | Project Milestone |
|---|---|---|
| 1 | Ontology Introduction | Movie domain design completed |
| 2 | RDF & RDFS | 10 movies converted to RDF |
| 3 | OWL & Reasoning | Discover hidden relationships with inference rules |
| 4 | Knowledge Extraction | Collect movie information from Wikipedia |
| 5 | Neo4j | Store in graph DB and query |
| 6 | GraphRAG | Natural language queries |
| 7 | Ontology Agent | Automatic updates for new movies |
| 8 | Domain Extension | Medical/Legal/Finance cases |
| 9 | Service Deployment | API + Dashboard |
Week 3 Milestone: Adding OWL Reasoning Rules
This week, you will apply OWL reasoning rules to the RDF data to automatically discover hidden relationships.
Reasoning Rules to Add:
# 1. VeteranDirector: A director who has directed 3 or more movies
class VeteranDirector(Director):
equivalent_to = [Director & directed.min(3, Movie)]
# 2. Blockbuster: A movie with rating >= 8.0 AND runtime >= 120 minutes
class Blockbuster(Movie):
equivalent_to = [Movie & rating.value(lambda x: x >= 8.0)
& runtime.value(lambda x: x >= 120)]
# 3. FrequentCollaborator: An actor who has worked with the same director on 2+ movies
class FrequentCollaborator(Actor):
# Actor who appeared in 2+ movies by the same directorInference Results Example:
| Input Data | Inferred Result |
|---|---|
| Nolan directed 5 movies | -> VeteranDirector |
| Inception (rating 8.8, 148 min) | -> Blockbuster |
| DiCaprio + Nolan: 2 movies | -> FrequentCollaborator |
In the project notebook, you'll use reasoning to automatically discover hidden relationships.
In the project notebook, you will implement:
- VeteranDirector class: Auto-classify directors with 3+ movies
- Blockbuster class: Infer movies with rating 8.0+ AND runtime 120min+
- FrequentCollaborator: Discover actors who worked with same director 2+ times
- Run Pellet Reasoner to auto-infer "Nolan = VeteranDirector"
What you'll build by Week 9: An AI agent that answers "Recommend sci-fi movies like Nolan's style" by reasoning over director-genre-rating relationships in the knowledge graph
Practice Notebook
For deeper exploration of the theory:
The practice notebook covers additional topics:
- Intersection, Union, Complement class expressions
- Pellet vs HermiT Reasoner performance comparison
- SHACL for data quality validation
- Reasoning visualization in Protégé
Interview Questions
What is the Open World Assumption and how does it affect reasoning?
Key Points:
- OWA: Absence of information does not imply falsity
- Unlike databases (Closed World), unknown facts are possible
- Affects query answers: "not found" differs from "false"
- Implications for negation and cardinality reasoning
- Example: If we don't know John's age, we can't assume he has no age
Premium Content
Want complete solutions with detailed explanations and production-ready code?
Check out the Ontology & Knowledge Graph Cookbook Premium (opens in a new tab) for:
- Complete notebook solutions with step-by-step explanations
- Real-world case studies and best practices
- Interview preparation materials
- Production deployment guides
Next Steps
In Week 4: Knowledge Extraction, you will learn how to extract knowledge from unstructured text using NER and relation extraction.