
Issues of Knowledge Representation in AI
There are some common issues of knowledge representation in AI. They are:
- Relationship Issue
- Granularity Issue
- Attribute Issue

All the issues of knowledge representation in AI are discussed in short below.
Relationship Issue
When we represent some knowledge in a specific way some relationship issue arrives. For example, inverses, existence, techniques for reasoning about values and single valued attributes.
Consider the below two representation for same facts:
language(Danny, Javascript)
This tells Danny is Javascript Developer or Danny’s language is Javascript.
This can be also represented in the way below:
language = Javascript developers = Danny, ........
There is a little bit difference in relationship representation in above two case.
Granularity Issue
While representing any knowledge we should care at what level should the knowledge be represented and what are the primitives. Granularity of Representation Primitives are fundamental concepts such as holding, seeing, playing.
For example, English is a popular language with over half a million words.
It needs to ensure we will find difficulty in deciding upon which words to choose as our primitives in a series of situations. See the below statements:
If Harry feeds a dog then it could become:
feeds(harry, dog)
If Harry gives the dog a bone that could be:
gives(harry, dog, bone)
These two statements are not same. So, we should choose primitives carefully.
In this condition we may need an additional statement which will relate the giving as feeding.
give(x, food) → feed(x)
So, we may need to add some inferential rules in this case.
Attribute Issue
There are some attributes which may occur in many different types of problem.
Consider, there are two instance and isa and each is important because each supports property inheritance.
So, this may be an attribute issue.