KG bots. Wikipedia robots play an important role in the quality and the maintenance of Wikipedia. We follow a similar approach in DeKaloG. The accessibility principle allows to write any query on any KG, so a “KG Bot” can acquire required knowledge on any KG even if the KG Bot is not executed by the KG provider (as in Wikipedia). For example, we can write a simple “sameAsBot” query looking for inverse functional properties such as “hasHomepage”.
- The sameAsBot acquires data with the “?x1 hasHomepage ?y and ?x2 hasHomepage ?y filter (?x1 != ?x2)”.
- Next, for each result, the sameAsBot creates a new fact “?x1 sameas ?x2” in its own sameAsBot KG. Each fact in the sameAsBot KG has the query with mappings in its context, for transparency reasons.
- Thanks to the index, the KG provider discovers that new context elements are available for some of her entities. Such tasks can be handled automatically by an “ObserveBot”.
- Thanks to provenance information, she is able to verify automatically the fact “?x1 sameAs ?x2” and update her KG. As we can see, context information enables collaboration between bots.
This scenario illustrates how the different principles of DeKaloG enable web automation. KG bots allow automatic refinement of KGs contributing to the improvement of global knowledge. This is a way to bootstrap a virtuous web of KGs. In addition, KG bots can be instrumented to document their actions and thus contribute to the transparency of KG refinement tasks.