Skip to main content

Inference

This service allows you to automatically generate inferences when resources are created, updated or deleted through LDP.

Features

  • Extract the inverse relations from provided OWL files
  • Automatically generate inverse links on create/update/delete operations
  • Add or remove the triples directly to the triple store, in a single query
  • Option to receive or offer inferences from remote servers (via ActivityPub)
  • More inference types are planned in the future

Dependencies

Install

$ yarn add @semapps/inference

Usage

const { InferenceService } = require('@semapps/inference');

module.exports = {
mixins: [InferenceService],
settings: {
baseUrl: "http://localhost:3000/",
acceptFromRemoteServers: false,
offerToRemoteServers: false,
ontologies : [
{
"prefix": "pair",
"owl": "http://virtual-assembly.org/ontologies/pair/ontology.ttl",
"url": "http://virtual-assembly.org/ontologies/pair#"
},
...
],
}
};

Service settings

PropertyTypeDefaultDescription
baseUrlStringrequiredBase URL of the LDP server
acceptFromRemoteServersBooleanfalseAccept inferences from remote servers (require RelayService)
offerToRemoteServersBooleanfalseOffer inferences to remote servers (require RelayService)
ontologies[Object] requiredList of ontology used (see example above)

Notes

  • Remote inference is currently not available for Pod providers