Inference
This service allows you to automatically generate inferences when resources are created, updated or deleted through LDP.
Features
- Extract the inverse relations from provided OWL files
- Automatically generate inverse links on create/update/delete operations
- Add or remove the triples directly to the triple store, in a single query
- Option to receive or offer inferences from remote servers (via ActivityPub)
- More inference types are planned in the future
Dependencies
- TripleStoreService
- LdpService
- RelayService (for inferences with remote servers)
Install
$ yarn add @semapps/inference
Usage
const { InferenceService } = require('@semapps/inference');
module.exports = {
mixins: [InferenceService],
settings: {
baseUrl: "http://localhost:3000/",
acceptFromRemoteServers: false,
offerToRemoteServers: false,
ontologies : [
{
"prefix": "pair",
"owl": "http://virtual-assembly.org/ontologies/pair/ontology.ttl",
"url": "http://virtual-assembly.org/ontologies/pair#"
},
...
],
}
};
Service settings
Property | Type | Default | Description |
---|---|---|---|
baseUrl | String | required | Base URL of the LDP server |
acceptFromRemoteServers | Boolean | false | Accept inferences from remote servers (require RelayService) |
offerToRemoteServers | Boolean | false | Offer inferences to remote servers (require RelayService) |
ontologies | [Object] | required | List of ontology used (see example above) |
Notes
- Remote inference is currently not available for Pod providers