Discussion:
Adding data to the graph during a SPARQL-query in Fuseki
Gernot Steindl
2018-11-21 07:22:25 UTC
Permalink
Hello,

I implemented a custom property function, which is retrieving data from
a relational database during the a SPARQL- query. The idea is to keep
the data in the database and load it into the graph only if it’s necessary.

It worked with Jena, but when I apply it to Fuseki I am getting problems
because of the lock mechanism for concurrent access.
The custom property function, which is used in a query, adds elements to
the graph (“write”), which is not allowed during the read-lock
transaction of the query (see error message below).

Is it possible to configure Fuseki in a way, that the query endpoint
would always use a write-lock, like the update endpoint does? As it is
not a public endpoint, the performance issue won’t be a problem.

My second approach would be to write the data from the relational
database into a second memory-stored graph. But I have no idea how I
will be able to make the connection between the two graphs for the
custom property function?
Do you have any hints for me on how I can reach my goal of adding data
just on demand?


Here is the code and the error, to get an idea of what I am trying to do:

public class GetDBValues implements PropertyFunctionFactory {

        public static void init() {
            //register custom property function
            final PropertyFunctionRegistry reg =
PropertyFunctionRegistry.chooseRegistry(ARQ.getContext());
reg.put("http://www.semanticweb.org/ontologies/2018/8/SensorTest.owl#getdbValues",
new GetDBValues());
            PropertyFunctionRegistry.set(ARQ.getContext(), reg);
        }

        @Override
        public PropertyFunction create(final String uri)
        {
            return new PFuncSimple()
            {

                @Override
                public QueryIterator execEvaluated(Binding parent, Node
subject, Node predicate, Node object,ExecutionContext execCxt) {

                    //get datapoint URI
                    String datapointURI=subject.getURI().toString();

                    //create the model from the context
                    Graph graph=execCxt.getActiveGraph();
                    Model model =ModelFactory.createModelForGraph(graph);

                    //Here the data from the raltional database is
retrieved
                    DatabaseConnector con= new InfluxDBMockup();
//produce dummy data
                    series = con.getData("dbName", "table","col")

                    if (series != null){

                        Model model2 = ModelFactory.createDefaultModel();

                        // create graph for database entries
                        String
ns="http://www.semanticweb.org/ontologies/2018/8/SensorTest.owl#";

                        Property valueProperty=
ResourceFactory.createProperty(ns +"currentValue");
                        Property timestampProperty=
ResourceFactory.createProperty(ns +"timestamp");

                            for (TimeValuePair obs : series) {

                                //create blank observation node
                                 Node bNode =
NodeFactory.createBlankNode();
                                 RDFNode rdfNode=model.asRDFNode(bNode);

//add timestamp and sensor value to the graph

//This model.add () causes the error!!!
                                 model.add(rdfNode.asResource(),
valueProperty, obs.getValue().toString());
                                 model.add(rdfNode.asResource(),
timestampProperty, obs.getTimeStamp());

                                 final Binding b =
BindingFactory.binding(parent, Var.alloc(object),bNode);
                                 bindings.add(b);
                            }
                    }
                    return new
QueryIterPlainWrapper(bindings.iterator(), execCxt) ;
                }
            };
        }
}

The query looks like:
SELECT *
WHERE{ ?dataPoint se:hasType ?type.
                ?dataPoint se:getdbValues ?observedValue
                ?observedValue se:timestamp ?time
}

 Here the error:

Fuseki     WARN  [7] RC = 500 : Tried to write inside a READ transaction!
org.apache.jena.sparql.JenaTransactionException: Tried to write inside a
READ transaction!
        at
org.apache.jena.sparql.core.mem.DatasetGraphInMemory.mutate(DatasetGraphInMemory.java:382)
        at
org.apache.jena.sparql.core.mem.DatasetGraphInMemory.addToDftGraph(DatasetGraphInMemory.java:416)
        at
org.apache.jena.sparql.core.DatasetGraphTriplesQuads.add(DatasetGraphTriplesQuads.java:42)
        at
org.apache.jena.sparql.core.GraphView.performAdd(GraphView.java:152)
        at org.apache.jena.graph.impl.GraphBase.add(GraphBase.java:181)
        at org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:1202)
        at org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:184)
        at org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:172)
        at at.bim4bems.GetDBValues$1.execEvaluated(GetDBValues.java:125)
        at
org.apache.jena.sparql.pfunction.PFuncSimple.execEvaluated(PFuncSimple.java:45)
        at
org.apache.jena.sparql.pfunction.PropertyFunctionEval.exec(PropertyFunctionEval.java:42)
        at
org.apache.jena.sparql.pfunction.PropertyFunctionBase$RepeatApplyIteratorPF.nextStage(PropertyFunctionBase.java:106)
        at
org.apache.jena.sparql.engine.iterator.QueryIterRepeatApply.makeNextStage(QueryIterRepeatApply.java:108)
        at
org.apache.jena.sparql.engine.iterator.QueryIterRepeatApply.hasNextBinding(QueryIterRepeatApply.java:65)
        at
org.apache.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:114)
        at
org.apache.jena.sparql.engine.iterator.QueryIterProcedure.hasNextBinding(QueryIterProcedure.java:73)
        at
org.apache.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:114)
        at
org.apache.jena.sparql.engine.main.StageGeneratorGeneric.execute(StageGeneratorGeneric.java:61)
        at
org.apache.jena.sparql.engine.main.StageGeneratorGeneric.execute(StageGeneratorGeneric.java:53)
        at
org.apache.jena.tdb.solver.StageGeneratorDirectTDB.execute(StageGeneratorDirectTDB.java:53)
        at
org.apache.jena.tdb2.solver.StageGeneratorDirectTDB.execute(StageGeneratorDirectTDB.java:59)
        at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:128)
        at
org.apache.jena.sparql.engine.main.ExecutionDispatch.visit(ExecutionDispatch.java:58)
        at org.apache.jena.sparql.algebra.op.OpBGP.visit(OpBGP.java:49)
        at
org.apache.jena.sparql.engine.main.ExecutionDispatch.exec(ExecutionDispatch.java:46)
        at
org.apache.jena.sparql.engine.main.OpExecutor.exec(OpExecutor.java:117)
        at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:228)
        at
org.apache.jena.sparql.engine.main.ExecutionDispatch.visit(ExecutionDispatch.java:130)
        at
org.apache.jena.sparql.algebra.op.OpSequence.visit(OpSequence.java:75)
        at
org.apache.jena.sparql.engine.main.ExecutionDispatch.exec(ExecutionDispatch.java:46)
        at
org.apache.jena.sparql.engine.main.OpExecutor.exec(OpExecutor.java:117)
        at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:88)
        at org.apache.jena.sparql.engine.main.QC.execute(QC.java:52)
        at
org.apache.jena.sparql.engine.main.QueryEngineMain.eval(QueryEngineMain.java:55)
        at
org.apache.jena.sparql.engine.QueryEngineBase.evaluate(QueryEngineBase.java:175)
        at
org.apache.jena.sparql.engine.QueryEngineBase.createPlan(QueryEngineBase.java:131)
        at
org.apache.jena.sparql.engine.QueryEngineBase.getPlan(QueryEngineBase.java:112)
        at
org.apache.jena.sparql.engine.main.QueryEngineMain$QueryEngineMainFactory.create(QueryEngineMain.java:90)
        at
org.apache.jena.sparql.engine.QueryExecutionBase.getPlan(QueryExecutionBase.java:593)
        at
org.apache.jena.sparql.engine.QueryExecutionBase.startQueryIterator(QueryExecutionBase.java:542)
        at
org.apache.jena.sparql.engine.QueryExecutionBase.execResultSet(QueryExecutionBase.java:581)
        at
org.apache.jena.sparql.engine.QueryExecutionBase.execSelect(QueryExecutionBase.java:204)
        at
org.apache.jena.fuseki.servlets.SPARQL_Query.executeQuery(SPARQL_Query.java:344)
        at
org.apache.jena.fuseki.servlets.SPARQL_Query.execute(SPARQL_Query.java:288)
        at
org.apache.jena.fuseki.servlets.SPARQL_Query.executeWithParameter(SPARQL_Query.java:242)
        at
org.apache.jena.fuseki.servlets.SPARQL_Query.perform(SPARQL_Query.java:227)
        at
org.apache.jena.fuseki.servlets.ActionService.executeLifecycle(ActionService.java:183)
        at
org.apache.jena.fuseki.servlets.ActionService.execCommonWorker(ActionService.java:98)
        at
org.apache.jena.fuseki.servlets.ActionBase.doCommon(ActionBase.java:74)
        at
org.apache.jena.fuseki.servlets.FusekiFilter.doFilter(FusekiFilter.java:73)
        at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
        at
org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:61)
        at
org.apache.shiro.web.servlet.AdviceFilter.executeChain(AdviceFilter.java:108)
        at
org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:137)
        at
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
        at
org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:66)
        at
org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(AbstractShiroFilter.java:449)
        at
org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(AbstractShiroFilter.java:365)
        at
org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90)
        at
org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83)
        at
org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:383)
        at
org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(AbstractShiroFilter.java:362)
        at
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
        at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
        at
org.apache.jena.fuseki.servlets.CrossOriginFilter.handle(CrossOriginFilter.java:285)
        at
org.apache.jena.fuseki.servlets.CrossOriginFilter.doFilter(CrossOriginFilter.java:248)
        at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1634)
        at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533)
        at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
        at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
        at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
        at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
        at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595)
        at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
        at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1340)
        at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
        at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)
        at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564)
        at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
        at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1242)
        at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
        at
org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:690)
        at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
        at org.eclipse.jetty.server.Server.handle(Server.java:503)
        at
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
        at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
        at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
        at
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
        at
org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
        at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
        at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
        at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
        at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
        at
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
        at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
        at
org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
        at java.lang.Thread.run(Unknown Source)
[2018-11-20 16:00:00] Fuseki     INFO  [7] 500 Tried to write inside a
READ transaction! (70 ms)
Andy Seaborne
2018-11-21 10:49:15 UTC
Permalink
Hi Gernot,

In SPARQL queries don't have side effects and that's why Fuseki runs
them inside a read transaction.

The way to create data is to to make an update: have a property function
that generates the variables and uses the INSERT template to create the
data:

Outline:


INSERT {
[] se:currentValue ?value ;
se:timestamp ?timestamp
} WHERE {
[] se:getdbValues (?value ?timestamp)
}
Post by Gernot Steindl
My second approach would be to write the data from the relational
database into a second memory-stored graph.
This might be workable but to me does not look like such a good way to
do it. The INSERT above is a dump already.

The property function can create the temporary graph and access it.
Post by Gernot Steindl
SELECT *
WHERE{ ?dataPoint se:hasType ?type.
?dataPoint se:getdbValues ?observedValue
?observedValue se:timestamp ?time
}
The property function does not seem to make use of ?dataPoint so this a
cross-product of the se:hasType triple pattern and the rest. You'll get
duplicates in the results if "?dataPoint se:hasType ?type" has more than
one match.

Andy
Post by Gernot Steindl
Hello,
I implemented a custom property function, which is retrieving data from
a relational database during the a SPARQL- query. The idea is to keep
the data in the database and load it into the graph only if it’s necessary.
It worked with Jena, but when I apply it to Fuseki I am getting problems
because of the lock mechanism for concurrent access.
The custom property function, which is used in a query, adds elements to
the graph (“write”), which is not allowed during the read-lock
transaction of the query (see error message below).
Is it possible to configure Fuseki in a way, that the query endpoint
would always use a write-lock, like the update endpoint does? As it is
not a public endpoint, the performance issue won’t be a problem.
My second approach would be to write the data from the relational
database into a second memory-stored graph. But I have no idea how I
will be able to make the connection between the two graphs for the
custom property function?
Do you have any hints for me on how I can reach my goal of adding data
just on demand?
public class GetDBValues implements PropertyFunctionFactory {
        public static void init() {
            //register custom property function
            final PropertyFunctionRegistry reg =
PropertyFunctionRegistry.chooseRegistry(ARQ.getContext());
reg.put("http://www.semanticweb.org/ontologies/2018/8/SensorTest.owl#getdbValues",
new GetDBValues());
            PropertyFunctionRegistry.set(ARQ.getContext(), reg);
        }
        public PropertyFunction create(final String uri)
        {
            return new PFuncSimple()
            {
                public QueryIterator execEvaluated(Binding parent, Node
subject, Node predicate, Node object,ExecutionContext execCxt) {
                    //get datapoint URI
                    String datapointURI=subject.getURI().toString();
                    //create the model from the context
                    Graph graph=execCxt.getActiveGraph();
                    Model model =ModelFactory.createModelForGraph(graph);
                    //Here the data from the raltional database is
retrieved
                    DatabaseConnector con= new InfluxDBMockup();
//produce dummy data
                    series = con.getData("dbName", "table","col")
                    if (series != null){
                        Model model2 = ModelFactory.createDefaultModel();
                        // create graph for database entries
                        String
ns="http://www.semanticweb.org/ontologies/2018/8/SensorTest.owl#";
                        Property valueProperty=
ResourceFactory.createProperty(ns +"currentValue");
                        Property timestampProperty=
ResourceFactory.createProperty(ns +"timestamp");
                            for (TimeValuePair obs : series) {
                                //create blank observation node
                                 Node bNode =
NodeFactory.createBlankNode();
                                 RDFNode rdfNode=model.asRDFNode(bNode);
//add timestamp and sensor value to the graph
//This model.add () causes the error!!!
                                 model.add(rdfNode.asResource(),
valueProperty, obs.getValue().toString());
                                 model.add(rdfNode.asResource(),
timestampProperty, obs.getTimeStamp());
                                 final Binding b =
BindingFactory.binding(parent, Var.alloc(object),bNode);
                                 bindings.add(b);
                            }
                    }
                    return new
QueryIterPlainWrapper(bindings.iterator(), execCxt) ;
                }
            };
        }
}
SELECT *
WHERE{ ?dataPoint se:hasType ?type.
                ?dataPoint se:getdbValues ?observedValue
                ?observedValue se:timestamp ?time
}
Fuseki     WARN  [7] RC = 500 : Tried to write inside a READ transaction!
org.apache.jena.sparql.JenaTransactionException: Tried to write inside a
READ transaction!
        at
org.apache.jena.sparql.core.mem.DatasetGraphInMemory.mutate(DatasetGraphInMemory.java:382)
        at
org.apache.jena.sparql.core.mem.DatasetGraphInMemory.addToDftGraph(DatasetGraphInMemory.java:416)
        at
org.apache.jena.sparql.core.DatasetGraphTriplesQuads.add(DatasetGraphTriplesQuads.java:42)
        at
org.apache.jena.sparql.core.GraphView.performAdd(GraphView.java:152)
        at org.apache.jena.graph.impl.GraphBase.add(GraphBase.java:181)
        at org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:1202)
        at org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:184)
        at org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:172)
        at at.bim4bems.GetDBValues$1.execEvaluated(GetDBValues.java:125)
        at
org.apache.jena.sparql.pfunction.PFuncSimple.execEvaluated(PFuncSimple.java:45)
        at
org.apache.jena.sparql.pfunction.PropertyFunctionEval.exec(PropertyFunctionEval.java:42)
        at
org.apache.jena.sparql.pfunction.PropertyFunctionBase$RepeatApplyIteratorPF.nextStage(PropertyFunctionBase.java:106)
        at
org.apache.jena.sparql.engine.iterator.QueryIterRepeatApply.makeNextStage(QueryIterRepeatApply.java:108)
        at
org.apache.jena.sparql.engine.iterator.QueryIterRepeatApply.hasNextBinding(QueryIterRepeatApply.java:65)
        at
org.apache.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:114)
        at
org.apache.jena.sparql.engine.iterator.QueryIterProcedure.hasNextBinding(QueryIterProcedure.java:73)
        at
org.apache.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:114)
        at
org.apache.jena.sparql.engine.main.StageGeneratorGeneric.execute(StageGeneratorGeneric.java:61)
        at
org.apache.jena.sparql.engine.main.StageGeneratorGeneric.execute(StageGeneratorGeneric.java:53)
        at
org.apache.jena.tdb.solver.StageGeneratorDirectTDB.execute(StageGeneratorDirectTDB.java:53)
        at
org.apache.jena.tdb2.solver.StageGeneratorDirectTDB.execute(StageGeneratorDirectTDB.java:59)
        at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:128)
        at
org.apache.jena.sparql.engine.main.ExecutionDispatch.visit(ExecutionDispatch.java:58)
        at org.apache.jena.sparql.algebra.op.OpBGP.visit(OpBGP.java:49)
        at
org.apache.jena.sparql.engine.main.ExecutionDispatch.exec(ExecutionDispatch.java:46)
        at
org.apache.jena.sparql.engine.main.OpExecutor.exec(OpExecutor.java:117)
        at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:228)
        at
org.apache.jena.sparql.engine.main.ExecutionDispatch.visit(ExecutionDispatch.java:130)
        at
org.apache.jena.sparql.algebra.op.OpSequence.visit(OpSequence.java:75)
        at
org.apache.jena.sparql.engine.main.ExecutionDispatch.exec(ExecutionDispatch.java:46)
        at
org.apache.jena.sparql.engine.main.OpExecutor.exec(OpExecutor.java:117)
        at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:88)
        at org.apache.jena.sparql.engine.main.QC.execute(QC.java:52)
        at
org.apache.jena.sparql.engine.main.QueryEngineMain.eval(QueryEngineMain.java:55)
        at
org.apache.jena.sparql.engine.QueryEngineBase.evaluate(QueryEngineBase.java:175)
        at
org.apache.jena.sparql.engine.QueryEngineBase.createPlan(QueryEngineBase.java:131)
        at
org.apache.jena.sparql.engine.QueryEngineBase.getPlan(QueryEngineBase.java:112)
        at
org.apache.jena.sparql.engine.main.QueryEngineMain$QueryEngineMainFactory.create(QueryEngineMain.java:90)
        at
org.apache.jena.sparql.engine.QueryExecutionBase.getPlan(QueryExecutionBase.java:593)
        at
org.apache.jena.sparql.engine.QueryExecutionBase.startQueryIterator(QueryExecutionBase.java:542)
        at
org.apache.jena.sparql.engine.QueryExecutionBase.execResultSet(QueryExecutionBase.java:581)
        at
org.apache.jena.sparql.engine.QueryExecutionBase.execSelect(QueryExecutionBase.java:204)
        at
org.apache.jena.fuseki.servlets.SPARQL_Query.executeQuery(SPARQL_Query.java:344)
        at
org.apache.jena.fuseki.servlets.SPARQL_Query.execute(SPARQL_Query.java:288)
        at
org.apache.jena.fuseki.servlets.SPARQL_Query.executeWithParameter(SPARQL_Query.java:242)
        at
org.apache.jena.fuseki.servlets.SPARQL_Query.perform(SPARQL_Query.java:227)
        at
org.apache.jena.fuseki.servlets.ActionService.executeLifecycle(ActionService.java:183)
        at
org.apache.jena.fuseki.servlets.ActionService.execCommonWorker(ActionService.java:98)
        at
org.apache.jena.fuseki.servlets.ActionBase.doCommon(ActionBase.java:74)
        at
org.apache.jena.fuseki.servlets.FusekiFilter.doFilter(FusekiFilter.java:73)
        at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
        at
org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:61)
        at
org.apache.shiro.web.servlet.AdviceFilter.executeChain(AdviceFilter.java:108)
        at
org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:137)
        at
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
        at
org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:66)
        at
org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(AbstractShiroFilter.java:449)
        at
org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(AbstractShiroFilter.java:365)
        at
org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90)
        at
org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83)
        at
org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:383)
        at
org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(AbstractShiroFilter.java:362)
        at
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
        at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
        at
org.apache.jena.fuseki.servlets.CrossOriginFilter.handle(CrossOriginFilter.java:285)
        at
org.apache.jena.fuseki.servlets.CrossOriginFilter.doFilter(CrossOriginFilter.java:248)
        at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1634)
        at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533)
        at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
        at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
        at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
        at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
        at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595)
        at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
        at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1340)
        at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
        at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)
        at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564)
        at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
        at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1242)
        at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
        at
org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:690)
        at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
        at org.eclipse.jetty.server.Server.handle(Server.java:503)
        at
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
        at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
        at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
        at
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
        at
org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
        at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
        at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
        at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
        at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
        at
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
        at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
        at
org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
        at java.lang.Thread.run(Unknown Source)
[2018-11-20 16:00:00] Fuseki     INFO  [7] 500 Tried to write inside a
READ transaction! (70 ms)
Gernot Steindl
2018-11-26 14:46:08 UTC
Permalink
Hi Andy,

thank you for your help. You are right, I shouldn’t alter the graph
during a query! Actually, I don’t really need to make a persistent
change of the graph, if it would be possible to create a temporary graph
with the data from the relational database and bind it only to the
current query?

What I am trying to do is something similar to the Ontop project
(https://github.com/ontop/ontop), but much simpler, because I have just
time series data . So, I don’t need the sql-query-transformation and
mapping stuff.

At the moment I amable to return the created blank-nodes as a binding to
the query from the custom property function, but it is not possible to
set the properties “currentValue” and “timestamp” of these nodes. Is
there a way I can do this? I am stuck here at the moment. Do you have
any further suggestions?


PS: Your INSERT approach would definitely work, but the database access
should be transparent to the user.
Post by Andy Seaborne
Hi Gernot,
In SPARQL queries don't have side effects and that's why Fuseki runs
them inside a read transaction.
The way to create data is to to make an update: have a property
function that generates the variables and uses the INSERT template to
INSERT {
  [] se:currentValue ?value ;
     se:timestamp    ?timestamp
} WHERE {
    [] se:getdbValues (?value ?timestamp)
}
Post by Gernot Steindl
My second approach would be to write the data from the relational
database into a second memory-stored graph.
This might be workable but to me does not look like such a good way to
do it. The INSERT above is a dump already.
The property function can create the temporary graph and access it.
Post by Gernot Steindl
SELECT *
WHERE{ ?dataPoint se:hasType ?type.
                  ?dataPoint se:getdbValues ?observedValue
                  ?observedValue se:timestamp ?time
}
The property function does not seem to make use of ?dataPoint so this
a cross-product of the se:hasType triple pattern and the rest.  You'll
get duplicates in the results if "?dataPoint se:hasType ?type" has
more than one match.
    Andy
Post by Gernot Steindl
Hello,
I implemented a custom property function, which is retrieving data
from a relational database during the a SPARQL- query. The idea is to
keep the data in the database and load it into the graph only if it’s
necessary.
It worked with Jena, but when I apply it to Fuseki I am getting
problems because of the lock mechanism for concurrent access.
The custom property function, which is used in a query, adds elements
to the graph (“write”), which is not allowed during the read-lock
transaction of the query (see error message below).
Is it possible to configure Fuseki in a way, that the query endpoint
would always use a write-lock, like the update endpoint does? As it
is not a public endpoint, the performance issue won’t be a problem.
My second approach would be to write the data from the relational
database into a second memory-stored graph. But I have no idea how I
will be able to make the connection between the two graphs for the
custom property function?
Do you have any hints for me on how I can reach my goal of adding
data just on demand?
public class GetDBValues implements PropertyFunctionFactory {
         public static void init() {
             //register custom property function
             final PropertyFunctionRegistry reg =
PropertyFunctionRegistry.chooseRegistry(ARQ.getContext());
reg.put("http://www.semanticweb.org/ontologies/2018/8/SensorTest.owl#getdbValues",
new GetDBValues());
             PropertyFunctionRegistry.set(ARQ.getContext(), reg);
         }
         public PropertyFunction create(final String uri)
         {
             return new PFuncSimple()
             {
                 public QueryIterator execEvaluated(Binding parent,
Node subject, Node predicate, Node object,ExecutionContext execCxt) {
                     //get datapoint URI
                     String datapointURI=subject.getURI().toString();
                     //create the model from the context
                     Graph graph=execCxt.getActiveGraph();
                     Model model
=ModelFactory.createModelForGraph(graph);
                     //Here the data from the raltional database is
retrieved
                     DatabaseConnector con= new InfluxDBMockup();
//produce dummy data
                     series = con.getData("dbName", "table","col")
                     if (series != null){
                         Model model2 =
ModelFactory.createDefaultModel();
                         // create graph for database entries
                         String
ns="http://www.semanticweb.org/ontologies/2018/8/SensorTest.owl#";
                         Property valueProperty=
ResourceFactory.createProperty(ns +"currentValue");
                         Property timestampProperty=
ResourceFactory.createProperty(ns +"timestamp");
                             for (TimeValuePair obs : series) {
                                 //create blank observation node
                                  Node bNode =
NodeFactory.createBlankNode();
                                  RDFNode
rdfNode=model.asRDFNode(bNode);
//add timestamp and sensor value to the graph
//This model.add () causes the error!!!
 model.add(rdfNode.asResource(), valueProperty,
obs.getValue().toString());
 model.add(rdfNode.asResource(), timestampProperty, obs.getTimeStamp());
                                  final Binding b =
BindingFactory.binding(parent, Var.alloc(object),bNode);
                                  bindings.add(b);
                             }
                     }
                     return new
QueryIterPlainWrapper(bindings.iterator(), execCxt) ;
                 }
             };
         }
}
SELECT *
WHERE{ ?dataPoint se:hasType ?type.
                 ?dataPoint se:getdbValues ?observedValue
                 ?observedValue se:timestamp ?time
}
Fuseki     WARN  [7] RC = 500 : Tried to write inside a READ transaction!
org.apache.jena.sparql.JenaTransactionException: Tried to write
inside a READ transaction!
         at
org.apache.jena.sparql.core.mem.DatasetGraphInMemory.mutate(DatasetGraphInMemory.java:382)
         at
org.apache.jena.sparql.core.mem.DatasetGraphInMemory.addToDftGraph(DatasetGraphInMemory.java:416)
         at
org.apache.jena.sparql.core.DatasetGraphTriplesQuads.add(DatasetGraphTriplesQuads.java:42)
         at
org.apache.jena.sparql.core.GraphView.performAdd(GraphView.java:152)
         at org.apache.jena.graph.impl.GraphBase.add(GraphBase.java:181)
         at
org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:1202)
         at
org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:184)
         at
org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:172)
         at
at.bim4bems.GetDBValues$1.execEvaluated(GetDBValues.java:125)
         at
org.apache.jena.sparql.pfunction.PFuncSimple.execEvaluated(PFuncSimple.java:45)
         at
org.apache.jena.sparql.pfunction.PropertyFunctionEval.exec(PropertyFunctionEval.java:42)
         at
org.apache.jena.sparql.pfunction.PropertyFunctionBase$RepeatApplyIteratorPF.nextStage(PropertyFunctionBase.java:106)
         at
org.apache.jena.sparql.engine.iterator.QueryIterRepeatApply.makeNextStage(QueryIterRepeatApply.java:108)
         at
org.apache.jena.sparql.engine.iterator.QueryIterRepeatApply.hasNextBinding(QueryIterRepeatApply.java:65)
         at
org.apache.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:114)
         at
org.apache.jena.sparql.engine.iterator.QueryIterProcedure.hasNextBinding(QueryIterProcedure.java:73)
         at
org.apache.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:114)
         at
org.apache.jena.sparql.engine.main.StageGeneratorGeneric.execute(StageGeneratorGeneric.java:61)
         at
org.apache.jena.sparql.engine.main.StageGeneratorGeneric.execute(StageGeneratorGeneric.java:53)
         at
org.apache.jena.tdb.solver.StageGeneratorDirectTDB.execute(StageGeneratorDirectTDB.java:53)
         at
org.apache.jena.tdb2.solver.StageGeneratorDirectTDB.execute(StageGeneratorDirectTDB.java:59)
         at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:128)
         at
org.apache.jena.sparql.engine.main.ExecutionDispatch.visit(ExecutionDispatch.java:58)
         at org.apache.jena.sparql.algebra.op.OpBGP.visit(OpBGP.java:49)
         at
org.apache.jena.sparql.engine.main.ExecutionDispatch.exec(ExecutionDispatch.java:46)
         at
org.apache.jena.sparql.engine.main.OpExecutor.exec(OpExecutor.java:117)
         at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:228)
         at
org.apache.jena.sparql.engine.main.ExecutionDispatch.visit(ExecutionDispatch.java:130)
         at
org.apache.jena.sparql.algebra.op.OpSequence.visit(OpSequence.java:75)
         at
org.apache.jena.sparql.engine.main.ExecutionDispatch.exec(ExecutionDispatch.java:46)
         at
org.apache.jena.sparql.engine.main.OpExecutor.exec(OpExecutor.java:117)
         at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:88)
         at org.apache.jena.sparql.engine.main.QC.execute(QC.java:52)
         at
org.apache.jena.sparql.engine.main.QueryEngineMain.eval(QueryEngineMain.java:55)
         at
org.apache.jena.sparql.engine.QueryEngineBase.evaluate(QueryEngineBase.java:175)
         at
org.apache.jena.sparql.engine.QueryEngineBase.createPlan(QueryEngineBase.java:131)
         at
org.apache.jena.sparql.engine.QueryEngineBase.getPlan(QueryEngineBase.java:112)
         at
org.apache.jena.sparql.engine.main.QueryEngineMain$QueryEngineMainFactory.create(QueryEngineMain.java:90)
         at
org.apache.jena.sparql.engine.QueryExecutionBase.getPlan(QueryExecutionBase.java:593)
         at
org.apache.jena.sparql.engine.QueryExecutionBase.startQueryIterator(QueryExecutionBase.java:542)
         at
org.apache.jena.sparql.engine.QueryExecutionBase.execResultSet(QueryExecutionBase.java:581)
         at
org.apache.jena.sparql.engine.QueryExecutionBase.execSelect(QueryExecutionBase.java:204)
         at
org.apache.jena.fuseki.servlets.SPARQL_Query.executeQuery(SPARQL_Query.java:344)
         at
org.apache.jena.fuseki.servlets.SPARQL_Query.execute(SPARQL_Query.java:288)
         at
org.apache.jena.fuseki.servlets.SPARQL_Query.executeWithParameter(SPARQL_Query.java:242)
         at
org.apache.jena.fuseki.servlets.SPARQL_Query.perform(SPARQL_Query.java:227)
         at
org.apache.jena.fuseki.servlets.ActionService.executeLifecycle(ActionService.java:183)
         at
org.apache.jena.fuseki.servlets.ActionService.execCommonWorker(ActionService.java:98)
         at
org.apache.jena.fuseki.servlets.ActionBase.doCommon(ActionBase.java:74)
         at
org.apache.jena.fuseki.servlets.FusekiFilter.doFilter(FusekiFilter.java:73)
         at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
         at
org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:61)
         at
org.apache.shiro.web.servlet.AdviceFilter.executeChain(AdviceFilter.java:108)
         at
org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:137)
         at
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
         at
org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:66)
         at
org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(AbstractShiroFilter.java:449)
         at
org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(AbstractShiroFilter.java:365)
         at
org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90)
         at
org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83)
         at
org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:383)
         at
org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(AbstractShiroFilter.java:362)
         at
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
         at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
         at
org.apache.jena.fuseki.servlets.CrossOriginFilter.handle(CrossOriginFilter.java:285)
         at
org.apache.jena.fuseki.servlets.CrossOriginFilter.doFilter(CrossOriginFilter.java:248)
         at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1634)
         at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533)
         at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
         at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
         at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
         at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
         at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595)
         at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
         at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1340)
         at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
         at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)
         at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564)
         at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
         at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1242)
         at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
         at
org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:690)
         at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
         at org.eclipse.jetty.server.Server.handle(Server.java:503)
         at
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
         at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
         at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
         at
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
         at
org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
         at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
         at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
         at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
         at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
         at
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
         at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
         at
org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
         at java.lang.Thread.run(Unknown Source)
[2018-11-20 16:00:00] Fuseki     INFO  [7] 500 Tried to write inside
a READ transaction! (70 ms)
Andy Seaborne
2018-11-27 16:23:32 UTC
Permalink
Post by Gernot Steindl
Hi Andy,
thank you for your help. You are right, I shouldn’t alter the graph
during a query! Actually, I don’t really need to make a persistent
change of the graph, if it would be possible to create a temporary graph
with the data from the relational database and bind it only to the
current query?
Yes, but there is a tradeoff.

You could have a general dataset and put the TDB-backed graph into that
dataset because any attempt to create a named graph will become a new
in-memory graph in the general dataset.

But the ddtaset is not a full TDB one now so depending on what you di
with it and how big it is that may be a performance cost.

ARQ doesn't have cross dataset query capability without making a call
out to a SPARQL endpoint.
Post by Gernot Steindl
What I am trying to do is something similar to the Ontop project
(https://github.com/ontop/ontop), but much simpler, because I have just
time series data . So, I don’t need the sql-query-transformation and
mapping stuff.
At the moment I amable to return the created blank-nodes as a binding to
the query from the custom property function, but it is not possible to
set the properties “currentValue” and “timestamp” of these nodes. Is
there a way I can do this? I am stuck here at the moment. Do you have
any further suggestions?
Would one of these work:

1/ Add a custom function that gets the current value.
<currentValue>(BlankNode) AS ?v

2/ make ns:currentValue a property function.

Andy
Post by Gernot Steindl
PS: Your INSERT approach would definitely work, but the database access
should be transparent to the user.
Post by Andy Seaborne
Hi Gernot,
In SPARQL queries don't have side effects and that's why Fuseki runs
them inside a read transaction.
The way to create data is to to make an update: have a property
function that generates the variables and uses the INSERT template to
INSERT {
  [] se:currentValue ?value ;
     se:timestamp    ?timestamp
} WHERE {
    [] se:getdbValues (?value ?timestamp)
}
Post by Gernot Steindl
My second approach would be to write the data from the relational
database into a second memory-stored graph.
This might be workable but to me does not look like such a good way to
do it. The INSERT above is a dump already.
The property function can create the temporary graph and access it.
Post by Gernot Steindl
SELECT *
WHERE{ ?dataPoint se:hasType ?type.
                  ?dataPoint se:getdbValues ?observedValue
                  ?observedValue se:timestamp ?time
}
The property function does not seem to make use of ?dataPoint so this
a cross-product of the se:hasType triple pattern and the rest.  You'll
get duplicates in the results if "?dataPoint se:hasType ?type" has
more than one match.
    Andy
Post by Gernot Steindl
Hello,
I implemented a custom property function, which is retrieving data
from a relational database during the a SPARQL- query. The idea is to
keep the data in the database and load it into the graph only if it’s
necessary.
It worked with Jena, but when I apply it to Fuseki I am getting
problems because of the lock mechanism for concurrent access.
The custom property function, which is used in a query, adds elements
to the graph (“write”), which is not allowed during the read-lock
transaction of the query (see error message below).
Is it possible to configure Fuseki in a way, that the query endpoint
would always use a write-lock, like the update endpoint does? As it
is not a public endpoint, the performance issue won’t be a problem.
My second approach would be to write the data from the relational
database into a second memory-stored graph. But I have no idea how I
will be able to make the connection between the two graphs for the
custom property function?
Do you have any hints for me on how I can reach my goal of adding
data just on demand?
public class GetDBValues implements PropertyFunctionFactory {
         public static void init() {
             //register custom property function
             final PropertyFunctionRegistry reg =
PropertyFunctionRegistry.chooseRegistry(ARQ.getContext());
reg.put("http://www.semanticweb.org/ontologies/2018/8/SensorTest.owl#getdbValues",
new GetDBValues());
             PropertyFunctionRegistry.set(ARQ.getContext(), reg);
         }
         public PropertyFunction create(final String uri)
         {
             return new PFuncSimple()
             {
                 public QueryIterator execEvaluated(Binding parent,
Node subject, Node predicate, Node object,ExecutionContext execCxt) {
                     //get datapoint URI
                     String datapointURI=subject.getURI().toString();
                     //create the model from the context
                     Graph graph=execCxt.getActiveGraph();
                     Model model
=ModelFactory.createModelForGraph(graph);
                     //Here the data from the raltional database is
retrieved
                     DatabaseConnector con= new InfluxDBMockup();
//produce dummy data
                     series = con.getData("dbName", "table","col")
                     if (series != null){
                         Model model2 =
ModelFactory.createDefaultModel();
                         // create graph for database entries
                         String
ns="http://www.semanticweb.org/ontologies/2018/8/SensorTest.owl#";
                         Property valueProperty=
ResourceFactory.createProperty(ns +"currentValue");
                         Property timestampProperty=
ResourceFactory.createProperty(ns +"timestamp");
                             for (TimeValuePair obs : series) {
                                 //create blank observation node
                                  Node bNode =
NodeFactory.createBlankNode();
                                  RDFNode
rdfNode=model.asRDFNode(bNode);
//add timestamp and sensor value to the graph
//This model.add () causes the error!!!
 model.add(rdfNode.asResource(), valueProperty,
obs.getValue().toString());
 model.add(rdfNode.asResource(), timestampProperty, obs.getTimeStamp());
                                  final Binding b =
BindingFactory.binding(parent, Var.alloc(object),bNode);
                                  bindings.add(b);
                             }
                     }
                     return new
QueryIterPlainWrapper(bindings.iterator(), execCxt) ;
                 }
             };
         }
}
SELECT *
WHERE{ ?dataPoint se:hasType ?type.
                 ?dataPoint se:getdbValues ?observedValue
                 ?observedValue se:timestamp ?time
}
Fuseki     WARN  [7] RC = 500 : Tried to write inside a READ transaction!
org.apache.jena.sparql.JenaTransactionException: Tried to write
inside a READ transaction!
         at
org.apache.jena.sparql.core.mem.DatasetGraphInMemory.mutate(DatasetGraphInMemory.java:382)
         at
org.apache.jena.sparql.core.mem.DatasetGraphInMemory.addToDftGraph(DatasetGraphInMemory.java:416)
         at
org.apache.jena.sparql.core.DatasetGraphTriplesQuads.add(DatasetGraphTriplesQuads.java:42)
         at
org.apache.jena.sparql.core.GraphView.performAdd(GraphView.java:152)
         at org.apache.jena.graph.impl.GraphBase.add(GraphBase.java:181)
         at
org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:1202)
         at
org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:184)
         at
org.apache.jena.rdf.model.impl.ModelCom.add(ModelCom.java:172)
         at
at.bim4bems.GetDBValues$1.execEvaluated(GetDBValues.java:125)
         at
org.apache.jena.sparql.pfunction.PFuncSimple.execEvaluated(PFuncSimple.java:45)
         at
org.apache.jena.sparql.pfunction.PropertyFunctionEval.exec(PropertyFunctionEval.java:42)
         at
org.apache.jena.sparql.pfunction.PropertyFunctionBase$RepeatApplyIteratorPF.nextStage(PropertyFunctionBase.java:106)
         at
org.apache.jena.sparql.engine.iterator.QueryIterRepeatApply.makeNextStage(QueryIterRepeatApply.java:108)
         at
org.apache.jena.sparql.engine.iterator.QueryIterRepeatApply.hasNextBinding(QueryIterRepeatApply.java:65)
         at
org.apache.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:114)
         at
org.apache.jena.sparql.engine.iterator.QueryIterProcedure.hasNextBinding(QueryIterProcedure.java:73)
         at
org.apache.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:114)
         at
org.apache.jena.sparql.engine.main.StageGeneratorGeneric.execute(StageGeneratorGeneric.java:61)
         at
org.apache.jena.sparql.engine.main.StageGeneratorGeneric.execute(StageGeneratorGeneric.java:53)
         at
org.apache.jena.tdb.solver.StageGeneratorDirectTDB.execute(StageGeneratorDirectTDB.java:53)
         at
org.apache.jena.tdb2.solver.StageGeneratorDirectTDB.execute(StageGeneratorDirectTDB.java:59)
         at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:128)
         at
org.apache.jena.sparql.engine.main.ExecutionDispatch.visit(ExecutionDispatch.java:58)
         at org.apache.jena.sparql.algebra.op.OpBGP.visit(OpBGP.java:49)
         at
org.apache.jena.sparql.engine.main.ExecutionDispatch.exec(ExecutionDispatch.java:46)
         at
org.apache.jena.sparql.engine.main.OpExecutor.exec(OpExecutor.java:117)
         at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:228)
         at
org.apache.jena.sparql.engine.main.ExecutionDispatch.visit(ExecutionDispatch.java:130)
         at
org.apache.jena.sparql.algebra.op.OpSequence.visit(OpSequence.java:75)
         at
org.apache.jena.sparql.engine.main.ExecutionDispatch.exec(ExecutionDispatch.java:46)
         at
org.apache.jena.sparql.engine.main.OpExecutor.exec(OpExecutor.java:117)
         at
org.apache.jena.sparql.engine.main.OpExecutor.execute(OpExecutor.java:88)
         at org.apache.jena.sparql.engine.main.QC.execute(QC.java:52)
         at
org.apache.jena.sparql.engine.main.QueryEngineMain.eval(QueryEngineMain.java:55)
         at
org.apache.jena.sparql.engine.QueryEngineBase.evaluate(QueryEngineBase.java:175)
         at
org.apache.jena.sparql.engine.QueryEngineBase.createPlan(QueryEngineBase.java:131)
         at
org.apache.jena.sparql.engine.QueryEngineBase.getPlan(QueryEngineBase.java:112)
         at
org.apache.jena.sparql.engine.main.QueryEngineMain$QueryEngineMainFactory.create(QueryEngineMain.java:90)
         at
org.apache.jena.sparql.engine.QueryExecutionBase.getPlan(QueryExecutionBase.java:593)
         at
org.apache.jena.sparql.engine.QueryExecutionBase.startQueryIterator(QueryExecutionBase.java:542)
         at
org.apache.jena.sparql.engine.QueryExecutionBase.execResultSet(QueryExecutionBase.java:581)
         at
org.apache.jena.sparql.engine.QueryExecutionBase.execSelect(QueryExecutionBase.java:204)
         at
org.apache.jena.fuseki.servlets.SPARQL_Query.executeQuery(SPARQL_Query.java:344)
         at
org.apache.jena.fuseki.servlets.SPARQL_Query.execute(SPARQL_Query.java:288)
         at
org.apache.jena.fuseki.servlets.SPARQL_Query.executeWithParameter(SPARQL_Query.java:242)
         at
org.apache.jena.fuseki.servlets.SPARQL_Query.perform(SPARQL_Query.java:227)
         at
org.apache.jena.fuseki.servlets.ActionService.executeLifecycle(ActionService.java:183)
         at
org.apache.jena.fuseki.servlets.ActionService.execCommonWorker(ActionService.java:98)
         at
org.apache.jena.fuseki.servlets.ActionBase.doCommon(ActionBase.java:74)
         at
org.apache.jena.fuseki.servlets.FusekiFilter.doFilter(FusekiFilter.java:73)
         at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
         at
org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:61)
         at
org.apache.shiro.web.servlet.AdviceFilter.executeChain(AdviceFilter.java:108)
         at
org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:137)
         at
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
         at
org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:66)
         at
org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(AbstractShiroFilter.java:449)
         at
org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(AbstractShiroFilter.java:365)
         at
org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90)
         at
org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83)
         at
org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:383)
         at
org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(AbstractShiroFilter.java:362)
         at
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
         at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
         at
org.apache.jena.fuseki.servlets.CrossOriginFilter.handle(CrossOriginFilter.java:285)
         at
org.apache.jena.fuseki.servlets.CrossOriginFilter.doFilter(CrossOriginFilter.java:248)
         at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1634)
         at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533)
         at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
         at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
         at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
         at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
         at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595)
         at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
         at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1340)
         at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
         at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)
         at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564)
         at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
         at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1242)
         at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
         at
org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:690)
         at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
         at org.eclipse.jetty.server.Server.handle(Server.java:503)
         at
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
         at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
         at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
         at
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
         at
org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
         at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
         at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
         at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
         at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
         at
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
         at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
         at
org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
         at java.lang.Thread.run(Unknown Source)
[2018-11-20 16:00:00] Fuseki     INFO  [7] 500 Tried to write inside
a READ transaction! (70 ms)
Loading...