We (Thomas and Wolfgang) have installed locally wikidata and blazegraph following the instruction here : https://github.com/wikimedia/wikidata-query-rdf/blob/master/docs/getting-started.md
The
mvn package command was successful
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] parent ............................................. SUCCESS [ 54.103 s]
[INFO] Shared code ........................................ SUCCESS [ 23.085 s]
[INFO] Wikidata Query RDF Testing Tools ................... SUCCESS [ 11.698 s]
[INFO] Blazegraph extension to improve performance for Wikibase SUCCESS [02:12 min]
[INFO] Blazegraph Service Package ......................... SUCCESS [01:02 min]
[INFO] Wikidata Query RDF Tools ........................... SUCCESS [02:19 min]
[INFO] Wikibase RDF Query Service ......................... SUCCESS [ 25.466 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
We are both using
java version "1.8.0_151"
Java(TM) SE Runtime Environment (build 1.8.0_151-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.151-b12, mixed mode)
We both downloaded the latest-all.ttl.gz e.g.
31064651574 Jan 3 19:30 latest-all.ttl.gz
from https://dumps.wikimedia.org/wikidatawiki/entities/ which took some 4 hours.
The .munge created 424 files as "wikidump-000000001.ttl.gz" in data/split
~/wikidata/wikidata-query-rdf/dist/target/service-0.3.0-SNAPSHOT$ ./munge.sh -f data/latest-all.ttl.gz -d data/split -l en,de
#logback.classic pattern: %d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n
08:23:02.391 [main] INFO org.wikidata.query.rdf.tool.Munge - Switching to data/split/wikidump-000000001.ttl.gz
08:24:21.249 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 10000 entities at (105, 47, 33)
08:25:07.369 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 20000 entities at (162, 70, 41)
08:25:56.862 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 30000 entities at (186, 91, 50)
08:26:43.594 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 40000 entities at (203, 109, 59)
08:27:24.042 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 50000 entities at (224, 126, 67)
08:28:00.770 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 60000 entities at (244, 142, 75)
08:28:32.670 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 70000 entities at (272, 161, 84)
08:29:12.529 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 80000 entities at (261, 172, 91)
08:29:47.764 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 90000 entities at (272, 184, 98)
08:30:20.254 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 100000 entities at (286, 196, 105)
08:30:20.256 [main] INFO org.wikidata.query.rdf.tool.Munge - Switching to data/split/wikidump-000000002.ttl.gz
08:30:55.058 [main] INFO org.wikidata.query.rdf.tool.Munge - Processed 110000 entities at (286, 206, 112)
When Thomas tried to load one file on blazegraph with
./loadRestAPI.sh -n wdq -d data/split/wikidump-000000001.ttl.gz
he got the error below. Trying to import from the UPDATE tab of blazegraph also didn't work.
What can be done to fix this?
ERROR: uri=[file:/home/tsc/projects/TestSPARQL/wikidata-query-rdf-0.2.1/dist/target/service-0.2.1/data/split/wikidump-000000001.ttl.gz], context-uri=[] java.util.concurrent.ExecutionException: org.openrdf.rio.RDFParseException: Expected an RDF value here, found '' [line 1] at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:192) at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:281) at com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:397) at com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:116) at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:303) at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:192) at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:808) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:497) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555) at java.lang.Thread.run(Thread.java:748) Caused by: org.openrdf.rio.RDFParseException: Expected an RDF value here, found '' [line 1] at org.openrdf.rio.helpers.RDFParserHelper.reportFatalError(RDFParserHelper.java:441) at org.openrdf.rio.helpers.RDFParserBase.reportFatalError(RDFParserBase.java:671) at org.openrdf.rio.turtle.TurtleParser.reportFatalError(TurtleParser.java:1306) at org.openrdf.rio.turtle.TurtleParser.parseValue(TurtleParser.java:637) at org.openrdf.rio.turtle.TurtleParser.parseSubject(TurtleParser.java:449) at org.openrdf.rio.turtle.TurtleParser.parseTriples(TurtleParser.java:383) at org.openrdf.rio.turtle.TurtleParser.parseStatement(TurtleParser.java:261) at org.openrdf.rio.turtle.TurtleParser.parse(TurtleParser.java:216) at org.openrdf.rio.turtle.TurtleParser.parse(TurtleParser.java:159) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:556) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:68) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ... 1 more