Copy a Sesame repository into a new one

2020-04-14 07:42发布

I'd like to copy all the data from an existing Sesame repository into a new one. I need the migration so that I use OWL-inferencing on my triplestore which is not possible using OWLIM in an In Memory repository (the type of my existing repository).

What is the most efficient way to copy all triples from a repository into a new one?

UPDATE 1:

I'm curious to understand why using SPARQL INSERT cannot be a valid approach. I tried this code under the SPARQL Update section of a new repository:

PREFIX : <http://dbpedia.org/resource/>
INSERT{?s ?p ?o} 
WHERE 
{ 
SERVICE <http://my.ip.ad.here:8080/openrdf-workbench/repositories/rep_name>
{ 
?s ?p ?o 
} 
}

I get the following error:

org.openrdf.query.UpdateExecutionException: org.openrdf.sail.SailException: org.openrdf.query.QueryEvaluationException:

Is there an error in the query or can the data not be inserted this way? I've inserted data from DBpedia using queries of similar structure.

1条回答
欢心
2楼-- · 2020-04-14 08:14

Manually (Workbench)

  1. Open the repository you want to copy from.
  2. select 'Export'.
  3. choose a suitable export format ('TriG' or 'BinaryRDF' are good choices as these both preserve context information), and hit the 'download' button to save the data export to local disk.
  4. Open the repository you want to copy to.
  5. select 'Add'.
  6. choose 'select the file containing the RDF data you wish to upload'
  7. Click 'Browse' to find the data export file on your local disk.
  8. Make sure 'use Base URI as context identifier' is not selected.
  9. Hit 'upload', and sit back.

Programmatically

First, open RepositoryConnnections to both repositories:

RepositoryConnection source = sourceRepo.getConnection();
RepositoryConnection target = targetRepo.getConnection();

Then, read all statements from source and immediately insert into target:

target.add(source.getStatements(null, null, null, true)); 

Either basic method should work just fine for any repository up to about several million triples in size. Of course there are plenty of more advanced methods for larger bulk sizes.

查看更多
登录 后发表回答