Does the work on linked data fragments [1] help at all? I'm thinking that it might make sense to have triples for some functions, using LDF. But I admit I'm still struggling to understand it.


On 4/10/15 9:17 AM, Simon Spero wrote:
[log in to unmask]" type="cite">

On Apr 10, 2015 10:33 AM, "LeVan,Ralph" <[log in to unmask]> wrote:
> Millions of triples?  Puhleeze.
> At OCLC we've got >300M bib records, around a billion article records and billions of holdings records.  That's going to be a *lot* of triples.

You forgot to mention member organizations and ILLiad.

It is hard to guesstimate the triple count for oclc bib records, as there are a large number of records that are broken enough to break deduplication, but which can be grouped into the same workset. This makes property inheritance with overriding especially effective. (A project for when the Ed O'Neill emulator becomes self-aware? :).

It is possible to scale triple stores to this size, but most of the entities are better off stored using other approaches, with triples being generated from this other store as required.  Most commercial triple stores also provide row and/or column organized tables.

I have sometimes speculated on how well IDMS might work for this kind of data. I have decided I would rather not know :)

Some kinds of ODBMS might also work rather well.


Karen Coyle
[log in to unmask]
m: +1-510-435-8234
skype: kcoylenet/+1-510-984-3600