On Mar 13, 2013, at 5:18 PM, Simon Spero <[log in to unmask]> wrote:

On Wed, Mar 13, 2013 at 11:55 AM, Karen Coyle <[log in to unmask]> wrote:

RDA:parallelTitle -> subclassOf ->RDA:titleProper-> subclassOf ->dcterms:title [1]

I'm still waiting to see a solution that implements this, and implements it simply and efficiently. 
[1] In particular because you can also have:
foaf:name -> subclassOf -> dcterms:title
since the definition of dcterms:title is " A name given to the resource." and anything -- documents, towns, people, chairs -- can be a resource.
I think you mean sub property, not sub class...

This kind of simple entailment is trivially handled at scale, either at load time, or at run time- this is not new technology.  

Simon, I'm not saying it's novel, but it's also non-trivial, especially if you're talking in terms of SPARQL (rather than just RDF graphs in general).  There aren't many (any?) free triple stores that support inferencing/reasoning, at least not in any serious sense (Fuseki/TDB supports RDFS reasoning via riot, I think, but only if you ingest your triples with their inferred entailments, which massively increases your index size; 4-Store doesn't support reasoning at all, although there are 3rd party projects that claim to provide it; OWLIM-Lite does, but it's quite limited in how much it can scale; Mulgara? No clue, but I don't think so; etc.) and Virtuoso, Allegrograph, Stardog, etc. aren't cheap.

And all this implies that a reasoning-aware triple store is *required* for this stuff, which is, in my mind, problematic.  Take, for example, id.loc.gov, which doesn't (or didn't, at any rate) use anything besides a regular RDBMS, since the domain it was modeling was known and finite, much like most applications are.

I think a good analogy here is Z39.50: think of how libraries have implemented this historically.  Basically, they've just deferred to their vendor's implementation on this because it's niche and they have (really) no idea how it works or even what an ideal implementation would look like.  The vendors, sensing the libraries' ambivalence and ignorance towards (and being just as ignorant and ambivalent themselves), do the absolute minimum necessary to "comply".

I can't imagine that a future where we replace "MARC store with Z39.50 interface" with "triple store with SPARQL endpoint" looking all that much different, honestly, and while I don't think we should let that hold us back, I also think it needs to be taken into consideration somewhat.


Well known techniques can be used in conjunction with pre-computation over the class or property hierarchy, to reduce this kind of transitive computation to checking to see if an integer falls within one or more ranges.