Print

Print


That's around the right order of magnitude (around a gigaTrip).

The actual number of triples required may vary quite a bit, depending on what sort of inference regime is used, and the degree to which entities can be reused (especially if properties can be for lower level entities can be derived from those of higher level entities unless otherwise specified).

Simon

On Apr 10, 2015 11:14 AM, "Stern, Randy" <[log in to unmask]> wrote:
Harvard estimates around 1 billion triples for our MARC records alone, could be 1.6 billion if we include visual information records and finding aid components.



> On Apr 10, 2015, at 4:32 AM, Bernhard Eversberg <[log in to unmask]> wrote:
>
> Am 10.04.2015 10:09, schrieb Ross Singer:
>>
>> .. it's counter-productive to prescribe a specific technology
>> if the goal is to increase adoption.
>
> Anyone advocating a specific technology could do nothing better
> than come up with a working model of compelling performance with
> interesting functions.
>
> B.Eversberg