Print

Print


I agree. Mapping MARC directly into the desired vocabularies (e.g. using XSL) is lossy but infinitely easier.

Jeff



> On Jan 31, 2015, at 12:20 PM, Karen Coyle <[log in to unmask]> wrote:
> 
> If you wish to explore the idea of a MARC-RDF serialization more fully, one was added to the Open Metadata Registry [1] but it has not been used, AFAIK. There is one ontology for each of the 9 sections of MARC tagging, plus the 600 segment is represent by 5 different ontologies. The total number of properties? I'm guessing well into the 10's of thousands. MARC itself has about 1400 properties, but this RDF serialization combines those with the indicator values, so the total number balloons up quite quickly. (You may begin to understand why a serialization isn't such a good idea.)
> 
> kc
> [1] http://metadataregistry.org/schema/show/id/38.html
> to  http://metadataregistry.org/schema/show/id/51.html
> 
>> On 1/31/15 8:54 AM, Martynas Jusevičius wrote:
>> As mentioned, I've no experience with MARC. My suggestion was
>> motivated by pragmatic implementation.
>> 
>> I guess we all agree that a mapping between MARC and Linked Data is
>> inevitable, using BIBFRAME or some other vocabularies.
>> 
>> However I maintain my position that lower-level MARC-RDF serialization
>> would be useful, as it would allow to implement the mapping using
>> standard and declarative RDF technologies.
>> 
>> The logic behind the cases Karen mentions could be encoded in the
>> SPARQL mapping query. That would be much, much more reusable and
>> platform-independent than a direct mapping implemented in an
>> imperative language, such as Java or Python (which would likely
>> require a software library for each language).
>> 
>>> On Sat, Jan 31, 2015 at 5:40 PM, Karen Coyle <[log in to unmask]> wrote:
>>>> On 1/31/15 8:16 AM, [log in to unmask] wrote:
>>>> So, from my personal experience, I do not recommend to propose a
>>>> MARC-centered "serialization only" Bibframe dialect. It will not improve
>>>> Bibframe or ease the migration, it will just add a truncated RDF without
>>>> links, without URIs, with another migration path.
>>> 
>>> +1. The "semantics" of MARC fields are not well-organized. Had MARC been
>>> treated to something like a relational-database analysis some decades ago,
>>> we wouldn't have a situation where things like dates of publication can be
>>> found in 3 or 4 different places in the record
>>>   008 date of publication
>>>   046 special coded dates (because there wasn't room in the 008 for
>>> expansion)
>>>   240 (sometimes) date of the expression
>>>   260 display form of date of publication
>>> 
>>> Oftentimes it is the same date that appears in each of these places in the
>>> record.
>>> 
>>> Nor would we have multiple ways to indicate the source of the data in the
>>> field:
>>>   indicator value (e.g. 0 = LCSH)
>>>   indicator value 7 + code in subfield $2
>>> 
>>> (The indicators alone are a can of worms. See:
>>> http://futurelib.pbworks.com/w/page/44421482/indicators)
>>> 
>>> To my mind, the only way forward with our data is to deconstruct MARC into
>>> semantic units, and move forward with those semantics, separate from the
>>> MARC structure.
>>> 
>>> And in case you are not aware of this, at this very moment new additions to
>>> MARC are being discussed at the ALA midwinter meeting in Chicago. That
>>> boggles my mind.
>>> 
>>> kc
>>> 
>>> 
>>> --
>>> Karen Coyle
>>> [log in to unmask] http://kcoyle.net
>>> m: +1-510-435-8234
>>> skype: kcoylenet/+1-510-984-3600
> 
> -- 
> Karen Coyle
> [log in to unmask] http://kcoyle.net
> m: +1-510-435-8234
> skype: kcoylenet/+1-510-984-3600