[For the record, copied from RDA-LIST, Nov. 4]

After the new master plan had been publicized, I've had exchanges with
various people about it. Mac referred to parts of this.  Enthusiasm
seems to be buildung up only very slowly, if at all...

A plan of this caliber ought to make a real splash in the community.
This is not just any old paper but a highly important one of
potentially far-reaching ramifications and a high impact on the quality
of the stuff we are working with, and thus the quality of our work,
from now on into an indefinite future. We all expect this quality to
improve, of course. Is this expectation justified by the Framework?

For one thing: that plan puts all eggs into one basket in committing
itself to Web standards like XML and RDF when, far and wide, there is
no large-scale bibliographic database that serves real-life library
work while being based on those. Correct me if I'm wrong.
What with Linked Data and RDF, those are offsprings of the Semantic
Web movement. In that arena, it is taken for granted that everything
comes for free. Content standards that are not openly available will
meet zero acceptance, may they use RDF or not. Of course, as was
discussed yesterday, the maintenance of an open standard takes a
long-term commitment. And for the data itself, what is OCLC's view on
the matter of liberal access via triplestores?
Now XML and RDF are not brand-new, and there certainly have been
lots of attempts to employ them in a grand way, even some at very
prestigeous places. Where are the success stories and the smoothly
running new-age engines based on the results? I'm asking this not
for the first time, but up until now got no answers in the forums.

Certainly, library systems need to be able to export and import XML
and RDF structures, side by side with many others. With the appropriate
tools and interfaces, library catalogs need never show anybody, except
those working on their upkeep, what their data looks like internally
or how they communicate among each other.
Even today, not every library system uses MARC internally. They just
all of them are able to swallow it and spew it out. (No mean feat,
I think, even today. Even something like VuFind takes in MARC and
nothing else.)
RDF triples in huge depositories called triplestores are static copies,
they need to be frequently refreshed. Is that realistic? Will it
really be useful and attractive for end-users if every library rig
up their own triplestores - or should OCLC do that for all of
them? Even now, OCLC could already be doing a *much* better job of
letting end-users access structured data in many useful ways,
XML structured and otherwise, out of the live database, not
a stale copy.
So: RDF is welcome as an addition, a special export product, but
not suitable for internal purposes and much too clumsy for
bulk communication. (JSON seems to be gaining ground now)

Secondly, there is no need for there to be one and *only* one exchange
standard. If some community needs some peculiar different format XYZ,
there may be tools that take in MARC and serve up XYZ. On a per record
or result set basis, web services can do that nicely, with no one
caring what the original was looking like. If we create more and
flexible standards for web services, these might solve or support most
of the requirements our catalogs of the future are expected to fulfil
for end-users and exchange, even with MARC inside. Web services are
flexible, easily extended and modified, with no need to tinker with
internal or communication structures.

And the plan itself says that MARC21 should be retained as an
exchange format for as long as necessary. So why not first create
an alternative format, test it up and down any number of years, improve
it or add yet another better design, and so on. And creating and
enhancing web services standards all the while, as the *primary means*
of access to library data from any outside agents. This can begin right
now and it has begun in many places, so one should look at ways to
coordinate and standardize some of this work. Eventually, let the
market decide, let the better concept win or let it take over step
by step as it gains acceptance. MARC may or may not fade away in
the process, sine ira et studio.
Anyway, two years to achieve "credible progress", in this field?
How's that defined, BTW, how will it be determined?
And what does it mean to "Demonstrate credible progress"? Which of the
many aspects of format features and uses will that include?
(About involvement of NISO, there's another thread in this forum)

And thirdly, data input and editing may use any modern techniques
available today, hiding all the ghastly stuff involved with MARC under
layers and subwindows of pulldowns and radio buttons and plain language
labeled input fields. No playground for RDF and XML here.
Ask the vendors why they don't provide that.
But don't forget to evaluate the economy of a new catalogers'
interface - and what it means to have different ones on sytems A, B and
C - in comparison with the universal interface everybody is used to
now. If you want to move away from plain tagged editing, it becomes
lots more difficult to create a standard. One reason is that interface
techniques keep evolving ...

Oh I forget: RDA's trouble with MARC was what led to this plan in the
first place. Well, that is not MARC's fault but the one of the
particular setup that was used for the test. It did not use
capabilities and provisions that are in fact there in MARC, like the
use of identifiers for authority headings, and record linking for
multi-part resources; the part-whole relationship wasn't considered
at all.
The test, in short, was a much too timid and superficial exercise
to base any overall judgement about "RDA in MARC" on. Or had the test,
to begin with, been designed so as to be able to then say, "See how
inadequate MARC is!"?

MARC does have its flaws, I'm really no fan of it as it is now, let
alone the curent practice, and I have written up and published a long
list of flaws. With some, I don't know why they haven't long since
been solved. They may, however, be cured without sacrificing the
economy of MARC, without dismissing the entire concept and logic
before something demonstrably more economical and logical has been
found and proven.

Briefly: We can set up our entire enterprise so that, internally, we
have the full benefit of an economical format that fits all our
numerous and highly diverse management purposes which are of no
interest to end-users. Externally, no one needs to be confronted
with our internal format, but there can be an increasing variety
of options to choose from, all derived from the same internal format.

(ISO2709, BTW, is *not* among the flaws and issues. It is a very
marginal issue of a purely internal nature and is in no way related
to MARC as a content standard. MARC can perfectly well work without
ISO, no one needs to bother with it except the few systems that are
still unable to ingest anything else, and they can use MarcEdit to
get what they want. Abandoning ISO in favor of the external format
MarcEdit uses, you get rid of the 9999 character field length limit as