We're also using Dynaweb and have run into the same problem
and have been using the method Bob outlines as well. Although
Dynaweb usually does a great job at breaking up arbitrarily
large documents into manageable "chunks" it's quite stymied
by large, unbroken sections of container lists. It really needs
something like a series or subseries to grab hold of and serve
out. Most of our finding aids work fine but a few require us to
break the container list down into artificial sections.
Changing the intellectual structure of a <dsc> simply to
accomodate the limitations our current presentation software
is not something we enjoy doing. We insert numbered <c>s at
reasonable points in the <dsc> and shift the <c> numbers of
its children down one:
... x 2000
<c03 level="otherlevel" otherlevel="toc">
<unittitle>Part 1</unittitle> (or "A-B" or "1921-1926")
We at least try and avoid using terminology like series
or subseries and use specific local terminology, e.g.,
"toc", in the hopes that one day this nonsense can be
stripped out. Although very few of our manuscript finding
aids happen to require this abuse, it's a practice we
apply regularly to our pictorial collections with thumbnail
images. Our largest finding aids, ca. 10-15 Mb, usually
require extensive structural manipulation.
I'm curious to know whether anyone has applied XSLT or DOM
to this problem.
Digital Publishing Group
UC Berkeley Library
[log in to unmask]
At 08:37 AM 11/06/2001 -0500, [log in to unmask] wrote:
>I've worked (and am currently working on) finding aids that are two to
>three hundred pages or more. As far as I can tell, the main problem with
>long finding aids is not really a structural one, but rather that it takes
>a long time for the pages to load in one's viewer/browser, especially if
>one is using a 56K modem.
>I'm not sure this is entirely the proper way, but what I've done for some
>of our huge finding aids is to "reinterpret" some of the structural
>elements. Specifically, since in large finding aids the <c02 level
>="subseries"> element leads to section breaks, I used that feature to
>"break" the finding aid into smaller sections alphabetically.
>You can see what I've done with the scores that belonged to conductor
>The paper finding aid for the papers of conductor Bruno Walter was also
>rather long (over 200 pages) but I didn't resort to that "breaking"
>technique. Thus, the bulk of the collection (in Series 1) results in a
>long loading time. Try it and see what you think:
>Lastly, for an example of a finding aid (with only a single series) that
>was rather long WITHOUT breaks, look at our collection of broadsides:
>Though I say "what I've done," I would be remiss in not acknowledging the
>help and assistance of my coworkers at my institution.
>Bob Kosovsky, Librarian
>Music Division -- The New York Public Library
>[log in to unmask] [log in to unmask]
>Listowner: [log in to unmask] ; [log in to unmask]
>My opinions do not necessarily represent those of my institutions.
> "Miss L.V.
> Mitchell" To: [log in to unmask]
> <larysam@LIVERP cc:
> OOL.AC.UK> Subject: large finding aids
>I have recently begun a project to create a large on-line finding aid
>for the University of Liverpool's own archives. The finding aid will be
>launched on the University web-site when complete. Around 40 A4-sized
>hard-copy finding aids have been retro-converted into ead, and the task
>at hand is to tidy up the encoding and create a useable finding aid. There
>far too much data to create a one-document finding aid, so I am
>intending to create a series of smaller records which will be linked
>using links on the website.
>I was wondering whether anyone else has undertaken a similar project or
>has any experience of creating large electronic finding aids and could
>offer advice or opinions.
>Special Collections and Archives
>Sydney Jones Library
>University of Liverpool
>PO Box 123
>tel. 0151 794 2696
>[log in to unmask]