A few things (not directed toward anyone in particular, but in general
I'm having a hard time believing this is an issue):
1. A single point of failure is never a good idea, right? In linked-data
land this is a bigger question. In XSD land the solution seems pretty
simple to me-cache it!
2 When we're all validating each of our 500 gazillion METS are we really
hitting loc.gov each and every time, for every record? Hopefully not, so
chances are there is a cached copy somewhere. Maybe this is an
opportunity to learn where the various tools we all use keep that copy?
3. It's possible to validate an XML document against a different schema
than the one it references. On the command line with xmllint
$ xmllint --schema mets.xsd 2397415.mets --noout
Catalogs might be a solution, but you could also just change any import
statements to point to local copies of those schemas. Kevin Clark listed
a few other libraries, any of which could do this.
4. I don't think it's part of [anyone conspiring to shutdown the federal
government]'s agenda to delete </div> tags across the worlds IRs.
Chances are your structMaps are going to be OK while loc.gov is unavailable.