Skip navigation

1738822.jpg

nature-2-rough.jpg

ABOVE: Nature-2 (rough).jpg @ 50% (Gray)
BELOW: Untitled-1 @ 33.3% (Layer 4, Gray)

Both by Bruce MacPherson, work-in-progress sketches for the MathFactory, for Gallagher & Associates Design Proposal

untitled-1.jpg

___________________

Below is the introduction from Time & Bits, Managing Digital Continuity edited by Margaret MacLean and Ben H. Davis, an eternity ago in 1998 for the Getty Research Institute.  The Getty Research Institute is dedicated to furthering knowledge and understanding of the visual arts and aesthetic appreciation through the advancement of long term digital preservation and information exchange techniques to protect our common cultural inheritance.  The book is about an early workshop pondering over new problems with obsolete media and machines impact on the cycle of: capturing, preserving, distributing, representing, and unlocking a real understanding of the meaning of stored data. See the Long Now Foundation Projects for follow on work such as the Rosetta Project.

gettyfigure1.jpg

Workshop Figure 1

This was a very unhappy interface. And small wonder. No doubt this entire virtual environment was being encrypted, decrypted, reencrypted, anonymously routed through satellites and cables, emulated on alien machinery through ill-fitting, out-of-date protocols, then displayed through long-dead graphic standards.  Dismembered, piped, compressed, packeted, unpacketed, decompressed, unpiped and re-membered.  Worse yet, the place was old.  Virtual buildings didn’t age like physical ones but they aged in subtle pathways of arcane decline, in much the way that their owner’s did.

Bruce Sterling, in Holy Fire. Science fiction writer and founder of the Dead Media Project.

gettyfigure2.jpg

Workshop Figure 2

Below from the article Storage Knowledge by Doug Carlston, page 28 Time & Bits: Managing Digital Continuity

- process information is everywhere and, with increasing frequency, it will not be possible to perceive the full expression of the content-creator’s intent if the ability to perceive the process information is lost.

Imagine, if you will, that we are talking about process content that represents the instructions for building a virtual space and populating it with still and animated images tied to sounds.  Even if one could disambiguate the various data forms and figure out what was image, what was sound, and what was descriptive code, the author’s expression is virtually impossible to deduce absent its interpretation via his original processing device.  If in the future it becomes common to create digital wire models of complex inventions and other devices in lieu of written words, we will have an entire body of obviously important process data held hostage to its original interpretation device.

Perhaps in these areas we just have to give it time.  We do seem to have some movement towards standards, numerical bits have been translated in a reasonably consistent way into numerals and letters of the Roman alphabet (and others), a necessary first step toward a process Rosetta Stone.  And there appears to be a compelling universal interest in standardizing the operating systems and chief applications of commonly available computers, although these standards themselves continue to evolve at a hazardous rate.  Perhaps this process will not continue indefinitely, in which case we are confronting merely an interim problem while the universal standards are finally worked out.

___________________

All of this was written before the explosion of the semantic web, online services, and the large scale development of open standards.  Nevertheless, many early concerns raised at the Time & Bits workshop are still valid.  The documentation of places and buildings together with the public information they generate has only just begun.  When will the process information be mature and standardized enough to tell the story of all these people and places over long periods of time?  There are many arguments on OntologForum regarding the utility, accuracy, and even the possibility of universal standards for such large scale processing. Like buildings in the real world, some digital architectures are better than others, some data deserve to be taken better care of and

“there is no constituency representing that body of information”

Margeret MacLean, Setting the Stage, page 33 in Time & Bits: Managing Digital Continuity.

3 images below are from the central garden at the Getty Center in Los Angeles. You can go anywhere, touch anything, get led in directions you want to go anyway, and have tremendous vistas open up around unexpected angles.  There are curves and corners. Only the best materials are used and they are taken care of.  The combination is gorgeous together.  This level of spatial design, execution, and maintenance is needed for an equivalent level of high quality, long term, takes-forever-to-build, semantic web spaces made expressly for the general public.

getty_center_central_garden.jpg

File: Getty Center Central Gardens Wiki Commons

gettygarden.jpg

http://www.panoramio.com/photo/1738822

___________________

Companion Post: Trace Continuous Threads

red2.jpg

Share

2 Comments

  1. Howdy there,Excellent blogging dude! i am just Fed up with using RSS feeds and do you use twitter?so i can follow you there:D.
    PS:Do you thought to be putting video to this blog posts to keep the people more interested?I think it works.Best regards, Danilo Spearmon

  2. Thank you for dropping that link… but unfortunately it looks to be down? Anybody have a mirror?


One Trackback/Pingback

  1. [...] 30, 2008 Digital Continuity – accuracyandaesthetics.com 12/30/2008 [ 1738822.jpg] Below is the introduction from Time & [...]

Leave a Reply