jump to navigation

Day of Digital Archives: McLuhan “The [digital] medium is [no longer] the [only] message.” 2011/10/06

Posted by nydawg in Digital Archives, Digital Archiving, Digital Preservation, Education, Information Technology (IT), Media.
Tags: , , , , ,
add a comment
Day of Digital Archives October 6, 2011 Marshall McLuhan: “The Medium Is the Message?” or “The [digital] medium is [no longer] the [only] message.”

This year marks the 100th anniversary of the birth of “the new spokesman of the electronic age”, Marshall (Understanding Media) McLuhan, and digital archivists should take a moment to think about how media, digital and analog, hot and cool, and in many different formats change our jobs, lives and responsibilities. With threats of technological obsolescence, vendor lock-in, hardware failure, bit rot and link rot, non-backwards compatible software, and format and media obsolescence, digital archivists need a system to accurately describe digital objects and assets in their form and function, content, subject, object and context. If we miss key details, we run the risk of restricting access in the future because, for example, data may not be migrated or media refreshed as needed. By studying and understanding media, digital archivists can propose a realistic and trustworthy digital strategy and implement better and best practices to guarantee more efficiency from capture (and digitization or ingest) and appraisal (selection and description), to preservation (storage) and access (distribution).

Over the last ten, forty, one hundred and twenty thousand years, we have crossed many thresholds and lived through many profound media changes– from oral culture to hieroglyphic communications to the alphabet and the written word, and from scrolls to books, and most recently transiting from the Atomic Age (age of atoms) to the Information Age (era of bits). While all changes were not paradigm shifts, many helped shift currencies of trust and convenience to establish new brand loyalties built on threats of imminent obsolescence and vendor lock-in. As digital archivists, we stand at the line separating data from digital assets, so we need to ensure that we are archiving and preserving the assets and describing the content, technical and contextual metadata as needed.

Today, Day of Digital Archives, is a good day to consider Marshall McLuhan’s most famous aphorism, “The medium is the massage,” and update it for the Information Age. In a nutshell, McLuhan argues that “the medium is the message” because an electric light bulb (medium) is pure information (light). He goes on to state: “This fact, characteristic of all media, means that the “content” of any medium is always another medium. The content of writing is speech, just as the written word is the content of print, and print is the content of the telegraph.” (Understanding Media, 23-24) But in the Information Age, the [digital] medium is [no longer] the [only] message. Every born-digital or digitized file is a piece in an environment in which it was created or is accessed, and needs to be described on multiple planes to articulate technical specifications (hardware & software versions, operating system, storage media, file format, encryption) as well as its content. For archivists and librarians describing content, the medium and the message, many use MARC, DublinCore and VRA Core are guides, but PBCore provides a richly defined set of technical, content and Intellectual Property metadata fields to ensure all stakeholders, including IT staff will be able to efficiently access, copy or use the asset (or a copy).

With More Product, Less Process [MPLP] the prevailing processing strategy, many libraries, archives and museums encourage simplified descriptions to catalog digital objects, but these generic descriptions (e.g. moving image, video or digital video) do not provide the most critical information to ensure future users can watch the video online, on an iPad or with a DVD player (or VHS player or film projector). Until digital objects and assets are described in their granular, multi-dimensional digital splendor, we are hurting ourselves and archival access in the future. Once we understand that the medium and message are split into many different categories, we can focus descriptive metadata on critical access points (subject, format or function), and we will not need to panic and makework every time a new [moving image] format [or codec] gains temporary popularity. With better description and critical appraisal at ingest, digital archivists will understand that the medium, the message and the content, subject, structure, form, format and other aspects are all integral parts. At that point we will start to change the commonly-held mindset that “The [digital] medium is [no longer] the [only] message.” 

#OccupyWallStreet Call for Information Managers, Librarians & Archivists 2011/10/05

Posted by nydawg in Archives, Education, Media.
Tags: , , , , ,
add a comment

I don’t know if anybody else has had an opportunity to walk around the
Liberty Plaza #OccupyWallSt #ows festival, but it is truly
incredible . . . Smart people are living there & there’s plenty of
free food donated throughout the day from around the world, and most
importantly, a lending library!  It’s true, i saw it with my own
eyes.  They also have drum circles, Helen Caldicott (last night) and
today, at 4:30 they’re co-hosting a big union rally!

“As the OccupyWallStreet protest movement has held firm and spread
since its inception September 17, the northeast corner of Zuccotti
Park (renamed Liberty Plaza by the protesters) in lower Manhattan has
become the home for the budding revolution’s People’s Library.  The
library already has a website, which proclaims that “information is
liberation,” and this morning, October 5, a “call for librarians” went
out.

“We need help building our catalog and writing our history. Our
readers are enthusiastic and some of them need help finding the right
book,” the post reads. “The right book for the right reader is
fundamental to successful librarianship, so we need public services
folks to come out and conduct reference interviews with people and
help them find ‘their’ book.”

http://www.libraryjournal.com/lj/home/892288-264/as_a_revolution_takes_root.html.csp
check out the library, and maybe you’ll find those American Archivists
back issues i dumped!
http://peopleslibrary.wordpress.com/

CLIR: Future Generations Will Know More About the Civil War than the Gulf War 2011/09/22

Posted by nydawg in Archives, Best Practices, Digital Archives, Education, Electronic Records, Information Technology (IT), Records Management.
Tags: , , , , , , , , , , , ,
add a comment

When I was in Queens College Graduate Library School six years ago, I took Professor Santon’s excellent course in Records Management which led me to understand that every institution has to manage its records and its assets and Intellectual Property.   The vital role the archive and records center play for every day use and long-term functions was made clear by the fact that records have a life cycle, basically creation – – use – – destruction or disposition.   The course was excellent, despite the fact that the main text books we used were from the early 1990s (and included a 3 1/4″ floppy that ran on Windows 3.1).

While doing an assignment, I found a more recent article which really led me to a revelation: electronic records will cause a lot of problems!  The one part that stuck out most and I still remember to this day was in a 2002 article “Record-breaking Dilemma” in Government Technology.  “The Council on Library and Information Resources, a nonprofit group that supports ways to keep information accessible, predicts that future generations will know more about the Civil War than the Gulf War. Why? Because the software that enables us to read the electronic records concerning the events of 1991 have already become obsolete. Just ask the folks who bought document-imaging systems from Wang the year that Saddam Hussein invaded Kuwait. Not only is Wang no longer in business, but locating a copy of the proprietary software, as well as any hardware, used to run the first generation of imaging systems is about as easy as finding a typewriter repairman. ” (emphasis added)

Obviously that article impacted my thinking about the Digital Dark Ages greatly, and it got me to wondering what will best practices be for managing born-digital assets or electronic records for increasingly long periods of time on storage media that is guaranteed for decreasing periods of time.  Or  “”We’re constantly asking ourselves, ‘How do we retain and access electronic records that must be stored permanently?'” she said. ”  Well, this gets to the crux of the issue, especially when records managers and archivists aren’t invited into the conversations with IT.  So when we are using more and more hard drives (or larger servers even in the cloud), “Hard-drive Makers Weaken Warranties“.  In a nutshell : “Three of the major hard-drive makers will cut down the length of warranties on some of their drives, starting Oct. 1, to streamline costs in the low-margin desktop disk storage business.”

So if we’re storing more data on storage media that is not for long-term preservation, then records and archival management must be an ongoing relay race, with appropriate ongoing funding and support, as more and more materials are copied or moved from one storage medium to another, periodically, every 3-5 years (or maybe that will soon be  1-3 years?).   Benign neglect is no longer a sound records management strategy.

That’s the technological challenge.  But there’s more!  I’ve gone on and on and on before about NARA’s ERA program and how one top priority is to ingest 250 million emails from the Bush Administration.  (I’ve done the math, it works out to nearly one email every second of the eight years.)  So we know that NARA is interested in preserving electronic records.  But a couple years ago I read this scary Fred Kaplan piece, “PowerPoint to the People: The urgent need to fix federalarchiving policies” in which he learned that “Finally—and this is simply stunning—the National Archives’ technology branch is so antiquated that it cannot process some of the most common software programs. Specifically, the study states, the archives “is still unable to accept Microsoft Word documents and PowerPoint slides.””

Uhhhhh, wait!  Well, at least that was written in 2009, so we can hope they have gotten their act together, but if you think about it too much, you might wonder if EVERYTHING NEEDED TO ARCHIVE IS ON MICROSOFT’S PROPRIETARY FORMATS?  Or you might just be inspired to ask if anyone really uses Powerpoint in the military.  Well, as Kaplan points out “This is a huge lapse. Nearly all internal briefings in the Pentagon these days are presented as PowerPoint slides. Officials told me three years ago that if an officer wanted to make a case for a war plan or a weapons program or just about anything, he or she had better make the case in PowerPoint—or forget about getting it approved.”  Or this piece from the NYTimes “We Have Met the Enemy and He Is Powerpoint” in which “Commanders say that behind all the PowerPoint jokes are serious concerns that the program stifles discussion, critical thinking and thoughtful decision-making. Not least, it ties up junior officers — referred to as PowerPoint Rangers — in the daily preparation of slides, be it for a Joint Staff meeting in Washington or for a platoon leader’s pre-mission combat briefing in a remote pocket of Afghanistan.”

We Have Met the Enemy, and He Is PowerPoint

Digital New York: Still a Few Bugs in the System 2011/09/05

Posted by nydawg in Curating, Digital Archiving, Education, Electronic Records, Information Technology (IT), Media.
Tags: , , , ,
add a comment

Hurricane Irene (not to scale)

Many of you know that I missed all the excitement last week as Hurricane Irene bore down on the New York area.  I was in Chicago for the 75th Annual Meeting of the SAA (Society of American Archivists) and it got so bad that I received warning emails from my mother and my oldest brother.  [I assume they had received but not read my itinerary which clearly showed that I was heading to Minneapolis/St Paul after the meeting.]  So I figured I was in the clear until I realized sometime on Friday, “Whoops! I forgot to close my windows!”  So I guess I can say I was tangentially affected (by guilt caused) by Tropical Storm Irene. . . .

But as the story was developing, I was in touch with friends back East and learned that some who live in my neighborhood were advised to evacuate!  My ex-girlfriend evacuated our two (Brooklyn) cats to Manhattan, and sent me pictures!  Well, I live close enough to the East River to start to worry about my (second floor) apartment. .  With a little research, I learned that I could find the evacuation areas from nyc.gov.  But on Saturday, I didn’t have any luck accessing the PDF or whatever it was.

So this morning, I stopped for a cup of coffee in Champion, and happened to read an article that “The New York Times reported that the city’s official website, www.nyc.gov, was down on the morning of Friday, Aug. 26.  The news outlet suggested that the site was overwhelmed by people looking for information about the hurricane. As of 1:30 p.m. Pacific time, however, the site was back online.   The timing couldn’t have been worse. In what New York City Mayor Michael R. Bloomberg called a “first time” for the city, he ordered a mandatory evacuation of various coastal areas of the city’s five boroughs, covering roughly 250,000 people.”  So this is dysfunctional modern-day disaster planning.

From the TimesCity Learns Lessons From the Storm, Many of Them the Hard Way” we learn that “For example, the mayor’s office had predicted a surge in Web traffic on nyc.gov when it issued the evacuation order. But nobody expected five times the normal volume of traffic. By Friday afternoon, computer servers had become severely overloaded. The Web site sputtered and crashed for hours, when New Yorkers needed it most.  In the future, the city will try to modify the Web site so that it can be quickly stripped down to a few essential features  —  like an evacuation map, searchable by ZIP code —  that are in highest demand during an emergency.”

Hurricane Irene: NYC Evacuation Zones

I’m curious about what is the “normal volume” of traffic on that webpage?  But it seems to me that this is ultimately a problem wit making information accessible, but not thinking it through to the extent that an end-user (who may have to evacuate his/her house!) has to first click on the PDF, then download it, wait for it to finish downloading, launch it, and then search for the data needed. . . . .  The fact that this is not an integrated system where a person can easily plug his/her zip code into an online system to find out if his house is in an evacuation zone  suggests that the system is not very functional, best practices are not in use, and further, that perhaps the metrics used to show how vital Digital New York is, are the wrong metrics to use.

Why wouldn’t the IT staff at DoITTT consider creating mirror sites for downloading the PDFs?  So the first victim of Hurricane Irene was NYC.gov.  “In a tweet earlier this morning the city’s Chief Digital Officer apologized for the outage while giving specific links (which were also frequently down) to find the city’shurricane evacuation map (we’ve included it below for your convenience). And the city’s main Twitter feed just put out a similar tweet. Which means, damn, a LOT of people must be trying to access the city’s website. We’ve e-mailed to find out just how many users it takes to take down nyc.gov but have yet to hear back.”

Well, fortunately, they’ve probably learned some lessons from this hysteria, and it seems like no one suffered much damage in this area and, ironically (or fortunately) September is a good time to Get Prepared: “National Preparedness Month . .  . a nationwide campaign to promote emergency preparedness and encourage volunteerism.”  To learn more about NYC’s Digital Strategy and the Chief Digital Officer check here for the Road Map. (more…)

Back from the SAA Annual Meeting #saa11 2011/09/01

Posted by nydawg in Archives, Education, Electronic Records, Records Management.
Tags: , , ,
add a comment

I’m back from the annual meeting of the SAA, and I had a blast.  I had the opportunity to hear many archivists and historians opine on best practices, photographic collections, digital forensics, electronic records management and a whole lot of other interesting topics.  And I hope to write about it soon.

Before I do, though, I wanted to write briefly about something I noticed.  Last Wednesday, I had a 7:30 am flight and left brooklyn at 5:15.  The night before, I had used the MTA Trip Planner and learned that there was a G train at 5:30 connecting to the E @ 5:44 to arrive at AirTrain by 6:45.  Plenty of time to go through security and catch my 7:38 flight. . . .  But nooooooooo.

For some reason, the G train was late, and the next E train that came was running local (stopping at all stops), so the estimated 30 min. subway ride became a 60 min trek, leading to Security Check in at JFK (Jet Blue).  So I get my ticket and arrive at security at 7:10 am, and there’s about 1000000 people in front of me.  (Okay, maybe 100).   I finally made it through at around 7:30 and made a mad dash for the gate– without even time to put on my belt!  Alasl, I got there too late, and the door to the plane was closed.  The plane was still there (I was 8 mins early), but they had stopped boarding.  I asked the woman at the help desk and reserved a space on the next flight (8 hours later) and paid $40. . .. . Anyway, I learned a lesson, and blogged a couple times while there.  (something about flying suitcases)

So what did I learn?  Well, when I returned from Minneapolis, we touched down at 6:01 pm and I was home, in my apartment, at 7:31.  The E train was running express!

Well, this all got me thinking.  Maybe the MTA Trip Planner doesn’t make the connection or association that some E trains don’t run express early in the morning?   If it’s algorithm is finding outdated information, it could cause problems.  Or maybe the MTA Trip Planner should give multiple trip and time suggestions and offer a way to browse through the results, so people aren’t blue-skying their journeys to the JFK airport. . .

Okay, so I learned that I need to leave that extra 15 minutes early even when doing due diligence.   It wasn’t that bad, but it is a bit ironic that I spent eight hours waiting to take a 60 minute flight to Chicago followed by a 30 minute taxi to the gate.

One other thing I noticed was soldiers walking through airports carrying huge backpacks filled with who-knows-what.  I wonder if the military have embraced wheels on suitcases yet as per a previous post.

Whither Appraisal?: David Bearman’s “Archival Strategies” 2011/08/22

Posted by nydawg in Archives, Best Practices, Curating, Digital Archives, Digital Preservation, Education, Electronic Records, Information Technology (IT), Media, Records Management.
Tags: , , , , , ,
1 comment so far

Back in Fall 1995, American Archivist published one of the most controversial and debate-inspiring essays written by archival bad-boy David Bearman of Archives & Museum Informatics from Pittsburgh (now living in Canada).  The essay, “Archival Strategies” pointed to several problems (challenges/obstacles) in archival methods and strategies which, at the time, threatened to make the profession obsolete.   The piece was a follow-up to his “Archival Methods” from 1989 and showed “time and again that archivists have themselves documented order of magnitude and greater discrepancies between our approaches and our aims, they call for a redefinition of the problems, the objectives, the methods or the technologies appropriate to the archival endeavor.”  As he points out in Archival Strategies, “In Archival Methods, I argued that “most potential users of archives don’t,” and that “those who do use archives are not the users we prefer.””

This disconnect between archives and their future users led Bearman to write “I urged that we seek justification in use, and that we become indispensable to corporate functioning as the source of information pertaining to what the organization does, and as the locus of accountability.”  With his well-stated pithy aphorisms like “most potential users of archives don’t,” and that “those who do use archives are not the users we prefer,” he was able to point to the serious problem facing us today: past practices have led us to preserve the wrong stuff for our unprefered users!  Of course Information Technology has led us down this road since computer storage is marketed as so cheap (and always getting cheaper),  and it seems much easier to store everything than to let an archivist do his job starting with selection and appraisal, retention and preservation, arrangement and description, and access and use.

Ultimately, his essay is a clarion call for archivists to establish a clear goal for the profession, namely to accept their role in risk management and providing accountability for the greater societal goal.  The role of an archivist, in my opinion, is to serve as an institution’s conscience!  Perhaps that is the reason why library science and archival studies are considered science.   He suggests that strategic thinking is required “Because strategic thinking focuses on end results, it demands “outcome” oriented, rather than “output” oriented, success measures. For example, instead of measuring the number of cubic feet of accessions (an output of the accessioning process), we might measure the percentage of requests for records satisfied (which comes closer to reflecting the purpose of accessioning).”

This seminal essay is a fascinating read and groundbreaking analysis of the sorry state of appraisal.  “What we have actually been doing is scheduling records to assure that nothing valuable is thrown away, but this is not at all equivalent to assuring that everything valuable is kept.  Instead, these methods reduce the overall quantity of documentation; presumably we have felt that if the chaff was separated from the wheat it would be easier to identify what was truly important.  The effect, however, is to direct most records management and archival energy into
controlling the destruction of the 99 percent of records which are of only temporary value, rather than into identifying the 1 percent we want, and making efforts to secure them.”

Using incendiary language, Bearman goes on to state the obvious:  “Appraisal, which is the method we have
employed to select or identify records, is bankrupt.  Not only is it hopeless to try to sort out the cascade of “values” that can be found in records and to develop a formula by which these are applied to records, 16 it wastes resources and rarely even encounters the evidence of those business functions which we most want to document.”

2D lifecycle or 3D continuum

This is a revolutionary essay, and I strongly encourage every archivist to read it and think about it deeply.  The ideas have mostly languished and been ignored in this country as we continue to use the life cycle model, but Bearman’s ideas are written in the international standards for records management (ISO 15489) and  widely embraced in Australia (and China) where, over the last two decades, they have conceptualized and implemented the “Australian records continuum” model to great effect and, in doing so, they are looking at born-digital assets and electronic records from perspectives of all users, functions, and needs.  In my opinion, it seems like the continuum model is a 3D version of the lifecycle, which reminds me of this image from A Wrinkle in Time in which Mrs. Who and Mrs. Whatsit explain time travel to Meg and Charles Wallace by showing how an ant can quickly move across a string if the two ends are brought closer together.   In other words, if archivists look at the desired end result, they can appraise and process accordingly.

 

After reading the Bearman essay for the first time and seeing how it has caused such dramatic changes in archival conceptualizations, methods, strategies and processes elsewhere, but is still not taught in any depth in US library or archival studies schools, I spoke with other nydawg members, and we decided to use it as the text as for our next discussion group on Tuesday August 23.   I hope to revisit this topic later.

One last point.  Because of the deluge of materials accessioned by archives, “uncataloged backlog among manuscripts collections was a mean of nearly one-third repository holdings”, leading the authors to claim “Cataloging is  function that is not working.”  With budgets cut and small staffs unable to make progress, Mark Greene and Dennis Meissner wrote another revolutionary piece titled “More Product, Less Process: Pragmatically Revamping Traditional Processing Approaches to Deal with Late 20th-Century Collections” [MPLP] which was a plea for minimal processing.

Unlike Bearman’s “Archival Strategies”, MPLP leads archivists to believe that we must remain passive filers or describers or catalogers or undertakers.  But without a better understanding of appraisal and how to do it, we are doomed with analog, paper, born-digital or electronic records!  The clearest example of this is the National Archives and Records Administration’s Electronic Records Archive (ERA) which, according to Archivist of the United States David Ferriero “At the moment, most of the electronic records in ERA are Presidential records from the George W. Bush White House.  This important collection includes more than 200 million e-mail messages and more than 3 million digital photographs, as well as more than 30 million additional electronic records in other formats. ”

A few weeks ago, I actually crunched the numbers and figured out that 200 million emails over the course of eight years works out to nearly one email a second!  (365 days a year x 8 years = 2920 days plus 2 (leap year days)  2922 x 24 hours a day = 70,128 hours x 60 mins in an hour = 4,207,680 x 60 seconds per minute = 252,460,800. )
After doing the math, my first thought was, “if we’re trying to process and preserve every email sent every second by the George W. Bush Administration, we must be doing something wrong.”  And now, I think I understand the problem: we’re not doing effective appraisal.  Although we still have to wait for public access to the emails, I am fairly confident that researchers will find that nearly 90 percent of the collection are duplicates, or that they are keeping copies of the sent email, the different received emails, plus backups of all of them.  With better appraisal, this task should not be so difficult, and would leave more time for catalogers to do more detailed descriptions (which will be more important later, especially with different formats of “moving images” which are not compatible  with newer versions of hardware (e.g. iPads don’t play Flash Video).

 

Disaster Plan: Mystery Surrounds Loss of Digital 9/11 Records, Docs & Art 2011/08/21

Posted by nydawg in Archives, Best Practices, Digital Preservation, Education, Electronic Records, Intellectual Property, Privacy & Security.
Tags: , , , , , , , ,
add a comment

A few weeks ago, nydawg member NYU Professor Howard Besser shared this article from the AP.  As an archivist and records manager, I shudder to think that all copies of each lost asset was only stored in one place, and that no copies were stored offsite, stored in at least two geographically different locations.

“Besides ending nearly 3,000 lives, destroying planes and reducing buildings to tons of rubble and ash, the Sept. 11, 2001, attacks destroyed tens of thousands of records, irreplaceable historical documents and art.  In some cases, the inventories were destroyed along with the records. And the loss of human life at the time overshadowed the search for lost paper. A decade later, agencies and archivists say they’re still not completely sure what they lost or found, leaving them without much of a guide to piece together missing history.

“You can’t get the picture back, because critical pieces are missing,” said Kathleen D. Roe, operations director at the New York State Archives and co-chairwoman of the World Trade Center Documentation Project. “And so you can’t know what the whole picture looks like.”  . . . . “The trade center was home to more than 430 companies, including law firms, manufacturers and financial institutions. Twenty-one libraries were destroyed, including that of The Journal of Commerce. Dozens of federal, state and local government agencies were at the site, including the Equal Employment Opportunity Commission and the Securities and Exchange Commission.

from Northeast Document Conservation Center

But the story goes on to point out that nobody notified NARA!  I would think that most of these federal agencies would have disaster plans and policies, (check out the Library of Congress’s 404 page not found, or here or NARA and NARA from 1993 but maybe I’m wrong.   Fortunately, you can probably find assistance at NDECC dPlan….

 . .  . “Federal agencies are required by law to report the destruction of records to the U.S. National Archives and Records Administration — but none did. Federal archivists called the failure understandable, given the greater disaster.  After Sept. 11, “agencies did not do precisely what was required vis-à-vis records loss,” said David S. Ferriero, the Archivist of the United States, in an email to The Associated Press. “Appropriately, agencies were more concerned with loss of life and rebuilding operations — not managing or preserving records.”  He said off-site storage and redundant electronic systems backed up some records; but the attacks spurred the archives agency to emphasize the need for disaster planning to federal records managers.

Said Steven Aftergood, the director of the project on government secrecy at the watchdog group the Federation of American Scientists: “Under extreme circumstances, like those of 9/11, ordinary record keeping procedures will fail. Routine archival practices were never intended to deal with the destruction of entire offices or buildings.”

Read “Mystery Surrounds Loss of Records, Art on 9/11” , and when you’re ready and think you can get some institutional support, you might want to check out some great resources including:
the Society of American Archivists’ [SAA] annotated resources site for disaster plan templates, articles and other useful information; or a
useful guide from NARA Emergency Preparedness Bibliography (which is only 5 years old) or this from
NARA Disaster Preparation Primer from 1993
which doesn’t mention digital or electronic.

dk
###

SAA Education Announces Digital Archives Certificate, Fall Workshops & Courses 2011/08/18

Posted by nydawg in Digital Archives, Digital Preservation, Education, Electronic Records, Information Technology (IT).
Tags: , , ,
add a comment

Digital Archives Specialist

Hi Everyone: Society of American Archivists [SAA] is announcing its 2011 Fall slate of 26(!)  Classes and Workshops including, for the first time ever, courses leading to Digital Archives Specialist [DAS] Certificate, a curriculum I’m proud to say I helped conceptualize, draft and propose with 6 other members of the SAA’s DACE [Digital Archives Continuing Education] Task Force (including one other member of the nydawg)!  To learn more about DAS, http://www2.archivists.org/prof-education/das

 Standards for Digital Archives [DAS]
September 29 in your office, classroom, or home!

Managing Electronic Records in Archives and Special Collections [DAS]
September 8-9, 2011 in Philadelphia
 (early discount extended to August 26!)
October 13-14, 2011 in Hanover, New Hampshire

December 15-16 in Pasadena, California

Encoded Archival Description
September 15-16 in Bethlehem, PA
 (early discount extended to August 26!)
November 10-11, 2011 in San Antonio, Texas

Understanding Archives: An Introduction to Archival Principles and Practices
September 16-17, 2011 in Mount Carroll, Illinois

Implementing DACS in Integrated Content Management Systems: Using the Archivists’ Toolkit™
September 26-27 in Washington

December 5-6 in Claremont, California

Arrangement and Description of Manuscript Collections
September 26-27 in Austin, Texas

Describing Archives: A Content Standard
September 30 in Richmond, Virginia
 [Open to Virginia residents only]
October 14 in Seattle

Archives Overview
October 3 in Carbondale, Illinois
 [Scholarships available for Illinois residents]

Oral History: From Planning to Preservation [Scholarship for North Dakota residents]
October 3 in Bismarck, North Dakota

Style Sheets for EAD: Delivering Your Finding Aids to the Web
October 13-14 in Pasadena, California

 An Introduction to Archival Exhibitions
October 14 in Austin, Texas

Rare Books for Archivists
October 20-21 in Chicago

December 7-8 in San Antonio, Texas

Introduction to Basic Imaging: How to Do a Small Digitization Project
October 31 in Princeton, New Jersey

Visual Literacy for Photograph Collections
November 4 in San Jose, California

Managing Literary Manuscripts
December 5 in Roanoke, Virginia

Implementing “More Product, Less Process”
December 12 in Lakeland, Florida

Whistleblowers & Leakers and Records Management @ the SEC 2011/08/18

Posted by nydawg in Archives, Digital Archives, Digital Preservation, Education, Electronic Records, Information Technology (IT), Privacy & Security, Records Management.
Tags: , ,
add a comment

Does it seem like whistleblowers and leakers have been in the news more and more over the last decade?  Perhaps it’s because I’m more tuned in and looking at things from an  archivist’s or records manager’s point of view, but there have been some high-profile intriguing cases with whistleblowers.  And I’m not just talking about Private first class (Pfc) Bradley Manning who leaked to WikiLeaks copies of diplomatic cable communications, classified documents, prison dossiers, classified war logs, embassy reports and a video, “Collateral Murder”, known as “a classified US military video depicting the indiscriminate slaying of over a dozen people in the Iraqi suburb of New Baghdad — including two Reuters news staff.

Meanwhile, Manning is in jail somewhere and may possibly face the death penalty because he copied, zipped, and uploaded a classified video: ” Specification 11 covers the release of “a file named ‘BE22 PAX.zip‘ containing a video named ‘BE22 PAX.wmv.'”  This likely is the video of the 2009 Afghani airstrike that Wikileaks published.
This morning I heard about a brief mention about an SEC whistleblower revealing how a poor records management (in this case, document disposal and destruction) policy cover up Wall Street crimes.  From Matt Taibbi’s excellent piece in Rolling Stone:

“That, it now appears, is exactly how the Securities and Exchange Commission has been treating the Wall Street criminals who cratered the global economy a few years back. For the past two decades, according to a whistle-blower at the SEC who recently came forward to Congress, the agency has been systematically destroying records of its preliminary investigations once they are closed. By whitewashing the files of some of the nation’s worst financial criminals, the SEC has kept an entire generation of federal investigators in the dark about past inquiries into insider trading, fraud and market manipulation against companies like Goldman Sachs, Deutsche Bank and AIG. With a few strokes of the keyboard, the evidence gathered during thousands of investigations – “18,000 … including Madoff,” as one high-ranking SEC official put it during a panicked meeting about the destruction – has apparently disappeared forever into the wormhole of history.”

In other words, they gather records, take the evidence and then destroy it!  But since these are the nation’s records, there must be some connection to National Archives, and there is.   “Under a deal the SEC worked out with the National Archives and Records Administration, all of the agency’s records – “including case files relating to preliminary investigations” – are supposed to be maintained for at least 25 years. But the SEC, using history-altering practices that for once actually deserve the overused and usually hysterical term “Orwellian,” devised an elaborate and possibly illegal system under which staffers were directed to dispose of the documents from any preliminary inquiry that did not receive approval from senior staff to become a full-blown, formal investigation.” but

“The enforcement division of the SEC even spelled out the procedure in writing, on the commission’s internal website. “After you have closed a MUI that has not become an investigation,” the site advised staffers, “you should dispose of any documents obtained in connection with the MUI.”  Read all about it in Taibbi’s “Is the SEC Covering Up Wall Street Crimes?

My impression is that this is obviously all about Records Management policies, processes, procedures and etc.  Like Manning, the SEC whistleblower saw mixed messages and witnessed wrongdoing and stepped forward and leaked copies of the records.  Until we figure out what is the difference between records and copies of records, this story may go nowhere.  But it seems likely that even if the SEC thought it had deleted the MUIs, there still may be traces that could provide evidence from past investigations.  IT probably has backup tapes (which have been over-written frequently), but the data could be extracted (at great expense).  The question really is: Does SEC Want to preserve all of its agency records?

My guess is: probably not.   My other thought is that there’s an easy answer: post-custodial control.  Agencies need to take responsibility for managing their record, even before they are handed off to an archivist or the NARA.   SEC needs its own department of archivists and records managers and needs a workable strategy and plan with IT for managing their own authentic records.  They also need somebody with a conscience, who is trustworthy enough to deliver accurate records with a life span of 25 years(!) to NARA in a timely fashion.

Changing Good, Better and Best Practices 2011/08/15

Posted by nydawg in Archives, Best Practices, Digital Archives, Education.
Tags: , ,
add a comment

One of the most pressing challenges Digital Archivists and Information Managers face in the Information Age is the discovery of “Best Practices.”  According to Richard Pearce-Moses’ invaluable “Glossary of Archival Terminology”, Best Practices were “Procedures and guidelines that are widely accepted because experience and research has demonstrated that they are optimal and efficient means to produce a desired result”.  But in the Information Age where technologies and technical standards are continually changing, formats are constantly improving and upgrading, and all kinds of functional and administrative variables are impacting our archives and collections, best practices plays a very important role in archivists’ daily lives. Since the future is unknown and, for the most part, unknowable, digital archivists need better “Best Practices” as we try to prepare yesterday’s and today’s collections for tomorrow’s and 2020 users.

In an age of bit rot, link rot, vendor and software lock-in, media and technological obsolescence, non-backwards compatibility, proprietary encryption and compression, algorithmic searches, data deluges, digital dilemmas and other variables, it’s important to keep in mind that today’s “Best Practices” may seem quaint and will likely be irrelevant in five years or fewer!

Here’s a few examples of recent attempts to describe “Best Practices” for Digital Archives and Collections and for Digitization.

University of Maryland. Best Practices Guidelines for Digital Collections, 2007.

NARA. Guidelines for Digitizing Archival Materials for Electronic Access, 2004.

NISO. A Framework of Guidance for Building Good Digital Collections” 2007.

So how has this changed in the last four years?! Well, for example, the announcement and release of the iPad has caused many organizations (including YouTube), to change their strategies and processes in order to encode in multiple digital video formats so iPad users can also access collections  (read: not Flash video).   This may seem to be limited ONLY to flash video, but a little research quickly reveals that some Digital Asset Management (DAM) databases (including Fedora) rely on Flash, and that many “records” originally created in MS PowerPoint have not yet been converted or captured in an open (non-proprietary) format.

In my opinion, one of the serious problems we face is our archival description systems and schemas (e.g. DublinCore, VRA) are inadequate for many essential (technical) fields which must be attached to assets with descriptive metadata (e.g. resolution, frame rate, color scheme, encryption, codecs, operating system, etc.).  Historically, Best Practices taught us to describe objects in a simplified way to streamline access for many stakeholders and researchers.  But when we use the same “Moving Images” term to describe video running the gamut from digital and analog video and different formats (16mm, 35mm, Super 8) of film, as well as DVD, VHS, CD-ROM, Quicktime, QT H.264, Flash video, RealMedia, Ogg Vorbis, and others, we run the risk of making our archives fully inaccessible to future users who may not have a way to access those old video formats in equipment not yet built.