jump to navigation

NARA to Declassify 400 Million Pages of Documents in Three Years 2011/12/06

Posted by nydawg in Archives, Digital Archives, Digital Preservation, Electronic Records, Information Technology (IT), Media, Records Management.
Tags: , , , , , ,
add a comment

For a very long time, I have been trying to ask anyone who knows (from my colleagues to the AOTUS himself), why are we even attempting to preserve 250 million emails created during the Bush Administration.  As I’ve mentioned before, that works out to nearly one email every second for eight years!  (And remember, part of that time included Bush’s annual month-long vacations.)  So this story really seemed to give a bit of context in ways that the National Archives (NARA) deals with processing large collections of backlog materials.  “All of these pages had been piling up here, literally,” said Sheryl J. Shenberger, a former CIA official who is the head of the National Declassification Center (NDC) at the National Archives. “We had to develop a Costco attitude: We had 400 million pages . . . and we have three years to do them in.”

If you read Saturday’s article in the Washington Post, you’ll learn that “All of the backlogged documents date back 25 years or more, and most are Cold War-era files from the departments of Defense, State and Justice, among other agencies. The CIA manages the declassification of its own files.”  and that ““The current backlog is so huge that Americans are being denied the ability to hold government officials accountable for their actions,” [AOTUS David] Ferriero said. “By streamlining the declassification process, the NDC will usher in a new day in the world of access.”

If NARA is really trying to declassify, process, catalog, describe, preserve and make these pages available, I hope they’re planning on hiring some more archivists!  The problem is that when institutions are dealing with mass quantities of materials, the (quantitative) metrics we use, may actually hurt us in the future.  In the archival world, the prevailing wisdom seems to be MPLP (More Product, Less Process), but I would argue that archivists need to have qualitative metrics as well, if only to ensure that they are reducing redundancies and older, non-needed versions.  This gets to the crux of the distinction between best practices for records managers and best practices for digital asset managers (or digital archivists).  Ideally, a knowledgeable professional will collect and appraise these materials, and describe it in a way, so that a future plan can be created to ensure that these assets (or records) can be migrated forward into new formats accessible on emerging (or not-yet invented) media players and readers.

Ultimately, this leads to the most serious problem facing archivists: the metadata schemas that are most popular (DublinCore, IPTC, DACS, EAD, etc.) are not specific enough to help archivists plan for the future.  Until our metadata schemas can be updated to ensure that content, context, function, structure, brand, storage media and file formats can be specifically and granularly identified and notated, we will continue paddling frantically against the digital deluge with no workable strategy or plan, or awareness of potential problems (e.g. vendor lock-in, non-backwards compatible formats, etc.)  Sadly, in the face of huge quantities of materials (emails and pages), NARA will probably embrace MPLP, and ultimately hinder and hurt future access to the most important specific files, pages, emails, etc., because they will refuse to hire more professionals to do this work, and will (probably) rely on computer scientists and defense contractors to whitewash the problems and sell more software.

Comparing Documentation Strategy of Civil War and First Gulf War 2011/11/21

Posted by nydawg in Archives, Best Practices, Digital Archives, Digital Preservation, Media, Records Management.
Tags: , , , ,
add a comment

I’ve said it before, and I’ll say it again (paraphrasing someone
else): “We are at risk of knowing less about the events leading up to
the First Gulf War than events leading up to the Civil War, because
all of the records and documents from the Civil War were conserved and
preserved, whereas all the records from the First Gulf War were
created on Wang Wordprocessors and never migrated forward and now lost
forever.”

Case in point: Lincoln at Gettysburg; photo by Matthew Brady
http://blogs.archives.gov/prologue/?p=2564

Or 1991 Gulf War speech by Sec of Def Cheney:
http://en.wikipedia.org/wiki/File:Cheney_Gulf_War_news_conference.jpg
http://upload.wikimedia.org/wikipedia/commons/thumb/5/52/Powell,_Schw…

or http://www.pbs.org/mediashift/2007/08/the-tangled-state-of-archived-n…

Adobe Abandons Mobile Flash Video (Over Steve Jobs’ Dead Body) 2011/11/10

Posted by nydawg in Archives, Digital Preservation, Information Technology (IT), Intellectual Property, Media.
Tags: , , , , , , ,
add a comment

Wired Magazine ran an interesting news story that many have been expecting!  “On Wednesday morning, Adobe delivered the eulogy for its multi-media Flash platform for mobile, stating the company would no longer invest resources in porting its once-indispensable cross-browser technology to smartphones and tablets.  It’s a startling admission of failure from a company that vehemently defended Flash and its mobile strategy in the face of Apple’s refusal to allow it on the iPhone and iPad. Adobe even took on Steve Jobs in a war of words over Flash’s viability as a mobile platform, all in the public domain.  But the writing was on the wall for Flash years ago, and Adobe knew it. With no Flash announcements to be heard at its Adobe Max conference earlier this year and with the company slowly beefing up its toolkit of Flash alternatives, Wednesday’s move is in step with Adobe’s broader strategy of migrating its loyal Flash developer base to a new era, one where mobile platforms reign supreme.”

It’s interesting to watch how these advancements will change our archiving strategies as older formats are retired and/or unsupported.  Everyone knows that the H.264 codec is more energy-efficient, but is the quality also better, and is it worth those license fees?!  So  just for fun, you might want to check out Steve Jobs’ “Thoughts on Flash” from April 2010:   “I wanted to jot down some of our thoughts on Adobe’s Flash products so that customers and critics may better understand why we do not allow Flash on iPhones, iPods and iPads. Adobe has characterized our decision as being primarily business driven – they say we want to protect our App Store – but in reality it is based on technology issues. Adobe claims that we are a closed system, and that Flash is open, but in fact the opposite is true. Let me explain.”

dk
###

Day of Digital Archives: McLuhan “The [digital] medium is [no longer] the [only] message.” 2011/10/06

Posted by nydawg in Digital Archives, Digital Archiving, Digital Preservation, Education, Information Technology (IT), Media.
Tags: , , , , ,
add a comment
Day of Digital Archives October 6, 2011 Marshall McLuhan: “The Medium Is the Message?” or “The [digital] medium is [no longer] the [only] message.”

This year marks the 100th anniversary of the birth of “the new spokesman of the electronic age”, Marshall (Understanding Media) McLuhan, and digital archivists should take a moment to think about how media, digital and analog, hot and cool, and in many different formats change our jobs, lives and responsibilities. With threats of technological obsolescence, vendor lock-in, hardware failure, bit rot and link rot, non-backwards compatible software, and format and media obsolescence, digital archivists need a system to accurately describe digital objects and assets in their form and function, content, subject, object and context. If we miss key details, we run the risk of restricting access in the future because, for example, data may not be migrated or media refreshed as needed. By studying and understanding media, digital archivists can propose a realistic and trustworthy digital strategy and implement better and best practices to guarantee more efficiency from capture (and digitization or ingest) and appraisal (selection and description), to preservation (storage) and access (distribution).

Over the last ten, forty, one hundred and twenty thousand years, we have crossed many thresholds and lived through many profound media changes– from oral culture to hieroglyphic communications to the alphabet and the written word, and from scrolls to books, and most recently transiting from the Atomic Age (age of atoms) to the Information Age (era of bits). While all changes were not paradigm shifts, many helped shift currencies of trust and convenience to establish new brand loyalties built on threats of imminent obsolescence and vendor lock-in. As digital archivists, we stand at the line separating data from digital assets, so we need to ensure that we are archiving and preserving the assets and describing the content, technical and contextual metadata as needed.

Today, Day of Digital Archives, is a good day to consider Marshall McLuhan’s most famous aphorism, “The medium is the massage,” and update it for the Information Age. In a nutshell, McLuhan argues that “the medium is the message” because an electric light bulb (medium) is pure information (light). He goes on to state: “This fact, characteristic of all media, means that the “content” of any medium is always another medium. The content of writing is speech, just as the written word is the content of print, and print is the content of the telegraph.” (Understanding Media, 23-24) But in the Information Age, the [digital] medium is [no longer] the [only] message. Every born-digital or digitized file is a piece in an environment in which it was created or is accessed, and needs to be described on multiple planes to articulate technical specifications (hardware & software versions, operating system, storage media, file format, encryption) as well as its content. For archivists and librarians describing content, the medium and the message, many use MARC, DublinCore and VRA Core are guides, but PBCore provides a richly defined set of technical, content and Intellectual Property metadata fields to ensure all stakeholders, including IT staff will be able to efficiently access, copy or use the asset (or a copy).

With More Product, Less Process [MPLP] the prevailing processing strategy, many libraries, archives and museums encourage simplified descriptions to catalog digital objects, but these generic descriptions (e.g. moving image, video or digital video) do not provide the most critical information to ensure future users can watch the video online, on an iPad or with a DVD player (or VHS player or film projector). Until digital objects and assets are described in their granular, multi-dimensional digital splendor, we are hurting ourselves and archival access in the future. Once we understand that the medium and message are split into many different categories, we can focus descriptive metadata on critical access points (subject, format or function), and we will not need to panic and makework every time a new [moving image] format [or codec] gains temporary popularity. With better description and critical appraisal at ingest, digital archivists will understand that the medium, the message and the content, subject, structure, form, format and other aspects are all integral parts. At that point we will start to change the commonly-held mindset that “The [digital] medium is [no longer] the [only] message.” 

WikiLeaks’ Cablegate Links State Dept. Bureau of Diplomatic Security to Madness 2011/09/28

Posted by nydawg in Archives, Digital Archives, Digital Preservation, Electronic Records, Information Technology (IT), Media, Privacy & Security, Records Management, WikiLeaks.
Tags: , , , ,
add a comment

For the last year or so, I’ve been fascinated by the whole WikiLeaks Cablegate story.  As I posted previously, there are a number of factors that contribute to this story which make it particularly interesting for people concerned with records  management and best practices for accessing and sharing information.   In my opinion, Private first class Bradley Manning is a fall guy (lipsynching to Lady Gaga), but problems revealed serious systemic malfunctions.  So I was very interested to read this article by Andy Kroll: “The Only State Dept. Employee Who May Be Fired Over WikiLeaks“.

Peter Van Buren is no insurgent. Quite the opposite: For 23 years he’s worked as a foreign service officer at the State Department, and a damn good one from the looks of it. He speaks Japanese, Mandarin Chinese, and Korean; served his country from Seoul to Sydney, Tokyo to Baghdad; and has won multiple awards for his disaster relief work. So why was Van Buren treated like a terror suspect by his own employer? For linking to a single leaked cable dumped online by WikiLeaks earlier this month.”

Well, this led me to read a TomDispatch.com posting by Van Buren himself which offers a clear-headed look at the madness!  For one thing, Van Buren got into a heap of trouble and was “under investigation for allegedly disclosing classified information” for LINKING to a WikiLeaks document which was already on the Web!  As he put it: “two DS agents stated that the inclusion of that link amounted to disclosing classified material. In other words, a link to a document posted by who-knows-who on a public website available at this moment to anyone in the world was the legal equivalent of me stealing a Top Secret report, hiding it under my coat, and passing it to a Chinese spy in a dark alley.”

Van Buren goes on to analyze the situation by stating: “Let’s think through this disclosure of classified info thing, even if State won’t. Every website on the Internet includes links to other websites. It’s how the web works. If you include a link to say, a CNN article about Libya, you are not “disclosing” that information — it’s already there. You’re just saying: “Have a look at this.”  It’s like pointing out a newspaper article of interest to a guy next to you on the bus.  (Careful, though, if it’s an article from the New York Times or the Washington Post.  It might quote stuff from Wikileaks and then you could be endangering national security.)”

And, for me, the cherry on the top, and something I’ve been trying to state for most of the last year (including at the Archivists Round Table of Metropolitan New York meeting in January 2011), is the fact that “No one will ever be fired at State because of WikiLeaks — except, at some point, possibly me. Instead, State joined in the Federal mugging of Army Private Bradley Manning, the person alleged to have copied the cables onto a Lady Gaga CD while sitting in the Iraqi desert. That all those cables were available electronically to everyone from the Secretary of State to a lowly Army private was the result of a clumsy post-9/11 decision at the highest levels of the State Department to quickly make up for information-sharing shortcomings. Trying to please an angry Bush White House, State went from sharing almost nothing to sharing almost everything overnight. They flung their whole library onto the government’s classified intranet, SIPRnet, making it available to hundreds of thousands of Federal employees worldwide. . . . . State did not restrict access. If you were in, you could see it all. There was no safeguard to ask why someone in the Army in Iraq in 2010 needed to see reporting from 1980s Iceland. . . . . Most for-pay porn sites limit the amount of data that can be downloaded. Not State. Once those cables were available on SIPRnet, no alarms or restrictions were implemented so that low-level users couldn’t just download terabytes of classified data. If any activity logs were kept, it does not look like anyone checked them.

In other words, by pointing the finger of blame at a few (two) bad apples (Pfc Manning and Foreign Services Officer/ Author Van Buren), “… gets rid of a “troublemaker,” and the Bureau of Diplomatic Security people can claim that they are “doing something” about the WikiLeaks drip that continues even while they fiddle.”  Yet, the State Department and the Department of Defense still refuse to acknowledge the systemic problems of trying to provide UNRESTRICTED and UNTRACEABLE ACCESS to ALL CABLES to all LEVELS of employees from the highest administrative levels at State and Defense  to the lowliest of the low  (Private first class on probation or a contractor, like Aaron Barr, working in White Hat or Black Hat Ops.)  Okay, according to Homeland Security Today, there’s 3 million people (not just Americans, btw) with “secret” clearance and “only” half a million with access to SIPRNet!

This still strikes me as an example of the US acting like ostriches and burying its head so we will not have to acknowledge the serious problems that are all around us.  Mark my words: the system is still broken, and even though certain changes have been instituted (thumb drive bans), we have a much more serious and systemic problem which few dare to acknowledge.  What’s the solution?  Better appraisal and better records management!

No one will ever be fired at State because of WikiLeaks — except, at some point, possibly me. Instead, State joined in the Federal mugging of Army Private Bradley Manning, the person alleged to have copied the cables onto a Lady Gaga CD while sitting in the Iraqi desert. That all those cables were available electronically to everyone from the Secretary of State to a lowly Army private was the result of a clumsy post-9/11 decision at the highest levels of the State Department to quickly make up for information-sharing shortcomings. Trying to please an angry Bush White House, State went from sharing almost nothing to sharing almost everything overnight. They flung their whole library onto the government’s classified intranet, SIPRnet, making it available to hundreds of thousands of Federal employees worldwide.

Keep Bit Rot at Bay: Change is Afoot as LoC’s DPOE Trains the Trainers 2011/09/20

Posted by nydawg in Archives, Best Practices, Digital Archives, Digital Archiving, Digital Preservation, Information Technology (IT), Media.
Tags: , , , , ,
add a comment

This was forwarded to me by a nydawg member who subscribes to the UK’s Digital Preservation listserv.  I don’t know if  it’s been posted publicly in the US, but I guess this first one is by invitation-only.  I would LOVE to hear what they are teaching and how they are doing it, so I hope someday to attend as well.

Library of Congress To Launch New Corps of Digital Preservation Trainers

The Digital Preservation Outreach and Education program at the Library of Congress will hold its first national train-the-trainer workshop on September 20-23, 2011, in Washington, DC.

The DPOE Baseline Workshop will produce a corps of trainers who are equipped to teach others, in their home regions across the U.S., the basic principles and practices of preserving digital materials.  Examples of such materials include websites; emails; digital photos, music, and videos; and official records.

The 24 students in the workshop (first in a projected series) are professionals from a variety of backgrounds who were selected from a nationwide applicant pool to  represent their home regions, and who have at least some familiarity with community-based training and with digital preservation. They will be instructed by the following subject matter experts:

*   Nancy McGovern, Inter-university Consortium for Political and Social  Research, University of Michigan
*   Robin Dale, LYRASIS
*   Mary Molinaro, University of Kentucky Libraries
*   Katherine Skinner, Educopia Institute and MetaArchive Cooperative
*   Michael Thuman,  Tessella
*   Helen Tibbo, School of Information and Library Science, University of  North Carolina at Chapel Hill, and Society of American Archivists.

The curriculum has been developed by the DPOE staff and expert volunteer advisors and informed by DPOE-conducted research–including a nationwide needs-assessment survey and a review of curricula in existing training programs. An outcome of the September workshop will be for each participant to, in turn, hold at least one basic-level digital-preservation workshop in his or her home U.S. region by mid-2012.

The intent of the workshop is to share high-quality training in digital preservation, based upon a standardized set of core
principles, across the nation.  In time, the goal is to make the training available and affordable to virtually any interested
organization or individual.

The Library’s September 2011 workshop is invitation-only, but informational and media inquiries are welcome to George Coulbourne, DPOE Program Director, at gcou@loc.gov.

The Library created DPOE  in 2010.  Its mission is to foster national outreach and education to encourage individuals and organizations to actively preserve their digital content, building on a collaborative network of instructors, contributors and institutional partners. The DPOE website is www.loc.gov/dpoe
http://digitalpreservation.gov/education/.  Check out the curriculum and course offerings here.

 

dk
###

Authors’ Guild Sues HathiTrust for Using Unauthorized Scans 2011/09/20

Posted by nydawg in Copyright, Digital Preservation, Intellectual Property, Media.
Tags: , , ,
add a comment

A few months ago, Judge Denny Chin put the kibosh on GoogleBooks’ attempt to digitize millions of library books and provide (mostly limited) access to the OCR.  Well, at that time, the good news was that ” HathiTrust, an organization set up to help them archive and distribute digital works” was still doing important work, but now HathiTrust is named as a defendant.  “The suit seeks to block two separate efforts. In the first, the universities have created a pooled digital archive of the contents of their libraries, maintained by the Hathitrust. No one contests that these works remain in copyright, or that the universities have rights to the nondigital forms of these works. What the authors object to is the fact that the digital works are derived from an unauthorized scan, and will be stored in a single archive that is no longer under the control of the university from which the scan was derived. The suit suggests that the security of this archive is also suspect, and may allow the mass release of copyrighted work.

“A separate issue in the suit is an orphaned works project started by the Hathitrust that focuses on some of the works within this archive. The group is attempting to identify out-of-copyright books, and those where the ownership of copyright cannot be established. If attempts to locate and contact any copyright holders fail, and the work is no longer commercially available, the Hathitrust will start providing digital copies to students without restrictions. This has not gone over well. The executive director of the Australian Society of Authors, Angelo Loukakis, stated, “This group of American universities has no authority to decide whether, when or how authors forfeit their copyright protection. These aren’t orphaned books, they’re abducted books.”  Read the Ars Technica article.

And if you’re still confused about the legal issues related to GoogleBooks’ recent problems with copyright infringement with an eye towards orphans, out-of-copyright and copyrighted materials, here’s an excellent multimedia presentation at Harvard by Lawrence Lessig in which he makes the argument  that tigers, as cubs, are extremely cute or to read why he thinks it is a “path to insanity,” check out TechCrunch or his longer “For the Love of Culture” essay in The New Republic.

DMCA, DRM and The unFair Use Act 2011/09/09

Posted by nydawg in Archives, Copyright, Digital Preservation, Information Literacy, Intellectual Property, Media, Privacy & Security.
Tags: , , , , , , , ,
add a comment

A few weeks ago, I reserved a copy of a Clay Shirky e-Book to take with me to Chicago.  When the book became available from NYPL, I was excited and hoped it would be a DRM (Digital Rights Management)-free PDF copy so I could download it (click the emailed link) to my netbook and transfer it to my eReader Tablet.  (It was my first time, so I was clueless.)  Oh well.  Obviously you can’t do that. . . .  or maybe you can, on a Sony eReader and the York Library.

Yesterday an old friend on facebook asked about borrowing eComic Books with the intent of ultimately preserving on some portable medium.  So I was intrigued enough to do a little research on DRM and found this informative piece from the ASIS&T Bulletin website: “Digital rights management (DRM) is commonly defined as the set of technological protection measures (TPM) by which rights holders prevent the use of digital content they license in ways that could compromise the commercial value of their products.  Restrictions on such uses as downloading, printing, saving and emailing content are encoded directly in the products or the hardware needed to use them and are therefore in immediate effect.”

The whole article is worth reading, but this one part caught my eye: “The New York Public Library (NYPL), for instance, has been considering bringing its digitized collection of dance and performance videos closer to the public outside the NYPL system as long as it is possible to restrict access to this online content to library locations only. These examples show that DRM may actually provide opportunities to expand access to online materials in ways previously not possible.”  The essay continues by examining the DMCA and its relation to DRM, pointing out that “Since the DMCA was enacted in 1998, the Library of Congress has enforced exceptions three times – in 2000, 2003 and 2006 – and was scheduled to do so again in 2009. Of the six exceptions passed in 2006, one specifically allows film and media studies professors to circumvent TPM to make film clip compilations for coursework using DVD copies held by their institution’s film-studies library. A movement has been underway to expand this exception to include K-12 educators, all subject areas and all legally obtained copies.”

And to give you a sense of what is at stake, the author writes “In February 2007, the Fair Use Act was introduced in Congress, but never passed. It would have codified into law all six exceptions from 2006, which are currently rule-made and remain subject to periodic reviews. The Fair Use Act would have permitted the circumvention of TPM for, among other cases, (1) access to public domain works, (2) access to works of public interest for criticism, scholarship, reporting or research, (3) compilations of educational film clips and (4) preservation in libraries. The latter is of particular importance as the various media with historical content, including DVDs, begin to deteriorate. Smith argued that what frightens publishers about the Fair Use Act is that, if implemented, it would render ineffectual the anti-circumvention rules. Fair use would constitute an exception so broad that decisions regarding the right to circumvent would often be made after the actual circumvention. If a content owner objected, the user could take the matter to court, and only then would a judge decide whether fair use can justify that particular circumvention. The Fair Use Act would thus defeat the anti-circumvention rule’s self-help purpose.”

So in other words, the encryption that libraries are using is controlling access to their eBooks, and the anti-copy encryption that companies are using on their deteriorating DVDs are conspiring with the law to keep libraries from providing open access in the future to our resources.  I say instead of “Fair Use Act”, we need a “Fair Copy Act” so libraries will be free to begin their media refreshment, digital migration and whatever they need to do to make sure their media collections do not become time capsules, moments of time captured on obsolete media formats.

 

Curating Google Doodle Highlights incl. Freddie Mercury’s Tribute 2011/09/06

Posted by nydawg in Curating, Digital Archives, Digital Archiving, Digital Preservation, Information Technology (IT), Intellectual Property, Media.
Tags: , , , , ,
add a comment

Hi everyone: Maybe this isn’t totally an archival or curatorial issue, but in some ways, these GoogleDoodles do what a good archive strives to do: provide easy access to available information and resources.  So pump up the volume, click on today’s GoogleDoodle, look for the cc [closed-captioning] button for lyrics to sing-along as you watch an animated music video tribute to the late great Queen singer Freddie Mercury.  http://www.google.com/

and check out Queen guitarist Brian May’s blog tribute here.

But if you want more of those awesome GoogleDoodles, don’t forget some of my favorites including: Alex Calder’s moving mobiles;  playable and recordable Les Paul guitar; John Lennon’s hand-drawn Imagine (animation); Martha’ Graham’s “Thought of You” dance; Mr. Men and Little Miss; Charlie Chaplin’s 122nd Birthday; and who can forget GoogleDoodle Dots, Jules Verne or the Google PacMan?

Those are some of my favorites, but I can probably think of a dozen more if i put my head to it. . . .If you’re interested in learning about the doodle history, check it out here.  And if i’m missing any good ones, please let me know!

Arab Spring Diplomatics & Libyan Records Management 2011/09/05

Posted by nydawg in Archives, Best Practices, Digital Archives, Digital Preservation, Electronic Records, Information Technology (IT), Media, Records Management.
Tags: , , , , , , , , , , ,
add a comment

At 75th Annual Meeting of the SAA (Society of American Archivists) last week, I had the fortunate opportunity to attend many very interesting panels, speeches and discussions on archives, archival education, standards, electronic records, digital forensics, photography archives, digital media, and my mind is still reeling.   But when I heard this story on the news radio frequency, I needed to double-check.

As you all know, the Arab Springrevolutionary wave of demonstrations and protests in the Arab world. Since 18 December 2010 there have been revolutions in Tunisia and Egypt;
civil uprisings in BahrainSyria, &Yemen; major protests in AlgeriaIraqJordanMorocco, and
 

Omanand minor protests civil war in Libya resulting in the fall of the regime there; in

Kuwait, LebanonMauritaniaSaudi ArabiaSudan, and Western Sahara! Egypian President Hosni Mubarak resigned (or retired) and there’s a Civil War going on in Libya.   Meanwhile, with poor records management, documents were found in Libya’s External Security agency headquarters showing that the US was firmly on their side in the War on Terror:

“CIA moved to establish “a permanent presence” in Libya in 2004, according to a note from Stephen Kappes, at the time the No. 2 in the CIA’s clandestine service, to Libya’s then-intelligence chief, Moussa Koussa.  Secret documents unearthed by human rights activists indicate the CIA and MI6 had very close relations with Libya’s 2004 Gadhafi regime.

The memo began “Dear Musa,” and was signed by hand, “Steve.” Mr. Kappes was a critical player in the secret negotiations that led to Libyan leader Col. Moammar Gadhafi’s 2003 decision to give up his nuclear program. Through a spokeswoman, Mr. Kappes, who has retired from the agency, declined to comment.  A U.S. official said Libya had showed progress at the time. “Let’s keep in mind the context here: By 2004, the U.S. had successfully convinced the Libyan government to renounce its nuclear-weapons program and to help stop terrorists who were actively targeting Americans in the U.S. and abroad,” the official said.””

Shudder.

So I guess that means that if all of those documents from the CIA are secret, there would be no metric for tracing a record (at least on the US side).   In other words, every time a record is sent, copied or moved, a new version is created, but where is the original?  Depending on the operating system, the metadata may have a new Date Created.  How will anybody be able to find an authentic electronic record when it’s still stored on one person’s local system which is probably upgraded every few years?

There is a better way, a paradigm shift, and by looking at the Australian records continuum, “certainly provides a better view of reality than an approach that separates space and time”, we can find a better way so all [useless] data created is not aggregated.   With better and more appraisal, critical and analytical and technical and IP content, we can select and describe more completely the born digital assets and separate the wheat from the chaff, the needles and the haystacks, the molehills from the mountains, and (wait for it)  . . . see the forest for the trees.  By storing fewer assets and electronic records more carefully, we can actually guarantee better results.  Otherwise, we are simply pawns in the games of risk played (quite successfully) by IT Departments ensuring (but not insuring) the higher-ups that “we are archiving: we backup every week.” [For those who are wondering: when institutions “backup” they backup the assets one week, moves the tapes offsite and overwrite the assets the following week.  They don’t archive-to-tape for long-term preservation.]

Diplomatics may present a way for ethical archivists in to the world of IT, especially when it comes down to Digital Forensics.  But the point I’m ultimately trying to make, I think, is that electronic (or born digital) records management requires new skills, strategies, processes, standards, plans, goals and better practices than the status quo.  And this seems to be the big elephant in the room that nobody dares describe.