Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Movies Media Data Storage IT

Digital Media Archiving Challenges Hollywood 155

HarryCaul writes "Movies are moving to digital, but what about long-term archiving of the master source materials? Turns out it's harder for digital media than for contemporary analog. Data is being lost, and studios have to learn to cope. Phil Feiner of the AMPAS sci-tech division says when he worked on studio feature films he 'found missing frames or corrupted data on 40% of the data tapes that came in from digital intermediate houses' How to deal with it? Regular migration from old media to new media. Grover Crisp, says Sony has put in a program of migrating every two to three years. Other studios are following suit, but what about indie features? Will we lose films like we lost the originals of the 20s?"
This discussion has been archived. No new comments can be posted.

Digital Media Archiving Challenges Hollywood

Comments Filter:
  • Bitorrent? (Score:1, Interesting)

    by Anonymous Coward
    Sounds like a good archival method to me...
  • by Anonymous Coward on Saturday April 21, 2007 @08:21AM (#18823195)
    If they are concerned about digital data being lost: why not introduce redundancy? Make sure that the data is stored at many locations as possible (and also with a high quality). Luckily the Internet already has a solution for this problem: BitTorrent.
    • Re: (Score:2, Insightful)

      by HeroreV ( 869368 )
      BitTorrent is a way to transfer information, not to store it. BitTorrent wouldn't help any here.

      Maybe you were trying to say a solution would be making the materials available to the general public and hoping they archive the data. Somehow I don't think many people are interesting in downloading hundreds of gigs of raw footage and keeping a torrent going for decades.
  • by bheer ( 633842 ) <rbheerNO@SPAMgmail.com> on Saturday April 21, 2007 @08:23AM (#18823211)
    "... only wimps use backup: _real_ men just upload their important stuff on ftp, and let the rest of the world mirror it."

    I think it'd work well for the MPAA.

  • They can manage to "lose" the digital masters for every film Nicholas Cage has been in?
    • (What are the odds) They can manage to "lose" the digital masters for every film Nicholas Cage has been in?

      Not good enough.
  • by Anonymous Coward
    Movies that suck will be lost.

    Movies that do not suck will be turned into AVI (or MP4 for the quality nitpickers who watch movies in front of their PC).

    Hollywood produces them. Piracy sorts them.
  • by Anonymous Coward
    Archive final cut to 35mm film.
  • by eneville ( 745111 )
    Personally, I'd like to see methods like OpenAFS [openafs.org] with a RAID/SAN data store. A great benefit of AFS is that it's ideal to work over a large IP network. Every night issue a update for all the nodes, a little like rsync I suppose in this respect, but it's ideal for a large infrastructure. Of course things like MD5 sums should be used on the files, perhaps split the large files with RAR or something, maybe use a .PAR file also. You know.. I think the pirate world has this sort of thing sorted already. Why don'
    • Splitting the data files wouldn't be enough. You'll need at least 3 copies to compare against one another to see if there's bitrot. If there's a difference in one file compared to the other two it is most likely that part is corrupted, and it should be overwritten by the data from the other copies.
      I don't know how redundant .par files are, but if it can be redundant like this it should work.
      • Splitting the data files wouldn't be enough. You'll need at least 3 copies to compare against one another to see if there's bitrot. If there's a difference in one file compared to the other two it is most likely that part is corrupted, and it should be overwritten by the data from the other copies.
        I don't know how redundant .par files are, but if it can be redundant like this it should work.

        well this is where AFS comes into it's own. its a DFS so adding replica sites is quite easy, then the normal admin tasks of daily checksums from the MD5s come into play. then either rebuild from the .par or deal with it manually. personally if i were running thing's i'd have AFS to multiple data centres, or just house the stuff onsite. which ever is easiest. i imagine that this sort of data would extend to having daily/weekly backups of employee data on there also, things like video edits.

        • As far as I can see about AFS is indeed the DFS structure, but on top of that would have to be a process which would check for differences between the volumes and correct any faults.
          The issue here wouldn't be so much as having a distributed set of data for better availability, but having a system that can detect and automatically correct data corruption.
          This is not something I can find as a feature in the AFS FAQ.
      • A good start would be to not use tape.

        I don't know what actual the percentage of tape failures is (and they're not telling), but in my own experience it's pretty high.

        Hard disks and PCs are cheap enough that every movie could have its own little RAID array somewhere.

        • Capacity of hard disks is nothing like capacity of magnetic tape. At Fermilab, we use tape because it'd be a real PITA to put dozens of petabytes on hard disks. CERN will soon have an even bigger problem in this regard.
  • by BlueParrot ( 965239 ) on Saturday April 21, 2007 @08:33AM (#18823269)
    This will only get worse because they insist on the stupid DRM schemes. If a drive crashes you can usually recover a fair portion of the data, if the drive is heavily encrypted and the crash takes out the key to your cipher, then you are fairly fucked. Sure, it is fine today when everybody and his mother has a HDMI compliant player, but with the amount of key-revocations that will likely be necessary as the scheme is cracked over and over again, sooner or latter the increasing complexity of key-management will cause them to start getting lost. The issue is further complicated by having the "plain-text" all in a central place rather than in everybody's home, a hurricane could easily take out a decade's worth of art that way. Of course none of this will happen because the people who make decisions about where the unencrypted originals are stored have a good understanding of how cryptography works, which is why we have DRM to begin with ...
    • Uh, joke? (Score:3, Insightful)

      by yabos ( 719499 )
      You really think they'd have DRM on the masters? Is that a joke or are you that crazy?
    • Re: (Score:3, Funny)

      I can see it now...

      RIAA: Hello? Is this DVD Jon? No! Wait! Don't hang up. We have an... awkward... favor to ask.
    • "This will only get worse because they insist on the stupid DRM schemes. If a drive crashes you can usually recover a fair portion of the data, if the drive is heavily encrypted and the crash takes out the key to your cipher, then you are fairly fucked."

      DRM is for mass-market items like DVDs, not master copies. I hate DRM as much as the next guy but let's not take this to a level of silliness.
      • Okay. So what happens if the master copies are destroyed (say, a fire starts at the film storage facility) and only mass-produced copies are left?
        Are the film studios allowed to use DeCSS if it's the only way they can make new copies of certain films?
        If there are both good DVDs and not-so-good VHS tapes remaining after the master copies of the film are destroyed, will the DMCA force the studios to use the VHS tapes for their reconstructions?
        • Exactly! There's a number of old films where the only copy is not in a studio's film vault somewhere but in the home of a passionate collector. If that copy is in a mass-market format like DVD, we'll lose some of original film print's quality but it'll still be a good representation of the film. But if that copy is an iTunes downloaded video, it's not going to be worth anything at all if iTunes is not still up and running. Which, in 70 years time, would not be surprising at all.
  • ...but codecs are. Chances are we'll have the information in another hundred years but not the means to access it.
    • by makomk ( 752139 )
      ...but codecs are. Chances are we'll have the information in another hundred years but not the means to access it.

      I'm not convinced this is really a problem. Just make sure you choose widely-used file formats and codecs that are supported properly by ffmpeg, and as long as open source software survives people will probably be able to find a way to play them in 100 years' time. (If we're particularly lucky, they may still be playable with standard, widely-available media player applications. That's much
  • by Anonymous Coward
    Tape backup has a limited shelf life. There is an effect known as "bleed through" that degrades data simply because of the physical layout of magnetic tape backup. This maximum shelf life of data on quality DDS media is about five years, therefore data on tape must be renewed in intervals less than the "best before" date.

    Additionally, daily backup tapes (differential or complete) have limited write cycles and must be replaced well before the manufacturers recommended maximum write cycles is exceeded.

    Obviou
    • Re: (Score:3, Insightful)

      by Waffle Iron ( 339739 )

      Obviously optical media is not an acceptable backup solution, due to its many failure points.

      However, what if they sent data that needs to be archived permanently through the first stages of the DVD mastering process, and produced an etched glass master disk. It seems to me that such a disk should last forever as long as it is protected from physical damage.

      To avoid damage from creating new DVD stampers from the master if it needs to be read, maybe they could create a special archival reader based on el

      • by rtb61 ( 674572 )
        This process would of course kill the whole marketing concept of buying the same content over and aver again (but it's better because it is a higher quality). At least with analogue you can sustain the marketing lie now matter how poorly the original analogue content was recorded.
  • by davidwr ( 791652 ) on Saturday April 21, 2007 @08:36AM (#18823285) Homepage Journal
    There are two problems:
    Data loss, where the data is actually lost. This is the equivalent of a scratch on a frame of the master negative. The cure is redundancy.

    Obsolescence, where the format becomes difficult to read after a period of time. The cure is lossless copying to new formats over time and/or keeping old equipment around.

    Another possible cure the the 2nd problem is to convert it to analog in an "easy to digitize" way.

    For example, simply "printing" the movie to 3 black-and-white filmstrips, one for each color, is considered archival. These can be rescanned later if needed. For better archiving, use larger film formats.
    Preserve each audio track in an archival analog format as well.

    Of course this doesn't preserve all the data that a digital filmmaking process has, but you aren't any worse off than you would have been with an analog film.

    If you want to, you can preserve each element of each scene separately, in an analog format or a completely-documented digital format but on an archival media, such as a "paper printout" stored on microfilm. I don't think most movie studios will go to this expense.
    • by XorNand ( 517466 )
      Instead of printing the actual picture to analog film, what about encoding the digital stream into a high-density barcode and printing it to microfilm? I have no idea how much media this would take but it could be completely lossless on an analog medium.
    • The cure for obsolescence is to simply come up with a lossless format and stick to it. It doesn't matter what it is, just stick with it. If movie formats are in flux, then go with a standard that has endured the test of time, TIFF (it lasted 20 years and is still going strong) and convert each frame into a TIFF image. If the movie industry is worried about "leaking movies", they can just encrypt each frame with another old standard, and stick to that encryption.

      There really is no need to keep changing forma
      • TIFF is about one of the worst formats for archiving images imaginable; I have yet to see two fully interoperable, independent TIFF implementations.

        As for "encryption", people change encryption standards not because they like to, but because old encryption gets weak. If you're going to pick an encryption standard and stick with it, you might as well not bother with encryption at all.
    • If you want to, you can preserve each element of each scene separately, in an analog format or a completely-documented digital format but on an archival media, such as a "paper printout" stored on microfilm. I don't think most movie studios will go to this expense.

      It cost millions to make a movie.. Why not.
    • Fun facts:
      Before 1912, films could not be copyrighted. Photographs, however, could be copyrighted if a print was filed with the Library of Congress.
      Thomas Edison and other filmmakers got copyrights for their films by telling the Library of Congress they were submitting a long string of photographs, which was technically true. They submitted to the Library of Congress long strips of paper containing prints of every frame in the film.
      Many years later, people discovered how to make film negatives from the
  • Losing movies (Score:4, Informative)

    by 0123456 ( 636235 ) on Saturday April 21, 2007 @08:41AM (#18823305)
    "Turns out it's harder for digital media than for contemporary analog"

    The negatives of the original 'Wicker Man' movie were either burnt or buried under the M3 motorway. From what I remember, some of the original 'Babylon 5' negatives were eaten by rats. They're gone, nothing will ever bring them back, because they're analogue media which can't be copied without quality loss.

    The problem is the whole idea of a 'master copy' of the movie on media that goes obsolete. The benefit of digital data is that it can be copied any number of times without quality loss, so build a big RAID system and stick the movies on there. Over time it will be upgraded but the digital data will remain... the only time you'll put the data on tape will be for backups, though even then you'd probably be better copying it to other RAID servers at remote sites.
    • Re: (Score:3, Funny)

      by Dogtanian ( 588974 )

      From what I remember, some of the original 'Babylon 5' negatives were eaten by rats.
      When asked why they hadn't eaten the original Star Trek negatives, the rats replied "We're not that desparate."
    • by fermion ( 181285 )
      What you are talking about here is money. Physical copies are thrown out because there simply is no money to store and maintain them. Library are cleared out all the time. One have a collection of movies, the movies over there have not generated significant revenue in 20 years, one needs to store old movies, what is one to do? Build a new facility, or throw out obsolete products.

      With 100% digital stock, the money might be less, but the issues just as real. Perhaps no footage must be left on the cutti

      • But there is a difference between long term storage of analog media and digital media: the costs of storing digital media are dropping exponentially. What used to take a rack to store five years ago is now stored on one or two servers and will eventually be down to one hard drive.

        Of course, you'd want to have multiple computers in separate locations for redundancy (which puts a lower limit on cost), but as time goes on, you'll need less copies at each datacenter. Combine electromagnetic storage with a coupl
      • Have you watched Turner Classic Movies lately?
        There is no such thing as an obsolete film, not as long as there are film fanatics. The studios will remove less-than-profitable films from the market, at least temporarily. They'll destroy physical copies, but they don't usually destroy all the physical copies of a film: after a few years off the market, a new generation of film fanatics will be curious about that film, and the studios can make another small profit then by reissuing it. Repeat this cycle of
    • they're analogue media which can't be copied without quality loss

      Yes, but a high-enough sampling rate and resolution make you not care [wikipedia.org].

  • by oneiros27 ( 46144 ) on Saturday April 21, 2007 @08:49AM (#18823341) Homepage
    This isn't news -- at least not to those of us who deal with data.

    The typical procedure is to do a media refresh (ie, copy it) every few years, and to check for damage. There are concepts like LOCKSS [lockss.org] (Lots of Copies Keeps Stuff Safe), so those joking about BitTorrent aren't that far off, but it's a little more structured than that.

    Dan Cohen [dancohen.org] gave a talk recently on "Can Today's Scientific Data Be Preserved? The Specter of a 'Digital Dark Age'", which touched on not only the issue of media failure, but also the loss of the knowledge to extract the encoded information. (much like the 'lost languages' that we don't understand now, how do we make sure that future generations have the necessary hardware and software to get the data back out?)

    What's disapointing is just how fast the media is failing. Vendors give a 'mean time to failure' estimate that's based on perfect storage, and that they have no real ways of testing (because, well, if you say it's 40 years, are we going to have to wait 40 years before using it?). Even if you're duplicating your tapes, what happens when all of the copies were put on the same potentially bad batch of tapes?

    Quite likely, we're going to lose data. And some of it's going to be because we no longer have copies of the data. The rest is going to be lost because there's so much crap being saved that doesn't need to be that we can't find stuff that still has value in the future.
  • Publish them on piratebay and let the world be the backup
  • ... and also need to ensure no parts of Jar-Jar are lost/modified due to data corruption. Hot Pink Jar-Jar, anyone?
  • Put it on the p2p networks and it will be available forever :)
  • Doctor Who (Score:3, Insightful)

    by Eudial ( 590661 ) on Saturday April 21, 2007 @09:10AM (#18823451)
    I think the countless lost Doctor Who-episodes is a good example of how analog video storage isn't perfect either.
    • I think the countless lost Doctor Who-episodes is a good example of how analog video storage isn't perfect either.
      That's because in most cases they *deliberately* got rid of them to free up space. It's nothing to do with deterioration of the original videotapes/film.
      • by Eudial ( 590661 )

        That's because in most cases they *deliberately* got rid of them to free up space. It's nothing to do with deterioration of the original videotapes/film.


        Though, one have to consider the human factor of film-storage, not just the technical part.
  • Have Google build them a redundant cluster like Google's own, couple of thousand machines, couple of petabytes.... shouldn't be too hard :)
  • Mix in some redundancy and use RAID-like computations to recover bad segments of tape. The pirate world has had this solved for quite some time.
  • by wbren ( 682133 ) on Saturday April 21, 2007 @09:32AM (#18823605) Homepage
    I have an archive of most of your _good_ material stored on my hard drives.
    • by Cheeze ( 12756 )
      wow, that's about 40GB max assuming you have them encoded in xvid.

      The problem is they said they were storing them on data tapes. Why are they using tape? Can't they write it out on a digital media like DVD or laser disc?
      • DVD is fine for end user, but it's actually pretty lossy. The studios need to store their stuff losslessly so they can do "something" with them later. When I'm doing a video I take it from DV tape (which is compressed, about 4:1 IIRC, but losslessly) edit it on the computer, render it back to DV, and put it back to tape. (for reference, DV is about 15 gigs per hour, and that's 720x480 w/ 2 channel audio) I then put the tape on my hermetically sealed and climate controlled "shelf" storage unit. Then I re
  • by AngryNick ( 891056 ) on Saturday April 21, 2007 @09:35AM (#18823613) Homepage Journal
    It seems to me that many filmmakers overestimate the artistic value of their work.

    Will we lose films like we lost the originals of the 20s?"

    A better question might be, "Will anyone really care that they can't watch a high-quality cut of 40-year Old Virgin in the year 2087?" If we are really worried about losing the content of a movie, then archive it to film and accept the faults (loss of image quality, cost of storage, risk of damage, etc.).

    • The 40-Year Old Virgin got positive reviews from critix, and Steve Carrell is a respected actor from his work in The Office (American vs.). So, if there are still film fanatics in 2087, there will be people who want to see that film in as good a condition as possible.
  • Their solution takes the data from a digital intermediate and turns it into three-color separation negatives.
    In other words, they take the digital movie and turn it into good old-fashioned film.

    So they're going to be using equipment that utilises the analogue hole [wikipedia.org]?
    Sounds... hypocritical. The movie industry balks at us for archiving movies we already own, but they're doing it at a massive scale just to save their own ass.

  • fortunately nothing made since hollywood started using digital recording is worth archiving.

    delete it all and the problem is solved.
  • by PyrotekNX ( 548525 ) on Saturday April 21, 2007 @10:39AM (#18824081)

    The current trend in the archival industry is to convert everything to digital. Unfortunately scanning is often a destructive process. In my experience, I have scanned documents that were written over 200 years ago and were still legible. In order to scan these documents, we had to cut all the pages from the binding which effectively destroys the document. The data was then burned onto dvd-r and sent back to the company. If there is any problem with a single disk, there would be a permanent loss of over 100,000 documents.

    DVD is fine for the consumer market, if the disc is damaged you can just buy another one. This is not the case with the film industry, once these original masters are gone, they are gone forever. Microfilm, however can last for generations. Even if there is some degradation in the film stock, you can recover almost all the original data. Film can be split into their primary colors onto different reels of microfilm and later be re-joined.

    One of my duties in the scanning industry was to operate the microfilm scanner. In this case, these were documents, but any type of information could be theoretically stored. Current models are capable of scanning at least 600 dpi. One of the hardest things would be to rejoin the frames later on and make sure they are all in sync. The way a microfilm scanner works is that on traditional microfilm, there are small squares that mark each frame. The scanner scans continuously and the software searches for these squares known as blips and it will know where to capture the image. With the addition of medium blips for keyframes and large blips for chapters, you can be fairly certain that you will be able to retrieve all the information later. If there is a missing frame, you will only be missing 1 channel of color for that particular frame. This data can be digitally re-created later.

    Unlike digital media, microfilm has been around for over 100 years. The images are stored optically rather than digitally so there is a minimal amount of equipment needed for retrieval. Reproduction of microfilm is relatively inexpensive and multiple copies can be produced from the master and can be stored in multiple off-site storage areas. If the master is digital, you can produce multiple copies that are all the same quality so there isn't a single original master. It may be possible to store the sound on microfilm as well. Software would have to be developed to encode and decode the data, but it is possible.

  • by Richard Kirk ( 535523 ) on Saturday April 21, 2007 @12:09PM (#18824809)

    Just back from Tinsel Town after talking to some of the dudes in the article. Still jetlagged, so it feels a bit unreal reading about it in Slashdot. Still, I'll do my best to explain why things are the way they are...

    A film is not often made by a single body. If you are shooting to film, then this will get handled by an editorial department. You may have a fast telecine scan for reviewing the material as dailies. Some of these scans may be used as low-resolution proxies for initial grades. Some chosen bits of film may get re-scanned on a slower pin-resolution scanner for inclusion in the final film. Artificial rendered scenes and special effects may be done by specialist houses, then composited in a post-production house. Your film may have 25 4K images per second in the final version, but the data used to generate it is scattered over the place - if you think a good IT department should be backing all this up, then you haven't worked on a film, my friend. As deadlines approach, people may be working stupid hours, and filling up all the available storage. Then the film gets released, and either makes a billion dollars or doesn't. Either way, the tension is off, people take holiday and zonk out. Nobody will be picking over the cutting-room floor or its digital equivalent looking for things that might be useful twenty years from now. By the time people are back from holiday, they don't know or don't care.

    Your end product may be big reels of negative film that you send to a film lab to make prints for cinemas. The lab should keep the golden master clean, and make most of the prints from a second copy. This would be a sensible time to make an archival print of the film. The lab can transfer the whole thing to black and white film. Black and white film does not fade, like conventional colour film does, even in the can. You are getting the print lab to do a pretty full backup of the released film when your people have all gone on holiday. These days you need to back up other stuff. The soundtrack is digital. You will have extra data for the releases in different formats (5:4 TV, 16:9 widescreen, IMAX, etcetera). Still, it is a lot better than nothing. But it is not often done.

    The other think is to know what to archive. Very little of the newsreel film I had to sit though as a child to get to the cartoon has survived. Key stuff like the Queen's Coronation or the outbreak of WW2 was clearly history, and put on a special shelf, but little of the day to day stuff survives. There is one cache that survived when a cinema closed, and the tins of newsreel went into landfill. The cinema was in Alaska; the landfill was permafrost, and the film was kept in near ideal refrigetrated conditions. Apart from this fluke, it has probably all gone.

    There will probably be digital solutions in time. Increasingly, as we have to manage more different sorts of digital data, there is a need to organize and track everything, which ought to mean it is possible to archive all the essential bits that go into any production. Many other people have posted on the problems of knowing what is on (say) a FAT16 Windows 3.1 disk in some 1980's image format. You can keep copying the data to overcome the degradation of the physical medium, but you still have to know what it means. I know of a system for archiving film images, where the people who did the archiving left the company, and one of them took the laptop with them that had the archiving software, so the ability to read the archives went with them. Do you archive the archiving system? Then, do you archive the system that archived that? Yes - basically, that is exactly what people are proposing to do. But it takes a bit of organizing, and we are not there yet.

    Film, on the other hand, has visible images. The 35mm format has remained readable for over 100 years. Even where nitrate stock has flowed over time, we still know what shape it ought to have been. Sometimes we can get something back if we want it badly enough.

    A simple analogue solution may be to

  • It's a bigger problem than most people realize. Twenty years ago, the original footage shot for a film might be 3x what finally appears on the screen, maybe more for a really big-budget film. Today, not only will there be more raw material, there's far more intermediate work product. A big project might have twenty layers going into a final frame. During the project, all that stuff is stored. But then what? Where does all that stuff get archived? There will be terabytes of stuff for any major film.

    • Distribution includes storage.
      If the film is successful, more than one print run may be needed. If it's to be transferred to DVD, the masters may get hauled out for that. If it's a recent film, the same studio will be distributing the theater films and the DVDs. They'll have to have negatives on hand--and so they get to store those negatives.
  • I'm involved in some preservation of history & documentation, and the issue is both huge, and far ranging, not just movies. Here are a few issues, beyond those already stated:

    - volume of data. Not uncommon for people to go on a vacation, and come back with 1,000 or 2,000 or 5,000 images on 2 or 4 gig SD chips off thier digital camera. Who has the time to catalog them all? When film cost you - oh, say for arguement a dollar a shot, most people were very careful what they took pictures of to begin
  • Recently an article concerning high efficiency (~250GB on an A4 sheet) paper based storage [techworld.com] was posted [slashdot.org] on Slashdot. Assuming that the article wasn't a scam, then this would provide a good solution for long term archival. Long term archival of paper documents is well understood, provides massive redundancy through easy duplication and requires minimal maintainance.
    • It _was_ a scam, and you have to be horribly gullible and avoid of any kind of physic and IT knowledge not to realize it.
      • I've just read the article (I only remembered the headline) and I admit that this leaves my post completely redundant.

        Never mind...
  • So: what media have survived for centuries or more, not this puny 5 to 10 years' worth for digital tapes or discs? You got it: etch in stone. So it will be a little bulkier than the 10 Commandments (the original ones, not the movie :-) ) but it'll last thru everything except a major volcano.
    • Stuff carved into stone has a very, very low survival ratio -- orders of magnitude lower than you're probably expecting. Clay is even worse. It lasts through almost nothing, other than by astounding flukes. (The only reason any survives at all is because, well, astounding flukes happen.)
  • It would be an absolute bitch, but if you really want it to last, engrave the data on a superhard oxide ceramic. Think Sapphire CD engraved ever-so-slowly by laser ablation. It'll never, ever "rot," it'll never get scratched unless you blast it with diamond powder, and it's stable forever at room temperature. Then it'll play back in an ordinary CD drive. If you're smart, engrave plates with pictures depicting the encoding method and data format, starting with basic physics of light diffraction.

    What it co
  • ...just aren't that great for preservation, but making it digital is still a sound practise

    1) Digitize it
    2) Create parity data
    3) Write to known persistant media

    For example, microfilm as we know that will last much much longer than a HDD or tape. Various forms of etchings, glass disks and whatnot are possible and also far more durable. In any case, I think everything that's been in public distribution gets recorded and kept by someone these days, sure we might lose the pristine 4k master copy but an image of

news: gotcha

Working...