41

Hello there,

Long story short: I have a big PC game collection from golden era (1995 - 2010) - digital ISOs and BINs and a limited space to preserve them. I dont trust clouds in any form, so I prefer old school external HDDs for store. For me 7Zip is a good way to archive them and save some space, but recently ive found out that if you convert a BIN or an ISO file to ECM and then you archive it with 7Zip (ultra compression), the final compression file size will be in most cases almost at a half compared to original non ECM file.

Example:

Original Bin file (rld-cl1.bin) - 672MB

Original Bin file zipped with 7zip on ultra compression (rld-cl1.7z) - 268MB

Original Bin file converted to ECM (rld-cl1.bin.ecm) - 586MB

Original Bin file converted to ECM zipped with 7zip on ultra compression (rld-cl1.bin.7z) - 195MB

So there is a difference of 73MB in this case.

Does this method is good? You can damage the BINs in any way if you ECM them, ZIP them, unzip them and UnECM them back to BIN? I noticed in properties of an UnECM(ed) BIN file that the BIN no longer have Last Modified Original Date - in this case was year 2005.

There are other different methods to save space? I dont care much about loading time from archiving/extracting. I just want to be sure that all the files remain untouched in this process. Thanks

top 17 comments
sorted by: hot top controversial new old
[-] count0@lemmy.dbzer0.com 15 points 1 year ago

You might also want to look into Zstandard - it gives much better ratios in orders of magnitude quicker time on modern hardware.

A bit down on the page you can find versions of the 7-zip graphical archive manager extended with this Zstandard algorithm.

Like normal 7-zip/traditional zip/rar/gzip/bz2/..., Zstandard is completely (guaranteed) lossless.

(I don't really know about ECM at all, so I won't speak on that aspect.)

[-] count0@lemmy.dbzer0.com 2 points 1 year ago

(sorry about the multi posting, Jerboa was giving me network errors and it seemed like the comment hadn't gone through.)

[-] cauciuc@lemmy.dbzer0.com 1 points 1 year ago

Thanks, I will take a look.

[-] rm_dash_r_star@lemm.ee 7 points 1 year ago* (last edited 1 year ago)

Well you're compressing it twice. Generally it's bad practice to compress a file twice, but it does happen a lot. For example a single zip file that packages several compressed archives. Personally I would live with the small amount of space it requires to avoid using two tools instead of one. Other than that it doesn't hurt anything and and there's no inherent risk of file corruption.

[-] Monomate@lemm.ee 3 points 1 year ago

I read somewhere that ECM compresses by removing the error-correction sectors from an ISO file.

It's already proven that discs can technically work without them. Dreamcast GD-ROMs are basically the practical application of this concept, with the goal of expanding CD-ROM storage from 700MB to 1GB.

That's why the size reduction is cumulative if the user apply ECM and then 7zip compression afterwards. That's because ECM is actually a trimming method rather than an actual compression scheme in the traditional sense.

[-] sweBers@lemmy.fmhy.ml 1 points 1 year ago

I had only heard that there wasn't much merit to multiple layers of compression as there is only so much compression that can be done.

One similar solution to OP here is tarballs.

The only way to be certain is to reverse the process then run a hash check on the file to see if it's the same as the original file

[-] qprimed@lemmy.ml 6 points 1 year ago* (last edited 1 year ago)

you probably already have your answer in your post. if an archival copy of your data is desired, then any modification to the source is not good.

virtually any lossless archiver/compressor (bz, xz, 7z, etc) will give you back the bit for bit original. pre-processing the image with ECM may not - you decide if the small savings in storage is worth it. considering that ECM is a compression method and already compressed data is harder to re-compress... based on your results, I would say ECM is a lossy process as compared to the source - I have no way to confirm this, however, without looking at specs.

tl;dr: don't lossy (potentially) pre-process data and meaningfully expect it to be considered a clean "archive")

edit: clarification... my use of lossy here refers to the loss of (likely) redundant or non-useful data from the source. stripping this data may have zero functional effect on the recovered binary, but archival purists would likely be horrified ;-)

[-] cauciuc@lemmy.dbzer0.com 4 points 1 year ago

Thanks for your answer, yeah I m worried about that "Last Modified Original Date" thing. For now I will keep only 7z compression for storage, just to be sure. Maybe I will use the ECM method in the near future when I will be able to duplicate my files on a second backup drive(s).

Use checksums and see? How much space are we talking here?

[-] cauciuc@lemmy.dbzer0.com 2 points 1 year ago

Checksums seems to be the same. Around 35% in average, depends on files stored in Bin/ISO image.

[-] ICastFist@programming.dev 5 points 1 year ago

I wonder if uploading your stuff to archive.org could be a good choice? Download speeds can be terrible, but I doubt your stuff would be taken down from there

[-] cauciuc@lemmy.dbzer0.com 3 points 1 year ago

No way, I dont trust cloud storage at all. Offline storage is mine forever, cloud storage is yours temporary with an invisible countdown timer. And btw Ive read a few months ago that Archive org has an open lawsuit regarding digital rights storage, or something like that...

[-] cccc@aussie.zone 6 points 1 year ago

It could still be worthwhile, just not as your only option. Do your local archiving as well but having it uploaded elsewhere could come in handy if the shit hits the fan.

[-] ICastFist@programming.dev 2 points 1 year ago

Fair enough. I recall seeing that lawsuit, looked around and bad news, Archive lost - https://time.com/6266147/internet-archive-copyright-infringement-books-lawsuit/

It was only about e-books, but anyone suing them again will use that as a precedent. 😔

load more comments
view more: next ›
this post was submitted on 30 Jun 2023
41 points (100.0% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

54390 readers
496 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS