It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
SilentBleppassin: I could be more interested if there was a transparent way on the filesystem level like BTRFS/ZFS/NTFS can do, the default compression unsurprisingly does nothing to GOG installers. From all I know, currently not possible.
Well, the standard compression is supposed to be fast and file based, so no: It does not do much.

RAR and 7Z can do a lot better, especially if you create solid archives with bigger page size.
But of course that also means that you have to decompression takes longer and you can't access single files, meaning they are no good for data which you constantly need, but they are great if you want to archive something.

I should try and see what they would to to a 160G installer like Cyberpunkt or Mordor.
These I really don't use very often.
avatar
SilentBleppassin: I could be more interested if there was a transparent way on the filesystem level like BTRFS/ZFS/NTFS can do, the default compression unsurprisingly does nothing to GOG installers. From all I know, currently not possible.
Keep in mind, typically compressed data can't be compressed more. If you do get compression it may be duplicate parts within the file or uncompressed data like meta-data added to say Jpeg files, it won't be to the main portion of the file.

Same thing with trying to compress video, mp3 files, other archives (zip, rar, 7z, gz, bz2, xz, arc, etc), they are already compressed. Any gains for a lot of work will be very very small.
avatar
SilentBleppassin: Not finished, the Github repo demonstrates new bugs and other things. Not too suited for long-term. What if it eventually breaks something in development, making the older archives unusable? I think a more stable format will be a better way to do. When ZSTD becomes more stable and around 14 years pass, sure, nothing to say against then.
Having gone through using many, many archiving formats in my lifetime, including lzh, bzip2 and rar, I know what should happen even when such a rare situation should occur. The format version is bumped and older archives are still extractable using "the old way", whereas new archives are created with the revised code. Ergo, even such events are transparent to end users.

If you want to wait more until you personally start using it, sure, but it's seen wide adoption at this point, with many Linux distros defaulting to zstd for their binaries package compression. And why not, considering at higher compression rations it's comparable to lzma but up to 13x as fast in decompression.
Post edited December 27, 2022 by WinterSnowfall
avatar
HunchBluntley: 7-Zip is free, open-source, and quite popular (as far as compression utilities go).
avatar
SilentBleppassin: Popular doesn't always equal good, as happens with some few things.
Obviously. But I was speaking of popularity specifically in relation to its likelihood of vanishing from the Internet without a trace -- i.e., the more popular a piece of software, the less likely it is to go "poof" (especially if it's legally redistributable).

Despite my earlier post sounding a little like shilling, I wasn't so much praising the 7-Zip software as pointing out that 1) it's extremely light-weight, and 2) it's FOSS, thereby (in my view) potentially negating some of Braggadar's stated concerns. Beyond that, I have no dog in this fight; 7-Zip is simply "good enough" for my (and presumably many computer users') extremely limited file compression/decompression needs -- especially given that attractively negligible hard drive footprint.
avatar
Braggadar: Oh, of course. I have one version or another of 7zip installed on most of my recent hardware, but I have an old laptop or two which may/may not have it installed on it and those laptops get the occasional run when I don't want to drag my new one in front of the TV and do some file sorting while watching something else.

Keeping with one archive format for my own personal archiving means once I've downloaded the file, packaged it how I want and then popped it only my external HDDs, it's accessible for all my equipment or even a friend's equipment should they not have 7zip installed, ya know? [...]
I've never used the feature myself, but, as far as I know, 7-Zip can generate self-extracting .7Z archives which don't require the 7-Zip software to be installed on the computer on which they're to be extracted.

avatar
Braggadar: I use the term bloat not only to indicate size, but unwanted extra registry use and program integration into Windows itself. I try to keep my system as free of extra stuff as possible at all times, so anything extra is generally unwanted to me. I tend to like portable (non-installed) programs over installed ones for that reason as well. Fortunately 7-zip offers that as well iirc, but the context menu alone makes such an option less attractive XD.
I'm with you there. Portability's nice for a lot of software (including most games), but when it comes to certain things where I want integration with File Explorer, I opt to install, even if there's a portable option. I'm thinking of software like VLC Media Player, LibreOffice and 7-Zip.
avatar
SilentBleppassin: Another factor you do not count is deadly and random, the non-zero chance of your PC silently ruining a game because no ECC RAM (error correction). This is not widely known. Typical customer PCs are not enterprise-grade and the industry considers the chance of corruption being acceptable for non-enterprise users (I don't agree with that), as such you must do it at least twice or three times to make sure everything is alright, wasting a huge amount of time.
This begs the question - how many times have you encountered a problem that could positively be linked with lack of memory correction? The only time I ever encountered memory issues was in setting up and overclocking my system (and memory) where the whole point is to increase frequency until you hit problems (failing a Prime95 torture test in my case) then roll back (or in my case, increase voltage) until those problems vanish and you can run a several-hour torture test without problems.

Even if a bit-flip happens, the likelihood that it (a) occurs in an active piece of code or variable is small and (b) gets written to non-volatile storage is smaller. Consider your typical Windows application - if bit-flips were frequent you would see visible effects like buttons being pressed at random, text within windows changing and graphics becoming corrupted.

There may be situations where paying extra for ECC (plus a motherboard/chipset that can use it) is worth the extra re-assurance it provides, but for the vast majority of users, driver failure, hardware failure and program bugs are a far greater concern (see Superuser: How come bit flips aren't destroying my computer?).
avatar
SilentBleppassin: ...I preserve the 15TB unchanged on 2 enterprise-grade HDDs (the main plus a copy) with ECC RAM,
Well, with that much data there is a type of bit-flipping to worry about - hard disk bit-changes that slip past their checksum (which used to be CRC but is now apparently Reed-Solomon codes).

Enterprise disk drives are supposed to encounter 1 uncorrected bit read error every 10^16 bits - consumer disk drives 1 uncorrected error every 10^14 bits (Wikipedia source). Once you have a terabyte of data (8 x 10^12 bits), you are looking at about an 8% chance of an undetected (and uncorrected) error in your backup on a consumer drive, based on those specifications. A single "flipped bit" is unlikely to break a picture, video or audio file but will scupper an installer or any other executable file (see ServerFault: Is bit rot on hard drives a real problem? What can be done about it? for some useful discussion).

For large amounts (1TB+) of data, extra protection may be worthwhile by taking a hash (fingerprint) of files before backing them up, checking the hash on the copy (if you use a copy program that doesn't check the copied data) and when updating (or making a new copy) verify those hashes by:
(a) renaming them (e.g. by adding an -old suffix);
(b) generating a new set of hashes;
(c) comparing the new set with the old using a file comparison program (expect changes due to additions, deletions and updates - but anything that can't be explained could be a good reason to retrieve that file from your last backup instead).

Hash Check is a small, simple program that integrates with Windows Explorer so you can create and verify hashes via the right-click menu (an updated version with a larger choice of fingerprint algorithms can be found at GitHub). I'd suggest using MD5 for this - ignore the recent claims of it being "broken" or "insecure" since you are using it to detect random data corruption rather than deliberate file impersonation.

I however have found the fingerprint generation/verification options in Total Commander (an alternative file manager) to be rather faster - the fingerprint files it generates are compatible with Hash Check and vice versa. TC has a file (and folder) comparison tool also.
avatar
SilentBleppassin: To choose a game makes me lost every time. Everybody with small collections, you are lucky in my opinion, so much easier to deal with..
Bleh...the number of times I've looked at the looong list of games in my Start Menu and decided to fire up Solitaire or Minesweeper instead...
Post edited December 27, 2022 by AstralWanderer
avatar
HunchBluntley: I've never used the feature myself, but, as far as I know, 7-Zip can generate self-extracting .7Z archives which don't require the 7-Zip software to be installed on the computer on which they're to be extracted.
Typically this involves a pre-made extraction mini version of 7z that the archive is just appended to. As 7z can scan a file for an archive after the start, it just targets itself and a 'target directory' of your choice, along with a possible 'run after extracted'. If i'm correct, this is made possible as the EXE format includes a size parameter which will be smaller than the filesize including the actual executable. That or the OS loads the data as it needs it and not everything all at once.

avatar
AstralWanderer: This begs the question - how many times have you encountered a problem that could positively be linked with lack of memory correction?
Probably quite common. OSes tend to need rebooting, NASA for example reboots their systems in space every day for stability reasons. And i've seen where something glitches not letting me use the sound or a program and all related duplicates will not function on some feature. Restarting the app or OS deals with that.

As for corruption of game data or other? If it's recent, probably not too common.
avatar
AstralWanderer: Enterprise disk drives are supposed to encounter 1 uncorrected bit read error every 10^16 bits - consumer disk drives 1 uncorrected error every 10^14 bits (Wikipedia source). Once you have a terabyte of data (8 x 10^12 bits), you are looking at about an 8% chance of an undetected (and uncorrected) error in your backup on a consumer drive, based on those specifications. A single "flipped bit" is unlikely to break a picture, video or audio file but will scupper an installer or any other executable file
Recently after a hard drive crash and recovering most of my data, i found a few of my DBZ anime files were corrupted. And that was a 2-4TB drive.
Post edited December 28, 2022 by rtcvb32
avatar
WinterSnowfall: If you want to wait more until you personally start using it, sure, but it's seen wide adoption at this point, with many Linux distros defaulting to zstd for their binaries package compression.
The same "many distros" adopted broken by design XZ. The argument doesn't look convincing to me in light of that.

avatar
AstralWanderer: This begs the question - how many times have you encountered a problem that could positively be linked with lack of memory correction? [...] Even if a bit-flip happens, the likelihood that it (a) occurs in an active piece of code or variable is small and (b) gets written to non-volatile storage is smaller.
Should we not install fire alarms because we have not encountered fire as well? I play no chances of turning the house into ashes.

avatar
AstralWanderer: for the vast majority of users, driver failure, hardware failure and program bugs are a far greater concern
This kind of fire hazardious logic is exactly why corporations do not care on customer machines having ECC. "Who cares?" they all think, while reading the quoted in good mood, "Nobody needs this, let's continue to sell the usual, the customer is so happy to be on less reliable machines anyway, look how many excuses." As I will say in return: No small issue excuses a big one, no big one excuses small, every issue shall be resolved to the best of our ability.

On cost. My ECC RAM is directly from Germany, ordered from work by our sysadmin in bulk, and is cheaper than the local prices on non-ECC back in another location of another time, the HDDs were significantly cheaper abroad too; AMD motherboards are not Intel (Intel charges extra for ECC motherboards), my inexpensive motherboard with B450 handles the RAM fine, nothing extra paid.
Post edited December 28, 2022 by SilentBleppassin