is a free cross-platform file archiver that provides an unified
portable GUI for many Open Source
technologies like 7-Zip, FreeArc, PAQ, UPX...
ARC, BZ2, GZ, *PAQ, PEA,
QUAD/BALZ, TAR, UPX, WIM, XZ, ZIP files
Open and extract
ARJ, CAB, DMG, ISO, LHA, RAR, UDF, ZIPX files and more...
includes extract, create and convert multiple
archives at once, create self-extracting archives, split/join files,
strong encryption with two factor authentication, encrypted password
manager, secure deletion, find duplicate files, calculate hashes,
export job definition as script.
SUPPORTED FILE FORMATS
provides primitives to compute multiple hash and checksum algorithms,
and this feature can be used either to find
duplicate files (which have identical checksum/hash value), and to
corrupted files which have different checksum/hash value from a known
Compute error detection algorithms
Check files entry
"File tools" submenu (context menu) allows to verify multiple hash and
checksum algorithms on multiple files at once, e.g. to
compare a group of file to identify redundant ones, or to check
files for corruption
when an original
checksum or hash value is
known (or when it can be calculated for reference from a known safe
copy of the data).
Selected algorithms are performed in a single passage so reading the
disk (usually the main performance bottleneck) occurs only one time,
speeding up the disk-bound part of the process
The algorithms to be performed can be selected in Options >
Settings, in File Tools tab; output value of hashes and checksums can
be seen as exadecimal (HEX,
either LSB or MSB) or encoded as
15 checksum and hash algorithms are currently supported:
checksum functions Adler32, CRC16, CRC24, CRC32,
general-purpose hash functions as
eDonkey/eMule, MD4, MD5
cryptographically strong hash:
family (SHA256, SHA512)
SHA-3 family (SHA-3 256, SHA-3 512)
chose faster checksums like CRC32 to
perform a quick comparison (e.g. to detect duplicate
files of test for casual corruption data errors), but it is
recommended to employ a cryptographically
secure hash function (like sha-2-family SHA256,
SHA512, SHA-3 algorithms, or the AES-based Whirlpool512) to detect
corruption / forgery attacks that might be calculated in order to pass
to one (or some) of the non-secure algorithms exploiting collision -
algorithm maps different input to same ouput digest.
Using multiple functions at once, and especially relying on
strong hash functions as Ripemd160, SHA-2 or Whirlpool, can defeat
of forging identical-looking files, as it is computationally feasible
to find a collision (different input mapped to same output) for simpler
checksum and hash functions.
This way, even a purposely crafted modification of a file would not
pass unnoticed to most sophisticated detection algorithms, making
possible to identify not only plain data corruption (e.g. communication
or device fault) but also to avoid certain classes of attacks relying
on replacing original content with forged data.
Byte-to-byte file comparison
"File tools" submenu performs byte to byte comparison between two
files, a slower process in which all single bytes of one file are
matched with the same offset byte of the other one.
Unlike checksum / hash comparison, this method it is not subject to
circumstance, and can effectively tell what the different bytes are,
only a way to verify if two files are identical or different, but also
to find in details what changes were
made between the two versions.
Test archive for errors
Most archive formats can be tested for errors using PeaZip routines
(Test button), detecting if data is correctly readable and matches file
format specification's standards, either testing archive table of
content and single archive items.
Some archive types (7z, rar, tar, zipx...) improves error detection
storing pre-computed checksum values (usually CRC32) of archived data.
Data errors can be due to random corruption (faulty support, troubles
during download), out of standard archiving processes (bug, obsolete
specifications), or in worst case purposeful alteration of original
content. In some cases corruption or alteration of data can lead to an unreadable archive as result.
Testing archives for errors is an useful good practice sufficient for
low risk scenarios, but if there is a reasonable suspicion of the data
being purposely manipulated it is preferable to test the file with a
cryptographically strong hash function (SHA, Ripemd, Whirlpool) against
a known value, using aforementioned "Check files" tool.
PEA file format
provides a wide array of strong hash functions and authenticated
encryption options so the archive integrity can be thoroughly tested
External online resources: definitions of data integrity, checksum, hash, cryptographically
secure hash function, and data
Wikipedia, sha-2-family standard.
Topics and serach suggestions: how to detect corrupted or forged files,
value, identify data corruption computing checksum and hash functions,
FIPS, CRC32, Adler, MD5, SHA, SHA2 algorithms, file integrity checker
tool, compute cryptographic hash function, software detecting modified
data, test file for data errors.