Simple file hash digests & file integrity checks
File hash digests are summaries of the binary contents of a file that change with even the most minor of file modifications. It is this property that allows you to use a hash digest to determine whether your version of a file has been modified from an expected version. Hash digests are always the same number of bits in length (for a given hash algorithm) irrespective of the size and contents of the file.
It is highly probable that the binary contents of the two files that the hash digests were created from are the same. In plain English, it is very likely that the files are identical.
It is highly probable that the binary contents of the two files that the hash digests were created from differ. In plain English, it is very likely that the test file has been modified from the expected version or is a different file without a similar origin file.
While improbable, it is possible for two different files to have identical hash digests. This is known as a hash collision. Older hash algorithms such as MD5 have a lower level of resistance to hash collisions than newer algorithms that were designed to make this phenomenon increasingly improbable. For more details and references for further reading, see the Wikipedia article on this topic.
SHA256
SHA256
Because file hash digests are the same length for a given hash algorithm, hsh can detect the algorithm that was used to generate the hash digest string that you include on the command line. The application generates a new hash digest for the file on your path argument using the same algorithm and compares it to your expected hash digest for equality.