Package portage :: Module checksum
[hide private]

Module checksum

source code

Classes [hide private]
  _generate_hash_function
  _hash_filter
Implements filtering for PORTAGE_CHECKSUM_FILTER.
Functions [hide private]
 
_open_file(filename) source code
 
getsize(filename) source code
 
is_prelinkable_elf(filename) source code
 
perform_md5(x, calc_prelink=0) source code
 
_perform_md5_merge(x, **kwargs) source code
 
perform_all(x, calc_prelink=0) source code
 
get_valid_checksum_keys() source code
 
get_hash_origin(hashtype) source code
 
_filter_unaccelarated_hashes(digests)
If multiple digests are available and some are unaccelerated, then return a new dict that omits the unaccelerated ones.
source code
 
_apply_hash_filter(digests, hash_filter)
Return a new dict containing the filtered digests, or the same dict if no changes are necessary.
source code
Tuple
verify_all(filename, mydict, calc_prelink=0, strict=0)
Verify all checksums against a file.
source code
Tuple
perform_checksum(filename, hashname='MD5', calc_prelink=0)
Run a specific checksum against a file.
source code
Tuple
perform_multiple_checksums(filename, hashes=['MD5'], calc_prelink=0)
Run a group of checksums against a file.
source code
Variables [hide private]
  hashfunc_map = {}
  hashorigin_map = {'MD5': 'hashlib', 'RMD160': 'hashlib', 'SHA1...
  rmd160hash = _generate_hash_function("RMD160", rmd160hash, ori...
  sha1hash = _generate_hash_function("SHA1", hashlib.sha1, origi...
  sha256hash = _generate_hash_function("SHA256", hashlib.sha256,...
  sha512hash = _generate_hash_function("SHA512", hashlib.sha512,...
  _whirlpool_unaccelerated = False
  whirlpoolhash = _generate_hash_function("WHIRLPOOL", _new_whir...
  prelink_capable = 1
  __package__ = 'portage'
  hash_name = 'whirlpool'
  local_name = 'whirlpool'
  x = '--version'

Imports: portage, PRELINK_BINARY, HASHING_BLOCKSIZE, _, os, _encodings, _unicode_encode, errno, stat, sys, subprocess, tempfile, _new_md5, _new_sha1, mhash, SHA256, RIPEMD, hashlib, functools, _new_whirlpool, md5hash


Function Details [hide private]

_filter_unaccelarated_hashes(digests)

source code 

If multiple digests are available and some are unaccelerated, then return a new dict that omits the unaccelerated ones. This allows extreme performance problems like bug #425046 to be avoided whenever practical, especially for cases like stage builds where acceleration may not be available for some hashes due to minimization of dependencies.

_apply_hash_filter(digests, hash_filter)

source code 

Return a new dict containing the filtered digests, or the same dict if no changes are necessary. This will always preserve at at least one digest, in order to ensure that they are not all discarded.

Parameters:
  • digests (dict) - dictionary of digests
  • hash_filter (callable) - A callable that takes a single hash name argument, and returns True if the hash is to be used or False otherwise

verify_all(filename, mydict, calc_prelink=0, strict=0)

source code 

Verify all checksums against a file.

Parameters:
  • filename (String) - File to run the checksums against
  • calc_prelink (Integer) - Whether or not to reverse prelink before running the checksum
  • strict (Integer) - Enable/Disable strict checking (which stops exactly at a checksum failure and throws an exception)
Returns: Tuple
Result of the checks and possible message: 1) If size fails, False, and a tuple containing a message, the given size, and the actual size 2) If there is an os error, False, and a tuple containing the system error followed by 2 nulls 3) If a checksum fails, False and a tuple containing a message, the given hash, and the actual hash 4) If all checks succeed, return True and a fake reason

perform_checksum(filename, hashname='MD5', calc_prelink=0)

source code 

Run a specific checksum against a file. The filename can be either unicode or an encoded byte string. If filename is unicode then a UnicodeDecodeError will be raised if necessary.

Parameters:
  • filename (String) - File to run the checksum against
  • hashname (String) - The type of hash function to run
  • calc_prelink (Integer) - Whether or not to reverse prelink before running the checksum
Returns: Tuple
The hash and size of the data

perform_multiple_checksums(filename, hashes=['MD5'], calc_prelink=0)

source code 

Run a group of checksums against a file.

Parameters:
  • filename (String) - File to run the checksums against
  • hashes - A list of checksum functions to run against the file
  • calc_prelink (Integer) - Whether or not to reverse prelink before running the checksum
  • hashname (List)
Returns: Tuple
A dictionary in the form: return_value[hash_name] = (hash_result,size) for each given checksum

Variables Details [hide private]

hashorigin_map

Value:
{'MD5': 'hashlib',
 'RMD160': 'hashlib',
 'SHA1': 'hashlib',
 'SHA256': 'hashlib',
 'SHA512': 'hashlib',
 'WHIRLPOOL': 'hashlib'}

rmd160hash

Value:
_generate_hash_function("RMD160", rmd160hash, origin= "pycrypto")

sha1hash

Value:
_generate_hash_function("SHA1", hashlib.sha1, origin= "hashlib")

sha256hash

Value:
_generate_hash_function("SHA256", hashlib.sha256, origin= "hashlib")

sha512hash

Value:
_generate_hash_function("SHA512", hashlib.sha512, origin= "hashlib")

whirlpoolhash

Value:
_generate_hash_function("WHIRLPOOL", _new_whirlpool, origin= "bundled"\
)