Information Theory

Information theory is the scientific study of the quantification, compression and communication of information. A key measure in information theory is entropy. It quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. Some other important measures are mutual information, channel capacity, error exponents and relative entropy.
https://en.wikipedia.org/wiki/Information_theory


Number of digits (Stellenzahl) -  How many digits a code word consists of
Assessability (Bewertbarkeit)  
-   Is each position assigned a specific value/valence? E.g. 22
Hamming weight  
-  Number of digits with 1  e.g. 10010001 = weight 3
Hamming distance  
-  The minimum of all distances between words inside a code
Steadiness  
-  Is the distance between all code words constant?
Redundancy  
-  Are there more combinations than mathematically (theoretically) required by the coding?


Hamming distance

In information theory, the Hamming distance between two strings of equal length is the number of positions at which the corresponding symbols are different.

The Hamming distance of two blocks of binary data with fixed length can be determined by writing them out in binary form, comparing them bitwise and counting the digits which are unequal. (XOR)
https://en.wikipedia.org/wiki/Hamming_distance



Todo: Describe the others too




You'll only receive email when they publish something new.

More from Cili's Notes
All posts