Glossary
Entropy (Shannon)
Bits of unpredictability
Entropy in information theory (Shannon entropy) measures how unpredictable a random variable is, in bits. A 1-bit entropy source can produce 2 equally-likely outcomes; n-bit entropy can produce 2ⁿ. Mathematically: H = log₂(N) for N equally-likely outcomes.
Examples:
- A fair coin flip: 1 bit.
- A fair die roll: log₂(6) ≈ 2.58 bits.
- A random 8-character lowercase password: 8 × log₂(26) ≈ 37.6 bits.
- A random 16-character password from [a-z A-Z 0-9 symbols]: 16 × log₂(94) ≈ 104.9 bits.
- A UUID v4: 122 bits (6 bits are version/variant constants).
Why entropy matters for secrets: an attacker brute-forcing your password tries combinations one at a time. The expected work is 2ⁿ⁻¹ attempts where n is your password’s entropy. At a billion attempts per second:
- 40 bits: ~9 minutes to brute-force. Inadequate.
- 60 bits: ~17 years. Marginal.
- 80 bits: ~38 million years. Adequate.
- 128 bits: heat death of the universe before exhaustion. Cryptographic grade.
Common gotcha: entropy depends on the distribution, not just the value. A “random” password chosen by a human (“Password123!”) has far less entropy than its length suggests, because the human selection process isn’t uniform. Computer-generated random passwords (via crypto.getRandomValues or /dev/urandom) hit the full theoretical entropy of their charset.
Use our password generator to see the entropy meter live as you adjust length and character classes.
Related
Published May 16, 2026