Information entropy can guide experimental design by quantifying how much information different choices provide. Using probability theory, you can calculate expected bits of information from various outcomes and select experiments that maximize information gain. The classic 12-ball weighing problem demonstrates this: weighing 4 balls per side yields the highest expected entropy (1.58 bits) because it creates three equally probable outcomes. The key heuristic is choosing experiments where all possible outcomes have similar probabilities, as uniform distributions maximize information entropy.

13m read timeFrom blog.demofox.org
Post cover image
Table of contents
Probability and BitsSubmarine GameExpected Information EntropyUsing Expected Information Entropy To Choose ExperimentsBonus: Calculating Odds Of Scale Results

Sort: