Information entropy can guide experimental design by quantifying how much information different choices provide. Using probability theory, you can calculate expected bits of information from various outcomes and select experiments that maximize information gain. The classic 12-ball weighing problem demonstrates this: weighing 4

13m read time From blog.demofox.org
Post cover image
Table of contents
Probability and BitsSubmarine GameExpected Information EntropyUsing Expected Information Entropy To Choose ExperimentsBonus: Calculating Odds Of Scale Results

Sort: