Joe took a math test along with the rest of his class. The test contained 100 questions worth 1 point
each. The test scores were registered in a database. But, there was a glitch in the system and for
40% of the tests, the first digit of the test score was replaced by a 5. So, for example, if the test
score was 90, it would have been registered as 50 in the database if it had been affected by the system
glitch. We know that Joe answered at least 10 questions correctly. His score in the database is
registered as 55. If you were given a dollar for every question Joe answered correctly, how much
money statistically would you expect to have?

**ANSWER**:

*$55.*
**EXPLANATION**: Either the test is registered correctly (60% probability) or it has been
affected by a system glitch (40% probability). If it had been affected by a system glitch, we know that
the first digit in his actual score could have been any number from 1 to 9 but the second digit had to
have been a 5. So we can make a probability tree diagram to determine Joe's score based on statistical
probability:

The long way would be to add up the probabilities multiplied by the associated score for each row:
(0.6 * 55) + (0.4 * 1/9 * 15) + (0.4 * 1/9 * 25) + (0.4 * 1/9 * 35) + (0.4 * 1/9 * 45) + (0.4 * 1/9 * 55)
+ (0.4 * 1/9 * 65) + (0.4 * 1/9 * 75) + (0.4 * 1/9 * 85) + (0.4 * 1/9 * 95) = 55.

Or, if you noticed that whether the test was registed correctly or not, in each case the average test
score is 55 (and an equal probabilty of all 9 scores if the test score that was not registered correctly),
the result will therefore be 55. So, statistically, you would expect to have $55. Note that the
percentage of tests that were affected by the glitch is not relevant.

Do you have a

suggestion for this puzzle (e.g. something that should
be mentioned/clarified in the question or solution, bug, typo, etc.)?