I briefly restate the problem for clarity:
Let A be an array of n integers in Z_n. Every member of Z_n appears exactly once, except that t in Z_n appears twice and m in Z_n is missing from A. Find t and m in O(n) time and O(1) space.
The book provides two solutions. The first one is correct. The second solution claims to improve upon the first by addressing potential overflow issues. Barring a misunderstanding, I believe that the second solution is incorrect. For clarity, I describe the original second solution:
Compute x = m xor t. Note x is not 0.
Pick a non-zero bit in x. Call the integer represented by it h.
h must be m or t. (This is where the reasoning goes awry)
Search A for h.
This solution is incorrect because it fails when m and t each have more than one one-bit.
e.g. consider.
m = 3 = "0011"
t = 12 = “1100”
Edit: Sorry I meant t = 12, not 6. 3(011) and 6(011) do not exhibit my point because their xor (101) leaves ONE bit to represent t and m. In contrast, 3 xor 12 = ‘1111’ and now suddenly, the significant TWO bits ‘belong’ to 12 and the less significant 2 bits ‘belong’ to 3. Here, picking a single bit position is no longer sufficient.