Recently, a few friends and I spent about an hour solving a computer science puzzle that was posed to us, indirectly, by Facebook’s AI-powered Messenger Assistant, M. Rather than explain our findings immediately, I’m going to tell the story of our investigation as it unfolded, so you can think along with us and come up with your own hypothesis along the way. In my opinion, puzzling through this problem is a great computer science thought exercise.
In a group chat on Facebook Messenger, a group of poker buddies from undergrad (read: mainly computer science and math majors) were talking about Jeff Bezos and his insane net worth. One of them jokingly asked for a hundred billion dollars, and M immediately tried to play wingman.
We quickly discovered that M was willing to facilitate a request of nine hundred billion dollars, but not a trillion. Using words, we were able to get M to prompt us to send $999,999,999,999—and not a dollar more.
Asking for any higher number resulted in M simply generically displaying a “Pay” or “Request” button, with no specific number attached to it.
This seemed reasonable enough; M understood how to make numbers out of the words for thousand, million, and billion, but not trillion. (We were similarly unable to trigger the trillion-dollar request with workarounds like “send me one thousand billion dollars” and “send me a million million dollars.” Asking for a quadrillion dollars also failed to register with M.)
We were able to get M to help us out with bigger numbers, however, by asking for the number directly, using numerical characters. “Send me $1,000,000,000,000” worked perfectly:
As did a quadrillion:
A quintillion, however, got the generic reply:
Of course, at this point, we needed to pin down the upper limit of what M would specifically ask for. Unlike with the trillion-dollar verbal limit earlier, this one didn’t appear to be due to a specific choice the programmers would have made. That is, an engineer likely simply chose to teach M what the word “billion” means, and didn’t bother teaching it what the word “trillion” means. With the numerical limit, though, there is no reason why the programmers would arbitrarily choose to make M stop handling requests after a certain number of zeroes. (Even if they did, they probably would have at least set it at some number that was reasonable for the sake of dollar amounts; after all, we’d long since eclipsed the world’s GDP when we got to a quadrillion.)
We figured it’d be easy enough to brute-force the problem and find the upper limit that way. We considered the generic reply (just “Pay”) a failure, and the specific reply (“Pay $X”) a success.
(I should note, here, that it seems that you can no longer test this out and try to find the limit yourself. When I first wrote the draft of this article on June 1, I was going to tell you to find a friend who wouldn’t mind you spamming their Messenger inbox, and sending them a bunch of “Pay me $X” messages. However, as of the time I am publishing this article (June 2), M’s behavior appears to have changed. Now, M gives the boring suggestion of simply “Pay” for any number past $999,999. Even the phrase “Send me a million dollars” no longer works like it did yesterday. It’s not just a local change for me, either — I just tested it from a different Facebook account, on a different device connected to a different network. It seems the actual logic with which this feature of M operates has been altered. I doubt that our activity actually triggered someone at Facebook to update the logic, but I also can’t imagine that the AI updated to a million-dollar limit on its own. Maybe I’m missing a third possibility. Regardless, this entire article discusses how M was behaving prior to June 2, not how it behaves now.)
Our approach to finding the upper limit was to spend ten minutes testing number after number, figuring out which ones were too large. A hundred quadrillion didn’t work, but ninety quadrillion did. Ninety-three quadrillion failed, but ninety-two succeeded. 92.3 quadrillion was too high, but 92.2 quadrillion was fine. On and on, moving rightward through the digits of the number, pinning down the exact upper bound M would oblige.
As we moved through the digits, we started to suspect that there was a powers-of-two factor at play here, but a quick (and, in retrospect, naive) check told me that there was no power of two in the ballpark of ninety-two quadrillion. 2⁵⁶ ≃ 7.2×10¹⁶ and 2⁵⁷ ≃ 1.4×10¹⁷, hopping right over 9.2×10¹⁶. We assumed our hunch was wrong. We were well aware that there still had to be some significant mathematical constant at play, but we figured it would be more fun (read: easier) to find the number first and figure out what it meant afterwards.
Interestingly, when we touched the last few digits of the number, we started seeing some strange behavior: M would almost get it right, but would truncate the number slightly, or otherwise mess around with the ones place and tens place.
“Send me $92,233,720,000,000,001”
M: “Pay $92,233,720,000,000,000”
Stranger still, once we got the upper limit narrowed down to the thousands place, M started inexplicably altering the terminating digits.
“Send me $92,233,720,368,539,900”
M: “Pay $92,233,720,368,539,904”
Sometimes, it even added cents:
“Send me $92,233,720,368,539,010”
M: “Pay $92,233,720,368,539,013.02”
We were perplexed by this behavior, but we knew we couldn’t deal with that until we first found the specific upper limit. Every single number we tried from here forward got altered, seemingly randomly, in its last few digits. The only pattern we quickly discerned was that the number of cents had to be even.
When we found the highest number M would accept, we were completely confused.
M responded successfully to “Send me $92,233,720,368,547,747.84,” and it would not go one cent higher.