Based on prior comments from the seller, for the life. And so nobody else has to compute it, that's:
5,256,000 minutes in 10 years (ignoring leap years)
500,000,000 bytes of data (assuming mega and not mebi)
An average of 95 bytes a minute per device over that 10 years, or an average of 951 bytes every 10 minutes, or more than 5k an hour. For event messages, that seems like something that can be worked within fairly easily, depending on use.
Truth in advertising: As stated in another response above, if you want to do a real computation you need to factor-in "session setup" overhead. If you config for TCP/IP (unencrypted) your session overhead is about 1kb. If you config for TLS, your session overhead is just under 4KB. Once the session starts, data transfer is super efficient - probably about 250-500 bytes for a half dozen or dozen notes of typical size. Session duration is typically 1-2 seconds.
The other secret of most IoT platforms is that their negotiated rates round sessions to 1KB boundaries. That's insane for IoT. For ours there is no rounding, and the 'practical' rounding is the 40-byte TCP/IP header.
5,256,000 minutes in 10 years (ignoring leap years)
500,000,000 bytes of data (assuming mega and not mebi)
An average of 95 bytes a minute per device over that 10 years, or an average of 951 bytes every 10 minutes, or more than 5k an hour. For event messages, that seems like something that can be worked within fairly easily, depending on use.