There might be faster algos for super long integers, or minute implementation differences that add/subtract few kilogates.
Big integer calculations are the bread and butter of GPUs now?