Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I see this repeated a lot, but I've done a fair bit of math, and a fair bit of programming, and they feel very different, so I'm not sure what it's supposed to mean.

"Algorithms are math" may be true, but most software development feels a lot more like engineering.



It may feel like it, but you can reduce any computer program to a mathematical expression. Even that distinction feels a little silly and artificial, code almost just feels like a different kind of notation for math. If you can patent code, you're patenting math.

This may seem pedantic, but I really do think it gets to the core of why software patents have turned out to be such a disaster. Math is specifically not patentable, and math coursework is specifically excluded from the scientific background required to become a member of the patent bar (the patent bar specifically identifies mathematics as coursework that does not qualify a person to sit for the patent bar). People are often surprised by this, but if you have a BS, MS, and PhD in Math from MIT, you can't sit from the patent bar (well, on the basis of that coursework anyway).

So what the patent system in the US has done is 1) allow math to be patented, and 2) specifically exclude people with math backgrounds from reviewing patents, 3) scratch head and puzzle about why trivial patents on mathematics are granted.


Maybe you weren't aware that literally all the computer does is binary arithmetic?


They're called "computers" for a reason.


Yeah, because they can do arithmetic really fast?

I wouldn't consider adding 1 to a register here and there and taking some conditional branches "math". Modern math (usually) involves thinking about theorems very hard and then writing proofs for them.


Algorithms have existed in the world of mathematics thousand years before computers were invented. From this point of view, computers are just a way to speed-up things.

If the Euler algorithm was discovered today, should it be patentable? What about the FFT? Is there any substantial difference with any "modern" algorithm, like the DCT or arithmetic encoding?


I already said algorithms can be plausibly called math.

Lots of software, and software patents, have little to do with novel algorithms.


It's the algorithms that get patented.


Software is math without proofs :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: