A lot of mathematics is based on sets, whereas sets and types are not formally the same thing.
Some mathematicians are trying to formulate a basis for mathematics in type theory. For example, there is the Univalent Foundations of mathematics proposed by Vladimir Veovodsky. This reduces logic, set theory, category theory (actually groupoids) and higher abstractions to homotopy theory (well, certain explicit constructions in such anyway) which is then modelled in Martin-Löf type theory.
By the Curry-Howard isomorphism there is a correspondence between types and proofs (specifically types in the simply typed lambda calculus and proofs in the intuitionistic logic or if you add dependent types, then also predicate logic).
The real difference is that functions in type theory are computable, whereas functions in mathematics (based on sets) need not be.
Types define the encoding for some pure mathematical thing, and they define the set of things that can be done to objects in that encoding. Something is the same type as something else iff they share representation and a particular set of operations. There are ideal types, and there are practical types.
UTF-8 and Unicode are great examples of a practical type. Null-terminated UTF-8 is a type, but unicode is an alphabet of characters in the abstract. The type called "UTF-8 string" has certain permitted operations, and a particular representation in machines.
Another good example is the set of all integers modulo 2^x, and the type consisting of unsigned x-bit binary integers for x86-style processors. The x86 add operation is permitted only on the later, which makes it a distinct type from, say, the PPC integer type. (so does endianness, but that's encoding, not the op)
Ideal types are things like sets, trees, lists, and so on. Their encoding is such that all everything is possible. Look at any functional language. These things are in math, although often less formally defined. The only problem with ideal types is that bottom is a member, which is a problem for some operations permitted on said ideal type.
To put this all another way, types convey something terribly important: What is this thing, and what operations can I do with it. In math, we tend to hand-wave both of these things away, and instead rely on human intuition to determine what something is (set, integer, field, vector, etc.) and what we can do with it (intersect, add, integrate, rotate, etc.) But if you look hard, you'll see types in math. Vectors are bold, fields have arrows, integers have an initial assignment, sets are capital letters, and so on. A human must do type analysis to figure out of an operation or encoding is conforming.
Null-terminated UTF-8 could be a type but in most languages that use such a construct it is not, and those languages where it could be a type, the designers are usually smart enough not to make such a type.
A null terminated UTF-8 string type gets into interesting territory when you want to add multibyte characters AND have a valid type at the same time.
Types exist in mathematics it's just that when discussing mathematics the types are usually inferred. A lot of computer science concerns itself with algorithms rather than math.
eg. When multiplying integers and matrices respectively you'd use a different algorithm to compute the result of multiplying the respective types.
Types mostly disappear as well when using a language derived from the lambda calculus vs. the universal turing machine.
The problem is that math types are equivalent to dependent structural types. You can say, "let x be a set with a comutative binary operation". In languages with Hindley–Milner type inference you can do almost the same (you will miss the dependent part).
And the most famous languages with static typing (Java, C++, C#, Scala) do not have such power.
For more information, I'd urge you to see the amazing presentation[1] of Daniel Spiewak's, Uncovering the Unknown: Principles of Type Inference, and James Iry's Types à la Chart [2]
F# seems to have a Hindley-Milner type system, as described some years ago also by Daniel Spiewak[1].
Which is a vast improvement over more mainstream type systems, while not as powerful as traditional logic type systems, such as those used in First order and Higher Order Logic.
However, I personally rely more on unit tests than on type systems to make sure the code is consistent. And when I do that, specially when I write the test before the code, I find the type system getting more in the way than helping me. Rich Hickey, Clojure's creator, compared[2] these to guard-rails.
I'm not sure that's quite right. Haskell models the lambda calculus pretty closely, but it has one of the most complicated type systems known to man. In its modern form, the type system is Turing-complete. The type system itself is capable of universal computation.
The thing is just that type systems tend to be more usable in lambda calculus languages.
You mean that Haskell has one of the most complex of the practical type systems known to man. For example (correct me if I am wrong) but the Haskell type system does not allow specification of dependent types (though dependent typing can be simulated after a fashion). Haskell with extensions does have a Turing-complete type system though (I actually don't know what this means, but the internets says so -- perhaps someone can tell me the difference between Haskell and Haskell with extensions).
Some mathematicians are trying to formulate a basis for mathematics in type theory. For example, there is the Univalent Foundations of mathematics proposed by Vladimir Veovodsky. This reduces logic, set theory, category theory (actually groupoids) and higher abstractions to homotopy theory (well, certain explicit constructions in such anyway) which is then modelled in Martin-Löf type theory.
By the Curry-Howard isomorphism there is a correspondence between types and proofs (specifically types in the simply typed lambda calculus and proofs in the intuitionistic logic or if you add dependent types, then also predicate logic).
The real difference is that functions in type theory are computable, whereas functions in mathematics (based on sets) need not be.