All those words surely do sound impressive when you list them all together just like that. But you can remove at least pattern matching and records, as they are coming to C#. At least some of them is a matter of taste (e.g. nested functions , global inference and custom operators - the latter two can reduce readability IMHO) etc. For me however the killer feature of C# is ReSharper and other tooling. There is no point to argue, F# is a great language, but C# is a great one as well, no need to write code samples to misuse the language and make it look bad. I considered using F# for physics computations (to leverage the units of measure) but unfortunately it provides no solution for marking external types with them (which means I cannot integrate it with e.g. MonoGame). I still have a hope for finding a good use case for F#.
Pattern matching in C# would be a rather big departure and change. And then to match F#'s level of deconstruction, that'd be neat. C# can become F# if it wants, they just have to add in features. They've been hesitant to do so based on limitations of the compiler codebase, as well as fears that C# might get too difficult for mediocre programmers.
I'm not sure how new List<string> { "a", "b" } is more readable than new List<_> {"a", "b"), but hey, sure, if you want to argue type inference is bad, go for it. F# also lacks loop constructs like break/continue.
I don't feel I've misrepresented C# at all to make it look bad. I've written a lot of C# code and a fair amount of F# code. Line by line, char-by-char, expression-by-expression, F# is simply much less code. Those examples are just things off the top of my head, from real codebases.
C#'s alright, because the competition (like Java) is laughable. So in that sense, it's "great". In absolute terms, it doesn't measure up (and this isn't a secret, bit-by-bit C# adopts features F# proved out.)
Units-of-measurements are implemented via erasure, yes. How would you represent float<m> externally to make it available to common types, but without losing performance? The compiler consuming them would need to be aware of it. And at runtime, you certainly don't want extra overhead, and I don't think the CLR has any efficient way of exposing primitives with additional type info. It's an unfortunate tradeoff.