I mostly agree with you, but I would add that systems fit different models (aka abstractions) as they scale and that as we add water molecules to a system we'll start to see microfluidic, then fluid, then oceanographic,.. cosmological properties emerge. The molecules are the same and always behaved with all of those properties, but whether or not we can see them easily, or can use them as a modelling tool depends on the scale.
I've began to question the complexity of many of these paradoxes I run into these days.
In this case, isn't this just a matter of the definition of a "heap of sand"? The article seems to sort of forget to define a heap and just proceeds to make a problem out of it and asking questions like "when does a heap become not a heap?" when it was never even defined why would we call something a heap in the first place.
So I think the real question there is just that why are you calling something a heap in the first place and when you have an answer to that, that same answer will also help you with figuring out when the thing you call a heap is not a heap anymore.
> when it was never even defined why would we call something a heap in the first place.
Cultural conditioning, technically, "just [the] reality", "you know what I mean", etc colloquially. It's a broadly "known" meaning, despite no definition, that's where the paradox comes from.
Lots of things are like this, the requirement(!) to fight wars for example - there's no objective definition for why we must do it, we just "know" that we must. And if you disagree with me, just ask anyone and see what they say (watch out though, quite often people will trick you and answer both yes and no to the same question, with complete sincerity).
Watch children playing house and listen carefully to how they talk, how they describe the details of their imaginary world, etc....its an innate feature of human consciousness, it never goes away, but it becomes cloaked by education, culture, "facts", etc.
The problem is that heap has only an informal connotative definition, not a denotative one. 4 grains of sand doesn't seem like a heap, but if your kid leaves 4 of his toys out you might consider it a heap.
So now what, does the definition of a heap have to account for the subject's and object's relative scales? Formalizing informal notions is not always straightforward.
You're totally right, and this exactly the sort of thing Wittgenstein set out to show. The trouble is that people have an idea of what a "heap" is and what "evil" is and what "knowledge" is and a lot of philosophy is concerned with pinning down those notions or at least saying something concrete about them.
I think the Sorites paradox may point to something specific about these kinds of ideas, which is not shared with the meaning of things like evil or knowledge. There's a page on it on SEP: https://plato.stanford.edu/entries/sorites-paradox/
I've always thought of emergence as "a behavior at scale which is unintuitive or difficult to predict given understanding of a lesser scale".
More specifically, I think "emergence" is more about the blind spots we have as meat calculators than something magical, unless you ascribe to the notion of "magic" as "something sufficiently advanced or complex as to be difficult to understand", in which case I think actually yeah, emergence is magical behavior from that perspective.
I've only skimmed the paper so far, but apart from both using the word "emergence" I think you and the paper are talking about very different things. The paper means something along the lines of "behaviours at different scales that are causally isolated from each other", and they give a mathematical treatment of that.
I imagine they would agree with you that there's nothing magical about it! The map is not the territory but some maps are better than others.
I don't know if I disagree with this, but this reminded me about one thing that for some reason stuck to my mind once.
It was about statistics and probabilities. I think I was talking with ChatGPT about superpositions or something and somehow we got to talking about how e.g it might be impossible to make a system which could predict everyone's favourite flavour of ice cream.
Interestingly enough though, it is perfectly possible to gather data about people's favourite ice cream flavours (and we could even go as far as to say that we could ask every single human on the planet) and make a statistical model which is able to answer what is most probably everyone's favourite ice cream flavour.
I find this really interesting. We could think of one person's flavour as essentially random and impossible to predict, but when we gather enough of these random data points, we are for some reason able to build a relatively accurate system to guess someone's favourite flavour. I don't think this is obvious at all.
Anyway, I didn't have any real point here. I just wanted to share one example of interesting thing that I think is an example of "emergent behaviour" and seemingly magical at that too.
Yes, nothing magical about it, but the macro behaviour is only possible with "enough" of the micro behaviour in some cases.
You can, for example, go mathematically from describing individual behavior to describing macro behaviour and see the macro emerging in the limit (e.g., homogenization of PDEs).
The thing is that we use different (and incompatible) abstractions to describe processes in differing contexts.
The map is not the territory.