> To protect potentially compromised accounts, npm is invalidating all npm login tokens created between 2018-07-11 00:00 UTC and 2018-07-12 12:30 UTC (about 2 hours ago). If you believe your account specifically was compromised we still recommend visiting https://www.npmjs.com/settings/~/tokens to revoke all your tokens.
> Posted about 20 hours ago. Jul 12, 2018 - 16:42 UTC
Then later:
> We have now invalidated all npm tokens issued before 2018-07-12 12:30 UTC, eliminating the possibility of stolen tokens being used maliciously. This is the final immediate operational action we expect to take today.
> We will be conducting a forensic analysis of this incident to fully establish how many packages and users were affected, but our current belief is that it was a very small number. We will be conducting a deep audit of all the packages in the Registry to confirm this.
>Posted about 18 hours ago. Jul 12, 2018 - 18:52 UTC
This is, to a degree, what happens. However, a species that ages has an advantage in that new individuals, with perhaps better characteristics in the face of new circumstances, have access to more resources.
I do wonder whether there is a point where the lost experience of other individuals weighs heavier than physically better adapted individuals. In a way, culture serves as a way to preserve information that would be lost by aging.
> new individuals, with perhaps better characteristics in the face of new circumstances, have access to more resources.
Another way to interpret this would be “increased diversity.” Which is a selection advantage in an environment that experiences change. While organisms don’t have “clocks” they do have different rates of aging, implying a biological control. And species that live longer tend to hold on to longer genomes. This tends to mean that short lived species hold diversity in their population while Long lived species hold diversity in every genome. Both operate as fitness advantages, and seem to imply that aging is a biological quality.
So that would be like killing a program and respawning a new process instead for some benefit like reclaiming memory or speed (like I do with browser tabs).
I know evolution doesn't work that way, but definitely sounds like an interesting thought.
> So that would be like killing a program and respawning a new process instead for some
> benefit like reclaiming memory or speed ...
>
> I know evolution doesn't work that way,
Sure its better now but that's how the early versions of Evolution worked (sorry couldn't resist !).
I'm not sure what class of error this is, but it's a common reasoning mistake in discussions on evolution.
By your same logic, you could pick any species at all and call their traits "the winning strategy for long-term survival" as long as you live contemporaneously with them.
There are known species that don't seem to exhibit planned senescence--the naked mole rat is a commonly discussed example. Check the Wikipedia page for biological immortality[1] for more examples and info.
The article essentially claims that ageing is likely a result of interacting thermodynamic processes. If so, then without specific preventative measures, organisms will "age".
The discussion at this level is pretty hand-wavy. So without introducing more rigor, the best we can say is probably something like this: there hasn't been strong selective pressure in the past to develop anti-ageing strategies.
Exactly. Some tortoises as well can live a few hundred years, as well as wales, urchins, sharks, quahog clams, and as someone else mentioned– jellyfish.
Though you won't see any of them developing rockets and space-stations. To what extent that is an evolutionary advantage on our part I'll leave to general consensus.
By the way, everything isn't just thrown into a single container. We use Docker to set up the whole app over a series of containers.
So postgres, redis, the flask app and celery all run in their own containers.
Without Docker, you're responsible for getting not only virtualenv set up, but also installing postgres and redis on your box, which is wildly different on Windows, MacOS and Linux.
From a teaching POV, it's easier to level everyone off with Docker, but from a student's POV, it's also easier because now they don't need to worry about any of that. It lets them get to the important material faster. They can just install Docker and run a simple `docker-compose up --build` and they are good to go.
Compare that to listing out a million steps to install python, setup virtualenv, write special clauses for people on Windows or MacOS, etc.. It's a huge burden for everyone, and it's the main reason why I started using Docker for my own projects years ago.
We haven't even talked about installing multiple versions of postgres or redis without Docker too (because apps tend to be developed at different points in time), or deploying an app to production. All of the steps you do to set up a typical real world Flask app on Windows and MacOS are thrown out the window without Docker because your production box is likely running Linux, but with Docker, it's pretty much the same.
Consumer perception, mainly. It depends on the target audience of the page. Modern consumers have come to correlate certain appearances with quality. It's not an awesome cultural development, but you're not likely to gain much traction by fighting against that momentum unless you already have an oversized influence on that readership's expectations.
I often hear people point to security as a reason to avoid shipping sourcemaps in Production, but it seems like such a non-issue given that anyone can unminify the code shipped out to their browsers. What kinds of secrets are able to be hidden via obfuscation? The answer traditionally is "none" so I'm pretty consistently baffled. We do strip comments explicitly so that devs don't need to be as concerned with exposing anything that way, but aside from that I don't really understand this angle.
Ah, good ol' security via obscurity. I bet they don't want you to know about `var SUPER_SECRET_ENCRYPTION_KEY = ` or the inner workings of some crappy client DRM.
The only case where I'd think it makes any sense is for protecting programming work from simple replication. While it isn't particularly hard to break client-side security bogus, its difficult to turn a minified mess into comprehensible code.
> Regardless, another important thing is not to download source maps onto client's machines, as that defeats the whole point of minification.
Browsers don't download source map files unless the developer tools are opened. If your client is using your app with the dev tools open you may have other problems that have nothing to do with performance.
If it was important and estimates on how long it will take are overrun, then don't wait to follow up on it and find out how it's going.
Unless the problem is more nuanced like a small component of a larger body of work? Like knowingly leaving flaws in an implementation, or something.