Sagan made solid contributions to Planetary Science in the 60's and 70's.
His role as PBS educator, SF author, etc. needs to be considered as a separate thing.
I also loved James Burke and his Connections series, but as it got into the later seasons the so-called "connections" got tenuous and sometimes quite strained.
You can go through all the classic PBS science shows and find problems, Stephen Hawking's Universe was basically unwatchable because they refused to engage with the math.
People like Sagan have a worldview in which we are all either rational robots that only believe in "science", or else silly magic-believers that can't think by themselves. Of course Sagan himself proves that this is wrong: you can be a great scientist while believing a lot of silly nonsense about the ancient world, and about crab evolution apparently.
I actually kinda like Sagan, Cosmos is seriously gorgeous (though Vangelis does a lot of the heavy lifting there). I'm just kinda tired of the whole "Library of Alexandria" myth that comes up constantly around nerdy forums such as this one.
> Believing silly nonsense which is still plausible isn't the same category of error as believing in magic.
Eh, hard to say what is magic and what is not. Sagan's beliefs about the ancient world could have been fixed with a five minute conversation with an expert. He just didn't care enough to do that, and for some reason that attitude is common among so-called skeptics when it comes to history.
I had to look up what specifically Sagan's errors about history were and yes it looks like he popularized a lot of bad mythology (I referenced this site[0]). I have to admit I believed a lot of this myself at one time. But I think there's a difference between the plausible but wrong and the impossible. If he'd said he thought Atlantis was real, or gone on about Tartaria or the great Ice Wall, that would be a lot closer to magical thinking.
Although it is a lot easier to be led astray by plausible lies than implausible ones. It seems like Sagan is certainly guilty of not being critical about narratives that reinforced his worldview.
> I don't think it is? I think there's a difference between the plausible but wrong and the impossible.
I find that the word "magic" is very overused by smart people online as a sort of thought-terminating cliché. It's a vague concept and I'm not always sure what they mean by it.
It's often extremely hard, even for top minds, to tell apart magic and science ahead of time. Think of Einstein mocking quantum physicists for believing in "spooky action at a distance". Of course if you still don't believe in quantum entanglement today then you are being irrational, but that's only because science has (mostly) settled the question, nothing to do with how magical or plausible the concept may sound.
Someone defending astrology will tell you that the gravity of the moon affects their bloodstream like it affects the tides of the ocean. That doesn't hold water if you sit down and do the math of course, but the same is true if you bother to check the dates for events in ancient history.
Emacs user. And the fonts I use have to work with anti-aliasing turned off.
Right now I'm using a Dell/Alienware AW3225DM and it's perfect for my needs (work + occasional gaming, and most of my gaming is retro). Best Buy was discounting these during the Xmas season.
I do not want anything higher than 2560x1440 because it makes my fonts look tiny, or I have to turn anti-aliasing on. Neither option is OK with me.
Any fonts look much better on a monitor with a higher resolution and the size of the fonts must not vary with the resolution of the monitor. A 4k monitor always provides more legible text than an 2560x1440 monitor.
The size of the fonts used by your documents is specified in typographic points, e.g. 12 points or 14 points. This corresponds to a fixed size on the screen, regardless of the screen resolution. The increased resolution only makes the letters more beautiful, not smaller.
If your fonts become smaller on a monitor with a higher resolution, then you are holding it in the wrong way, i.e. your operating system is badly configured and it does not know the correct dots-per-inch value for your monitor, so it uses a DPI value that corresponds to the obsolete VGA monitors.
A decent operating system should configure automatically the right DPI, because the monitor provides this value to the GPU when it is initialized.
Despite this, for some weird reason many operating systems do not use the DPI value read from the monitor to configure automatically the graphics interface, so it must still be configured manually by the user. Even worse is that the corresponding setting is frequently well hidden, so it is difficult to discover.
In any case, these endless discussions about fonts being to small on high-resolution monitors have been caused only by some incompetent morons who for inexplicable reasons have been in charge of the display settings of the popular operating systems. The user may have reasons to override the true DPI value of the monitor, but by default the OS should have always used the value provided by the monitor EDID, and then you would have never seen any change in font sizes when substituting monitors with different resolutions (except when even more incompetent Web designers specify some sizes in pixels instead of length units; allowing pixels besides length units for the sizes of graphic elements has been a huge mistake, but when this was done several decades ago, most computers did not have GPUs yet, so there were concerns about the rasterization speed in software).
I used to work in my mom and dad's print shop when I was a kid. 6 picas in an inch, 12 points in a pica, and by the time you go home your hands smell like hypo. That should give you an idea of how old I am.
For a kid I was passably good at setting up headlines for paste-up, but I never had to be the one who used an X-Acto Knife.
I'll die on the hill where 2K is better than 4K if your livelihood depends on having to stare at a screen at a distance of 60cm for upwards of 10 hours a day, longer sometimes.
I also think you missed my point about about the anti-aliasing. For various reasons I still use Windows and some of my favorite monospace fonts only exist in the the .FON format. I can emulate the X-Windows experience of using the misc-fixed-medium family and it works just fine for my needs.
I agree that on monitors with insufficient resolution ancient bitmap fonts can be sharper, because they are free of artifacts caused by mismatch between the shape of the letters and the pixel grid.
Your problem is precisely that you use monitors with a too low resolution. On monitors with a high enough resolution, you approach the quality of printed paper and you can use monospace fonts that are more beautiful than any bitmap fonts, without being able to perceive the pixels.
The only problem is that big monitors also need a bigger resolution and the combination of big size with big resolution can be expensive.
While for a size of 27" or 32" the 4k monitors can be quite cheap, I believe that at such sizes a 5k resolution is the minimum for good text rendering, and 5k monitors remain expensive.
In the limit, as pixel density increases, regular, unhinted floating-point-x text looks just like it would on a printed page. How can you get better than that? With enough resolution, you free yourself from all the hacks we've devised to make text on a computer halfway tolerable. Shouldn't doing so be the goal?
If you want that blocky-font retro look, you can use vector art to make squares.
To complicate things further, Guaraldi released "Jazz Impressions of A Boy Named Charlie Brown" (once again, based on the 1963 documentary) but these recordings are not the same as the cues used in the documentary.
If you go back and watch the first two seasons of HBO's Westworld, you will see Anthony Hopkins' character repeatedly doing exposition dumps out of his mouth. The difference is in how he does it, that he is in such complete command of his craft that he can work out exactly what the screenwriters intended without drawing any attention to it.
And Trekkies will remember the time Larry Niven wrote a screenplay for TAS and gave all the exposition dumps to Leonard Nimoy. See how nicely he handles it?
Once you develop an awareness of how SF screenplay writers do this, you can't unsee it.
Babylon 5 was particularly egregious, I was never a fan but I was puzzled that JMS had to do rely on it so heavily. It was like he created the character of Delenn just to be an exposition dumper and Mira Furlan faithfully did what was asked of her. Screenwriters also call this diegesis if the writer goes all the way and uses dialog to explicitly feed the narrative to the audience.
More likely to get hit with a Zero Tolerance punishment for a single isolated incident, which derails your entire trajectory through the school system.
I've always felt like tech workers view themselves as a modern-day petite-bourgeoisie, and this is why the industry has been so successful at keeping out the unions.
As an aside, I had friends who had to declare bankruptcy during DotCom 1.0 because of stock options and the Alternative Minimum Tax. This could have been fixed with legislation but it always seemed like the DC inside-the-beltway crowd saw the whole thing as class treachery and refused to intervene.
The story wasn't actually about the trucker being hard working (or not), though I'm sure he was. He wasn't actually trying to make people believe he literally was the hardest working.
The joke is that everyone else he went to war with was claiming to be something else, so he must have delivered all the supplies himself.
The response is interesting to me, because having fought in a war, though I am not a US veteran -- I instantly got it. And the place I heard it from was more veteran dominated, and everyone instantly understood/appreciated the joke.
I didn’t get it until you explained it. It makes a lot of sense - people who have actually gone to war know of stolen valor and embellishments - you can sniff them immediately. People who have never been and don’t hang around military types much have much less of this kind of context
His role as PBS educator, SF author, etc. needs to be considered as a separate thing.
I also loved James Burke and his Connections series, but as it got into the later seasons the so-called "connections" got tenuous and sometimes quite strained.
You can go through all the classic PBS science shows and find problems, Stephen Hawking's Universe was basically unwatchable because they refused to engage with the math.