My favorite "feature" of Chrome is the way it invites people to look through your browsing history every time they open a new tab, and there's no way to disable it. There used to be a way to disable it, but they removed it sometime last year.
Because users don't know what they want, right Google?
Almost nothing you said is correct aside from your basic description of the procedure.
Weight loss is rarely the primary goal of FMT and its efficacy in that realm remains unproven. Nowhere did the article imply that any of the things that made you "cringe" are necessary. And oral administration isn't objectively better than the alternatives, nor is it always an option since many people understandably find the idea of swallowing a "poop pill" rather repulsive.
>>Weight loss is rarely the primary goal of FMT and its efficacy in that realm remains unproven.
Well, doesn't this case demonstrate that FMT can be used to have an effect on weight, albeit in the negative sense? So something is going on here, and while the science is unproven, it sure seems like this incident demonstrates that it has promise.
Don't you mean "the kind who use proper spelling"?
No. Both "who" and "that" may be used to anchor relative clauses with human antecedents. To quote [1]:
Most writers use 'that' and 'which' as the relative pronouns for inanimate objects, and who as the relative pronoun for humans. This widespread habit has led to the mistaken belief that using that in reference to humans is an error. In fact, while most editors prefer who for people, there is no rule saying we can’t use that, and that has been widely used in reference to people for many centuries.
In other words, my original sentence was perfectly grammatical, and in regard to the original commenter's presentation skills -- and their correlation to his employability and salary potential -- I was simply stating the obvious. But this being HN, I've been downvoted and/or flamed at least 8 times for it.
Did you forget about the entire city of Philadelphia? 14 inches predicted, < 1 inch actual snowfall.
The city is practically a ghost town today thanks to a botched weather forecast. Tons of businesses and schools closed preemptively because they were predicting massive snowfall totals with what seemed like absolute certainty. According to meteorologists here, it wasn't about whether or not we were getting snow, it was about whether we were getting 1 foot or 2 feet.
Did you forget about the entire city of Philadelphia? 14 inches predicted, < 1 inch actual snowfall.
OK, but so what? This is about unpredictability, not over-estimation. If the unpredictability was better conveyed the best case scenario is still everyone preparing like they did for a storm that didn't hit. Worst case is suffering hardships due to underestimating the storm.
I'm a bit further northeast and it's coming down pretty hard outside my window right now. The preemptive cancellations really did avoid a lot of unsafe travel here. We have a bit more familiarity with the unpredictability of snowfall predictions, but it usually manifests as people saying: "It won't be that bad" and getting into trouble the few times it really is that bad.
It didn't hit the cities that aren't used to big snowfall. But those cities house a lot of national media, so now we get to listen their scorn for the next week.
I would rather they close and not get snow than to stay open and put people in danger. Last year in Georgia the weather forecast was so bad and officials so late reacting that thousands were trapped at work and tens of thousands spent hours just driving home.
> But if you talk to startups, you find practically every one over a certain size has gone through legal contortions to get programmers into the the US, where they then paid them the same as they'd have paid an American.
The whole argument hinges on this unverified anecdote.
Sorry, but your personal experience isn't universal.
Well that's not fair. The "occlusion mask" is referenced in a patent application, but basically is just a LCD that blocks out light that sits in front of your eye. re: fullscreen google glass, at best.
Well, that's exactly the point: an anti-display which lets through what parts of reality the system wants you to see, and blocks light from what it wants you to not see - so another display can render something to fill in the blocked-out areas. Rather than the "ghosting" of putting an elephant over your hand, with the light of both being added, the LCD acts as a dynamic precision cutout shutter, blocking the light from where the elephant would be and letting another display fill in _all_ the light you'd expect from that visual addition.
Does this mean that people looking at me using my magic leap will be able to see an outline of shapes which I'm projecting? That's going to be awkward.
I stopped updating all my Google apps ~2 years ago because there were so many asinine, unintuitive UI changes with each new version. A lot of their changes didn't even conform to the standards recommended by Google (e.g. menu button locations). It's almost as if the UI team gets bored and decides to make changes just because they need to occupy themselves. And then instead of testing them with actual users, they just talk amongst themselves about how awesome their new set of obscure gestures and button locations are.
I'm afraid to buy a new Nexus because of how bad the application interfaces will probably be.
At the very least, now it seems Google is making all their applications fairly standardized in their UI/UX. I think that's Material Design at work, but I'm far from a UI/UX person.
Are you serious? Every Operating System manufacturer these days "standardizes" their UI across their whole offering every few years (Microsoft, Google, Ubuntu and Apple are all on board). Just a moment ago, Holographic Design was the thing on Android. Let me guarantee you that Material Design will be, too, a thing of the past in a couple of years. What the designers don't seem to realise is that this is frustrating to the users.
I just updated my Nexus 5 to KitKat. The update forced me to use GMail instead of the Email app, and almost 24h later GMail is still "Getting your messages". None of my email accounts work. Never mind changing the UI, they should start by not completely breaking the key functionality of their applications in their effort to force everyone to use their services.
K9 Mail is the traditional answer to this problem. Google never put very much effort into the Android Mail App because they wanted to push you into using GMail. That said, Email still works fine in KitKat - why did you have to switch to the GMail App?
Did I accidentally fall into a time machine and end up in the 1980s?