Hacker Newsnew | past | comments | ask | show | jobs | submit | IronyMan100's commentslogin

So If Amazon wante to be the lowest price Destination, but Takes fees for Listings, FBA etc, then the product price needs to include that fees. That will make the product more expensive and since amazon wanted to be the cheapest Destination, the price does need to gonup everywhere? It's maybe the Fairest Thing, but is it good for the Overall Economy?

Work in the Public sector can be exhaustive, since some branches are severly underfinanced, at least in Germany.


Sorry to hear that and that you and your Family could find some peace and strength, i remember a quote which maybe fits Here: you can play a perfect Game and still loose. That's not a weakness, that's life.


The funny Thing is, If these LLMs withold this information. What does it withhold else? Can i trust These Corporate LLMs If i Look for information and i am not deemed a Domain expert?


How do you know if a domain expert is not withholding information based on corporate instruction, personal bias, profit motivation,...? What are your options as a non domain expert for verification? Do you trust peer reviews and metrics set up by the experts you distrust? At what point have you taken enough steps backwards to question your own perception?


Normally in this kind of systems, the detection is the nonlinearity. That is, you send light through the system, the light can interfere, Changes path through the system but in the end you can detect only the intensities, |E|^2.


Can they not go in front of the court? I mean to grow up by their own parents is a child right.


This Has it's use. The continouus Fourier Transform is is based on that. You are asking what frequencies is this continouus signal made of. Time is normally defined as a real number in that context, but If you have a continouus time you need continouus frequencies to map time space to frequency space. You can think about an Index as a lego Block, that you need to construct Something.


If i Look at all the finfluences and "get thin in 30h with my cale diet eBook"-influencer, i though it was substantially more than 10%.


I think it's N5/N3 or so, current state of the art.


Does this Not make sense? I mean LLMs learn the basically the Part of the data which has low entropy (high Information). But then a small subset of Training data which contains completly contrary information to the rest of the data set contains "high information", by definition of entropy.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: