Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One factor is Bluetooth is on the 2.4GHz band, and WiFi on that band is many times more powerful - so Bluetooth packets can only get through in the gaps between Wifi transmissions. Many of the Bluetooth devices around use Nordic Semi nrf5x chips, and we know that with those chips if there's a single wifi transmitting at one end of the frequency band and all the BT-type devices up at the other end there's still significant interference effects. Also the short wavelength (~10cm) means the RF bounces and refracts all over the place, often resulting in small dead spots where devices fail to communicate (and the dead spots move about as the environment changes,e.g people moving around).

As a result, getting anything through relies on lots of error handling and quite a lot of software is not the highest quality, shall we say? It's also not helped by lying user interfaces telling you a device is connected when snooping packets manifestly shows its not (looking at you, Apple).

Bluetooth can be pretty reliable - in RF-quiet environments. Low data rate stuff with error handlng/repeat transmissions often works by getting through during periods when there happen to be few interfering signals. Trying to use BT for relatively high rate date (e.g. audio) with lots of nearby wifi's is always going to be hit and miss.



> One factor is Bluetooth is on the 2.4GHz band, and WiFi on that band is many times more powerful - so Bluetooth packets can only get through in the gaps between Wifi transmissions.

It's the other way around: Bluetooth transmits whenever it wants to (or rather has to, per the frequency hopping sequence) 802.11/Wi-Fi has to deal with the consequences (i.e. consider the channel occupied or back off its transmissions in case of a collision per CSMA/CA).

All reasonably modern Bluetooth implementations support adaptive frequency hopping though and try to avoid bands known to be busy with other protocols.


My point was that at the RF level, BT packets get swamped by Wifi (hence only getting through in the gaps). With Wifi putting out 100mw and BT around 1mw, added to WiFi devices generally having much larger antenna, WiFi is the thing to workaround. And in a busy environment (I often see dozen's of APs in a scan) pretty much every channel is occupied - making it challenging to get much data through at BT-type power levels.


AirPods are Bluetooth Class 1 devices and can transmit at 100 mW as well! A larger (omnidirectional) antenna doesn't increase the EIRP either, which is what is limited in case for both Bluetooth and 802.11.

> And in a busy environment (I often see dozen's of APs in a scan) pretty much every channel is occupied

Most of these channels are overlapping anyway, so we're really only talking about 3 or so channels in the 2.4 GHz band.

And due to CSMA/CA, 802.11 has to back off in case of sensing any sufficiently powerful interference. Bluetooth Class 1 definitely seems like it would qualify, so Bluetooth will actually make 802.11 on all of these channels back off, reducing the duty cycle and creating enough gaps for Bluetooth to statistically get its transmissions through.

Bluetooth audio (A2DP) only needs about 300 kbit/s anyway; that's less than 20% of the total possible Bluetooth throughput, so even if every single transmission needs 5 retries, you'd still not have audio dropouts.

I've never been in an environment where 802.11 was actually able to overwhelm Bluetooth audio connections, neither A2DP (which is heavily buffered), nor HFP (which isn't). Microwaves or older "non-backoff" 2.4 GHz transmitters (old babyphones, analog cordless phones etc.) definitely can, though.


As an interesting experiment, take a connected bluetooth speaker and put it in a microwave oven (without turning it on, and if there is any question of it being on, unplug it).

You will find that the bluetooth connection is instantly broken by the shielding when the door is closed.

Bluetooth also has to put up with leaky microwave ovens, in addition to other users of the 2.4GHz band.


I just put my iPhone 13 into my microwave while playing music on my AirPods Pro. They kept playing reliably until about 10 feet away, where breakup started occurring. By 15 feet the connection was gone. Walking back into the 10 feet range the music started again though.


Most Apple devices including AirPods support Bluetooth Class 1, which have a 40 times higher maximum transmit power than the more common Class 2.


That is quite interesting. Thanks for sharing the results. I don't have Apple devices, so I am not able to verify this for myself at the moment.

I'm assuming that other bluetooth devices die immediately.


True, microwave ovens usually don't implement CSMA/CA :)


> Bluetooth can be pretty reliable - in RF-quiet environments

Like somewhere deep in the Swedish forests, just me and my chainsaw and a bluetooth headset which doubles as ear protectors.

...which still randomly starts stuttering or drops the connection.

Nah, Bluetooth is just unreliable. It works when the sun and moon and stars are in the correct alignment and/or a black cockerel of sufficient pedigree has been sacrificed in the correct way, otherwise it plays its tricks on you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: