They chose to make totally new inferior LC3 codec though.
Also, on my system (Android phone + BTR5/BTR15 Bluetooth DAC + Sennheiser H600) all options sound realy crappy compared to plain old usb, everything else is the same. LDAC 990kbps is less crappy, by sheer brute force. I suspect it's not only codec but other co-factors as well (like mandatory DSP on phone side)
"Inferior" is relative. The main focus of LC3 was, as the name suggests, complexity.
This is hearsay: Bluetooth SIG considered Opus but rejected it because it was computationally too expensive. This came out of the hearing aid group, where battery life and complexity are a major restriction.
So when you compare codecs in this space, the metric you want to look at is quality vs. CPU cycles. In that regard LC3 outperforms many contemporary codecs.
Regarding sound quality it's simply a matter of setting the appropriate bitrate. So if Opus is transparent at 150 kbps, and LC3 at 250 kbps thats totally acceptable if that gives you more battery life.
I remember seeing published numbers based on instrumented code, but could not find it.
I did a quick test with the Google implementation (https://github.com/google/liblc3) which is about 2x faster than Opus. To be honest, I expected a bigger difference, though it's just a superfical test.
A few things that also might be of relevance why they picked one over the other:
- suitability for DSPs
- vendor buy-in
- robustness
- protocol/framing constraints
- control
- DSP-compatibility probably considered but never surfaced as a reason, so hard to guess investigation results. + Pricing and availability of said DSP modules
- Robustness - well, that's one of the primary features of opus, battle tested by WebRTC, WhatsApp etc. (including packet loss concealment (PLC), Bit-Rate Redundancy (LBRR) frames)
- Algorithmic delay for opus is low, much lower than older BT codecs, so that definitely wasn't a deal breaker
- Ability to make money out of standard is definitely important thing to have
If used in a small device like a hearing aid, a 2x factor can have a significant impact on battery life.
VoIP in general experiences full packet loss, meaing if a single bit flips the entire packet is dropped. For radio links like Bluetooth it's possible to deal with some bit flips without throwing the entire packat away.
Until 1.5 Opus PLC was in my opinion it's biggest weakness, compared to other speech codecs like G.711 or G.722. A high compression ratio causes bit flips to be much more destructive.
As for making moeny, Bluetooth codecs have no license fees.
> For radio links like Bluetooth it's possible to deal with some bit flips without throwing the entire packat away.
Opus was intentionally designed so that the most important bits are in the front of the packet, which can be better protected by your modulation scheme (or simple FEC on the first few bits). See slide 46 of https://people.xiph.org/~tterribe/pubs/lca2009/celt.pdf#page... for some early results on the position-dependence of quality loss due to bit errors.
It is obviously never going to be as robust as G.711, but it is not hopeless, either.
I've got AirPods and a Beats headset so they both support AAC and to my ear sound great. Keep in mind I went to a lot of concerts in my 20s without earplugs so my hearing isn't necessarily the greatest anymore.
AFAIK Android's AAC quality isn't that great so aptX and LDAC are the only real high quality options for Android and headphones. It's a shame as a lot of streaming is actually AAC bitstreams and can be passed directly through to headphones with no intermediate lossy re-encode.
Like I said though, to get Opus support in A2DP a BT SIG member would really have to be in love with it. Qualcomm and Sony have put forward aptX and LDAC respectively in order to get licensing money on decoders. Since no one is going to get Opus royalties there's not much incentive for anyone to push for its inclusion in A2DP.
Also, on my system (Android phone + BTR5/BTR15 Bluetooth DAC + Sennheiser H600) all options sound realy crappy compared to plain old usb, everything else is the same. LDAC 990kbps is less crappy, by sheer brute force. I suspect it's not only codec but other co-factors as well (like mandatory DSP on phone side)