xiphmont

It's not a strawman after it comes true

A few months ago, Cisco announced they would distribute a free and fully licensed h.264 encoder/decoder blob that FOSS projects could use to support h.264. At the same time Mozilla announced we'd use the blob in Firefox. I blogged about it at the time.

That announcement was mostly about WebRTC, but there was plenty of talk about this being another step toward full MP4 playback in Firefox. Moz obviously can't do that without also supporting (and licensing) AAC, the audio half of MP4. AAC was not included in Cisco's h.264 offer, which many people noticed and Brendan confirmed on his blog.

At the end of my blog post about Cisco's plan, I suggested it might influence MPEG licensing:

"In the future, could nearly every legal copy of HEVC come as a binary blob from one Internet source under one cap? I doubt that possibility is something the MPEG LA has considered, and they may consider it now that someone is actually trying to pull it off with H.264."

Woah, damn. Did that just happen with AAC?

After Cisco's h.264 Open h.264 announcement, Via Licensing, which runs the AAC licensing pool, pulled the AAC royalty fee list off their website. Now the old royalty terms (visible here) have been replaced by a new, apparently simplified fee list that eliminates licensing sub-categories, adds a new, larger volume tier and removes all the royalty caps. Did royalty liability for AAC software implementations just become unlimited?

The new page is much shorter than the old page; Perhaps this is just an oversight or an as-yet-incomplete pricing update. Still it would be a bit odd for an organization that exists for the purpose of royalty licensing and collection to publish an inaccurate or incomplete price list.

So, who'd like to do the dirty work of following up in more detail with Via?

[update 2014-01-29]: Janko Roettgers followed up with Via Licensing, he details their response in a Google+ post. The short version is the old categories 'remain available' but 'under the new terms, products must be approved by Via before they can be reported in these categories.' In short, the caps are still there at Via's discretion. That's probably not actually much of a change; I believe Via decided what products qualified for capped pricing before as well.


xiphmont

Libvorbis 1.3.4 released

Nathan Froyd at Mozilla noticed something odd in the libvorbis sourcebase. Codebook 'length lists' only use integers in the range of 0 to 32. Well, worse than integers, actually, longs. And the lengthlists are big; the static data comprises the bulk of libvorbisenc.

In the earliest days of Vorbis development 15 years ago, codebooks were constructed differently and the lengthlists were quite small. Longs still weren't necessary, but the wasted space was negligible. When the coding strategy shifted and these lists became much larger, no one caught the wasted space. The vast majority of optimization was always for speed, not space. The only concentrated effort in trimming Vorbis library size down over the past decade had been in the decoder.

But now browsers need to ship encoders, and size matters. Add that to 64 bit taking over (and doubling the wasted space in the lengthlists), someone finally noticed the oversight.

That's a long way of saying "Xiph.Org is pleased to announce the release of libvorbis 1.3.4..."

No functional changes, but the encoder lib is now a shade over 25% the size it was in the 1.3.3 release.

Tags: ,

xiphmont

Linux eMagic driver update

Every now and then I'm reminded I'm not the last emi2|6 or 6|2m user left in the world-- apparently Debian just recently made some of the changes that broke the eMagic drivers on other distros years ago and I've been getting mail about it again.

Background: The eMagic emi2|6 and 6|2m firmware loaders shipped with the Linux kernel have been broken for many years. Different distros have had them on life support with an inconsistent array of minor patches, but they've got a couple problems across the lot: races in the loader, an incorrect memory target, deadlocking all of USB with synchronous firmware requests in probe(), and the fact that the bitstream.fw file being shipped for the 6|2m is the wrong file. Apparently, someone accidentally substituted the 2|6 version of the file in a code cleanup years ago, and so even if you get the 6|2m loader to work, it crashes the device because it uploads the wrong thing.

Oh, and Linux is apparently shipping a buggy, early version of the firmware without 96kHz support. Even if you don't care about 96kHz for audio production, and you probably shouldn't, the extra frequency range makes these guys a lot more useful as software oscilloscopes.

I've maintained a working version of the driver with updated firmware for the past several years, but getting it into the official kernel always stalled on 'wait, do you have permission from eMagic to use this firmware?' Unfortunately a) Apple bought eMagic and discontinued all their hardware products more than a decade ago, b) to my knowledge, no one ever had explicit permission to use the firmware currently being shipped either. I have no real interest in having a battle over firmware licensing, so my fixes continue to be my own. If Apple turns out to care, I'll pull them down, but I doubt that will happen. I don't think Apple remembers this device even exists. Seriously, they're Bondi Blue. That's soooo late-90's.

Anyway, here's my latest, updated, out-of-kernel firmware loader with the last firmware release form eMagic. It works properly on 2.6.x and 3.x.x kernels for both the emi2|6 and emi6|2m. It replaces the old firmware and two kernel modules with new firmware and a single unified firmware loader module name emi.ko. All new! Such shiny. Wow.

If you have kernel module build dependencies installed, it should be as easy as untarring as root, make, and make install. I also included a 'make remove-old' target to clean out the old driver and avoid any conflicts. It just removes the old modules and firmware files; obviously that might make a packaging system a little pissy (and you'd probably have to re-run it on each kernel update).

tl;dr, get the driver here: http://people.xiph.org/~xiphmont/emagic/

Tags: , , ,

xiphmont

Opus 1.1 final release

...and the final 1.1 release lands!

The release also features an extensive demo page that describes and shows off the improvements in 1.1 in detail. (The page will look familiar to those who have been following over the past few months; it's an updated and expanded version of the demo for last July's beta test release.)

Tags: ,

xiphmont

Opus 1.1-rc is out

Opus 1.1 just hit release candidate; pending any last minute bug discoveries or showstoppers, this will become the final 1.1 release.

The release candidate includes two major improvements over the previous 1.1 beta.

We've further improved surround encoding quality and tuning of both surround and stereo at lower bitrates. As an example, full 48kHz 5.1 surround is now tested and tuned down to 45kpbs (it's nowhere near audiophile quality at that rate, but it is surprisingly good).

In addition, we also landed additional encode/decode optimizations for all CPU types, but especially ARM which now includes NEON encoding optimizations.

And of course, we hopefully cleared the 1.1-beta buglist :-)

Tags: , ,

xiphmont

Comments on Cisco, Mozilla, and H.264

Please note: This is not a statement on behalf of Xiph.Org or Mozilla. I speak here for myself, my team, and other developers who share my views on an open web.

If you haven't seen today's announcements from Cisco and Mozilla regarding H.264, you'll want to read them before continuing.

Let's state the obvious with respect to VP8 vs H.264: We lost, and we're admitting defeat. Cisco is providing a path for orderly retreat that leaves supporters of an open web in a strong enough position to face the next battle, so we're taking it.

By endorsing Cisco's plan, there's no getting around the fact that we've caved on our principles. That said, principles can't replace being in a practical position to make a difference in the future. With Cisco making H.264 available at no cost, holding out against H.264 in WebRTC makes even less sense than holding out after Google shipped H.264 in the video tag. At least under these terms, H.264 will be available at no cost to Mozilla and to any other piece of software that uses the downloadable plugin.

Cisco's license hack is obvious enough if you have the money: There's a yearly cap on total payments for any given licensed H.264 product. This year the cap is $6.5M. Any company that pays the cap each year can distribute as many copies as they want. There are still terms and restrictions on how the distribution gets done, but Cisco will be handling that (and only Cisco will be allowed to build and distribute these copies without a separate license).

Once you or your applications download the prebuilt codec blob from Cisco, you're allowed to use that specific blob for anything you want so long as you don't modify it or give it to anyone else. H.264 codecs for everyone! Cisco has committed to these blobs being available for just about every platform and architecture you can think of. "IBM S/360? Yes, please!"

This arrangement has obvious short-term benefits. Open source projects get licensed (if partial and restricted) access to H.264, and users don't feel like they're being held hostage in the ongoing battle between the open web and closed codecs. Firefox and other projects can install H.264 support (via Cisco), which is a big deal.

That said, today's arrangement is at best a stopgap, and it doesn't change much on the ground. How many people don't already have H.264 codecs on their machines, legit or otherwise? Enthusiasts and professionals alike have long paid little attention to licensing. Even most businesses today don't know and don't care if the codecs they use are properly licensed[1]. The entire codec market has been operating under a kind of 'Don't Ask, Don't Tell' policy for the past 15 years and I doubt the MPEG LA minds. It's helped H.264 become ubiquitous, and the LA can still enforce the brass tacks of the license when it's to their competitive advantage (or rather, anti-competitive advantage; they're a legally protected monopoly after all).

The mere presence of a negotiated license divides the Web into camps of differing privilege. Today's agreement is actually a good example; x264 (and every other open source implementation of an encumbered codec) are cut out. They're not included in this agreement, and there's no way they could be. As it is, giving away just this single, officially-blessed H.264 blob is going to cost Cisco $65M over the next decade[2]. Is it any wonder video is struggling to become a first-class feature of the Web? Licensing caused this problem, and more licensing is not a solution.

The giveaway also solves nothing long-term. H.264 is already considered 'on the way out' by MPEG, and today's announcement doesn't address any licensing issues surrounding the next generation of video codecs. We've merely kicked the can down the road and set a dangerous precedent for next time around. And there will be a next time around.

So, we're focusing on being ready.

Fully free and open codecs are in a better position today than before Google opened VP8 in 2010. Last year we completed standardization of Opus, our popular state-of-the-art audio codec (which also happens to be the best audio codec in the world at the moment). Now, Xiph.Org and Mozilla are building Daala, a next-generation solution for video.

Like Opus, Daala is a novel approach to codec design. It aims not to be competitive, but to win outright. Also like Opus, it will carry no royalties and no usage restrictions; anyone will be permitted to use the Daala codec for anything without securing a license, just like the Web itself and every other core technology on the Internet.

That's a real solution that can make everyone happy.

I can't resist a little codec fantasy football.

MPEG HEVC licensing isn't set yet. It will be interesting to watch the negotiations if Cisco's H.264 giveaway plan is wildly successful. In the future, could nearly every legal copy of HEVC come as a binary blob from one Internet source under one cap? I doubt that possibility is something the MPEG LA has considered, and they may consider it now that someone is actually trying to pull it off with H.264. Perhaps in five years, even cameras and televisions will download a software codec to avoid paying monopoly rents. Sillier things have happened given sufficient profit motive.

Or maybe they'll build in a free, legally uncomplicated copy of Daala instead. Dare to dream.

—Monty Montgomery <monty@xiph.org> and others
October 30, 2013



xiphmont

Free (that's with a capital F) codecs update: Opus and Daala

Xiph and Mozilla's Greg Maxwell (or as Dave has been teasing, 'Professor Max') gave a good thorough presentation on Opus and progress being made on Daala at the 2013 GStreamer conference in Edinburgh on Wednesday. Unlike many of our presentations, we were more careful to get complete video for this one.

If you've been a fan of the Daala demo updates, his talk touches on some of the topics of upcoming demos, specifically PVQ, the range coder, and motion compensation. Obviously, I'll be going into more detail on those in the actual demo pages.

Tags: , ,

xiphmont

Introduction to Daala part 4: Chroma from Luma

Another new demo, another new technique specific to Daala: frequency domain prediction of the chroma planes from the luma plane!

Predicting the chroma planes from the luma plane isn't a brand-new idea. Still, we're both the only codec to actually be deploying it, and we're doing it entirely in the frequency domain (which is novel).

Read on!

Tags: ,

xiphmont

A fond farewell to Red Hat, an exciting hello to Mozilla

Just a quick note, as people will doubtless ask about it...

I'm leaving Red Hat and joining Mozilla as of next week. This has been in the works for most of the summer.

This is not a reflection on Red Hat, but rather jumping at an opportunity offered by Mozilla. I'll be able to better coordinate and work more closely with the other Xiph Daala developers, most of whom are already at Mozilla, as Xiph and Mozilla continue to ramp up directed development of the codec. I'm choosing between an awesome job, and a potentially even more awesome job. Such problems we all wish to have.

To my Red Hat co-workers: It's been a pleasure and honor to work with you. I suppose I still work with you in the big picture, but starting Monday I'll be on someone else's req.

(also, not _physically_ moving anywhere, staying put in Somerville)

xiphmont

Introducing Daala, part 3: Time/Frequency Resolution Switching

I've just posted part 3 in my demo series introducing the Daala video codec. This one is kind of a long one, mainly because I think it's one of the only really detailed presentations of a technique Jean-Marc Valin of Xiph invented and first introduced in the Opus audio codec: 'TF' aka Time/Frequency resolution switching.

Even better... while I was documenting TF for posterity, I spotted a possible improvement. So, I've tossed in documentation of a brand new technique as well!

Tags: , ,

You are viewing xiphmont