Monday, March 31, 2008
Submittion day
Today is the deadline for the proposal submission. I hope it is accepted, I think it's a perfect fit for GSoC. I had some trouble trimming it down to 7500 characters because I initially misread the guidelines and thought it said 7500 words, if it wasn't for someone on the dri-devel mailing list reminding me I'd have submitted all 8K characters. I'm feeling lucky already!
Saturday, March 29, 2008
Potential benefits of accelerated video decoding?
I've started considering some of the potential benefits of hardware accelerated video decoding from a user's perspective. The biggest one for me is being able to play back HD streams in real-time. I have a modest machine and it does struggle with HD streams, but having read this paper I'm encouraged by one statement in particular. Testing on a machine equipped with a Pentium III @ 667 MHz, 256 MB of memory, and a GeForce 3 Ti200, they state that they were able to play back a 720p ~24-frame/s stream encoded with WMV at 5 Mb/s. That hardware is pretty ancient by today's standards, and yet with the GPU handling MC and CSC they get a 3.16x speed-up over the CPU w/ MMX implementation. That's pretty encouraging. I'm sure all the folks out there who use their machines as HTPCs would really appreciate that sort of performance.
Another benefit would be that implementing this in terms of a Gallium3D front-end allows it to be used on all hardware that has a Gallium3D back-end. Currently I believe certain Intel GPUs are supported very well, as well as Nvidia through Nouveau, but I'm thinking specifically of AMD/ATI. As far as I know their Linux drivers have never supported any sort of video decoding acceleration, even though the hardware is very capable and the functionality is implemented on Windows. Recently they released hardware specs for some of their GPUs, but this did not include any of the dedicated video decoding hardware if I recall correctly. However, with specs for the newer GPUs and various reverse-engineered drivers for older GPUs already existing, comprehensive Gallium3D support for ATI GPUs will probably happen. I think the point is obvious by now: hardware accelerated video decoding on ATI GPUs finally.
Another benefit would be that implementing this in terms of a Gallium3D front-end allows it to be used on all hardware that has a Gallium3D back-end. Currently I believe certain Intel GPUs are supported very well, as well as Nvidia through Nouveau, but I'm thinking specifically of AMD/ATI. As far as I know their Linux drivers have never supported any sort of video decoding acceleration, even though the hardware is very capable and the functionality is implemented on Windows. Recently they released hardware specs for some of their GPUs, but this did not include any of the dedicated video decoding hardware if I recall correctly. However, with specs for the newer GPUs and various reverse-engineered drivers for older GPUs already existing, comprehensive Gallium3D support for ATI GPUs will probably happen. I think the point is obvious by now: hardware accelerated video decoding on ATI GPUs finally.
Monday, March 24, 2008
Jumping right in
To get myself familiar with how a Gallium3D front-end works I downloaded the source to Mesa and built the library. I had a little trouble trying to figure out which make target built Mesa using SoftPipe, but someone on #dri-devel was kind enough to tell me. I also downloaded openChrome's libXvMC source, but it's not immediately clear to me how the library works. It appears to do some work in terms of Xlib, xext, and others, but expects a few functions (the ones that actually touch the hardware) to be provided and linked in. Odds are this is where the Gallium3D calls will end up.
Subscribe to:
Posts (Atom)