Sunday, April 24, 2011

linux gfx stack

1.Every app has a window associated with it
2.Every window can have multiple surfaces associated with it
3.Every surface has multiple objects drawn on it
4.Surface and window are abstractions provided by x system
5.Meaning the physical memory object for both are provided by x system
6.An application draws an object on its surface and tells the x
7.x does a local processing for that current window,then combines processing with all the other windows
8.x then makes it into a frame buffer
9.writes to the hardware(frame buffer may be a hw buffer, so writing will reach the hw directly)
10.above was software render
11.for hw render, each window and surface , object would have a corresponding handle, memory within the hardware memory
12.each application will have its own private memory in hw
13.every write from hw will get into hw directly, no combining operation done by x
14.the hw internally does the combining
15.mixed rendering, some app uses hw and sw surfaces
16.sw surfaces uses the first method and creates a frame buffer
17,hw surfaces writes into app private memory in hw
18.hw combines both and creates the final output

hw codecs

1.xine player, mplayer
2.xine has xvmc through which hardware mpeg decoders are accessed, also xine supports via gfx card
3.mplayer also supports xvmc but for nvidia gfx
4.vlc also supports somethings
5.ffmpeg contains a list of codecs
6.gstreamer is a framework

7.not only 1080p but its bit rate also matters in a playback
8.mplayer can use coreavc to decode using cpu h.264
9.apple quick time supports h264
10.linux h264
http://www.linuxjournal.com/article/9005
11.http://developer.nvidia.com/tegra/forum/support-h264-hardware-accelator-support-libavcodec
12.terminologies and interrelations
http://forum.videolan.org/viewtopic.php?t=9647
13.new flow
vlc,xine,mplayer--->ffmpeg-->libavcodec-->vaapi-->vpdau-->driver-->hw-->hwcodec
14.http://www.nvnews.net/vbulletin/showthread.php?t=131050
15.http://forums.gentoo.org/viewtopic-t-836517-start-0.html
16.http://www.linuxquestions.org/questions/slackware-14/how-to-enable-libvdpau-libva-vaapi-and-mplayer-with-nvidia-driver-855574/
17.on linux 10.10
http://www.webupd8.org/2010/10/use-mplayer-with-vaapi-support-hardware.html
18.http://www.linux.com/news/software/developer/31582-multicore-video-decoding-with-mplayer-part-2
19.http://superuser.com/questions/109388/how-do-i-get-vdpau-working-with-ubuntu-9-10
20.nvidia gt220 supports vdpau
http://phoronix.com/forums/showthread.php?20182-FFmpeg-Gains-VDPAU-MPEG-4-ASP-Acceleration
21,https://bbs.archlinux.org/viewtopic.php?id=115794

nvidia hardware codec

1.http://en.wikipedia.org/wiki/Nvidia_PureVideo
Different generations of nvidia each have added internally hw codecs for video and audio in addition to hw rendering optimizations and features
2.The container is made and parsed in software, the contents of the container are then hooked to video/audio hw accel inside
3.http://www.guru3d.com/article/geforce-gt-430-review/4
4.geforce 4 can decode mpeg 4 in hardware
mkv
x264
h264
aac
dts
5.http://en.wikipedia.org/wiki/VDPAU
6.If VDPAU is supported on card , card can do codec and graphics accel
7.vlc has support for it, vlc has support for sw codec too
8.http://penguindreams.org/blog/nvidias-hardware-h264-1080p-codec-in-linux/