Pimp my video: shader effects and multimedia

Published Wednesday February 29th, 2012 | by

Introduction

A topic which has been receiving quite a bit of attention recently is the use of shader programs to apply effects within Qt Quick applications. Once you have grasped the basics of the shader programming language, embedding the shaders in your application is made really easy by Qt Quick, with the result that stunning visual effects can be achieved from a surprisingly small amount of code.

What hasn’t been mentioned previously in this blog is that applying shader effects to multimedia content (video playback, or a camera viewfinder) is just as easy as transforming any other QML element. This post shows some of the results you can achieve by combining shader programming with QtMultimedia.

The best way to demonstrate some of the effects you can achieve is … with a demo, so that’s where we’ll start.

qmlvideofx demo running on Qt 5, on desktop Linux

The videos above show a Qt Quick 2 demo app, running on a desktop Linux machine. Aside from some ancillary C++ code (listening to scene graph signals in order to calculate the frame rate, and reading shader programs from the file system), the app is entirely written in QML.

Implementation details

As previously described, Qt Quick 2 comes with built-in support for shader effects via the ShaderEffect QML element. Using this to apply an effect to video or viewfinder content really isn’t any more difficult than applying the same effect to any other QML element. To illustrate this, the code snippet below shows a time-varying wobble effect being applied to a video clip which is played from a file.

import QtQuick 2.0
import QtMultimedia 5.0

Rectangle {
    width: 600
    height: 400

    MediaPlayer {
        id: mediaPlayer
        autoplay: true
        source: "test.ogg"
    }

    VideoOutput {
        id: videoOutput
        anchors.fill: parent
        source: mediaPlayer
    }

    ShaderEffect {
        anchors.fill: parent

        // Properties which will be passed into the shader as uniforms
        property real amplitude: 0.02
        property real frequency: 20
        property real time: 0

        NumberAnimation on time {
            loops: Animation.Infinite
            from: 0
            to: Math.PI * 2
            duration: 600
        }

        property variant source: ShaderEffectSource {
            sourceItem: videoOutput
            hideSource: true
        }

        fragmentShader: "
            uniform highp float amplitude;
            uniform highp float frequency;
            uniform highp float time;
            uniform sampler2D source;
            uniform lowp float qt_Opacity;
            varying highp vec2 qt_TexCoord0;
            void main() {
                highp vec2 p = sin(time + frequency * qt_TexCoord0);
                highp vec2 tc = qt_TexCoord0 + amplitude * vec2(p.y, -p.x);
                gl_FragColor = qt_Opacity * texture2D(source, tc);
            }
        "
    }
}

Applying the effect to a viewfinder stream rather than to video playback is simply a matter of replacing the Video element with a Camera element.

In the qmlvideofx demo, each effect is implemented as a QML element which inherits from ShaderEffect; these elements are dynamically loaded and applied to the video content when the user selects the corresponding effect from the menu. Similarly, the different inputs (image, video and camera) are represented by QML elements which are dynamically loaded when required.

The set of parameters supported by each effect (e.g. ‘granularity’ for Pixelate; ‘radius and diffraction’ for Magnify) is exposed as a ListModel which is used to construct sliders via which the parameter values can be adjusted.

The code (Qt 5.x)

The source for the qmlvideofx demo can be found in the qtmultimedia repository here:

https://qt.gitorious.org/qt/qtmultimedia/trees/master/examples/video/qmlvideofx

Until Qt 5.0 is released, you’ll have to build it from source in order to build and run the demo. Instructions for how to do that can be found on the wiki here. The required subset of Qt modules can be cloned as follows:

$QTDIR/init-repository --module-subset=qtbase,qtdeclarative,qtjsbackend,qtmultimedia,qtxmlpatterns

The code (Qt 4.x)

While Qt Quick 1 does not have built-in support for shader effects, it is provided by the Qt.labs.shaders plugin, which is shipped with Qt from version 4.7.4 onwards. A Qt 4 version of the demo is available here:

https://qt.gitorious.org/qt-mobility/qt-mobility/trees/master/demos/video/qmlvideofx

In addition to running on desktop platforms (tested on Linux and Windows), the Qt 4 version of the demo will also run on mobile devices – the video below shows it running on a Symbian device (Nokia C-701).

qmlvideofx demo running on Qt 4, on Symbian (Nokia C-701)

Keen-eyed viewers will notice that the app has a slightly different UI to the desktop version – thanks to the flexibility of Qt Quick, that is achieved by substituting a single QML file which describes the layout.

It should be noted that the app will not run on currently available versions of the Symbian platform. The reason is that it requires the output of the video decoder to be available to the OpenGLES engine as a texture. An EGL extension which allows the video and graphics stacks to cooperate in this way (EGL_NOK_image_endpoint2) will be added in a future Symbian release. Once this is available, QtMultimediaKit will automatically make use of it (see QTMOBILITY-1818), and the demo will start working.

You can, however, run the app today on the Nokia N9 – as shown by the video below.

qmlvideofx demo running on Qt 4, on MeeGo Harmattan (Nokia N9)

Further information

If you weren’t already excited about the potential of shader effects, hopefully you are now. Below are some links to other examples of their use in conjunction with Qt/QML.

Did you like this? Share it:
Bookmark and Share

Posted in Multimedia, OpenGL, Qt Quick

29 comments to Pimp my video: shader effects and multimedia

Enis says:

Great work peeps! I built this a few weeks ago and slightly changed it so that it loads as an app in the qtmediahub project (http://gitorious.org/qtmediahub) it was really easy as the code is well written and easy to follow. Note that for us Fedora users you have to hunt around installing packages and creating symlinks in /usr/lib64 and /usr/include to make sure qt5 builds correctly to play videos and use opengl/es (it seems the devs expect Debian layout). After that you can try out all the cool new examples in qt5, eg the maps demo is also amazing!

PS – loving the music, I vote for this being a requirement of all future labs postings! (any chanche of a tracklist too? :)

suy says:

I’m quite impressed. The framerate is quite good for being 4.X and in a mobile phone. I guess you use a GLView, isn’t it?

I wish all app developers start including the page curl effect to their apps. I could curl all day long. ;-)

Rezza says:

Enis: we’d like to start with official qt5 packaging for Fedora soon, but it will take some time…

IvanDM says:

I’m speechless, but I have to say what no one here said yet: I HATE ELOP.

Look at this!
It’s amazing what QML and Qt can do…

I’m very sad.
You Troll are great.

ary says:

@IvanDM
Don’t be sad, Nokia smartphones are not the whole world.
Qt will have a long and bright future on many other platforms.

Sean says:

Awesome stuff! Out of curiosity what tool did you use to do the screen capture on your Linux desktop?

Gareth Stockwell says:

Thanks for the positive comments :)

@Enis The music tracklists are in the video descriptions on YouTube . All music is Creative Commons-licensed and available from soundcloud.com.

@suy On Symbian, we use the opengl graphics system; for other platforms (in the Qt 4.x version of the app), we set a QGLWidget as the QML viewport. The reason for the different behaviour on Symbian is that QGLWidget does not automatically release its resources when the application goes to the background. The opengl graphics system, on the other hand, does have this resource management intelligence.

@Sean I used xvidcap to capture from the screen. In order not to steal too many CPU cycles from the app, I captured raw XWD images rather than having xvidcap encode to a video on the fly. Once I’d finished capturing, I used ImageMagick to convert the frames to TGA format, and then encoded them into a video using mencoder. Of course that workflow only makes sense if you don’t need to capture audio from the application while it is running.

Thomas Senyk says:

What’s with the performance on the N9?
I guess it’s a n Qt4 problem?
… qmlshaderplugin doesn’t really “support” effects on dynamic sources, right?
… in theory a small shader effect on a fullscreen opengl texture shouldn’t be a big deal for the PVR

I guess I need to build Qt5 for me N9 very soon ;)

Gareth Stockwell says:

@Thomas Senyk To be honest, I’m not really sure. Symbian is using Qt 4 (and therefore the Qt.labs.shaders plugin) aswell, but gets much better performance – so I don’t think it’s an inherent limitation of that stack. I did build Qt 5 for the N9, but I couldn’t get QtMultimedia working, and didn’t have much time to play around with it. I haven’t had chance yet to go back and debug that – but if you try it I’d be very interested to hear how you get on.

Yoann Lopes says:

@Thomas Senyk @Gareth Stockwell The GPU on the C7 (Broadcom VideoCore III) is much more powerful than the one on the N9 (PowerVR SGX530)… Especially when it comes to fragment shader power…

Bart K. says:

@Gareth Stockwell

That looks awesome! :)

Something I’ve been struggling with recently… is there any way to have QML widgets apply a shader to the content underneath them? I’ve got some semi-transparent widgets, and I thought it might be nice to desaturate and blur the stuff below them, giving them kind of a glazed appearance (and also making the content below them less distracting).

The source of the trouble I’m having is that I can’t reference the parent widget in the shader, because that then tries to draw all of the child widgets. :)

salsaman says:

Looks nice. But I wonder – how easy (or difficult) is it to add new sources/sinks ? Is there some example code for this somewhere ? What about control of the ouptut – can you force it fullscreen, hide it when it is not playing, embed it in an existing window, etc. Can I render to a framebuf, or does it require a physical screen ?

chalup says:

Positive: amazing functionality with so little code. I’ll definitely have to find some time and learn about shaders.
Negative: mutilated “Hey” by The Pixies in the first video.

Gareth Stockwell says:

@Bart K
I thought I knew how to do this … but then when I implemented my approach, it didn’t quite work as I expected. I’ve posted the broken code as an answer to your question on StackOverflow – hopefully someone else can explain why it doesn’t work.
http://stackoverflow.com/questions/9510013/is-there-some-way-i-can-have-a-qml-widget-affect-the-content-below-it-with-a-gls

Gareth Stockwell says:

@salsaman There are a few questions in there, so I’ll take them one by one:

* How easy (or difficult) is it to add new sources?
If the sources are QML elements, then it’s very easy – simply point ShaderEffectSource.sourceItem at the appropriate item. If you’re asking about feeding content from another source (e.g. an existing rendering library outside of Qt) into the ShaderEffect, then first you would need to expose that content to QML by wrapping it in a QDeclarativeItem (Qt4) or a QSGItem (Qt5). I expect that would be likely to throw up all kinds of OpenGL context / threading-related issues, so approach with caution .. :)

* What about control of the ouptut – can you force it fullscreen, hide it when it is not playing?
There are two parts to this answer:
1. Changing the geometry of the native window into which QML is rendering (e.g. to make it fill the whole screen)
This can be done from C++ via the QWindow API (see e.g. QWindow::setWindowFlags(Qt::WindowFullScreen)), or from QML via the Window element.
2. Changing the layout within the QML app (e.g. to hide some chrome when it goes full screen, or to transition to a different app view when the video stops playing).
This can be done easily in QML by using states, transitions and property bindings.

* Can I render to a framebuf, or does it require a physical screen?
In Qt5, rendering to an FBO is easy – simply call QQuickCanvas::setRenderTarget(QOpenGLFramebufferObject *). In Qt4, it is possible but more difficult – one approach is to implement a QGraphicsEffect which redirects the QPainter output into a QGLFramebufferObject. This is in fact the way the Qt.labs.shaders plugin is implemented, so check its source if this is of interest.

Ayman says:

yay, Thank you very very veeeery much Qt Team for releasing the shader effects for Symbian, this is awesome, amazing, a big and happy news for us Qt Symbian developers :-)
looking forward for the official release :)

Gareth Stockwell says:

@Ayman Actually, the ShaderEffectItem QML element has been available on Symbian since Qt 4.7.4 (in both Symbian Anna and Belle), in which the Qt.labs.shaders plugin and the Symbian OpenGL graphics system were added.
http://labs.qt.nokia.com/2011/11/08/qt-4-7-4-release-bundle-for-symbian-news/

Using ShaderEffectItem in conjunction with Camera or Video elements as shown in the demo, however, is not yet supported in the latest version of the platform (Symbian Belle FP1). The EGL extension mentioned in the post is expected to be added in a future release, but at this time we can’t provide any more details.

you says:

Could you put demo videos on developer.qt.nokia.com/videos as well? We cannot access youtube from China mainland…

Scorp1us says:

LOVE IT! And thanks for the 4.x love too!

prabindh says:

Just wanted to add here, on platforms such as Beagle (bone/board), you can use imgstream or eglimage using Qt-integrated classes. This class (v3dfx) provides a wrapper to Video streaming functionality, as a QGLWidget, or in a QGraphicsScene (overriding the background paint). Link below.

http://tigraphics.blogspot.in/2012/02/sgx-video-streaming-with-qglwidget.html

http://tigraphics.blogspot.in/2012/02/8-cpu-ultrasound-viewer-with-v3dfx-base.html

Ville says:

This is great!

How does this work with hardware accelerated video decoding, eg. h264 HD material? Phonon video can be gpu accelerated with eg. mplayer and va-api library. When I have done this with ‘plain’ Qt (no QtQuick), there has been very limited ways to manipulate the video, because it has been processed in the gpu all the time. Only the result has been seen in Phonon-VideoWidget.

Sharers, the effects, are processed in gpu, yes, but is the actual video decoding here also done in gpu?

Basicly, I’m interested to show overlay text and graphics on top of hardware accelerated video. It is surprisingly difficult. But, also any cool effects are my interest too.

Gareth Stockwell says:

@Ville Good question. A full answer would probably be a good topic for a future post, but I’ll try to cover the main points here.

As you rightly say, shaders are executed on the GPU – so in order for them to be applied to video content, that content also has to be on the GPU in the form of a texture. So, at the end of the pipeline we have a shader program awaiting a texture, and at the beginning we have a video decoder which is writing frames to some area of memory. In between, we have Qt – specifically the QtGui, QtDeclarative and QtMultimedia(Kit) modules (and their respective platform-specific plugins). It is their job to connect everything together.

In Qt4, the first part of the pipeline looks like this:

Decoder –> Raw frame –> QPainter –> QPaintDevice

Depending on the platform, that raw video frame may be in one of a number of forms, such as
* a CPU-addressable memory buffer (i.e. a raw pointer)
* a platform-specific bitmap type (e.g. XVideo shared memory, Mac CoreImage)
* an EGL handle such as an EGLImage or GL texture – i.e. the decoder has written directly into GPU-addressable memory)

If we weren’t using shaders, then the QPaintDevice would be a QWidget, and the job of the QPainter would be to draw the video frame into the backing store (QWindowSurface), from where it would be blitted onto the screen.

In order to make any of the above video frame types available to a shader program, the Qt.labs.shaders plugin substitutes a QGLFrameBufferObject as the paint device. The painter (provided by the OpenGL graphics system) renders into this FBO; if the incoming video frame isn’t already GPU-accessible then this step involves first uploading the pixel data to the GPU. OpenGL APIs can then be used to extract a texture from the FBO. So now we have the following:

Decoder –> Raw frame –> QPainter (OpenGL) –> FBO –> GL texture –> Shader

The bit of Qt glue which is not shown in the above pipeline is QtMultimediaKit. Its platform-specific plugin needs to provide an implementation of QVideoRendererControl which returns video frames in an appropriate form. By “appropriate”, we mean “in a form which can be consumed (i.e. drawn) by the OpenGL graphics system implementation on this platform. In other words, the answer to the question “what do I need to get this demo working on platform X” is “a QtMultimediaKit plugin and an OpenGL graphics system which can talk to one another”.

In Qt5, the picture becomes a bit simpler, in that the new graphics stack is inherently based on OpenGL (so shaders are available out of the box), and QPainter is dead. This means that, rather than having to work out which video frame type is the best one for QtMultimedia to pass to QtGui, there is only one option: a GL texture handle.

Decoder –> Raw frame (GL texture) –> Scene graph

I hope that at least partly answers your question – as I said, a more comprehensive deep dive into all things video rendering-related is something which may be provided in future.

Gareth Stockwell says:

@prabindh Thanks for the links – it looks like an interesting project.

It would be great to make use of this functionality in the QtMultimedia backend. There is already some related code – the Symbian QtMultimediaKit backend produces video frames in the form of EGLImageKHR EGLImageKHR handles, which are then converted to VG images or GL textures prior to drawing. The API used to get video/viewfinder data into the EGLImageKHR target, however, is EGL_NOK_image_endpoint2 rather than EGLImageStream.

Does the TI SDK provide APIs for piping hardware-accelerated video/camera data into an EGLImageStream? If so, a useful piece of work would be to modify the QtMultimedia gstreamer backend in order to use these APIs, so that multimedia rendering via Qt will take the optimal path on BeagleBoard.

prabindh says:

@Gareth – we are working on that. Rob Clark already has a version with QtMobility. We are working on a more generic version that can support multiple planes (gstreamer pipelines or other). Work ongoing in the below trees.

https://github.com/prabindh/TI-Graphics-Accelerated-Video-streaming-
https://github.com/aditya-nellutla/TI-Graphics-Accelerated-Video-streaming-

subbu says:

i am trying to run the qmlvideofx sample in qtmultimedia folder of QT-5 alpha, but facing plugin load error for folderlistmodel. Quick pointers appreciated

smoothio says:

@subbu
Read Jasons comments on the qt5 lab post, there are two plugins of the same name but one is qtquick1 which gets installed over the qtquick2 one.

Gareth Stockwell says:

@subbu @smoothio Further discussion of problems running the qmlvideofx example with Qt5.0.0-alpha can be found in the comments of https://bugreports.qt-project.org/browse/QTBUG-23118

Matt Cleere says:

Of course I have a gazillion questions about all this, but first… About that Demo — The second song that played really stood out to me (as a former DJ) it appears to be the music from Horace Andy – Skylarking with the vocals from Justice – We Are Your Friends over the top… where did you get that?

Commenting closed.