Hey, KImageEffects served its purpose (it's arguable whether it ever had one). We obviously need something to easily add special effects to our application and it's clear the infrastructure KImageEffects had won't work for the coming years. I've sat down during my vacations last week and started working on a replacement. Some of my requirements for it were: - Support for all kinds of surfaces on which filtering can be applied to. That includes: o OpenGL textures o QImage o QPixmap - Hardware acceleration, - Support for filtering chains - meaning be able to simply create a chain of filters, e.g. Emboss->Perspective Transform->Blur. Cache that chain and execute on demand. - Dynamic nature of the whole library. Meaning additional filters can be added/removed without binary or even source incompatible changes. - Support for pure-rendering filters. This is strictly connected to hardware acceleration. In a lot of cases you just want to say "render this with that effect" and don't really want to get the result in your application, you just want to render it. With OpenGL rendering with a filter can be a few hundred of times faster than rendering/fetching of the result. So ability to just "attach" rendering filters, without forcing any client side data transmission is very important for proper/efficient hardware acceleration. - Ability to add new surfaces as both source and destination of all the effects. This is to be able to connect the effects to the objects which Phonon or Krita are rendering to with very simple hooks. - Intelligent filters. Meaning as a user you shouldn't have to care what's is happening under the hood - the framework should decide the fastest way possible of filtering your surfaces. For example if you have a huge QImage and you want it blurred, and the framework happens to have a Gaussian blur implemented in GLSL it will magically convert your image to an OpenGL texture and use OpenGL for the rest of the operation, without you ever noticing. Having said that. I have a basic version of a library implementing it. Here's very simply what is happening: The top-level compositing object is called Layer. Layer is a wrapper around the rendering surface (QImage, QPixmap, OpenGL texture, etc). It knows how to convert between all of them. The most important methods from Layer are: - bool Layer::add(Filter &); which adds a new Filter to the filter chain for the given layer - various void Layer::draw() methods which loosly coorespond to all QPainter::draw(QImage) methods, - Layer Layer::composition() const, which actually returns the result of the composition, note that if you want to just render your Layer with some filters added to it, you just call draw on it, and never need to call composition(). The filters are encapsulated by the Filter object. One creates a Filter with a string name of the filter. Filter names are like paths, where the first part is the group and the second part is algorithm name, e.g. "/blur/gaussian" is a full filtername. You can create Filter with just groups, e.g.: Filter f; //creates null filter, use isNull to query it Filter f("/blur/gaussian"); //creates a gaussian blur filter Filter f("/blur"); //creates a blurring filter - framework will pick the best //one for your as soon as you add it to a Layer Those are the basics. So if you'd like to blur an a hello.png you'd do: Layer layer("hello.png"); Filter f("/blur"); f.setArgument("radius", 5); //Filter has possibleArguments method //which returns all acceptable //arguments one can pass to this Filter layer.add(f); .... and now... QPainter p(...); layer.draw(&p, 0, 0); // renders results ... or ... Layer result = layer.composition(); //returns result QImage img = result.toImage(); //converts to a QImage img.save("blurred.png"); //saves result to a png "But I just want to blur a QImage you say", fear not, for the lazy who don't care about all this fancy stuff and just want to filter a QImage and get a result there's a simpler way : QImage img("hello.png"); if (KImageFx::applyFilter(&img, "/blur", "radius=10")) { //success img.save("blurred.png"); } I created two simple examples. One is trivial and just uses applyFilter to blur an image given on the command line. The other is an application in which you can create filter chains and view the results in real-time. It's kinda like QuartCompositor in OSX which made a lot of people loose their minds, but cooler of course ;) A very early screenshot is here: http://chaos.troll.no/%7Ezrusin/kimagefx.png (yes, there's only one filter there now ;) but you can already see the basics) Ah, and since my internet connection is very flaky at the moment and i've been havily hacking on it I'm using git right now, so if you want to give this a try. And I strongly suggest you do! Do: git clone http://ktown.kde.org/~zrusin/dev/kimagefx.git once that's done enter kimagefx and type qmake && make; There's examples directory with both examples. z