Quartz Composer integration in GrandVJ 1.2

Quartz Composer can be used in GrandVJ as .qtz files, we render them natively through the fastest interface. Before GrandVJ 1.2 you needed to save them as QuickTime movies, this had 2 big drawbacks, it was slow and it did not allow to change compositions variables.

With the Quartz Composer integration we have a fast rendering path and compositions can be used both as sources and effects.

When a composition is used as a source you simply import it the same way than any movie, you drag and drop it over the interface or use the file browser. When a composition is used as a source you cannot change any composition variables from the GrandVJ interface.

A composition can also be used as an effect. We scan automatically the system folder “/System/Library/Compositions” and register the ones that can be supported by the application, you find them under the “Quartz-Composer (System)” section in the effects. You can use them as source to develop new effects.

GrandVJ Mac OS X supported compositions
GrandVJ Mac OS X supported compositions

When using a composition as an effect the application scan the published inputs and map them as effect parameter when the format is compatible with the engine. We map parameters that are float values, indexes, colors and booleans. One visual input of the composition receive the assigned cell visual.

We also load user created compositions in the effect section, to do that you must place them in the local directory “Libray/Application Support/ArKaos/GrandVJ/Quartz Composer/Effects”

Here is where are my user test compositions:

GrandVJ user compositions
GrandVJ user compositions

They will appear in the “Quartz-Composer (User)” section of the effect browser:

GrandVJ user composition browser
GrandVJ user composition browser

Regarding the effect parameters if we start from the system composition “/System/Library/Compositions/Image Hose.qtz”, when we open it in the editor we see:

Image Hose system composition
Image Hose system composition

And if we display the inspector and ask to see the published inputs / outputs we can see those parameters:

Image Hose system composition parameters
Image Hose system composition parameters

GrandVJ will show the first 4 parameters that can be mapped automatically and in this case it will be:

Quartz composition parameters in GrandVJ
Quartz composition parameters in GrandVJ

You can see that GrandVJ try to extract the meaningful part of the parameter name, so it remove all _ProtocolInput strings from parameter names. Not doing that would make it impossible to display the full name in the panel. GrandVJ also capitalize the first letter of a parameter, so “size” become “Size”.

And finally if you want to have compositions in the visual sources we ship a few in the “Libray/Application Support/ArKaos/GrandVJ/Quartz Composer/Visuals”, the idea is that you can add more files in this place.

Here is where are my user sources compositions:

user source compositions

user source compositions

If you need you can generate the icon of the visual by have a picture like “confetti.png” in the same folder than “confetti.qtz”

And here is where you find them in GrandVJ:

User compositions panel

User compositions panel

GrandVJ 1.2 beta 1 supports Flash AS 3, Quartz Composer, APC 40 and many other …

We just released a new beta version of GrandVJ, you can download it from our forums:

http://www.arkaos.net/forum/viewtopic.php?f=26&t=6828

This version has many new cool things which are detailed in the forum post, here is a recap:

  • New ultra smooth engine
  • Support for Akai APC40 & Generic midi feedback
    (also for Behringer BCR2000/BCF2000, Livid OHM, ..)
  • Quartz composer player (Mac OS X)
  • Support for Flash Actionscript 3
  • Cell layer assignment
  • Mixer state saving
  • Added master blackout button
  • New enhanced soft-edging algorithm
  • Added vertical/horizontal position presets for TripleHead2Go
  • Several Bug fixes.
  • New fullscreen option (PC)

We plan to ship the final version at the end of this month so we encourage you to check if this new version is working fine and to report any problems or suggestions by creating a topic in the GrandVJ forum.

Happy testing, we hope you will have as much fun using this version as we had to code it!

ArKaos MediaMaster used at Australian Idol

A popular usage of ArKaos MediaMaster is on TV stages, there it’s very convenient to use it’s large library of content, up to 60.000, and drive both videos on the walls and the lighting from the same DMX console.

In this case there are 2 MacPro servers but you could also run MediaMaster on a windows server.

Here are some extracts of the show:

This is a remix I made from videos found on sharing web sites. You need QuickTime installed on your machine to view this video.

Why software genlock at 60 FPS does matter!

Since MediaMaster 1.1 we have revamped our video engine and particularly the synchronization and multi threading.

We now perform what can be called software genlock to ensure the best possible fluidity if your machine has a multi core processor. Genock is the action of locking the frequency of a media on a reference signal or clock. There is a nice description on that on wikipedia.

When the software must present a frame you can cut the work that has to be done in 3 parts : getting the video frames from the disk, uploading them to the graphic card and doing the composition / blending of the pixels for the presentation.

Because of the way disks are working and because the time that is required by a codec to convert the data from the disk into a decompressed frame is not constant this can create fluidity problems.

So at each new frame the software wakes up and start working sequentially on the 3 phases. In a classical real-time video processing application this will work like this:

Traditional video application

Traditional video application

This graph show an application trying to play a video loop encoded at 30 fps on a monitor running at 60 fps. In a perfect world the application should present each frames of the movie exactly twice.

There are 2 problems with the traditional way of doing the video processing:

  • The time base is synchronized to the clock of the computer and so there is a drift between the monitor frequency and the internal clock of the computer. This means that even if your computer would be extremely powerful you will see small hiccups once in a while.
  • When the software start working to display a new frame it has only in this example 1/60th of second to read a video frame, upload it and present it to the user. The available time is dependent on the fps of the monitor and not on the fps of the video source. So the higher the fps of the monitor, the more stress you give to your systems.
  • The processing done by MediaMaster is much more elegant. Since 1.1 we have 3 modes, the original one, a buffered mode and a frame blending mode.

    In this article I focus on the buffered mode, I will write more about frame blending latter.

    So in buffered mode the graphic card has always one frame in advance ready to be composed. As soon as a frame has been processed and presented to the user the software read and upload the next video frame in advance.

    The other thing that is done in buffered mode is that the clock of the content is not taken anymore from the computer clock but rather from the monitor frequency. To my knowledge most other media players are not doing something as subtle as this.

    So if we keep the same timings as in the first example but simply shuffle them according to the way the buffered mode is working here is the result:

    Software genlocked video

    Software genlocked video

    Because the movie clock is genlocked to the monitor and because we have always one frame in advance we play the movie with a perfect regularity and timing : 1 1 2 2 3 3 4 4 … each frame is played exactly twice.

    If you are still not too sure why it’s nice to be genlocked at 60 fps here is a recording we did in real time from MediaMaster when running 2 layers. The lower one is at 30 fps and the top one is at 60 fps. The loop running at 60 fps is part of our test content, it’s a loop running a ramp that allows us to see visually if the system is perfectly genlocked. we capture the output with fraps.

    If you apply effects to the content you play, the effects will be rendered at 60 fps and this is why it’s so important to have a perfect synchronization, your eyes will have the feeling that everything is crisp and fluid.

    In order to see the videos in this article you need to have QuickTime installed and the first video should play smooth on a recent laptop or on a LCD monitor set at 60 fps.

    So here is the original screen grab that I just scalled down in order to show it in this article, the fps is still 60 fps:

    Now in order for you to be able to see the difference here is the same loop at 30 fps:

    I continue to lower down the fps and now we are at 20 fps:

    Here is the more degraded version at 15 fps:

    If you are curious to test this with any software video mixer that can play QuickTime movies here are our test files:

    Horizontal ramp, 2 seconds at 60 fps:

    Here is a vertical ramp at 60 fps:

    And a zooming rectangle at 60 fps:

    Those movies can be downloaded for you to test your systems, you just need a software that support QuickTime *.mov files. For best results you should loop those files and let them run for a while. In a VJ application you can simply add them as the latest layer of your composition with an addition blending and you will see if your system is powerful enough and well designed.

    Feel free to stress test our software and compare it with others, we think that we did very nice with MediaMaster and there is a demo version for you to test on the ArKaos web site.

    The Optical Theremin

    Matt Finke from LoopLight sent me this video using ArKaos software in a very creative way. In their toolkit you find NuVJ for standard VJ work and MediaMaster controlled from a GrandMA console.

    The Optical Theremin

    The Optical Theremin

    Here is the video:

    Get the Flash Player to see this content.

    Philipp from LoopLight also sent me this information:

    “I used Arkaos MediaMaster, Arkaos NuVJ and grandMA ultralight in one network session. The headlights were Jarag-5 in a matrix with par30 and 52 Movinglights with CMY. I made a pixelpatch in MediaMaster for Jarag and put the CMY of the movinglights in the same pixelpatch. A camera, the first input to MediaMaster, indicated towards a plexiglass disk, which i had divided in two areas. Left side for Jarag and the right side for the CMY. The NuVJ was the second input to MediaMaster for SD Clips. I set different fx like edge detect, greyscale and contrast in MediaMaster. Then I scaled my live input on pixelpatch, only the left area of plexiglass. Finally I moved my hand vertically and horizontal on the left area during playing SD Clips from NuVJ. For CMY I made four large Colourwheels in different bright colours. I moved one of the Colourwheels in the right area to change all CMY in the movinglights.”

    As you can see it’s a cool example of creative processing with lighting!