Why software genlock at 60 FPS does matter!

Since MediaMaster 1.1 we have revamped our video engine and particularly the synchronization and multi threading.

We now perform what can be called software genlock to ensure the best possible fluidity if your machine has a multi core processor. Genock is the action of locking the frequency of a media on a reference signal or clock. There is a nice description on that on wikipedia.

When the software must present a frame you can cut the work that has to be done in 3 parts : getting the video frames from the disk, uploading them to the graphic card and doing the composition / blending of the pixels for the presentation.

Because of the way disks are working and because the time that is required by a codec to convert the data from the disk into a decompressed frame is not constant this can create fluidity problems.

So at each new frame the software wakes up and start working sequentially on the 3 phases. In a classical real-time video processing application this will work like this:

Traditional video application

Traditional video application

This graph show an application trying to play a video loop encoded at 30 fps on a monitor running at 60 fps. In a perfect world the application should present each frames of the movie exactly twice.

There are 2 problems with the traditional way of doing the video processing:

  • The time base is synchronized to the clock of the computer and so there is a drift between the monitor frequency and the internal clock of the computer. This means that even if your computer would be extremely powerful you will see small hiccups once in a while.
  • When the software start working to display a new frame it has only in this example 1/60th of second to read a video frame, upload it and present it to the user. The available time is dependent on the fps of the monitor and not on the fps of the video source. So the higher the fps of the monitor, the more stress you give to your systems.
  • The processing done by MediaMaster is much more elegant. Since 1.1 we have 3 modes, the original one, a buffered mode and a frame blending mode.

    In this article I focus on the buffered mode, I will write more about frame blending latter.

    So in buffered mode the graphic card has always one frame in advance ready to be composed. As soon as a frame has been processed and presented to the user the software read and upload the next video frame in advance.

    The other thing that is done in buffered mode is that the clock of the content is not taken anymore from the computer clock but rather from the monitor frequency. To my knowledge most other media players are not doing something as subtle as this.

    So if we keep the same timings as in the first example but simply shuffle them according to the way the buffered mode is working here is the result:

    Software genlocked video

    Software genlocked video

    Because the movie clock is genlocked to the monitor and because we have always one frame in advance we play the movie with a perfect regularity and timing : 1 1 2 2 3 3 4 4 … each frame is played exactly twice.

    If you are still not too sure why it’s nice to be genlocked at 60 fps here is a recording we did in real time from MediaMaster when running 2 layers. The lower one is at 30 fps and the top one is at 60 fps. The loop running at 60 fps is part of our test content, it’s a loop running a ramp that allows us to see visually if the system is perfectly genlocked. we capture the output with fraps.

    If you apply effects to the content you play, the effects will be rendered at 60 fps and this is why it’s so important to have a perfect synchronization, your eyes will have the feeling that everything is crisp and fluid.

    In order to see the videos in this article you need to have QuickTime installed and the first video should play smooth on a recent laptop or on a LCD monitor set at 60 fps.

    So here is the original screen grab that I just scalled down in order to show it in this article, the fps is still 60 fps:

    Now in order for you to be able to see the difference here is the same loop at 30 fps:

    I continue to lower down the fps and now we are at 20 fps:

    Here is the more degraded version at 15 fps:

    If you are curious to test this with any software video mixer that can play QuickTime movies here are our test files:

    Horizontal ramp, 2 seconds at 60 fps:

    Here is a vertical ramp at 60 fps:

    And a zooming rectangle at 60 fps:

    Those movies can be downloaded for you to test your systems, you just need a software that support QuickTime *.mov files. For best results you should loop those files and let them run for a while. In a VJ application you can simply add them as the latest layer of your composition with an addition blending and you will see if your system is powerful enough and well designed.

    Feel free to stress test our software and compare it with others, we think that we did very nice with MediaMaster and there is a demo version for you to test on the ArKaos web site.

    ArKaos MediaMaster driven by GrandMA through eDMX…

    We are hard working since 3 months on the next upgrade of MediaMaster.

    On thing that we want to show in this video tutorial is that we now can listen directly to GrandMA eDMX protocol. This allow a simpler networking between your console and a server running MediaMaster.

    Here is the tutorial I made to show how to make it happend:

    Music “I want to be a machine” by Pornophonique licensed from Jamendo PRO.

    The Optical Theremin

    Matt Finke from LoopLight sent me this video using ArKaos software in a very creative way. In their toolkit you find NuVJ for standard VJ work and MediaMaster controlled from a GrandMA console.

    The Optical Theremin

    The Optical Theremin

    Here is the video:

    Get the Flash Player to see this content.

    Philipp from LoopLight also sent me this information:

    “I used Arkaos MediaMaster, Arkaos NuVJ and grandMA ultralight in one network session. The headlights were Jarag-5 in a matrix with par30 and 52 Movinglights with CMY. I made a pixelpatch in MediaMaster for Jarag and put the CMY of the movinglights in the same pixelpatch. A camera, the first input to MediaMaster, indicated towards a plexiglass disk, which i had divided in two areas. Left side for Jarag and the right side for the CMY. The NuVJ was the second input to MediaMaster for SD Clips. I set different fx like edge detect, greyscale and contrast in MediaMaster. Then I scaled my live input on pixelpatch, only the left area of plexiglass. Finally I moved my hand vertically and horizontal on the left area during playing SD Clips from NuVJ. For CMY I made four large Colourwheels in different bright colours. I moved one of the Colourwheels in the right area to change all CMY in the movinglights.”

    As you can see it’s a cool example of creative processing with lighting!