Originally Posted by
Stoney3K
I'm also interested in looking into (and co-developing) a unified laser display framework which could also run under Linux.
Communicating with lots of DACs from different manufacturers on a low level is going to be the hard part, even though it is what you'll want if you want complete control over your laser display. Audio APIs can be quite cumbersome, resource demanding and could interfere through unwanted signal processing.
To be honest, I would start by looking at systems like the Ether Dream and pull apart the protocol for that DAC -- the developer is an active forum member here and may be able to give you some pointers.
Personally, I would like to stop working from the traditional 'frame based' paradigm and transition to signal processing on a set of continuous streams of (X, Y and colour intensity) data. After all, that's how the old analog consoles and show tapes were designed as well, which is also a key factor in their fluidity. Using today's processing power, you could construct a very advanced compositing engine for some very impressive shows. Implementing OpenGL commands onto a laser based canvas may even be possible.