This chapter uses OSC queries in QLab 4.1 to construct a traditional 3 band sound to light system. In the example, the output controls a video image consisting of 6 sprites, but it is easy to adapt to control lighting or even parameters of other sound cues.
Here it is in action (best viewed full screen):
How it works:
A fire all group cue fires 3 audio cues that target the same audio file. Each cue has an audio effect applied using a standard Apple AU Bandpass filter set to cover one of the audio ranges, HI, MID, or LOW.
A group of Network cues, set to OSC message, query these audio cues using liveAverageLevel to get an instantaneous level reading for each frequency band and applies these levels to parameters of text cues with emojis to animate the video image output. In finished projects, these would probably be png’s with alpha image files but I have used emojis to keep the example workspace compact.
/cue/LOM/scaleX #/cue/LO/liveAverageLevel/1 1 20 #
Will animate the x scaling of cue LOM (the eyeballs) by getting the average level of the cue numbered LO (which has a bandpass filter applied to isolate bass frequencies). The last 2 numbers of the queries set the minimum and maximum scaling for the full audio output range.
You can download the example workspace here
Chapter Author: Mic Pool