Unity Spectrogram Visualization
Tongyu Zhou (February 2022)
This tutorial demonstrates how you can create animated spectrograms from any audio plugged into Unity. The main idea relies upon the AudioSource.GetSpectrumData function, which performs fast Fourier transform (FFT) on the currently playing audio stream in order to convert the amplitude over time domain into the frequency domain and return relative amplitudes for different frequencies. We will then map these relative amplitudes to mesh vertices, then update these vertices as the audio plays in order to simulate the sound profile in real-time. I will post my full code below that should work if you just plug it in, but you can also follow along to understand what I'm doing!
Goal: Create an animated spectrogram where (x = frequency, y = amplitude, z = time)
Getting Relative Amplitudes
0. Before you begin, make sure you have audio playing in your Unity scene. This can be achieved by adding a normal game object, then adding an AudioSource component and dragging in your desired sound clip
1. Obtain a reference to your audio source by calling
audioSource = nameOfGameObject.GetComponent<AudioSource>();
2. Create an array of floats (we'll be calling this spectrum) to store the relative amplitudes. Make sure the length of this array has a power of 2. Pass your audio source through the FFT function by calling
audioSource.GetSpectrumData(spectrum, 0, FFTWindow.BlackmanHarris);
The resulting spectrum array will contain the extracted relative amplitudes, split into however many bins based on what you specified as the length of the spectrum array. For, example, if you initialized spectrum to have a length of 512, then at a sample rate of 48kHz we will have 48000 / 2 / 512 = ~47 Hz per bin. See this post if you want to read more about what is happening under the hood, but generally 512 and 1024 are good array length choices.
3. Display the spectrum array! In the documentation for AudioSource.GetSpectrumData, you can run their sample code to see the spectrum data, scaled in 4 different ways (see GIF in the next section).
Displaying a Mesh
Since we will be updating our spectrogram (mesh) at each frame as the audio plays, we need to generate the mesh dynamically. I found two video guides that were useful in procedurally generating the mesh, and both are quite beginner friendly (you can just essentially copy what he is doing):
If you followed the videos guides above and visualized the spectrum data from the previous step, you should have something like this:
Combining the Two: Animating the Spectrogram
Now that we have both the spectrum data and the procedurally generated mesh, we can combine them into a proper animation to actually visualize the spectrum data. To do this, we just need to modify the heights of each mesh vertex to match the relative amplitudes at each frequency. To create an animated effect where some history of the previous relative amplitudes are maintained, we need to push all the height values back by one row each time we add a new one. We can achieve this through a for loop:
With the above code added, you should be seeing something similar to the GIF on the left. Now, it would be more useful if we could add colors that scaled based on the amplitude. The video on the right below demonstrates how you could dynamically color the mesh based on vertex height (which is exactly what we want)!
Note: the video is a bit dated and does not work with the latest version of Unity. Here are a couple of changes that I had to make/adapt to make the shaders work:
The "Lightweight Render Pipeline" package is now called "Universal RP" in Unity 2019.3+
Shader Graph UI has changed:
To create a shader graph: in Projects, right click > Create > Shader > Blank shader graph
To edit: Open the new shader graph in the editor > Active Targets > Universal > Create vertex color node > Connect a vertex color node to base color
And voila! Here's what I got after following the above steps, changing the multiplier for spectrum to 1000, and changing the gradient color from red -> yellow:
I have posted my entire Unity project at this repository, if you ever want to fork and build on top of yourself! In addition to the visualization, it has a couple of interactive functionalities, including:
[A] on right controller to toggle play/pause
[Y/X] on left controller to increase/decrease volume
[hold down trigger and rotate] on right controller to rotate the visualization
[hover] to see frequency/relative amplitude of point you are pointing at
Otherwise, here is the code snippet for generating the spectrogram mesh only: