Rhythm game with Unity3D: latency free sync (android and other platforms)

Rhythm game with Unity3D: latency free sync (android and other platforms)

Making a rhythm game with Unity3D...

...can be challenging. You may have noticed some delay or latency between the animation and the music with the built-in audio source.  Or maybe you got your game running nice and in sync on standalone, but it is accusing a serious audio latency on android.

Having developed two rhythm games on standalone, android and Ios with Unity 3D,  I thought  I could share my tips. I will talk about retrieving the exact track time as heard by the user and dealing with input latency.

Latency ?

There are two main factors that can make your rhythm game feel out of sync.

Hardware chip icon

Audio hardware latency

Often designates what is actually called render latency. It's the delay between the time an application submits a buffer of audio data to the APIs, until the time that it is heard.

Touch to app latency icon

Touch-to-app latency

Relevant for mobile devices, it's the offset between a user's tap on the screen and the event on the software side.

Unity settings

There are a few settings you can tune directly in Unity 3D to optimize latency:

  • Edit → Project Settings → Audio → set DSP Buffer size to Best latency
  • Select your audio clip assets and ensure Load Type is on Decompress On Load
  • Minimize audio effects as much as possible

Getting the exact track time on ios and standalone

The bad:

AudioSource audioSource;

float GetTrackTime()
{
    return audioSource.time;
}

The good:

float GetTrackTime()
{
    return (audioSource.timeSamples / (float)audioSource.clip.frequency);
}

The best:

double trackStartTime;

void StartMusic()
{
    trackStartTime = AudioSettings.dspTime + 1;
    audioSource.PlayScheduled(trackStartTime);
}

double GetTrackTime()
{
    return AudioSettings.dspTime - trackStartTime;
}

From my tests, the value audioSource.time is easily late of 0.1s compared to the later audioSource.timeSamples value. For reference, a musician can feel a delay as low as 0.02s. In the last option, we use the best accuracy Unity's audio system can offer with AudioSettings.dspTime. Scheduling the start is necessary as audioSource.Play() can accuse a delay up to 0.5s on some platforms. The second can be accurate enough for your game. For the last, you will have to handle pause and seek operations yourself.

Getting the exact track position on Android

This is another story. The offset between the track time and what the user hears on android can vary from 0.01s to 0.2s depending on your device. From my personal experience, to offer a consistent synchronization on every device, you have no choice but to write native android code and bypass Unity's audio system.

The trick is to use hardware timestamps combined with an android audiotrack starting from Kit Kat and to retrieve the latency measurement the manufacturer provides for earlier versions.

If you don't want to bother, I wrote a minimalist plugin that let you play a music and get its exact time / head position on any android device.

Dealing with input/touch latency on android and ios

First things first, do not use Unity's UI button. I guess you may have noticed it already, but the callback happens on the release, not on the press, which is obviously bad for a rhythm or music game's inputs.

Instead, put a tag on your button/hit zone and use the following code:

private void Update()
{
    // tip: don't use ui buttons for rhythm games. The callback is on the release...
    if (Input.touchSupported)
    {
        if (Input.touchCount > 0)
        {
            Touch touch = Input.GetTouch(0);
            if (touch.phase == TouchPhase.Began)
            {
                if (EventSystem.current.currentSelectedGameObject != null &&
                    EventSystem.current.currentSelectedGameObject.CompareTag("HitBtn"))
                    TapTheBeat();
            }
        }
    }
}

Then for touchscreen latency, here is the code I'm using in Beat the Rhythm:

#if UNITY_ANDROID
    const float MIN_TOUCH_LATENCY = 0.025f; // Samsung Galaxy Note Edge
#elif UNITY_IOS
    const float MIN_TOUCH_LATENCY = 0.023f; // iPhone 6s Plus
#else
    const float MIN_TOUCH_LATENCY = 0;
#endif

    const float SHOULD_HIT_TIME = ????; // the supposed time the user should tap the screen/button
    const float TIMING_TOLERANCE = 0.02f; // just an example, the timeframe the user can hit 

    void UserAction()
    {
        float backwardTolerance = Time.unscaledDeltaTime + MIN_TOUCH_LATENCY;
        float actionTrackTime = GetTrackTime();

        float actionDelay = Mathf.Abs(SHOULD_HIT_TIME - actionTrackTime);

        if (actionTrackTime > SHOULD_HIT_TIME) // if the user input is later than the perfect hit time
            actionDelay -= backwardTolerance;

        if (actionDelay <= TIMING_TOLERANCE)
            Success();
        else
            Fail();
    }

Unfortunately to deal with the touch latency, I didn't find any other option than to compensate with an average value depending on the platform. But as you can read it, touch latency isn't the only thing you should deal with: you should compensate for Unity 3D input latency, indeed your button or touch callback will only be called on an update, meaning it could have happened anytime between the last frame and this one.

Good luck with your Unity 3D rhythm game !

I hope those few tips will help you make a great music game. Shall you have any questions linked to audio latency on android or need a hand with your game, I'm available for freelance work 🙂 I would also be happy to simply have a look at what you have done !

Leave a Reply

Close Menu