Traction Control: Racing Towards Multiplayer

Sailboat Studios
8 min readAug 10, 2021

--

In February of 2021, we released our first version of Traction Control. A downhill racing game where only some of the laws of physics apply. This release was fun, fast-paced, and satisfied our itch to drift, jump, and do all sorts of tricks. We kept the game competitive by adding a leaderboard to rank the best of the best.

But the game was missing something, it felt lonely. So, we decided to add multiplayer to our game. What could go wrong? A server here, a connection there, Bob’s your uncle! Except, multiplayer is hard. Especially hard for a fast-paced game like Traction Control.

So, what makes it so difficult?

Latency & Synchronization

It takes time to send information over the internet. Everyone has a different internet speed. When we receive information, it’s from the past. In multiplayer, this means everyone sees something different, and nobody is on the same page. This can be frustrating, especially in real-time competitive games.

Every game needs a unique implementation of multiplayer, and the most popular method is client-side prediction. Each player predicts other player’s positions right now based on past information. But, like most things in computer science, there are trade-offs. Cheating and bad predictions are a caveat.

Interpolation and Extrapolation

Players send their data at a specific rate, but it is not consistent for receiving players. Uncertainty must be overcome for a smooth multiplayer experience. Latency can make the receiving rate higher or lower. We cannot guarantee the order of data or its arrival. These data are discrete, without interpolation we will see teleportation, stuttering, and jittering. If we are waiting for data, we can’t stop, we must extrapolate.

We are using Unity Engine for Traction Control, which runs physics computations at 50 Hz. Why not send updates at 50 Hz? No interpolation needed. This won’t work. Our receive rate is not guaranteed to be 50, and our data could come out of order, or not at all. Sending at too high a rate can also cause latency issues, the more data we send, the slower it will arrive.

Networking

Each player has a unique set of technical factors that affect their relative latency with respect to other players. Players can have different ISPs, firewall settings, routers, or even be on the other side of the planet from each other.

We need a third party to mediate connections and tell players where to send data. We need services on the cloud to put players together and serve other features. We must also consider reliability, scale, performance, and more.

Getting Started

We decided to take the client-server approach.

The server matches all the players into a lobby. Players send messages to the server which broadcasts these messages to other players. For simplicity, we are not simulating physics on the server. The server is there to matchmake and broadcast messages — that’s it. This was the simplest implementation given our constraints.

The server is the host, players can join and leave as they please without a host migration process. But this means the host handles the compute of sending and receiving messages. This means added costs and reduced scalability.

Shaking Hands

Some data must be reliable, such as matchmaking, in-game messaging, and player information. We use the handshake protocol (TCP) to guarantee the arrival of data and its order. We used socket.io for its simplicity. The player shakes hands with the server and agrees to make a connection. Players send messages to the server which broadcasts them to other players. Speed is the drawback of this protocol — it’s slow.

Spamming Transforms

For real-time communication of vehicle transforms (positions, rotations), we use the spam protocol (UDP). No guarantee of reliability or order, no handshakes. UDP fires at a network address and hopes that it arrives. Speed is the main benefit, no handshakes, no acknowledgments. Data arrive fast which is great for cars speeding around a map. Yet, some firewalls block this protocol.

Falling Back

If a player’s firewall blocks UDP, we don’t want to leave them hanging. We can fall back on socket.io to send messages. While this will slow down the speed of the messages, at least they arrive.

Smooth as Butter

When the transform messages arrive, we interpolate to provide a smooth experience. Each message is a “frame” that gets pushed into a frame buffer. We know the amount of time that we need to interpolate between frames. With a send rate of R frames/second, we interpolate for 1/R seconds.

When messages don’t arrive, we’re stuck waiting. We must extrapolate until a new frame arrives. We have a car velocity value attached to each frame, so we can apply this velocity until a new frame arrives. When a new frame arrives, we set the transform. But we still need another frame to interpolate towards, so we wait and extrapolate. To avoid this happening too often, the frame buffer has more than two slots. The frame buffer should scale with the send rate — a higher rate means more slots.

This method keeps the experience smooth even if players have a lot of latency. The caveat is the lack of synchronization. At low speeds, collisions work as expected and latency is hardly noticeable. At high vehicle speeds, things get funky, one player can see a collision and the other doesn’t. There is work to do in the prediction department.

Since sending more data increases latency, we compress each frame before we send it. We were able to reduce the data per frame by 75%. We decreased the precision of values where possible and encoded data into the bits of a byte.

The tried-and-true method is a combination of server and client-side prediction. The server is the authority for all physics, bad predictions on the client roll back to what the server says. This is an endeavour we would like to avoid as it involves a massive investment of time and money. Instead, we may use client-side neural networks to predict physics for other players.

Lobbies

We need to put players together, we allow players to create a lobby. A player will select their privacy, level, and weather, then send it off to the server. A lobby contains level (weather, mode) and player information (username, car, colour, address). The lobby will persist if one player is in it. We delete the lobby when everyone leaves.

We need to let players join lobbies. We give them the option to join a random lobby or join by code. The only way to join a private lobby is through a code. The player sends their connection request then enters the lobby to see other players.

Adding some Spice

Once we got the MVP finished, we wanted to add some extra oomph to the experience. We created a message queue to tell users when other players join, leave, or finish challenges. We also added a lobby leaderboard to encourage players to complete challenges.

In the future, we want to add racing, freestyling, and other fun game modes such as capture the flag. With the baseline completed, there are so many possibilities!

Implementation

We used node.js for our server along with a ton of libraries. For the game, we used a C# NuGet package for socket.io and default networking interfaces provided by .NET. We deployed our server on AWS EC2.

There are other networking pieces in our game including leaderboards and analytics. We decided to go with a microservice architecture to support new services as we build them. It looks something like this:

This allows us to work on our networking services in smaller, more manageable chunks. It allows us to use different tools and languages without bloating a single code base.

On the client-side, the design looks something like this:

After lobby creation, the session manager has the information to send/receive data. The session manager handles low-level networking logic like compression, sending, and receiving data. The broadcaster sends out TCP and UDP messages to the server then to other players. The receiver creates new players and interpolates/extrapolates incoming transforms.

Next Steps

We still need to improve our prediction and rework our architecture.

Prediction and convincing physics between cars are high on the priority list. As mentioned, this is the hardest part. There are loads of experimentation and testing to do, but we are confident in our abilities to get there.

Finally, we may stray away from the client-server model in a sense. Instead of using the cloud server as the host, we will change it to a matchmaker. One of the players in a lobby can be the host and relay messages to other players. This will reduce the load on the cloud server. Although, like most of this project, there are trade-offs. The experience will always be best for the host, and the consistency of the experience will take a hit.

Results

There’s still a lot to do, but the results thus far are fantastic:

Try it out for yourself with a friend! Going forward, we will continue improving the system and learning about multiplayer. It’s incredible what some critical thinking and experimentation can do. If you have any ideas or comments, we’d love to hear them.

Justin

Bonus: Leaderboards and Analytics

We implemented a leaderboard system for our racing and freestyler modes. We wanted to give users a way to see how they stack up against others. We support all-time, monthly, weekly, and daily records. After we created our microservice architecture, we plugged our leaderboard service in. Using PostgreSQL and Python3 with Flask, we made a leaderboard system that can work with any game.

Unity Analytics is a convenient tool built into the engine that helps us keep track of various key metrics. We track which cars people use, which levels they play, what’s too hard or too easy, and more. This addition is recent but important, we want to design and improve our games based on data.

--

--