GameSessions

The gamesession plugin contains all the facilities required to manage game sessions on Stormancer server farms. Each game session is represented as a scene allocated on the farm. When joining the session, players connect to the scene. The plugin:

  • Validates player access to the game session according to the game sesion configuration

  • Notifies player readiness

  • In external game server mode:
    • Manages the lifecycle of the game server (startup, shutdown)

    • Provides the networking configuration to connect to the game server

  • In P2P mode
    • Selects an host

    • Synchronizes clients

    • Coordinates with the cluster to establish P2P connectivity (both NAT Traversal & Relay are supported transparently)

The scene can host additionnal server logic to enable features at the game session level, for instance player to player broadcast and messaging, player states, chats, or serverside gameplay logic.

A p2p sample project is available on Github.

Dependencies

server

  • Authentication

  • Management

client

  • Core

Creating a game session

On the server, any gamesession is associated to a scene. Applications have first to declare one or several scene templates that provide game sessions functionalities by calling the AddGameSession extension method in the template builder.

builder.SceneTemplate("gameSessionTemplate", scene =>
{
    scene.AddGameSession();

    //Add other features
});

AddGameSession adds the gamesession metadata to the scene, which in turn triggers activation of the Gamesession plugin logic on scene startup. To create a gamesession scene, the easiest way is it use the IGameSessions interface in the server class that creates the game session. The interface is automatically registered in the server dependency injection system by the gamesession plugin. For instance:

public class MyController: ControllerBase
{
    private readiness IGameSessions gameSessions;

    public MyController(IGameSessions gameSessions)
    {
        this.gameSessions = gameSessions;
    }

    public Task<string> CreateGameSession(IEnumerable<string> users)
    {
        var id = Guid.NewGuid().ToString();
        var this.gameSessions.CreateGameSession("gameSessionTemplate",id,,new GameSessionConfiguration{
            isPublic = false,

            parameters = JObject.FromObject(new {
                customParameter = "foo",
                customParameter2 = 3
             });
        });

    }

 }

Game sessions can be created using the HTTP Rest scene creation API too. For more informations about the process, please see the implementation of the CreateGameSession method.

configuration

Configuration:

"gameServer":{
    "provider":"xxx",
    "dedicatedServerTimeout:"00:01:00", //Timeout after which dedicated server startup is considered to have failed. (defaults to +infinity)
    "useP2P":true, // Is the game session P2P?
}

Other configuration parameters are available for specific game server providers.

P2P

When p2p is set to true, the gamesession is started in P2P mode. By default, the first client that connects become the host of the game, then the library establishes P2P connections with each newly connected player.

Dedicated server

In dedicated server mode, the game session selects an implementation of IGameServerProvider and uses it to start a game server for the game session. The provider to use is specificed in the server app configuration

"gameServer":{
    "provider":"xxx"
}

On startup, the game server receives the necessary information for the game server to:

  • Connect to the Stormancer server farm

  • Authenticate as a game server

  • Join the game session as game server

Different providers may require additionnal configuration parameters to be able to instantiate game servers. Currently available providers are:

Local

This provider starts a game server on the local server using the game server executable directly. While not being the best system for live environments because buggy game servers may be able to impact durably the performance of the whole server instance, it’s perfect for testing and debugging.

Configuration:

"gameServer":{
    "provider":"local",
    "executable":"path to executable",
    "stormancerPort:30000, //Port bound to a local grid Raknet endpoint
    "arguments":["-log","xxx"], //Additional arguments for the server executable
    "mapName":"xxxx" //String passed as environment variable to the game server.
}

Executable parameters & environment variables:

Arguments: PORT={port for game netengine} {additionnal arguments}
Environment variables:
    connectionToken={Token used to connect to the game session}
    serverDedicatedPort={port for game netengine}
    clientSDKPort={port for the Stormancer library}
    serverPublicIp={public ip of the server for the strm lib}
    localGridPort={localhost port to join the grid for the strm lib}
    endpoint={grid endpoint for the strm lib}
    accountID={strm app account}
    applicationtName={strm app name}
    serverMapStart={map name provided by the game session}
    authentication.token={token used for authentication}

Docker

This provider starts a docker container for each game instance on a local docker daemon.

Configuration:

"gameServer":{
    "provider":"local",
    "executable":"path to executable",
    "stormancerPort:30000, //Port bound to a local grid Raknet endpoint
    "arguments":["-log","xxx"], //Additional arguments for the server executable
    "mapName":"xxxx" //String passed as environment variable to the game server.
}

Executable parameters & environment variables:

Arguments: PORT={port for game netengine} {additionnal arguments}
Environment variables:
    connectionToken={Token used to connect to the game session}
    serverDedicatedPort={port for game netengine}
    clientSDKPort={port for the Stormancer library}
    serverPublicIp={public ip of the server for the strm lib}
    localGridPort={localhost port to join the grid for the strm lib}
    endpoint={grid endpoint for the strm lib}
    accountID={strm app account}
    applicationtName={strm app name}
    serverMapStart={map name provided by the game session}
    authentication.token={token used for authentication}

ECS Fargate

This provider starts a docker fargate task on ECS for each game session. Configuration:

"gameServer":{
    "provider":"aws.ecs",
    "awsKeyId":"aws key id",
    "awsSecret :"aws secret",
    "awsRegion":"aws region on which to start the container",
    "taskDefinition":"ecs task definition id",
    "containerName": "fargate container name",
    "cluster":"fargate cluster name",
    "securityGroups":["xxx","yyy"], //container security groups
    "subnets":{"xxx","yy"] //container subnets
}

Executable parameters & environment variables:

Arguments: PORT={port for game netengine} {additionnal arguments}
Environment variables:
    connectionToken={Token used to connect to the game session}
    serverDedicatedPort={port for game netengine}
    clientSDKPort={port for the Stormancer library}
    serverPublicIp={public ip of the server for the strm lib}
    localGridPort={localhost port to join the grid for the strm lib}
    endpoint={grid endpoint for the strm lib}
    accountID={strm app account}
    applicationtName={strm app name}
    serverMapStart={map name provided by the game session}
    authentication.token={token used for authentication}

EC2

This provider starts an EC2 to instance for each game session to run a developer provided image. It’s not recommanded for short duration game sessions because EC2 instance may take up to several minutes to start.

Pool Configuration:

"pvp.ec2":{
    "provider":"aws.ec2",

    "awsKeyId":"aws key id",
    "awsSecret :"aws secret",
    "awsRegion":"aws region on which to start the container",
    "awsImageId":"image id to use in the EC2 instance",
    "awsLaunchTemplate": "EC2 launch template"
}

Image parameters:

strBuilder.AppendLine(arguments);
strBuilder.AppendLine($"endPoint={applicationInfo.ApiEndpoint}");
strBuilder.AppendLine($"accountID={applicationInfo.AccountId}");
strBuilder.AppendLine($"applicationName={applicationInfo.ApplicationName}");
strBuilder.AppendLine($"authentication.token={authenticationToken}");

Connecting to a game session

In order to connect to the game session, game clients need a secure connection token to the game session scene. Such a token is most of the time obtained using a gamefinder.

auto gameSession = client->dependencyResolver().resolve<Stormancer::GameSession>();

//Connect to the game session and establish a P2P connectivity with the host if necessary.
    auto connectionInfos = gameSession->connectToGameSession(gameFound.data.connectionToken).get();

Once connectToGameSession() completes, there are 2 possibilities:

  • The game is the host of the game, either because it was selected by a P2P game session or because it’s a dedicated server. In this case, the game code must start the game server logic, wether it uses a custom network engine or directly sends message using the game session scene networking facilities.

Once ready to accept player connections, it must call SetReady() to inform the gamesession. - The game is a client in the game. In this case, as soon as the host calls SetReady, a P2P direct or relay based connection is established between client and host, then ConnectToGameSession completes. It’s recommanded but not manadatory for the client to call SetReady at this point.

Sample

if (connectionInfos.isHost)
{
        std::cout << "Starting as host";
        //If using a custom network engine, Start game server on the port specified in config->serverGamePort
}
else
{
        std::cout << "Starting as client";
        //If using a custom network engine connect the game client to connectionInfos.endpoint
}
//Indicates that the game is ready. This is necessary because the host indicates by calling this function
//that it's ready to accept connection from other game clients.
gameSession->setPlayerReady().get();

P2P tunnel

A lot of game engines provide an integrated network engine that game developers want to use with Stormancer.

To this effect, the game session opens a P2P tunnel between each client and the host. This tunnel enables the client to send UPD packets to a port on the local loopback IP and have these packet be forwarded to the game host transparently with a very low latency cost.

The tunnel is able to switch automatically from relay to direct connection mode depending on performance and NAT traversal success.

If the host benefits from a public IP address, the runtime bypasses the tunnel and directly provides the public IP and port to the game client.

Sending P2P packets using the scene API

The scene contains API to:

  • list remote peers currently connected in P2P to the peer.

  • be notified on any remote peer P2P connection or disconnection.

  • Send packet to one or all P2P peers instead of the scene host on the server.

  • Add routes that respond to P2P packets instead of server originated packets.

These API allow to use the scene directly as a lighter weight network transport mechanism if the game doesn’t already integrates a network engine you want to use with a tunnel.

The Gamesession API provides 3 functions dedicated to easily use gamesessions scenes in P2P and client server topologies. These function can be used to avoid creating a full fledged plugin that augments the gamesession scene.

Instead they enable you to interact with the scene at critical points during its lifecycle to register message handlers, send messages to remote peers and cleanup on disconnection.

virtual std::shared_ptr<Scene> scene() = 0;

Event<std::shared_ptr<Scene>> onConnectingToScene;
Event<std::shared_ptr<Scene>> onDisconnectingFromScene;

Applications can subscribe to the onConnectingToScene event to add route handlers to the scene to handle P2P packets.

//This code is going to be called before actual connection happens.
auto initSubscription = gameSession->onConnectingToScene.subscribe([](std::shared_ptr<Stormancer::Scene> gs) {

        //Register a P2P route
        gs->addRoute("hello", [](Stormancer::Packetisp_ptr packet) {
                std::cout << packet->readObject<std::string>() << std::endl;
        }, Stormancer::MessageOriginFilter::Peer);

});

Reminder: To avoid memory leak, subscriptions to event are automatically unsubscribed when the subscription returned by the subscribe() method is destroyed (the initSubscription variable in the example above). It’s the responsability of the developer to ensure it doesn’t go out of scope as long as the subscription must be kept alive.

The scene() function returns the actual scene if the client is connected to a game session, or an empty shared ptr otherwise. Use the scene to get a list of connected peers or to send messages to one or serveral peers.

std::string input;
    std::getline(std::cin,input);
    input = userId + ": " + input;

//Broadcast a message to all other P2P peers
    gameSession->scene()->send(Stormancer::MatchAllP2P(), "hello", [serializer,input](Stormancer::obytestream& stream) {
            serializer.serialize(stream, input);
    });

Handling game results

When gameplay is complete, each player in a P2P game can send its results to the gamesession by calling the postResult function. In a server game, only the game server is expected to call the function.

Once all game results have been gathered, or all player that didn’t send them have disconnected from the game session, the game session calls the GameSessionCompleted event.

Custom code implements the IGameSessionEventHandler interface to react to the event and update the leaderboards and player profiles accordingly.

public interface IGameSessionEventHandler
{
    Task GameSessionStarting(GameSessionContext ctx);
    Task GameSessionStarted(GameSessionStartedCtx ctx);

    Task GameSessionCompleted(GameSessionCompleteCtx ctx);
}

Game server pooling

As starting game servers may take time, a pooling mechanism enable the system to start them beforehand so that they are instantly ready to accept players on gamesession startup.

The pooling system starts server, can mix the different types of server hosting available in a single pool, and provides a way to prioritize the type of game servers that are going to be created in the pool, and the order in which they are going to be used.

The pooling system assumes that a game server can only be used once, and is shutdown when the game session completes.

Configuration

The server pool is configured in its section of the server app configuration.

"serverPools":{
    "pvp.local":{
       "provider":"local",

       "executable":"path to executable",
        "stormancerPort:30000,      //Port bound to a local grid Raknet endpoint
        "arguments":["-log","xxx"], //Additional arguments for the server executable
        "mapName":"xxxx",           //String passed as environment variable to the game server.

        "maxSize":10,               //Max number of server to start in the pool
        "ready":2           //Min Number of servers to keep ready in the pool, if possible

    },
    "pvp.ecs":{
        "provider":"aws.ecs",

        "awsKeyId":"aws key id",
        "awsSecret :"aws secret",
        "awsRegion":"aws region on which to start the container",
        "taskDefinition":"ecs task definition id",
        "containerName": "fargate container name",
        "cluster":"fargate cluster name",
        "securityGroups":["xxx","yyy"],     //container security groups
        "subnets":{"xxx","yy"],             //container subnets

        "ready":0                       //If not necessary, don't keep container running
    },

    "pvp.composite":{
        "provider":"composite"
        "subpools":["pvp.local","pvp.ecs"]
        "ready":1
    },
    "pvp.ec2":{
        "provider":"aws.ec2",

        "awsKeyId":"aws key id",
        "awsSecret :"aws secret",
        "awsRegion":"aws region on which to start the container",
        "awsImageId":"image id to use in the EC2 instance",
        "awsLaunchTemplate": "EC2 launch template",

        "ready":0                       //If not necessary, don't keep container running
    },
}

To set the server a gamesession must use

About composite server pools

Composite server pools mutualize game servers from several pools to allow optimization of costs, availability and scalability.

The order of the subpools in the composite pool configuration is important:

  • When ensuring that enough game servers are ready, the composite pool tries to start servers in the order of the subpool array. In the above example, it’s going to try creating in pvp.local, then if not enough servers can be started in this pool, pvp.ecs.

  • When using servers ready for startup, the subpool array is traversed in reverse order.

This way, we make sure that the servers are created in the first pools as a priority, supposing that these are quicker to start and less expensive to run. But as soon as a server in a lower priority pool is started and ready to run, the system is going to use it to run a game first. This way we minimize the number of servers in low priority pool that are waiting to be used.

The number of server a composite server pool must keep ready doesn’t take into account servers kept ready by subpools. It means that in the above example, the pvp.composite pool tries to keep 3 servers ready when the pvp.local pool is not full, and 1 afterward. The 2 ready server kepts ready by the pvp.local pool are not taken into account to decide if enough servers are available.