Description

When a client sends an Unreliable RPC to the Server in same frame with a Reliable RPC, Unreliable RPCs are Reliable. It implies extremely high lag on each lost RPC and all following ones. User loses all benefits of Unreliable RPCs that should always deliver as fast as possible the last received information, and ignore lost information. This is a regression from 4.19.2 (CL-4033788).

This was reported and tested in 4.20.3 (CL-4369336). This was reproduced in Main 4.21 (4392549)

Steps to Reproduce
  1. Download and open the attached project
  2. Click on Play and go to Advanced Settings>Multiplayer Options
  3. Set the Number of Players to 2
  4. Uncheck Use Single Process
  5. Set Editor Multiplayer Mode as Play As Listen Server
  6. Play as Standalone Game
  7. Red message describe the bug, Green and white ones are OK.

Alternatively from a Blank Project:

  1. Configure Packet Lost 20% + Lag 500 ms
  2. Call Unreliable RPC from Client to Server on same frame than Reliable RPC from Client to Server.
  3. Observe that Unreliable RPC are never lost and VERY HIGH LAG appear on RPCs after lost packet (because arrived RPCs wait the lost one)
  4. Observe that it work on the other way (Unreliable Server RPC call on Client are sometime lost, and lag is normal).

Results: The problem only appear on Unreliable RPCs that have been called from Client to Server (not RPC called from Server to Client). Those Unreliable RPC are never lost, and it implies very high lag (sometime more than 1 second) on non-lost RPC : it waits to resend the lost one.

Expected: Unreliable RPCs should always been unreliable. They have to be guaranteed, but if Engine detects a lost, it should not wait to deliver the next unreliable RPC.

Have Comments or More Details?

There's no existing public thread on this issue, so head over to Questions & Answers just mention UE-64431 in the post.

8
Login to Vote

Fixed
ComponentUE - Networking
Affects Versions4.214.20.3
Target Fix4.21
Fix Commit4492487
Main Commit4551290
Release Commit4492487
CreatedSep 25, 2018
ResolvedOct 19, 2018
UpdatedNov 8, 2018