Under certain network emulation settings, it looks like outgoing packets from the client will be dropped in groups (around 4 or more packets at a time) rather than individually. This leads to the actual packet loss being much higher than the value specified in the settings.
This only seems to occur if the Emulation Target is "Everyone," as "Clients Only" emulation looks to have the expected amount of outgoing packet loss. This also only seems to occur if some latency is being emulated as well, as setting the min/max latency values to 0 also results in the expected packet loss.