1. I run a VM'd Rust server that hits 50-60 people. 170k entities and at this population starts to bottom out at 150-200fps. people notice lag and stutters on the server. The VM has two sockets and 2 cores + 12gb of memory. CPU sits at 60%, memory at 8GB.

    I build a dedicated HP ML380 G7 with 2x sockets, 12 cores. Server gets 400fps empty (using a save from the VM server). Empty the VM server gets about 700fps.

    What the hell does Rust need to get the bigger fps with more users??
     
  2. That fps is still good anything above 30 is good but remember every player does not run a good gaming pc so they can still get lag not from the server but from having a crap pc
     
  3. well, below around 100fps people start getting stutter on the server.
     
  4. Yep, I've noticed that below 150fps stutter starts to happen on one of my servers too. When it does happen, I trawl the server for an hour looking at abandoned bases, random stone walls, and raid towers, and removing them.

    In addition I've added a rule that says if a player wants to use a >twig raid tower they can, but they must clean up after the raid. I've set twig to decay within 24 hours using the TwigsDecay plugin. When people get offline raided and there's a stone tower outside, the first thing they do is cry "admin!" They want that player punished. ;-)

    I've installed AutoPurge and set it to remove bases after a player (AND friends/clanmates) hasn't been on for 4 days.

    If I keep the entity count below 100k, all is well, but it progressively gets worse higher than that. From what I've read, Unity can handle 100k entities and then it runs into problems.

    BTW, what tool do you use to track your entity count?
     
  5. I just use the console to track entities. I use auto purge on 3 days and also decay twig etc fast. 120k entities is where my server starts caving. I fixed my fps issue yesterday on ym physical server. It's all down to clock speed, not cores. 3.6ghz server will beat the crap out of 2.6ghz even with more cores.