Explanation: scalability, system requirements

As this question pops up in the forums fairly often, i have decided to compile a simple document with an explanation of the issue regarding such questions and also provide here links to various tests made by users (which are scattered in separate threads or in other places in the internet).

No definitive answer problem:

Users ask in the forums what hardware should they use or how well Openfire will scale. There is no definitive accurate answer to that. Because this depends on hundreds of various aspects (number of users, usage patterns, number of online users at any time, operating system and its limitations, java, etc.). We can’t say that for 10 users you will need 10 MB of RAM. Because one server can handle 100 users with that and another one won’t be able to handle one user (because that one user will be sending huge messages every 10 seconds, etc.). And there is no formula to calculate resource usage based on usage patterns (nobody knows how much CPU or RAM a message sending or login uses). So, we can only share tests made by other users (synthetic mostly) or our setups (for what it’s worth) and you will have to do your own tests or make some conclusions based on others’ tests.

Synthethic tests:

(feel free to add new links here, can be also from other places in the web)

Clustering

Openfire has a clustering solution (Hazelcast plugin), which might allow to get higher count of users.

Real setups:

@wroot - my production server with ~150 online users (Windows Server R2 x64 with 2 GB of RAM and 1 virtual CPU on Xeon E5-2620 Hyper-V host) is generating CPU usage of 1-4% during 3 months span. Memory usage is between 1-1.5 GB with 1200 MB on average. I have set Openfire’s JVM to 512-1024 MB though. It probably would use less by default, but at some point i started to have JVM memory exhaustion problems, so i have tweaked JVM’s memory settings.

(feel free to share your server’s setup and average load)

2 Likes