Java optimization in openfire

Hi,

Openfire 4.0.4, Spark 2.8.2 mostly, 290 users, plugins: Broadcast, Monitoring Service, Search.

Memory using by java up to 1,4 GB with 1,5 GB limit! Is this normal? How to decrease this java memory consumption?

After upgrade openfire to 4.1.0 it’s worse :confused: anyone help?

Do you actually hit an out of memory condition? Memory reporting in java is complicated and caching can give large results.

Hi,

in the last view month i got the same promblem, more often, too.

Is there a chance to optimize openfire for the use of java memory?

It seems like openfire will use every memory which it gets, like a memory leak.

best regards

Hi there

You guys might want to try this out…

Go to the installation folder of openfire: Drive:\Openfire\bin

1.) go to bin folder

2.) create a text file and name it to “openfired.vmoptions” and “openfire-service.vmoptions”

3.) Specify the Virtual Memory allocated

Example

4.) Type in: -Xms512m

5.) Type in: -Xmx1024m

4 & 5 will depend on your server/PC memory.

I’m using a server with 8GB RAM and allotted 1GB memory for JAVA.

We have 114 users.

Thanks for this advice,

at the moment this worked for me.

The options have to be written without “Assign Minimum / Assign Maximum” just wrote in the files:

-Xms512m

-Xmx1024m

otherwise the openfire Server wont start

Hi…

RE: The options have to be written without “Assign Minimum / Assign Maximum” just wrote in the files:

That is correct, sir. The assign maximum is not included in the code

Edited my post.

unfortunately, this just the solution for increasing the Java Memory, but not for my Problem.

As I expected before, openfire uses now 4 GB Memory, it takes all memory what it gets and the CPU workload run into 100 %, too.

It seams the reason is not enough Memory.

In the Logs I often found Messages like:

*2017.08.04 07:59:07 org.jivesoftware.openfire.pubsub.PubSubPersistenceManager - ConnectionManager.getConnection() failed to obtain a connection after 11 retries. *

The exception from the last attempt is as follows: java.sql.SQLException: Couldn’t get connection because we are at maximum connection count (25/25) and there are none available

Is this Problem familiar to someone?

best regards

Hi

Yes we’ve had that problem previously when we’re still using a lower specs server [1GB RAM / Intel Xeon].

The memory was insufficient that time, so we decided to transfer Openfire to a much higher specs.

Right now, the server we’re using has 8GB of RAM with the same processor.

If you’re still having a problem with memory leak even if it’s installed on a higher memory server, I think there must be some sort of application or program that runs in the background and eats a lot of resources rather than java.

I’m not sure, but my understanding is, the more the users connected, openfire requires more memory.

Since then, we never had this memory leak problems anymore.

Let’s wait for someone who resolves this issue

Please note that Openfire is a Java application. Java applications (like Openfire) have a tendency to reserve a considerable amount of memory from the operating system - even if is not being used. This is a performance optimization (which is highly configurable). Generic Java memory configuration applies, as you can find in many places on this forum, or even in this thread. I strongly recommend that you do not apply these, unless you’re running into functional issues (for example: very high CPU load, disconnecting users, or logged errors that relate to memory unavailability). A high memory usage in and of itself is not an issue.

1 Like

The exception that you’re mentioning relates to the database: Openfire is unable to connect to it. Common causes are misconfiguration of credentials, or network issues.

1 Like