We were running v 2.7.7 just fine. Since moving to the 2.8.x versions even tested on the newest nightly build we have the same problem.
Windows 7 x64, and this has been repeated on many workstations. Whether we choose exit from the system tray icon or from the File menu we get the same result.
The program does not exit, instead it logs out the user and displays the message “Your connection was closed due to an error”.
Actually this forced me to investigate why Spark was freezing on me in one of my testing machines (Win 7). And it appeared to be because of some old, unsupported plugin. Though very unlikely, but maybe you also have Whiteboard plugin installed. Or maybe this is caused by some other plugin. My logs showed this:
Oct 04, 2016 8:00:05 PM org.jivesoftware.spark.util.log.Log error
Indeed it was the plugin(s) causing the issue. I had to remove:whiteboard, editor, reversi, and tic-tac-toe. These were actually used in house for collaboration with the first two, and the last two for breaking monotony occasionally.
1, Any chance of these being made “compatible”?
2, Should this not be marked as an easy bug to fix in forthcoming update? Incompatible plugins should show a descriptive message including plugin name, or continue normally while bypassing them.
There is always place for improvements Especially in the Spark’s code… I doubt Whiteboard or Editor plugin will be made compatible. It was a third party plugin and i think they are not in this business long ago. http://www.version2software.com/
As for Reversi and tic-tac-toe, they were also provided by contributors, who don’t participate in Spark project anymore. But there is at least the source code of these, so it can be fixed. All these plugins probably stopped working after the update of Smack library to the latest version i 2.8.0.
Can’t comment on the plugins themselves. It seems that they are not available for download in Spark and i’m not sure whether this is Spark’s or website’s issue.
Yeap, translator, tic tac toe, battleships and reversi are commented out in the build.xml for Spark. They were probably causing problems with the build because of their outdated code and because of switching to newest Java.