New IM hijacks keyboard away from other apps

Not sure if a fix for this. A user mentioned that when they are typing in another app, if they receive a new IM, they lose typing ability in the app they were in. The IM is not brought to front, the icon is just flashing at the bottom of the screen. It is like the new IM pops their focus out of the window they were in because they then have to use the mouse to click back into the application to resume typing.

I also tested with something like Notepad. Just typing and when received a new IM lost all typing within Notepad until I click back into it.

Have any suggestions? Using Spark 2.6.0 and OpenFire 3.7.0.

use spark 2.6.3 and install java7

this is a really old java issue and is fixed in java7

Updated both but doesn’t seem like it fixes the issue.

i tried 2.6.3 with java 1.7.0-ea and it worked fine

1.7.0? I might have the wrong thing then. I downloaded the new Java7 that I think was released yesterday. Do you have link of the proper download?

http://jdk7.java.net/download.html

official release is on July 28th

Hmmm… That is what I used. Will test it out on another machine.

Edit: No go. Installed Spark 2.6.3 and Java 7 on two machines, rebooted, but still happens. Both are Win7 x64 machines.

Probably he also has to remove jre folder inside the Spark installation directory or Spark will still use its internal java, which is still java6 i suppose?

yes, spark mustn’t use the bundled jre, which is still 1.6.0_18

you can check which jre is running in spark when you type in

%{java.version}/%

as resource or as away status

Ok, got it. Had to use a 32-bit version of Java7. Uninstalled any previous versions of Java and replaced the Spark original jre folder with a copy of the 32-bit Java7 version. Renaming it failed to open Spark. So far all looks good now. Thanks!

Was using a Win7 machine. Now trying an XP machine. When I try and use the jre7 folder as jre, it gives this error when opening the program:

An error has occurred during startup:

java.lang.UnsupportedClassVersionError: Bad version number ion .class file

Any clues?

How can you tell what Java version is being used by Spark? I didn’t understand the explanations above on how to do that.

The full install of 2.6.3 uses Java 6. If you are installing the online version, it uses the JVM installed on your PC.

2.6.4. will use Java 7, it’s scheduled for Q4/11.

Can we get an early update that packages Java7?

Currently you have to install Java7, remove all previous versions of Java, and then copy the JRE7 folder to the Spark Directory, and rename the old JRE to something else and rename JRE7 to JRE.

And then a user comes along and clicks “update” or something… whcih then reinstalled Java6 and Spark is broken again…

This is too big of a problem to let go another couple months guys… seriously… its the MAIN problem with Spark that I have seen the most. please bundle it with Java7. Please.

Wouldn’t it be easier if Spark came bundled with Java7? You guys are loosing subscribers and users… please release an updated Spark with Java7 BEFORE Q4… thats too far away…

Well, we may be able to provide an experimental Java7 bundle, but J7 is brand new.

I would also like to note the fact that this project is run by volunteers and we need to update our build environment to get J7 up&running. That’t no small task which requires some experience. I’ll try to manage that, but overall: More dedicated and experienced programmers would be nice. Spark is no simple code…

@Walter – THat would be great! I know i’m not the only person who would be jumping with joy if you guys were able to release a Java7 version. I"m no programmer, but I wouldn’t think it would be too much trouble to package them together since the current release of Spark installs its own Java6 bundle…

An “experimental” version would be sufficient… right now its just too big of a hassle to have to manually install java7 and copy the directory over and essentially “patch” Spark. And then the user clicks the “update” button and its back to java6 again…

Maybe even a “patch” release that we couldl install over spark and have it automatically apply Java7?

I"m just looking at this from an enterprise perspective… i have about 50 workstations that I have to get this to work on… and if i have to baby sit each workstation… well… thats just a no-go…

As stated in other posts… I have no other problems with spark at all…think its an amazing product and i think you guys are doing a great job… just get this damn java bug fixed please!

Any word on this yet Walter??? The community is waiting with bated breath! please release some sort of package with Java7 bundled… it can be “experimental” or whatever you want to call it… just please get it out there.