Java 7 Error in online Installer?

The full version (non online) works fine under windows 7 64bit, so somthing with the fact that your not including Java?

I am going to do a couple more test instalation, but yes it seems once you log in the first time with an elevated user it seems to work now. I have a couple more machines I want to test (differant confiurations) I will let you know if this takes care of the install/operations issues. Thanks SO much!

Okay some help on pointing me to the source and tools needed to complie this would be helpful :wink:

2 out of 2 so far, so here is what I had to do, like you said Wroot, Providing you have the 32bit Java runtime of at least 1.7+ you can run the online installed from the most recent build here: http://bamboo.igniterealtime.org/browse/SPARK-INSTALL4J-642/artifact

If your running windows 7+ and installing from an unalleviated user, you will have to run as, with elevated rights. Install and allow Spark to START, including needing to login and connect to your server, then exit/log out. Now all unalleviated users should be able to login and use Spark with existing versions of Java runtime installed. This should also prevent the embedded/included Java from clobbering applications using the installed version of Java, but if there are memory leaks and bugs in the code there is NO way to prevent it from causing further issues but that is the way the game is played :wink:

One other note, I have found by experiance that if you load/run both 32bit and 64bit Java RE on Windows 7+ 64 bit) then be sure you run the same build versions, don’t mix them! I have found that it creates all kinds of odd issues.

found this, prob what I need thx: http://community.igniterealtime.org/docs/DOC-1040

okay whats the trick, what is the connection type pserver??? what do you use for login and password?

Are you talking about SVN? Thee is no login or password.

So, by comparing i4jparams.conf in the installation folder i see only such change which may caused that installer is not recognized as a windows program installer by OS:

633 build, beginning of the file:

<?xml version="1.0" encoding="UTF-8"?>

642 build:

<?xml version="1.0" encoding="UTF-8"?>

Though both builds should be using the same 5.1.9 version of install4j, but probably config has been changed after Java min/max settings changed. Not sure if it can be edited.

Well, it is probably not a big deal. Users will have to learn that they now have to run Spark installer via Run as administrator explicitly to be able to install in the limited rights environment. Installing with a user which has administrative rights will be the same.

well the SVN client im using insists it needs somthing to save the connection, and when I try to connect its asking me for a password, got a url/link that works

(:extssh:anonymous@bamboo.igniterealtime.org:/browse/SPARK)

make sure you are connecting to the http:// website and not https:// for anonymous

1 Like

ya I m getting nowhere fast, im using Eclips and the Subversion addon

Daryl, you are not able to edit install4j params in Bamboo? Then, we will have to live with that I’m talking abut that UAC shield.

But this is a minor thing. Today i decided to check another thing and it seems changing java min/max also affects how Spark works. It won’t even launch now if java 7 is not present on the system or in built-in jre folder. This also means, that if you install Spark with java 7 bundled and still have java 6 on the system, Spark won’t use that older java 6 like it did before. Which is GREAT This way many java 6 related bugs will go away if you still have to use old java on your system. I’ve been able to test only the system tray icon looking crappy in this situation (with new installer it looks ok). Will test more next week (the focus stealing issue) and will close SPARK-1545 ticket for now. Unless someone will find more hooks in the code hardcoded to use Java 6 (probably there are such, using old methods, but i think the main task is completed).

Thanks to Carl for pushing this and for Daryl for making changes to Bamboo. Looks it was easier than i expected

wroot wrote:

Daryl, you are not able to edit install4j params in Bamboo? Then, we will have to live with that I’m talking abut that UAC shield.

I am unsure of what you are asking me. What do you wish for me to change?

I’m unsure myself in 50. message i post that there is additional line in i4jparams.conf file in older installations:

I’m thinking maybe this is what makes Windows recognize Spark installer as installer. But i don’t know where to put it so it will come up in the new installation i4jparams.conf file…

the installation “troubles” above look normal. @Carl are you by any chance installing this on a domain workstation?

Regarding the JRE versions, I believe these are controlled via the spark.install4j file, which should be updated now for java7. Spark shouldn’t be using your system’s installed JRE if the ‘jre’ directory is present under the Spark installation directory (and provided you isntalled via the Offline version with the JRE bundled, I believe if you install the Online version, the install4j script may be different and I’ve not looked at it).

Also, running multiple JRE versions on your system at the same time shouldn’t effect Spark. I do this on a few of my workstations. This works because Spark should use the JRE in the jre directory from it’s Spark installation directory.

I don’t believe I’ve tested Spark with the 64bit JRE, so I’m unsure if it works or not. I know Install4j itself is picky and did not agree with the 64bit JRE when I tried to upgrade it’s bundled version, but it happily took the 32bit JRE.

Most of this was discovered prepping this thread: http://community.igniterealtime.org/message/234019#234019

Jason wrote:

This works because Spark should use the JRE in the jre directory from it’s Spark installation directory.

It should, but it doesn’t I think we have discovered this on some thread about focus stealing bug, where it was still happening to someone while using the latest SVN version with java 7 bundled. But if you have java 6 on your system Spark will use it instead of the jre folder inside its installation folder. At least with the older installers. We have such issue at work on some machines where we have to use old java 6 versions. System tray icon looks bad and focus stealing bug is present. Starting with 642 build installers this shouldn’t be an issue anymore. It looks like Spark is depending on install4j config no matter what installer type was in use (offline or online) and searches for min/max java on the system.

1 Like

That is what I have found Wroot, all is good so far now, there is a little bit of hesitant/latency now when scrolling though names in the contact list. I am not sure of the cause, also by default ALL the plugins are enabled, I have to disable them as we do not want some of those running. would be nice also to get rid of the extra “skins”…

it’s still happening? i thought it was fixed when we changed the install4j file… which install4j version are we using? I’m testing with 5.1.7… i think 5.1.8 is the latest.

in any event, it’s defenitely an install4j problem since the launcher it creates is responsible for setting up the classpath, etc. Perhaps there is a setting for “only use the bundled jre, damnit!”?

i’ve not been following lately… has the About Box patch gone in yet? The latest patch I posted showed in the about box which JRE spark was actively using… may be helpful in troubleshooting.

Jason wrote:

it’s still happening? i thought it was fixed when we changed the install4j file… which install4j version are we using? I’m testing with 5.1.7… i think 5.1.8 is the latest.

5.1.9