No JVM error, define EXE4J_JAVA_HOME

I found some references to it when wikiaing/googling about, but no solutions.

Appears to be an erroneous error as I get it whether or not EXE4J_JAVA_HOME and/or the typical Java environment variables are filled out. I’m able to run spark fine from the command line (via startup.jar) though.

Platform is Vista x64 w/Sun JDK 1.6.0_07.

I ran into this same error…It appears spark wants the 32bit version of jre to be installed. Just go to java.com and download the recommended version.