No JVM could be found on your system

Hey there,

I’ve getting the following error message when attempting to load Spark…

No JVM could be found on your system.

Please define EXE4J_JAVA_HOME

to point to an installed JDK or JRE or download a JRE from

www.java.com

I’m using Windows 7 64bit…

I’ve downloaded old versions of Spark, new versions of Spark…

I’ve attempted to run as administrator…

I’ve download and installed old versions of Java, and new versions of Java…

Now, I’m not 100% sure how to define EXE4J_JAVA_HOME, if anyone could suggest it… that would be great…

I’m also at the stage where I can’t uninstall Spark because of the same error…

Any help would be greatly appreciated…

thanks

Josh

Do the other java applications run on your system? Did you download the full Spark installer (bigger size) and not the online one?

You can also try adding that system variable in My Computer > Properties > Advanced > Environment Variables >

Add this to System Variables

EXE4J_JAVA_HOME

value would be a path to your JRE installation without a trailing slash

Thanks for tip…

Downloaded the full install and it worked like a charm…

I didn’t even realise there were 2 install versions!

thanks again

Josh

I fixed this by just downloading and installing the latest java JRE from filehippo.com.

1 Like

This is really not an answer. The full installer uses it’s own JRE version so it’s not using the one installed on the system. So the problem really isn’t being solved if you don’t want to use the pre-packaged JRE.

Thanks church, I specifically registered here so I could 'like" this and thank you for a simple fix to a problem that’s been causing me lots of issues since I downloaded the latest JAVA update and java.com offered no help.

Filehippo is a nice site (though recently they tend to offer online installers for Flash, Skype, etc. which is not very useful). But i tend to use original sources when i can. Java.com is slow to update. But http://www.oracle.com/technetwork/java/javase/downloads/index.html is quite good place, also for JDK if you do Java dev.

Just come across this issue myself after using Spark 2.6.3 for ages. Turns out it started when I removed the last remaing version of Java 6 from my machine leaving only versions of Java 7. Is there something special about Java 6 that Spark 2.6.3 needs? The variable mentioned in the error isn’t set by Java 6 either so it must be looking for something else…??

I didn’t ever had Java 6 installed on my system and Spark 2.6.3 runs fine, but i always use offlien version with jre bundled, because it won’t easily use system’s Java 7 until 2,7,0 with proper support is released.