Running Spark 2.7.1 in a Windows 2008 R2 Citrix environment. Sometimes (seems random as the users claim they have not done unusual with Spark) spark.exe will start using 100% of its available CPU and create a huge errors.log file in %AppData%\Roaming\Spark\logs. Spark keeps generating the log file until the disk is full and will keep going until the .exe is manually terminated.
This appears to be a bug but I doubt I’ve provided enough information for anyone to be able to trace its origin. I have to delete the errors.log file to free up disk space so I do not have a sample to attach (although I could keep some next time it happens).
I experienced the same thing with some of my 2.7.1. clients. I never really found a reason for why it occurred and it only occurred in some of the clients. However, after upgrading them to 2.7.2 and 2.7.3 I have not had that issue any more.
Jarred, are you using it in a virtual environment like David though? Other than move from Java 7 to Java 8 in 2.7.2 version i can’t think of anything which could solve such an issue.
Logs would be helpful of course, but i doubt they would show much. Especially when we don’t have experienced java developers here.
Not with the clients that were experiencing that issue. They were Windows 7 Professional clients. I do not recall the version of Java that was being run at the time I apologize.
If you are using full installer, then Spark has its own Java built-in. 2.7.1 had latest version of Java 7. Starting with 2.7.2 it is bundled with latest versions of Java 8.
Java being used by Spark can be checked in the About window.