To limit the memory usage one can add parameters to the JVM. These may lead to OutOfMemory errors if the values are too low. With these parameters one can only limit the Java heap and the Permsize but not the native heap of javaw.exe which manages the threads, the garbage collector etc. within the JVM.
See Spark JVM Settings for a general description how to add Java System Properties to Spark, one could add:
but what should be a good solution? 64MB is much and I think Spark don’t really need it. I don’t want do compile it share to our network and see then that in some cases the memory is not enough.