26 Replies Latest reply on Aug 16, 2011 4:44 AM by Antony

    How to use different log4j version in plugin

    Antony Bronze

      My plugin has a depency on a jar file that depends on log4j 1.3.  Even if I include the log4j 1.3 jar file in my plugin/lib directory I still get undefined method Logger.isTraceEnabled

       

      Is there any way to separate out a plugin so that its classloader will find classes locally before delegating upwards.  I think it uses the JiveClassLoader, but that will just tag the jar/zip files from the lib directory onto the existing path and not allow the plugin classes to override those in the core lib directory.

        • How to use different log4j version in plugin
          Antony Bronze

          Plugin classloading seems to be a bit of a problem as all the jar files from all lib directories for all plugins get stuffed onto the same PluginClassLoader.  I'm having problems with XStream as well as log4j.  I found a temporary solution for log4j just to copy the 1.3 jar into the Spark/lib directory and remove the orignal.  Unfortunately XStream does not work if I use the same logic.

           

          Are there any plans to provide plugin class loaders per plugin so that each plugin can run in its own class space without conflict with any other plugin or the Spark core.  It should following the same logic such as a typical servlet spec implementation, e.g. Tomcat's WebappClassLoader.

           

          I've tried to play around with the PluginClassLoader, without success

            • How to use different log4j version in plugin
              Walter Ebeling KeyContributor

              There is an feature issue related to plugins http://issues.igniterealtime.org/browse/SPARK-1311 You are welcome to provide some patches etc to Spark and engage in this with Mircea.

              • How to use different log4j version in plugin
                Mircea Carasel Bronze

                Antony,

                 

                IMO, we should keep spark plugins as much as possible decoupled from the container (Spark in our case) If you want to load some jars from your plugin only for your plugin, then the plugin should create its own class loader, and not Spark.

                 

                It may be another way that you can try in order to load your own xstream or log4j for your plugin, without making changes in spark code.

                 

                Your findings are correct: all jars that are found inside the plugin's  /lib directory are loaded in the PluginClassLoader defined in Spark (PluginManager.java -> private PluginClassLoader classLoader;)

                So basically this class loader is more like a generic class loader that loads all plugins into spark.

                All other jars that may reside inside one plugin and are used only by the plugin, should be loaded by the plugin itself using its own class loader instance.

                 

                I think that sparkplug API can be extended to make this easier.

                 

                What I thing you can do is the following:

                 

                1. create your own class loader instance inside your Plugin

                2. put the jars that you need to use (log4j, xstream, etc) in your plugin in a different directory than lib, lets say: privatelib directory. Since those jars are not in plugin's lib directory, they won't be loaded in the generic plugin class loader created in spark (PluginManager.java -> private PluginClassLoader classLoader;)

                3. during plugin creation, in the constructor of your plugin org.jivesoftware.spark.plugin.Plugin implementation for instance, load all jars from privatelib directory in your dedicated plugin class loader instance.

                 

                Hope this helps,

                Mircea

                  • Re: How to use different log4j version in plugin
                    Antony Bronze

                    Hi Mircea,

                     

                    Writing a Plugin should be a very simple task and one Plugin writer should not need to care about other classes used by other plugins or the Spark container or know how to write custom class loaders.  Requiring a Plugin writer to take on the intricacies of writing a custom ClassLoader makes writing Plugins too complex.

                     

                    I expect many Java developers do not fully understand how classes get loaded by which ClassLoader (me included - especially if security is enabled).  Part of the problem is that there is no way for the Plugin developer to know whether he needs to write his own ClassLoader because he may not know the deployment environment.  I only found out because I got NoSuchMethod errors when trying to run my Plugin.

                     

                    My knowledge about Class loading is a bit thin, but my understanding is that when an object needs to load a class it will be loaded using the ClassLoader found by this.getClass().getClassLoader() from the object requiring the class.  So as long as the Plugin instance is created by a new class loader created specifically for that Plugin, it should not need to do the following (which I said in my original PM)

                     

                        pluginClassLoader = classLoaders.get(plugin.getClass().getName());
                        ClassLoader saveLoader = Thread.currentThread().getContextClassLoader();
                        Thread.currentThread().setContextClassLoader(pluginClassLoader);
                        plugin.doAction();
                        Thread.currentThread().setContextClassLoader(saveLoader);
                    

                     

                    but I'm happy to be corrected.  So, I think the relatively small change to PluginManager.loadPublicPlugins below works when creating the newInstance of the class.  It does it using a new (modified) PluginClassLoader which must NOT delegate to the system classLoader before attempting to load the class.  I think this is not such good behaviour though as it allows Plugins to override the core Java classes, but if it does call the system class loader, it will then get classes that have been loaded from the Spark core, e.g. log4j etc. 

                     

                        PluginClassLoader cl = new PluginClassLoader(getParentClassLoader(), pluginDir);
                        cl.addPlugin(pluginDir);
                        //  Has core lib directory after - like 'shared' plugin library
                        cl.addRootURL(PLUGINS_DIRECTORY.toURL());
                        pluginClass = (Plugin) cl.loadClass(clazz).newInstance();
                        publicPlugin.setClassLoader(cl);
                    

                     

                    There may be no need to keep a map of the class loaders in PluginManager, but I guess it's no big deal.

                     

                    I don't see any point in the current PluginClassLoader usage - in fact it's a really bad thing to lump all the plugin lib directories together and can lead to real support issues as Plugins may do random things depending on what jars are installed with what Plugins and whether the URLs are pushed on the class loader stack before another ... 

                     

                    I don't see the point in having lib and privateLib directories.  What is the purpose of jar files in lib.  Why would I want jar files I put in lib directory to be visible to other Plugins and vice versa.  It's the same as having web container webapp separation.

                     

                    I understand the desire to keep the Spark container simple, but the Container ought to provide the framework to provide class separation between Plugins, so I think the following could be done

                     

                    1. PluginManager creates a new PluginClassLoader before creating the Plugin class instance

                    2. PublicPlugin has the class loader instance (if it wants it)

                    3. PluginClassLoader is populated with URLs in this order

                    a) plugins/pluginName/classes - directory

                    b) plugins/pluginName/lib/*.jar

                    c) plugins/classes

                    d) plugins/*.jar

                     

                    That allows the plugin to use it's own private classes and libs and allows the plugins directory to act as a shared class repository to be usable by all plugins (both jar files and individual classes).

                     

                    There are various discussions about parent-last class loading, which avoid the delegation to parent first, but I am not sure if this is good practice or not.  It may be possible to have a Spark Container ClassLoader, which is set to load the classes in the Container, which is a separate class loader to the system class loader.  In that way, when loading the plugins, it can first get the system class loader and use that to create the PluginClassLoader.  In that way, the PluginClassLoader could in fact delegate to the parent first as it would not then find the Spark Container classes as it would bypass that ClassLoader.

                     

                    Cheers

                    Antony

                      • Re: How to use different log4j version in plugin
                        Antony Bronze

                        Log4j is actually a real PITA.  It will force the use of the Thread's ContextClassLoader to find the log4j.(properties/xml).  It's well described here http://articles.qos.ch/classloader.html  and in this http://stackoverflow.com/questions/1974705/log4j-and-the-thread-context-classloa der they recommend dropping in SLF4j as a replacement.

                        • Re: How to use different log4j version in plugin
                          Mircea Carasel Bronze

                          Hi Antony,

                           

                          Please see below

                           

                          [Antony]

                          Requiring a Plugin writer to take on the intricacies of writing a custom ClassLoader makes writing Plugins too complex

                          [Mircea]

                          I totally agree, a Plugin writer should not do that, I was just thinking that we can extend current mechanism to automatically create a dedicated plugin class loader that will load plugin dependencies

                           

                          [Antony]

                          I don't see any point in the current PluginClassLoader usage - in fact it's a really bad thing to lump all the plugin lib directories together and can lead to real support issues

                          [Mircea]

                          Well, I am not sure if this is bad or not, but IMHO there should be a way to make Spark container aware of what plugins are installed. Loading all plugins in the same class loader proves to be a solution... What this doesn't seem to resolve are plugin dependencies (other plugin jars), as you noticed.

                          I think that there are three major aspects here:

                           

                          1. There is Spark - the plugin container

                          2. There are plugins

                          3. There are plugin dependencies.

                           

                          What we have a this point:

                          For 1 - there is the the Spark class loader responsible for spark libraries loading

                          For 2 - there is the Plugin class loader that loads all plugin jars (lib directories)

                          In addition for point 3 What you need is to load plugin dependencies in dedicated class loaders for each plugin. All those class loaders should inherit the generic plugin class loader

                           

                          IMO, there is no need to beak the current functionality, removing the clas loader from point 2 I think will break  the plugin dependency concept (in plugin.xml you can put a dependency tag and make one plugin to be loaded after a parent plugin)

                           

                          I was looking on how Tomcat class loading takes place: The Overview Diagram:

                          http://www.jajakarta.org/tomcat/tomcat3.2-4.0/tomcat-4.0b5/src/catalina/docs/dev /classloaders.html

                           

                          The plugin class loader created by spark (point 2) is somehow similar with "Shared" class loader from above diagram. What is missing right now are class loaders below Shared

                           

                          Mircea

                            • Re: How to use different log4j version in plugin
                              Antony Bronze

                              Hi Mircea,

                              [Antony]

                              I don't see any point in the current PluginClassLoader usage - in fact it's a really bad thing to lump all the plugin lib directories together and can lead to real support issues

                              [Mircea]

                              Well, I am not sure if this is bad or not, but IMHO there should be a way to make Spark container aware of what plugins are installed. Loading all plugins in the same class loader proves to be a solution... What this doesn't seem to resolve are plugin dependencies (other plugin jars), as you noticed.

                              I think that there are three major aspects here:

                               

                              1. There is Spark - the plugin container

                              2. There are plugins

                              3. There are plugin dependencies.

                               

                              What we have a this point:

                              For 1 - there is the the Spark class loader responsible for spark libraries loading

                              For 2 - there is the Plugin class loader that loads all plugin jars (lib directories)

                              In addition for point 3 What you need is to load plugin dependencies in dedicated class loaders for each plugin. All those class loaders should inherit the generic plugin class loader

                               

                              IMO, there is no need to beak the current functionality, removing the clas loader from point 2 I think will break  the plugin dependency concept (in plugin.xml you can put a dependency tag and make one plugin to be loaded after a parent plugin)

                               

                              I am not sure I understand the Container's purpose of being 'aware' of what plugins are installed and what problem is solved by adding the lib directories to the classpath.  By doing this, it effectively makes every plugin a runtime dependency of every other, but without the benefit of being able to define the dependecy order as can currently be done. Sure, the container can know what plugins are installed by parsing the plugins directory, but if this is to solve the dependency issue, it is too crude.

                               

                              IMHO, it seems like the container's runtime is being polluted with too much knowledge of the Plugins.  The container needs to be simply able to parse and load the plugins runtime descriptors and then launch them when appropriate.  It should not have a runtime i.e. classpath, awareness.  Tomcat container does not have all the webapp classes/libs on its own classpath.

                               

                              When debugging yesterday, I found the plugin dependency support, which I couldn't find documented anywhere in the SparkPlug dev guide, but there is a simple solution to that dependency issue rather than having a common class loader that is aware of all libs.

                               

                              Yesterday, during initializePlugins, after it does the dependency checking, I just then ran through each Plugin (P) again and added all the jar files from the plugin dependency (D) to the PluginClassLoader instance for the Plugin (P), so I have a working solution now that gives the following

                               

                              1. PluginManager creates a new PluginClassLoader before creating the Plugin class instance.  This classloader is used to load and create the Plugin instance, so all classes then created by the Plugin will be automatically loaded by this loader.

                              2. PublicPlugin is given the class loader instance (if it wants it)

                              3. PluginClassLoader is populated with URLs in this order

                              a) plugins/pluginName/classes - directory

                              b) plugins/pluginName/lib/*.jar

                              c) plugins/classes

                              d) plugins/*.jar

                              e) plugins/Dependency-A/lib/*jar

                              f) plugins/Dependency-B/lib/*jar

                              ...

                               

                              So, this would seem to support the class segregation as well as the dependency and should not break anything.  One possible improvement is that the PluginManager could have the plugins 'shared' directory as a classpath, where at the moment I have added them (c) & (d) to each plugin specific loader.

                               

                              BTW, I noticed also that the addPlugin() method used when downloading a plugin does not handle any dependencies.  I am not sure if dependecies are possible from downloaded plugins, but given that the addPlugin() is public, it should take care of dependencies if there are any.

                              I was looking on how Tomcat class loading takes place: The Overview Diagram:

                              http://www.jajakarta.org/tomcat/tomcat3.2-4.0/tomcat-4.0b5/src/catalina/docs/dev /classloaders.html

                               

                              The plugin class loader created by spark (point 2) is somehow similar with "Shared" class loader from above diagram. What is missing right now are class loaders below Shared

                               

                              That link is for Tomcat 3.2.4/4, which is 10 years old - I was working with Tomcat classloader issues in 2001 where they had similar problems - I even found one of my old messages about log4j problems .  Current Tomcat dev is 7 (also 6), but the classloaders hierarchy is similar - http://tomcat.apache.org/tomcat-7.0-doc/class-loader-howto.html but has developed since then,

                               

                              If you like, I can upload a patch so you can see where I am going with this.

                              Antony

                              • Re: How to use different log4j version in plugin
                                Antony Bronze

                                I found another flaw with the existing dependency architecture.  Let's assume you have plugin A and B and B depends on A.  Plugin A class "com.myorg.A" has a direct dependency on "com.org.B" which exists in a jar file shipped with plugin B.  When newInstance() is called in loadPublicPlugin() to create the plugin class com.myorg.A, as plugin B has not yet been loaded, the jar file containing com.org.B is not yet on the current common PluginClassLoader classpath, so it will fail to load plugin A even though it has the dependency configured in the plugin.xml.

                                 

                                I see the solution for that is for the plugin class instances to be created with newInstance() in the initializePlugins() method rather than loadPublicPlugin() method.  This is when all dependencies will have been satisfied.  I've changed the PluginManager to do that.

                                 

                                I'm not sure I fully understand the re-ordering of the List<Plugin> plugins array in initializePlugins.  As the internal plugins are loaded by the EventQueue.invokeLater runner, they end up being added in parallel with the public plugins - at least that's what happens in my test, the public ones get loaded half way down the plugins array, so the original order is somewhat random.

                                 

                                Is it intended there can be dependencies from public plugins to internal?

                                  • Re: How to use different log4j version in plugin
                                    Antony Bronze

                                    Hi Mircea,

                                    Take a look at the attached patch for PluginManager and PluginClassLoader.  I think it satisfies all the necessary requirements - logic is:

                                     

                                    1. 'Shared' class loader created in createSharedClassLoader() which acts as parent loader for all plugins.  Given a directory that is given to the classLoader, the following paths are added to the classpath for that directory

                                     

                                    a) {dir}/classes - directory

                                    b) {dir}/lib - directory

                                    c) {dir}/lib/*zip, plugins/lib/*jar

                                     

                                    2. loadPublicPlugins() will load, but not instantiated, Public plugins

                                     

                                    3. InitializePlugins will order plugins/publicPlugins according to dependency requirements

                                     

                                    4. InitializePlugins creates a PluginClassLoader for each plugin.  To this ClassLoader it adds the plugin/pluginName directory as in (1)

                                     

                                    5. For each dependency it then adds the plugin/dependency directory as in (1)

                                     

                                    I don't think this breaks anything although I have very little experience with Spark.  Still, I have created 3 plugins, one which depends on the other 2 and they are loaded out of order, but when the objects are created I no longer get the NoClassDefFoundError due to the file system load order.

                                     

                                    So, if you want a jar/class or file to be common to ALL plugins it goes in the plugins directory (I don't do anything about copying stuff from the install directory to the runtime directory).  Then each plugin will first look for stuff in its own classes directory, lib directory, then jar/zip files in the lib directory.

                                     

                                    I wonder if the plugin's own directory should be on the classpath rather than the lib directory.

                                     

                                    Antony

                                      • How to use different log4j version in plugin
                                        Mircea Carasel Bronze

                                        Hi,

                                        I looked over the patch. Overall it looks OK, but when I tested it, I noticed that in case when we have plugins that depend on another plugin, so when one plugin require the loading of a dependency plugin first, there are problems, in a sense that the plugin that has to get loaded first, doesn't seem to be loaded properly.

                                         

                                        If we have resources like properties files or images that are located in the plugin's jar and we want to access them right after Spark is started, before all UI is painted, are not found. Those plugin resources seem that are not loaded quick enough.

                                          • How to use different log4j version in plugin
                                            Antony Bronze

                                            Not got access to the sources just now, but I recall seeing a 2 second sleep somewhere during the plugin load.

                                            • Re: How to use different log4j version in plugin
                                              Antony Bronze

                                              The original plugin code is rather unpredictable, but it has worked  because the original classpath, which contained all the plugin jar files  from all plugins is set up before each call to loadPublicPlugin().  The PluginManager.loadPlugins() was called from Spark.startup(), so, when the PluginManager.loadPlugins() is complete, the original common classLoader is fully populated with the classpath from all plugins.  After my change, that is not done until initializePlugins() and that call is not done until later - there is at least a 2 second delay before initializePlugins() is called from SparkManager.getWorkspace().loadPlugins();

                                               

                                                      TaskEngine.getInstance().schedule(new TimerTask() {
                                                          public void run() {
                                                              final PluginManager pluginManager = PluginManager.getInstance();
                                                              SparkManager.getMainWindow().addMainWindowListener(pluginManager);
                                                              pluginManager.initializePlugins();
                                              
                                                              // Subscriptions are always manual
                                                              Roster roster = SparkManager.getConnection().getRoster();
                                                              roster.setSubscriptionMode(Roster.SubscriptionMode.manual);
                                                          }
                                                      }, 2000);
                                              

                                               

                                              So, unless there is a reason why code to manage the plugin  dependency re-ordering is inside the logic to perform the plugin  initialisation, I will move all the dependency re-ordering that was done in initializePlugins() into the end of loadPlugins().

                                               

                                              I'll upload a patch later.

                                              Antony

                                                • Re: How to use different log4j version in plugin
                                                  Antony Bronze

                                                  Patch for above to PluginManager - use earlier patch for PluginClassLoader

                                                    • How to use different log4j version in plugin
                                                      Mircea Carasel Bronze

                                                      I applied the patches in the following way: I put PluginClassLoader changes from Plugins.patch and PluginManager changes from PM.patch and it looks good now.

                                                       

                                                      The only problem that I noticed is that it appears that when you have plugins that depend one of another, the jar files are loaded both in the shared class loader and the plugin classloader

                                                       

                                                      For instance, when you create a singleton and use this singleton both in a .java class of the plugin that contains this singleton, and also use this singleton in a .java class of a different plugin that depends on this plugin, I noticed that this singleton is created again.

                                                       

                                                      Please check the above behavior and also make sure that you create some static access methods in PluginManager that returns a plugin's classloader, being given the plugin name as parameter

                                                       

                                                      Also, please create a new patch that contains all changes and attach it on a separated task here:

                                                      SPARK-1311

                                                       

                                                      Thanks,

                                                      Mircea

                                                        • Re: How to use different log4j version in plugin
                                                          Antony Bronze

                                                          Hi Mircea,

                                                           

                                                          Thanks for looking at the patches.  I expect the singleton behaviour will  be as you have noticed.  The approach to that problem is that common dependencies should either be in a separate plugin or as a jar file loaded by the shared classloader, which is the parent to all class loaders.  This is the same approach that Tomcat uses through their 'common' class loader.

                                                           

                                                          The shared class loader is created in PluginManager.createSharedClassLoader(), so that will set up the following paths to that shared classloader

                                                           

                                                          plugins/classes

                                                          plugins/lib

                                                          plugins/lib/*jar

                                                          plugins/lib/*zip

                                                           

                                                          so that singleton class should go into one of those locations.

                                                           

                                                          I will get the patches done this week.

                                                          Cheers

                                                          Antony

                                                          • Re: How to use different log4j version in plugin
                                                            Antony Bronze

                                                            Don't have rights to update SPARK-1311, so here's the patch based on trunk as at 15/8.

                                                            Antony

                                                              • Re: How to use different log4j version in plugin
                                                                Mircea Carasel Bronze

                                                                I opened ticket: SPARK-1424

                                                                 

                                                                I also applied your patch but I think that something was lost during your merges, because after logging into Spark I see:

                                                                 

                                                                log4j:ERROR A "org.apache.log4j.ConsoleAppender" object is not assignable to a "org.apache.log4j.Appender" variable.

                                                                log4j:ERROR The class "org.apache.log4j.Appender" was loaded by

                                                                log4j:ERROR [org.jivesoftware.spark.plugin.PluginClassLoader@7cd37a] whereas object of type

                                                                log4j:ERROR "org.apache.log4j.ConsoleAppender" was loaded by [org.jivesoftware.launcher.JiveClassLoader@15c7850].

                                                                log4j:ERROR Could not instantiate appender named "S".

                                                                 

                                                                I also created lib directory under plugins - but after spark starts this just disappears

                                                                 

                                                                Other question is related to changes for PluginManager.loadPublicPlugin(File pluginDir) - it looks like that right now always returns null, but I think that it is supposed to be changed to void

                                                                  • Re: How to use different log4j version in plugin
                                                                    Antony Bronze

                                                                     

                                                                    The log4j problem goes full circle


                                                                    The jingle plugin has a copy of log4j.jar and that causes this problem.  It's all to do with the horrible log4j problem with static initialisation which has been the bane of many a Java dev's life   It was the reason I started down this class loader issue in the first place...

                                                                     

                                                                    It is possible to migrate to SLF4J, with a drop in a slf4j jar replacement which would solve these problems for the future.

                                                                     

                                                                    The other issue about the lib directory...  I suggested the plugin directory as a location for shared classes/config, however, in practice the deployment is not so straightforward as I had not fully understood how Spark unpacks the initial plugins.  Spark will unpack all the plugins from the install directory to the c:\users\x\AppData\Roaming\Spark\plugins (or appropriate) directory, so it removes things.

                                                                     

                                                                    I can easily skip lib/classes directories in plugins directory, but there still remains the issue of how users should deploy common jars/classes.  Attached is a patch that preserves lib/classes directories.

                                                                      • Re: How to use different log4j version in plugin
                                                                        Mircea Carasel Bronze

                                                                        I applied the patch and for me it still wipes out plugins/lib directory. I am thinking to simplify things a bit

                                                                        Lets just forget about plugins/lib or plugin/classes for common dependencies...

                                                                        Lets use the shared classloader only for loading dependency plugins, other plugins to have their own class loader.

                                                                         

                                                                        So for instance if we have the following structure of plugins:

                                                                         

                                                                        Plugin A, Plugin B, Plugin C, Plugin D, Plugin E

                                                                         

                                                                        Plugin A and Plugin B depend on Plugin C

                                                                        Plugin D depend on Plugin E

                                                                         

                                                                        Shared class loader will load Plugin C and Plugin E

                                                                        Plugin A, B and D will have dedicated class loaders

                                                                         

                                                                        In this way we will also avoid to have things loaded twice - in the shared classloader and plugin dedicated class loader as well

                                                                         

                                                                        In other scenario for instance, if we have only plugins that do not depend one of another the shared class loader will not load anything..

                                                                         

                                                                        I am not sure if this is a doable proposal and if this scheme solves your original need with different version of log4j to be available in plugin. If this is doable, lets just do that, and think of another more complex scenario in a step 2. Please let me know what do you think

                                                                         

                                                                        Thanks,

                                                                        Mircea

                                                                          • Re: How to use different log4j version in plugin
                                                                            Antony Bronze

                                                                            Hi Mircea,

                                                                             

                                                                            I think you might have got a wrong copy of the patch, I uploaded one, then replaced it.  Not to worry though, we can forget those paths.

                                                                             

                                                                            As I said in my PM, the changes currently mean for your example that:

                                                                             

                                                                            Classloader for A has A and C paths in its classloader

                                                                            Classloader for B has B and C paths in its classloader

                                                                            Classloader for D has D and E paths in its classloader

                                                                             

                                                                            The shared classloader is never used and you never get things loaded twice anyway because the plugin classloader always searches its own paths before delegating to the parent (shared) class loader.

                                                                             

                                                                            The problem with using the shared classloader for dependencies is that in your example, if you have jar files in C and E, there must be no duplicate classes between them otherwise there will be conflict and the situation is not much different to now.

                                                                             

                                                                            The only problem I see is what mechanism should be used to support deploying shared jar files.  Your suggestion of packaging jar files to a common/XXX directory in the plugin jar that are then added to the shared classloader would work.

                                                                             

                                                                            Regards

                                                                            Antony

                                                                              • Re: How to use different log4j version in plugin
                                                                                Mircea Carasel Bronze

                                                                                In the second patch it was added a test that cannot work:

                                                                                +    if (file.getName() == "lib" || file.getName() == "classes")

                                                                                +        continue;

                                                                                I think this is why it didn't work for me. Instead, it should have been: file.getName().equals("classes")

                                                                                 

                                                                                Anyway - so lets forget about plugins/classes, plugins/lib, common directories or any other paths. I made a change on your first patch where I loaded dependency plugins in the shared classloader, and only for other plugins I created dedicated class loaders

                                                                                 

                                                                                So, in scenario: Plugin A, Plugin B, Plugin C, Plugin D, Plugin E, Plugin F

                                                                                Plugin A and plugin B have a dependency on Plugin C

                                                                                Plugin D has a dependency on plugin E

                                                                                Plugin F does not have any dependencies

                                                                                 

                                                                                will have separate class loaders for Plugin A, Plugin B, Plugin D, Plugin F, added in your hashMap for plugins class loaders

                                                                                Plugin C and Plugin E will be loaded in the shared class loader - lets accept the limitation that those plugins will not have common classes - this would have been a problem for every plugin right now when we have one single class loader

                                                                                 

                                                                                Given the Fact that the shared class loader is parent for all dedicated classloaders that are loaded in the hashMap we are OK, every plugin works.

                                                                                Also, the case when a singleton is present in a dependency plugin and used in other plugins is resolved.

                                                                                 

                                                                                The error with log4j class loading is anyway encountered only when you have plugins that contain the log4j library, because log4j library is present in spark lib also, I think we can live with this. Plugins should not contain the same version of log4j as spark does. We better update spark lib directory with the latest version of log4j

                                                                                 

                                                                                Please review if my changes are ok with you. If there are OK, I will recommend for the patch to be commited in spark svn.

                                                                                 

                                                                                I only performed changes in your PluginManager.private void createPluginClasses() method

                                                                                 

                                                                                The attached patch contains your first (SparkPlugin-110815.patch) patch + my changes

                                                                          • Re: How to use different log4j version in plugin
                                                                            Antony Bronze

                                                                            I asked earlier about the loadPublicPlugin method because it loops round all the <plugin> elements in plugin.xml, so it looks like it was supposed to handle multiple plugins classes in a single plugin.xml, but in practice it did not work.

                                                                             

                                                                            It should be void because the plugin cannot be instantiated at that point.  Also, if it is not designed to support multiple <plugin> elements, then the confusing loop should be removed.

                                                    • Re: How to use different log4j version in plugin
                                                      Antony Bronze

                                                      Current PluginManager.loadPublicPlugin API seems to imply it will create a single Plugin instance, but loadPublicPlugin() will loop the plugin.xml

                                                       

                                                              List<? extends Node> pluginDefs = pluginXML.selectNodes("/plugin");

                                                              for (Node plugin1 : pluginDefs) {

                                                                  PublicPlugin publicPlugin = new PublicPlugin();

                                                       

                                                      so it seems it's allowed to have multiple Plugin implementations in a single plugin.xml - if so, the return value is meaningless.