How to use different log4j version in plugin

Log4j is actually a real PITA. It will force the use of the Thread’s ContextClassLoader to find the log4j.(properties/xml). It’s well described here http://articles.qos.ch/classloader.html and in this http://stackoverflow.com/questions/1974705/log4j-and-the-thread-context-classloa der they recommend dropping in SLF4j as a replacement.

Hi Antony,

Please see below

[Antony]

Requiring a Plugin writer to take on the intricacies of writing a custom ClassLoader makes writing Plugins too complex

[Mircea]

I totally agree, a Plugin writer should not do that, I was just thinking that we can extend current mechanism to automatically create a dedicated plugin class loader that will load plugin dependencies

[Antony]

I don’t see any point in the current PluginClassLoader usage - in fact it’s a really bad thing to lump all the plugin lib directories together and can lead to real support issues

[Mircea]

Well, I am not sure if this is bad or not, but IMHO there should be a way to make Spark container aware of what plugins are installed. Loading all plugins in the same class loader proves to be a solution… What this doesn’t seem to resolve are plugin dependencies (other plugin jars), as you noticed.

I think that there are three major aspects here:

  1. There is Spark - the plugin container

  2. There are plugins

  3. There are plugin dependencies.

What we have a this point:

For 1 - there is the the Spark class loader responsible for spark libraries loading

For 2 - there is the Plugin class loader that loads all plugin jars (lib directories)

In addition for point 3 What you need is to load plugin dependencies in dedicated class loaders for each plugin. All those class loaders should inherit the generic plugin class loader

IMO, there is no need to beak the current functionality, removing the clas loader from point 2 I think will break the plugin dependency concept (in plugin.xml you can put a dependency tag and make one plugin to be loaded after a parent plugin)

I was looking on how Tomcat class loading takes place: The Overview Diagram:

http://www.jajakarta.org/tomcat/tomcat3.2-4.0/tomcat-4.0b5/src/catalina/docs/dev /classloaders.html

The plugin class loader created by spark (point 2) is somehow similar with “Shared” class loader from above diagram. What is missing right now are class loaders below Shared

Mircea

Hi Mircea,

[Antony]

I don’t see any point in the current PluginClassLoader usage - in fact it’s a really bad thing to lump all the plugin lib directories together and can lead to real support issues

[Mircea]

Well, I am not sure if this is bad or not, but IMHO there should be a way to make Spark container aware of what plugins are installed. Loading all plugins in the same class loader proves to be a solution… What this doesn’t seem to resolve are plugin dependencies (other plugin jars), as you noticed.

I think that there are three major aspects here:

  1. There is Spark - the plugin container
  1. There are plugins
  1. There are plugin dependencies.

What we have a this point:

For 1 - there is the the Spark class loader responsible for spark libraries loading

For 2 - there is the Plugin class loader that loads all plugin jars (lib directories)

In addition for point 3 What you need is to load plugin dependencies in dedicated class loaders for each plugin. All those class loaders should inherit the generic plugin class loader

IMO, there is no need to beak the current functionality, removing the clas loader from point 2 I think will break the plugin dependency concept (in plugin.xml you can put a dependency tag and make one plugin to be loaded after a parent plugin)

I am not sure I understand the Container’s purpose of being ‘aware’ of what plugins are installed and what problem is solved by adding the lib directories to the classpath. By doing this, it effectively makes every plugin a runtime dependency of every other, but without the benefit of being able to define the dependecy order as can currently be done. Sure, the container can know what plugins are installed by parsing the plugins directory, but if this is to solve the dependency issue, it is too crude.

IMHO, it seems like the container’s runtime is being polluted with too much knowledge of the Plugins. The container needs to be simply able to parse and load the plugins runtime descriptors and then launch them when appropriate. It should not have a runtime i.e. classpath, awareness. Tomcat container does not have all the webapp classes/libs on its own classpath.

When debugging yesterday, I found the plugin dependency support, which I couldn’t find documented anywhere in the SparkPlug dev guide, but there is a simple solution to that dependency issue rather than having a common class loader that is aware of all libs.

Yesterday, during initializePlugins, after it does the dependency checking, I just then ran through each Plugin (P) again and added all the jar files from the plugin dependency (D) to the PluginClassLoader instance for the Plugin (P), so I have a working solution now that gives the following

  1. PluginManager creates a new PluginClassLoader before creating the Plugin class instance. This classloader is used to load and create the Plugin instance, so all classes then created by the Plugin will be automatically loaded by this loader.

  2. PublicPlugin is given the class loader instance (if it wants it)

  3. PluginClassLoader is populated with URLs in this order

a) plugins/pluginName/classes - directory

b) plugins/pluginName/lib/*.jar

c) plugins/classes

d) plugins/*.jar

e) plugins/Dependency-A/lib/*jar

f) plugins/Dependency-B/lib/*jar

So, this would seem to support the class segregation as well as the dependency and should not break anything. One possible improvement is that the PluginManager could have the plugins ‘shared’ directory as a classpath, where at the moment I have added them (c) & (d) to each plugin specific loader.

BTW, I noticed also that the addPlugin() method used when downloading a plugin does not handle any dependencies. I am not sure if dependecies are possible from downloaded plugins, but given that the addPlugin() is public, it should take care of dependencies if there are any.

I was looking on how Tomcat class loading takes place: The Overview Diagram:

http://www.jajakarta.org/tomcat/tomcat3.2-4.0/tomcat-4.0b5/src/catalina/docs/dev /classloaders.html

The plugin class loader created by spark (point 2) is somehow similar with “Shared” class loader from above diagram. What is missing right now are class loaders below Shared

That link is for Tomcat 3.2.4/4, which is 10 years old - I was working with Tomcat classloader issues in 2001 where they had similar problems - I even found one of my old messages about log4j problems . Current Tomcat dev is 7 (also 6), but the classloaders hierarchy is similar - Apache Tomcat 7 (7.0.109) - Class Loader HOW-TO but has developed since then,

If you like, I can upload a patch so you can see where I am going with this.

Antony

I found another flaw with the existing dependency architecture. Let’s assume you have plugin A and B and B depends on A. Plugin A class “com.myorg.A” has a direct dependency on “com.org.B” which exists in a jar file shipped with plugin B. When newInstance() is called in loadPublicPlugin() to create the plugin class com.myorg.A, as plugin B has not yet been loaded, the jar file containing com.org.B is not yet on the current common PluginClassLoader classpath, so it will fail to load plugin A even though it has the dependency configured in the plugin.xml.

I see the solution for that is for the plugin class instances to be created with newInstance() in the initializePlugins() method rather than loadPublicPlugin() method. This is when all dependencies will have been satisfied. I’ve changed the PluginManager to do that.

I’m not sure I fully understand the re-ordering of the List plugins array in initializePlugins. As the internal plugins are loaded by the EventQueue.invokeLater runner, they end up being added in parallel with the public plugins - at least that’s what happens in my test, the public ones get loaded half way down the plugins array, so the original order is somewhat random.

Is it intended there can be dependencies from public plugins to internal?

Current PluginManager.loadPublicPlugin API seems to imply it will create a single Plugin instance, but loadPublicPlugin() will loop the plugin.xml

List<? extends Node> pluginDefs = pluginXML.selectNodes("/plugin");

for (Node plugin1 : pluginDefs) {

PublicPlugin publicPlugin = new PublicPlugin();

so it seems it’s allowed to have multiple Plugin implementations in a single plugin.xml - if so, the return value is meaningless.

Hi Mircea,

Take a look at the attached patch for PluginManager and PluginClassLoader. I think it satisfies all the necessary requirements - logic is:

  1. ‘Shared’ class loader created in createSharedClassLoader() which acts as parent loader for all plugins. Given a directory that is given to the classLoader, the following paths are added to the classpath for that directory

a) {dir}/classes - directory

b) {dir}/lib - directory

c) {dir}/lib/*zip, plugins/lib/*jar

  1. loadPublicPlugins() will load, but not instantiated, Public plugins

  2. InitializePlugins will order plugins/publicPlugins according to dependency requirements

  3. InitializePlugins creates a PluginClassLoader for each plugin. To this ClassLoader it adds the plugin/pluginName directory as in (1)

  4. For each dependency it then adds the plugin/dependency directory as in (1)

I don’t think this breaks anything although I have very little experience with Spark. Still, I have created 3 plugins, one which depends on the other 2 and they are loaded out of order, but when the objects are created I no longer get the NoClassDefFoundError due to the file system load order.

So, if you want a jar/class or file to be common to ALL plugins it goes in the plugins directory (I don’t do anything about copying stuff from the install directory to the runtime directory). Then each plugin will first look for stuff in its own classes directory, lib directory, then jar/zip files in the lib directory.

I wonder if the plugin’s own directory should be on the classpath rather than the lib directory.

Antony
Plugins.patch.zip (5619 Bytes)

Hi,

I looked over the patch. Overall it looks OK, but when I tested it, I noticed that in case when we have plugins that depend on another plugin, so when one plugin require the loading of a dependency plugin first, there are problems, in a sense that the plugin that has to get loaded first, doesn’t seem to be loaded properly.

If we have resources like properties files or images that are located in the plugin’s jar and we want to access them right after Spark is started, before all UI is painted, are not found. Those plugin resources seem that are not loaded quick enough.

Not got access to the sources just now, but I recall seeing a 2 second sleep somewhere during the plugin load.

The original plugin code is rather unpredictable, but it has worked because the original classpath, which contained all the plugin jar files from all plugins is set up before each call to loadPublicPlugin(). The PluginManager.loadPlugins() was called from Spark.startup(), so, when the PluginManager.loadPlugins() is complete, the original common classLoader is fully populated with the classpath from all plugins. After my change, that is not done until initializePlugins() and that call is not done until later - there is at least a 2 second delay before initializePlugins() is called from SparkManager.getWorkspace().loadPlugins();

TaskEngine.getInstance().schedule(new TimerTask() {
            public void run() {
                final PluginManager pluginManager = PluginManager.getInstance();
                SparkManager.getMainWindow().addMainWindowListener(pluginManager);
                pluginManager.initializePlugins();                 // Subscriptions are always manual
                Roster roster = SparkManager.getConnection().getRoster();
                roster.setSubscriptionMode(Roster.SubscriptionMode.manual);
            }
        }, 2000);

So, unless there is a reason why code to manage the plugin dependency re-ordering is inside the logic to perform the plugin initialisation, I will move all the dependency re-ordering that was done in initializePlugins() into the end of loadPlugins().

I’ll upload a patch later.

Antony

Patch for above to PluginManager - use earlier patch for PluginClassLoader
PM.patch.zip (5643 Bytes)

I applied the patches in the following way: I put PluginClassLoader changes from Plugins.patch and PluginManager changes from PM.patch and it looks good now.

The only problem that I noticed is that it appears that when you have plugins that depend one of another, the jar files are loaded both in the shared class loader and the plugin classloader

For instance, when you create a singleton and use this singleton both in a .java class of the plugin that contains this singleton, and also use this singleton in a .java class of a different plugin that depends on this plugin, I noticed that this singleton is created again.

Please check the above behavior and also make sure that you create some static access methods in PluginManager that returns a plugin’s classloader, being given the plugin name as parameter

Also, please create a new patch that contains all changes and attach it on a separated task here:

SPARK-1311

Thanks,

Mircea

Hi Mircea,

Thanks for looking at the patches. I expect the singleton behaviour will be as you have noticed. The approach to that problem is that common dependencies should either be in a separate plugin or as a jar file loaded by the shared classloader, which is the parent to all class loaders. This is the same approach that Tomcat uses through their ‘common’ class loader.

The shared class loader is created in PluginManager.createSharedClassLoader(), so that will set up the following paths to that shared classloader

plugins/classes

plugins/lib

plugins/lib/*jar

plugins/lib/*zip

so that singleton class should go into one of those locations.

I will get the patches done this week.

Cheers

Antony

Don’t have rights to update SPARK-1311, so here’s the patch based on trunk as at 15/8.

Antony
SparkPlugin-110815.patch.zip (8189 Bytes)

I opened ticket: SPARK-1424

I also applied your patch but I think that something was lost during your merges, because after logging into Spark I see:

log4j:ERROR A “org.apache.log4j.ConsoleAppender” object is not assignable to a “org.apache.log4j.Appender” variable.

log4j:ERROR The class “org.apache.log4j.Appender” was loaded by

log4j:ERROR [org.jivesoftware.spark.plugin.PluginClassLoader@7cd37a] whereas object of type

log4j:ERROR “org.apache.log4j.ConsoleAppender” was loaded by [org.jivesoftware.launcher.JiveClassLoader@15c7850].

log4j:ERROR Could not instantiate appender named “S”.

I also created lib directory under plugins - but after spark starts this just disappears

Other question is related to changes for PluginManager.loadPublicPlugin(File pluginDir) - it looks like that right now always returns null, but I think that it is supposed to be changed to void

The log4j problem goes full circle

The jingle plugin has a copy of log4j.jar and that causes this problem. It’s all to do with the horrible log4j problem with static initialisation which has been the bane of many a Java dev’s life It was the reason I started down this class loader issue in the first place…

It is possible to migrate to SLF4J, with a drop in a slf4j jar replacement which would solve these problems for the future.

The other issue about the lib directory… I suggested the plugin directory as a location for shared classes/config, however, in practice the deployment is not so straightforward as I had not fully understood how Spark unpacks the initial plugins. Spark will unpack all the plugins from the install directory to the c:\users\x\AppData\Roaming\Spark\plugins (or appropriate) directory, so it removes things.

I can easily skip lib/classes directories in plugins directory, but there still remains the issue of how users should deploy common jars/classes. Attached is a patch that preserves lib/classes directories.
SparkPlugin-110815-2.patch.zip (8331 Bytes)

I asked earlier about the loadPublicPlugin method because it loops round all the elements in plugin.xml, so it looks like it was supposed to handle multiple plugins classes in a single plugin.xml, but in practice it did not work.

It should be void because the plugin cannot be instantiated at that point. Also, if it is not designed to support multiple elements, then the confusing loop should be removed.

I applied the patch and for me it still wipes out plugins/lib directory. I am thinking to simplify things a bit

Lets just forget about plugins/lib or plugin/classes for common dependencies…

Lets use the shared classloader only for loading dependency plugins, other plugins to have their own class loader.

So for instance if we have the following structure of plugins:

Plugin A, Plugin B, Plugin C, Plugin D, Plugin E

Plugin A and Plugin B depend on Plugin C

Plugin D depend on Plugin E

Shared class loader will load Plugin C and Plugin E

Plugin A, B and D will have dedicated class loaders

In this way we will also avoid to have things loaded twice - in the shared classloader and plugin dedicated class loader as well

In other scenario for instance, if we have only plugins that do not depend one of another the shared class loader will not load anything…

I am not sure if this is a doable proposal and if this scheme solves your original need with different version of log4j to be available in plugin. If this is doable, lets just do that, and think of another more complex scenario in a step 2. Please let me know what do you think

Thanks,

Mircea

Hi Mircea,

I think you might have got a wrong copy of the patch, I uploaded one, then replaced it. Not to worry though, we can forget those paths.

As I said in my PM, the changes currently mean for your example that:

Classloader for A has A and C paths in its classloader

Classloader for B has B and C paths in its classloader

Classloader for D has D and E paths in its classloader

The shared classloader is never used and you never get things loaded twice anyway because the plugin classloader always searches its own paths before delegating to the parent (shared) class loader.

The problem with using the shared classloader for dependencies is that in your example, if you have jar files in C and E, there must be no duplicate classes between them otherwise there will be conflict and the situation is not much different to now.

The only problem I see is what mechanism should be used to support deploying shared jar files. Your suggestion of packaging jar files to a common/XXX directory in the plugin jar that are then added to the shared classloader would work.

Regards

Antony

In the second patch it was added a test that cannot work:

  • if (file.getName() == “lib” || file.getName() == “classes”)

  •    continue;
    

I think this is why it didn’t work for me. Instead, it should have been: file.getName().equals(“classes”)

Anyway - so lets forget about plugins/classes, plugins/lib, common directories or any other paths. I made a change on your first patch where I loaded dependency plugins in the shared classloader, and only for other plugins I created dedicated class loaders

So, in scenario: Plugin A, Plugin B, Plugin C, Plugin D, Plugin E, Plugin F

Plugin A and plugin B have a dependency on Plugin C

Plugin D has a dependency on plugin E

Plugin F does not have any dependencies

will have separate class loaders for Plugin A, Plugin B, Plugin D, Plugin F, added in your hashMap for plugins class loaders

Plugin C and Plugin E will be loaded in the shared class loader - lets accept the limitation that those plugins will not have common classes - this would have been a problem for every plugin right now when we have one single class loader

Given the Fact that the shared class loader is parent for all dedicated classloaders that are loaded in the hashMap we are OK, every plugin works.

Also, the case when a singleton is present in a dependency plugin and used in other plugins is resolved.

The error with log4j class loading is anyway encountered only when you have plugins that contain the log4j library, because log4j library is present in spark lib also, I think we can live with this. Plugins should not contain the same version of log4j as spark does. We better update spark lib directory with the latest version of log4j

Please review if my changes are ok with you. If there are OK, I will recommend for the patch to be commited in spark svn.

I only performed changes in your PluginManager.private void createPluginClasses() method

The attached patch contains your first (SparkPlugin-110815.patch) patch + my changes
SparkPlugin-110815-3.patch.zip (8252 Bytes)

Looks like I uploaded the wrong patch as my copy had the “lib”.equals() rather than == “lib”.

I’m away until Monday now, have to look then.