Thoughts on moving spark.properties to an INI file

Title states it all. It would be really handy for me to have sparks main properties files to be an INI files. It looks like its possible for java to take advantage of INI files using [ini4j] - Java API for handling Windows ini file format . and since and INI is really nothing more than a text file with sections, this shouldn’t cause problems with other platforms.

So, i thought I would open up the discussion to get some feedback if this is doable or desired by others.

What improvements would switching to ini bring?

honestly, I can only think of a few:

consolidation of property files to a single property file by using INI section tags

being able to use windows group policy to push out changes to the INI files.

the being able to edit INI via group policy is the one would be most interested in

I wasn’t aware of an option to modify ini files via GPO. This looks handy. But who would take up such a task of modifying all the parts that use properties file to use ini (and to use it correctly, not sure if ini4j would solve all this automatically)? And then backwards compatibility (automatic conversion between old and new config file). And also it will invalidate a bit many of the forums posts referring to spark.properties and other config files. Maybe this can be an option to switch to ini usage, instead of doing this a default behavior. Especially when other platforms wouldn’t benefit from that.

Anyway, i’m not disagreeing. I might use it also. Although we are allowing our users to change their settings as they wish and we will probably switch to Skype for Business by the time it will be implemented

i may try this: simply add a section tag to spark.properties say [settings], and see if its ignored by spark…if it is, than I wonder if I can use GPO to edit it even though the file doesn’t end with .ini …hmmmm

yes, backward compatibility would prob be an issue as would past references to the property files

I’m pretty sure Spark will ignore that, but not sure about GPO not wanting the ini file

Why not move settings to the registry instead of an ini file? Managing registry settings via GPO is also a well-documented and familiar task for any system admin.

because spark is a cross platform java app

Very true. Well I may be overcomplicating things but since the registry is a glorified text file itself, I think Windows versions could easily support the registry while Mac and Linux versions could use an ini file.

INI files can be controlled and changed with GPO as well. Truth be told,we don’t need true INI support…just a single [tag] near the top of the file before the settings. Then rename the spark.properties files to spark.ini .Since a INI is still a text file, it should still work fine with other OS, as long as the TAG is ignored by spark.

If i’ll have to vote, registry would be on the last place for me (coming from a Windows systems admin having to deal with GPOs, scripts, registry on a daily basis). Currently i’m perfectly fine with having settings in a text file. It is easy to edit (not just for admins, for regular users also). I can see advantage in using INI, especially if we could provide MSI packages at some point.

Aren’t there many situations when you wouldn’t want your users to be able to edit the ini file?

For the antivirus or firewall, sure. But for a general use programs, not so often. I know that some may have strict policies here also, but for instant messenger you might want your users to be able to change settings for their convenience. Also, advanced users can still find how to change settings as user side settings usually sit in user’s appdata or user’s registry.

Just wanted to make a comment here: I seem to have my GPO correctly making all of the spark.properties file, via the INI file configuration options in GPP settings. I simply made the Section heading of [SparkViaGPO]. Spark interprets this as an empty property: after Spark has launched, it rewrites this as “[SparkViaGPO]=”. This seems to be working well, even through multiple successive GPO reapplications (both forced and non-forced updates, and the regular 90-odd minute reapplications).

To make this completely smooth, the Spark client just needs to allow for a proper section heading. Anything, really. (Well, either that, or Microsoft changes it’s INI editor to not require Section headings!). Perhaps since Spark assumes # are comments, then just make [ for comments as well - Spark will ignore the line.

Currently working well for my needs!

1 Like