Spark 2.7.0 install via Windows Startup Script

Hi all,

In our Windows-based environment I’m responsible for deploying the latest official Spark release (2.7.0.671) to a large number of desktops. These desktops are currently using the last official Spark 2.6.3.12555 release. As with the last Spark release, I’ve opted to use our startup script (applied by Group Policy) to silently install Spark 2.7.0 when the machines boot up.

So, within the startup script, I’m launching the Spark 2.7.0 installer like this:

spark_2_7_0.exe -q

And it’s not installing. I can see that it tries to install, however it doesn’t do it - instead it hangs for a period of time and just fails silently. Meanwhile in the windows Event Logs, I see an entry that something tried to write to the registry but failed because it lacked permission (no info is given on exactly what it was, though).

Meanwhile if a logged on user with admin privileges runs the same install command as above, all works well.

After troubleshooting this quite a bit, I’ve run across what I think may be the cause of the problem: the “installer manifest” in the spark_2_7_0.exe installer is not set to ask for or require administrative privileges in Windows. As evidence, if you take a look at the old Spark 2.6.3 installer and compare it with the latest 2.7.0 installer, you’ll see that the Spark 2.6.3 installer has the “administrator shield” on the icon, while the 2.7.0 installer does not. I’ll also add that I’m able to successfully install Spark 2.6.3 via the startup script without issue.

So, is there any hope that the 2.7.0 installer can be recompiled so that it requires administrative privileges like the old 2.6.3 installer? FYI, I found this link on the install4j site that explains how: Elevation Of Privileges. I’m open to any other ideas as well, but this seems to be the simplest resolution to our problem.

Thanks!

I have noticed the absence of the shield icon a few years ago after install4j has been updated for Bamboo (installer building system here). But we haven’t figured out how to change that. I thought this would be problematic, but Spark installs fine manually (it just asks for the privileges after install4j loads, not as it launches). It also installs normally over 2.7.0 versions using -q switch. I do it in Shutdown script, but it should work the same as Startup. But it probably causes problems when installing 2.7.0 over 2.6.3, though it may be also because of the different install4j versions used for those versions. Maybe 2.7.0 tries to update the installation and fails as 2.6.3 can’t understand new commands.

I will file a ticket for this, but restoring old privileges function can break updating for those who already use 2.7.0 (as Spark didn’t have an official release for years some were using nightly builds in production). So, there must be an option for these users too. It needs testing. But most of all, we don’t have install4j/bamboo experts here. I know when i tried comparing old and new install4j configs back then it was mind boggling. And Daryl (volunteer looking over Bamboo in his spare time) didn’t find what to change either. It’s a small community of a few volunteers. So don’t expect fast results.

[SPARK-1608] Investigate Windows privileges options for install4j - Jive Software Open Source

For now you may try to call uninstall.exe first (not sure if it supports -q also), remove 2.6.3 installation and then install 2.7.0. That’s all i can think of now (other than wiping old installation folder the unclean way ).

Also Sam Snow here talks about using -overwrite switch and tells about goods results in testing, but he didn’t came back to share his final results Trying to use GPO to install Spark

Thanks for that. I should mention that within the startup script, I’ve also tried running the uninstaller for 2.6.3 then the installer for 2.7.0 and the result is that 2.6.3 gets uninstalled and 2.7.0 doesn’t get installed. I’ve also tried installing 2.7.0 on a system with no Spark installed - same results. I’ve also tried the -overwrite switch with the same results. A bit odd since I seem to be the only one having this issue. I understand that you are successfully running the 2.7.0 installer via a shutdown script in your environment. Both the startup and shutdown scripts run under the SYSTEM context, so I can’t imagine what is different in our environments.

There is an update to my dilemma. I made a change to the startup script so that it copies the spark_2_7_0.exe installer down to the local PC, then runs the installer silently from there (previous to this, the installer was being run from a UNC share). When the Spark 2.7.0 installer is run from the local PC, it works properly. With that said, I’m going to go forward with deploying it in this manner.

if your trying to run it from UNC, then the issue is likely due to a check that cmd.exe does that prevents running executable from unc. you can disable this in the registry.

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Command Processor\DisableUNCCheck

and change it to 1

Good catch. Yes, i’m using the shutdown script. And i’m launching the installer from an UNC path. Maybe this is that Win7 thing, which marks installs downloaded from the Internet as not trusted (with a checkbox somewhere in file Properties). Or as speedy pointed, maybe your domain policy is more strict than mine.