Error installing Spark-2.6.3 on Fedora 16

Hello,

I am trying to install spark-2.6.3 on a Fedora 16 machine. I have tried installing from the rpm and I get the following errors:

Error: Package: Spark-2.6.3.12555-1.x86_64 (/spark-2.6.3)

Requires: libodbcinst.so

Error: Package: Spark-2.6.3.12555-1.x86_64 (/spark-2.6.3)

Requires: libodbc.so

I have installed both the unixODBC i686 and x86_64 packages.

I have also tried building the spark-2.6.3.src.rpm. At first the build complained about %{SPARK_SOURCE} so I changed the spec file to hardcode the source path to rpmbuild/SOURCES/spark-2.6.3.12555.tar.gz and hardcoded the %{SPARK_VERSION} to 2.6.3.12555. Then the build complained about finding /usr/apache-ant-1.8.1bin/ant jar. On Fedora 16 ant is located in /usr/bin so the line should be /usr/bin/ant jar. The rpm will then build but it still complains about the same errors:

Error: Package: Spark-2.6.3.12555-1.x86_64 (/Spark-2.6.3.12555-1.x86_64)

Requires: libodbcinst.so

Error: Package: Spark-2.6.3.12555-1.x86_64 (/Spark-2.6.3.12555-1.x86_64)

Requires: libodbc.so

The dependencies are there:

lrwxrwxrwx. 1 root root 20 Jan 6 13:40 /usr/lib/libodbcinst.so -> libodbcinst.so.2.0.0

lrwxrwxrwx. 1 root root 20 Jan 6 13:40 /usr/lib/libodbcinst.so.2 -> libodbcinst.so.2.0.0

lrwxrwxrwx. 1 root root 16 Jan 6 13:40 /usr/lib/libodbc.so -> libodbc.so.2.0.0

lrwxrwxrwx. 1 root root 16 Jan 6 13:40 /usr/lib/libodbc.so.2 -> libodbc.so.2.0.0

Does anyone know how to fix this without running rpm with --nodeps?

Thanks,

Patrick

I’m running CentOS 6.2 - same thing. RHEL 6 was the same. I had to download the tarball and unzip it. There’s a script in the “resources” folder called “startup.sh” - had to chmod 755 to make it executable, then it runs no problem.

Hope that helps!

Justin

I am having the same problem trying to install Spark in Fedora 17. Searching the Spark issue tracker, apparently no tickets have been opened for this. It’s not encouraging to see the issue get left untended…

I tried running with --skip-broken but it does not look like Spark was successfully installed since it got skipped.

sudo yum install spark-2.6.3.rpm --skip-broken

Loaded plugins: langpacks, presto, refresh-packagekit

Examining spark-2.6.3.rpm: Spark-2.6.3.12555-1.x86_64

Marking spark-2.6.3.rpm to be installed

Resolving Dependencies

–> Running transaction check

—> Package Spark.x86_64 0:2.6.3.12555-1 will be installed

–> Processing Dependency: libX11.so.6 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libXext.so.6 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libXi.so.6 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libXp.so.6 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libXtst.so.6 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libasound.so.2 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libasound.so.2(ALSA_0.9) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libc.so.6 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libc.so.6(GLIBC_2.0) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libc.so.6(GLIBC_2.1) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libc.so.6(GLIBC_2.1.2) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libc.so.6(GLIBC_2.1.3) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libc.so.6(GLIBC_2.2) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libc.so.6(GLIBC_2.2.4) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libdl.so.2 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libdl.so.2(GLIBC_2.0) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libdl.so.2(GLIBC_2.1) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libgcc_s.so.1 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libgcc_s.so.1(GCC_3.0) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libm.so.6 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libm.so.6(GLIBC_2.0) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libnsl.so.1 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libodbc.so for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libodbcinst.so for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libpthread.so.0 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libpthread.so.0(GLIBC_2.0) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libpthread.so.0(GLIBC_2.1) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libpthread.so.0(GLIBC_2.2) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libpthread.so.0(GLIBC_2.2.3) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libstdc++.so.5 for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libstdc++.so.5(CXXABI_1.2) for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libstdc++.so.5(GLIBCPP_3.2) for package: Spark-2.6.3.12555-1.x86_64

–> Running transaction check

—> Package Spark.x86_64 0:2.6.3.12555-1 will be installed

–> Processing Dependency: libodbc.so for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libodbcinst.so for package: Spark-2.6.3.12555-1.x86_64

—> Package alsa-lib.i686 0:1.0.25-3.fc17 will be installed

—> Package compat-libstdc+±33.i686 0:3.2.3-68.3 will be installed

—> Package glibc.i686 0:2.15-37.fc17 will be installed

–> Processing Dependency: libfreebl3.so(NSSRAWHASH_3.12.3) for package: glibc-2.15-37.fc17.i686

–> Processing Dependency: libfreebl3.so for package: glibc-2.15-37.fc17.i686

—> Package libX11.i686 0:1.4.99.901-2.fc17 will be installed

–> Processing Dependency: libxcb.so.1 for package: libX11-1.4.99.901-2.fc17.i686

—> Package libXext.i686 0:1.3.1-1.fc17 will be installed

—> Package libXi.i686 0:1.6.1-1.fc17 will be installed

—> Package libXp.i686 0:1.0.0-17.fc17 will be installed

–> Processing Dependency: libXau.so.6 for package: libXp-1.0.0-17.fc17.i686

—> Package libXtst.i686 0:1.2.0-3.fc17 will be installed

—> Package libgcc.i686 0:4.7.0-5.fc17 will be installed

–> Running transaction check

—> Package Spark.x86_64 0:2.6.3.12555-1 will be installed

–> Processing Dependency: libodbc.so for package: Spark-2.6.3.12555-1.x86_64

–> Processing Dependency: libodbcinst.so for package: Spark-2.6.3.12555-1.x86_64

—> Package libXau.i686 0:1.0.6-3.fc17 will be installed

—> Package libxcb.i686 0:1.8.1-1.fc17 will be installed

—> Package nss-softokn-freebl.i686 0:3.13.4-2.fc17 will be installed

Packages skipped because of dependency problems:

Spark-2.6.3.12555-1.x86_64 from /spark-2.6.3

alsa-lib-1.0.25-3.fc17.i686 from fedora

compat-libstdc+±33-3.2.3-68.3.i686 from fedora

glibc-2.15-37.fc17.i686 from fedora

libX11-1.4.99.901-2.fc17.i686 from fedora

libXau-1.0.6-3.fc17.i686 from fedora

libXext-1.3.1-1.fc17.i686 from fedora

libXi-1.6.1-1.fc17.i686 from fedora

libXp-1.0.0-17.fc17.i686 from fedora

libXtst-1.2.0-3.fc17.i686 from fedora

libgcc-4.7.0-5.fc17.i686 from fedora

libxcb-1.8.1-1.fc17.i686 from updates

nss-softokn-freebl-3.13.4-2.fc17.i686 from fedora

just use the tar.gz and unzip it to a directory…

James is right, just execute it from the from the unzipped directory. You may have to chmod it.

just a side note, i’ve never gotten the rpm to work, so i just gave up on it… it may have been compiled for some older kernel or redhat version or with some special dependency… who knows… and really who cares if the tar.gz works perfectly.

you could even make a simple shell script to execute it when you log into fedora or whatever just to simplify operation (and keep you from having to go to the command line or dig for the directory). You may be able to make a shortcut and stick that on your desktop if you want to double-click it to start it intead of an autorun thing… your choice.

Thanks for the responses. I’ll try the tar.gz.

If the RPM is not working, why not remove it to prevent others from having to waste time with it?

it may still work for some people i suppose… don’t know the real answer to tell the truth… rpm’s are really a convinience for the redhat’s and clones out there (centOS, Fedora, etc). it does the same thing as just unziping the tar.gz basically from my understanding… but is more tailored for the system… anyways… just roll the tar.gz and life should be good

actually… i haven’t tried… but maybe try installing the Developer Tools on your Fedora or whatever distro your running, then try the RPM again?

i think its:

sudo yum group-install “Developer Tools” -y

and that should work – as a heads up, the -y will force the install through without a prompt to confirm installation, the Developer Tools has a lot in it… so it will be large… and may not be best for a server system that is public facing… but if its just your personal machine, no worries.

Are there installation instructions somewhere for the tar.gz? I didn’t see any in the Documentation directory, or on the website. I find myself doing a lot of guessing. Here is what I have stumbled across so far:

I tried the tar.gz, then tried to run Spark (just guessing that was the binary name, since I didn’t

----------------------------------- Begin Terminal Snippet ------------------------------------------------------

[bradallen@zeltlinux06 Spark]$ ./Spark

Preparing JRE …

./Spark: bin/unpack200: /lib/ld-linux.so.2: bad ELF interpreter: No such file or directory

Error unpacking jar files. Aborting.

You might need administrative priviledges for this operation.

---------------------------------- End Terminal Snippet ---------------------------------------------------------

A little Google searching brought up that insalling glibc.i686 might resolve this problem since I am running a 64-bit OS. It did help but got me to the next problem.

----------------------------------- Begin Terminal Snippet ------------------------------------------------------

[bradallen@zeltlinux06 Spark]$ ./Spark

Preparing JRE …

testing JVM in /home/bradallen/apps/Spark/jre …

ls: cannot access /home/bradallen/apps/Spark/lib/windows: No such file or directory

[bradallen@zeltlinux06 Spark]$

----------------------------------- End Terminal Snippet ---------------------------------------------------------

Ok, this is turning into a tarpit. I need to get back to work.

Do an ls -a in your home directory and see if there’s a directory called .spark if there is rm -r .spark and try again.

Is your shell able to find your jvm? Type java --version and make sure it comes back with an affirmative response.

Removing ~/.spark did not prevent the error message:

[bradallen@zeltlinux06 Spark]$ ./Spark

ls: cannot access /home/bradallen/apps/Spark/lib/windows: No such file or directory

For java --version, here is what I got:

[bradallen@zeltlinux06 Spark]$ java --version

Unrecognized option: --version

Error: Could not create the Java Virtual Machine.

Error: A fatal exception has occurred. Program will exit.

[bradallen@zeltlinux06 Spark]$ which java

/usr/bin/java

[bradallen@zeltlinux06 Spark]$

the /home/username/apps/Spark/lib/windows no such file or directory may very well be caused by lack of user permission to run the file. you must chmod the Spark directory to allow your user to run or access those files.

right now you def. are having user permission problems. i can see from your shell that you have the “$” which is a non-root shell prompt… either try to “sudo su” or just “su” into root and run it, if you can’t then try to chmod it.

sudo chmod 777 Spark

that should give everyone all permissions to that folder, not the best, but will help diagnose the problem

for a “production” system i woudl do this after getting spark to work.

sudo chmod 755 Spark

then do

chown bradallen:bradallen Spark

the chown should set the Spark to be “owned” by your ursername and your username’s group. of course make sure you are in the director right above the Spark directory… so do this before any chmoding

cd /home/bradallen/apps

then do the sudo chmoding and chowning

the chmod 777 should absolutely get it to work since that makes everyone able to edit that stuff… however having Spark inside your home directory may cause a problem since the home directory is locked down a bit.

if all else failes, unzip spark to someplace else, such as /usr or /var or something so that its accessible by anyone on the sytem, then make sure to chmod it so your user can access it.

whoops! just saw your most recent post.

You dont have Java installed!!!

Spark is based on java, and needs java installed to work.

do this:

sudo yum install java -y

and you should be set.

this will install open-jdk which isn’t necessarily the best… but should get the job done. the sun/oracle java jvm is better imho, but hey, i do a little java development myself so i’m a little bias! lol

PS: if this is your personal machine or a work machine that is yours to use, AND NOT A SERVER!, then i would recommend isntalling the Developer Tools anyways… that would have isntalled the java jvm and jdk for you already. you probaly don’t need the jdk, its for developing java programs, but the Developer Tools group install package includes a ton of program compilers, etc that will allow you to run most source applications and compile them successfully. after reviewing you orginal post errors, it seems to me your system is missing a lot of dependencies that would be installed by the Developer Tools anyways… such as most of those “lib” packages, etc.

Do this:

sudo yum groupinstall “Development Tools” -y

my original command syntax was incorrect… its “groupinstall” not “group-install” and its “Development Tools” not “Developer Tools”…

I did install the developer tools, which I would like to have on general principles anyway. I had not been aware of the “groupinstall” option; thanks for the tip!

Regarding Java, that was already installed, but --version is not a valid option.

--------------------------------- Begin Terminal Transcript -------------------------------

[bradallen@zeltlinux06 ~]$ sudo yum install java

[sudo] password for bradallen:

Loaded plugins: langpacks, presto, refresh-packagekit

Package 1:java-1.7.0-openjdk-1.7.0.3-2.2.1.fc17.8.x86_64 already installed and latest version

Nothing to do

[bradallen@zeltlinux06 ~]$ java --version

Unrecognized option: --version

Error: Could not create the Java Virtual Machine.

Error: A fatal exception has occurred. Program will exit.

------------------------------------- End Terminal Transcript ----------------------------

I did all the necessary chowning and chmoding using -R to make sure to hit all the files in the Spark directory. However that did not address the problem.

The error message says:

ls: cannot access /home/bradallen/apps/Spark/lib/windows: No such file or directory

In reality that directory path does not exist. Instead, there is a windows64 directory in that path containing a civil.dll file.

I had the same thing happen last week on one of my machines. Believe it or not, I restarted the machine and then ran the Spark script and it worked.

I’m running Ubuntu 12.04 so it may be different, but in the past on RHEL based Linuxes I’ve had to use the “starter” script in the main Spark directory or “startup.sh” in the resources subdirectory.

I get the same error message you did above (ls: cannot access /home/bradallen/apps/Spark/lib/windows: No such file or directory) when I run the “Spark” script but then Spark loads.

Maybe run pgrep Spark and see if it launched in the background?

do a

./

while in the Spark directory and then hit TAB a few times, see what it automatically wants to populate it with… should pickup either the starter.sh script or the Spark script as Justin suggested… might be worth a shot

try this too to set the java enviroment variables if a reboot doesn’t work.

http://glassonionblog.wordpress.com/2008/07/21/linux-setting-java_home-for-a-sin gle-user-and-all-users/

also… try to run as root for a minute and just see if that helps or not. that will tell us if this is some kind of user permission problem…

just do a

su

and enter the password. then try to do

java --version

again and as well as running spark. sometimes if the program (java) was installed under another user, it may not work under yours (permission thing).

PS: I belive it creates that “windows” directory when it runs successfully the first time… so it smells like a permissions issue coupled with a possible java environmental variable issue. so, really, if you can, try running as root or sudo if you can, to just help diagnose.

did you get this working Brad?