Site Sponsors:
Downgrading Java on Ubuntu 16.x 
Just a quick note to let everyone know that - after installing LTS 16.04 (so far a very pleasant & stable release - recommended!) that I discovered that the default JDK ("9 internal") does not work.

java -version
openjdk version "9-internal"

--The failures are so bad, that the kernel actually dumps core!

(Forgive the imp, yet note the bug -Halloween is around the corner!)

Sadly, removing the "9 internal" by conventional means results in only a partial removal... very nasty. What will work however, is to remove everything via a hand-grenade:

sudo apt remove --purge "^openjdk.*"

Followed by a dependency purge:
sudo apt-get autoremove

Then re-install a version that actually works:
sudo apt-get install openjdk-8-jre

Ultimately, once we have what we need installed, don't forget to associate that new JRE by right-clicking on a .jar file, so as to:

In addition to those old 'ol reliable versions of Oracle (I like to test under many JVM's), here is the OpenJDK that worked for me on LTS 16.04.1:
openjdk version "1.8.0_91"
OpenJDK Runtime Environment (build 1.8.0_91-8u91-b14-3ubuntu1~16.04.1-b14)
OpenJDK 64-Bit Server VM (build 25.91-b14, mixed mode)

Sharing is caring!


[ add comment ] ( 268 views )   |  permalink
JarOMine: Locate + Search Java and / or ZIP Files FAST! 
Just a quick note to let everyone know that I have written a reasonable first-response to help resolve my recent carping about not being able to locate a file in an archive.

Known as JarOMine, I cobbled this puppy together just now so as to make mining thousands of Java .JAR files, as well as just as many classic .ZIP files, a whole lot easier.

Enjoy the Journey!

[ add comment ] ( 232 views )   |  permalink  |  related link
Hadoop: Outloading Support for User-Defined Functions in Java 
So there we are, trying to locate the Java-Archive (JAR) files required for creating a User-Defined Function (UDF) for Hive. (Pig works much the same.)

Perhaps one is even using the Hortornworks Sandbox. No matter how we access Hadoop's Distributed File System (HDFS) however, many have broken a sword or three trying to locate the proper jar-file support.

Shell Games...

One of the reasons for the breakage is that we folks in the Java world will insist on packaging classes in different archives over time. -A worthy rant in itself, the sad truth is that - while classes and packages will remain the same - wee Java folks always need to scramble things about so as to make them logically easier to find.

Speaking as someone who has been known to migrate a class between archives from time to time, perhaps a JSR-someone needs to create what might be a simple meta-tracking system - a packaging idea well outside of git, and perhaps a tag closer to @Depricated.

The need is to help us keep our sanity when playing the class shell game.... but I digress!


Yes, I needed a new term, so I coined one: Unlike "uploading" or "downloading," the idea here is that we need to post something from INSIDE the system, so as to be available from OUTside of the same... so we can downLOAD it! (Remember the term "DLL Hell?" -I needed a new term when writing for BYTE Magazine in 1992, as well! =)

While the Internet has worked so as to make locating properly-named jar files allot easier now than it was back in 1992... it still ain't easy. Because the exact same package and class can migrate between archives so much in Java, maybe we should call the little "shell game" we all must eventually play "Jar Hell"?

So whilst we have used "hadoop-core" -> "hadoop-common" on HUE (HDP 2) -> Ambari (HDP 2.4), after the requesite Internet search (sigh) I have discovered that the best way to locate the Jar we need (i.e. nowhere else to be found!) would be to use a simple find command on the target platform, itself. --The JAR we need to create UDF's is simply to be found no-where else!


On the present incarnation of the Hortonworks VM (again, the truly marvelous HDP 2.4 - Gotta love Ambari!) here is how to locate + post the archive(s) we need to create UDF's ... to the HDFS:

find / -type f -name "hive-exec*core.jar" -exec hadoop fs -put "{}" hdfs:/hadoop9000 ";"

(Please note that I obviously had created /hadoop9000 in the HDFS beforehand...) Ultimately, note also that the default incarnation of the fs command will not replace an existing file.

A similar exercise will also work (if required) so as to outload "hadoop-common" (or hadoop-core!) via HDFS.

Enjoy the Journey!


[ add comment ] ( 215 views )   |  permalink

<<First <Back | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | Next> Last>>