Site Sponsors:
Dropping that firewall: CentOS & RedHat! 
When you are working on your own cluster - behind your own firewall - then having ANY type of port-security on an on-demand server is just plain 'ol annoying.

Here is how to allow access to ALL of those ports on your R&D server. -Wherever iptables is 'empestered:

su
service iptables save
service iptables stop
chkconfig iptables off
service iptables restart

Now Tomcat, 8080, 80, and (etc, etc,) will respond as most software developers are used to.


Sharing is caring,

-Rn


p.s: Ubuntu Tomcat users can do this.

[ view entry ] ( 965 views )   |  permalink  |  related link
The HTML Sandwich 
This morning I decided to take one of my open-source projects to the next level. -By moving it to the 'web, not only would I add spell checking, but better cross-platform & mobilization options, as well.



The 1-Hour Tech Assessment


At the time of this writing I had been 'giging for a few years on a single set of technologies. Reviewing the web-side implementation options available for a new server-side plunge, as usual we had far too many choices: PHP CRUD, ASP.NET, GWT, Signed Java Applets, Java Server Pages (JSP), Java Server Faces (JSF), etc, etc...

--Such is the dilemma when you know & adore so many technologies!

Full-time Job


In the case of this particular project however, after considering the rate of disruption of everything I knew, I decided to simply use a Servlet.

Why did I select such a basic server-side technology set?

I chose Java because both PHP and .NET seem to ever threaten to leave our code legacies behind. We thereafter decided upon Servlets because - as the foundation of virtually everything on the Java-Web - I likewise tire of the sheer rate of disruption on just about everything built atop of Java Servlet Technology. --Unlike all-things served (Server-ed?) up as Servlet a-la-mode, those good 'ol HTTPRequest & HTTPResponse wrappers have not changed much since their ultimate incarnation.

Lamentations


There are also far too many closed, self-obsoleting standards out there today. -After a few years everyone gets tired of trudging along any cold, hard, forced-march: Those bleak book-biting ascensions along that "upgrade to the latest NOW - that last greatest thing we did was wrong" mountain pass.



Decades after their "me too" d├ębut, has anyone ever heard of SNOBOL or RATFOR?

(-Yea: Ruby-on-Rails is great. Gotcha. (I loved Smalltalk. -Learned Perl, too.))

Sabo


Now don't get me wrong: If you have a full-time IT staff, then ASP.NET, JSTL, JSF, (etc, etc) are great. -Yet for 90% of the planet however, our site traffic is well under ~250 visitors per second. In as much as that traffic stream is well within the capability of your typical rented-server, if you don't want to spend days catching up on the latest comparative innovations de-jur, then Servlets will do just fine. (Imagine a Qui-Gon Jinn attempt to use a mind trick on Watto...)

Yes.... Just fine.

Indeed, as an army of one, like most low-budget entrepreneurs I simply want to do things once - then get back to doing other things. We'll do it one better if folks like it?

Simple Templates


So the problem arises: Given that one may frequently want to change the style of any given web site, how to do so without tag libraries, code injectors, faces, adaptors, POJOs, entities, and an eager staff of software developers?

At the end of the day, what is the absolute easiest way to skin a site?

... how 'bout an HTML Sandwich?


package com.soft9000.html;

import java.io.File;

/**
*
* @author profnagy
*/
public class HtmlTemplate {

static final String sTEMPLATE_DELIM = ".template.";
static final String sTEMPLATE_FILE = ".template.txt";
private File file;
private String delim;

private String prefix = null;
private String suffix = null;

public HtmlTemplate() {
this(new File(sTEMPLATE_FILE), sTEMPLATE_DELIM);
}

public HtmlTemplate(File file) {
this(file, sTEMPLATE_DELIM);
}

public HtmlTemplate(File file, String sDelimiter) {
this.file = file;
this.delim = sDelimiter;
}

public boolean isNull() {
return (file == null || delim == null);
}

public void getHtml(StringBuilder sb, String content) {
if (sb == null) {
return;
}
if (prefix == null && split() == false) {
return;
}
sb.append(prefix);
sb.append(content);
sb.append(suffix);

}

private boolean split() {
if (isNull()) {
return false;
}
String str = com.soft9000.file.TextReaderF.Read(file);
String[] set = com.soft9000.Text.Split(str, this.delim);
if (set.length != 2) {
return false;
}
this.prefix = set[0];
this.suffix = set[1];
return true;
}

}


Simple, eh?

Rube


Inspired by PHP CRUD's "template21", the way that the above class works is easy to understand: By allowing us to specify the name of an external template file, we simply use the "top" and "bottom" of the file as a wrapper around our content. getHtml.

So deign a web page for your entire site, or a subsection thereof. Use graphics, google cookies, style sheets -just the way you want it!

Next, simply put a unique token where you want dynamic content to be inserted then vola! - you will have just what we need to create an "HTML Sandwich Web Site."



Of course, that swap-out could also be for more than a single unique token. -Perhaps a triple, or n-decker, sandwich? Different tokens for different types of content... a 'dagwood? (Yet clearly with a 'tad more bread-foundation, and a allot less baloney (pun intended.))

new BooksRequired == NULL: volatile ReLearningCurve = 0L;

Etc.


Over time, for internationalization, subscription, and / or other reasons our templates might even be generated dynamically, taken from an SQL CLOB, deducted from competitive web pages, (etc.)

Indeed, no matter if we are using Java, PHP, VB.NET, RUBY, C# or even CGI -w- C/C++ any language can take a byte of that same sandwich.

The dynamic server-ed side don't get any easier than that!



Sharing is caring,


-Rn


[ view entry ] ( 847 views )   |  permalink  |  related link
Setting-up Hadoop 2 on Linux in 3 Easy Steps 
When the time came to pick a 'distro for use by our Hadoop students, because the Hortonworks VM was using CentOS (love it), just to round the student experience out a bit I decided to use Ubuntu.



Hadoop 2.2.0 is still about Java. If you are thinking production, be sure to use Oracle Java 1.6.

STEP 01: Source Install (OPTIONAL)


Obviously tracking to an LTS version, when you want to set-up the source code for Hadoop for spelunking on Linux (first-timers will want to avoid this step!), then you will want to do the following:
sudo bash
apt-get install maven
apt-get install git
apt-get update
apt-get upgrade
mkdir /hadoop9000
cd /hadoop9000
git clone git://git.apache.org/hadoop-common.git
cd hadoop-common
mvn install -DskipTests
mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true
chown guest /hadoop9000 (or whoever)
sync
exit

The above will create the latest Hadoop on your machine. -To keep your production work moving along on the Hadoop 2 LCD, then also consider using the official binary install.

DEVELOPER NOTE: If you are looking to use your install for software development, note that the above step is not optional. Why? Because native mode libraries (as well as the rest of the lot) need to be generated for your R&D Platform. Depending upon the version of Linux you have, you may also need to install projects like ptotocol buffers (etc.) to compile Hadoop's C/C++ JNI underpinnings. Once created, just chum the lot of the 50-or-so libraries into /hadoop9000/lib. Why? Because you will want to use those JARs in your IDE (eclipse, netbeans, etc.) from a single standard location.

STEP 02: Official Binary Install (REQUIRED)


If rebuilding from the source is not what you want to do, then you can simply download & unzip the hadoop tar under /hadoop9000.

Note that if using gzip compressing is on the radar, then we will need to be sure to provide the proper 32 / 64 rendition of Hadoop's native libraries, as well. (Step 01 can build those native libraries for us, too. Use: mvn compile -Pnative )

STEP 03: Hadoop Environment Variables (REQUIRED)


Next, those Hadoop environment variables need to be wired-into your .bashrc: (The embolden ones are required - the rest are optional)
export JAVA_HOME="/usr/lib/jvm/java-6-oracle"
export HADOOP_HOME="/hadoop9000"
export HADOOP_PATH="$HADOOP_HOME/bin:$HADOOP_HOME/sbin"

export PIG_INSTALL="$HADOOP_HOME/pig"
export HIVE_INSTALL="$HADOOP_HOME/hive/hive-0.12.0-bin"
export PATH="$PATH:$HADOOP_PATH:$PIG_INSTALL/bin:$HIVE_INSTALL/bin"
export HADOOP_COMMON_LIB_NATIVE_DIR="$HADOOP_HOME/lib/native/"
# Can also place into hadoop-env.sh
#export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/"
Yes, Oracle Java (/usr/lib/jvm/java-6-oracle) is officially endorsed as the the best real-world way to go on Hadoop.

Chasing the Moths


When debugging, do not forget to edit the hadoop-env.sh:
HADOOP_OPTS="-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5123"

This should get you started. If you need to learn more, then consider signing-up for our next week of virtual training on Hadoop.

Enjoy,

-Rn




[ view entry ] ( 1418 views )   |  permalink  |  related link

<<First <Back | 1 | 2 | 3 | 4 | 5 | Next> Last>>