Site Sponsors:
Server Side Python3: Simple CGI Dump 
This Holiday's project saw us hacking the AWS so as to serve-up some hot & spicy LAMPy.

Submitted for your approval, here is a little something you might also need to know:

#!/usr/bin/python3
import os
from urllib import parse
import cgitb

cgitb.enable()
print("Content-Type: text/html;charset=utf-8")
print()

values = {
"DOCUMENT_ROOT" : "Server's Root Directory",
"HTTPS" : "'on' if using HTTPS",
"HTTP_COOKIE" : "Visitor cooky, if set",
"HTTP_HOST" : "Host Origin / URL",
"HTTP_REFERER" : "URL of calling page",
"HTTP_USER_AGENT" : "Visitors Browser Type",
"PATH" : "Server System Path",
"QUERY_STRING" : "HTTP / GET Query String",
"REMOTE_ADDR" : "TVisitor IP Address",
"REMOTE_HOST" : "Visitor Host Name / IP Address",
"REMOTE_PORT" : "Visitor Web Server Port",
"REMOTE_USER" : "Visitor's username (for .htaccess-protected pages)",
"REQUEST_METHOD" : "Determine GET or POST operation",
"REQUEST_URI" : "The requested pathname (relative to the document root)",
"SCRIPT_FILENAME" : "The full CGI pathname",
"SCRIPT_NAME" : "The parsed pathname of the current CGI (document root-relative)",
"SERVER_ADMIN" : "The server's webmaster's email address",
"SERVER_NAME" : "The server's fully qualified domain name (e.g. www.soft9000.com)",
"SERVER_PORT" : "The port number for request",
"SERVER_SOFTWARE" : "Server & Software (e.g. Apache/2.4.18 (Ubuntu))"
}


for ref in values:
try:
print(ref, ": ", end='')
print(os.environ[ref])
except:
print(" undefined")

Designed to show-off the complete set of CGI variables used to bind just about everything into the server environment, here is what the above yielded on my AWS cloud today:
SERVER_NAME : ec2-00-00-00-00.compute-1.amazonaws.com
SCRIPT_NAME : /CgiValues.py
REMOTE_USER : undefined
PATH : /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
REQUEST_URI : /CgiValues.py?foo=bar&ned=pooyl
HTTP_COOKIE : undefined
HTTP_HOST : ec2-00-00-00-00.compute-1.amazonaws.com
SERVER_SOFTWARE : Apache/2.4.18 (Ubuntu)
HTTP_USER_AGENT : Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:50.0) Gecko/20100101 Firefox/50.0
REMOTE_ADDR : 65.00.00.00
HTTPS : undefined
SERVER_ADMIN : webmaster@localhost
DOCUMENT_ROOT : /var/www/html
HTTP_REFERER : undefined
REMOTE_HOST : undefined
QUERY_STRING : foo=bar&ned=pooyl
SERVER_PORT : 80
REQUEST_METHOD : GET
SCRIPT_FILENAME : /var/www/html/CgiValues.py
REMOTE_PORT : 59860

I (obviously) changed the IP address to mostly 00's to keep it free from the planet's denial-of-service, self-accursed, nut-cases.


Cheers,

-Rn




[ add comment ] ( 248 views )   |  permalink
Python3 and Apache - Whose there yet? 
If you are a Python 2.7 developer whom is putting off learning Python 3, you are not alone. While a gentle enough learning curve, there are enough internationalization & conventional code changes so as to send allot of code back into the R&D queue...

Surprisingly, when it came to discovering whom - at the time of this telling - had already bitten that hard-to-swallow conversion bullet, I was surprised to discover that Ubuntu's Server currently has absolutely THE BEST & total default support for Python 3 out-of-the box! (Made the mistake of un-installing Python 3 on a 16.04 Desktop a few months back & almost ruined my weekend. =)

LAMP-Py?


Unlike a Pythonic update however, there shall probably never be a need to install mod-python - by default - on any Apache Server.



Yet - while ever ready to use the classical L.A.M.PHP stack, updating Apache2 to favor the use of the L.A.M.Python (LAMPPy = "Lamp-eye"?) stack on Ubuntu 'aint all that tough.

Enabling Python


Since Python3 is installed on Ubuntu Server (16.04 remains the LTS vogue for 2017), we only need do the following:

(1) A multi-threaded MPM shalt require:

sudo a2enmod cgi


(2) Creating a cgi-bin for the default site will require:

mkdir /var/www/cgi-bin


(3) Updates to the /etc/apache2/apache2.conf shalt require:

<Directory /var/www/cgi-bin>
Options ExecCGI
SetHandler cgi-script
</Directory>


(4) Updates to the each .py folder definition in the apache2.conf shalt require:

<Directory /var/www/html>
Options +ExecCGI
AddHandler cgi-script .py
</Directory>


(5) After saving the above to apache2's apache2.conf, your first CGI Python (/var/www/html/index.py?) program should look something like:

    #!/usr/bin/python3
import cgitb
cgitb.enable()
print("Content-Type: text/html;charset=utf-8")
print() # Yes Veronica - an empty newline IS required! =)
print("Hello Web-World!")
... and don't forget the chmod...

(6) Since we are talking R&D here, just reboot the locus to get your R&D things going again. (When we are the only user & the IP endpoint is glued down, a reboot seldom hurts ;)

Failing the jous of having the luxury of ye olde R&D mode, then:

sudo service apache2 restart


(7) After Apache2 has re-started, browsing to the server's http://IP-address/index.py will show off your "Hello" in relatively short order.

(*) BTW - We should note that Ubuntu Server is also supported on AWS ... am renting my latest R&D site for under $10 / month. -Yours could be free for a year...

Unlike the official distro, if you are planning on using AWS, note also that while Python 3 is en pester, that Apache2 is not installed. --All save Port 22 shalt also tightly locked down be... so 'ya can't even Ping your server until one updates the associated, inbound, AWS security profile =)


Sharing is Caring!

-Rn


(p.s. If you get stuck, the default location for your error.log shalt be in /var/log/apache2.)


[ add comment ] ( 244 views )   |  permalink  |  related link
Hadoop: Outloading Support for User-Defined Functions in Java 
So there we are, trying to locate the Java-Archive (JAR) files required for creating a User-Defined Function (UDF) for Hive. (Pig works much the same.)

Perhaps one is even using the Hortornworks Sandbox. No matter how we access Hadoop's Distributed File System (HDFS) however, many have broken a sword or three trying to locate the proper jar-file support.

Shell Games...


One of the reasons for the breakage is that we folks in the Java world will insist on packaging classes in different archives over time. -A worthy rant in itself, the sad truth is that - while classes and packages will remain the same - wee Java folks always need to scramble things about so as to make them logically easier to find.

Speaking as someone who has been known to migrate a class between archives from time to time, perhaps a JSR-someone needs to create what might be a simple meta-tracking system - a packaging idea well outside of git, and perhaps a tag closer to @Depricated.



The need is to help us keep our sanity when playing the class shell game.... but I digress!

Outloading?


Yes, I needed a new term, so I coined one: Unlike "uploading" or "downloading," the idea here is that we need to post something from INSIDE the system, so as to be available from OUTside of the same... so we can downLOAD it! (Remember the term "DLL Hell?" -I needed a new term when writing for BYTE Magazine in 1992, as well! =)

While the Internet has worked so as to make locating properly-named jar files allot easier now than it was back in 1992... it still ain't easy. Because the exact same package and class can migrate between archives so much in Java, maybe we should call the little "shell game" we all must eventually play "Jar Hell"?

So whilst we have used "hadoop-core" -> "hadoop-common" on HUE (HDP 2) -> Ambari (HDP 2.4), after the requesite Internet search (sigh) I have discovered that the best way to locate the Jar we need (i.e. nowhere else to be found!) would be to use a simple find command on the target platform, itself. --The JAR we need to create UDF's is simply to be found no-where else!

Punt!


On the present incarnation of the Hortonworks VM (again, the truly marvelous HDP 2.4 - Gotta love Ambari!) here is how to locate + post the archive(s) we need to create UDF's ... to the HDFS:

find / -type f -name "hive-exec*core.jar" -exec hadoop fs -put "{}" hdfs:/hadoop9000 ";"

(Please note that I obviously had created /hadoop9000 in the HDFS beforehand...) Ultimately, note also that the default incarnation of the fs command will not replace an existing file.


A similar exercise will also work (if required) so as to outload "hadoop-common" (or hadoop-core!) via HDFS.



Enjoy the Journey!

-Rn



[ add comment ] ( 255 views )   |  permalink

| 1 | 2 | 3 | 4 | Next> Last>>