Site Sponsors:
Don't be a Monkey! 

Another Chimp, Chump?


I do get SO tired of the monkey-shine. (No, this time I will not treat you to yet-another politically-incorrect pseudorant on neo-politico-religious values. Un-uha.)

What has gotten my goat during lunch today is yet another clueless company. This one, a computer consulting firm: They are using a 'webco that is trying to fool us into giving them our personal information.

And what could be more personal with our email address, than our very-own computer's IP address? -All just sitting ripe and ready in the html server-logs; Combined with any seemingly anonymous data we might foolishly provide to them, poised for yet another denial of service attack, mindless email marketing campaign, or perhaps even a web-mail-cookie email invasion?

Yawn, yawn, yawn ... !


Since the time of 9600 BPS modems, most of us geeky-folk have know that, whenever we create an HTML link (aka: "anchor"), that we have to supply both a "link", and the "human-viewable" part.

i.e:

<a href="http://mygeekgoop.net/magosh.asp?my_code_to_id_you">You only see this part</a>

What is so annoying about the abuse of this seemingly innocent convention today is that whenever you click on 'You only see this part', the computer gets the 'my_code_to_id_you' 'stuff sent to it.

What is the point?


For decades we have all know that if anyone chooses to click-on such a emailed link, that by using a unique my_code_to_id_you value, that we can instantly know that YOUR EMAIL ADDRESS has done so. All I need do is generate a unique number, and use it as the URL suffix, as shown above.

Booya


What gets me whenever I see things like the following so often today - is just how stupid do they think we are?



I mean, it is bad enough that they are so DECEITFULLY trying to pull the wool over our eyes. --But by pretending that the HUMAN readable part is just any plain-old URL - and that (who, me?) it is going to make NO attempt to associate your personal information with their innocent fact-finding attempt... just plain 'old makes me sick.

Ho well. On a scale of 1 to 10 on the annoyance meter, this is probably a clear five... but, by way of the final insult, guess what the name of the offending webco is calling itself?

SURVEY MONKEY!!

Yup - Unreal, no?

So do yourself a favor: Always check your URLs' before clicking them. --Don't let the link banana-chasers make yet-another monkey, out of you!


Blessings,

-R.A. Nagy



[ add comment ] ( 2548 views )   |  permalink
Cosmos Screen Saver 

Reminiscance


Ah, for the days of Carl Sagan - when the threat of national socialism, global warming, and removing funding from the arts and sciences seemed to be a problem for a republic far, far, away...

While this note has absolutely nothing to do with the late, great cosmic scientist (you know - the one who made billions & billions, only to become affectionately know to Mac developers as "BHA"?), the household conversion from Microsoft Windows to Linux brought such thoughts to mind when passers-by began to comment on this marvelously-named, free screensaver.

NASA


The pictures included with the Cosmos screen-saver are wonderful. So too are the pictures circulated by the National Aeronautics and Space Administration. While bringing both together seems like a natural thing to do, there is simply no documentation on how to do this.

So here it is:

Enjoying Billions & Billions of Pictures


First, consider signing-up to the NASA News Services (last seen at the NASA Web Site, note that we can also browse & download stellar pictures from there directly, too.)

Save your favorites to a common location. Any folder / directory will do.

Next, set the permissions of the Cosmos folder to something like

sudo chmod 777 /usr/share/backgrounds/cosmos

(did I ever mention that we switched from Centos to Ubuntu? ...If you are not on a similar, then feel free to drop the sudo.)

Then locate & open the Cosmos configuration file (you might even want to put a link to /usr/share/backgrounds/cosmos on your desktop)

file:///usr/share/backgrounds/cosmos/background-1.xml

(aka /usr/share/backgrounds/cosmos/background-1.xml)

Finally, edit the XML to include the picture(s) you would like to display.

Cosmos XML


While adding too many files will certainly be a painstaking process, here is what a new entry for background-1.xml might look like (apparently Cosmos scripts can be timed - feel free to experiment with creating additional xml file names):

<transition>
<duration>5.0</duration>
<from>/usr/share/backgrounds/cosmos/cloud.jpg</from>
<to>/home/profnagy/515307main_PIA13449_full.jpg</to>
</transition>

Also, when adding a new picture to Cosmos, note that we are editing a chain of pictures. To insert the latest cosmic work-of-art, merely (1) locate where in the chain you want your new picture to appear, then (2) add an entry in-between (or at the beginning or end) of the Cosmos picture chain.

Once you have given yourself access to the cosmos directory, you can store your pictures there, too. Just dropping your 'pix there will allow them to be randomly displayed.

Conclusion


Of course, in addition to snaps of the galaxy's breathtaking beauty, we can add other types of pictures to Cosmos, too. --Just be sure that the pix from your last family reunion are out-of-this-world.


Enjoy!

[ add comment ] ( 3510 views )   |  permalink  |  related link
Incrementally Backing Up Changes to a USB Drive in Linux 

Saving Changes Anywhere?


Whenever we want to back up a Linux file system to a FAT / FAT32 file system, strange things can happen!

The need was to help us incrementally, and recursively, back-up file changes made during the past few days to a huge USB drive. -Since FAT partitions do not have all of that POSIX info, we were getting tired of waiting while absolutely everything was being copied over - even things that had not changed.

Unless you want to re-format the FAT partition to use NTFS (or EXT`n`), then there are two techniques here. The first technique also creates an on-line archive. The second works a lot like Mr. Bill's xcopy.

Technique 1: Coding It


Because the following script uses tar rather than cp, your file dates and time stamps will be preserved. Even ctime, mtime, or atime ... all the time!


#!/bin/bash

file_name="`basename $0`"
days='7'
file_tar="$HOME/$file_name.tar"
file_tar_list="$HOME/$file_name.found"

source="$1"
dest="$2"

echo 'Started at' `date` ", looking for deltas within $days days"

echo 'Source = '$source', Destination = '$dest
echo 'TAR file is ' $file_tar
echo 'FIND file is ' $file_tar_list

echo ' '
# echo 'Cleanup started at ' `date`
# $(find $source -name *.class -print -exec rm '{}' \; )
# $(find $dest -name *.class -print -exec rm '{}' \; )

echo ' '
echo 'Tar-up started at ' `date` ' from ' $source
pushd . > /dev/null
cd $source
find . -type f -mtime -$days -print > $file_tar_list
tar cvf $file_tar -T $file_tar_list
popd > /dev/null

echo ' '
echo 'Un-tar started at ' `date` ' to ' $dest
tar xvf $file_tar -C $dest

echo ' '
echo 'Completed at ' `date`



Since I have been writing a lot of Java this decade, I included a way to remove all of those pesky .class files. Just remove my # comments before those first find statements if you have a similar need.

Calling It


Once you copy the above into a file name (such as _io_sync.sh in the below,) all you need to do next is to call it. The _io_sync.sh script expects a pair of folders on its command-line:


#!/bin/bash
clear

source='/d_drive'
dest='/media/PATRIOT/d_drive'

file_name="`basename $0`.rpt"

./_io_sync.sh $source $dest | tee $file_name


Designed to be reflective, you can even swap the source and destination file around with impunity:


#!/bin/bash
clear

dest='/d_drive'
source='/media/PATRIOT/d_drive'

file_name="`basename $0`.rpt"

./_io_sync.sh $source $dest | tee $file_name


Of course, it is of no importance if we use the above on CentOS, Ubuntu, Slackware, or DSL. -This backup routine will work the same wherever a console / terminal / command-line / secure shell / or other script-running shell interface is available.

Left-Overs


Perfect for pen-drives, hard disks, or other mountable media, this script will leave a copy of the .tar file on your hard-drive in your home directory ($HOME) after all is said and done. A copy of the report file will also be placed right next to the script for your perusal & retroactive peace-of-mind, as well.

Technique 2: Preservations


If you do not want to have an on-line archive laying around, then you can just use:


#!/bin/bash
for arg in "/my_folder1" "/my_folder2"
do
cp -r -u -v --preserve $arg "./"
done


Simply copy & paste the above into a file, mark that file as executable, then modify and copy the script to where you would like the backup to store YOUR folders - like on a USB drive.

Best of all, note that running the operation repeatedly will only copy the 'deltas' (changes,) not everything else.

Conclusions


Clearly both techniques are superset operations: The only potential down-side is that previously deleted files will still be left hanging-around your backups. So if you want verbatim archives, then either delete the backup folders from time to time, else copy the tar file, as described earlier, to your backup device.

Finally, note that the above operations also work the same on NTFS, as well as on all EXT-n, partitions.

Enjoy!

R.A. Nagy


NOTE: If your are backing up to a POSIX file system and want to keep a log, then you might want to also consider this article.


Before any backup strategy however, don't forget to watch for, as well as reclaim any Zombies!

[ add comment ] ( 2228 views )   |  permalink  |  related link

<<First <Back | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | Next> Last>>