NetLibrary Media Center - Belongs in the Trash Bin

So my local library is using NetLibrary for renting out audio books.

This was working just fine, though the download process was a three-step thing - first download to local computer, play MP3 to obtain license, and then copy to portable MP3 player using Windows MediaPlayer. But - this process worked just fine, with no problems.

Then, in their infinite wisdom, the NetLibrary people have unleashed a separate stand-alone Windows program, called "NetLibrary Media Center" - which is supposed to make downloads easier. Good idea - but pretty bad execution.

Be warned - that program is nothing but grief. First of all, its user-interface is from the dark ages - windows that cannot be moved or re-sized, clunky buttons, no good feedback on actions or what it is doing.

But - it also deletes all files in folders without warning. Yes, here is a program that NetLibrary asks to be downloaded to make transferring eAudioBooks easier, and that program will clean out certain folders. If you go into Preferences, and point the folder to a different location, NetLibrary will delete all files without any warning. And with all this, it did not recognize the MP3 player plugged in - which Windows Media Player located just fine. So, there was no way to actually transfer the audio to the player.

This program belongs in the trash bin - it is one of the most poorly designed - and useless - utilities developed. And of course, now that NetLibrary has this new program, their old way of downloading a audiobook to the local disk does not work. The web pages claim it works, but it ends up in "Requested page could not be found" error. Thankfully, there is a way around this - in the Web NetLibrary account "Edit My Account" page, uncheck the Download Preference "Use NetLibrary Media Center". This should reset the download option, and show the link to download the CD-quality MP3 which can be copied to portable players.

Public DNS Server with no hijacking!

DNS hijacking has become common place, not just used by rogue DNS servers anymore, but seems like most (all?) Internet Service providers are now resolving non-existent domains to the ISP's own servers.

This is very irritating, and causes numerous problems - where a NXDOMAIN response is expected, applications now get a valid response. RCN puts up a search page, which contains search data, and does not even contain a link to the address actually typed. All so the ISP can serve more ads to the end-user. And RCN has no easy way to opt-out that would work for all applications and operating systems.

So users have turned to many different methods: installing browser plug-ins - poor solution since all non-browser applications won't see the fix, or using Public DNS Servers and configuring their DNS lookups to go to these Public Servers.

But now, even the Public DNS Servers are involved in subverting NXDOMAIN responses - they too want to serve ads and issue redirects. A web search on this issue results in many people saying that OpenDNS has fixed their problems - which is not really true. It is in fact, quite complicated to figure out what the basic, free, OpenDNS really does, and it requires jumping through many steps to make it stop the hijacking - they claim it can be done, but requires registration, etc. They do offer other services, which may be useful to most regular users - such as security features, but they are certainly not providing easy access to non-hijacking DNS servers.

Just last month, turns out that there is one public DNS service that promises to Get the results you expect with absolutely no redirection. - Google's Public DNS.

Ubuntu 9.04 to 9.10 upgrade and nvidia problems, etc

So since it is very easy to upgrade to 9.10 from 9.04, I downloaded Kubuntu 9.10 in the kubuntu-9.10-alternate-amd64.iso format for a Intel Core2 64-bit system and ran the install.

Predictably, ran into problems after the install, here are the issues, and their fixes.

Started with 9.04 install, more details in this entry: Ubuntu Install. One problem existed in 9.04 - there was no audio for the HDA Nvidia device, even after trying many installs/uninstalls and different drivers. Magically, the audio problem was fixed with the 9.10 upgrade! So, it was all worth it.

Here are the rough steps for the upgrade and fixing issues:

Run apt-get update and upgrade to bring all packages to latest version on current version of Ubuntu. Shut down all applications, including any kvm virtual machines running on the host.

Download kubuntu-9.10-alternate-amd64.iso. Mount it locally and the run the installer from a shell:
sudo mount -o loop .../path-to/kubuntu-9.10-alternate-i386.iso /media/cdrom0
kdesudo "sh /media/cdrom0/cdromupgrade"

Upgrade was smooth, selected option to remove unused packages, and in around 15 minutes, it was done. Even kvm virtual machines started up with no problems.

Since the upgrade was done from a locally mounted CD image that won't be available later, edited the /etc/apt/sources.list file, and commented out the cdrom line.

On reboot - the first problem - no kdm - no graphical login prompt, no X server. The nvidia drivers did not install, and manually trying to install nvidiia-glx-173, 180, failed.
With some searching, found that medibuntu was disabled, so ran the command to add it to apt's source list:

Gmail spam filter is very poor

It is surprising to read some web articles about how good the Google gmail spam filter is - in my experience, it is really poor at stopping simple spam, and also has too many false positives (email that it marks as spam but it is not spam).

There are some articles, though, about how gmail does not stop much email with "VIAGRA" in it.
On a daily basis, I have 5-10 emails with Viagra in the subject line in my gmail inbox. I don't use gmail much because of this problem - it is strange that my local spamassassin setup can easily mark this as spam, but gmail does not. I did spend a few days reporting these emails as Spam in Gmail, but to no avail - gmail will not recognize these as spam - which suggests that the Report Spam feature in gmail is also pretty much useless.

Gmail even says the message is "signed" - whatever that means. here's the gmail header:

  from	Approved VIAGRA Store 
 date	Fri, Dec 18, 2009 at 12:55 PM
subject	Member get 80% 0FF on ALL Pfizer.

The same message in my local spamassassin filter is in the spam folder, and has these spam tags:

X-Spam-Status: Yes, score=13.8 required=5.0 tests=BAYES_99,HTML_IMAGE_ONLY_24,
        SPF_NEUTRAL autolearn=no version=3.2.5
Delivery-date: Fri, 18 Dec 2009 10:55:56 -0700
Received: from [] (

Article module and recently updated block

Article Module for Drupal is an useful module, and is pretty easy to configure.

The Views module in Drupal does similar stuff, but it does take a lot of learning new terms, and likely even more CPU/Database usage. More importantly, it is a hugely complex module which brings with it risks of breakage when updating either just the Views module or updating the core Drupal release. So, it is worth keeping the simple Article module around.

One of the key missing things in the Article module is a way to make a block of "Recently Changed Articles" - it only provides for "Recently Created Articles". But this is an easy hack - edit the article/article.module file, and replace all n.created to n.changed.
That is it! Of course, the trouble is that updating the Article module now becomes tricky - but all this is probably still easier than dealing with updating the large Views module and core Drupal.

ToDo: add an user option to have article module support both n.created and n.changed as required for a specific block.

NRPE easier than check_by_ssh for Nagios on Ubuntu

Trying to setup SSH for Nagios on Ubuntu is quite maddening. So, for those thinking that since they have familiarity with ssh, ssh-keygen, that using check_by_ssh would be a breeze, don't! Use NRPE instead.

These plugins are needed to monitor things like load, procs, on remote machines such as might exist in a small home network.

Assuming a localhost that is running the Nagios monitoring server, and a remotehost that we need to monitor load, procs, here are the simple steps to get NRPE up and running on Ubuntu or Fedora:
This assumes that a base Nagios server is up and running on localhost, and remotehost does not have Nagios installed.

On remotehost, run:
sudo apt-get install nagios-nrpe-server (Ubuntu)
su -c "yum install nrpe.i386 nagios-plugins-procs.i386" (Fedora FC7)

To find appropriate names of packages to install, use:

On remotehost, run:
apt-cache search nagios (Ubuntu)
yum search nagios (Fedora)

Make sure all required plugins are installed on the remotehost also - things like check_load, check_procs, etc in /usr/lib/nagios/plugins/ or appropriate folder on remotehost.

Next, edit the config file /etc/nagios/nrpe.cfg - replace .. with the IP address of localhost that is running the Nagios monitoring server:

allowed_hosts=, ...

In the same file, check the hardcode commands, such as check_load and check_total_procs, those might be sufficient for many uses.

Finally, restart nrpe on remote host:

sudo /etc/init.d/nagios-nrpe-server restart (Ubuntu)
su -c "service nrpe restart" (Fedora)

Now, back to localhost. Install the nrpe-plugin here:

sudo apt-get install nagios-nrpe-plugin (Ubuntu)

Test it - from locahost, run:
/usr/lib/nagios/plugins/check_nrpe -H remotehost (this should output remote NRPE version)

Choosing a web host for low volume sites

There seem to be many hits when one looks for reviews on web sites, but nearly all of the top hits seem to be fake or aggregate sites, with not enough real data. It is also very hard to judge web sites from a large general list of criteria.

Having to do this exercise once every few years, here are the criteria that I look for in web host:

  1. Low usage, non-commercial web hosting. Generally around 2G to 5G data download per month.
  2. Medium level of system admin capability - ability to configure simple .htaccess rules, install packages like Drupal and Joomla, host multiple sites.
  3. SSH access is essential. This is the most efficient method of managing a site. If only FTP access is available, simple tasks become quite complex. Like copying a directory, making a symbolic link. (Have to create a cron job to do such tasks in absence of SSH access.)
  4. Should allow two domains to be hosted. One main domain, and one add-on. More is nice-to-have.
  5. Drupal CPU/Memory requirements should be supported. Even for very low volume sites, Drupal can be a memory hog and some sites kill scripts too soon. Drupal is not very robust - if an admin page (such as modules list) is killed when being constructed, it will corrupt the database. The CPU/Memory requirements are minimal, but some sites do not support them.
  6. Disk space requirements are generally unimportant since every web host seems to be providing over 1G of space. Low usage sites probably don't need more than 1G space, if that much.
  7. Standard CGI tools support - Perl, PHP are essential. Python, Ruby, etc are nice-to-have.
  8. CGI scripts should be allowed to open web connections to other servers. Some web hosts do not allow network connections from server scripts. Very rare, but I found one out of the 10 or so hosts I tried had this restriction.
  9. Should be less than $100/year. Back when I started this, it used to be $300/year, but now a $100/year budget is reasonable.
  10. Support is necessary, but only expect help with really critical server related problems that cannot be fixed by self-managing the web site. The expectation is that I would never really need to open a support ticket. Email only support is fine. A user forum for the web host customers would be a good thing to have.
  11. No stringent requirements on uptime - acceptable to have an hour or so down every few months.

And, the best site that supports these requirements?
As of 2009:

  1. HostMonster has turned out the best fit for the above requirements. The thing to be careful is that their first sign-up cost can be 10% to 30% lower than renewals, so be sure to look up the actual renewal cost. They easily meet all the requirements above, and the price is the best for these set of requirements. 2009 cost is $250/3years or $110/year.
    BlueHost is the same provider as HostMonster, so either of these are probably similar in service.
    This site does go down once in a while - maybe an hour or so every few months. This has been infrequent enough, and all web hosts seem to have this issue at the low end, so has to be accepted as part of the deal.
  2. WebAxxs is another good site. It costs nearly the same as HostMonster, with $2/month extra for SSH. But they have lower pre-purchase periods, so for a year or two, it is just $10/year or so more in cost. Their web site is very poor, not enough information. Had to get multiple emails to get more details. This provider is same-as/related to, which has better information.

Kubuntu very easy to install

So after all my troubles trying to get Fedora 11 x86_64 kit installed and failing, I tried Kubuntu.
I prefer KDE desktop to Gnome, so went with the KDE Ubuntu 9.04 "Jaunty" version, Kubuntu. Ubuntu itself is Debian-based.

Took two tries, but the second try was the real thing, and within an hour the the 64-bit kit was installed, and fully working - including the DVD playback as well as audio.
Update: not that good, in the end. When I got new drives, and re-did the full install it failed to get audio working. Oh well, so these Linux distributions are not as good as Windows when it comes to multi-media support. Since I do not use audio, I am going to stick with Jaunty and will update the system again when the next release comes out.

First try:
The documentation on the Ubuntu pages is somewhat unclear. I knew I wanted LVM, and expected that the DVD iso image for desktop install would offer it. No mention is made of this on the Get Ubuntu pages. It does refer to something call "alternate" kit which is text-mode install for systems with small amounts of memory. So, went with the desktop kit. It is also not very clear that the "amd64" kit is for Core 2 Duo Intel processors also, in fact, some of the language might indicate that it is only for AMD CPUs, but that would be an incorrect reading. So, for Core 2 Duo, use amd64.

But overall, using the desktop kit was the wrong thing to do. Took an hour to install, discovered the problem with no LVM, so went back to web searches, and discovered (in not-so-easy to find places) that the alternate kit is the more powerful kit, and has much better control and many more options for the install.

Second try:

Fedora 11 Hiccups

So, here are the all the issues I ran into when installing Fedora 11 on a new-ish computer.

Computer: Gigabyte GA-73PVM-S2, with SATA hard-drive, and IDE CD/DVD drive, and 4G of RAM.

Given that the i386 Fedora cannot see all of 4G, I first tried the x86_64 install. The Fedora install notes also say that for Core 2 Duo processors, the 64-bit install is recommended.
Right at the outset, it failed. Booted off a DVD install kit, and got stuck where the install said "install media driver not found", and asked me to select a driver, I selected pata_amd (for IDE CD/DVD), and forcedeth (for ethernet), but no luck, it kept repeating - no driver, then select driver, with no way to break the loop. It also failed to recognize the VGA card, and performed the install in text mode.

Retried the install over the network. Now the install proceeded to completion, but when the system started, there was no X driver for the video card. Having spent 2+ hours on this, decided to shelve this approach.

Using the i386 DVD install, everything went smoothly. It had no trouble finding all drivers, did the install, and Fedora was up. It even selected the PAE kernel, so can see all 4G of RAM (most of it anyway).
This probably indicates that there is some packaging problem with the x86_64 version?

Current problem:
Cannot play DVDs, even locally created, no encryption DVDs. Have installed libdvdcss and other modules mentioned at the unofficial fedora faq. Xine reports no mpeg codec, and keeps popping up the error window, ad nauseum.

Gave up on Fedora, was taking too long to get basic stuff running, and 64-bit not working was a big problem, and discovered that it was not Linux, or the Gigabyte GA-73PVM-S2 motherboard issues - Kubuntu installed with incredible ease.

Monster Cables are Unnecessary

Monster Cable news in USA Today is very funny - but it is presented as News! Pretty poorly written article in USA Today, with no questioning of whether Monster really makes any sense. The comments people posted are more to the mark, that is where some questions are being raised.

"Oxygen Infused Cables" and some such thing, and people shell out 100s of dollars for such gimmicks. Well, a sucker is born every minute, so good for Monster to soak up the excess money some people have floating around.

There is absolutely no need for the way Monster and other high end cable companies make audio/video cables - metal conducts electricity, for analog the key issue is the gauge of the cable (get thicker one for higher currents), and for digital - hey, even easier - either the bits get from one end to another, or they don't and any decent cable will work fine.
Sure, buy Monster if you like the to overspend, but don't buy it because "it conducts electricity better"! And there are certainly crappy cables, good cables, very good cables, excellent cables, and way-out-there-with-magic-powers cables, and it is best to avoid that last category along with the first.

Lot of good stuff on the web such as Monster Cable Sucks! and CNET HDMI Cables along with the kool-aid drinking gushers.