Spooning with Pentaho Data Integrator

Pentaho Data Integrator is a neat open source, cross-platform tool for making light work of ETL. I have it set up on my Ubuntu laptop and can run it directly from the command line, or as part of a chron job.

However I found a couple of annoyances when running it from CLI, one was in having to keep a terminal window open, the other was having to run it from it’s install directory – particularly when it comes to relative path names for kettle jobs.

So I created an alias that runs PDI in an interactive shell allowing you to run it from one a word command, and it occurred to me that this might be useful to share. Here you go:

alias spoon='sh -c '\''cd /opt/pentaho/data-integration; ./spoon.sh&'\'''

 

Copy Files to Pogoplug Without The Pogo Software (using scp)

I recently picked up a Pogoplug on sale from John Lewis and thought I’d give it a whirl with my media.

Although it is a neat little device, one of it’s biggest benefits is also it’s biggest flaw in terms of design – and that is how it requires you to sign into pogoplug.com and maintain an account there. It also requires you to mount the pogoplug with their software for transferring and viewing files, rather than acting as a NAS.

Whilst it’s nice to have easy access to media outside of home (without having to fiddle with setting up port forwarding on your firewall and whatnot) it’s a bit of a drag when you’re on your own network. I noticed a severe performance degradation copying media to my pogoplug device using pogoplugfs rather than through a standard means. So I learned that Pogoplug does appear to have a Busybox install and along with that SSH access. In order to enable SSH access, Cloud Engines have been gracious enough to allow this through your my.pogoplug.com portal. You simply go to Security options and enable SSH, and change the password. From there it’s just a simple,

ssh root@<pogoplugIpAddress>

The problem is that there doesn’t seem to be any support for sftp and therefore I couldn’t use ssh in a file manager. Thankfully however ssh provides scp protocol and from there it was just as short script in order to zap files across over local lan without worrying about signing in.

When you attach an external harddrive to the Pogoplug, your files will be installed in a directory similar to ‘/tmp/.cemnt/mnt_sda2/’ where ‘mnt_sda2’ will be the mount point of your media device.

Be aware this script utilises “expect”, but you could use private keys instead.

ppsend.sh:
#!/bin/bash
#set -x
# Provide a list of media extensions to send to pogoplug
extensions=("mp4" "avi" "mkv" "jpg");
ppip="192.168.1.10" # The local ip address of your pogoplug
echo ""
echo "Sending video files to PogoPlug ($ppip) for the following extensions..."
for ext in "${extensions[@]}"; do
echo ${ext}
done
echo ""
for ext in "${extensions[@]}"; do
 expect -c "spawn bash -c \"scp -p *.$ext root@$ppip:/tmp/.cemnt/mnt_sda2/\"
 expect assword ; send \"mysshpassword\r\"; interact"
done

I think the next step for this is to translate it to another language and wrap it up in a GUI for easy access. So watch this space perhaps.

Live Backup of Minecraft

I use Minecraft on Linux, occasionally I find java crashes whilst I’m playing and I lose my world save. I think it has more to do with some buggy hardware currently than the OS after a discussion I had with someone in meatspace.

Anyway, yeah, no matter what’s at fault, losing a Minecraft world is no pleasant thing, so I created the following script to incrementally backup whilst I’m playing. I run it from my home directory:

#!/bin/bash

# Live backup of the game for java crashes
# Author: Wes Fitzpatrick

if ! [ -d .minecraft_live_backup ]; then
 cp -pr .minecraft .minecraft_live_backup
fi
if ! [ -d .minecraft_current_session ]; then
 cp -pr .minecraft .minecraft_current_session
fi
mv .minecraft_live_backup .mc_last_session`date | awk '{ print $1 $2 $3 $4 }'`
while true; do
 rm -fr .minecraft_current_session && cp -pr .minecraft .minecraft_current_session
 sleep 120
 rm -fr .minecraft_live_backup && mv .minecraft_current_session .minecraft_live_backup
done

What this does is first backups up your current .minecraft folder, so your last game is preserved, then creates two alternate backups. One is your current (minecraft_current_session) up to the last 2 minutes of play, the second is the previous current (minecraft_live_backup) in case the failure occurs during backup.

I’ve tested the backup copies and both work in event of a crash. This means rather than losing the entire castle, I’ve only lost the last few block placed.

Every time I use Skype

Skype 2.2 Beta is just a buggy piece of crap, but I have to use it because my family are on it, so here’s a small script I use when it blows up (which it does almost every chat session):

#!/bin/bash

proc_id=`ps -ef | grep "/usr/bin/skype" | grep -v "grep" | awk '{print $2}'`
kill -9 $proc_id
/usr/bin/skype &

Converting Duel Boot Windows 7 Partition to a VM

I recently got a new replacement work laptop with Windows 7 installed. Despite the great desire to shrink it and put a more mature and stable OS on to use, I decided to give Win 7 a shot – that and I needed to use the laptop right away so I didn’t have an immediate choice.

Well after a month of use, Windows 7 was already showing the signs that it was going the way of its predecessors in growing exponentially, slow boots and sloooow shutdowns… I heard Ubuntu calling.

You see, Linux takes a while to get used to when you come from a Windows background, things aren’t done the same way, but after time you realise this new way of doing things makes much more sense and takes much less time. So when you go back to Windows after a few years in Linux, you feel like you’re taking a step back in time – to slower, less advanced OS, where a problem can’t be fixed unless you are prepared to fork out a lot of money for a proprietary app that you’re only ever going to need to use to solve that one problem.

It was time to stop the rot and cage Windows 7, I still needed it for Outlook and Exchange (well until Crossover can support 2010) but I don’t need 90% of that operating system. I had done plenty of duel boots before but I wanted to try my hand at turning my Win 7 partition into a VM, and despite the ubiquity of home-brew tutorials out there on the web, I had to turn to several for different problems I experienced along the way. I’m documenting the steps here completely, and will provide attribute the relevant tutorials that helped.

Step 0: Backup

It needs to be said, it needs to be done. I always hate using Windows Backup and sometimes opt to use a Linux live CD to do the backup instead, guaranteeing I can view the process. I usually just make sure that documents are saved, I’m not worried about settings as these can be reset. This time I used Windows backup to an external HDD which seemed to work adequately enough.

Step 1: Shrink Windows 7 Partition

Although it’s not recommended, I always found GParted to be a trouble-free tool and never had a problem with it, so I booted into an Ubuntu Live CD and fired it up. I was then presented with a disk that had no less than 4 partitions. One was a boot partition, one was recovery, the other I couldn’t tell, and the final one was Windows 7. Here is where I made my first mistake, I got cocky and deleted the Windows boot partition thinking I could restore the boot record later with a recovery disk – it seems Microsoft have made that process much less efficient along with making partitions a lot more complicated than necessary.

Anyway, don’t delete the boot partition, but if you do, then here’s what to do:

The first problem I had was that Windows 7 wouldn’t boot I had the following error:

“autochk program not found, skipping autocheck”

Some Googling brought me to a Microsoft Answers post.

  • Use your recovery CD or download one if you got a crappy OEM pre-installed system – Neosmart have some links and instructions for torrent files.
  • Boot into recovery and then when you get to the System Recovery Options screen, you can choose the automatic System Repair option but I’ve never found it any use so go straight to Command Prompt.
  • Run the following command to check your disk for errors and fix them (where x: is the drive containing your Win 7 install):

CHKDSK x: /F /R

  • Once that runs restart the computer.

In my case chkdsk didn’t work and I still got the error, so the next thing I attempted was to attempt to use bootrec to fix the mbr.

  • Boot back into System Restore, go to the command prompt and run:

bootrec /fixmbr

then
bootrec /fixboot

then
bootrec /rebuildbcd

In my case after the last command I got the following error:

“total identified windows installations 0”

Exporting the bcd didn’t work either:

bcdedit /export C:\BCD_Backup
c:
cd boot
attrib bcd -s -h -r
ren c:\boot\bcd bcd.old
bootrec /RebuildBcd

So I attempted the following fix to this following the instructions from Neosmart again, Recovering the Bootloader:

x:\boot\bootsect.exe /nt60 all /force
del C:\boot\bcd
bcdedit /createstore c:\boot\bcd.temp
bcdedit.exe /store c:\boot\bcd.temp /create {bootmgr} /d "Windows Boot Manager"
bcdedit.exe /import c:\boot\bcd.temp
bcdedit.exe /set {bootmgr} device partition=C:
bcdedit.exe /timeout 10
del c:\boot\bcd.temp

But I didn’t get that far because I didn’t have a bootsect file, so I had to do a bit more digging and found a better solution halfway down this thread. These are just the steps, but more detail about why we do this is in the original post.

First of all I seemed to have some kind of corruption in my filesystem telling me that the c:\boot\bcd file didn’t exist, except it was there, when I attempted to copy memtest.ext to BCD it said that a file already existed. This is where the Live CD came to the rescue:

  • Boot into Linux live cd.
  • Mount your Windows 7 partition.
  • Navigate to /Boot
  • Delete the ‘BCD’ file.
  • After a startup repair your original BCD file is renamed to BCD.Backup.0001.
  • Copy memtest.exe memtest.exe.org.
  • Copy BCD.Backup.0001 memtest.exe.
  • Rename memtest.exe to BCD.
  • Rename memtest.exe.org memtest.exe.
  • Now reboot Windows.

In my case, this solution finally worked and I got Windows working again. Now for virtualisation…

Step 2: Virtualising Windows 7 Partition as a VM

I followed instructions given by Rajat Arya on his blog, apart from the 5th step which didn’t work without some slight modification as I had installed VirtualBox v4.0.4 from http://www.virtualbox.org/, not the Open Source Edition.

These are just the steps, Rajat goes into more detail in the post which is worth reading:

You need to take ownership of the disk first under your username. The original way stated is to chmod the /dev/sda file but this is less secure.
sudo usermod -a -G disk wafitz
Then log out and back in to make the changes take effect. Next install the mbr package:
sudo apt-get install mbr
The -e flag below is to set the partitions you wish to make available to Windows boot, so in this case I set 1 (Windows partition) and 2 (recovery).
install-mbr -e12 --force ~/vm.mbr
Then create the vmdk file. I found that the -relative flag didn’t work, neither did the -mbr flag, but it was fine with these left out:
VBoxManage internalcommands createrawvmdk -filename /home/wafitz/wind7part.vmdk -rawdisk /dev/sda -partitions 1,2 -relative
Now create your VM in VirtualBox and boot into Windows 7. If you get a boot error, you’ll need to do Windows recovery again. Set the VM to mount your CD drive then press F12 at startup and boot into recovery… and follow through on Step 1 of this post again.

Now I installed VirtualBox tools and with Seamless mode, I’m able to Outlook as a full-on desktop app within Ubuntu.

3rd Times a Charm… Linux, Wifi and Samsung N210

If you follow this blog you will know that I’ve mentioned wireless and the Samsung N210 in the past.

Kickstart Samsung N210 Wireless
Update: UNR 10.04 Wireless on Samsung N210

The first one post was the most successful blog post I’ve ever written, and judging by the number of posts in the Ubuntu forums it’s something many people struggle with constantly with this netbook.

I’ve found that the solutions I posed previously worked, but had to be redone each time after a kernel update. What’s worse is that with each new ‘Buntu distro upgrade, an entirely new wireless problem appeared.

To be quite frank, I’m getting sick of this and I won’t be touching another Samsung netbook. In fact when it comes time to refresh I’ll probably opt for a System 76 box.

The latest upgrade appears to be finding the card and connecting seamlessly on live distro and install… but after a while of being online, it starts to drop off, becoming more frequent until it either doesn’t work at all, or it pretends to connect to the network without actually establishing any connection to the router.

I did a bit more digging in the forums and online and after much reading I’ve come to the conclusion that it has something to do with Canonical disabling Wireless N by default in 10.10 (Maverick), and whatever is going on inside Samsung netbooks… because to my dismay I discovered it’s not just the N210 that has this problem seems to be the majority of their netbook range using Realtek cards.

It was then that I stumbled across a German blog post from a link through a forum for sorting out wireless on the Samsung N510.

Dirk Hoeschen has put together both a driver and script to easily run from the command line which I can confirm worked on my N210. The only issue I’ve found is Wireless still drops but it’s much more stable and lasts longer. Furthermore, by running the script again it performs a ‘reset’ and causes the wireless to reboot without having to reboot.

sudo apt-get install build-essential
tar -xpf rtl819Xe.tar.gz
cd rtl819Xe
sudo ./install.sh

It’s not perfect but it works and so it is my 3rd solution from Ubuntu 10.10 onwards. There may be more elegant and more permanent solutions but I really don’t have the time or skill level to look for them. One thing I will state is that Ubuntu was the only distro I tested that recognised the Realtek hardware out-of-the-box despite these issues.

I honestly don’t know what the Narwhal will bring and if I’ll have to hunt for a new solution or if it will finally be fixed. Funnily enough, I did have duel boot with Win 7 Starter on the netbook and one evening when I was getting particularly fed up I booted in and found that Windows didn’t recognise the wireless network either – leading me to believe that the wireless cards in these things are very poor quality. Either that or Samsungs quality control efforts are seriously questionable. But that’s another story.

Blog Your Geocaching Found Logs with Blogger

You may or may not be aware that Google have recently released a command line tool called Google CL which allows limited updating of some of it’s primary services from the command line – including Blogger.

I have been working on a script and looking for a utility to parse the “My Finds” pocket query for uploading to a blog for a while now so on hearing this news I set to work to see if I could create an automated script. You can see the results on my old blogger account, which I have now renamed _TeamFitz_ and repurposed for publishing our Geocaching adventures.

It’s a little bit clunky and could be improved, but the script is now complete and ready for ‘beta’. I’m publishing it here and releasing it under GPL for others to download, copy and modify for their own Geocaching blogs.

A few snags:

  • It will only work with one “Find” per cache – if you found twice it may screw up the parser.
  • Google have an arbitrary limit of 30-40 auto-posts per day, which is entirely fair, it will then turn on word verification which will prevent CL updates. I have limited the script to parse only 30 posts at a time.

You will need to download and install Google CL, it goes without saying the script is Linux only but if someone wants to adapt it to Windows they are welcome.

I have commented out the “google” upload line for test runs, remove # to make it active.

Either cut n’ paste the code below, or download the script from YourFileLink. Please comment and post links to your own blogs if you use it, also let me know if there are any bugs I haven’t addressed.

#!/bin/bash
# Script to unzip, parse and publish
# Pocket Queries from Geocaching.com to Blogger
# Created by Wes Fitzpatrick (http://wafitz.net)
# 30-Nov-2009. Please distribute freely under GPL.
#
# Change History
# ==============
# 24-07/2010 - Added integration with Blogger CL
#
# Notes
# =====
# Setup variables before use.
# Blogger has a limit on posts per day, if it
# exceeds this limit then word verification
# will be turned on. This script has been limited
# to parse 30 logs.
# Blogger does not accept date args from Google CL,
# consequently posts will be dated as current time.
#
# Bugs
# ====
# * Will break if more than one found log
# * Will break on undeclared "found" types
#   e.g. "Attended"
# * If the script breaks then all temporary files
#   will need to be deleted before rerunning:
#	.out
#	.tmp
#	new
#	all files in /export/pub
#####     Use entirely at your own risk!      #####
##### Do not run more than twice in 24 hours! #####
set -e
clear
PQDIR="/YOUR/PQ/ZIP/DIR"
PUBLISH="$PQDIR/export/pub"
EXPORT="$PQDIR/export"
EXCLUDES="$EXPORT/excludes.log"
GCLIST="gccodes.tmp"
LOGLIST="logs.tmp"
PQZIP=$1
PQ=`echo $PQZIP | cut -f1 -d.`
PQGPX="$PQ.gpx"
BLOG="YOUR BLOG TITLE"
USER="YOUR USER ID"
TAGS="Geocaching, Pocket Query, Found Logs"
COUNTER=30
if [ ! $PQZIP ];then
echo ""
echo "Please supply a PQ zip file!"
echo ""
exit 0
fi
if [ ! -f "$EXCLUDES" ];then
touch "$EXCLUDES"
fi
# Unzip Pocket Query
echo "Unzipping PQ..."
unzip $PQZIP
# Delete header tag
echo "		...Deleting Header"
sed -i '/My Finds Pocket Query/d' $PQGPX
sed -i 's/'"$(printf '\015')"'$//g' $PQGPX
# Create list of GC Codes for removing duplicates
echo "		...Creating list of GC Codes"
grep "<name>GC.*</name>" $PQGPX | perl -ne 'm/>([^<>]+?)<\// && print$1."\n"' >  $GCLIST
# Make individual gpx files
echo ""
echo "Splitting gpx file..."
echo "	New GC Codes:"
cat  $GCLIST | while read GCCODE; do
#Test if the GC code has already been published
if [ ! `egrep "$GCCODE$" "$EXCLUDES"` ]; then
if [ ! "$COUNTER" = "0" ]; then
echo "      	$GCCODE"
TMPFILE="$EXPORT/$GCCODE.tmp"
GCFILE="$EXPORT/$GCCODE"
sed -n "/<name>${GCCODE}<\/name>/,/<\/wpt>/p" "$PQGPX" >> "$TMPFILE"
grep "<groundspeak:log id=" "$TMPFILE" | cut -f2 -d'"' | sort | uniq > "$LOGLIST"
cat $LOGLIST | while read LOGID; do
sed -n "/<groundspeak:log id=\"$LOGID\">/,/<\/groundspeak:log>/p" "$TMPFILE" >> "$LOGID.out"
done
FOUNDIT=`egrep -H "<groundspeak:type>(Attended|Found it|Webcam Photo Taken)" *.out | cut -f1 -d: | sort | uniq`
mv $FOUNDIT " $GCFILE"
rm -f *.out
URLNAME=`grep "<urlname>.*</urlname>" "$TMPFILE" | perl -ne 'm/>([^<>]+?)<\// && print$1."\n"'`
echo "      	$URLNAME"
# Replace some of the XML tags in the temporary split file
echo "      		...Converting XML labels"
sed -i '/<groundspeak:short_description/,/groundspeak:short_description>/d' "$TMPFILE"
sed -i '/<groundspeak:long_description/,/groundspeak:long_description>/d' "$TMPFILE"
sed -i '/<groundspeak:encoded_hints/,/groundspeak:encoded_hints>/d' "$TMPFILE"
sed -i 's/<url>/<a href="/g' "$TMPFILE"
sed -i "s/<\/url>/\">$GCCODE<\/a>/g" "$TMPFILE"
LINK=`grep "http://www.geocaching.com/seek/" "$TMPFILE"`
OWNER=`grep "groundspeak:placed_by" "$TMPFILE" | cut -f2 -d">" | cut -f1 -d"<"`
TYPE=`grep "groundspeak:type" "$TMPFILE" | cut -f2 -d">" | cut -f1 -d"<"`
SIZE=`grep "groundspeak:container" "$TMPFILE" | cut -f2 -d">" | cut -f1 -d"<"`
DIFF=`grep "groundspeak:difficulty" "$TMPFILE" | cut -f2 -d">" | cut -f1 -d"<"`
TERR=`grep "groundspeak:terrain" "$TMPFILE" | cut -f2 -d">" | cut -f1 -d"<"`
COUNTRY=`grep "groundspeak:country" "$TMPFILE" | cut -f2 -d">" | cut -f1 -d"<"`
STATE=`grep "<groundspeak:state>.*<\/groundspeak:state>" "$TMPFILE" | perl -ne 'm/>([^<>]+?)<\// && print$1."\n"'`
# Now remove XML from the GC file
DATE=`grep "groundspeak:date" " $GCFILE" | cut -f2 -d">" | cut -f1 -d"<" | cut -f1 -dT`
TIME=`grep "groundspeak:date" " $GCFILE" | cut -f2 -d">" | cut -f1 -d"<" | cut -f2 -dT | cut -f1 -dZ`
sed -i '/groundspeak:log/d' " $GCFILE"
sed -i '/groundspeak:date/d' " $GCFILE"
sed -i '/groundspeak:type/d' " $GCFILE"
sed -i '/groundspeak:finder/d' " $GCFILE"
sed -i 's/<groundspeak:text encoded="False">//g' " $GCFILE"
sed -i 's/<groundspeak:text encoded="True">//g' " $GCFILE"
sed -i 's/<\/groundspeak:text>//g' " $GCFILE"
# Insert variables into the new GC file
echo "      		...Converting File"
sed -i "1i\Listing Name: $URLNAME" " $GCFILE"
sed -i "2i\GCCODE: $GCCODE" " $GCFILE"
sed -i "3i\Found on $DATE at $TIME" " $GCFILE"
sed -i "4i\Placed by: $OWNER" " $GCFILE"
sed -i "5i\Size: $SIZE (Difficulty: $DIFF / Terrain: $TERR)" " $GCFILE"
sed -i "6i\Location: $STATE, $COUNTRY" " $GCFILE"
sed -i "7i\Geocaching.com:$LINK" " $GCFILE"
sed -i "8i\ " " $GCFILE"
mv " $GCFILE" "$PUBLISH"
touch new
COUNTER=$((COUNTER-1))
fi
fi
done
echo ""
echo "			Reached 30 post limit!"
echo ""
# Pubish the new GC logs to Blogger
if [ -f new ]; then
echo ""
echo -n "Do you want to publish to Blogger (y/n)? "
read ANSWER
if [ $ANSWER = "y" ]; then
echo ""
echo "	Publishing to Blogger..."
echo ""
egrep -H "Found on [12][0-9][0-9][0-9]-" "$PUBLISH"/* | sort -k 3 | cut -f1 -d: | while read CODE; do
CACHE=`grep "Listing Name: " "$CODE" | cut -f2 -d:`
GC=`grep "GCCODE: " "$CODE" | cut -f2 -d:`
sed -i '/Listing Name: /d' "$CODE"
sed -i '/GCCODE: /d' "$CODE"
#google blogger post --blog "$BLOG" --title "$GC: $CACHE" --user "$USER" --tags "$TAGS" "$CODE"
echo "blogger post --blog $BLOG --title $GC: $CACHE --user $USER --tags $TAGS $CODE"
mv "$CODE" "$EXPORT"
echo "		Published: $CODE"
echo "$GC" >> "$EXCLUDES"
done
echo ""
echo "                  New logs published!"
else
echo ""
echo "                  Not published!"
fi
echo ""
else
echo "			No new logs."
fi
echo ""
rm -f *.out
rm -f *.tmp
rm -f "$EXPORT"/*.tmp
rm -f new

3gp Video Format on Linux

I don’t know about you, but it irritates me how many mobile phones I’ve used that only save their video in one propriety format – 3gp. From what I gather 3gp is a format backed by 3GPP – a collaboration of telecoms providers – probably for it’s compact size limit (for sending MMS) but why then can’t they offer another encoding version for video you don’t intend to send?

3gp doesn’t work out of the box on Linux – but I’ve found even with restricted media packages in place that the audio won’t play. Google “3gp audio” and a myriad of results will return with links to free converters. Support for linux seems to range from either some fairly complex command line fu, to manually installing and compiling codecs for working in your media player of choice.

Which is why I was happy to discover Miksoft a little while ago. Miksoft offer a free ‘Mobile Media Converter’ which is not only cross-platform, but it offers a simple GUI interface. The GUI makes it a trivial matter to copy a 3gp (or any media file) you want into the input box, then just specify the output file and the format you wish (e.g. AVI).

I’ve found Mobile Media Converter converts sound perfectly and it also includes a YouTube downloader which I’ve found handy recently.

Get Mobile Media Converter now (and be sure to donate if you have any spare cash).

Just GPSBabel

A while ago I posted my frustrations with running GSAK on Linux, and how I’d found a way of getting the function to send to my Garmin 60CSx to work.

I thought I owed an update to anyone who read that to say I gave up on GSAK and now just use GPSBabel from the command line. Here’s the script I run:

#!/bin/bash

GPXZIP=$1
NAME=`echo $GPXZIP | cut -f1 -d.`

unzip $GPXZIP
GPXFILE=`ls *.gpx`
for FILE in `echo $GPXFILE`; do
    gpsbabel -i gpx -f $FILE -o garmin -F usb:0
done

I called the script ‘sendtogarmin.sh’. I saved it in the directory where I save all my pocket queries are stored as zip files. When it comes to load them, I just hook up the Garmin and then run:

sudo ./sendtogarmin.sh pquery.zip

Works for me, but I never bothered with using GSAK for much more than sending the pocket queries – I’m not at the stage yet where I feel I have to create complicated queries to get a good caching experience.

Becoming A Perfect Killer

If you’re going to work with computers and Linux in particular, sooner or later you’ll need to perfect your killing technique. I’m not just talking about those useless defunct processes or applications that refuse to die – I’m also talking about the Operating System itself. When it comes to Windows you’re pretty limited to the confines of Ctrl+Alt+Del and the power switch, but with Linux, as with anything else, there are many ways to do things.

Force Quit Applications that Won’t Die (Gnome)

The first and easiest way to kill an application is to install the Force Quit button on your top or bottom panel. Right click on the panel, select “Add to Panel…” then scroll down and find Force Quit and select “Add”. This adds a little broken window icon to your panel when clicked on, turns your cursor into a deadly cross. With the cross, click on any window that refuses to die and usually it will be instantly gone.

End that Pesky App
Managing Processes that are not Being Nice (Gnome)

If Force Quit is bit too brutal a method for you, you can always got to System -> Administration -> System Monitor, which is a bit like Windows Task Manager, to view processes in the Processes tab. Here you can choose to End an application in a softer way as well as reset its nice value (priority).

Kill that Lurking Background Process

Sometimes an application will seem to close normally, or even after a force quit, but will continue to skulk in the background, refusing to give up it’s entitlement to your cpu and memory. This is the Terminal comes in handy for terminating that rogue John Connor of processes.

Fire up the terminal (Applications -> Accessories -> Terminal) and type the following:

top

If you already know the name of the application or it’s not eating your cpu cycles you can use grep to filter all or part of the application name:

 ps -ef | grep -i [ name of app ]

Note the process id (pid) of the rogue application taking up cpu then type:

kill -9 [pid]

Killing Applications that Crash your Desktop (X)

Of course, the above methods are not useful if an application has caused your desktop to seize up. Not to worry, Most distributions boot up with several virtual terminal sessions with X (the GUI desktop) running on only one of them.

When X crashes,

  • Press Ctrl+Alt+F1. This will take you to the login prompt of the 1st of 6 terminals running.
  • Login as root, or your user id then sudo to root.
  • Run the top command to get a list of processes and PID numbers.
  • Press Alt+F2 to log into the 2nd Terminal
  • As with killing background processes above, you can simply type ‘kill -9 [pid]
  • Press Alt+F7 to get back to your desktop and all should be well

Note: even if the desktop does not recover after killing the process, if you can access one of the other terminals, you can still do a hard reboot by typing:

shutdown -fr now

If all else fails….

Raise A Skinny Elephant

It may sound like some kind of incantation or ritual now needs to be performed, or possibly a mafia codeword for calling in a hit on your computer – but it’s really just a mnemonic device.

Skinny Elephant Guy
source: Vin Rowe

If your desktop has completely seized and none of the above will work, then just remember that

Raising Skinny Elephants Is Utterly Boring.

This will give you back control of your keyboard, sync your hdd, end all processes, and perform a manual but orderly reboot. Type in the following:

Alt+SysRq (+ letters below in order, pausing between key strokes)
+r =  Put keyboard in raw mode (recapture your key presses)
+s = Sync the disk
+e = Terminate all processes
+i = Kill all processes
+u =  Remount all filesystems Read-Only
+b =  Reboot the system

On reboot let the system perform any filesystem checks and it will recover automatically.

If for some reason you can’t recall Skinny Elephants, or the thought grosses you out, you can always use BUSIER to help you remember – execute it backwards: (Alt+SysRq)+R+E+I+S+U+B.