This is our account of using the Ubuntu computer operating system and other free software.
On 23 April 2007 we acquired a used Dell PC to run the Linux-based Ubuntu 7.04 on, while continuing to run Windows XP on our other PC.
We had several motivations for trying out Linux, as you will read here, including being very, very unimpressed with Windows Vista as a potential replacement for Windows XP, when its mainstream support ended on 14 April 2009. We wanted a second computer anyway so both of us could work at the same time and trying out alternatives to expensive commercial software turned out to be a good idea.
Because of our success in using Ubuntu, on 14 June 2008 we reformatted our remaining Windows XP computer and installed Ubuntu 8.04 on it, going Windows-free.
We haven't missed Windows at all, in fact we have been far better off without it, Linux is free, works better, is more stable and doesn't run viruses or spyware. It does everything we want to do and has been a perfect solution for our computing needs.
Like most diaries the most recent entries are at the top, so you will have to go to the bottom of the page to see the beginning of the story.
This page covers Ubuntu 7.04 Feisty Fawn to Ubuntu 10.04 LTS Lucid Lynx. Because the page became so large we have continued the diaries in our Ubuntu Diaries Part II.
I have been working more with PiTiVi since my last review and I have to say that the more I use it the less I like it.
The final straw for me came while trying to assemble a movie from clips that I shot at an airshow. The project required 43 clips to be joined into a seven minute video with transitions. To tackle the project with PiTiVi I used Ruth's desktop PC, since it has dual core processors and 3 GB of RAM. Unlike on my older desktop PiTiVi will actually run on that PC.
The entire process was a mess. PiTiVi imported the clips okay. I was able to assemble them on the time line and add transitions only with great difficulty as PiTiVi kept randomly moving clips around, locking up and crashing. It didn't seem to like the number of clips, even though it hadn't used up the CPU power or the available RAM. It was only through carefully saving my work after each step that I was able to get the task completed. Then when I came to render the video it persistently got only so far and then froze. It was hopeless and in the end I had to give up.
Next I switched to my own more modestly powered desktop PC and tried doing the work on Avidemux instead. Since Avidemux doesn't recognize the audio codec my camera produces, this required using the technique of ripping the audio with VLC, converting it to a .wav with Audacity and then reintegrating each clip with Avidemux. It was a slow process, but with that completed I was able to then edit the 43 clips together into a movie. The resulting video worked out okay, although I wasn't happy with the transitions between clips, which were jumpy, and in the end posted it without transitions. All through this Avidemux performed flawlessly, never using much more than half my modest 1.8 GHz CPU power or more than 400 MB of RAM.
After having posted that video I recognized that I am going to be using Avidemux more often, since PiTiVi just won't work on more than a few clips at a time. This left me resolved to do some more experiments with Avidemux to see if I could make the transitions between clips work better. In the past I have mostly used a 20 frame transition (ten frames either side of a join between clips) and left it as "fade out", which is the default. As noted this seems to result in jumpy transitions. After many experiments what I found works best is to use a ten frame fade out to black at the join and then a ten frame fade in from black from the join. You can see what that looks like in this video. It actually gives nice fade to black joins, with no jumpiness at all.
Overall I continue to be impressed with Avidemux and less impressed with PiTiVi. Avidemux, being a linear editor, is less flexible, meaning that you can't randomly drag and drop clips into the video you are making, but instead you have to add them to the project in the right order the first time and then edit them. But at least it works! If Avidemux would support the mu-law audio codec my camera produces I would be totally happy with it as a video editor.
Ubuntu seems to have been creeping up in its hardware requirements in recent versions. Not long ago the minimum system requirements page specified 256 MB of RAM (with 384 MB recommended) and a 300 MHz processor (700 MHz recommended). A recent report of the inability to install Ubuntu 10.04 on a couple of 550 MHz processor machines at Computers For Communities sent me scrambling to check the spec pages. The machines ran Puppy Linux just fine, so that ruled out broken hardware as the cause.
It seems in the recent Ubuntu website reorganization that the hardware specifications page was moved. Now there are two pages on the subject:
A recent article, How to Revitalize Mature Computers by Howard Fosdick, confirms that Ubuntu seems to have upped its system requirements. Fosdick says:
...five P-III systems on which I've tried to install the latest Ubuntu release, 10.04, have all failed. I've had success with two of three P-IV systems. In all cases, the new video didn't work out-of-the-box and fixing it has become more complicated than simply editing the now-missing xorg.conf file. I am concerned but have not yet completed testing, and am now trying some different boot options. My impression thus far is that Ubuntu is leaving older systems behind.
I am not sure that this is a bad news story entirely. Ubuntu 10.04 will run well enough on most computers that ran XP and will turn in great performance on any Vista or later box. Ubuntu is probably just using the capabilities that these boxes present. For the older boxes, as Fosdick points out, there are other lighter distributions available that will run fine on them, and he especially recommends Puppy Linux, which we have had good experiences with as well.
Over the last five months since I wrote this short item I have been carefully following this issue. In October 2010 Ubuntu 10.10 came out and my tests showed that it does not run well on a PC with a Pentium 4 processor of 2.66 GHz and 2 GB of RAM. It is never RAM-limited, but maxes out the CPU playing back off-line videos, running the Ubuntu Software Centre and using the PiTiVi video editor, even though the Ubuntu system requirements state that Ubuntu currently requires just a 1 GHz processor and 1 GB of RAM to run well. It is also sluggish opening applications, directories and files on both netbooks with a 1.6 GHz Atom processor and on the 2.66 GHz Intel Pentium 4 desktop. This is a change from earlier versions of Ubuntu, including even Ubuntu 10.04, which only showed problems on this class of processor when running PiTiVi, which is a known issue.
It is interesting to compare the minimum hardware requirements for the following operating systems:
Clearly those specs are all the same. It is also worth noting that neither Vista nor Windows 7 will run even close to well on just a 1 GHz processor with 1 GB of RAM and the same seems to be true now of Ubuntu. In conducting experiments with Lubuntu, the lightweight LXDE desktop version of Ubuntu, I have found that it runs really well on this class of processor. As a result, my current recommendations are that if you are looking for an operating system that gives good speed and performance then:
We recently had a recommendation from the National Capital FreeNet Free Software Discussion Group to try out a relatively new search engine with an odd name.
Duck Duck Go was started up by Gabriel Weinberg in September 2008 as an alternative to Google. Lots of people have tried doing just that, but Weinberg may just have succeeded.
Duck Duck Go offers a very clean and simple interface with some nice features, including one page results, you just scroll down and it keeps adding results, with no need to click through multiple pages. It also includes "Zero-click info" which is a red-bordered box that shows up at the top of many searches with a picture and quick definition information from Wikipedia. That is very handy! It also displays favicons (those little icons that each website uses) beside each result, which is also useful. When it runs out of results it offers to send you to Google to try there, which is pretty magnanimous!
DDG gets its results from the open source Yahoo! Search BOSS project, which provides free access to the Yahoo! search index, Wikipedia, as well as its own web crawler, called the DuckDuckBot, of course. The results are supposed to be optimized to give the best answers, rather than the most answers and they seem pretty good. One thing it doesn't do is give a number of results found, but I am not sure how useful that information is anyway.
Most of the best features of DDG are what it doesn't do. As Weinberg explains in detail the main focus of this search engine is better results with much better privacy. Unlike Google, DDG doesn't collect or share personal information, like your search strings. It also doesn't keep your search history and doesn't put cookies on your computer unless you use the customization features. It doesn't track you and it can't create patterns from your searches to advertise at you. This has some implications. For instance other search engines have been compelled by court orders to turn over search information as part of criminal investigations, even though nothing requires them to collect that information in the first place. DDG doesn't collect the information so it can never be turned over to anyone.
DDG doesn't seem to have advertising at all, or paid-position searches for that matter, so I am not clear how it is funded.
As far as the funny name goes Weinberg says "I just liked it. It is derived from Duck Duck Goose, but it's not a metaphor—really."
So far I am pretty impressed with it - it seems to work well and the privacy standards are good news. It also gives a good alternative for people who don't want Google to own their whole life. I am using it as my default search engine on Chromium.
PiTiVi is the new default movie editor included with Ubuntu, starting with 10.04 Lucid Lynx. Interestingly it is a "Linux-only" application.
Unlike Avidemux, PiTiVi has a fairly complete manual available, which is a great help in learning how it works. As a bonus the mu-law audio codec our camera produces is supported by PiTiVi right out of the box!
I have been trying out PiTiVi 0.13.4, going through the manual and trying out features. It works well importing clips and trimming them. By zooming in on the timeline, quite precise cuts can be made. Based on the manual, I was under the impression that PiTiVi does not include transitions between clips. This means that my first efforts have "jump cuts", which make for a bit of a choppy result, as can be seen in this video.
I discovered that one thing the manual does not describe well is that PiTiVi actually does have the ability to do transitions between clips! It requires overlapping the video clips on different layers, marking keyframes on the overlapped portions and then using the opacity control to reduce one clip linearly to 50% opacity over a few seconds while increasing the other one to 100% opacity over the same time frame. This actually produces a good result as can be seen in this video. The transitions are actually nicer than those produced by Avidemux, which tend to be choppy. According to the current PiTiVi manual, transitions will be much easier in the next version, 0.13.5, as you will just have to drag the clips on the same layer and overlap them to create an automatic fade transition. Future versions will also incorporate a range of new video effects, a product of a Google Summer of Code project for 2010.
I have discovered that even though the manual doesn't show how to do it, PiTiVi does support adding titles. The best way to accomplish titling a video is to make a .png image file, the same dimensions as the video, with a transparent background with your title on it. This can then be imported into PiTiVi as a clip and dragged onto the timeline. One thing to keep in mind is that the video timeline accounts for layer opacity from top to bottom, meaning that if the title is below the movie clip then it won't be visible. For it to be seen it has to be above the video clip. The opacity control (red line) can then be used to fade the title in or out, if desired. The resulting titles look very professional as you can see in this video I made.
One PiTiVi oddity I ran into was rendered file formats. When rendering the video from .avi clips I stuck with the default settings which are OGG Muxer for a container, Vorbis for audio and Theora for video. This saved and played fine on my desktop using the Totem movie player, as they are all open formats. When I uploaded it to You Tube the video didn't display properly, though. I am not sure if this is a You Tube problem or if the .avi input didn't work well as an .ogg output. Setting the container to FFMPEG AVI and the video to Xvid rendered a video that wouldn't play in Totem, but would play in VLC and when I uploaded it to You Tube it played fine.
I had the same formatting problem with videos made from older .mov clips, they would render and play fine as .ogg files, but produced bad video results on You Tube. They worked fine on You Tube when I used a .mov container, Xvid video and LAME MP3 for audio, though.
The only remaining issue I have with PiTiVi is that the current version is a resource hog and maxes out my single core AMD 1.8 GHz CPU trying to do playback of the edited video, although this is a known issue which they are addressing. The lack of being able to playback videos while editing makes it pretty much impossible to use effectively on my PC, although it will run okay on Ruth's newer desktop computer, which has dual core 2.4 GHz CPUs. Otherwise it seems to be a serviceable video editor that is being steadily improved over time. It will be interesting to see if we get updates to it through Ubuntu to version 0.13.5 when it becomes available and whether that new version will fix the issue. In the meantime I have rated it as 7/10 just due to its advanced hardware requirements.
Overall PiTiVi is fairly impressive. On the right PC it works well and the documentation is pretty good. It allows us to do everything we want to do with video editing and will probably replace Avidemux as the video editor we use, when they get its current problems sorted out.
As described in a previous entry our new camera takes pretty good videos but has a sound codec incompatibility problem with my favourite video editor, Avidemux. The Nikon camera produces .avi videos with audio that uses the mu-law audio codec, which isn't supported by Avidemux.
In most cases my recent videos have been easy to edit in Avidemux, because I haven't wanted to retain the video's audio tracks anyway, preferring to create a music track in Audacity and then save it as a .wav file and use that instead. Still it would be nice to be able to use the audio track or at least sections of it when needed.
I have been looking for an application that will extract the audio track and save it allowing me to mix it in Audacity. I tried a few that didn't work and then discovered that I already had the application that I needed: VLC!
I found the information that put it all together in the article Extracting and Using a Recorded Sound Effect with VLC and Audacity from Free Software Magazine by Terry Hancock.
As Hancock demonstrates VLC can save sound tracks in many useful formats, including .ogg. The key is to open VLC and go Media→ Save/Convert→ File selection→ Add→ Convert/Save→ Settings - Vorbis/OGG→ Start. The resulting .ogg file can then be opened in Audacity, edited and saved as a .wav file and used as the soundtrack in Avidemux.
To make use of this procedure I used VLC to extract the sound tracks from a couple of video clips as OGG files, Audacity to convert them to WAV files and then Avidemux to reintegrate the audio with the video and save the clips. These were then edited together in Avidemux and they work fine as can be seen in this short video, which uses this method.
So I am pleased - one long standing issued solved!
We brought our first computer running Linux home on 23 April 2007, more than three years ago. It was a used Dell desktop and it started off running Ubuntu 7.04 Feisty Fawn. A little over a year later, on 14 June 2008, we switched our other desktop PC from Windows XP to Ubuntu. Both PCs were using Ubuntu 8.04 Hardy Heron at that time.
This week marked two years of running nothing but Linux at our house, including two netbooks that have been more recently added. All four PCs are now running Ubuntu 10.04 Lucid Lynx and I thought this anniversary would be a good opportunity to go over the experiences we have had.
Do we miss Windows? Not at all. We have found that we can get by without the crashes, blue-screens-of-death, viruses, spyware, the high purchase cost and everything else that goes with using Windows. For us Linux came of age with Hardy Heron and since then we have had no hardware problems and no software problems that couldn't be easily solved. We haven't seen a virus, except a couple of Windows viruses I downloaded on purpose to check our Clam AV virus scanner on.
There really isn't a lot to add. We have found Ubuntu infinitely customizable, that it is very stable, the hardware we have all works fine with it and best of all it is free of charge and protects our freedom, too, because the source code is available.
Perhaps the best endorsement has come this week. Ruth's daughter Rachael previously used Mac OS-X at school and found it a poor experience. Now she has to use Windows at work, where she loses hours each day to computer crashes, network crashes, blue screens and reboots. She says that Windows is preventing her getting any work done. Each day after work she is happy to use her Ubuntu netbook as she hasn't had a crash on it ever - it just works. She doesn't understand why they haven't switched to Linux there.
Some odd things have been going on in the Linux CD burning world.
As noted below there is Bug Report 581926 filed against K3B for not being able to erase CDs. I have managed to confirm that on both our desktop machines. One of these has an older generic CD-R/CD-RW writer while the other has a modern ATAPI DVD A DH16A6L-C combination CD/DVD writer. That latter unit has given us problems in the past and continues to do so.
At the same time that K3B seems to be having erasing problems, the standard Gnome CD burner, Brasero, has been getting better. It is now in its 2.30.0 version and runs just fine on the older CD-R/CD-RW writer, erasing and writing data and images without any problems.
I have run a series of tests to try to figure out what is working and what isn't. The results of the testing have been a bit confusing, so here is a table that summarizes what I have found:
ATAPI DVD A DH16A6L-C
|Brasero make data CD-RW||Yes||Yes|
|Brasero make data CD-R||Yes||No|
|Brasero make ISO image CD-R||Yes||No|
|Brasero erase CD-RW||Yes||Yes|
|K3B make data CD-RW||Yes||Yes|
|K3B make data CD-R||Yes||Yes|
|K3B make ISO image CD-R||Yes||No|
|K3B erase CD-RW||No (as per Bug 581926)||No (as per Bug 581926)|
So what this adds up to is that K3B has a bug in it that prevents it from erasing CD-RWs and at the same time the support for the ATAPI DVD A DH16A6L-C combination CD/DVD writer seems to be pretty uniformly poor in both K3B and Brasero. I guess it shouldn't be a surprise that newer hardware is more of a problem than older hardware, probably due to incomplete drivers being available I would guess.
So operationally the desktop with the older CD-R/CD-RW writer works fine and can do everything using Brasero. With K3B it is limited in that it cannot erase CD-RWs due to the reported bug. The desktop with the ATAPI DVD A DH16A6L-C can make CD-Rs and RWs (except not CD-Rs with Brasero), but not ISO image files with K3B, but can't erase with K3B, although it can with Brasero.
I am hoping that the K3B bug is resolved soon, although Brasero is actually working well on the older PC. All I can do is submit bug reports.
Until all this gets sorted out I have rated Brasero at 7/10 and K3B at 6/10. We are currently able to get all our CD burning done, but it takes some real thought to avoid the limits of both programs and the hardware.
As part of my ongoing troubleshooting of this problem I posted this question on the Ubuntu Forums and have carried out a lot more testing based on advice I received in response to my query. I discovered that the ATAPI drive will successfully burn ISO images to CD-RWs. It will also successfully burn ISO images to CD-Rs if it is done at a much slower speed (8X on both Brasero and K3B instead of "auto"). The CDs test fine on other PCs, but will not even boot on the ATAPI drive, which is very odd as it will boot ISO CDs made on other PCs. I am still investigating this problem and it may turn out that the ATAPI drive is not 100% serviceable.
After a long discussion on the Ubuntu Forums about the way four different CD burning programs, including Brasero, K3B, wodim (from the command line) and Xfburn are behaving on this CD drive, the consensus is that this drive is unserviceable. I guess that accounts for its erratic behaviour.
I will add that CD burners are particularly hard to troubleshoot because you have to eliminate three different potential causes of the problem: the drive itself, the software and the CDs you are using. It can take a number of tests of each parameter to build up a complete picture. Because I have determined that the drive is the problem here and not the CD burning applications, I have changed the ratings for Brasero and K3B accordingly. K3B is still lacking as it still will not blank CDs and Bug Report 581926 is still outstanding.
This troublesome CD drive has lately been refusing to even boot to to CDs and so I figured it was time to replace it. I got a new SATA CD/DVD reader and writer from PC Cyber today, installed it in the PC and tested it out. The new drive seems to read and write CDs and DVDs just fine, so I think we can reasonably conclude that the previous drive was unserviceable.
We have now been using Lucid Lynx for 26 days and I can say that it is a great release. Despite the new look, it is not a revolutionary change, but really just an incremental improvement over early Ubuntu releases.
There have been a few updates to Lucid so far, averaging about one a week, mostly consisting of behind-the-scenes library files and similar. We did get a new version of the Evince PDF reader this week, 2.30.1. I was particularly interested to see if Clam AV would get updated. Lucid shipped with Clam 0.96 in the repositories and 0.96.1 was released on 19 May 2010, nine days ago. Today an Ubuntu update brought that version, I am pleased to say.
One application we haven't had an update to is the Chromium browser, which we are all using at our house. Lucid shipped with Chromium 5.0.342.9 (43360) Ubuntu and that is the version we still have, even though Chromium had builds up to 6.0.419.0 as of 27 May 2010. Even Chrome for Linux has shipped 5.0.375.55 as a non-beta "stable" version and has 6.0.408.1 as its development version. Work continues at a fast pace at the Chromium Project and we are all very happy with the browser, so hopefully we will have a new updated version "in the mail" soon. I did check through Launchpad to see if there were any bug reports there and didn't find anything on this subject. One response to a question I posted on the Ubuntu Forums offered the opinion Updates for packages in the repos are updated when its confirmed stable, if I remember correctly.... I guess we will see if he is right.
I should add one complaint I do have about Lucid, for some reason blanking CD-RWs in K3B just doesn't work. I seem to get a bunch of errors and permissions problems instead. As a work-around I have been using Brasero to blank the CD-RWs and then using K3B to record them again, as Brasero is far too slow at recording. This seems to have been reported as Bug 581926 already.
I am pleased to report that today we received an updated version of Chromium 5.0.375.38 (46659) Ubuntu though the regular update manager process. This replaced Chromium 5.0.342.9 (43360) Ubuntu which was the original version available at installation of Lucid. It looks like we will get regular updates to Chromium in the future.
Today we received another update to Chromium, bringing us to 5.0.375.70 (48679) Ubuntu from the previous Chromium 5.0.375.38 (46659) Ubuntu. This is notable in that this is also the current stable non-beta version of Google Chrome. It looks to me like Chromium is being kept up to the stable Chrome version, which is actually a safe way to proceed. This new version incorporates book mark synchronization.
|Puppy Linux 5.0.0 desktop|
The latest version of Puppy Linux was just released on 15 May 2010 and so I downloaded a copy of it and have been giving it a try to see how it works.
Puppy Linux 5.0.0 is nicknamed Lucid Puppy a nod to Ubuntu's Lucid Lynx whose repositories provided the binaries for many of the applications available in Lucid Puppy.
Lucid Puppy is a great leap from its immediate predecessor Puppy Linux 4.3.1 in many ways. To start off it was developed by a team headed, not by Barry Kauler, but by Chief Developer Mick Amadio and Coordinator Larry Short. Barry had long imagined a means of allowing Puppy users to tap into other distributions' application repositories and 5.0.0 has made that a reality.
The previous iteration of Puppy came with two browsers, the very lightweight Puppy browser and Seamonkey. They get the job done, but aren't optimal browsers for most users. This newest Puppy comes only with the Puppy browser but when you click on the browser desktop icon it launches the Browser Installer that gives a choice of downloads:
Likewise this new Puppy introduces a new desktop icon for Quickpet V2.0, the simple and easy application installer. This offers a quick way to download and install such application favourites as KompoZer webpage authoring, Thunderbird email, GIMP graphics editor, Audacity audio editor, Songbird music, Nvidia and ATI video card drivers (although not for my older Nvidia GForce 4000 card)
Quickpet also offers updates for applications, something new to Puppy. These are via a voluntary "pull-process" that requires opening Quickpet and selecting updates. This could be ignored for users who don't want the large file sizes involved with updates on dial-up.
I tested out the network connection tools and they seem to be an improvement over the last edition, with easy DSL setup and also great dial-up tools. I tested out dial-up on a USB modem and it was quickly identified and connected without any complications.
Puppy still has some drawbacks for users with more complex requirements. For instance there are no accounts, every user is "Root", so it is fine for one user per computer, but not so good for those who share a PC. My own experiments with SSHFS show that it works to connect out to other network computers, but there doesn't seem to be an easy way to connect into the Puppy computer. The usual SSHFS addressing doesn't work, since the Puppy PC has no name and no user account. According to the Puppy Forum others have had this problem and I haven't seen a solution to it, although I am sure there is one.
This latest version of Puppy is a quantum leap over previous editions and shows that this Linux distribution is growing and developing well. Best of all it provides excellent support for dial-up, something that many other distros, like Ubuntu, seem to have forgotten.
Now that I have completed installing Lucid on all the PCs in the house I have also had a bit of time to check out the applications that come with it and even try most of them out. Here is a quick rundown on some of the software that comes on the installation CD:
To that I added from the repositories:
As usual I removed Evolution e-mail client and F-spot. I don't use either and they tend to get in the way.
I have already discussed Chromium, which I think is an excellent open source web browser, so here are some comments on a couple of newer applications.
The USC has evolved very quickly since its introduction in Karmic Koala. The current version is very slick, simple to use and effective. It basically combines the simplicity of the old "add/remove software" with the power of Synaptic. USC allows you to stack lists of applications to download and it then downloads them all in turn, one at a time.
USC also allows you to find applications that were not available on the old "add/remove software", like Tesseract, for instance. Users used to have to use Synaptic to remove many packages, but these can all be removed using USC. Eventually USC is slated to replace Synaptic and probably the Update Manager as well. So far it is working well and seems to be progressing in development quite quickly.
One trick I did learn is that on initial installation USC won't show all the applications available until you open Synaptic and refresh it to download the latest lists.
Simple Scan is a new application that was developed in-house at Canonical by Robert Ancell and is officially described as a "Simple Scanning Utility".
Compared to the XSANE application that it replaces on the installation CD it is indeed very simple, with a very clean interface. Both XSANE and Simple Scan use the SANE backend and just provide different user interfaces for SANE. Perhaps some people found XSANE a bit too confusing, with its multiple window system and esoteric controls. On Simple Scan the controls are very basic, but do include some critical items, such as resolution, which controls file size.
Simple Scan will save in three formats: PDF, JPG and PNG. This presents a problem for later performing optical character recognition on a document using Tesseract, because Tesseract requires a ".tif" file as an input. This is easily addressed by saving as a JPG and then converting to a ".tif" using GIMP. I often open the scanned files in GIMP anyway to adjust the contrast to make Tesseract work better anyway, so it is little effort to save them in ".tif" format.
So far Simple Scan looks like a worthwhile and easy to use application.
The application stack for Ubuntu 10.04 is impressive. There is definitely a lot of variety available to keep everyone happy. The extra effort that has obviously gone into providing builds of Chromium and also writing Simple Scan shows why Ubuntu remains the most popular Linux distribution on the desktop for a good reason.
|Default Ubuntu 10.04 Lucid Lynx desktop|
I decided to go ahead and install Lucid Lynx today, hoping that the repositories had calmed down enough to allow me to install all the applications we each want on our PCs. It worked out pretty well, the downloads were fairly quick.
We work from individual checklists that we have developed for each computer, allowing each one to be set-up the way each user likes it. As we carry out the installation we update the checklist and that keeps it useful for next time, hopefully.
The checklists specify backing up all documents, testing the CD and then going ahead and installing the new version of Ubuntu. After a couple of previous experiences we always do a "fresh install", rather than an upgrade. This clears out any old junk and also avoids problem with things like associating Flash to the browser and such.
The actual installation went reasonably well. I started with my desktop PC as it has the printer connected to it and the other PCs all need to find the printer on the network. My PC is a 2004 custom-built unit with a 1.8 GHz AMD processor and an older NVidia GeForce4 MX 4000 graphics card. That card caused some problems in installing Jaunty Jackalope, as it wasn't supported and that sent me scurrying to download Envy NG to get an older driver to make it work right. I didn't have that problem with Lucid as it offered up a driver right away on installation.
The same thing happened with the HP 1018 printer I have. In the past I have had to install firmware from the command line to get it working, but with Lucid as soon as I turned on the printer it offered to install the firmware - very impressive.
The installation was quite slow on my PC for some reason - it ran about 45 minutes for the CD to do its thing. The same type of installation on Ruth's Intel PC took half that time and I am not sure why that would be. Regardless the installation went well, even if it took a little longer to complete on my PC.
In testing out our hardware the printer works fine from my PC, as does our camera and scanner. Lucid includes a new scanning program that replaces XSANE, called Simple Scan. I'll have to work with it for a while and see if it will suffice or if I would rather download XSANE after all. During the initial trials it looks pretty good, gets the job done and works well with our scanner.
One thing I was very keen to check out is the Chromium browser, so I downloaded it from the Ubuntu Software Center and gave it a run-through. I can say with good assurance that it is identical to Google Chrome, except for the Google branding and tracking. The latter is only evident by checking Options→Under the Hood where the checkbox for informing Google of crashes and usage is missing. Otherwise it looks like Chrome and works like Chrome, except it is fully open source. It is my default browser.
As noted below in First Look at Lucid Lynx, I am not keen on either the default Lucid desktop picture or the new default theme "Ambiance" that puts the window control buttons on the left instead of the right, so I changed the theme to "Dust Sand" and the desktop image to an elegant one from Gnome-Look that I have been using on Jaunty.
Overall I can't find anything that doesn't work on Lucid Lynx and a whole bunch of things during the installation that worked better, such as NVidia drivers and printers. Lucid looks like another winner in the Ubuntu line and we intend to be using for at least a year or two.
Lucid Lynx was released yesterday and I managed to get both the desktop and netbook versions pretty quickly from the mirrors in Slovenia and Bosnia, respectively. As usual the North American mirrors were clogged. We ran live sessions on both desktop and netbook computers and our initial impressions are that it looks good and seems to work well, too!
A couple of interesting changes regarding applications: the XSANE scanner software is gone from the default CD and has been replace by Simple Scan instead. XSANE is available from the repositories, as is GIMP, which is also not on the CD any more. SSH and SSHFS are now both available from the Software Center, so you don't have to use Synaptic to get those packages. Tesseract seems to be missing from the repositories, but I need to do some more homework to confirm that.
While I was checking the repositories through the new Ubuntu Software Center I also looked to see if the Google Chrome browser was in there. It isn't, but the open source Chromium browser (Wikipedia page) on which Chrome is based is there instead. This is a good option for those people who want the speed and interface of Chrome, but without Google tracking or branding. It should be very similar to the SRWare Iron browser for Windows. I am interested to see if the Google themes work on it and if it gets regular updates, too, since it is listed as "community supported". I'll give a full review since I will be installing and using Chromium in place of Chrome on Lucid.
With Lucid much has been made of moving the window control buttons from the right side to the left, but many of the same old themes are still available in "Appearance" and all of them have the buttons on the right - so I really think this was a red herring. If you don't like the buttons on the left then pick a new theme.
One word on the default purple and orangey desktop image - it is awful. Ruth took one look at it and declared that that is what it looks like when you hit your head really hard. I agree, it does look like someone did it in a big hurry in GIMP. Fortunately this is easy to fix, too.
I showed the Ubuntu netbook remix interface to Ruth's daughter, Rachael. She has been using a Macbook Pro with OS-X for the last two years as a college requirement. She noted that with the purple colours, the left-hand window control buttons and the grey icons, Ubuntu 10.04 does look very Mac-like. But as noted, that is easy to fix, just pick a new theme and desktop image and it can look any way you like. I have been using the very industrial and slick "Dust Sand" theme on Jaunty and it is available for Lucid as well, so I may well use that too. There are literally tons of themes and desktop images available for free at Gnome-Look.
I haven't installed Lucid on any of our four PCs here yet. I have the disks so it would be easy to do so, but typically the application repositories are clogged for a few days and that means you can't get any other applications beyond what is on the CD, such as those needed for networking. So we will wait until after the weekend "Global Ubuntu Install-Fest" is over and the repositories calm down.
|Puppy Linux desktop|
I recently ended up carrying out a bit of an experiment that included inadvertently doing some work with Puppy Linux. The experience turned out to be a good one and worth relating, particularly for people trying to use Ubuntu on dial-up internet.
Here is how this all came about. As part of my volunteer work with Computers for Communities (C4C) I build up volunteer hour credits which can be cashed in for a free PC. Because we have three PCs at our house and we don't need any more, I put together a PC for a friend of Ruth's who is disabled and has no computer at home. This person will be using NCF dial-up internet.
As this diary entry relates we managed to get Ubuntu 7.04 Feisty Fawn up and running on dial-up without too many problems using pppconfig from the command line back in April 2007.
That earlier success lead me to believe that getting another Ubuntu PC working on dial-up should be fairly simple. That was not the case.
I put together a nice complete Dell PC with Ubuntu 9.10 Karmic Koala on it and had an external USB modem, all from C4C. I took the PC home and added Gnome PPP, a nice dial up GUI. Ubuntu also comes with the pppconfig command line dial-up configuration tool out of the box. The first thing I found was that Ubuntu won't recognize the USB modem.
I worked on the problem for the next month with help from some much more knowledgeable people at C4C and also spent a lot of time on the Ubuntu forums. The forums have a lot of recent entries from people asking for help with dial-up and very few answers. There are some really good tutorials such as on Ubuntu Geek, but it is from 2006, very out of date and no longer works. The one really useful recent forum post I found was Dial-up Redux. The author essentially recommends using pppconfig, which I tried. After a raft of permissions problems I was able to get it to dial out as root but it could not connect to the point where I could view any websites. It turned out to be some unidentified Ubuntu software problem, because the same hardware running Puppy Linux worked fine. Gnome PPP turned out to be a non-starter, it worked and dialled but the Ubuntu back end files are no longer compatible with the application and so it could not sign in.
I am not saying that dial-up cannot be done on Ubuntu, but it exceeds my ability to make it work.
Here is what I have learned:
Puppy Linux is actually very impressive. It is very easy to use, especially for people coming from Windows, fast and pretty darn complete right out of the box. I would definitely recommend it for people on dial up, because of its dial-up friendliness and also because of the lack of updates to download.
Puppy does have some drawbacks, including that there is no account sign-in, everyone runs it as root and also because the available applications are few. That later problem will probably be solved soon as a future version will be able to use other distros' repositories.
Some useful Puppy resources:
Testing Puppy Linux with a D-Link V.90 USB 56K dial-up modem showed that it easily identified it and was able to log on with ease and no need for command line or other complex gyrations that would defeat a new user. The dial-up support built into Puppy is impressive!
Version 5.0.342.7 of Chrome arrived today and it includes some changes and improvements worth noting. Work seems to be continuing on Chrome at a quick pace, with new versions coming out every month or so now. It is encouraging that each version seems to be an improvement on the last as the project matures quickly.
The changes in this new version include:
The new privacy features include moving the Clear Browsing Data button from Options→Personal Stuff to Options→Under the Hood and also adding a completely new interface for Content Settings at Options→Under the Hood→Privacy. This tabbed box allows a great deal of control over cookies, whether they are accepted and how they are stored. By default it blocks all third party cookies and, this is something I have missed from Firefox, it can be set to clear all cookies when you close the browser.
I did want to note that Chrome actually handles some websites better that Firefox does. A good example is the Transport Canada Civil Aircraft Register database. When you do an inquiry on the database from Firefox that returns a multi-page report only the first page can be accessed - clicking on the links to the second and subsequent pages just take you back to the first page again. Epiphany has the same problem, while Chrome allows all pages to be accessed as intended.
An oddity in this new Chrome version is that two keyboard shortcuts I use regularly seem to have been disabled: "Backspace" to go back a page and "Shift+Backspace" to go forward a page. This is now Bug 39435, so hopefully it will be fixed in the next version! Until it does get fixed I have rated Chrome as 9/10.
Through the Chromium bug reporting system I discovered an undocumented Linux keyboard shortcut that actually is more intuitive than using "Backspace" to go back a page and "Shift+Backspace" to go forward a page, "Alt+right arrow" to go forward a page and "Alt+left arrow" to go back a page. Due to this I have raised Chrome back to 10/10. Google really needs to document this shortcut, so I have sent in an update.
There are now a total of four bugs that I have found on the backspace issue, all merged into the first one:
I also discovered through trial and error that Chrome now includes text "drag and drop" in forms as of Chrome 5.0.342.7 beta, this corrected a problem that I noted back in Chrome version 22.214.171.124.
Well some more reading of bug report threads turned up some interesting info - it seems that the "Backspace" key no longer taking you back one page is not a bug, it was disabled intentionally as explained in Bug report 30699. Essentially people were having trouble with the feature when editing in text fields as they tried to erase characters and got taken back a page instead. Bug report 36533 was filed by disgruntled users who want it back or at least as an option.
The next version of Ubuntu, due out 29 April 2010, will be called Lucid Lynx and will be numbered as Ubuntu 10.04. Like Ubuntu 8.04 Hardy Heron this will be a Long Term Support edition and so will be supported on the desktop until April 2013 and on servers until April 2015.
The team at Ubuntu have long been considering changing he look of Ubuntu from its current brown "Human" theme. In fact this roll out of a new colour scheme was supposed to take place with Ubuntu 9.10 Karmic Koala, but the work wasn't done in time and so Karmic only received minor theme changes.
Lucid will look quite a bit different from previous versions of Ubuntu. In a recent announcement Canonical unveiled the new "Light" theme that will replace "Human". The screenshots show it as featuring white, grey and mauve in place of the browns we have become used to.
The make-over is serious enough that even the Ubuntu logo has been redesigned, with a new font and slightly new look, although the "Circle of Friends" is still there.
Is this really important? I mean once a user has installed Ubuntu they can customise it severely so it looks nothing like the default desktop anyway, so what is the point? I actually do believe that this is important. For one thing the vast majority of users don't change their desktop picture or their themes. I am always kind of amazed to walk though an office and see the default Windows XP "grassy hill" everywhere, even eight years after it was rolled out. The other reason it is important is that even though a lot of us are rather utilitarian users and don't really care what our desktop looks like, for most people aesthetics matter and more importantly they are what largely drives the all important first impression.
Having had a look at the example new themes I quite like them and think that it will help Ubuntu make a good first impression against the competition and especially against the often-touted-as-elegant-looking Mac OS-X desktop.
Our Google Chrome browser has been through several new versions, all downloaded as part of Ubuntu updates. Currently we are on Chrome 5.0.307.9 beta. We have both been using Chrome every day since 10 December 2010, which is two and half months. It is interesting that the earlier versions were not marked as "beta", but 5.0.307.9 now is.
One of the most remarkable things we have both noted about Chrome is its amazing stability. In the time we have been using it we have had only one crash each. Ruth had a browser lock up and yesterday I had a tab crash. Because of the way that Chrome works, each tab is dealt with as almost a separate browser instance. This is done for security as much as anything, as it insures that even if one tab is compromised then information on another tab is not at risk. It takes a little more RAM to do this but the results seem worth the cost.
Because of the structure of the tabs this also means that it is possible to have a single tab that crashes, leaving the rest of the tabs functional. In this case I wasn't able to close the non-responsive tab or the browser, so that required a system reboot to fix.
The stability record we have seen from Chrome has been remarkable. Even Firefox, which is a very stable browser, normally has at least one crash every week or two. In my experience one tab crash in two and half months is worthy of note.
|Acer Aspire 150|
People these days are not only "on the go" but have really become enamoured with having their high tech devices equally portable. Cellphones allow us to stay in touch no matter where we are and, of course, the revered Blackberry allows us to not just stay in touch but to get work done from any place whether you want to or not. There is a scene in the 1984 film 2010: The Year We Make Contact, where the protagonist is seen sitting on a beach and tapping furiously on a laptop, so the idea of portability and use of high technology isn't new.
What is new, however, is how such technology has evolved.
Time was, laptops were bulky and expensive. Too often they were slow and their batteries just didn't last a long time. Thanks to demand, however, all that has changed.
We recently purchased an Acer Aspire netbook. Even the nomenclature has changed. We have not just laptops that can connect to the Internet but netbooks whose primary function – not to mention selling feature – is Internet connectivity. We bought the Acer Aspire One 150 from a neat place called Factorydirect.ca.
The Aspire is classified as ultra thin, has a 10.1” screen, Dolby headphone capable, WiFi "certified" and runs on an Intel Atom processor. We got what was called a "refurbished" model which kind of implies that it was used, but that isn't the case. "Refurbished" in this case simply means that there are a couple of – in my view – invisible flaws that made the netbook unsuitable to ship to a regular retail outlet. If I hold the netbook a certain way, under certain lighting conditions that don't exist in 3 dimensional space and squint my eyes I might see what looks like a scratch on the outside of the case the same size as a grain of sand. Maybe. If I try really hard.
When we brought it home and fired it up, the OS, Windows XP-Home edition bled into view. It took quite a while for the whole thing to be user ready, but that's Windows for you – any edition. Of course, what we wanted to do was replace Windows with the netbook version of Ubuntu. However, because netbooks do not have a CD drive, the regular way of obtaining and installing Ubuntu wouldn't work. Luckily, there is a way to download the Ubuntu Netbook Remix and write it to a USB jump drive. The Acer Aspire has a couple of USB ports so that wasn't a worry there.
From one of our desktop computers, we downloaded the Ubuntu Netbook Remix 9.10 Karmic Koala, performed an MD5 sum check to ensure it was a good copy and wrote the ISO file onto one of our USB jump drives, using the USB Startup Disk Creator. It was then a matter of getting Ubuntu from the jump drive into the netbook and that was fairly simple, we just set the boot sequence in the BIOS to put the USB HDD first and rebooted.
So, with all that done and the issues of wireless connectivity sorted out (we had a few minor issues to do mostly with other computers in the neighbourhood sharing the same channel), it was now up to me to see how the whole thing worked.
Admittedly, my computing needs are not too extensive or memory intensive. I'm not doing any 3D rendering or trying to launch a satellite from my living room. Mostly I use it for email and taking the odd YouTube tutorial (yes, YouTube does have its uses there). With my disability, I am periodically unable to safely climb a flight of stairs. Since our office is downstairs and we don't have a bathroom in the basement, it was becoming a hassle for Adam to have to always help me up and down flights of stairs. The YouTube tutorials I refer to are mostly on (no laughing here) crocheting. My crocheting is in the living room where there is daylight so that is where I need to watch the how-to videos. Having a netbook allows me the freedom to have my computer anywhere in the house, or in the warmer months outside, too.
There are some limits to the Acer Aspire that are worth mentioning as far as I'm concerned. Mostly, these are to do with getting used to the absence of a mouse. Both Adam and my kids tell me that I will get used to working with the touchpad and that happens fairly quickly. While there is a way of attaching a mouse to the netbook if I wanted to, doing so would defeat the portability somewhat. If I want a mouse, I can go downstairs to my desktop and use it there. I use GIMP for all my graphic designs. For that, I require a mouse and a fairly sizeable flat surface to use it. Sure, the touchpad for the Acer Aspire acts in the same way, however it is much too small for the fine detail work I need when making graphics of any kind.
The Aspire takes about 45 seconds, plus or minus a few, for Ubuntu to boot up from the time I turn the netbook on and about 10-15 seconds to shut down entirely.
The battery life is a function of drain rate and that is mediated entirely by the number of applications used. In simple terms, when I'm writing things like this review, I have nothing else operating and so can write for up to three hours before the battery needs charging. Even with the battery at, say, 20% it only takes about 2 hours to recharge fully and I can continue working while the battery charges, as long as I am near an AC outlet.
The ergonomics of the keyboard work just fine for me. My fingers are not especially thick and so small keyboards like the Acer Aspire's do not cause me any serious problems. Plus, admittedly, I like that the bottom of the netbook gets slightly warm while sitting on your lap, a really nice feature on a chilly winter day like today.
The price was really great, too. Factorydirect.ca sells the Acer Aspire 150 for $269, plus tax. This is a really good unit for people who want portability, including sitting outside in their back yards.
I like it.
|Karmic Koala Ubuntu Netbook Remix|
desktop main menu page
For the uninitiated a netbook is a little laptop computer, with fewer capabilities, no CD ROM drive and a lower price. Ruth has wanted one of these for a while, mostly so that when her MS is acting up she doesn't have to use the stairs to get to her desktop PC. Netbooks have come a long way since the ASUS EEE made its debut a few years ago; they are now a pretty refined product and they sell well.
Factorydirect.ca had reconditioned netbooks on for a pretty good price and so we picked up an Acer Aspire 150 for Ruth. In this case "reconditioned" didn't exactly mean "used". It turns out that these are factory seconds with small scratches in their cases that didn't pass Acer's standards, were bought by Seneca Data and resold, still in their original cartons, unopened. This all makes it about $60 cheaper than other retailers.
The Aspire 150 is a nice little unit, measuring 10 X 7.25 X 1 inches (25 X 18 X 3 cm) when closed and weighing 2.4 lbs (1.1 kg). It comes with the Atom N270 1.6 GHz processor, 1 GB of RAM and 160 GB hard drive. It also comes with built-in wireless internet and a screen that is 10.1" measured diagonally. It has three USB ports and connections for ethernet, video and audio out, along with an AC adaptor. The only downside is that it comes with Windows XP Home Edition, but that is fairly easy to fix.
One reason we picked the Aspire 150 is that the Ubuntu Netbook Hardware List indicates that there are no serious hardware issues with this netbook.
Naturally we wanted to install Ubuntu on the Aspire 150. There is a special version of Ubuntu called the Ubuntu Netbook Remix (UNR). This is regular Ubuntu 9.10 Karmic Koala but optimized for the small netbook screens and Atom processors. The main changes are:
We downloaded UNR very quickly. It is 10 MB smaller than vanilla Ubuntu at just 680 MB. Installing it is a bit different, as there is no CD drive to boot from! There is a good explanation of how to make up a USB stick to install UNR on a netbook. To be honest that page makes it sound complex, when, at least from an Ubuntu desktop, it is very simple:
The next step was to take the Aspire 150 out of the box, install the battery, plug in the adaptor to charge the battery and boot it up. We elected to do a test run of the hardware using Windows XP just to make sure that everything worked. If there was a problem with the wireless not working I wanted to to be confident that it wasn't a hardware issue. Everything worked fine on XP and we were able to sign onto our wireless network with any problem.
I have to remark that it has been so long since I have used XP that I had forgotten what a crappy operating system it was, compared to modern Linux systems! It is odd that retailers are still selling systems with it in 2010, as it has been out of Microsoft mainstream support since April 2009.
With the hardware tested we booted to the BIOS (hit F2 in this case) and reset the boot order to put USB HDD first. Then we inserted the USB memory stick and rebooted to Ubuntu. This comes up just like booting to a CD and includes the standard built-in self-test "Check disk for defects", which we ran successfully.
Next we tried "Try Ubuntu without any change to your computer" to run a live session and allow checking the hardware for compatibility. This worked fine except the wireless modem didn't connect. We spent quite a bit of time running through the menus looking for connection information and didn't find it. In the end we clicked on the blank space on the top panel where you would expect the network icon to be and the dialogue box popped up. It was easy then to pick out our network and connect. Lessons learned!
With the hardware checked successfully we then went onto the installation, using the full disk space - goodbye Windows XP. That went quickly and smoothly.
The next step was to set up networking via SSHFS and see if that could be made to work. We got the openssh-server and sshfs packages from Synaptic, confirmed the netbook was on the fuse group and gave it a try from the command line. It connected right away! That means that Ruth can leave all her documents on her desktop PC and just open and edit them via the SSH network. That is easy and saves any document synchronization problems.
Next was installing a few applications, from the Ubuntu Software Center. Ruth didn't want much and just installed the ClamTk virus scanner and Restricted Extras to get Flash and standard fonts. Since we use webmail, she removed the Evolution e-mail client.
Karmic Koala comes with Firefox 3.5 for a web browser, but Ruth really likes Google Chrome. We checked the repositories, but it isn't there in the Ubuntu Software Center or in Synaptic yet, so we installed it directly from Google. Setting up bookmarks and Flash in Firefox first ensured that Chrome would import all those settings and work right. Chrome is pretty zippy on the netbook, but then it was designed to run on netbooks and on Linux especially, as that is what the Chrome Operating System will consist of when it is released this upcoming fall. In many ways Ruth's new Aspire 150 is very similar to what the Chrome OS netbooks are going to be, but with many additional capabilities, like OpenOffice.org, the ability to save documents locally and work off-line.
Over the first day we noticed that the Aspire dropped its wireless connection quite often. That wasn't good. Ruth's daughter, Rachael, had recently been at our house with her Mac laptop and had the same problem, so I had a suspicion that the problem wasn't the netbook. I concluded it might be radio interference from other wireless networks in our neighbourhood. It seems that most wireless modems come auto-preset to "Channel 1" and most people probably leave them on that. I tried manually setting a different channel and we have had a much better signal, noticeably quicker page loads and no drops at all. I think that problem is solved!
Ruth is still learning the Aspire 150 and Ubuntu Netbook Remix. She has indicated that in a week or two, once she has gained some more operational experience on it, she will do a write up here on what it is like to use the hardware and software together.
For the last three years we have been using the open source Mozilla Sunbird calendar on both our PCs. In general it works very well for individual use, but it has proven problematic due to synchronization issues.
Basically we both keep our own calendars, but we synchronize them together so they are the same. We do this by updating our own Sunbird and then exporting the .ics file to the other's PC via the network. This can then be imported into the other PC's Sunbird installation. The .ics files are saved as back-ups.
In theory that doesn't sound too bad, but in practice we have found our calendars are often out of synch and this leads to duplicate entries and all kinds of similar problems, combined with the number of steps required to synch the calendars.
We have switched to a better solution, using Google's on-line calendar application. Ruth created a calendar and then shared it with me, allowing me edit access. This means that we now both are updating the same calendar and there are therefore no synchronization issues to take care of.
The most widely criticized problem with "cloud computing" like this is that your data is being held somewhere else, in this case on Google's servers. This means that if their service fails you can lose your data. This hasn't happened yet, but the Google Calendar allows exporting the calendar as Sunbird compatible .ics file and saving it on your own PC. We do this on a regular basis, the same as we did in backing up Sunbird.
The Google Calendar can also be set to synchronize with a local installation of Sunbird automatically, but I am not sure what the point of that it, unless you will be working off line. It seems simpler to just use Google Calendar and then back up the file, which can then be read off line in Sunbird or even in a text editor if needed.
According to an announcement from Mozilla, Sunbird has now been discontinued with version 1.0 beta 1 so that they can concentrate on the Lightning calendar plug-in for Thunderbird instead.
We have both now been using Google Chrome for over a week and, if anything, we are both more impressed with it than on our initial look. In the past week of working with the new browser we have discovered many pluses and a few minuses.
One oddity I found is multi columning on Wikipedia, which I originally thought was a shortcoming for Chrome. Some Wikipedia articles use CSS to put the reference lists in multiple columns to save space, like this page. Chrome sees only one column there, while other browsers all see three in this case. However a CSS validation of that page shows 51 errors, including Property -moz-column-count doesn't exist : 2 and Property column-count doesn't exist in CSS level 2.1 but exists in : 2, so it looks like Chrome is rendering the page correctly and not showing the invalid CSS columns. Oddly Firefox and Epiphany show the three columning, even though the CSS for it is invalid.
One question that comes up around Google Chrome is privacy. Yes the browser has incognito mode, but that just prevents it from registering where you have been on the web on your PC. What does Google collect about you from Chrome? Google has some pretty good explanations of this in their General privacy: Privacy, unique IDs, and RLZ article (since deleted as out of date, but quoted below) and also in their Communications between Chromium (and Google Chrome) and service provider article. There is also a pretty good table summary on Wikipedia. These seem fairly transparent. Chrome does communicate with Google and those articles lay out what is transmitted and what isn't. If you don't opt into Google's suggestion service then Chrome is not supposed to transmit your URL data to Google.
The company has been pretty plain about explaining how the browser works, but it takes some degree of trust to believe that there is nothing more that they aren't saying. Google has built its entire empire on the trust of their customers, so they have a lot to lose if their declarations turn out to be false. So far most people seem to trust them, at least more than many other large companies who have proved untrustworthy over time. If you have concerns you should read the articles and then decide before installing Chrome.
I think the bottom line to remember on internet privacy is that your ISP has every bit of information on what you have done on the web, which is more than Google will have about you, even if you opt into everything Google offers.
Overall we are still very impressed with Google Chrome and are both using it as our main browsers. It is the best browser we have used so far.
Google has deleted their article General privacy: Privacy, unique IDs, and RLZ as out of date, but it said at the time this was written:
• The installation ID (IID) is created at install time to remove duplicate installation reports from Google servers. This is necessary to accurately count the number of successful installations of Google Chrome that have occurred. The IID is generated randomly (not based on any other information) and is deleted in the next update check after first run.
• The client ID is used for the user metrics service. This is an opt-in service that lets users send usage statistics to Google so that we can learn how Google Chrome is being used for the sake of making improvements. It helps us answer questions like, "Are people using the back button?" and "How common is it that people click the back button repeatedly?" Users can always update their preference about sending usage statistics.
• GoogleUpdate, which is included in Google Chrome, also contains its own unique ID.
You may notice a RLZ parameter in the URL when you do a Google search from the Google Chrome address bar. The RLZ parameter contains some encoded information (like when you downloaded Google Chrome and where you got it from). The RLZ parameter does not uniquely identify you nor is it used to target advertising. Google uses this information in aggregate to find out whether groups of people are using Google Chrome actively. Not all users have the same RLZ parameter. The RLZ parameter is based on where Google Chrome was download from, when it was installed, and when certain features were first used, like search.
A RLZ parameter is sent to Google with every search done using the built-in search box. It is also sent separately on days when Google Chrome has been used or when certain significant events occur such as a successful installation of Google Chrome. The RLZ parameter is stored in the registry and may be updated from time to time. The code that makes this work is not included in the open source project (http://www.chromium.org) because it only applies to the version of the browser that Google distributes, Google Chrome.
Linux users have been waiting quite a while for the Linux version of the Google Chrome browser to be released. The Windows version came out on 2 September 2008 and the Linux community just got the beta release on 8 December 2009, along with Mac users too. Chrome will now run on Windows XP SP2 and later, Mac OS X 10.5 and later and Linux, with downloads for 32 and 64 bit RPMs and Debs.
The good news is that it has been worth waiting for, as most of us thought it would be. The Chrome browser is going to be the only application and the centre-piece of the upcoming Google Chrome Linux-based operating system, so the browser had better be stable and fast!
The download doesn't come the usual way for Ubuntu users, which would be via "add/remove applications" or the new Ubuntu Software Centre for Karmic users. Instead it has to be downloaded from Google directly. I was a bit concerned that installing the package would require looking up some commands to run in a terminal window, but a quick read through the Ubuntu Forum showed that it just downloads, opens with gdebi and then you can just click through the installation. I installed it and it worked fine, so credit goes to Google for making it as easy for Linux users to install as it is for Windows users.
Chrome asks you if you want to import your Firefox settings and bookmarks. This is an important step to do, because if you miss it Flash will not work. This is because Chrome needs the Firefox information to find the location of the Flash player.
Once the set up is complete you can go to Applications→Internet and there is the Chrome icon installed and ready to go. Chrome starts by opening a guide page, but for anyone who has used Firefox or Epiphany it will be very intuitive, all the usual shortcuts work, like Ctrl+T for new tab, Ctrl+W to close a tab, Ctrl+U for page code, Ctrl+L or F6 for for the URL bar, etc. It takes a few minutes to get used to the tabbed interface which puts the tabs at the very top of the window, instead of below the navigation controls, but that is the extent of the learning curve. The interface is very simple with just buttons for forward, back, refresh and home. Everything else is hidden behind the page or wrench icons for controlling the page and settings. It takes just a few minutes to set everything up the way you like it, going through the settings menu.
Chrome includes spell checking, but unlike Firefox can't be set to automatically dump all personal data when you close the browser. Like Epiphany, that has to be done manually from the settings menu. Chrome does include the much-touted Incognito mode, which can be opened with Ctrl+Shift+N. This allows browsing without leaving any traces on the PC, including dumping cookies and such on exit, although downloads remain. This mode has been euphemistically referred to as Porn Mode, but it probably isn't all that useful on non-shared PCs.
So that brings us to the key question about Chrome for Linux - is it faster? It loads this rather large Ubuntu Diaries webpage in 3.5 seconds, which compares to about 4.5 seconds for Firefox and 3 seconds for Epiphany. The CBC news home page loads in 4 seconds on Chrome, 6 seconds on Firefox and 7 seconds on Epiphany. On one test Wikipedia page Chrome loaded in an amazing 1 second, versus 3 for Firefox and 6 seconds for Epiphany, so results seem to vary a lot, although Chrome seems to be faster most of the time. I mostly stopped using Epiphany due to its sudden crashes, but I will have to work with Chrome and see how it does for stability over time.
Ruth has also installed Chrome on her Ubuntu PC and so far she is very impressed with its simplicity and speed and has made it her default browser.
So far I am impressed, too, enough to recommend it. We will have more information as we work with this new browser over time.
Back in August I wrote about how impressed I was that Ubuntu is now providing ClamAV version updates after the initial release of a new Ubuntu version. We had started in Jaunty with ClamAV 0.95.1 and received a new version to bring it up to 0.95.2.
On 28 October 2009 another new version of ClamAV was released, 0.95.3. Since Karmic was released in October Jaunty is no longer the most recent version of Ubuntu, even though it is supported until October 2010. I was very interested to see if the newest version of ClamAV would show up automatically via the update process.
There was an update yesterday on 11 November 2009 that included a new CUPS printing application version and, yes, a whole new version of ClamAV 0.95.3. I am very impressed! This makes it much easier for us to continue on using Jaunty and not have to upgrade Ubuntu versions.
As I said previously "I think it is fair to say that the mid-stream updating of ClamAV shows that Ubuntu is gaining a certain level of maturity as a distribution". That goes doubly so, now that Jaunty is no longer the most recent version.
The latest version of Ubuntu was released on schedule on Thursday 29 October 2009. Ubuntu 9.10 Karmic Koala was available by lunchtime that day and we managed to download a copy right away without any problems, once again from the mirror in Zimbabwe.
An MD5 Sum check showed that it matched the published MD5 Sum and so we had a good copy. We made up some CDs and tested each one and again they all confirmed as good copies! We also ran a live session on Ruth's Compaq and it worked very smoothly and looks good. On Friday 30 October we delivered ten copies of Karmic to the offices of National Capital FreeNet so other NCF members can give it a try.
With some new features, such as reduced boot times, the Ubuntu Software Centre replacing "Add/Remove Applications", ext4 file system and Firefox 3.5, along with new themes in dark brown, Karmic looks like it is another winner in a long line of great releases for Ubuntu.
Normally we would upgrade our PCs to the latest version of Ubuntu as soon as possible, but this time we both agree that we will wait a while, perhaps even until the next release of Ubuntu comes out on 29 April 2010, Ubuntu 10.4 Lucid Lynx. It is not that we have any reservations about Karmic, just that our present installation of Jaunty Jackalope is working so flawlessly that we don't see any reason to upgrade at present. Jaunty is still using the earlier Firefox 3.0 browser, but it is still supported - we even had an update to it this morning, to Firefox 3.0.15. Jaunty is still supported until October 2010, so there is no rush to upgrade right now.
I think Ubuntu reached a plateau with the release of Hardy Heron after many earlier releases where great strides were made with each new version. At that point most of the hardware and other serious issues were taken care of and the distribution became very smooth, easy to install and use. Ubuntu has reached a level of maturity now whereby each new version of Ubuntu now brings incremental improvements that are always welcome, but small evolutionary advances. This negates the need to immediately upgrade and allows existing installations to be used longer.
Of course if I had to do a new installation today I would install Karmic.
We have had a Panasonic DMC-FZ2 digital camera since the summer of 2004. It has had its problems, including a failed focussing unit and some Linux incompatibilities with Feisty and Gutsy that were finally solved with Hardy. Then on our recent trip to Montréal it died - refused to keep the lens open for shooting.
We debated sending it out for repairs once again, but figured that it probably wasn't far from the end of its service life anyway.
In shopping around we found a close-to-ideal camera for us, the Nikon Coolpix L20. It uses AA batteries, of which we have lots of NiMH ones here, it takes photos up to 10 mega-pixels and movies at 30 fps, compared to 2 mega-pixels and 10 fps with our old camera. It is also more portable and cost what the repairs would have probably run us. The one question was Linux compatibility. Nikon removed that worry, too as the camera has a small 20 MB internal memory, but also accepts SD cards. We have several SD cards from our previous camera. The SD cards equal no Linux worries as I have an SD card reader that means even if the camera isn't recognized the card will work!
In testing the L20 at home I quickly discovered that I needn't have worried about compatibility. Here in 2009 Linux now supports more hardware than Windows does. I plugged in the L20 and it was immediately recognized and Ubuntu opened up Nautilus to show the photos.
So I am impressed. Once again Linux comes though without any problems and I know if some future kernel doesn't support this camera that the card reader will work.
The only oddity I have come across is that our video editor, Avidemux, doesn't support the sound format on the movies the L20 creates, so editing them leaves them without sound, unless a .wav file is created and added as a soundtrack.
Doing some digging on the Ubuntu Forums I discovered that the Nikon L20 creates movies with a mu-law audio codec and Avidemux doesn't support that out of the box. The ideal solution is to download the codec and then download the new source code for Avidemux and recompile it with the codec. That would be fine, but there does not seem to be any step-by-step instructions on how to do that and some users have reported it still doesn't work anyway. Out of the box Avidemux seems to only handle .wav files and not much else.
I found a reasonable workaround for movies that need editing and sound added. The free audio editor Audacity is available for Ubuntu and allows importing most audio formats, editing them and then saving them as .wav files. The they can be used as soundtracks for edited movies and saved in Avidemux, like this example from today:
In the past I have written about our struggle to find a video editor that worked.
Avid Free DV only ran on Windows, has been discontinued and is far too complex and user-hostile for home use.
Open Movie Editor has a great, easy to use interface and is Linux-only, which works for us. The only problem with it is that it renders huge files. Even the tiniest video made from two 10 MB clips was over 200 MB after editing and that made it useless for uploading to You Tube, which has a 100 MB limit.
In the end we settled on using the on-line video editor, JumpCut, which worked great. It allowed us to upload clips, edit them, add titles and post them. Unfortunately it was bought out by Yahoo! in October 2006 and, as part of their restructuring, they closed the site in June 2009. That left us without video editing once again.
This week I had a couple of clips that I wanted to edit and post on You Tube and so basically I started over again trying to find a way to do that. I tried downloading Open Movie Editor and then upgrading it to the latest version, since the Ubuntu repository version is the previous version we tried, 20080102. The current version is 20090105, which is newer and might possibly be better. Unfortunately the previously posted instructions on how to upgrade the version no longer work and so I wasn't able to upgrade it. I did post a question on the OME Forum, but didn't get an answer. I tried using 20080102 and this time it didn't render large files, instead it rendered very small (32 KB) movies that were unreadable. Back to the drawing board.
The spotty documentation left me to mostly figure out how it works on my own and by searching though the forum for problems others have solved. So far I have figured out how to load videos, cut out sections and create fade transitions. Titles have been an interesting problem. I tried using the subtitle feature, but the documentation is too incomplete to figure out how to make it work. Using title stills before the video works fine, but eliminates all audio from the resulting video. Putting the stills later after a section of video results in an unplayable movie. So far I haven't found out how to add audio to a video, so at least I could overlay music or commentary. I will post updates if I discover how to do that.
The poor documentation for Avidemux has somewhat been offset by an independent group of manual writers stepping up to the plate and filling in the gap with their FLOSS Manual on Avidemux. This group are filling a great need and any time you get stuck for help it is worth checking their index of manuals they have available. More are being added all the time.
So using Avidemux I was at least able to take the .mov clips I had and edit them together with fade transitions in between. The resulting output was as an .mpeg and was not much larger than the input videos, so I was easily able to upload the result to You Tube.
I haven't added any new entries to this diary in a few months, for the sole reason that Ubuntu 9.04 Jaunty Jackalope has been so flawless to work with that there hasn't been anything to report. We have both had no problems at all with this version, it just works perfectly for us.
I did recently find something new from Ubuntu that I hadn't seen before that is worth reporting on, however - a mid-stream update to ClamAV!
In the past when we got applications with Ubuntu, either on the CD or as a download from the repositories, they were one version and they stayed that one version for the life of the Ubuntu release. The only exception to this has been Firefox, which gets regular updates. The rest of the applications, GIMP, OpenOffice, gEdit, etc always stayed the same. If you wanted the newest version you had two choices, you could either wait six months for the newest version of Ubuntu to come out with the latest releases or else figure out how to manually go and get each package and then install it from the command line. That latter method then results in no more updates to that application, you are now on "manual".
Then, last week, something new happened. One of the regular Ubuntu updates included a new version of ClamAV, the free software anti-virus scanner. We had been running 0.95.1 and regularly got warnings that it was out of date. There are instructions to update it from the source code, but doing that is beyond my level of expertise as the instructions are not written in an easy to understand manner for beginners. The regular Ubuntu update delivered us the latest version, ClamAV 0.95.2.
I have no idea if this was a once-only event or if we will get future updates to Clam or other applications without upgrading to the next Ubuntu version. Clam is probably more critical than most applications because it is used on Ubuntu Server as an e-mail gateway scanner, a critical use. We use it on the desktop, which is probably a more rare use than on a server.
I think it is fair to say that the mid-stream updating of ClamAV shows that Ubuntu is gaining a certain level of maturity as a distribution, taking care of the small details that make users and administrator's jobs easier. As a by-product this saves us a bunch of work as we may decide to keep Jaunty a bit longer rather than upgrade to Karmic right away when it comes out in about two months time, on 29 October 2009.
The bad news was that Ruth's Sony Discman finally died. She started looking around to find a new portable CD player and discovered that these have almost disappeared these days, replaced by the now-ubiquitous MP3 player instead.
I have been curious about MP3 players for a while, in particular how well they work with Ubuntu.
Almost everyone says that the Apple iPod is a poor value - they don't last any longer than the other brands and, being an Apple, are pricey, of course. In searching around I found a recommendation for the Sansa Clip from Ubuntu blogger A.Y. Siu. He had a run in with another MP3 player, the Cowon iAudio 7 and very strongly recommended avoiding that brand. Instead he went back to his Sansa Clip player. Because A.Y. uses Ubuntu I knew that the hardware would work on our PCs, so that was a start!
We found the diminutive Clip on sale for Cdn$49.99 at Future Shop, about $10 cheaper than the similar iPod Shuffle.
The Clip is actually a pretty impressive piece of gear - very small and light, with a 15 hour battery that is recharged from a USB connection. This model has a 2 GB memory, which is quite a lot of music! It comes with the USB cable, earphones and a mini-CD with the user in dozens of languages. It even comes with some pre-installed songs to try out, which at least proves it works right out of the box. It will not only play audio files, but come with a built-in FM radio receiver, too and a microphone so you can record your own files.
The listed Minimum System Requirements on the Clip's packaging are a bit ominous for Linux users:
So we plugged it into Ubuntu and it was immediately recognized! We were given the option of how to access it, including Rhythmbox. We set it up for the Nautilus file manager as default so we can see what files are on it and easily load new songs.
It seems that the minimum requirements are a bit misleading. You don't need Windows XP or Windows Media Player at all, Linux will work fine. You don't need an Intel Pentium class PC, either as it works just fine on AMD PCs, too. The CD drive is so you can read the PDF user manual, although the user cards that come with it will get you started and there isn't much of need in the user manual. You do need a USB 2.0 port, so that much is correct!
The player will handle a number of formats, including MP3 and the open format Ogg Vorbis .ogg as well.
Loading files is easy - they can be simply dragged-and-dropped or cut-and-pasted from Ubuntu's Nautilus file manager right into the window for the Clip. Songs can be dropped on individually or in folders. Playing it is just as easy, as it uses the fairly standard five button circular-style of control switch and is pretty intuitive to use.
In summary the Sansa Clip is a nice, neat package, easy to figure out and use, very portable and relatively inexpensive. Best of all it works really nicely with Linux, with no complaints. That makes it hard to improve on!
K3B is the standard CD burning utility that comes with the KDE desktop. Apparently the odd name means KDE Burn Baby Burn.
I have been experimenting with K3B ever since Ruth got her new PC and the ATAPI DVD A DH16A6L-C combination CD/DVD writer it came with didn't seem to work well with Brasero or the no-name brand CD-Rs we we using. With K3B and some brand name Philips CD-Rs it seems to work fine now.
I decided to try out K3B on my old PC and I like the way it is laid out. The interface it very easy to understand and use, even if, being a KDE application, it does not match the Gnome desktop for colour schemes and such. Initially I tried it out and got some mixed results. It seemed to burn okay, but the Ubuntu ISO CD-Rs I made could not be read and the PC refused to boot to them. When I tested them on Ruth's PC they booted fine. Very odd. I tried making ISO CDs with Brasero and got the same result. In troubleshooting the problem it seemed that the MITSUMI CR-48X9TE CD-R/CD-RW writer hardware I had installed was quitting on me and while it still wrote, was not reading properly, especially on boot. I swapped it for a spare CD writer from Ruth's old PC and now it works fine.
I have to admit that Brasero has been not working well on either PC. I have been able to make CDs with it, but it still acts odd at times. For instance it has two interfaces, the normal one that comes up when you open it from the main applications menu and the alternate one that opens when you click on mounted media, like a blank CD-R that has been inserted. Both interfaces seem to lead to the same dialogue boxes, but they don't produce the same results. For instance I was making a CD-RW document back-up disk. Using the normal interface it wouldn't burn and only produced a raft of errors. Using the interface from the inserted media it worked just fine. It just seems to be far too buggy at this stage to be entirely reliable.
I have have much better luck with K3B. It only seems to have one interface and produces more consistent results. The only error I have come across seems to have been related to a bad CD, although that is hard to confirm.
Along with a very straightforward and simple interface, K3B has some nice features, like a built-in MD5 sum checker. All you have to do is start the process to burn an ISO image and it automatically creates an MD5 sum from the image which you can compare to the correct MD5 from the download website. This is nice for users who don't want to have to separately run the MD5 sum from the Linux command line.
Personally I don't need a lot of features with a CD burner, as long as it consistently makes good CDs I am happy with it. Like Brasero, K3B does have the ability to create and save projects, although I don't use this feature, I imagine it could be useful to people compiling music CDs from multiple sources.
So far I like what I see in K3B, but I will continue using it and see how it fairs in the long run on many different types of CD tasks. In the meantime it seems to be working far better than Brasero ever did.
This morning Ruth's old and mostly reliable Dell OptiPlex GX260 refused to boot up, it wouldn't even get past the bios. Some careful trouble shooting showed that it wasn't the RAM or the video and it wasn't the hard drive either. Process of elimination meant it was the motherboard. This is sadly a common aliment with these GX260s, it is reported that eventually something blows on the motherboard and that, well, is that.
It really isn't practical to replace the motherboard on a seven year old PC, so we looked at the options. We could have bought a pretty good used PC from factorydirect.ca or another discount outlet. That would have meant we could have used some of the cannibalised parts from the GX260, like the DDR RAM and such.
We decided, instead, to take a look over at Future Shop and found a new Compaq-Presario for not much more than a used PC with much lower specs. The Compaq-Presario is pretty much low-spec for a Vista-designed machine with 3 GB of DDR2 RAM, dual core Intel 2.40GHz CPUs and a 250 GB hard drive, but we figured it would make a great Linux box. The best part is that it has an Intel Integrated Graphics Controller which means no worries about drivers for Nvidia or ATI cards, the video just works. The new PC one year warranty helps too, as most used boxes are 30 days.
So we picked one up, set it up and Ruth opted to install Ubuntu 9.04 instead of the Xubuntu 9.04 she has been using. She figured with those hardware specs Xubuntu wouldn't add any speed and Ubuntu is more closely configured out of the box to what she wanted. We both agree that Xubuntu is a useful distribution for a PC with 256 MB of RAM, but if you have at least 384 MB of RAM then Ubuntu gives you more of everything.
So far everything seems to work well on the Compaq with Ubuntu, except the network printer and the combination CD/DVD burner. We still have to do some poking around to get both of those working right. Ruth is impressed that the PC is lightning fast working on the most complex and memory-hogging tasks. She does a lot of graphics work and this PC running Ubuntu is ideal.
Getting the network printer working turned out to be surprisingly easy. At first the new PC didn't locate the printer, but a reboot of both PCs fixed that and the Compaq then located it on the network without trouble.
This is interesting. When I initially tested the ATAPI DVD A DH16A6L-C combination DVD/CD writer on Ruth's new PC it would record CD-RWs, but not CD-Rs. I searched the Ubuntu Forums and found a similar problem. I posted my problem there but didn't get a response, except from the original poster of the problem. He suggested trying a different CD burning application and that Brasero is the problem. He recommended K3B instead.
I installed K3B and it is a good application, easy to use and works well. I did some testing on CD-RWs and it recorded them fine on Ruth's new PC with the ATAPI DVD A DH16A6L-C drive.
I tried writing to a CD-R and the ATAPI DVD A DH16A6L-C writer didn't recognize that the CD was in the drive and neither did Ubuntu's Nautilus file system. I tried the CD-R in my PC with its older MITSUMI CR-48X9TE CD-R/CD-RW writer and it failed to recognize the CD-R as well. It seems it was a bad CD. This was a generic-brand CD-R, so I tried making a CD-R of an Ubuntu ISO using a brand-name Philips CD-R in the ATAPI DVD A DH16A6L-C writer with K3B and it wrote to it just fine and the CD tested correctly using the built-in CD ISO test.
Next I tried making a CD of an Ubuntu ISO using Brasero and the ATAPI DVD A DH16A6L-C writer using a Philips CD-R and it worked fine, too.
I can only conclude that the problem encountered was not Brasero but the no-name brand CD-Rs. Strangely enough these worked fine on my PC with the older MITSUMI CR-48X9TE CD-R/CD-RW writer, but all I can conclude is that perhaps the newer ATAPI DVD A DH16A6L-C hardware, being a combination DVD and CD writer, is more sensitive to these no-name brand CDs. It is also possible that the last six CDs in the generic box were bad ones.
The latest version Ubuntu 9.04 Jaunty Jackalope comes with all new applications, as do most new Ubuntu versions. Here is a quick look at a few of the ones we have already tried out.
Jaunty comes with the very latest version of GIMP, the ever popular drawing and image editing application. This new version is an evolutionary refinement over the past iterations of GIMP, no amazing new features, but a solid set of incremental improvements.
The GIMP team have changed the user interface since the 2.4 series. It is still too early for me to tell whether this new way of organizing the image window and the toolboxes is better or not, but it wasn't too difficult to figure out while I was processing some photographs.
In the repositories is the latest version of the Clam AV virus scanner. From the command line it runs nice and quickly and the new scan engine supposedly has many new features that aren't evident to the user, but makes it more effective under the covers.
The companion to Clam AV is the GUI version, Clam Tk 4.08. This is not the most up to date version as that is 4.11. The GUI has changed quite a lot over previous versions and now allows the option at installation of caching virus signatures locally so that everything can be done from the GUI - no need to update signatures from the command line.
Clam seems to be getting better all the time on an incremental basis and it well worth having in your arsenal.
Here is another well-known application suite that is finally available on Ubuntu in its "newest generation". OOo 3.0 Writer offers some nice refinements, including variable document zoom, right from the document panel, no hunting around for it, which is especially useful when doing desktop publishing work.
The Gnome web browser is getting better quickly and this time there are some notable improvements. Among the most sought-after on my list is that it no longer randomly crashes, closing the whole application when you were intending to just close a single tab. Epiphany also includes new menus for selecting plug-ins and other refinements.
When using Epiphany I got the impression that it was loading pages more quickly than Firefox 3 does, so I did some testing on web pages large and small. Epiphany is always quicker than Firefox and sometimes by a large margin. One Wikipedia page tested at almost twice the speed, 4 seconds with Epiphany versus 7 for Firefox, which is remarkable. None of this is a surprise. Firefox is a compromise to make it cross-platform compatible, whereas Epiphany only runs on Unix-like systems. It doesn't have as many features as Firefox, but it works very well and, best of all, doesn't leave you waiting for page-loads.
With its new stability I am now using Epiphany as my main browser!
One thing I have found very useful when doing an installation of Ubuntu or Xubuntu is having an accurate checklist to make sure that I don't miss anything. Maybe it is just my background in aviation that makes me feel comfortable using checklists?
On 23 April 2009 when I installed Ubuntu 9.04 in place of 8.04 I started with a pretty good checklist and then corrected and refined it as I went along. The result is that it is pretty accurate now.
Ruth originally did the Xubuntu 8.10→9.04 upgrade on her PC, but that didn't work out very well. We have only done the upgrade once before, when we went from Ubuntu 7.04 Feisty Fawn to Ubuntu 7.10 Gutsy Gibbon and it didn't work out all that well either. In both cases there were problems with the Adobe Flash player not working right and in both cases it seems that multiple different versions were installed. Rather than spending a lot of time solving it Ruth elected to do a clean installation and using our checklist had the box reformatted, on the network, all her applications and documents installed in about an hour and a half, which is very fast!
As a result of doing complete Ubuntu and Xubuntu installations we now have verified checklists for both.
Yesterday was the release date for Ubuntu and Xubuntu 9.04 Jaunty Jackalope. My plans for the day were to download the ISO file for Xubuntu, if I could get it and then perhaps download the Ubuntu ISO overnight, once the immediate demand had subsided, or at once Europe had all gone to bed.
The files become available mid-morning EDT. Just for fun I tried a couple of North American mirrors to see what the download rates were. MIT and University of California were the same - zero - totally locked up. That wasn't much of a surprise. With over ten million Ubuntu users there are going to be a lot of downloads going on, especially on the first day.
So, just for fun I tried a mirror in Zimbabwe and sure enough I got a very good download rate and had the ISO in about 30 minutes. The MD5 Sum checked out fine, it was a good copy! Not long after that I was able to get a good copy of Xubuntu as well, all on release day! That is pretty good!
Next I made CDs of both and ran live sessions of each to check out the hardware. On both the scanner and camera worked, but the printer and my nVidia graphics card didn't. That is actually quite normal as those usually require commercial drivers that can't be downloaded with the operating system running from the live CD. The fact that the camera was working was great as it never worked at all on Ubuntu Intrepid, although it worked fine on Xubuntu Intrepid. My SD card reader worked fine on both, too.
So far so good. I couldn't think of any reason to wait to do the installation. It was early afternoon, so lots of time left in the day as these things usually take 5-6 hours to complete and I had a good ISO! So I got out my checklists, backed up my documents one last time and went ahead and ran the installation.
The installation was done quickly. Ubuntu now includes a new set of screens to guide you through the installation process and they are much more friendly than the older system. I chose "use entire disk" to reformat the drive and replace Ubuntu 8.04 with 9.04.
Once the basic installation was complete, my next task was to start downloading the applications I needed, starting with OpenSSH and sshfs to set up the PC on our network. It was a 'no-go'. The repositories were not moving at all and Synaptic couldn't even read them, let alone download anything. Not a big problem, I knew that over time the congestion would ease and downloads would be possible.
One bother was that my video card didn't have a driver and, unlike with Ubuntu 8.04, the installation didn't identify the nVidia GeForce4 MX 4000 and offer to get the driver automatically. I tried System→ Administration→ Hardware Drivers but it offered nothing either. So it was off to the Ubuntu Forums where I quickly found the answer. The answer wasn't good. As LibertyShadow explained "be aware that nvidia does not support Xorg 1.5 on its legacy drivers". However there was good news. He suggested installing Envy, which is an application that fetches old drivers and installs them. It took:
sudo apt-get install envyng-gtk
That installed Envy core 2.0.1 and Envy gtk 1.1.1. Great news: it worked and my old nVidia GeForce4 MX 4000 once again works!
Next was the HP printer. Using a conventional System→ Administration→ Printing→ New didn't work. This one I had run across before and so I went to a terminal and ran:
but that didn't work either. It suggested trying "interactive mode", so typed in:
sudo hp-setup -i
And that set up the printer perfectly! The printer tested and networked fine as well. That completed the hardware, it was all working and that left me impressed.
While waiting for downloads to be possible, I checked the applications that come with Ubuntu 9.04:
Once the repositories started to clear a bit I was able to download the applications I wanted. From System→ Administration→ Synaptic Package Manager, I downloaded:
Then from Applications→ Add & Remove Applications, I downloaded:
I intended to download FileZilla, the FTP application we have been using to upload files to and manage our websites for the last few years, but it is no longer available for Linux i386 format and so after some research I downloaded gFTP 2.0.18 instead. The interface is a little different from FileZilla, but it works okay, or else you wouldn't be reading this!
With the installation of all my documents the job was pretty much finished. I have had a chance to look at some of the applications. OpenOffice.org 3.0 has some nice features, like sliding zoom. GIMP 2.6.6 has changed the interface from my previous version, 2.4.5, which will take some getting used to. Stellarium 0.10.2 is simply gorgeous.
So now I just have to work with Ubuntu 9.04 and see what else is new. Otherwise this looks like it is a great release.
GIMP stands for Gnu Image Manipulation Program and it was originally designed to provide the same functions as the proprietary Adobe Photoshop does. Calling up saved images, users could add layers ostensibly so that they could add more features to the image, colours and even clone areas on a given image. Like that tall pine tree and wish there were more? Use the clone tool and make extra copies. Just like with Photoshop and just as simple - the only difference being that GIMP is free! Files can be saved in a multitude of formats, including as JPGs and PNGs. Users can even make animated GIFs using GIMP. It is that simple.
However, if you're like me, you tend not to have too many saved images to "manipulate". How many moustaches can you draw on pictures of the Queen before that gets terribly dull? And no matter how talented an artist you may be, you will just not be able to give Rex Murphy a full head of hair and make it look convincing. Nor will adding lightning bolts to your mother-in-law's eyes make her look any more formidable than she may be. So, what's a scribbler like me to do with GIMP anyway?
Earlier versions of GIMP were not really designed to be a graphics creator - in other words, anything much beyond drawing a couple of simple shapes and the odd straight line. However, newer versions offer some amazing features for the would-be artist that makes using GIMP as natural as using a glass to hold water.
Admittedly, the brushes that come with the original GIMP are the standard, circle (fuzzy and hard edge), calligraphic slant line of varying length and a few curiosities like a green pepper (there's gotta be a story there) and a section of vine. Like all graphics programs, starting up GIMP can leave the new user somewhat overwhelmed. After all, isn't drawing a matter of taking pen to paper? The concepts of layers, transparent or otherwise, stroke paths and brush opacities though known to artists just isn't translated in quite the same way with GIMP. Thankfully, there is an online manual that is very good and won't bedazzle users with unfamiliar terms or cumbersome keyboard acts, although one of the best tricks I learned is to hold down the shift key while drawing a line. Doing so will create a perfectly straight line.
I did write an earlier review of GIMP but I thought I would focus this article on just some of the ways users can use GIMP to not merely manipulate images but to create them. The key is to use different brushes. Noupe.com is but one of many websites where users can find large collections of free brushes. I downloaded three sets of brushes: water, trees and moon. Of course, like other users, I took advantage of the other features of GIMP, such as adding lens or gradient flares or basic lighting effects all of which are very intuitive to use. For me, though, the best thing about using all these brushes is that they are free to download and will give artists hours upon hours of inspiration to create all kinds of masterpieces or at least some fun pictures.
I have read that the brushes users can get for Photoshop will work with GIMP versions 2.4 and higher. As the current version is 2.6.6, there should be no compatibility problems.
Brushes are downloaded as .gbr files. To install them in GIMP running on Ubuntu, all you have to do is copy them to the hidden folder in your home directory: Ctrl+h to show the hidden folders, then .gimp-2.6→ brushes and they will appear in the application ready to use.
Ever since we first got two PCs, one running Ubuntu and the other running Windows XP I had planned to network them. The two different operating systems made figuring out how to do this quite difficult and the fact that initially both were on dial up made it even more difficult, as they weren't connected together. In the end after a few attempts we resorted to using USB devices for synchronizing files from one PC to the other, what some people term a "sneaker network" (i.e. powered by running shoes).
Then we got both PCs on high-speed sharing a gateway (router/modem). It seemed that networking the PCs would be easier, given that they were wired together, but we ran into the Windows vs Linux issue again and couldn't figure it out.
Finally with both PCs running Linux it seemed the time to figure out how to network them at last. It sounds simple, but finding out how to do it wasn't. I spent many weeks reading the gateway's manual, searching on the internet and trolling The Ubuntu Forums all to find virtually no information at all. The closest I came was some networking tutorials, but they were so generic as to be useless - "first configure your PCs for networking and then your router and, presto, you have a network". I knew that I had to configure the PCs and probably the gateway, but that was all I could discover.
Finally the good folks at National Capital FreeNet who, after all, sold us the gateway in the first place, connected me to some expert help on networking Linux PCs from the Ottawa Canada Linux Users Group and we got it figured out
To have two PCs share a printer and be able to synchronize files.
Note: this is a generic procedure to network two Ubuntu PCs. For Xubuntu or other systems locations may be slightly different. No changes to the gateway were required. There are several other ways to network Linux PCs, as well, using SSHFS is just the simplest way.
That completes the installation of the networking tools. Now to activate the network from one PC to the other:
Applications→ Accessories→ Terminal, enter: $ sshfs otherusername@otherpc:/home/otherusername ~/newfolder
You will be able to access the other PC's home directory in the "newfolder" in your home directory. To disconnect your PC from the network enter:
$ fusermount -u ~/newfolder
Sharing the printer is completely separate from the SSH file sharing network. To share the printer:
There are a couple of pitfalls to keep in mind with this type of ssh network:
I did manage to crash the system by clicking though to the other user's home folder and then clicking on their icon back to my folder etc. Don't do this! There are probably other ways of crashing the network, which I am sure I will discover in time. Otherwise so far it seems to work quite well and we are saving lots of wear and tear on the USB devices and our sneakers!
The previous issue that we had of files having to be deleted on the remote PC prior to copying new versions of the same file or else creating an error and a hidden duplicate file, seems to have been fixed in the latest version of SSHFS 2.1 that we got with Jaunty. Now you can just copy files on top of existing remote files and it works just like local files and replaces them - a great improvement!
Through carefully reading the SSHFS home page I discovered that the default target is the user's home directory. This has the effect of shortening the command line requirements to sign in. For example, instead of having to type:
$ sshfs otherusername@otherpc:/home/otherusername ~/newfolder
all that is required is:
$ sshfs otherusername@otherpc: ~/newfolder
There is no need to specify /home/otherusername as it is the default.
I recently had an interesting time with a "USB mass storage device", commonly known as a "thumb drive", "jump drive" or "flash drive". We use a number of these to transfer data between our two PCs and also to back up hard drives. This week one USB drive started acting erratically - data disappeared, the drive unmounted itself, it didn't write data, that sort of thing.
My assessment was that the USB drive could be suffering from either a software or a hardware problem. It is still under warranty for hardware, but before returning it I had to rule out a software problem. These drives use the FAT32 file system and it is always possible that the drive has some corrupted data on it or even a corrupted file system. The normal cure is to reformat the drive and see if it works, if it does then it was a software problem. If not, then it is a hardware problem.
There are very few things that are easier to do on Windows than on Linux, but reformatting and even renaming USB devices are two of them. In Windows XP you would just find the device on "My Computer", right click on it and select "format" and it would give a choice of "full format" or "quick format", one warning you are going to lose your data and then it is done. Renaming a drive is just as easy, right click, "rename". Of course in reformatting on Windows you only have one choice of formats - FAT32.
On Linux this is a bit more complex, but not difficult once you learn how, of course.
I started with a search through the Ubuntu Forums and found enough information on reformatting USB drives to know that I would need to download Gparted, the Gnome partition editor application. I found this in "Add/Remove Applications" and downloaded it fine. I then used it to try to reformat the device as FAT32 and that didn't work, just returning errors. The lay-out of Gparted makes it a little hard to understand how it works, but I tried reformatting the USB device as the native Linux file format, ext3, and that worked fine, at least once I had figured out that the device has to be "unmounted" to be worked on.
So far, so good, except that I couldn't write any files to the USB device. A quick check showed that the device was listed as being owned by "root". Hmm. I tried logging in as root from Alt+F2 and that worked fine, I could store files on the device and it worked normally. That at least proved that the original problem had been a corrupted file system and not a hardware problem.
The new problem was changing the ownership of the device so that users could write to it, without having to log in as root. The Gparted documentation is quite extensive, but offered no help.
So I turned to our trusty source of tech support, the Ubuntu Forums again. I started by searching the forums, but finding nothing posted my own question. After a couple of exchanges I got a complete answer, which was to run the command (shown here in generic form):
$ sudo chown -R username:username /media/devicename
This changes the ownership of the device (chown), including all files on it (-R, which means "recursively"). The USB device now works just fine, just as before. Of course having the device formatted as ext3 has some advantages, as it is unreadable on non-Linux computers, which gives some security in case it gets lost. I even used Gparted to rename the device, which at least in ext3, turns out to be as easy to do as Windows. The version of Gparted we have will not change a device name on FAT32.
So lessons learned:
There are a couple of reasonably useful guides to installing Ubuntu out there that are helpful for users installing Ubuntu for the first time. In particular:
These are good for the beginner as they have screen shots, although I think both have their drawbacks for the brand-new beginner.
When we do a fresh installation from CD I usually make up a checklist, just to make things go more smoothly. This past week when Ruth installed Xubuntu 8.10 on her PC she did it from a checklist I had made up for her in advance. We actually validated and improved the checklist through the installation process and we now have a pretty good version of it for both Ubuntu and Xubuntu. I thought it might be handy for other people interested in installing either one of those operating systems to see our checklists and perhaps use them for the basis of their own.
You should keep in mind a few things in reading these:
Back shortly after I installed Hardy Heron on my PC in place of Windows XP, I tried out the Brasero CD burning application that was new with that Ubuntu release. My early experiences with the application were bad enough that I gave up on using it after wrecking several CD-RWs and just used the standard Gnome CD/DVD Creator instead. That application is very bare-bones, but it gets the job done.
When Ruth installed Xubuntu 8.10 Intrepid Ibex on her computer it only came with one CD burning utility, Brasero 0.8.2. When she wasn't using her computer I spent some time checking out the various features of Xubuntu, including Brasero. One of my reasons for doing so was to see if it had improved from the previous Brasero 0.7.1 that is still current on Hardy Heron. The second is because Brasero has now become an official part of the Gnome desktop and that means that the Gnome CD/DVD Creator has disappeared on later versions of Gnome. Basically that means that it had better work!
The first opportunity came when someone asked me for an Ubuntu ISO CD. I decided to give it a try on Brasero 0.8.2, using its "burn image" selection on a CD-R. The new interface is very straightforward to use, better than most of the CD burners I had used on Windows in the past. That made finding the file and adding it to the CD very quick and easy. On the Gnome CD/DVD Creator when I make ISOs I usually burn them at 4X to make sure that the risk of errors is reduced. Brasero doesn't allow the user to select the burn speed, it works it out by itself. The reported speed varied slightly, but ran about 12X. After it was done I did a full test on the CD and it tested as error-free, booted and run fine. That was good.
The next test I ran was creating some copies of our Best of Free Software CD for Windows on CD-Rs. Again the simple interface made setting up the CD very quick as a "Data Project". Again Brasero offered no choice of burn speeds and did the task at about 24X. I created one CD, tested it and it worked fine. Next I used Brasero's multiple CD feature to feed it four more CDs in sequence and it made them all flawlessly. Testing showed they were all good.
The last test I did was writing data to a CD-RW. This disk had data on it that was to be replaced with new data. Again I used Brasero's "Data Project" selection and it asked about deleting the existing files and burned the new ones just fine. It is supposed to be able to multi-session, in other words add more files to a CD-RW without erasing the existing ones, but I wasn't able to get it to do that. Perhaps this would be possible with a new CD-RW?
One thing that has definitely improved between Brasero 0.7.1 and 0.8.2 is that the application now includes actual help files that have useful information in them! Brasero 0.7.1 had no help available. Brasero also includes a CD cover editor that allows you to make and print paper CD jewel box covers, which is a nice feature.
There have still been some reported bugs in Brasero 0.8.2, including when making DVDs of movies that it removes the original file from the PC. This bug will be fixed in Brasero 0.8.3. Bug listing and Launchpad bug report.
So overall Brasero seems to be capable of handling most CD burning tasks that the average users needs to tackle. I will do some more tests as the opportunity arises and report on future finds.
After our recent Live-CD tests with Xubuntu Ruth decided to install the Xfce desktop on her PC. She was motivated by the simplicity of Xubuntu, as it lacks a lot of the RAM-eating fancy frills that Ubuntu has.
Installing an alternate Xubuntu desktop is easy to do if you already have Ubuntu, you just select the xubuntu-desktop package on Synaptic. This installs Xubuntu in addition to Ubuntu and is about a 60 MB download. The result is that you can select a Gnome or Xfce desktop session each time you log in. You also get the unique-to-Xubuntu applications as well, but not the common applications already present in Ubuntu.
Ruth played around with both desktops for the past month or so as an experiment. She discovered how to do things in Xfce and decided that she quite liked it and especially the light and fast Thunar file manager in place of Nautilus. She tried out some of the additional applications that come with the package, such as the Orage calender and the AbiWord word processor. Overall she wasn't so impressed with those compared to the standard Ubuntu applications, but this did lead to a new plan.
We are going to evaluate upgrading to Jaunty Jackalope in about a month's time when it is released. As a trial Ruth decided to spend the intervening month trying out Xubuntu 8.10 Intrepid Ibex and so yesterday we did a clean installation of that together. It took about five hours to do, most of which was downloading the extra packages she wanted. The initial installation was actually very fast, taking under an hour. Essentially she created a custom installation with the best of the Ubuntu applications she uses and removed the Xubuntu applications she doesn't like or doesn't need.
Using Add/Remove Applications she installed:
Using Synaptic, she uninstalled
This gave her basically an Ubuntu-configured PC using the Xubuntu file manager and utilities. This may not work on a small laptop or RAM-limited desktop, but with 512 MB of RAM it works fine. After customizing it with her own desktop and screensaver and tweaking the taskbars and other elements she is now pretty happy with how it is working.
She will now work with this installation of Intrepid Ibex Xubuntu and see how it goes for the next month. Once Jaunty is available she will then have a few options:
I am sure there will be more news as her testing of Xubuntu proceeds.
Type: closed source, commercial
Use: on-line income tax application
Made by: Dr Tax Software
Doing your income taxes when all the computers in your house run only Linux can be an interesting challenge.
Up until 2003 we did our taxes on paper returns. It was slow (6-8 hours work) and required a lot of pencils and a calculator. On the plus side you knew where every dollar went or came from and all the mistakes were your own. Another plus for doing it on paper is that, other than a stamp, it doesn't cost you anything. I have a fundamental objection to having to pay money to pay my taxes.
Starting in 2004 we switched to doing our taxes on Windows using Intuit's QuickTax. To say QuickTax sucked would be an understatement. The software is difficult to use, hard to follow what it is doing, expensive and slow to input into and complete. Normally you buy it on CD-ROM and then install it on your Windows PC. It did cut our tax preparation time down from the paper version, but only by an hour or so. It was still an all-day task for the most part, because it was so slow and difficult to use. The worst part about QuickTax was the tech support. The software is bad enough that you need the tech support and it is "off-shore". It will probably suffice to say that the person on the other end of the phone was hopeless. The 2007 tax year was the last straw. Intuit had mailed us a CD, all we had to do was get an activation code by phone. This could not be done and was totally screwed up. I swore I would never use QuickTax again after than nonsense.
Then in 2008 we did away with our last Windows PC and switched to Linux. This made the Windows-only QuickTax a non-starter, no matter how much gluttons for punishment we were. We did look at the alternatives. There are a couple of freeware Canadian tax packages available, but they are both for Windows only. That left us with basically two options - paper or Ufile.ca.
We decided to give Ufile.ca a try and were very pleasantly surprised by how well it worked. It is a web-based application, meaning you enter your information into their database via your browser and then when you are done you download your ".tax" file for netfiling, plus download a PDF copy for your own records. Right on the homepage is a penguin to give us Linux users a warm, fuzzy feeling, alongside the Apple and Windows logos. Because it is web-based it works on any operating system.
I read all the tour, background and FAQ info on their website and then we plunged into learning how their system works. They claim that their system is very secure and there is no worry about compromising data:
The information entered into UFile is stored securely on servers owned and maintained by UFile. We use the highest encryption protocols available (128-bit) and we store the data in a protected data bank. Secure backups of the encrypted data are also stored off-site in guarded data centres. All data transmissions to the CRA and Revenu Québec are subject to 128-bit encryption as well.
At the UFile site, security is guaranteed and you are assured that all your tax information is stored securely in our data bank. When you log onto our site, you pass through a firewall, a tool that blocks access to anyone who does not have the right to enter, including anyone attempting to enter with a disallowed protocol or without an identifiable IP address. From the time that you log on at our Web server, using your user ID and password, all activity on our Web site takes place in secure mode. The information you enter on the Web server does not stay there. It is sent to our applications server behind a second firewall as soon as you submit it. The applications server will send back messages and diagnostics to the Web server. These are intended for you. You are expected to read the messages, make corrections and clarifications to the data if necessary, and continue until your tax return is completed. At no time is any part of your file accessible in a clear format on our Web server.
I wasn't able to find any complaints of breaches of security so far, so that is reassuring to some extent.
Actually entering the data and completing the tax forms was surprisingly quick - we had it done in under two hours, which is a record for us. The process was about as painless as I think it is reasonable to expect for filing income taxes and a huge improvement over QuickTax.
Payment is an interesting subject. Anyone can create an account at Ufile.ca and enter all their data, save it and come back to it anytime, all at no charge. It is only when you are happy with the results that you pay by credit card and then can download your ".tax" file and your PDF copy. The price was actually not too bad, at Cdn$15.95 for one person or $24.95 per family, oh yes, plus $1.25 tax, of course, for a total of $26.20.
I waited until we had both received our "Notice of Assessments" back from Revenue Canada, before reporting on it, as I wanted to make sure that Ufile.ca had done a good job and its results hadn't needed to be fixed. The assessments are back and it was completely accurate, which is the good news.
So as far as doing taxes on Linux is concerned, Ufile.ca gets our thumbs up.I still find paying money to file your taxes objectionable, but Revenue Canada's answer to that seems to be, "well then do them on paper".
This great little book was recommended to me by National Capital FreeNet board member Graeme Beckett. Until he pointed it out I hadn't heard of it. It is available as a paper book from many booksellers, but is also available as a free PDF download (2.2 MB) . Naturally I downloaded the electronic version!
The book is a small-format publication of 170 pages. It is intended to be a real beginner's guide and therefore assumes you know virtually nothing about Ubuntu or Linux, although it often compares Linux to Windows, assuming that some users will have a passing familiarity with the way Microsoft organizes its operating system.
The book is dated 2009 and covers Ubuntu 8.04 Hardy Heron LTS and 8.10 Intrepid Ibex. This clear focus on the current two Ubuntu versions makes the book more concise and to the point.
The chapters of the guide are:
This guide starts you right at the beginning, with coverage of how to install Ubuntu as a sole installation, dual boot, through WUBI or a virtual machine. It assumes you are a beginner, which most readers of this book will be. From there it moves quickly and concisely on to customizing and configuring, using the desktop and navigating through the file system and right into how to use the command line. You can tell that the author enjoys the command line and gives some real world exercises to the reader to build some comfort in using it. From there it goes over using Synaptic to install, remove or repair applications and how Ubuntu permissions work. The last chapter covers configuring the netfilter kernel firewall using Firestarter, using the ClamTk and ClamAV virus scanners and some Firefox security suggestions. There are GUI screenshot photos to show how to do everything throughout the book.
The book also includes a fairly extensive glossary and some links to more help, as well a complete index, although this will be of more use to readers of the paper version of the book, since the PDF version can be easily searched. It finishes with some generous plugs for the author's other two books Ubuntu Kung-Fu and Beginning Ubuntu Linux, neither of which are available as free downloads.
The plugs for his other two books are fair enough, as he is using this free guide as a bit of a lost-leader for the other more extensive books. If the other two are as well written and easy to understand as this little guide is then they would be worth purchasing.
This guide is necessarily short as it has only 170 small pages to cover a big subject. The guide doesn't get into applications very much at all, other than touching on ClamTk and Firefox. Given the space limitations, perhaps my only criticism is that while he spends quite a bit of time on how to install applications using Synaptic and even the command line, there is no mention at all of the "Add/Remove Applications" GUI, which is easily the most user-friendly of the three methods of installing applications.
I have to admit that, even though I have now been using Ubuntu for almost two years, I did learn quite a bit from this guide. One thing I learned was how to configure which applications open when you plug in a digital camera. I always prefer to manage my own photos and manually remove them from the SD card myself, through opening the file browser, but Ubuntu always opens the two F-Stop photo importation management windows and creates photo thumbnails. Sometimes this interferes with the file system creating thumbnails. I have tried for a while to figure out how to configure Ubuntu to not open F-Stop. The guide told me how to do it: Places→ Home→ Edit→ Preferences→ Media→ Photos→ Open Folder.
Over all Ubuntu Pocket Guide and Reference by Keir Thomas is a great addition to any Ubuntu user's bookshelf, or your file system in PDF form. It is well recommended, even if you have been using Ubuntu for a while, I guarantee you will learn something new reading it.
I have recently been thinking about all those PCs out there still running Windows XP here in 2009. Probably most of their owners don't realize that mainstream support for XP ends in a just a few short months and it is slowly running out of lifespan. They probably think they are going to have to go out and buy a new computer to run Vista or even Windows 7. In the current economic climate many people don't want to spend money on new hardware when the old hardware runs fine.
There are solutions, any PC that will run XP and has at least 384 MB of RAM can run Ubuntu, which is an up-to-date operating system and in my opinion is better than XP anyway. I say that because with Ubuntu or any Linux or BSD installation, the operating system is available free of charge, it preserves your freedom, all the applications are free and there are no virus or spyware worries. All improvements over XP.
The problem is that there were lots of XP boxes that were delivered with 256 MB of RAM and that was okay for XP, but won't run Ubuntu all that well. Other than buying a new PC there are some options with these boxes. You can of course add RAM, as we did with our old XP PC. 512 MB of RAM is relatively cheap and will run Ubuntu very well. Another alternative is to install Xubuntu.
Xubuntu is the the lightweight desktop version of Ubuntu and it has lower system requirements. Xubuntu can run on as little as 64 MB of RAM and a 166 MHz processor, although 256 MB and a 800 MHz processor are recommended. Instead of the Gnome desktop it uses the Xfce desktop and includes lower resource-intensive applications, as well. Xfce originally stood for "XForms Common Environment", although these days it goes by its initials only.
I recently downloaded the Xubuntu 8.10 Intrepid Ibex ISO, made a CD and gave it a try on a live session. Xubuntu is actually quite impressive - light and fast. The desktop is similar to Gnome, but simpler and the default theme is blue, not the usual Ubuntu brown. The Xubuntu logo shows up on the desktop, an Ubuntu "circle of friends" with a mouse inside it it, presumably to show that it is a small operating system. The ISO is only 595.5 MB, smaller than the normal CD capacity.
Xubuntu 8.10 includes some applications that will be quite familiar to Ubuntu and other Linux/Unix/BSD users, including:
But it also has some applications that will be less familiar, such as:
I tried a couple of these new ones. AbiWord works well as a word processor and while it has its own default format for saving documents, the open format ".odt" is also available along with ".pdf" and ".doc" as well. Likewise the Gnumeric spreadsheet application seems to work like any other spreadsheet and can also save in ".ods" or ".xls formats as well as ".pdf". Mousepad is a very basic text editor, more along the lines of Notepad, as it doesn't include syntax highlighting and other more advanced features.
Of course Xubuntu comes with the most useful of all applications "add/remove applications" and its big brother the Synaptic package manager. Those allow you to go and get just about any application you want, including some common ones we use:
The only problem will be not overloading a PC that is running Xubuntu because it lacks RAM in the first place! Xubuntu also lacks some common utilities, although these can be selectively added through "add/remove" if needed.
Xubuntu would be a great option for desktop PCs with under 384 MB of RAM or for many laptops and especially the newer netbooks. In fact there is even a version of Xubuntu called eeeXubuntu especially for the Asus eeePC netbook.
So if you are the owner of an older PC that is running XP on 256 MB of RAM and you want to upgrade to a more modern operating system that uses free software and offers worry-free computing you have a choice:
Both are great solutions and should allow you to keep your old hardware humming along for years to come.
Our earlier reported hardware testing on Ubuntu 8.10 Intrepid Ibex yielded some unhappy results - our camera wouldn't work. We could take photos fine, but unlike Hardy, Intrepid refused to recognize the camera and download the photos from its internal SD card.
In a way this wasn't a problem, since we are remaining with Ubuntu 8.04 Hardy Heron at least for now. Still, it is disturbing that hardware that worked with Ubuntu in one version stops working in another Ubuntu version. This problem has been widely aired on the Ubuntu Forums and also reported on the Ubuntu Launchpad bug reporting system as Bug #285682. However it still isn't fixed. Some updated files have allowed some users to get their photos off their cameras, while others have resorted to a plethora of work-arounds, like reinstalling Windows.
One work-around I did want to try is an SD card reader. Since our camera uses the small removable SD (Secure Digital) cards a separate reader can by-pass the need for the operating system to be able to identify and work with the camera. This was actually suggested by a member of the NCF Free Software Discussion Forum. When I suggested it on the Ubuntu Forums and on the bug report some users indicated that an SD reader worked for them while others indicated that it didn't.
An SD card reader is essentially a small adaptor that allows an SD card to be plugged into a USB port. But of course in computing nothing is entirely simple and so SD card readers are not all the same. Some read only one card format while others are multi-card. There seems to be differences in how they work, too, so some may not work with Linux systems. I wanted to get one just to ensure that even if we upgrade our Ubuntu version that we won't have to worry about camera compatibility anymore.
So I picked up a small multi-card SD reader from factorydirect.ca for $9.99. The unit is made in China, of course. Factorydirect.ca call it "USB ALL-IN-ONE FLASH CARD READER" (all in capitals) and assign it a code of CA2300. The Chinese packaging describes it a bit differently. The package indicates that it is marketed under the fanciful brand name "Enjoy Technology, Enjoy Life", calls the unit an "AIO-Card Reader" (presumably for "All-In-One") and claims, in somewhat fractured English:
I presume that "antiseismic" means that it is "shock-proof" and not "earthquake-proof".
The packaging also reassuringly carries icons assuring compatibility with Mac, Windows, USB2.0 and what I wanted to see: a Tux penguin icon, indicating Linux compatibility.
The package also claims that it supports many card formats:
The unit is small (2.75" X 1.5" X 0.5") and has four ports for four different sized cards. It comes with a 17" USB cable. I plugged in an SD card, plugged the USB cable in and it worked. Both Hardy and the Intrepid Live CD instantly recognized the SD card and I could download the pictures there. The reader itself seemed totally transparent to the operating system, which is how it should be. Actually is is transparent to the user as well - you can see the card ports and the circuit board though the clear case. Very simple and impressive.
So now with my new SD card reader I don't have to worry if future Ubuntu versions won't read my camera. The unit lives up to the expectations it raises, as I can now indeed truly "Enjoy Technology, Enjoy Life".
People who use open source software like to be able to contribute to the projects, but for those of us who are not software developers it is hard to contribute in a meaningful way. Sure we do what we can to advertise and get the word out about how great these free applications are, buy T-shirts and even file bug reports, but for many projects that is pretty much all that non-developers can do to help out.
One of the few open source projects that has a place for non-developers to contribute directly is the only open source virus scanner, SourceFire's ClamAV. On this project mere users can submit viruses and false alarms, which are then incorporated into the virus definitions and thus help improve the final product, making it more competitive with commercial scanners.
People who make useful reports get credit, too. I recently was able to report a confirmed false alarm that ClamAV had picked up and was mentioned in [clamav-virusdb] Update (daily: 8796). That is a nice feeling!
We already covered the Linux ClamAV GUI ClamTk, a bit earlier, but I thought I would put all the information that a Linux desktop user needs to use the command line ClamAV scanner in one place here.
This is probably a good place to start - why do you need a virus scanner if you are running Linux? Actually you probably don't need one - there isn't much in the way of Linux viruses out there and they are hard to install on Linux anyway. Unlike in Windows, you can't just click on them to execute them, you need to enter a password and so on. Even if you manage to install a virus it will only be able to affect your user account and not the whole PC anyway. So why bother?
Well as mentioned before:
On Ubuntu you can get ClamAV on the "add/remove applications" repositories, although it is very generically called "Virus Scanner" there. The repositories install both the command line only ClamAV and the ClamTk GUI as well, which is handy.
ClamAV is easy to use from the command line and unlike from the ClamTk interface you can update the definitions and scan the whole PC that way. ClamAV only does one thing in the desktop environment and that is on-demand scanning. This means that you have to manually update the virus definitions and then you can scan all or part of your computer. Updating takes only a second or two, but scanning takes a while so I recommend doing it when you are finished with the PC for an hour or two, set up a scan and let it run.
Here are some useful commands to run in the terminal window:
Okay so you run a scan and nothing is found - that is good. Close the terminal window and have a nice virus-free day.
But what if ClamAV shows up with a virus? The next step is to see what it found and determine if it is a real virus or a false alarm. This means locating the infected file identified using the "-i" command.
Once identified it may be obvious that it is a false alarm. My last "virus" was "Inkscape-0.46.win32.exe" which is the Windows installer for Inkscape and clearly not a virus. As long as it is under 10 MB, a good way to check if a file is falsely accused is to upload it to the Jotti Online Malware Scanner. This gives "second opinions" from about twenty different free scanners and quickly shows up if they all think it is a virus or just ClamAV that does. That would indicate a probable false alarm.
If it is a false alarm then it needs to be reported.
ClamAV has a really simple reporting system that makes it easy for non-technical users to submit virus samples and false alarm reports. You simply fill out the ClamAV Virus Database Submission Form, attach a file and send it. In the case of reporting a real virus that ClamAV missed you attach the virus file. In the case of a false alarm I usually create a small text file that includes a copy of the scan report showing the virus identified, the definition build number and a note indicating the Jotti results or my reasons why I think is is a false alarm. The form will not allow you to submit it without an attachment of some type, so you need a text file or other info to attach there.
If you discover a real virus that ClamAV doesn't catch that can be submitted via the same form after being checked on Jotti, too.
So what if during a scan you find a real virus? It is pretty unlikely that what you have is a real Linux virus. If you have ruled out a "false alarm" using Jotti, then what you may have is a real live Windows virus. That is no panic as it can't hurt your Linux system. The easiest thing to do is just go to where the file is and delete it. ClamAV doesn't include "virus vaults" or "removal tools", you just do it manually if required.
As long as ClamAV picked up the virus there is nothing to report at that point.
Overall we are happy to be using ClamAV from the command line. It works well and gives us a chance to participate in an open source project in a small way, beyond just promotion.
We started using Inkscape on a recommendation from the NCF Free Software Discussion Group. Ruth was looking for a better application for doing drawings than Gimp and Inkscape fit the bill nicely. It was available right in the Ubuntu "add/remove" repositories and so we downloaded it for both PCs.
Inkscape is an open source drawing application that produces Scalable Vector Graphics or ".svg" images. These are the "gold standard" in images these days for two reasons. First SVG is an open standard and second because they are scalable. This means that you can make the image bigger or smaller without having to deal with it becoming "rasterized", in other words the pixels getting bigger and bigger with the image. SVG images look the same at any size, which is a great advantage.
The Inkscape project team describes the application as
A Linux, Windows & OSX vector graphics editor (SVG format) featuring transparency, gradients, node editing, pattern fills, PNG export, and more. Aiming for capabilities similar to Illustrator, CorelDraw, Visio, etc.
It is also available for other Unix-based operating systems, such as BSD and Solaris. It is worth noting that Inkscape was actually developed for Linux and was later ported to Windows and Mac.
Inkscape allows you to draw lines, shapes, do calligraphy, backgrounds, gradients and tons more effects. It really specializes in transparencies and, in fact, the default background is transparent.
Inkscape is fairly complex to learn how to use well, but the application itself comes with a series of tutorials that help you get started, all written as ".svg" files that you can draw right on. They are found under Help→Tutorials. The tutorials even include a lesson on the basic elements of design, which is helpful for people like me who have no formal art training.
Available tutorial subjects include:
There is also a complete manual that explains how Inkscape works and makes a good reference.
Inkscape is often used to create icons and logos and it is ideal for those sorts of simple uses. It can also apparently be used to create much more complex pictures, as well.
Because ".svg" files don't work well on the Internet yet, Inkscape includes the ability to export your final drawing to ".png" format. These then become raster images, but retain the transparency that Inkscape is capable of rendering. The little drawing shown above was rendered in Inkscape as an ".svg" and then exported to ".png" format and it looks just like the original.
Unfortunately Inkscape will not make you an artist, but as can be seen from the doodle above, it can cover up some of the drawbacks of a lack of talent!
One thing almost any experienced Linux user will tell you is that it is almost infinitely customizable. Naturally most people think this is a good thing as it means that it can look and work as they want it to.
I have been rather reluctant to do a lot of customizing for a few reasons. First I really want to see how everything works out of the box and second when we were upgrading versions every six months it meant re-customizing it, at least on clean installations. Now that we are using the Hardy Heron LTS version of Ubuntu there is no need to worry about that second reason as we will be running Hardy for a while.
Ruth has dived into customizing her PC with gusto. She now has replaced the brown "human" theme with the blue "clearlooks" theme. Her desktop seems to change every week as she draws new outer space pictures in GIMP. She has kind of looked pitiably at my stock desktop that still features the heron motif.
But lately I have been figuring out some customizations to make things, not look different, but work better for me. The latest was when I accidentally opened a ".jpg" in Sunbird for about the tenth time. I should explain that when opening files from the file browser the quickest way is to right click on the file and select the application to open it in. I was trying to open an image file in GIMP to work on it, but ended up missing by a millimeter or so and opened in Sunbird. Now Sunbird is a calender application, it doesn't open image files, although it tries to. Sunbird creates a whole new calender, which then has to be deleted, which is a nuisance to say the least.
What I can't figure out is why Sunbird ends up on the default "open with" menu for images and ".html" pages. It makes no sense.
So I set out to figure out how to get it off the menu. It actually turned out to be easy. It was just a matter of finding any file of that type (.jpg, .html) clicking on Properties→Open With, highlight Sunbird and select "remove" and it is off the menu. A small customization, but it makes life easier for me.
So as far as customizing goes, I still have my Hardy Heron desktop, but with Ruth drawing pictures in GIMP, you never know what you will find.
Okay I have to admit that Ruth finally convinced me to make a new desktop image. So I created a very simple one in Inkscape, using a white background, a purple gradient and the Ubuntu logo. I think it looks elegant! Ruth actually liked it enough that I made one for her, too in olive green. I still liked the Heron graphics, though.
I also made a couple of other changes to make Ubuntu work better. The first was to stop F-Spot opening every time I plug in a camera or other media with pictures on it. I don't use F-Spot photo manager and prefer, instead, to manage my own pictures from the Nautilus file browser. Getting F-Spot to stop popping up the two windows it opens everytime was a priority. This turned out to be easy to do. In Nautilus Edit→ Preferences→ Media→ Photos→ Open folder, and that is it!
The other change I made is one that has greatly improved the performance of the Nautilus file system. The default is for Nautilus to load thumbnails for all image files when you open a folder that has images in it. This is okay if there are a dozen or two image files in a folder, but if there are hundreds it takes a very long time to load the thumbnails and in the meantime the folder contents jump around while it is loading, making selecting and opening a file difficult. I fixed this also in Nautilus Edit→ Preferences→ Preview→ Other Previewable Files→ Show Thumbnails→ Never. This means that instead of thumbnails for image files, you get a generic icon instead. Even the biggest folder of pictures opens in a flash instead of sometimes two-three minutes! Of course if you need the thumbnails in the folder you can quickly reset that. A real performance enhancer!
We don't use an e-mail client, like the Evolution client that Ubuntu comes with. We both prefer to use webmail as it means we an sit down at any PC in the house or elsewhere and all our mail is available instead of being "on the other computer". The problem this creates is that if you click on a webpage link that is for an e-mail address Ubuntu tries to set-up Evolution each time. I fixed this by using Synaptic to remove Evolution entirely. Now if I click on an e-mail link nothing happens - much better solution, plus it frees up some hard drive space!
As mentioned in the article about Gmail, as part of trying to get Hotmail to work again recently I tried using another web browser. Ubuntu has quite a list of browsers available for download, but the one I have wanted to try for a while is Epiphany.
Epiphany is actually the standard Gnome desktop web browser. The only reason that the Gnome-equipped Ubuntu doesn't come with it is because Ubuntu ships with Firefox instead. Even though Firefox is well regarded, Epiphany has been getting good reviews as well.
Epiphany has an interesting history. It was started in 2000 when developer Marco Pesenti Gritti decided that the Mozilla Suite project was getting too big and bloated. He had a vision of a very small, fast browser and so he created a fork from the Mozilla Suite and started a new browser project named Galeon. Galeon was released in June 2000. It worked well and became quickly very popular as a Linux, Gnome-based browser. A disagreement arose with the rest of the team during Galeon's development. The other members wanted a fully-featured browser and Gritti wanted to keep it simple and lightweight. This resulted in splitting off a new project from the source code and this became Epiphany. By 2005 the Galeon project had lost momentum and the team started writing extensions for Epiphany instead, validating Gritti's original concept.
Epiphany is the standard Gnome browser, and as such, it integrates really well into Gnome, but less so on other desktops. It is available for Mac OS-X, but not Windows. It was felt that one of the reasons Firefox is not as fast as it could be is due to the compromises made in making it cross-platform. Mac OS-X is Unix-based, so there is no coding compromise in making it available for Mac.
Epiphany does not have its own theme settings, it takes up the selected Gnome settings. That is one reason it looks so much at home on the Gnome desktop and not on other desktops. The emphasis is on speed and simplicity.
Epiphany truly is a very simple browser, it is quick to learn and use. Unlike KDE's Konqueror it is not also a file manager display or anything else. That is one reason it excels as a web browser.
It is fast, as well, loading pages quite noticeably faster than Firefox and far faster than Microsoft's Internet Explorer does. This is most evident when loading multiple image animations, such as weather radar images.
Epiphany has most of the features that Firefox has, including tabbed browsing and a bookmarks toolbar. For users coming from Firefox learning to use Epiphany is very quick and intuitive. Many shortcuts are familiar, for instance F5 will reload the page, Ctrl-T opens a tab, Ctrl-W closes a tab. Ctrl-F opens a "find" bar that actually works better than Firefox's as it auto-closes when you click elsewhere, which I like. There are some differences in shortcuts from Firefox, for example F6 will not highlight the URL bar, you have to hit Ctrl-L to do that (for "location").
Being a simpler browser Epiphany does not have all the features that Firefox has. Missing are:
While Epiphany doesn't have a dedicated search tool bar the default search from the URL bar (which Gnome calls the "location bar") is with Google, which is a neat solution, really.
Epiphany does have plug-ins available, but not as many as Firefox. Interestingly these cannot be loaded from the browser itself, they have to be installed from the command line by typing in:
sudo apt-get install epiphany-extensions
This provides a standard package of official extensions which can then be selected on or off from the browser. There are additional unofficial extensions available too.
Epiphany is most often noted for its handling of bookmarks. Rather than the more usual system of folders and sub-folders Epiphany tags them so they can be seen by category including an "all" setting. It actually works quite well. Bookmarks can be imported from other sources. I imported our Firefox bookmarks via the HTML file that Firefox can create and Epiphany can import. Epiphany can back-up bookmarks as an XML-based file "Bookmarks.rdf". Of interest bookmarks cannot be easily exported to Firefox from Epiphany, since Firefox can't handle the ".rdf" files Epiphany creates.
So overall Epiphany offers fewer features than Firefox in exchange for better speed, simplicity and a quicker learning curve.
Ruth is quite impressed with Epiphany and has been using it as her primary browser since I installed it for her. In particular she says the improved speed and simplicity make it preferable for her over Firefox 3.
We have been working with Epiphany quite a bit now and we both like it a lot, mostly because of its speed over Firefox. There are quite a number of official extensions and also third party extensions as well. We don't usually use extensions for browsers, but one we have found worthwhile for Epiphany is gwget which allows you to monitor downloads to see how long they will take, etc. Gwget can be installed directly through the command line as it is in the Ubuntu repositories:
sudo apt-get install epiphany-extension-gwget
Once installed you just have to turn it on at Tools→ Extensions.
It doesn't work on Epiphany 2.22.2 which I have on Ubuntu 8.04, but it does work on Epiphany 2.24.1, which Ruth has on Xubuntu 8.10. It should work in Epiphany 2.26, which I am hoping will be available with Ubuntu 9.04.
We have never really discussed webmail before, mostly because it has never been an issue until yesterday, unfortunately.
We have been using webmail for a long time. Our first e-mail account was a Hotmail account that we signed up for in the early summer of 1997. That was before Hotmail was bought by Microsoft later that same year. We have always kept that account active, even when we went to POP mail and it has been very useful over the years. Having the same e-mail address for eleven years is a rare luxury these days!
But we lost the account on 5 November 2008. No it wasn't compromised, instead Microsoft "updated" Hotmail and suddenly it stopped working in Firefox 3 on Linux. It seems that the website identifies your browser and loads a different page and CSS sheet for each browser type. Some people have reported that it works okay on Windows with Firefox. Even some other Linux users say they can still use it, but it wouldn't work at our house.
We didn't lose the ability to sign in, in fact it all works fine except that we can't compose messages, as it won't accept any text, at all. I even tried installing a different browser, Gnome's Epiphany, but while that is a great browser, it had the same problem. We tried another PC, no avail.
I wasn't really surprised that this happened. Microsoft has been moving Hotmail around in their plans since they bought it. At first it was "MSN Hotmail", more recently it was made part of "Windows Live", along with a bunch of other services. A check of the Windows Live system requirements shows that you need:
"Microsoft Windows XP SP2 or later, or Windows Vista (Windows Live programs do not support Windows XP Professional x64 Edition. Windows Live Family Safety does not support 64-bit editions of Windows.)"
"Internet Explorer 6 or later"
Could it be that Microsoft is kicking all non-Windows users off Hotmail and its other services? It looks like they are even booting Windows 98, ME and 2000 users as well. Technically it wouldn't be hard to do that, but it couldn't be termed charitable or fair.
The reason that I am not too upset at all this is because, while webmail elsewhere has got much better, Hotmail gets superficial make-over after make-over, but still sucks. Yes you can choose the colour of your page from six options, but Hotmail doesn't support POP/SMTP or IMAP use, doesn't offer secure connections, doesn't even have phishing reporting anymore (eliminated this week!), is very, very slow to load mail even over fast connections, has cumbersome address book management, including no auto-completion of addresses, making you go to the address book for addresses every time. You have to click the in-box every time to see if you have any new messages, it won't auto-refresh itself and show new messages. Even the short list of on-page addresses was random and not customizable. Basically it was still stuck in the 1990s for utility.
So goodbye to Hotmail and our last connection to Microsoft, but what to do? I wanted to move to another webmail service that I was sure wasn't going to lock out Linux systems. The obvious choice is, of course, Google. This innovative company only uses Linux internally so their Gmail has to work with Linux. I already had a Google account, so I just signed up for Gmail.
My first problem was finding an acceptable e-mail address. Gmail has been around publicly since 7 February 2007 and has millions of users, meaning all the good names have been taken. Finally I found one and signed up.
Gmail organizes your mail differently, not by individual messages, but by threads of messages and responses, they call "conversations". It works more like a forum. It takes a few minutes to get used to, but is fairly intuitive.
The pluses? It is much faster than Hotmail loading, refreshing the page and working. It supports POP/SMTP and IMAP, so you can use it through an e-mail client, like Thunderbird, if you like. It includes chat capabilities with other Gmail users. You don't delete threads, but archive them, although they can be deleted as well if you like. It has over 7 GB of storage space per user. Its handling of attachments is amazingly simple and fast. It auto-refreshes every minute or two and shows new messages, even indicating them on the browser tab on the taskbar, so you don't even have to be working on your browser to know you have a new message. It works on Linux and Firefox 3 and Epiphany really well.
I was even able to get Hotmail to export a CSV (comma-separated variable) spreadsheet of our address book and then import it into Gmail. That saved a lot of typing!
The downside of Gmail is...well I haven't found any yet. Perhaps the only thing is, ironically, I haven't found a way to customize the colour of the page, it just stays Google blue, apparently. This is of course the only thing that Hotmail does have over Gmail, but it is trivial.
So, my conclusion? I think Microsoft did me a favour in throwing me off Hotmail after eleven years. Changing our websites to indicate the new e-mail address was a bit of work, as was sending out an e-mail to everyone I know with the new address. On the plus side even that turned out to be a good thing, as I heard back from several friends whom I hadn't heard from in a while.
I have rated Gmail as 9/10. It is almost perfect - a true 21st century webmail system. Now if they only allow changing the page colour, it would be 10/10!
It seems, from discussions on the NCF Free software forum, that there is a workaround for the Hotmail text-entry problem. Firefox needs to have its about:config files modified to eliminate the reference to the operating system, then Hotmail will allow text to be entered, at least until Microsoft breaks something else.
It isn't an issue for us as we have moved to Gmail to avoid further problems with Hotmail, not to mention its other disadvantages.
This is neat - the one thing that Gmail didn't have was customizable colours, but today that changed. Gmail now offers 31 different colour schemes, found under Settings→Themes. This makes it fully customizable now and so I have upped the rating to 10/10. I have been using Gmail for 15 days and I am very happy with it - it works really well.
I have been using Gmail now for 25 days and I have to admit the more I use it the more impressed I am.
Gmail includes the ability to add a label or tag to any mail string. This can then be used as a search parameter, making it very easy to find all those e-mails that are on any labelled subject. Very neat!
The search features deserve special mention as well. That Gmail features extensive e-mail searching should come as no surprise, after all this is Google, famous for web searching. Basically you can search for any word or phrase in your e-mail, including any e-mail address, making it very fast to find an e-mail from a specific address, person or on any subject. Very fast and accurate, too.
The more I use Gmail the more impressive it is. It should come as no surprise that it is written for and runs on Linux. After all Google searches are also all run on Linux and Google exclusively uses Linux internally as well.
So I managed to download an ISO of Ubuntu 8.10 late on 01 November, check the MD5 sum on it and confirm that it is a good image. Then I burned a copy to a CD-R using the Linux ISO burner tool, checked the disc integrity and it passed as well.
I was interested to see what it looks like and how it works on a CD live session, so I booted to it and it loaded and ran fine. I am not sure I like the desktop image of the Ibex, at first look I thought it was a coffee stain! Of course that is pretty minor, as you can use any picture to replace it that you like.
I checked a few applications. Intrepid comes with the same versions of Firefox and Open Office that Hardy currently has, 3.0.3 and 2.4 respectively. A few of the menus are organized differently, but otherwise it looks pretty similar. There is some forum traffic that seems to indicate that it is faster and uses less RAM when running.
Then I tried out our camera. As earlier entries (Camera? and Hardy Heron Peripherals) mentioned our Panasonic camera did not work in Feisty Fawn or Gutsy Gibbon but does work in Hardy Heron. This was the main reason we were able to get away from Windows and move entirely to Linux. The camera does not work in Intrepid Ibex. When plugged in it returns an error that says:
Unable to mount Panasonic (Matsushita) Lumix DMC-FZ10 Camera Error initializing camera: -1: Unspecified error
The camera isn't even an FZ10, but is an FZ2, so that isn't a good sign.
I checked the Ubuntu Forums and found that others are having the same problem with all types of cameras, including Panasonic, Canon, Kodak and Nikon. They all worked in Hardy, but not in Intrepid.
A search of the Launchpad bug reporting system discovered that it has already been reported and is Bug #285682.
So I have to admit that am not impressed. I think that the biggest challenge facing Linux remains hardware support. I can understand that some hardware manufacturers are being uncooperative and will neither write Linux drivers nor provide the information so that others can do so, but this is different. These are cameras that were all working in Hardy and now aren't. Hardware support just has to get better than this. This sort of problem will cause people to re-install Windows to make their cameras work.
The good news is that Hardy still allows our camera to work fine and it is supported until 2011. As mentioned we planned to stick with Hardy, because it is working with all our hardware.
Because of the widespread camera problems we can't recommend Intrepid Ibex at the present. It needs to work better. I am following the reported bug and will post an update here when there is progress.
In following up with some knowledgeable people and also the Ubuntu Forum posting, I have discovered that there is a pretty easy workaround for this rather widespread camera problem. Apparently a simple USB SD Card Reader will enable the camera SD card to be removed from the camera and plugged directly into the PC and read as a generic USB device. This enables the photos to be removed as files and moved to a folder on the hard drive.
Some other users have indicated that this workaround is not ideal as it means that photos can only be handled as files and not through a photo management application, such as F-Spot. Also, as was pointed out, we shouldn't have to go running to find a work around like this - hardware that works in one Ubuntu version should work in the next. While I prefer to handle my photos as files anyway and not use F-Spot, I have to agree on that last point.
Because this at least means that most cameras will be usable, I have upgraded my rating to match that of Gutsy Gibbon at 8/10.
According to recent news on the Ubuntu Forum this camera recognition problem has been fixed.
It looks like two ".deb" files were omitted in the initial version of Intrepid Ibex. Some people were able to download them and install them, which fixed the problem. These files have now been added to Intrepid through the update process and so all cameras that worked on 8.04 should now work on 8.10.
I think we can put this problem down to the folks at Canonical working to very tight and inflexible deadlines. Intrepid had its committed release of 31 Oct 08 set more than six months ago. There has been an increasing tendency to ship new versions of Ubuntu on time, but not necessarily complete. Perhaps the best advice I have seen is to wait before upgrading versions and not do it the first week that it is available. That way you have the benefit of the update fixes that didn't get completed before it shipped. Of course the other approach would be to stick with the LTS versions of Ubuntu, which seem to be less susceptible to this problem due to their longer cycle, which seems to give the version greater maturity.
With this problem now fixed I can rate Intrepid Ibex as 10/10 as there are no other issues that I have run into.
It looks like I may have pronounced this "fixed" too soon. While it seems that Panasonic and some other cameras are now working fine with the recent updates at least one report indicates that at least Nikon Coolpix S550 cameras are not.
There may be more updates to this story yet!
October 30, 2008 marks the release of the next version of Ubuntu. This one will be version 8.10 and it has the development code name Intrepid Ibex. From the early reviews it looks like it will be a good release with some useful new features, including:
In the past we have been pretty quick to move onto the next version of Ubuntu, but we have decided not to this time.
I should explain the background on Ubuntu releases. New version releases occur every six months, in April and October of each year and are supported for 18 months, meaning updates and patches are provided for a year and a half. The exception to this is the "long term support" or LTS releases, which are supported for three years. Our current version, Ubuntu 8.04, Hardy Heron is an LTS release and is supported until 2011. The idea behind the LTS versions is that users can keep them for a longer time, avoiding having to upgrade and still get patches and fixes, as well as application updates. You give up leading edge updates for consistency in the longer term.
As mentioned, previously we were quick to upgrade from Feisty Fawn to Gutsy Gibbon to Hardy Heron. This wasn't because we love being on the "bleeding edge" of new releases, but because we never had total hardware support with the earlier two releases. Feisty Fawn wouldn't run our scanner or camera. Gutsy Gibbon ran our scanner, but wouldn't run our camera. Hardy Heron runs all our hardware, which removes the immediate need to upgrade. We are also quite happy with the available application versions and basically Hardy Heron works so well that we don't see a rush to move to Intrepid Ibex at this point.
We will probably download the ISO for Intrepid Ibex after 30 October, give it a look as a Live CD and also make some CDs to give away.
Intrepid Ibex looks good, but for now we are going to stick with Hardy Heron LTS.
Last Sunday was September 14th and it marked three months of living in a 100% Linux house. We have given it a really good trial so far and I thought that it would be a good time to have a look at how we are doing living without Windows.
Actually the main thing to note about the last three months has been just how non-notable it has been! In general we have had no problems at all and have learned a lot more about how functional Linux really is.
First, general functionality. Linux works great! Sometimes we will go two weeks without needing to reboot either PC. We get software updates about five days a week and unless we get a kernel update no reboot is required. This never happened with Windows.
We also rarely get lock-ups, crashes or other similar issues. We do regular virus scans with Clam AV, but viruses seem to be a thing of the past as well. There is no spyware at all. We have had a couple of virus false alarms, but these are very easy to identify and report to the Clam team so they can fix the definitions.
We have not run into any document formats that we can't open, although we can certainly generate some that Windows users can't open! We try not to be rude and send stuff out in PDF format whenever we need to send things to users on Macs or Windows. Between OpenOffice.org, with its native PDF functionality and the CUPS-PDF printer, we can create PDFs out of anything. That is nice flexibility.
So what have we been doing in the last three months? Here is a sample:
Basically we haven't found anything that we haven't been able to do that we wanted to do yet.
For its part Linux is free, with tens of thousands of free applications available. It is stable, fast and stays very up-to-date through frequent updates.
So what has been the cost? Not much, really. We have learned how to do everything we need to without much effort. Modern Linux distributions, like Ubuntu 8.04, are really easy to use for the beginner. The interface is very quick to learn and when you need help the answers are generally easy to find through the documentation or from the Ubuntu Forum. We haven't need much help, though, it all just works for us.
We are really happy running Linux and haven't looked back since leaving Windows and its closed source software, expensive applications, crashes, spyware, viruses, BSOD and all that behind.
The world's largest science experiment came on line this past month in Switzerland. Building the Large Hadron Collider (LHC) with its 27 kilometer long circular tunnel under the Alps has been quite a task, but when it is fully operational it may yield the secrets of dark matter, which we think makes up the majority of the matter in the universe.
The LHC was built by CERN, Organisation Européenne pour la Recherche Nucléaire (European Organization for Nuclear Research) and was started in 1995. The LHC is probably the most important science being done on the planet right now and it has been getting quite a bit of attention from the general press.
What is of interest is that the LHC is run entirely using Linux. In fact CERN, in conjunction with FermiLab, developed its own Linux distribution, Scientific Linux a number of years ago. Scientific Linux is a base recompilation of Red Hat Linux and is handled in-house.
The CERN use of Linux is no coincidence. In fact CERN is committed to the open sharing of scientific knowledge and its use of Linux is part of that philosophy. The world wide web was invented at CERN. Tim Berners-Lee was working there in the 1980s when he invented the hyperlink, the first web-browser, the first web server and launched the first website in August 1991. Because of CERN's open philosophy, all of these inventions were made public domain and remain so today. Using Linux is an integral part of this philosophy, CERN needs to be able to develop and share its operating system and with other scientists around the world using it and improving it, the spirit of openness means it improves rapidly. CERN has been committed to open standards since 1953 and they have worked very well for the organization.
Linux is all that is used in most of the world's biggest scientific labs, because it is the only operating system that is compatible with the philosophy of those labs and because it works better than proprietary software.
It is great to see good science being done around the world. It is even better to see that results and knowledge are being shared from it. It makes sense that Linux would be a part of this. Closed source software isn't up to the task, only open source gives the ability to develop quickly and produce the kind of software needed to run the world's's biggest science experiment.
CERN LHC Rap on You Tube - worth watching, explains everything!
Microsoft Windows Vista is, of course, an integral part of our story of switching to Ubuntu. After all, it was our great disappointment with Vista as a successor to Windows XP, that lead us to explore alternatives and made us Linux fans instead. We have continued to follow the story of Vista over the past year and a half, but the story only gets stranger with time.
"When is a Vista Installation Not a Vista Installation?" It sounds like a Zen koan. The answer is "when it is running XP".
Let me explain.
Microsoft has been claiming that Windows Vista has been selling very well and that they have sold over 200 million Vista licences. That sounds good for them, but it turns out that a Vista licence is not a Vista installation. Windows XP went off sale on 30 June 2008, so that should mean that all OEM computers being shipped with Windows have Vista, right? Not quite, it turns out.
Apparently because of all the hardware issues Vista still has, every Vista PC sold comes with a licence that allows a "downgrade" to XP, if the customer wants to do that. It seems that the world's largest PC maker, Hewlett-Packard, have been shipping the vast majority of their PCs since June with XP installed and not Vista.
How it works is this: because customers can downgrade their Vista licence to install XP instead, HP have been shipping them "pre-downgraded". So the customer gets a PC running XP which comes with a Vista licence. The customer is happy, because, as was reported, the vast majority want XP and not Vista and Microsoft is happy, because they count it as a Vista licence sale, because it is. Even HP is happy, because they are selling PCs and they have a simplified licencing scenario - every Windows PC ships with the same licence.
The only problem is that the so-called sales numbers for Vista are a bit, shall we say, "misrepresented". This may explain why website stats always seem to indicate that the number of Vista PCs that visit is rather low compared to the sales numbers.
Microsoft has told HP that they will have to cut this out by January 2009, but HP has indicated that they are negotiating on this date. I expect it will be slipped.
Meanwhile, what is Microsoft doing to try to sell the operating system that no one wants? Their latest plan is apparently to pay Jerry Seinfeld $10M to be their frontman, as part of their upcoming $300M advertising campaign to try to move Vista and counteract those cute "I'm a Mac" ads.
One blogger responded to this announcement with "It's appropriate that Microsoft would hire a comedian, since Vista is joke".
According to what I have read on the forums, many new Linux users are intimidated by using the command line on the terminal. I guess that if you have spent all your computing life using just graphical interfaces the command line can be a bit scary, after all you can't just keep clicking on things until something works.
The first computer I worked on was the IBM mainframe that was in use at Simon Fraser University in the late 1970s. I say the computer, because in those days there was only one computer in the university and it was that IBM. It sat in the basement of the Administration Building and was composed of about two acres of cabinets with whirling tape drives. There were CRT and printer terminals all over the school. I had an account for use as part of a chemistry class. There was no graphical interface at all, everything was accomplished by entering commands at a prompt.
From that experience I learned that the command line is the main way to talk to a computer. The only problem I have had, then on the IBM, as more recently with Linux, is learning what the available commands are to get it to do what I want it to. You can't just click around, either you know the correct syntax or you get an error.
As a result I have been collecting up useful commands that I have learned and thought I would list a few of them here. Once you get going it is rather fun to see what you can do.
MD5 Sums can be used to check the integrity of a downloaded file. For more information on this see the article below.
Clam Anti-Virus is much easier to use from the command line. First you need to install it from the Add/Remove repositories.
To run this fist you must install Tesseract from Synaptic, including at least one language package. For more information on Tesseract see the article below.
These are all easy to use once you know how. Some things, like running Tesseract OCR, MD5 Sums or ClamAV can really only be done from the command line, whereas other functions are better done using a graphical interface. For instance, no one edits photographs using the command line!
With a bit of practice anyone can use the command line. It gives you more confidence in using using Linux and more control, too.
Everyone who uses the command line should also read the article Things you should never EVER type in Linux. Ever!.
It has now been five weeks since we switched our last remaining PC over to Ubuntu and left Windows behind.
Of note, there have been no challenges at all, in fact quite the opposite. In the past Ruth mostly used the Ubuntu box and I mostly used the Windows XP one. That meant that I never got really proficient at using Ubuntu for getting work done. Now that I have been without XP I have not only been able to find ways of doing everything I need to on Ubuntu, like OCR with Tesseract, but I have even found out that on Ubuntu most tasks are easier and quicker.
Unlike Ruth, who has never used computers in a work environment, I have always tried to find the fastest and most efficient ways of getting tasks done. Fewer key strokes means more work gets done in a given period with less fatigue. Some of what I have learned in the last few weeks has been shortcuts - mostly keyboard shortcuts. I have always found keyboard shortcuts a great help, but discovering them can be a tricky job in itself. Some are explained on menus, but in Linux many are just hidden. You have to learn them from someone else, basically.
In the interests of sharing the shortcuts I have learned, here are some of the most useful ones:
A time goes by and I learn more shortcuts in Ubuntu, I realize how well designed it is to get work done quickly. It already saves me a lot of time over Windows XP, which either lacked some of these shortcuts or else I never discovered them.
Overall this makes using Ubuntu a lot of fun and a constantly pleasant learning experience.
When I first started using Linux in April 2007 it was a bit of an experiment. Would it work in our home environment?
The first few weeks were pretty exciting, with a sharp learning curve. Looking at my Ubuntu Forum posts from that period, it is obvious that I was starting from zero after using Windows 3.1/95/98/XP for many years. That was okay, I expected a learning experience and that the process would be a growing experience.
We started with Ubuntu 7.04 Feisty Fawn and, while it worked fairly well for web browsing, writing and e-mail, we had some insolvable hardware problems (scanner, camera). I viewed it as an experimental process that one day might lead to us adopting Linux in place of commercial operating systems, but I certainly wasn't "evangelical" about Linux. In fact when people asked how it was going I would say "okay", but I couldn't recommend that they try it.
Ubuntu 7.10 Gutsy Gibbon got our scanner working. Ubuntu 8.04 Hardy Heron got our camera working as well and introduced a lot of improvements, like a new file system and a much more stable and compatible version of Firefox.
With 8.04 I felt that Ubuntu had arrived for me, although I recognize that it was a convergence - Ubuntu improved as I learned and we came to a point of intersection on 14 June 2008 when I was able to reformat my Windows XP PC and install Ubuntu instead.
We made a number of copies of 8.04 and gave some to friends and gave some to the office at National Capital FreeNet to hand out. I knew I had to be careful and so the message to friends was just - "Boot to this and give it a try, if you don't like it then give it away to someone else." I saw it as necessary to be low-key about it, even though I feel quite enthusiastic about Ubuntu 8.04. Linux is not for everyone, at least not in 2008 it isn't.
The word "enthusiastic" actually means "filled with God". To my way of thinking, that is a dangerous way to feel, because it means that you are out of touch with yourself and the way of the subject in question, in this case "Linux".
There is a "way of" everything, whether it is a "way of lunch" or a "way of Tai Chi". In the Chinese language the word for "way" is "tao", usually pronounced "dao", or in Japanese "do" as in "judo", the "soft way".
A small tao can be simply an everyday thing, like how to make a sandwich properly. A greater tao can be a path to enlightenment, the place beyond the self.
The "way of anything" is where its natural resonance occurs and therefore must be free of human desire and ego. This is not always easy to achieve, as things affect us, attract or repel us emotionally.
Using Linux as your operating system is not something that can be imposed on someone. They have to be ready for it. As A.Y. Siu warns, if you act like an evangelist you may entice people into Linux who are not ready yet and as a result they may have a bad experience and be turned off it forever. That is the imposition of ego, not the way of Linux.
Many have asked the question "Is Linux ready for the desktop?" There are about 30 million people using Linux in the desktop environment today and many have done so for years. I think that when people ask that question or a similar one, they are really asking "Is Linux ready for me?" or more specifically, "will switching to Linux be easy, painless, involve very little learning and basically be just like Windows?" As Dominic Humphries points out, if Linux were the same as Windows, then it could be no better than Windows.
In many ways it is better than Windows:
There are also some ways where it isn't as good as Windows:
It isn't the same as Windows, so there will be a learning curve, if you are coming from Windows. You have to expect that. By learning how to use Linux you will gain knowledge and confidence, of course and the process may change you.
In reality, though, "Is Linux ready for me?" is the wrong question. The question asked should actually be "Am I ready for Linux?"
If you want Windows without the spyware and viruses then you aren't ready, get Ad-Aware, ClamWin and AVG instead and keep using Windows.
If you want Windows that doesn't crash, doesn't spy on you and has better support, then buy a Mac.
If you want an operating system that requires no learning to master, then stick with what you are using now. You aren't ready for Linux.
If you are an experienced Windows user who is often called upon to fix other people's Windows PCs and you don't want to feel like a beginner all over again, you aren't ready for Linux.
If you want Windows for free then try ReactOS instead.
If you like doing de-frags, check-discs, disc clean-ups, spyware and virus scans, then you will feel at a loose-end on Linux, kind of like an ex-smoker who doesn't know what to do with his hands. Linux doesn't store files in a fragmented manner, doesn't have spyware or much in the way of viruses and cleans itself up.
If you want an operating system that requires you to learn, but rewards you with stability and performance, then maybe you are ready for Linux.
If you want to be part of a community where most people are welcoming and help each other instead of relying on professional support, then maybe you are ready for Linux.
If you want an operating system that challenges you to think, adapt and learn, then maybe you are ready for Linux.
If Windows frustrates the heck out of you, then take a break from computers in general, Linux will drive you nuts in the beginning, because very little of what you have learned so far will be useful.
If you have had enough of Windows and its problems and are open to trying something new then maybe you are ready for Linux.
If you are willing to persevere when you feel lost, ask questions on the forum and copy text into a terminal window, then maybe you are ready for Linux.
If you will not give up after a month or two, or even three and if you will wait six months for the next version release to see if it runs your hardware, then perhaps you are ready for Linux.
If you have never used a computer and you are ready to learn, then you may be the most ready for Linux, because you will have no preconceived notions.
If you invest a lot of yourself, your sense of worth, into your computer, then you aren't ready for Linux. You can probably muscle your way through learning Linux for a while, at least until you get so stumped on something that you have to post on the forum. "Sorry, I am a total beginner, I didn't understand your technical language, can you try it again, one step at a time, please?"
If your mind is empty then you are ready for Linux.
There is a Way of Linux, but it is not a path that leads to cheap software or even worry-free computing, it is a path without a destination, really, except yourself.
For anyone considering trying Linux, further reading on this subject:
Today is the first day in the passing of Windows XP, the beginning of the end, an event noted even in the mainstream non-tech media.
It is a process that will take until 08 April 2014 when all support will cease. At the present it looks like there may well be XP users around still in 2014, although I think most will have moved long before then. Of course there are still Commodore 64 users around today.
As of today Windows XP is no longer for sale, with a couple of exceptions, such as "existing OEM stock" and sub-laptops, for which Microsoft has no other product. Otherwise if you want an XP PC after today you have to buy a Vista one and then order the "downgrade" (or as most people call it: "upgrade").
XP was launched on 31 December 2001 and in the past seven years it has developed a strong following. Most users seem to agree that was Microsoft's best operating system so far. It was a great improvement over Windows 98 in both functionality and stability. Most Windows users also agree that it still works better than Vista does, which is why over 212,000 people signed an InfoWorld petition to try to convince Microsoft not to discontinue XP. Not surprisingly Microsoft hasn't listened.
I don't think many people were surprised that Microsoft didn't listen to its customers on retaining XP. Even if it was the sort of company that cares about its customers, which it isn't, as evidenced by the long list of anti-trust convictions against it, why would Microsoft continue to sell XP, when it is plainly hurting Vista sales? Simply put, Microsoft wants you to buy Vista. They don't want you to hang onto XP and hope that the upcoming Windows 7 is better. Every indication is that Windows 7 will just be more Vista anyway.
The end for XP is a sad occasion; it was a great operating system. We ran it from 07 May 2004 until 14 June 2008. In that time we got to really like it. It was stable, fast on 256 MB of RAM and did everything we needed a PC to do. When we reformatted it on 10 July 2007 and equipped it only with free open source software it ran even better. It wasn't perfect, was very vulnerable to viruses and spyware, but, with proper user care, it was a very usable operating system.
So what can Windows XP users do?
Microsoft made it easy for us by announcing that Windows 7 will incorporate only "minor changes" over Vista.
As described in my earlier Windows 7 article we are sorry to see Windows XP go, but since the future will all be Vista, that made our choice to go 100% Linux easy.
I did some more reading about the Tesseract OCR application and, on a hunch, managed to figure out how to get it to work and discovered that it works really well!
The key missing piece of information that I needed was that even though XSANE creates ".tiff" files, Tesseract will not open them unless the extension is changed to ".tif" (one "f"). Once the extension is changed it works fine, in fact it works amazingly well.
As noted in the previous article on GOCR for Linux that application produced a 2.7% character error rate, which is pretty high. That was on a large font and very clear page, too. On the same page Tesseract produced only two errors out of 1059 characters for an error rate of just 0.19%. It should be noted that both errors were just extra spaces added, very minor in nature. In our test that makes Tesseract 14.2 times more accurate than GOCR.
I have tested Tesseract on much smaller fonts and it seems to work just as well.
Tesseract runs only from the command line and is very easy to use. You scan the text to be copied and save it as a ".tif" file. If the scanning application saves it as a ".tiff" then change it to ".tif" instead. To carryout the OCR operation enter the pathway and target file name in the command line, like this example:
$ tesseract /home/johnsmith/input.tif output
In this case "input.tif" is the name of the file to be converted and "output.txt" will be the new file created by Tesseract
Tesseract is very impressive, quick and easy to use, once you know the trick! It is also very a very accurate OCR and so this all adds up to a 10/10 rating.
Tesseract has an interesting history. It was developed by Hewlett-Packard between 1985 and 1995. In 2005, after no work had been done on it for ten years, it was released as open source under an Apache licence. Currently the project is headed by Ray Smith and sponsored by Google. I would like to acknowledge Google for their work in supporting the project, it is obviously worthwhile!
The current available version is 2.03, but as usual Ubuntu lags behind and the current version in the Hardy repositories is 2.01. Hopefully Ubuntu will be able to offer the newest version soon.
One of the capabilities we lost when we got rid of Windows was Optical Character Recognition. OCR allows a user to scan a page of text, have the computer recognize it as text characters and then save it in a text format.
Our Canon scanner came with a very capable Windows application called OmniPage SE. It worked very well on even small fonts and made very few errors. We don't do a lot of OCR, but it is nice to have the capability when you need it.
Ubuntu comes with the XSANE scanner utility, which works pretty well for scanning photos as ".jpg"s and creating ".PDF"s from scans as well.
In trying to get an OCR capability I searched the Ubuntu Forum and found mention of a command line OCR called Tesseract. I downloaded it using the Synaptic Package Manager. It works by scanning the source document as a grey scale ".tiff" image and then running the command line to recognize it and convert it. I tried it several times, but it could not open the ".tiff" images for some unknown reason.
I tried a different approach. The XSANE viewer has a setting at File→OCR (save as text) which looked promising. I tried scanning a text document and then using that setting but it resulted in an error, indicating that "gocr" was missing. I went back to Synaptic and found the package there and installed it.
The good news is that it works! The bad news is that is doesn't work really well. A test on a very clear and large font scan yielded 16 errors in 602 characters, or 2.7% error rate
[GOCR] claims to handle single-column sans-serif fonts of 20-60 pixels in height, and reports trouble with serif fonts, overlapping characters, handwritten text, heterogeneous fonts, noisy images, large angles of skew, and text in anything other than a Latin alphabet.
The project is still in its early stages and so hopefully will get better over time. In the meantime it gives some basic OCR functionality.
OCR is actually the last thing that I had on my list of capabilities to locate for Ubuntu.
We have now been "100% Linux" at our house for ten days and have encountered no problems at all. Even though we have had a Linux PC in our house since 23 April 2007, Ruth adopted it and has used it far more than I have. This has probably slowed down finding Linux solutions to questions as when I had to do things like OCR in the past I just left Ruth to work on Ubuntu while I did OCR with OmniPage on Windows XP. Going Windows-free means finding new ways to do things and I haven't been stumped yet. The learning is actually good fun and stretches the brain.
I was thinking about the Gnome desktop and all the applications and utilities that come with it recently. It certainly is a long list and includes applications that we use everyday on Ubuntu to get work done, such as:
This wasn't just idle consideration, but was motivated by something Randall C. Kennedy recently wrote in his article Handicapping the Windows 7 alternatives. The article is a rundown on what the alternatives will be when Windows 7 is released in late 2009 or early 2010.
Kennedy is firmly convinced, and with some justification, that Windows is doomed, because Windows 7 is going to be just a rehash of the failed Vista. All well and good, but he makes some strong statements about Ubuntu and Gnome:
Once the poster child for Windows-to-Linux defectors, Ubuntu has lost some of its coolness factor. Consecutive lackluster releases, plus a "pass the buck" mentality toward lingering kernel issues, have tarnished Canonical's once unassailable reputation. Add to this Mr. Shuttleworth's obsession with the emerging ultra-low-cost PC segment and you have a recipe for disaster.
There's still time for the company to come to its senses -- to take responsibility for more than just the packaging of its distro. With two or three major releases between now and Windows 7's earliest, most optimistic delivery target, Canonical has an opportunity to shore up its position as desktop Linux torch bearer by dumping Gnome, embracing KDE 4.x, and doing whatever it takes to improve reliability across a greater range of hardware configurations. Do that, and it might have a shot at securing some of the more open-minded XP defectors.
One of Kennedy's blog commenters, RSingh, takes him to task, responding:
Dropping Gnome!! you've gotta be kidding me. From what I know from the people that have used it, KDE 4 is still not all that stable to be used as default but that doesn't mean that it is not used, that's why we have Kubuntu.
As for GNOME, it easily meets the requirement for a stable system. From what I have seem, stuff like Compiz Fusion on GNOME easily blows vista "AERO" effects out of the water and at the same time allows for a simple and clean environment to use.
RSingh gets the issue exactly right.
I have been using the Gnome desktop and all its applications for over a year now. Being an avowed utilitarian, I am not impressed with splashy, but non-functional features. I like Gnome because it offers an environment that is clean, simple, fast and has good ergonomics - meaning that it is well designed to interact with humans.
Evince is a perfect example for this philosophy. It is a plain PDF reader with no adornments. In many ways it is similar to the Sumatra free software PDF reader for Windows. Both Sumatra and Evince are small, simple and open very fast. They both allow the user to read PDFs, which is really the point. Unlike the brand-name Adobe Reader which takes around 6 seconds to open, displaying a list of credits while you wait, Evince is open in less than a second. It is not flashy or glamourous, but it is wonderfully functional.
Applications like PDF readers that give up simplicity, small size and speed for glitz, remind me of what Lao Tzu said: "When all else is lost, only ceremony remains" (chapter 38).
Kennedy outwardly blames "lackluster releases" for Ubuntu losing "some of its coolness factor", as he puts it. Actually each Ubuntu release in the past year has been a careful, incremental improvement in functionality, but without much flashiness added. I can't but help thinking that Kennedy's underlying reason is because Ubuntu has proven too successful, too popular. Ubuntu now accounts for about half of all desktop Linux use and that makes it a bit too accepted for comfort. To many people "coolness" is when they have something that no one else has, regardless of whether it has any practical value or not. As soon as everyone has it, then it is no longer "cool" and they must move on to preserve their self-perception of uniqueness. Coolness is mere ceremony.
Being a utilitarian I have never had any use for "coolness". To me is it just "posing". Talking about "cool software" is meaningless, as there is no such thing. Either it is functional or it isn't. To me that means clean, simple, fast and with good ergonomics. I am looking for nothing more or less. To return to the Evince PDF reader: it is all those things, but it is not "cool", it is functional.
My suggestion to people looking for something really cool is this: use Vista. Now that is cool!
With many applications, users must create individual GIF files and string them together. The animated cartoons I made in our gallery were all done using UnFREEz. The animation you see on this page was made with GIMP.
With GIMP, users only need to create an image without saving it and then click on the “layer” tab at the top of the screen. A pulldown menu appears and the first option is “new layer”. Select “create a new layer” and then choose either the foreground colour (usually white) or transparent, which works best if you're trying to keep things smooth as the earlier image is already there whereas it disappears if you use the white layer. Think of it in terms of whether you throw a white piece of paper on top of a picture or a sheet of clear plastic. As you change the image (whether adding a colour or giant ants with lightning bolts coming out of their antennae taking over the planet) you just keep adding new layers over top the old one.
Another benefit to using GIMP is that users can take advantage of all the interesting features in GIMP. For example, users can open GIMP and click on the the 'Xtns' (extensions) tab. When the popup menu appears, select 'Misc→Sphere' and then enjoy creating planets of whatever colour you choose. If you want to add animated effects, such as making the planet appear to rotate, then you can play with the lighting direction and, by adding new transparent layers create motion that way. Granted, the results are not as super high tech as you'd like and nobody will believe you've “filmed” a planet rotating but it's a great start, especially when you consider that GIMP is free.
Creating animations is still fairly labour intensive, to be sure, but with something as simple as having a choice of layer characteristics, such as foreground, background, white or (my favourite) transparency can make a huge difference and save a lot of time.
Unlike some graphics editor applications, saving image files as GIFs will not result in degradation. Now, if all you're trying to do is depict a simple shape moving, it isn't a serious issue. However, if you're trying to animate an actual photograph, the image degradation can be a problem. Luckily, there is no such problem with GIMP. If you have a photograph you've saved as a JPEG, you can easily re-save it as a GIF without any image degradation. If I want to animate a picture of Zuby, like having flames shooting out of her nose or something like that, I could open GIMP, call up the file, add transparent layers as I alter the original and continue until my project was done.
In GIMP, you then save your work as a GIF. When you do that, one of the dialogue boxes will appear that ask if you want to flatten the image (ie squish all the layers together to make just one image) or else animate it. Choose animate.
There is one small problem using GIMP to create animated GIFs that I have yet to sort out. GIMP has many different types of brushes, as most graphics editors do, and sometimes images I create make use of those soft brushes, including airbrushes. However, when I try to create an animation of storm clouds building the result is a collection of solid images. Soft gentle clouds don't appear to be possible – or I haven't played with GIMPs animated GIF feature for a long enough time. Fortunately, it is so much fun to play with GIMP that I don't mind one bit!
Brasero is the newly included CD-burning application that was introduced with Ubuntu 8.04. Everyone raved about how flexible and easy to use it is, but I haven't had any success with it.
Yesterday I tried using it to do a multisession write to a CD-RW. By this I mean that I wanted to add files to a CD-RW that already had files on it without erasing those files. Brasero just returned an error:
Cannot Mount Volume - Invalid mount option while attempting to mount the volume...
Thinking it might be the CD, which had been compiled on Windows XP, I tried another approach. I got a new CD-RW and burned a file to it. It worked fine! Then I tried adding another file and got:
Error While Burning - The image can't be created
And it ruined the CD-RW, of course.
I couldn't find any documentation for Brasero, even on the Brasero website or any help anywhere, so I posted the problem on the trusty old Ubuntu Forum. The answer I got is basically that this seems to be a kernel-wodim problem and is the subject of a previously reported bug.
While annoying, this isn't a serious issue as Ubuntu has two CD burners. The other is the Gnome CD/DVD Creator and it works fine, the only drawback being that it cannot do multisession disks. This means that to add a file to a CD-RW you have to reload all the files on it and then burn it, in place of the existing files. Generally I have been backing up photos on a CD-RW and then, when it is full, burning them to a CD-R. I can still do this, or I can collect the files on a USB device and then when I have 700 MB, transfer them to a CD-R.
This does sound like a minor variation of the "system-breaking updates" that have happened in the past with Ubuntu, except this is minor.
In case any Windows users feel smug - I just heard first-hand from a Windows XP user who downloaded XP SP3. It over-wrote his CPU driver with a generic one and caused a false high temperature reading. That took a while to sort out including some shop time to determine that it wasn't a true overheat condition.
I guess these sorts of problems can happen on any operating system, just due to the complexity of them.
Brasero looks promising, with a simple interface. Now if only the Ubuntu developers can get it working.
The title of this entry is "Windows Free", not as in "Free Windows", but as in "A Windows Free Zone".
As you can read on our Open Source Windows Project diary entry Bye Bye Windows, we decided to go "Windows Free" as of today, 14 June 2008. Because the reasons are all laid out there I won't repeat them here.
I was a bit concerned about the technical aspects of switching our old Windows XP PC over. It is a custom-built unit from 2004 with some odd ball hardware, such as an AMD Athlon processor, a RealTek audio card and an NVidia video card. It ran the live CD okay, although without a video card driver. I wasn't worried about the peripherals, including the camera, printer and scanner, because they had all been tested on our existing Ubuntu PC.
So I decided to go ahead and completely reformat the drive and install Ubuntu 8.04, figuring that between what I have learned in the last year about Ubuntu and the forum that I should be able to sort it out.
This would be a completely clean installation, no partitions shared with Windows XP, no emulation - just pure Linux.
The installation went very easily, actually. When I last reformatted this PC in July 2007 and reinstalled Windows XP I had to scramble to find Windows drivers for the audio panel and the video card. It was a long process to find and download them. Ubuntu came with the audio drivers and detected the NVidia card and prompted me to install a non-free NVidia driver all by itself. That went very smoothly.
I then installed applications to make this box the same as the existing Ubuntu PC that we have:
Naturally, like the last installation, I wrestled with PolicyKit to make it do what I wanted and I think I have it at a reasonable level of security now. Setting up the screen saver and mouse completed the basic installation.
Once the bookmarks for Firefox and the calender dates for Sunbird were installed it was time to install all our files, photos, videos, etc. This also went smoothly, except for the CD reader not reading a couple of CDs. These were read on the other PC and then the files transferred to a USB device. It is odd, but otherwise went fine.
All in all I probably put in about 5 hours reformatting and installing Ubuntu, in between doing housework and other chores. The longest task is installing documents as the new Ubuntu file system is slower than the old one and takes a while to transfer files from the CDs, etc.
So we are now a "Windows Free Zone", or if you prefer, a "100% Linux Home". We will continue to add to this diary as we learn more about Linux and work with it more over time.
Windows 7 has been all the talk in the IT world in the last year or so. Windows 7 is the next operating system that will replace Vista. It was originally planned for the 2011-12 time-frame, but with the failure of Vista, development has been accelerated and Microsoft is now apparently aiming to get it to customers in late 2009 or early 2010.
The world has learned to hate Vista since it was released in early 2007. The uptake on it has been well below expectations and there is good evidence that many of the 100 million licenced copies out there have been over-written and replaced with either XP or Linux. The web-meters consistently show that people aren't using Vista. Even this website's stats for this month so far show only 9.2% of visitors use Vista. That compares to 7.2% for Mac and 6.3% for Linux.
Many IT writers, like Randall C. Kennedy, have been advising desktop users to avoid Vista and instead wait for Windows 7. There has been a hope in the IT world, fueled by rumours that Windows 7 would include MinWin, a proposed new small Windows kernal and that therefore the next Windows will be smaller, faster and work on smaller hardware requirements. That could have meant that, by avoiding upgrading your hardware to handle the bloated Vista requirements, you might just be able to run a lean Windows 7 on your XP hardware.
Microsoft's PR department has been working hard to make sure people don't wait for Windows 7. They even issued a recent 20 page paper entitled The Business Value of Windows Vista (661 kB download). Kennedy calls the paper "propaganda" and, to be honest, he is being kind to Microsoft.
The paper actually says:
"There is no need to wait for Windows 7. It is a goal of the Windows 7 release to minimize application compatibility for customers who have deployed Windows Vista since there was considerable kernel and device level innovation in Windows Vista. The Windows 7 release is expected to have only minor changes in these areas. Customers who are still using Windows XP when Windows 7 releases will have a similar application compatibility experience moving to Windows 7 as exists moving to Windows Vista from Windows XP."
So the cat is out of the bag. Windows 7 will not be "MinWin", it will be just like Vista, or maybe even more bloated.
"Until now, I've been advising Vista fence-sitters to wait for Windows 7. However, last week's "big reveal," in which Microsoft finally confessed that Windows 7 will be nothing more than "Vista warmed over," has forced me to reconsider my position. I'm now more convinced than ever that Windows is doomed...".
Kennedy is not alone. There was a lot of hope that Microsoft would see the light, accept that Vista was a failure and design a new, lighter, more functional and less bloated Windows. One without DRM, one that works! Even one that will run on sub-laptops, cellphones and other small devices. The paper makes it absolutely clear: Windows 7 will be just more Vista.
"As I said before, desktop Windows is doomed. Version 7 was the platform’s last, best hope. Now it looks like Microsoft is going to kill it out of spite - one final, desperate act before the end."
"Windows 7: R.I.P."
But is it all gloom and doom? There is no doubt that the IT community, particularly the IT writers, feel Microsoft isn't listening while the writers themselves try to save Microsoft from the trash-heap of history.
Kennedy concludes with an optimistic note:
"That's why I say it's time for the Windows community to take a hard look at alternative platforms, like Linux and Mac OS X. It's over there, on the other side of the fence, that the real innovation is occurring. By contrast, Windows - including the over-hyped version 7 - is an architectural dead end. We, as a community, need to accept this fact and move on."
Microsoft had the chance to save their brand and all the loyal XP users out there, but they have decided to capitalize on failure by making it worse. The Microsoft paper The Business Value of Windows Vista is a plainly sad and desperate sounding document, but it is also a clear advertisement for Linux. By releasing documents like this Microsoft is making up for the lack of advertising that Linux has. Microsoft is accidentally becoming Linux's biggest promoter.
With Windows showing on Net Applications for April 2008 at 91.64%, Mac at 7.38% and Linux (all distros) at 0.63%, is Microsoft really worried about Linux?
Sure Microsoft's share of the operating system market is falling. In October 2004 they had 96.40% of the operating system market share, so they have lost 4.76% in 43 months, which is a rate of 1.33% per year. But at that rate it will still be 31 years, or in the year 2039 before their market share falls to 50%.
But still Microsoft is showing some signs that they are worried about Linux. Not on the full-sized home or office desktop, but in the laptop market, particularly the rather large niche low-end, sub-traditional laptop market.
As Ryan Paul and Jon Stokes at ARS Technica recently wrote, the new Asus Eee PC sub-laptop is a "game changing" machine. If you have ever browsed websites on a cell phone you know how hard that is. If you have ever tried to lug a full-sized laptop around you know that they lack portability, especially in confined spaces, such as on airliners.
What was needed was a smaller laptop that has all the features of a larger one. There are lots of those "notebooks" around, but the prices are no breakthrough. Then along comes the Asus Eee PC.
It is amazingly small and portable at 8.9 x 6.5 x 1.4 inches and just 2.03 pounds (0.92 kg) but big enough to be usable, with a real keyboard. The specs are very modest:
The Eee comes with some pretty standard applications included:
The one feature that sets the Eee PC apart is that you can buy one at The Source today in Canada for Cdn$299.00. No that is not a typo - a fully functional laptop for under $300.
How can they do that?
Well one of the factors is that the operating system is a modified version of Debian-based Xandros Linux with a modified KDE desktop. This means that it is fast, small, stable, pretty much virus-free and, best of all, free of Microsoft licencing fees. It is hard to offer a PC this cheap when you have to pay Microsoft something in the range of $100 for a licenced copy of Windows.
So what has been Microsoft's response? They really can't do anything with Vista for this sub-laptop market. That operating system is just too big and bloated to run on that small an amount of RAM or with that small a flash drive.
No instead Microsoft is offering up the manufacturers a deal on Windows XP.
According to Ryan Paul:
Microsoft will offer Windows XP licenses for $26 for developing countries and $32 for the rest of the world. In order to qualify for these deep discounts, products will have to be limited to a maximum of 1GB of RAM, 10.2 inch screens, and single-core processors clocked no higher than 1GHz (though there are apparently some exceptions). Products must also not have hard drives exceeding 80 GB in capacity and cannot have touch-screen technology.
Of course most people will quickly point out that Windows XP is seven years old and will not be supported in another eleven months. Microsoft have that one aced - they will make an "exception".
That's it. That is all they can do. They really don't have an operating system that fits this market.
So is Microsoft worried? I would say they are worried enough to extend XP and worried enough to discount XP as well.
Do they have a good solution to compete with free operating systems in this sub-laptop market? No. they missed the boat when they let Vista get so huge and bloated.
Microsoft had better be thinking hard about Windows 7, the Vista replacement that is in high-speed development. It needs to be a lot smaller than Vista and run on a lot less hardware if they want to gain any ground in this market.
The last word goes to Ryan Paul:
The Eee PC will likely have a noticeable influence on future mobile computing development. Companies are increasingly adopting Linux in the mobile space, and Linux developers and distributors are embracing this trend and accommodating rapid development. Intel is also pushing forward Linux-based budget mobile computing with the Silverthorne architecture.
It is becoming increasingly obvious to hardware makers that Windows simply isn't flexible enough to meet the requirements of the rapidly-evolving mobile market and that open-source software provides a clear path forward. The Eee PC is a stunning example of what a hardware maker can accomplish when mixing a highly compact form factor with a custom open-source Linux platform. With the Eee PC, consumers can get a taste of the future today.
I was recently reading a good essay by Ubuntu Forum staff member A.Y. Siu called The Linux Desktop Myth, that was published on 28 July 2006.
"Is Linux Ready For the Desktop?" is a question that has been asked for years. The sort-of corollary is the statement that "2008 (or 2005, or 2001) will be the year of Linux desktop breakthrough". Siu points out that it is all bunk. The difference is that he gives a very good analysis of why this is so and what is really preventing more people taking up Linux.
Siu has some good credibility on this subject, too. This is not because he is a computer sciences professor or a well-known Linux developer, because he isn't. He is just an experienced Windows user who has become a Linux supporter and forum volunteer, heck, he admits his wife uses a Mac!
The article gives a detailed history of his own experiences with various computer operating systems, including Windows, Mac and Linux.
Siu makes a pretty strong argument that if Linux, particularly Ubuntu, isn't ready for the desktop then neither is Mac OS X. Why? Because none of them are Windows. He makes the argument that Windows is hard to avoid - almost all computers come with it pre-installed. It is very difficult to get a new PC without Windows. Getting something else means having to install it yourself and that is just beyond most average "novice" computer users.
He does give some good analysis about what we can each do to help Linux adoption. These include improve documentation, welcome new users, contribute and educate, but be honest about Linux.
The point about "contribute" is worth making. Siu says that this means submitting bug reports, writing code, donating money, but definitely not whining on forums!
His point about educating while also being honest about Linux to potential users is worth making as well. He believes that Linux is not for everyone. He says:
"Recognize that Linux is not a cure-all and is not for everyone. Everyone should have a choice, and there are some times when you have to tell people a Linux distro may not be the best choice for them at this time. There are other times when you have to tell people it's worth a shot. Anyone who, like me, checks email, surfs the web, types documents, listens to music, organizes and manipulates pictures, and designs some websites (albeit poorly... but that's my fault, not Linux's) will be fine with Linux. If you love Lexmark printers, AutoCAD, Adobe Creative Suite, and Flash MX Studio, Linux may not be the best option for you right now. Don't be a crazy evangelist. I was at first, and I think I permanently turned a friend off from Linux. Just remember--one bad experience will leave a lasting first impression. Enjoy it. If people see you having fun with your computer, they may get curious--"Why are you having so much fun with that thing?" On Windows, I can get work done. On Linux, I can get work done, too... and have fun while doing it."
His article is fairly long, but it covers a lot of useful ground. I think it should be required reading for anyone thinking about switching from Windows to Linux. It is on par with two other articles that I have mentioned links for before:
A.Y. Siu wrote this essay when the current version of Ubuntu was 6.6 Dapper Drake. That was two whole iterations before we started running Feisty Fawn and five versions behind today. I think everything that he said in his essay still applies, the concepts are still quite valid and Windows is still hard to avoid. Getting people to even try Linux is still a challenge.
Writing in this diary about Ubuntu 7.4 Feisty Fawn, I said that Ubuntu would be of limited use at home and shouldn't be considered for work use:
Is it right for the office environment? It is nowhere even close to that in its present state. Right now it requires very patient, knowledgeable and sophisticated users who aren't afraid to troubleshoot their own problems and use the command line interface. For the average office environment you couldn't employ enough tech support people to keep an office working. Employees would quit in frustration...Will Ubuntu grow into a system that I would be happy to recommend to my boss as a replacement for Windows (without having to replace the whole office staff with hackers)? One day, I hope...It isn't there yet.
In updating my thoughts on Ubuntu at this point I would say that, with the great improvements incorporated in Ubuntu 8.04 Hardy Heron, it is now very usable for home and work, even for computer novices.
We have never given out Ubuntu CDs before, for the single reason that we were never comfortable enough with the limitations of Ubuntu in its Feisty Fawn or Gutsy Gibbon versions to recommend it to anyone. We are now giving our friends Hardy Heron CDs.
With Ubuntu 8.04, Linux is ready for almost any user's desktop, if the user is ready for Linux.
I wanted to follow up on our recent look at Compiz Fusion.
Occasionally when using Ubuntu 8.04 we have been getting some poor performance out of our PC. With a fair number of applications open (around 4-6) we were seeing application windows "graying out". Firefox would scroll slowly and with a notable time delay. Sometimes it happened with just two applications running, such as VLC and Firefox with a fair number of tabs open. It seemed to be a RAM problem.
We shouldn't be running into this sort of situation with 512 MB of RAM, but I suspected Compiz Fusion might be a RAM hog.
We set Compiz Fusion from "Normal" to "None" (i.e. "off") and surprise, the problem has gone away.
As a bonus the PC is now running much more "snappily".
We have now successfully made Ubuntu CDs on both our Windows and Ubuntu PCs. We find it is handy to have a few extra discs, not just for our own installation and archive, but also to give away to people - just part of our attempt to promote Ubuntu!
I thought it might be useful to detail, step-by-step how to get an Ubuntu CD. It is a bit more complex than I initially thought it would be, although there is an Ubuntu Help documentation page on the subject.
Here is the procedure:
Now you are ready to use the CD for a Live Session, for an Ubuntu installation or to give away to friends to try!
We have both been trying out Compiz Fusion recently, since it works on our Ubuntu PC with Hardy Heron. Compiz Fusion is described as a "3D composited desktop".
The Compiz Fusion website says that:
"Compiz Fusion aims to provide an easy and fun-to-use windowed environment, allowing use of the graphics hardware to render each individual window and the entire screen, to provide some impressive effects, speed and usefulness."
I know - it sounds like something written by the Marketing Division of the Sirius Cybernetics Corporation.
Even though our PC does not have a graphics card many of the basic features of Compiz Fusion do work on it. Ubuntu allows three levels of operation found under System→Preferences→Appearance→Visual Effects, which are:
The "extra" level is described on Ubuntu as:
"Provides more aesthetically pleasing set of effects. Requires faster graphics card."
Of course each graphics card set up, or lack of one, will allow certain effects and perhaps not others. All that is to say that what we get for Compiz Fusion may not be what others get. According to the website many of the effects available, such as the desktop cube, are plug-ins and may not be part of the basic Ubuntu installation.
In the "None" position we get windows that open and close sharply and quickly, no animations, just quick performance. In this mode the PC is actually pretty impressive, it works fast!
In the "Normal" position we get windows that open by zooming in from behind you and close by tipping down and zooming into the distance away from you. It isn't as disturbing as it sounds. Window edges show transparency when overlapped. Overall it is a pleasant set of effects and not too obtrusive. This is actually the Ubuntu default setting.
In the "Extra" position on our PC, Compiz Fusion just adds "wobbly windows". This is an odd effect that I have seen demonstrated in the past using Beryl, so it probably comes from that older project. Basically anytime you grab and move a window it wobbles like it is made of jello. If you grab it and shake it, the window wobbles a few times before settling down. Both our us find this setting annoying or at least disconcerting.
There are lots of other effects available, although it isn't obvious how to set or control them. I believe that it is necessary to install an application called "Compiz Fusion Icon" to allow more advanced control of it.
As mentioned below we both agree that this application doesn't really serve any purpose other than "eye-candy". Even though Compiz Fusion's website says that it will "provide some impressive effects, speed and usefulness", we have only seen the first of those (depending what you personally consider "impressive") and not the latter two.
We are both sure that Compiz Fusion has been included in Ubuntu to compete with Windows Vista's Aero interface, which features 3D transparent stackable windows, etc. We have tried out Aero and don't like it either.
Perhaps we are just old fuddy-duddies who learned on command line systems back in the 1970s, but even today we both think that computers should be for getting work done and not driven by "form over function". They shouldn't have ugly and offensive interfaces, granted, but why spend all this brain-power and installation CD space on "eye-candy"? I guess it is all a "keeping up with the Gates" phenomena.
We would rather have a faster PC and more useful applications than "eye-candy" interfaces.
We have both been working with Hardy Heron and trying out all the usual things that we do on the PC, as well as testing out our hardware (and finding it all works).
One of the things we have been doing is checking out the installed versions of the applications that came with Ubuntu 8.04, as well as the ones that I have installed, as part of the set up. I thought for anyone considering trying out Ubuntu, a list of what comes with it would be useful:
As noted we have already installed the following applications from the repositories:
A couple of notes are in order:
We use Clam AV from the command line as it is easier and more flexible than using the Clam TK GUI. They download together, so you get both anyway.
The Open Office Suite does not come with the Base database application. It is probably the least-used OpenOffice.org application, so it looks like Ubuntu no longer includes it, to save space on the CD. Unlike OpenOffice.org Writer, which can open Microsoft formats, like ".doc" and Calc which can open ".xls", Base can't work with Microsoft Access's ".mdb" format. That is probably an advantage, however as ".mdb"s aren't that good. OpenOffice.org Base is still available from the Ubuntu Hardy repositories for anyone to download it, so it isn't an issue.
Ubuntu obviously comes with a lot of free software. Windows is fairly bare out of the box, with little application software, because Microsoft wants you to buy MS Office, of course (smart Windows users download OpenOffice.org for Windows for free). With Ubuntu you are ready to write documents, design slide-shows and edit graphics right away. With access to the free repositories through Add/Remove Applications and Synaptic, Ubuntu users can find tens of thousands more applications to do just about everything they need to.
I decided to take some time this evening and test out all my peripherals and see if they work on Hardy Heron. I have to admit that I am amazed - all were successful!
I tested our camera earlier and it works! There is some history here. The camera is a 2004 model Panasonic DMC-FZ2, so it is not new. When we first started on Feisty Fawn a year ago the camera worked, then we had an Ubuntu update and it stopped working. Now, in Hardy Heron it is working once again! I am impressed. I hope that it doesn't suffer from another "system breaking update" in the future.
I also dragged our Canon LiDE 20 scanner over to the Ubuntu box and did a couple of scans using the native XSane application. It works fine right out of the box, just as it did in Gutsy.
The printer was interesting. I have never tried it on this box, but have printed off a Live CD on my other PC. That was with Ubuntu 7.4 Feisty Fawn and it worked fine out of the box. Plugging it in on Hardy, it didn't work. It looked like it was working - queued jobs, marked them as printed, etc, but nothing printed. The HP LaserJet 1018 was identified just fine, but wouldn't actually print. As always a quick search through the Ubuntu Forum showed up the answer. The solution was simply to run a terminal command:
$ sudo hp-setup
and the command-line set-up wizard found the missing firmware required. Now it works fine!
So, as I said, I am very impressed with Hardy Heron so far. The installation wasn't too bad. The only glitches were getting a good ".iso" file and then getting a valid CD out of it and then dealing with the new PolicyKit permissions regarding files and CDs. All in all a very instructive day.
It feels great to have a three year old used PC that can now do everything we need it to do, except run our Windows-only Garmin GPS. With Ubuntu 8.04 Hardy Heron I truly think that Ubuntu is easily a match for Windows XP at last and far, far ahead of Vista. Given the advances made in Hardy Heron, Vista will never catch up now.
This time I think I got it right!
After the previous attempt to download Ubuntu 8.04 and burn it onto a CD I am now a smarter user. After having consulted the Ubuntu Forum and also read the well-hidden, but actually pretty clear documentation, I had better success installing a clean copy of Ubuntu 8.04.
Just for the record here is the procedure that I used and how long it took:
The total time to do the complete reformat and upgrade was 5:43, which isn't too bad. Not all of that was labour time, some of it was spent doing laundry while waiting for downloads to happen.
We made up a list of applications that we use and wanted to re-installed:
Since all our websites are all now converted to hand-coded XHTML we no longer need KompoZer and therefore didn't install it. Although it is a great application and we still recommend it to anyone who needs a WYSIWYG web-page designer.
Hardy Heron comes with CUPS-PDF, the PDF printer and XSane, the scanner application already installed, so no need to install those.
The Compiz Fusion graphics effects application also comes with Hardy Heron, as it did with Gutsy Gibbon. With Hardy it works slightly differently, though. It wouldn't run at all on our graphics-card-free PC before. Now it runs fine on its "normal" desktop effects setting and will even work on the full effects "extra" setting, although it is a bit slow in that mode. To our mind Compiz Fusion is in the same category as Windows Vista's Aero interface - cute and entertaining, but doesn't add to productivity or utility - just "eye candy". I demonstrated the full "extra" setting to Ruth and she said "Barf, no thanks". We will leave it in the "normal" setting for now and do a full evaluation and write up on it at a later date.
My initial attempts to download the applications through add/remove weren't very successful. It seems that the repositories are pretty busy and so the connection kept timing out. Hardly a surprise, really, since about 8 million users will be trying to download applications this weekend. After a wait I got all the applications that we need installed.
The new permissions environment in Ubuntu caused me some confusion. In restoring all the files to the hard drive from CDs, they all installed as "locked" and have to be unlocked to be edited or deleted. I managed to unlock them all after some work. Some locked files that had been sent to the trash could not be deleted from the trash, moved or have their permissions changed. I managed to solve that one by signing into Nautilus as "root" (through Alt+F2→"gksudo nautilus") and then navigating to the trash folder and deleting the files. The trash folder isn't very easy to find, as it is located at:
I got into the PolicyKit permissions and I think I gave us permission to do what we need to. Without adjustments it won't even let you eject a CD!!! I will write more about that if it is an issue.
In testing out the installation, Hardy Heron seems to work just fine. The new graphics and the fonts look okay, internet works fine, even the new Firefox 3 Beta 5 works well.
Hardy Heron looks like a winner and so far an even better improvement over Gutsy Gibbon.
Here is something amazing - our camera works! I tested it and was able to download pictures from it. Hardy Heron recognizes the camera as a USB Mass storage device. I am impressed! I will have to carry out separate tests of the printer and scanner.
Now we can use Ubuntu 8.04 to get some work done!
Well one thing I can claim is that I am still learning about computers, and especially about Linux.
Ubuntu 8.04 was released yesterday. We decided a couple of months ago that we would do this one as a "clean installation" rather than an upgrade.
We decided that for two reasons:
The Ubuntu download page contains no instructions or links to any instructions at all. I was under the (mistaken) impression that I just had to download the file, burn it onto a CD-R and then boot to it to get into the installation menu. From there the rest I had done in our Linux class last year. The rest was easy, I knew.
So I downloaded the 631 MB ".iso" file on our Windows XP PC, simulating how a complete beginner would do it. The file took 7 hours to download, mostly because with a new version out everyone wants a copy. I got the file, burned it to a CD-R , booted to it and nothing happened. The PC ignored it on boot.
What to do? I searched the Ubuntu website and found nothing useful, so I went to that great source of information, the Ubuntu Forum.
It seems that the ".iso" is a special file and on either Windows or Ubuntu must be burned to a CD with a special ISO Recorder tool. I also learned that for Windows an open source ISO Recorder by Alex Feinman is available. There is also another open source burner called InfraRecorder. Ubuntu apparently comes with a right-click ISO burner tool.
So I downloaded the 361 kB ISO Recorder file and installed it no problem and it worked. I got out another CD-R and used the ISO Recorder to burn it onto the CD. That worked too. Then I tried booting to the CD. It sort of worked, getting to the Ubuntu menu screen. When I tried testing the CD for errors, it showed an error message screen on multiple boots.
The forum suggested trying again with a very slow CD burn speed, since apparently it is sensitive to that. The ISO recorder has a speed selector, so I set it at 1X on a new CD and went to bed.
In the morning it had made a CD. I tried booting to it, but it produced exactly the same error as the night before. That was three CD-Rs in the garbage. I wish they were at least recyclable.
Back to the forum. They forgot to mention that according to Ubuntu Help you should test your downloaded file before burning it with the ISO tool. For Windows there is a special open source ISO hash mark tester from NullRiver called WinMD5Sum which can then be used to interrogate and compare the integrity hash marks against the secret list of Hardy Heron hashes on releases.ubuntu.com. I installed that, tested the file and the hash mark was totally different from the one it should have been, indicating that the ".iso" file I had downloaded was corrupt.
So start again, wiser this time.
As I said I have learned a lot on this task so far:
Some others on the forum suggested using BitTorrent to download the file, but of course here in Ontario that is seriously throttled, so isn't a lot of help. The best suggestion is probably to download from a European mirror site in the evening here in Ottawa, when everyone near the site is asleep and it is likely to go more quickly.
I will try that again after the weekend when the demand should drop off a bit for Ubuntu 8.04, hopefully.
It's hard to believe that we have been using Ubuntu for a whole year but, as of today, we have. I was asked to write down a few of my impressions about this operating system and so I have set them out here.
The more immediate impression I had when I first sat down in front of our Ubuntu computer was just how simple the whole thing looked. Sure, using Linux based systems often requires some cursory (excuse the pun) knowledge of things like command lines but there just aren't that many reasons to resort to using them. I realize that the primary target market is the Windows/Mac user who is accustomed to using desktop interfaces with point and click features but Ubuntu's GNOME desktop will be familiar to anyone using Windows or Mac. This removes the scarier misconceptions some people have about using a Linux based system, that it requires a lot of a priori computer knowledge. To be sure, Linux is a hacker's dream. Except for a few device drivers everything is open source, but there is no need to actually know how to write or edit code to customize or use Ubuntu. I certainly don't and that brings me to the second feature of Ubuntu, just how user friendly it is.
Thanks to the GNOME desktop, our Ubuntu computer works a lot like our Windows XP computer. If I want to call up an application, all I need to do is go to the taskbar (located at the top of the screen unlike Windows), click on the 'Applications' tab and select what I want from the drop-down menu...exactly the way I would with Windows, except that Ubuntu runs much more quickly. Files don't take ages to open and file transfers from our jump drive take seconds, rather than half an hour.
There was one aspect to Ubuntu that proved to be a sticking point and that was how to connect it to the Internet using a dial-up modem. The internal modem on our tower just wouldn't work and so we had to purchase an external modem if we wanted to get our computer on the Internet. Remember the reference I made earlier about using command lines? Well, it was only through doing that could we get on-line which goes to show you that learning a thing or two about computing is never a bad thing if it helps you get the job done! We are now on high speed which works beautifully with Ubuntu.
Finally, Ubuntu and the Internet work incredibly well. There are no existing viruses that can infect our system and any attempt to install one requires the user to input a password. That's not to say we have no anti-virus software because we do and we run it. Of course, it's all free and it doesn't get any better than that, now, does it?
Using Open Office, which is native to the operating system, I have been able to write my first book, work on my second and third one and keep track of things like my bank account using Calc (the Ubuntu version of Excel). Using the text editor, I have been able to write web pages. With the very handy add/remove option, the last menu item after selecting the Applications tab, I have been able to download a lot of free astronomy software, such as Stellarium, Celestia and even K-Stars. New applications and games are added to Ubuntu's repository regularly which only increases the utility of the system. Adding and removing different applications is easy and, best of all, quick. No more clicking on 'download' and then having to wait endlessly for the files.
With more and more people turning from Windows Vista because it doesn't work for them, the numbers of Ubuntu users can only go up. If you have a Windows computer that won't run Vista, it'll run Ubuntu absolutely perfectly, easily and very quickly. I plan on staying with Ubuntu.
The recently certified "Vista Capable" class action law suit against Microsoft is having some interesting effects. For one thing it is being carefully documented by such mainstream media outlets as The New York Times.
The Times take on the court case can be summed up as journalist Randall Stross says, "where does Microsoft go to buy back its lost credibility?" The Times seems convinced that Microsoft is going to do badly out of the lawsuit and will end up having to compensate millions of PC buyers. The case is scheduled to go to trial in October, 2008.
But the lawsuit has had an interesting secondary effect. Many documents are being released, including internal Microsoft e-mails that will form evidence for the case. These have now been made public and are interesting reading. For one thing they outline Microsoft's senior managers' personal experiences with Windows Vista.
It seems that Jon A. Shirley, a Microsoft board member and former president and chief operating officer, tried to upgrade two PCs to Vista and discovered that his printer, regular scanner and film scanner all lacked Vista drivers. He had to retain an XP PC to get any work done.
Steven Sinofsky, Microsoft senior vice president responsible for Windows, found that his new Vista PC lacked the drivers to run anything he had, saying: "this is the same across the whole ecosystem."
And Mike Nash, Microsoft vice president of Windows product management, got a new Vista laptop that would only run Vista basic and lacked the graphics requirements to run the applications he used on XP, including some basic applications like Windows Movie Maker. He stated: "I personally got burned...I now have a $2,100 e-mail machine".
Aside from the fact that the lawsuit trial looks like it will be a short one, it is interesting to see how Microsoft's senior management feel about Vista! I am sure there will be more revelations.
26 days until Hardy Heron is released.
Wow, it seems every time I check the news all I read is about people talking about Microsoft Vista! It is getting hard to avoid. I am not even talking about the techie press, either, but the mainstream general news outlets.
For instance this morning CBC Radio carried a story on the national morning news about an on-line petition started by ComputerWorld Canada and InfoWorld of San Francisco. The story also found its way onto the CBC website Users petition Microsoft to save XP. It seems that over 101,000 people have signed this on-line petition to convince Microsoft to not drop support for Windows XP. The first benchmark in that process will be the cessation of new PC sales with XP, which is scheduled for 30 June 2008. The article is about Windows XP, but everyone is really talking about Vista.
One petitioner said:
"Millions of us have grown comfortable with XP and don't see a need to change to Vista," one said, comparing the changeover to a forced eviction from a comfortable apartment."
Nobody even mentioned DRM! Some of the reader comments to the on-line article are worth noting, however:
"It has been proven beyond a reasonable doubt that next to windows vista Windows XP is the most peace (sic) of crap, insecure OS in the world! So bad in fact that it is the ONLY OS responsible for all of the Zombie bots that produce spam and child porn on a regular basis while getting their owners busted for said crimes. Thank Goodness for OpenSuSE Linux!"
"Vista was/is an over-reaction to security threats, real and imagined. It's bloated, slow and doesn't offer anything Mac hasn't been doing for years now. Just get a Mac. Wake up."
"I just switched to Ubuntu to avoid Microsoft altogether. No regrets."
As that indicates, lots of people are talking about Vista.
The CBC article actually refers to ITWorldCanada's Canadians speak out: Why we want to save XP which is the place to sign the petition to save XP from the chopping block. The gist of the article is summed up as:
"IT managers have spoken and their cry is loud and clear: Windows XP is still mission-critical."
That IT article also has some comments worth noting:
"No-one minds investing in new technology when it offers some payback. unfortunately there's no payback in switching to Vista, only costs!"
"Vista is a dead OS. I don't know anyone who in IT roles who actually prefer Vista over XP."
"This is a perfect argument for open source solutions. The companies mentioned in the article wouldn't be having the problems they do if their systems were based on open source platforms and solutions. Why use MS Office and its proprietary document formats, only to have them become out-moded and unsupported? Why use Windows XP when the company who sold it to you is going to stop supporting it, without releasing any technical materials, let alone the source code, that would allow another party to maintain it? Unfortunately, most users will have tantrums if they are asked to change their habits in any way, such as using OpenOffice instead of MS Office (one of my users requested the company buy MS Office Enterprise because OpenOffice Writer's thesaurus wasn't as big as he was used to in MS Word)"
Lots more people talking about Vista there!
While we are looking at Vista in the press, I found an interesting article on PC World Windows XP vs. Vista: An Explosion of Opinion. This article all started with a survey on the dropping of XP by Microsoft. It resulted in 3500 responses and a 1000 comments! Incidentally 83% of respondents were unhappy with Microsoft dropping XP. Again the story is about XP, but all the talk is about Vista.
The comments ran the whole gamut of responses. There were even a few positive ones:
"Personally, I prefer Vista at home. I realize it's somewhat bloated, but I like the look and feel of it for my home system. Given the hardware I have, it runs exceptionally well and multi-tasks better than XP. For work however, there is not reason to have Vista. Most machines we run are 3-4 years old and cannot keep up with Vista. To discontinue offering/support XP is ridiculous and a slap in the face to enterprises everywhere."
"Out with the old in with the new, Time to move on."
Of course there were many comments in favour of XP. There is no doubt people like it as an operating system:
"XP! IT WORKS! I DON'T LOVE IT - BUT IT WORKS! Please leave it alone. Note to Bill, if you must continue to develop, develop a perfect XP. Or a perfect ME or 2000 or 98 or 3.1! That would be quite a vista."
And lots of people talking about Linux, too:
"I have hardware that is only 3-4 years old that either won't work entirely with Vista or won't work at all with Vista. No thanks, Microsoft. Maybe it's time for a switch to Linux."
"Discontinuing XP will be the final straw that pushes me into Ubuntu."
"While I think that Vista is a decent operating system, I think that Microsoft is playing the convicted monopolist that it is by charging way too much for the new OS. Thus, I think that XP should remain available because it's also a good, mature OS that you can get at a much cheaper price. As for me personally, while I do still have Vista installed, I'm now using Ubuntu for much of my work."
Yup Vista is sure getting a lot of press these days - everyone keeps talking about it.
Perhaps the Irish author Brendan Behan was right:
"There is no such thing as bad publicity except your own obituary."
Of course what we are talking about is what we have read on the Ubuntu website and it says "The new Ubuntu - 30 days to go" and a tantalizing background picture of a heron.
On Tuesday March 18, 2008 Microsoft finally issued its long-awaited fix for all that ails Windows Vista in a package of improvements called "Service Pack 1", or "SP1" for short. Since the problems with Vista have been so widely covered in the press, there has been lots of anticipation that SP1 would make Vista work better.
The general media CBC article Microsoft releases Vista update pack for download quotes Microsoft as saying: "SP1 improves Vista's reliability, security and performance."
The news there is definitely not positive, however:
So far, the software maker has determined that only a handful of programs will fail in some way after SP1 is installed.
Microsoft said SP1 will block several applications from running for "reliability reasons."
The list includes BitDefender Antivirus and Internet Security, version 10; Fujitsu's Shock Sensor hard drive protection for rugged laptops; two versions of Jiangmin KV Antivirus software and Check Point Technologies' Zone Alarm Security Suite.
The company said a few programs won't run on SP1, such as web application design program Iron Speed Designer, while others will stop working well, like The New York Times Reader application.
Needless to say comments posted by readers of that CBC article have been pretty negative:
I fundamentely (sic) object to a company making me pay, again, to be a beta tester.
"Reliability, performance, but failing programs...". Hmmm, am I EVER glad I decided to stay with XP when I got my new laptop a few months ago!
Hah! "Oh, Vista will be much better than XP, blah blah blah".
It seems the general public isn't impressed.
The computing world also doesn't seem that impressed with Vista, as Wikipedia reports:
Due to Vista's poor reception and continued demand for Windows XP, Microsoft is allowing continued sales of Windows XP. An unexpectedly high number of Vista users have downgraded their operating systems, with many having reverted their own Vista installs or even installing Windows XP (or other operating systems) onto computers which were preloaded with Vista. Many computer manufacturers have even begun shipping Windows XP restore disks along with new computers with Vista Business and Ultimate editions pre-installed, possibly to help small to mid-sized businesses for a limited time, as well as new computers with Linux pre-installed. A study conducted by ChangeWave in January 2008, shows that the percentage of customers who are "very satisfied" with Vista is dramatically lower than other operating systems, with Vista Home Basic at 15% and Vista Home Premium 27%, compared to the approximately 52% who say they are "very satisfied" with Windows XP and the 81% for Mac OS X Leopard. ChangeWave also reported that 83% of those intending to purchase Macs said that they "are choosing Macs because of Leopard and their distaste for Vista"
After CNet Labs tested SP1 Robert Vamosi concluded :
In general, CNET Labs found that Windows Vista SP1 offered a mixed bag of improvements. For example, Microsoft says that reading and writing files will be much faster within Windows Vista SP1. Tests performed by CNET Labs on a Dell XPS M1530 laptop showed that performance did improve in one scenario, remained steady in another, and even deteriorated in a third scenario. When transferring files from one folder to another on the same drive volume, the transfer time did somewhat improve. However, when reading those same files from an external drive, or writing them to the external drive, performance was the same or worse.
Do you need Windows Vista SP1? Yes and no. It's always good to install the latest (read: patched) code for any operating system. But downloading and installing the update will take some users a few hours without any visible or tangible improvements to their systems.
Clearly SP1 was needed but a lot of people seem to be pretty disappointed that it has not resulted in a net improvement in Vista.
Judging by Bill Gates comments at the consumer electronics show Microsoft is moving at top speed trying to get the Vista replacement "Windows 7" ready for shipping. Of course the regulators that oversee Microsoft, since the company agreed to oversight as part of its antitrust settlements, has Windows 7 in its sights from an antitrust perspective. It will be interesting to see if what they order will slow down getting this system to market.
So while Microsoft has been staggering from a late release of Vista (it was supposed to be out well before Christmas 2006, but didn't get released until 30 January 2007), to a very hostile critical reception, to a very mixed set of reviews on the patch that is SP1, what has everyone else been doing?
During this same period Ubuntu has gone through three complete releases with the next evolutionary version, Hardy Heron, which will be 8.4 when released, ready to go on April 24, 2008. That is four new versions in little more than the time that Vista has received one package of patches.
The last word always goes to the buying public, of course. What are they doing? According to Net Applications Windows continues to fall in user popularity. Between February 2007 when Windows accounted for 93.05% and February 2008 when it was 91.58% Windows use has fallen 1.47%. This covers the year since Vista was released.
Since Net Applications data started collecting data in October 2004 when Windows was at 96.40%, acceptance has fallen almost 5%. The general public are clearly voting with their wallets on Microsoft's performance and products.
Our stereo project described in our article on the VLC Media Player is now finished and it all works really well!
This took a while and a bit of planning. Here is the background: our PCs are located in our office space in our basement. Our home stereo is upstairs in the living room, located where most people's is. This is mostly older stereo components from the mid-1980s. Recently when I had the on/off switch replaced (it wore out!) on the receiver/amp the old fix-it guy said "hang onto this one, they don't make any stereo components like they used to".
We have a collection of CDs, cassette tapes and even 33 rpm and 45 rpm records. Yes, we even have an old turntable to play them on that works. The problem is that all our new music these days consists of free (and legal) downloads from Jamendo, which means the files are all MP3s. We can listen to them on our PC speakers, but these are small speakers and are located in the basement. It would be great to bring this music upstairs. The only way we have found to do this is to burn the MP3s onto a CD as CD audio files. That is okay, but there had to be a better way, that didn't involve making lots of CDs.
We examined all the options, including getting an MP3 player for the stereo system. There aren't many available except the miniature iPod-type variety. These don't have a good reputation for durability or reliability, they aren't cheap and they run on batteries.
Since we have a good MP3 player in VLC installed on both our Windows and Ubuntu PCs, it seemed to make more sense to get the PCs and the stereo closer together so that the PCs can input into the stereo. We opted to move the stereo into the basement, rather than move they PCs into the living room.
This also meant moving the LPs, cassette tapes, CDs and and everything downstairs, but leaving the big speakers (also 1980s relics) upstairs and running speaker cables to them from downstairs. We already had a smaller set of "bookcase" speakers in the basement hanging on the wall. We just had to run enough cabling to connect everything, which we did.
Altogether this wouldn't have taken more than a couple of hours, but we decided that since we were moving all this stuff around that we might as well finally paint the basement. That added another full day to the project!
The end result was a freshly painted basement with lots of speaker cable running through the ceiling, but it all works.
Now we can play 33s, 45s, tapes and CDs on the traditional players and every kind of digital audio file type on VLC from either PC. Fortunately the old receiver has enough jack ports for the 5 inputs and more to spare, plus AM and FM radio, too! The PCs output audio signals using 1/8 inch plugs, while the stereo receiver accepts inputs via dual RCA plugs. Radio Shack provided the converting "Y" cords to make that work.
All the components are very well used, connected together by new cabling, but it all works well and we have music all over the house now from many different eras.
VLC really has the capability to put out music that sounds good, even on the big stereo speakers. The limitations are in the quality of the music file, not the player, but the the same goes for cassette tapes or LPs for that matter, too.
Overall I am impressed with VLC. The interface is very simple and intuitive, the graphic equalizer works well and the sound quality is fine. A great piece of free software.
Back in January 2008 I uninstalled the annoying Apple Quicktime media player and installed VLC instead. What a change - it works really well on Windows and the interface is very simple to use.
VLC has great features, such as a graphic equalizer and will run just about any movie or audio format. It is a great piece of software, better than anything commercially available.
As part of our preparation for upgrading to Ubuntu 8.04 Hardy Heron in April, which will be via a reformat and fresh installation, I have been making up a list of the applications that we want to install. Because of our success with VLC on Windows and also because we like to keep both the Ubuntu and XP PCs configured as much alike as possible, I recently installed VLC on the Ubuntu PC.
VLC is available through the regular Ubuntu Gutsy repositories on the "Add/Remove Applications", so it is a snap to install, just check a box and you have it. This results in the familiar VLC orange traffic cone icon appearing on the "Application" menu. The current Gutsy repository version is VLC 0.8.6c, which is two versions behind the latest version 0.8.6e. Presumably Ubuntu will catch up with Hardy Heron next month.
VLC works just as well on Ubuntu as it does in the Windows environment. The play list is really easy to set up, just drag and drop files, double click the first one or hit "play" and it goes.
Videos work really well, too, with quick full screen access available.
We have been re-thinking our home stereo use as we have been downloading a number of free and legal albums from Jamendo. There is some really great music available there.
Right now, because our PCs are in the office and the stereo is in the living room, there isn't an easy way to play the MP3 downloads, except on the PC small speakers. We have put a few albums on CD as CD-audio files, but that means burning lots of CDs. We looked at getting an MP3 player for the stereo, but there really are none for component stereos, just the little battery-powered iPod-style ones. Of course we have a great MP3 player - VLC. So we are moving our stereo equipment into the office and running the speaker cables back around the house to the existing speakers. The PCs will plug into the stereo receiver and then we can play MP3s on VLC and pipe it into the big speakers! All we need are a few cables to complete this project.
Our hats are off to the open source team working on VLC. it is a great piece of software, easy to use and better than anything available on the commercial market. Best of all, it is open source, free software.
Back in December I was writing about how Ubuntu needs PDF printer similar to PDFCreator which is sadly only available for Windows. I did look at some of the complaints about CUPS - the Common Unix Printing System and decided to see if there were any other options for creating PDFs from outside OpenOffice.org applications, such as from web pages.
It turns out that there really aren't a lot of alternatives. I looked at Loop-to-PDF, which is a Firefox add-on, but it requires uploading your documents to a commercial website and then combining them there on their server. Not too keen on that way of making PDFs.
In more recently looking at the Ubuntu Forum it seems that the more experienced users are still recommending CUPS-PDF, so I thought I would give it a try.
I didn't realize it at the time that I wote my previous article, but CUPS-PDF comes already installed on Ubuntu. Alternatively it can be installed from Synaptic, the main Ubuntu package manager.
I am pleased to report that CUPS-PDF works just fine. You use it just like PDFCreator in Windows - select it as the "printer" for creating a PDF and in a flash it makes the page into a PDF and saves it in a new folder in your home directory, labelled "PDF". You can then rename and move the PDF document elsewhere if you need to. It works great! Truly a Linux equivalent to PDFCreator.
I guess the lesson here is to check and see what you already have installed - the question may be answered before it was asked.
I recently read Ryan Paul's 20 February 2008 article entitled Next Ubuntu release to be called Intrepid Ibex, due in October. Paul seems to get the best scoops on Ubuntu and his concise articles help the rest of us see what is coming.
Basically with Ubuntu version 8.4, "Hardy Heron" now only 50 days from being released and its features now frozen, the developers are starting to think about the next version of Ubuntu.
The fall 2008 Ubuntu release is already proceeding under the project name "Intrepid Ibex", which will be Ubuntu 8.10 when it is released, an event that is currently scheduled for 30 October 2008.
The article discusses the focus for 8.10 and therefore some of the features and improvements that we in the user community can look forward to.
It looks like we can expect improvements to:
In announcing Intrepid Ibex, Mark Shuttleworth said:
"Our desktop offering will once again be a focal point as we reengineer the user interaction model so that Ubuntu works as well on a high-end workstation as it does on a feisty little subnotebook. We'll also be reaching new peaks of performance—aiming to make the mobile desktop as productive as possible. A particular focus for us will be pervasive Internet access, the ability to tap into bandwidth whenever and wherever you happen to be."
It looks like those of us who use Ubuntu on the desktop will benefit from the work on this version, which is good news, not that we have been suffering with Gutsy Gibbon.
The key place to hammer out the details of what the fall Ubuntu will do will be Prague, when the developer conference will be held there between May 19 and 23
Paul offers his own opinion on how Ubuntu is doing in 2008:
"Ubuntu is a climbing star in the Linux community and has attracted considerable attention from prominent commercial hardware vendors as well as enthusiastic end users."
This all seems like good news for those of us who are committed Ubuntu users.
As always we will try to keep up with future versions of Ubuntu and report on the user experience here in Our Ubuntu Diary.
Vista just always seems to be in the news these days! If it isn't about the EU making a ruling against Microsoft for unfair trade practices, it is a class action suit.
Today the CBC carried a story that Microsoft is lowering prices on Windows Vista. On the surface it sounds like a good thing, if you like Vista I guess. But there is more to the story.
First off the price cuts of 20-48% apply to "the Home Premium and Ultimate versions of Vista, in both their full editions and the editions that upgrade an older or more basic operating system". This won't affect very many people obviously. Very few people have a "Vista capable" box just sitting there so that they can run out and get a CD to install Vista on it. Most Vista PCs come from the factory, pre-installed.
So why lower prices?
Brad Brooks, a corporate vice-president for Windows marketing at Microsoft, said in an interview that the company has since tested lower prices and found "product was moving much, much faster."
Brooks said he expects so many customers to buy Vista at the new prices that the price cuts will increase Microsoft's revenue, not subtract from it.
Wow Microsoft has a vice president who thinks that lower prices will increase sales? I am shocked. Maybe he took a first-year economics class?
Of course it is likely that this move to lower prices has little to do with marketing and everything to do with a class action lawsuit against Microsoft. The suit has just been allowed to proceed in the US by a federal judge. The suit is all about Microsoft labelling PCs as "Vista Capable" before Vista was available. The concept was that to avoid a drop off in sales while consumers waited for the very behind-schedule release of Vista they could buy a "Vista capable" PC with XP and then upgrade the operating system through a coupon when it finally came out. The problem is many of those PCs weren't able to run Vista or at least weren't able to run more than the "Home Basic" version, which doesn't have the Aero interface or other snazzy features. People felt ripped off and now want their money back.
As the article describes, Microsoft allowed almost anything to be labeled "Vista Capable" to not slow down sales. The Microsoft executive in charge of Windows, Jim Allchin, said "we really botched this."
The new lower price seems to be an attempt to address some of the allegations in the class action suit that while the coupons got consumers the downgraded "Home Basic" version of Vista, the higher-end versions of Vista were overpriced.
It is interesting to watch the whole Vista issue play out in the press. Of course, at any price Vista is overpriced. Even for "free" it would be no bargain. There are better operating systems available and they are free.
I think the smarter people who bought a "Vista capable" PC with XP and an upgrade coupon wouldn't bother to waste their time getting involved in suing Microsoft for a $600 PC. They would just re-install XP and have a pretty fast PC with an okay operating system. I guess for some Americans going to court is their only real opportunity to get out of the house now and then?
Perhaps the last word on the subject of whether Vista is any good should really go to Bill Gates himself. At the Consumer Electronics Show 2007 he was asked what product he wished he could have polished more before it was released. His answer: "Ask me that after we ship the next version of Windows and I'll be more open to give you a blunt answer." Ya gotta give him credit for being honest about not liking Vista himself, sorta.
Back last spring when we started using Ubuntu, I remarked about it being an "intended replacement for Windows", at least for us. Some people have questioned whether that is the case or not, but I think I found the definitive answer to that question.
Launchpad is the Ubuntu bug-reporting system. I found the "number one reported bug" there. Mark Shuttleworth, who started Ubuntu and who runs Canonical, Ubuntu's parent, filed the bug report on 20 August 2004 just a couple of months ahead of the first release of Ubuntu 4.10. Shuttleworth wrote:
Microsoft has a majority market share in the new desktop PC marketplace.
This is a bug, which Ubuntu is designed to fix.
Non-free software is holding back innovation in the IT industry, restricting access to IT to a small part of the world's population and limiting the ability of software developers to reach their full potential, globally. This bug is widely evident in the PC industry.
Sounds pretty clear to me that Shuttleworth started this project as a Windows replacement.
So the next question that should be asked is probably, "is it there yet?"
Shuttleworth was quoted in September 2007 as saying:
"it would be reasonable to say that [Ubuntu] is not ready for the mass market."
However that was during the time of Ubuntu 7.4 Feisty Fawn. Ubuntu 7.10 Gutsy Gibbon has definitely moved things further along towards Shuttleworth's goal. About 7.10 The Economist said:
"No question, Gutsy Gibbon is the sleekest, best integrated and most user-friendly Linux distribution yet. It’s now simpler to set up and configure than Windows."
I think it is getting there and for quite a number of us is proving to be a good replacement for Windows.
Now we just have to let those PC users with the "Vista hangover" know.
The next version of the Ubuntu operating system is well on the way to being ready for delivery. It will be called version 8.4 (meaning "April, 2008"). It is currently being built under the project name of "Hardy Heron". The release is scheduled to be 24 April 2008 for this new version.
Ubuntu 8.04 is probably going to be a landmark version in the history of Ubuntu. It will be the eighth Ubuntu release and the second "LTS" or long term support version. The LTS versions are supported for three years for desktop and five years for server, instead of the usual 18 months. This should mean that 8.4 should be an especially functional and complete release, since the developers will have to live with it for a longer period of time. The development aim is for it to be "stable and resilient" rather than introduce new features.
Ryan Paul of ars technica has recently written a sneak-peek review of Hardy Heron Alpha Version 4, which is the fourth initial test version of the release. The article gives some interesting first hand information of what to expect in April.
Some of the highlights of Ubuntu 8.04 will include:
Paul summed up his impressions of working with this early pre-release version by saying:
Although many of the significant architectural features like PulseAudio and GIO are still in transitional stages and aren't fully functional yet, Ubuntu 8.04 alpha 4 is still very impressive. I'm a big fan of D-Bus, and I'm very pleased to see it being adopted throughout the entire desktop stack in core components. I'm also very impressed with the relative completeness of PolicyKit integration, and I'm looking forward to the promised performance improvements and support for pausing file transfers that we should get when GIO is more mature. Many of the major pieces are falling into their proper places for Hardy Heron...
It all adds up to some excitement in the Ubuntu world. This new version is looking like it will be very functional, which is, of course what all of us users are looking for, after all. We use our PCs to get actual work done (instead of fussing with the boxes) and this upcoming release of Ubuntu looks like it is aimed at users like us.
Overall I like what I see coming in the Ubuntu world. While Microsoft keeps making its software more bloated, less functional and generally worse over time, the open source community is building new versions of Ubuntu that easily surpass Vista in functionality, stability and usefulness, all without giving anything up on aesthetic appeal or ease of use ... and all for free.
We will have more on Hardy Heron when there is more info, and especially when we test fly the final release version ourselves after April 24th.
I cannot deny that working with Ubuntu is a learning experience!
The latest challenge revolved around that little plug-in that allows website animations to work, Macromedia Flash, or as it is now known, Adobe Flash.
This all started because of JumpCut, the on-line video editing application described in our Windows Open Source/Freeware Project. JumpCut allows you to upload video clips, edit them, add titles and soundtracks and then post them. It works pretty well, but the website works entirely through Flash. This is not unique, without Flash you can't get very far on many websites, including You Tube, too.
I should mention that while the use of Flash in a Windows or Mac environment is not really questioned, it is somewhat controversial in the Linux world. The controversy revolves around the fact that there is no "free software" version of Flash. That is not to say that you have to pay for it - all Flash plug-ins are "no-cost software". But Flash, even for Linux, is proprietary software, not open source. That makes the open source purists nervous, as they will not run non-free software on their PCs. I can understand the reasons for doing that, from a philosophical perspective. The problem is that there are no free alternatives - only Flash does what Flash does. Without it you can be politically pure, but you can't watch You Tube and you can't edit videos on JumpCut, among many other things. It is a trade off.
When we first started using JumpCut it worked pretty well on both Windows and Ubuntu. Then, after about a week or so, it wouldn't take uploads from Ubuntu. Everything else worked fine - editing, viewing, just no uploading.
I suspected that the problem was related to Flash. What I guessed was that the good folks at Yahoo, who run JumpCut, made some kind of upgrade to the site that didn't work with older versions of Flash. This turned out to be correct. Fixing the problem was harder.
The current (as of this writing) version of Flash is 126.96.36.199. Our Windows PC had that. The Ubuntu PC was running 188.8.131.52. Should be easy to upgrade, right? Not on Ubuntu. Normally you have to wait until the repositories have a newer version available and then it arrives as an update. Forcing the issue complicates things, as we saw on Open Movie Editor, although it can be done, sometimes. With Windows it is easy to download the latest version of any software, with Ubuntu is not so easy. It is certainly easier to wait until the new version comes automatically through the update process.
Never-the-less did try doing a download of the latest Flash for Linux, but the command line installation didn't work correctly and I didn't get it installed.
That was the bad news. The good news was that there was a Firefox update package the following week and that brought the latest version of Flash too. It installed fine and I checked it on the Flash test website and sure enough, there was Flash 184.108.40.206 all installed. I tested Ubuntu out on JumpCut and it worked! This confirmed the hypothesis that Flash was the problem.
Everything was good for a few minutes. Then, suddenly, JumpCut wasn't working once again. I retested the Flash version and found that it tested as version 220.127.116.11. Amazing! How did that happen?
This sent me to the Ubuntu Forum to try and find some help. As noted in that discussion I got some good help, as I almost always do on the forum. The full version of what we did is there on the forum. The short version is that I seemed to have two versions of Flash installed on the PC and the older one, 18.104.22.168, seemed to override the newer one installed in a separate file. We managed to delete both and then re-install Flash from scratch successfully.
This time it worked! As a result our Ubuntu PC can now upload and edit videos on JumpCut. There is a list of our JumpCut videos on our home page.
So what did I learn from all this?
As time goes by I getting more and more positive about Ubuntu. The operating system is getting better over time quickly. With Windows each new distribution is a crap-shoot. Will it be an improvement like Windows 98→Windows XP or a disaster like Windows 98→Windows ME or like Windows XP→Vista? With Ubuntu each version is better and better. Besides Ubuntu versions are only six months apart, not five years!
Open Movie Editor 0.0.20080102
Type: free software
Use: Movie editor
Made by: Richard Spindler
No Wikipedia Page
As described in our earlier posting on video editing we have tried out a number of applications to take our video clips and turn them into movies. One of the applications we tried was Avid Free DV for Windows. That was unsatisfactory and so we looked through the Ubuntu repositories to see what was available there. As described earlier, that is how we found Open Movie Editor (OME).
OME was created and is maintained by Richard Spindler of Austria. Richard provides the downloads, runs a forum and also has a tutorial page and even a You Tube video on how to use the application. The package is pretty complete, especially for open source.
In the previous entry I described the tribulations of getting the latest version of OME for Ubuntu. It took a bit of work, but we succeeded, after a couple of days we had the current version of OME, version 0.0.20080102.
Compared to Avid Free DV OME is a breeze to use. Clips can be called up from the media browser and then "dragged & dropped" into the video timeline. They can then be moved around to give black space between them or to overlap them. The overlaps automatically become fade transitions from one clip to another. Beautiful and simple.
Working with the timeline is made very easy by an innovative scrollbar that can be easily expanded and contracted to provide a close look at sections of the video track. Very simple and effective.
The title function does not work with OME out of the box. I thought it odd that the entire title selection tab and the commands were all "grayed out" right from installation. The forum provided the answer - titles are an option using the Frei0r Video Plug-in. I chose not to download FreiOr and instead made my own titles as ".jpg" files on GIMP. Stills can be inserted into the video time line, just like video clips can be. By stretching them out or contracting them you can make them longer or shorter. OME also acts as a player - you can run the movie to see how it looks or grab the cursor and pull it across the timeline to see individual parts. It all works great.
Clips are easily to cut, split and then use, or dispose of unneeded portions. Parts of clips that have been cut can be removed from the timeline simply by dragging them to the trashcan and they are gone. This doesn't affect your original file clips, so if a mistake is made you can always drag another copy of the clip onto the timeline.
OME also saves your current project and all other projects automatically as you work - a nice touch, really.
Turning your work into a finished product is easy. Select File→Render and then select either the default or custom parameters to tun it into a movie.
OME is just about perfect for a user looking for a very simple, but effective tool for editing videos. We have only found a couple of problems with OME. First sometimes when working on a movie the "play" function ceases to operate. This isn't a real problem as you can manually move the cursor to see where you are and how it looks. Closing and reopening OME seems to solve the problem.
The other problem is that, at least when working with 10 fps ".mov" files, the finished product is huge. Typically taking three clips totalling about 10 MB of video and combining them results in a finished product of 300 MB. Since You Tube and JumpCut both have limits of 100 MB for uploads I haven't been able to post anything edited on OME to them. I have tried working with the "custom render" parameters to see if I can make them a bit smaller, but that hasn't been successful.
OME makes great movies for off-line use. The finished product is just too big for on-line use.
In the meantime we have been creating videos on-line using JumpCut. It works fine, within the limitations describe in that article. The biggest limitation being that it stopped working on Linux a few weeks ago.
Have a look at our Home Page for a complete list of videos that we currently have posted on You Tube and JumpCut.
Am I the only one who has noticed this? I just discovered something really big that is happening all around us. I think most people haven't noticed it yet, but they are going to be waking up to it soon.
The whole fabric of our society is being changed, dragged away from corporate interests and profits towards a new age of personal freedom instead. This is not something that is coming in the distant future, it is here now. It is, just as I said, that most people haven't noticed it, even though it is all around them.
The epiphany came for me a few days ago. I was working on a video I was trying to edit. I had a bunch of clips of our quadracycle taken last summer. I had shot some of them and Rachael had shot some, too. They needed combining into a movie with titles and such. I had tried working with Avid Free DV, a (no longer available) commercial freeware video editing application, but it wasn't working out well. I created a version of the video in an open source application called Open Movie Editor (review to come!). That worked perfectly, but I had some file size issues (huge) that prevented me from posting it on You Tube. Finally I settled on using JumpCut, the free on-line video editor. I uploaded the clips and titles, edited them and created a nice little movie out of it. A couple of the clips had loud aircraft noise in the background, so I just selected the audio "off" and posted the video. It looked pretty good.
I sent the link to it to a quadracycling friend. He liked the video but asked why there was no sound. I realized that I couldn't get away with a silent movie in 2008!
What to do? How about put a music soundtrack on it instead of background noise? That sounded like a good idea. We have a few commercial MP3 files but those are all copyrighted, you can't use them on a video without paying a fee or breaking the law.
I knew all about the music downloading/file sharing scene, from Napster on up and have avoided it. Basically people have been making music available on the internet for free, for a long time. The problem was, as most people know, it was other people's music and it was a copyright infringement and illegal. There have been lots of prosecutions and people have been given huge fines or gone to jail for music downloading in recent years as the music industry tries to protect its falling profits from people sharing illegal files instead of buying music. Personally I consider the whole concept all a dead end - downloading illegal music is a waste of time, like pirating software. If those bands and record companies don't want you to have their music for free and you aren't willing to pay for it, then forget them.
Jamendo is something else entirely. Based in Luxembourg, the name is a contraction of "Jam-Crescendo". The music they have available for download is all free and it is all legal! It has been uploaded by the musicians under Creative Commons licences. Creative Commons licensing allows anyone to download the music, pass it on to anyone, subject to the same licence and use it for non-commercial purposes just by giving credit, all for free. Jamendo pays musicians 50% of the advertising revenue and 100% of donation revenue. For most albums that probably doesn't amount to much.
So you would figure that there wouldn't be much music there would you?
As of 27 January 2008 they have 6,658 published albums, 60,489 album reviews and 243,946 active signed-up members. And you don't even have to become a member to download. You can search for music, play samples on-line and then download the album zip files. They have rock, pop, jazz, classics, world beat, everything. I was amazed.
The first question most people would ask is "why would anyone post their band's albums there for free?"
There are lots of reason to do that. One of the most compelling is that most bands signed to major record deals don't make much money from their album sales, whether on CD or as downloads. The record companies seem to make most of the money on the music. The bands make money on tours and concerts. So why not give the music away for free, get it passed around to the maximum extent possible, increase exposure and then make money touring, giving concerts and selling T-shirts? Guess what - you can entirely cut out the record company and retain total creative control. All via Creative Commons and Jamendo. There are many other websites like that one, it is all a totally different model!
So the next question is probably: "So the music can't be very good, right?" I am sure the quality varies quite a bit, but I downloaded two albums and I am impressed - the two I got aren't bad at all.
The first was a jazz album called "Gonzo Jazz" by a band called Margin of Safety from Finland. I uploaded a track called "Funky Kaftan" from their album to JumpCut to finish off my video. Quadracycling is a lot better with a jazz soundtrack!
That solved that problem.
The Jamendo download not only came with the music track files, but the Creative Commons licence to prove that the download is not illegal and even a ".jpg" file of the cover art. I formatted the cover art in my open source desktop publisher, Open Office Writer and printed that for the CD. I could have just as easily put the MP3 files on an iPod or similar audio player.
I am amazed - I now have a free album, totally legal and all for the cost of a blank CD. It is a great album, too. Played on my regular stereo it sounds super.
Okay I got two free albums, big deal, right? As I mentioned earlier, it occurs to me that something bigger is going on here.
For me it actually started with Wikipedia, the free encyclopedia. I heard about that in early 2005. I looked up a few subjects and liked what I saw - a large number of people were collaborating and building a pretty darn good encyclopedia there. It wasn't perfect, but the best part is if you find a mistake anyone can correct it. I started working on it and have now written dozens of articles from scratch and uploaded 350 photos. It is all based on the idea that knowledge should be free and freely available.
I am not sure that the old time traditional encyclopedia companies are happy with the existence of Wikipedia, especially since, as a CBC article reports: "a 2005 study in the journal Nature found Wikipedia was about as accurate in covering scientific topics as Encyclopedia Britannica. Based on 42 articles reviewed by experts, the average scientific entry in Wikipedia contained four errors or omissions, while Britannica had three."
After Wikipedia, I discovered free software. Much of this journal is about free software, so I won't dwell on those experiences. Let me just say that the philosophy is similar - that software should be available for everyone at no cost and that anyone should be able to read the source code, experiment with it and make it better. Much is made of the fact that free software protects your freedoms. Like the music situation, the money is made elsewhere, on support and customization of the software. Here in 2008 there is little doubt that the open source process produces much better quality software and much, much better value than the for-profit commercial approach. You only have to compare Ubuntu to Vista to see that.
Open source software is like Wikipedia, it isn't necessarily perfect, but it is getting better all the time and everyone can help out. Even those of us who aren't programmers can promote it, write about it and provide information to people, all to get the word out there. It is a community effort.
But, like the record companies and the encyclopedia salesmen, there are people who are unhappy about open source software. The SCO Group, Inc recently lost a major lawsuit trying to stop Novell and IBM from using Linux, claiming that it was a copy of Unix and that they own the rights to Unix. It turns out that it isn't and, as well, SCO doesn't own Unix, Novell does. SCO is now in bankruptcy. Microsoft has been making claims that open source software violates their patents. They are obviously trying to protect their turf, but then Microsoft is entirely based on "closed source" software and secrecy. So far they haven't provided any proof and so it just looks like bullying behavior from the world's most-sued software company. Microsoft keeps losing court cases, particularly in Europe. They are getting eaten by the wave of open source software, which produces better quality software for free and a lot of their reaction looks like a dinosaur caught in a tarpit. Much of the effort in the open source community seems to be undertaken to solely to break the Microsoft hegemony.
The globalization movement that started in the 1980s really has allowed large corporations to make huge profits by moving operations off-shore and then selling products back to the former employees that they laid off. It exploited the employees in the third world who work for next-to-nothing and can't even afford to buy the products that they make. It also exploited the people in the former manufacturing location, because it aimed to sell them something they used to make and put nothing back into the country they had abandoned. The riots that we have seen at the G8 conferences along with other protests represent people's unhappiness with the success of the corporate agenda. Don't even ask which corporations are making all that money from the invasion of Iraq.
It seems to me that what we are seeing here is another form of action, instead of mere protest. It doesn't stop at free encyclopedias, software and music. People are making their own TV shows, posting them on You Tube and the number of people watching network TV is dropping. Over the last 20 years the majority of light aircraft built in North America have been home-built aircraft, not made by big companies. The same sort of thing is happening all over. And it isn't always subtle.
The Free Software Foundation's Bad Vista Campaign says: "There is a battle underway between those who value freedom, and corporations such as Microsoft who wish to profit by taking that freedom away. DRM and absurd licenses are at the heart of that battle. Please join us on the side of freedom by saying NO not just to Windows Vista and other DRM-enabled products, but to proprietary software in general. Instead, use non-DRM, “free” software such as the GNU/Linux operating system. You can get your work done while ensuring that your rights and freedoms will not be restricted now and into the future."
It seems that the tide is turning on the commercialization of our lives. A volunteer effort is wrestling control back from the corporations that have dominated it for so long.
If that isn't a revolution, I don't know what is.
At the same time we are all creating a new form of "globalization" - a direct one. When I can download an album by a Finnish or Hungarian band for free and put it on a CD here in Canada, then we are no longer exploiting each other for profit, we are now collaborators in a global project, one of equals around the world. I am going to tell other people how much I like their music and where to get it from.
There is something oddly familiar about this social revolution. It all reminds me of the anti-Vietnam War protests and draft dodging of the 1960s. The babyboom generation wanted to change the world, make it a place for people instead of putting a priority on corporate profits, turn it in a different direction than it had been going. "Tune in, turn-on and drop out". They didn't succeed in changing the world then, even though that movement took the US out of Vietnam and brought down a president as a result. I always thought the 1960s marked a failed social revolution when really nothing changed after Vietnam and that the revolutionaries had "sold out" and got jobs, bought houses and became comfortable instead. I am happy to say that there is some good evidence that I was wrong. They did get jobs and they bought houses, but they didn't get too comfortable. They are still out there creating social revolution over brunch. Lots of them are now computer programmers, running open source software projects, playing in jazz bands and working on Wikipedia.
It isn't going to end, because I keep running into the young generation out there, the so-called "Generation Y", the under 25s. They are doing their own programming, collaborating on open source software and posting their band's music instead of signing record deals. They are the children of the babyboomers and they are carrying on their parents' intent to change the world. And they are doing it, this is all evidence. No wonder whenever I meet under-25s they are so idealistic - they grew up with this quiet social revolution. You have to be idealistic to be part of this. I believe they will succeed, at many levels they and their parents already have.
This "freedom revolution" is changing things, slowly and quietly and it is bringing a great levelling around the world, eliminating inequities and injustice.
Maybe we can make the world a better place, after all.
January 30th will mark one year since the public release of Windows Vista. I thought this was probably a good opportunity to see how the acceptance has been, what the critics are saying and whether it has been a success like Windows XP was or a failure, like Windows ME was. There have also been many questions about whether Vista is improving with time or not, so I will try to have a look at that issue, too.
First off let's look at acceptance. In looking at the best available information gathered from multiple sources it would seem that Vista accounted for 6.9 percent of desktop operating systems in use in December 2007. The acceptance rate has been far below that of Windows XP when it was introduced. Much of this lack of user uptake of Vista has been because it has received "harsh criticism for low hardware support, high system requirements, relatively poor performance, and for not making big enough improvements since the release of XP." (Wikipedia) In general it looks like users have not bought into Vista in large numbers because of its critical reception.
Bill Gates stated that Vista had sold 20 million copies in the first month, 88 million copies by September 2007 and 100 million by the end of 2007. On the surface this looks like a lot. But, as some critics noted, most of these were original installations on new PCs that people bought. Many PC manufacturers gave you no choice in 2007, you could only get their products with Vista already installed. This was due to Microsoft's policies in dealing with OEMs. Dell was one major exception and they continue to offer a choice of five operating systems in early 2008, including Vista, XP and three Linux systems: SUSE, Red Hat and Ubuntu. There is good reason to believe that many of the OEM-equipped Vista PCs aren't running Vista any more and that at least some users have installed XP or Linux operating systems in place of Vista.
At least one critic has noted that OEM Vista installations are the best advertisement possible for Linux operating systems due to "the frustration of dealing with blue screens and Digital Restrictions Management - not to mention the sneaky updates and spying" that Vista brings to your desktop.
Installing a different operating system will always result in better performance as most Vista PCs ship with 2 GHz dual core processors and 2 or 3 GB of RAM. While it is amazing that Vista runs so slowly with all that hardware, systems like Ubuntu or even XP will zoom along with all that capacity to spare.
One of the acceptance issues has been the "Vista Capable" label. This is issue is now a major lawsuit. The basic issue is that many people bought PCs that were labelled as "Vista Capable" before Vista came out. These ran XP but were supposed to be upgradable after Vista was released. The problem is that they weren't. Some of them had 512 MB of RAM, whereas ComputerWorld says you need 4 GB of RAM to run Vista fully. Most of these boxes could only run Vista at a very basic level, without the aero interface. There were a lot of disappointed people, who are forming a class action on the issue. The suit says that Microsoft lied to them about the hardware they bought and they want their money back.
It is worth noting that according to Microsoft's own figures Vista was installed on only 39 per cent of the new computers shipped in 2007. That is actually rather remarkable. It means that the vast majority of new computers last year had something else for an operating system.
While many people who bought new PCs seem to have replaced Vista, that doesn't affect Microsoft - they made their money on the Vista installations anyway. It is your financial loss to overwrite it with a better operating system.
That Vista has made money for Microsoft was was pointed out in a 25 January 2008 CBC article. As it says, "Microsoft's "client" division, responsible for Vista, posted revenue of $4.34 billion...Shares of Microsoft rose $1.40, more than four per cent, in extended trading, after gaining $1.32, or four per cent, to close the regular session on the Nasdaq market at $33.25." That is a lot of cash they made on Vista, in just the one calendar quarter ending 31 December 2007.
So I think it is fair to say that the acceptance of Vista has been weak and that a number of customers aren't happy with it at this point, but that Microsoft is happy with the money they have made.
So what are the critics saying? Not anything nice. Perhaps the leading comment about Vista is from one of the leading computer magazines, PC World who named Vista the number one tech disappointment of 2007. They said:
It's just that Vista isn't all that good. Many of the innovations the operating system was supposed to bring--like more efficient file and communications systems--got tossed overboard as Microsoft struggled to get the OS out the door, some three years after it was first promised. Despite its hefty hardware requirements, Vista is slower than XP.
When it debuted last January, incompatibilities were rampant--in part because hardware and software makers didn't feel any urgency to revamp their products to work with the new OS. The user account controls that were supposed to make users feel safer just made them feel irritated. And at $399 ($299 upgrade) for Windows Ultimate, we couldn't help feeling more than a little gouged.
No wonder so many users are clinging to XP like shipwrecked sailors to a life raft, while others who made the upgrade are switching back. And when the fastest Vista notebook PC World has ever tested is an Apple MacBook Pro, there's something deeply wrong with the universe.
We have no doubt Vista will come to dominate the PC landscape, if only because it will become increasingly hard to buy a new machine that doesn't have it pre-installed. And that's disappointing in its own right.
Strong words, but there is a lot more.
Wikipedia lists six main problems with Vista:
The complete summary of problems is best found in the article Criticism of Windows Vista.
Rather than comment on all I found let me just list some of what people are saying about Vista:
CNET: "Any operating system that provokes a campaign for its predecessor's reintroduction deserves to be classed as terrible technology.
IT Wire: "...many have called the final Vista "alpha quality" software. (That means "at a stage barely ready for testing". It's bad.)"
RegDeveloper: "...sexy party dress aside – it’s the same old tart underneath. Contrasting what was originally promised with what was finally delivered, Vista (nee Longhorn) has spectacularly failed."
Graceful Flavour: "Its launch has not lived up to expectations by any reasonable measure. Sure, you have some cheerleaders out there, but if you look closely enough, you’ll find they make a living supporting and advocating Microsoft’s technologies first and foremost. Vista is not setting the world on fire, and people are actively trying to avoid upgrading."
Vista Blorge: "PC Advisor surveyed its readers and found out that 67% would prefer their new computer come with Windows XP over Vista... "
The Inquirer: "Microsoft is in a rut. The firm has cowered, co-opted or bought all the critics, and any message coming out of the press will be well scripted... Vista could have been innovative instead of warmed over. Vista could have defended our rights instead of raping them. Vista could have been lean and mean instead of bloated and DRM slowed. Vista could have brought new ways of doing things instead of the same old same old. Vista could have been cheaper instead of a stealth price increase. Vista could have pioneered new ways of letting us use computers instead of activated tethers and licence problems. Vista could have been compatible and advanced standards instead of breaking software in the name of locking you in... I think we would have been better off if MS packed it in and spent the money on the moon shot they are so fond of making comparisons too."
Todd Bishop's Microsoft Blog: "Within Microsoft, there are (or were) those who knew where to lay blame. Jim Allchin, who left Microsoft the day Vista was released, said in a now-famous memo to Bill Gates and Steve Ballmer, "...we lost our way. I think our teams lost sight of what bug-free means, what resilience means, what full scenarios mean, what security means, what performance means, how important current applications are, and really understanding what the most important problems [our] customers face are... I would buy a Mac today if I was not working at Microsoft. If you run the equivalent of VPC on a MAC you get access to basically all Windows application software (although not the hardware). Apple did not lose their way... They think scenario. They think simple. They think fast." He closes with this simple summary of Longhorn (the initial code name for Vista), "LH is a pig.""
Pseudo Marketing: "I was loyal to you for so long. I stuck with you through thick and thin. From DOS 5.0 through XP. Through decent functionality and through countless crashes. But this new operating system is the last straw... You’ve terrified folks like my poor dad. He is afraid to install new software for any reason. He mumbles things like “Computers – you just can’t trust them.” He’s been conditioned that if he tries to install a new program or download an update – even if he does it correctly - something is likely to go awry for no explicable reason... You made millions of poor secretaries and office workers cry just for trying to do normal things like printing and saving... The secret is out, Microsoft. The reputation that you can’t be trusted to deliver reliable software is getting around fast."
Tech Blorge: "According to Microsoft, Vista brings “clarity” to our lives. What the hell is that supposed to mean?... We all know the truth - Windows XP is perfectly fine, and Microsoft’s PR machine is just inventing reasons for us to buy something we don’t need."
Forbes: "Should you upgrade your current machine? Are you nuts? Upgrading is almost always a royal pain. Many older boxes are too wimpy for Vista, and a 'Vista-ready' unit Microsoft upgraded for me could see my wireless network but not connect to it. The diagnostics helpfully reported 'Wireless association failed due to an unknown reason' and suggested I consult my 'network administrator'--me. Yet I've connected dozens of things to that network, including other Vista machines, a PlayStation 3 and Microsoft's own Xbox 360... My recommendation: Don't even consider updating an old machine to Vista, period. And unless you absolutely must, don't buy a new one with Vista until the inevitable Service Pack 1 (a.k.a. Festival o' Fixes) arrives to combat horrors as yet unknown."
ComputerWorld: "Microsoft's on-the-box minimum RAM requirement "really isn't realistic," according to David Short, an IBM consultant who works in its company's Global Services Division. He says users should consider 4GB of RAM if they really want optimum Vista performance. With 512MB of RAM, Vista will deliver performance that's "sub-XP," he warned."
BBC: "I've had two Vista crashes so far - not a blue but a black screen - and that really shouldn't happen. I can't even remember my last XP crash."
eWeek: "Microsoft really doesn't want you to know this, but many of your existing applications won't work with Vista. In fact, some brand new products won't work with Vista."
Computer World: "A major component of this is a new reduced functionality mode, which Vista enters when it detects that the user has "failed product activation or of that copy being identified as counterfeit or non-genuine", which is described in a Microsoft white paper as follows: "The default Web browser will be started and the user will be presented with an option to purchase a new product key. There is no start menu, no desktop icons, and the desktop background is changed to black. After one hour, the system will log the user out without warning". This has been criticised for being overly draconian, especially given reports of "false positives" by SPP's predecessor, and at least one temporary validation server outage."
IT Wire: "Any sensible person who reads the end user licence agreement accompanying Vista would, I'm sure, prefer to opt for a cell in Guantanamo; you basically have to spread your legs wide and bend over if you want to use the operating system."
There is a lot more collected at Anything but Speechless: 100 Things People Are Really Saying About Windows Vista
The critics have done a lot of experimenting and comparing Vista to the other current operating systems out there, particularly Mac OS-X Leopard and Ubuntu 7.10 Gutsy Gibbon. There seems to be general agreement that both those operating systems are far better than Vista:
Information Week: "Mac OS X Shines In Comparison With Windows Vista"
ZDNet: "Since the late 90s I've dabbled with Linux, but there have always been compelling reasons to return to, or stick with, Windows. No more, for two reasons: Vista, and Ubuntu 7.10."
So is Vista another "XP" or another "ME". From what I have read from the experts it isn't good enough to be compared to "ME".
So that leaves the last question - is Vista getting better with time? There were many admissions from Microsoft that Vista lacked hardware drivers when it was released, meaning that it wouldn't work with a lot of hardware people already had. Some of these issues were fixed in time through updates, which is good news.
Overall, though, it is easy to find articles that show that Vista is not being improved over time. A good example is from Bruce Schneier who points out: "Microsoft has added the random-number generator Dual_EC-DRBG to Windows Vista, as part of SP1. Yes, this is the same RNG that could have an NSA backdoor. It's not enabled by default, and my advice is to never enable it. Ever." Essentially Microsoft is introducing new bugs and issues as they are fixing old ones.
I keep looking out for people who say good things about Vista, but other than folks who work for Microsoft I am not finding much being said out there. Even the strongest fans of it seem to have a love/hate relationship with the operating system. Typical of the positive comments from non-Microsoft employees is Andrew Clifford's I love Vista, I hate Vista. His bottom line is "From my one week's experience of Vista, I would recommend it wholeheartedly. But I am still wary of Microsoft. If you want to keep the value of your investments in learning, in software, and in data, Vista does little for you."
Perhaps the last word should go to an IT Specialist Ron Schenone: "If there has ever been a time for other operating systems to make their move, this must be it. Unless Microsoft pulls a rabbit out of its hat, Vista may go down in history as the software that brought down a mighty empire. Drastic statement? Maybe. But people are not going to lay down their hard-earned money for something that doesn’t work. Plain and simple."
We will continue to watch what is happening in the world of Vista, but suffice to say that we are very happy to be running XP and especially Ubuntu 7.10.
Figuring out how to edit videos turned out to be much tougher than we thought it would be.
Here is the story:
We have a digital SLR camera that shoots really nice still photos. It also shoots videos in Apple QuickTime ".mov" format. They aren't very high quality - 10 frames per second (fps). That will become a factor as the story unfolds.
Since we got the camera in 2004, we have amassed a collection of raw video clips that we have shot: skiing, flying, Zuby, kid videos, quadracycling, etc. We always thought that it would be nice to be able to edit them to make longer movies out of the short clips that we have and then post them on You Tube and link them from our website.
We don't want an editor to do much:
Sounds pretty basic, doesn't it?
We have the video clips, what we need is editing software. Apple sells an editing version of their free QuickTime Player for US$29.99. There had to be a freeware or open source alternative, right?
We started looking a few months ago. Ubuntu 7.4 Feisty Fawn, which we were running at that time, didn't have anything suitable in the repositories. There were a couple of video editors, but the reviews said that they we were complex, buggy and not recommended for beginners or for simple uses.
We got a recommendation for Avid Free DV (Digital Video). This is a freeware version of Avid Technology's more professional editing software. As a bonus it also had Flash tutorials available. So I downloaded the application and the tutorials. It was a good thing that I got it when I did in August 2007, as Avid Technology decided that they were getting out of the freeware business and discontinued Avid Free DV as of 01 September 2007, so it is no longer available.
Avid Free DV will only work on Windows or Mac and so we installed it on our Windows PC. That left us with no video editing capability on the Linux PC, but at least we had something.
We both spent a lot of time going through all the tutorials in great detail. Avid Free DV is a very complex application and it is not easy to learn how to use it. It cannot be described as "user friendly". The tutorials are difficult to follow as they tend to jump from "way too easy" to "way too complex" a lot. Some tutorials we watched several dozen times, trying to understand them.
We tried editing some videos with Avid Free DV and the results were mixed. Avid Free DV is very fragile when dealing with ".mov" files and possibly other formats too. If you do something and don't like it, undoing it will often not restore the project to a usable form and then you have to start all over again. It does a lot of unexpected things that the tutorials don't explain.
It also seems that Avid takes the ".mov" format and converts it to its own internal formatting for editing, losing some resolution at the same time. Then when you edit it and export it back to ".mov" format it again loses something in the format translation. We have been posting videos to You Tube to share them and You Tube also degrades them on upload and conversion to Adobe Flash format. The end result is pretty low resolution, as you can see in this example: Adam & Ruth go Flying. It really isn't a very good result.
Since we upgraded our Ubuntu version from 7.4 "Feisty Fawn" to 7.10 "Gutsy Gibbon" I checked and discovered that the Gutsy repositories have a new Linux-only video editor, called Open Movie Editor (OME). It sounded good on the OME website: "Open Movie Editor is designed to be a simple video editor, that provides basic movie making capabilities. It aims to be powerful enough for the amateur movie artist, yet easy to use." That sounded perfect for what we needed.
As a bonus OME had available an on-line tutorial and even a You Tube video in which the developer, Richard Spindler, demonstrates the application. A lot of open source applications lack this kind of documentation. Richard has even established a forum to get help on, too. He seems to often answer the questions there himself.
It all sounded good. So I used the Ubuntu "add/remove applications" to install OME. It installed quickly and opened up fine. But it wouldn't open our videos. It turns out that OME will only work with 25 fps video and, as mentioned, ours are all 10 fps. It was a no-go.
The version I got was 0.0.20061221, which translates to 21 December 2006. In other words it was not the current version. I thought that might be the issue.
I tried posting a question about it on the OME forum and got some good help there. It turns out that the newer versions of OME do support other video speeds, including 10 fps. Richard apparently fixes bugs in the program and refines it on a regular basis and just posts the source code for it in a tarball format for download.
What we needed was version 0.0.20080102, or 2 January 2008, just 19 days old. The problem was that no one had complied a new version for Ubuntu in a while. If I wanted to get the newer version then I would have to compile and install the newer version.
Of course I don't know how to do that. I tried the Ubuntu Forum and did get some helpful advice there, although not the best solution. A bit later I did get some help on the OME forum from Richard Spindler himself. After a couple of errors were uncovered I had the latest version of OME installed!
In case anyone would like to update their Ubuntu version of OME here are the correct instructions to do so:
We haven't had a chance to really try out OME, but I can confirm that it will open and run our 10 fps ".mov" files. I will do a more complete write-up on OME once we have worked with it a bit and done some actual editing.
While I was working on getting the latest version of Open Movie Editor installed, Ruth was working on something of her own in the way of movie editing. She had previously come across a You Tube editing capability. It seemed to work well and allowed easy editing of video clips, combining clips and adding titles. The only problem was that it was listed as a "beta" application and there didn't seem to be any way to save the edited movies.
It did give us an idea that perhaps we should be looking for an on-line editor. Ruth did a quick Google search and found JumpCut. This is a free service that is owned by Yahoo. It looks like when Google bought You Tube that Yahoo figured that they better have a video service website and purchased it. According to Wikipedia JumpCut was founded in 2005 and Yahoo bought it in October 2006.
JumpCut is a great improvement over You Tube in a number of ways. It has a complete video editing application that allows:
Videos are displayed in a non-degraded state, so they look much better than they do on You Tube. Videos can be uploaded, edited and posted either publicly-searchable or not publicly searchable ("private"). The private videos can be seen by anybody, as long as the have the exact URL, so you can e-mail the link to people, or post it on a website.
We have up loaded some clips, edited them, strung them together and added titles. The results are pretty good.
JumpCut has the advantage of being totally cross-platform, since it is web-based. It is a commercial website, so it probably counts as "freeware".
So we still have some learning to do with Open Movie Editor and then post a review here. In the meantime Ruth is editing videos and posting them on JumpCut. There is now a review of this application in our Windows Open Source/Freeware Project.
You can find links to our videos on our Home Page.
I was recently looking at computer operating system market share numbers. These stats are interesting for a number of reasons. Mostly I was curious about whether Linux is growing as a desktop operating system or not, and if so who is losing out.
I found the Net Applications Market Share numbers - these are nice and neat and even have cute pie graphs that make it easy to visualize what is happening.
The December 2007 numbers aren't really all that impressive alone. They show:
Clearly Windows dominates the market, but then we all know that! With the Linux systems sitting at just under 2/3 of one percent that really isn't much market penetration, especially when you consider that represents over 330 different Linux distributions!
The trend information is more interesting. To look at that I went back a year to the December 2006 numbers. These show:
Now that is interesting! The change in market share in that one year time span is:
Okay so Linux has gone up 0.26 percent in a year. That is a 70 percent increase!
At the same time Mac has seen an increase from 5.67 to 7.31 percent, which represents a 29 percent increase. It is likely that all those Mac TV ads are having an impact, as the Mac market share definitely is increasing.
The Linux numbers are interesting because none of the Linux distributions have a slick advertising campaign behind them. Linux uptake is due to word-of-mouth and some columnists writing about it. Given that, the Linux gains are extraordinary.
The release of Windows Vista during that period undoubtedly has something to do with the drop in Windows numbers - it has been widely panned in the press.
Of course the Windows numbers have been falling steadily since October 2004, when the Net Applications Market Share numbers start. Back then Windows had 96.40 percent, so it has fallen almost 5 percent. Obviously people are seeing the need to find alternatives.
Of course, to put it in perspective, with the current rate of Windows losses seen between October 2004 and December 2007, it will be 2036 before Windows use will drop to 50%. Much will change by then, I am sure!
So where is Ubuntu in all of this? According to DesktopLinux.com's August 2007 survey Ubuntu accounts for 30.3 percent of Linux use. Using the above numbers that would give it about 0.2 percent of the overall operating system market. Mark Shuttleworth estimated in 2006 that there were 8 million Ubuntu users. It has probably doubled since then.
So to answer my original question - Linux use is growing and quite rapidly, up 70 percent in the last year. Mac use is also up noticeably. Who is losing market share? Windows. Why? Because of bone-headed operating systems like Vista.
I am still learning lots about Ubuntu, although at this stage in the game it is generally more subtle things.
Ruth had mentioned that our Ubuntu PC clock seemed to be slipping behind the clock on the Windows PC and the clock on the wall. I clicked on the Ubuntu clock and discovered that it was set to "manual". This seems to mean that the time is just manually set and then the internal clock will just run, picking up errors as time goes by. It had only lost a couple of minutes in 10 months so that isn't too much to get worried about!
I discovered that it can be synchronized to a time source, but that we didn't have the sync application installed. Of course with Ubuntu all that stuff is free and open source - so one click downloaded and installed it. I then selected a time source from a checklist - NRC here in Canada looked good. The clock then synchronized and will now presumably stay on time.
I am discovering that Ubuntu is really easy to use - everything you need is just available, if you think to ask for it! Impressive!
I mentioned a couple of entries ago that Ubuntu 7.10 Gutsy Gibbon comes with a new version of the GIMP graphics editor application. At the time we downloaded Ubuntu 7.10 it was version "2.4.0 Release Candidate 3", in other words a "beta" version. It seemed to work fine and we had no problems using it on a daily basis in our house.
On the Ubuntu Forums a number of users asked when a non-beta version would be made available. The response from the more knowledgeable users was not to expect an upgrade before Ubuntu version 8.4 Hardy Heron comes out in April. They noted that mid-stream upgrades of any applications aren't normally done on Ubuntu, perhaps with the exception of Firefox bug fixes. Of course that is more critical as it leaves your system vulnerable, because it is a web browser.
We are currently three months into Gutsy Gibbon, which was released on 18 October 2007. This means that there are still three months to go before the next release, Hardy Heron, is available on 24 April 2008.
There have been a few Ubuntu updates recently. They are mostly small lib files, nothing interesting, really. Then yesterday there was a series of upgrade files for GIMP. This replaced version "2.4.0 RC 3" with "2.4.2", which is the version we have on our Windows PC as current.
I am impressed that the Ubuntu team actually took the time to replace a beta application (that worked just fine) with the stable release version instead. Credit where credit is due - that was a nice effort made!
It seems that every time I read anything from Microsoft about the company's plans for future operating systems after Vista, I become more committed to Ubuntu instead.
I really think that the marketing department for Ubuntu works in Redmond, Washington.
On February 3rd, 2007 Steven Levy of NewsWeek interviewed Bill Gates about Windows Vista, which was released just a few days earlier, on January 30th. Most of the article was damage-control on Vista in light of Apple's advertising assault. The article also quoted Gates' vision for what is being called Windows 7, the next operating system to come after Vista.
Here is what Gates said:
So can you give us an indication of what the next Windows will be like?
Well, it will be more user-centric.
What does that mean?
That means that right now when you move from one PC to another, you've got to install apps on each one, do upgrades on each one. Moving information between them is very painful. We can use Live Services [a way to connect to Microsoft via the Internet] to know what you're interested in. So even if you drop by a [public] kiosk or somebody else's PC, we can bring down your home page, your files, your fonts, your favorites and those things. So that's kind of the user-centric thing that Live Services can enable. [Also,] in Vista things got a lot better with [digital] ink and speech but by the next release there will be a much bigger bet. Students won't need textbooks, they can just use these tablet devices. Parallel computing is pretty important for the next release. We'll make it so that a lot of the high-level graphics will be just built into the operating system. So we've got a pretty good outline.
This is not a new concept. Gates actually expressed similar ideas when Windows XP and later Vista were being developed, but they never did achieve it.
I am not convinced that the barriers to this "dream" of Gates are technical. They could do all of that today. I think it is a "privacy & trust" issue.
When I read that quote and what Gates has previously said, his vision for Windows in the future looks like this: You will have a PC running Windows as an operating system, but it won't have any application software. Those will be located on Microsoft's servers and you will rent the applications. That way there is no pirating of software, you won't have the disks and even if you did, you couldn't install them. You also won't have to upgrade software - whenever you sign in, the applications will be the latest versions.
But the best part is that you will keep all your documents on Microsoft's servers, that way you won't have to ever download anything - no viruses to worry about and your documents would be available to you anywhere. Of course Microsoft will have your tax returns, your music, your love letters, your resume, your credit card statements, your bank accounts, your divorce documents, everything. Of course we all trust Microsoft not to lose our personal information or use it for marketing to us, turn it over to the government or any other nefarious purposes, right? We all know that Microsoft just has our best interests at heart, right?
Hey, that would even put an end to third party software, including the whole open source world - you couldn't install them. There would only be Microsoft software in the world. Of course these new Windows PC would be pretty cheap, too as they wouldn't need much more than a slim operating system, a browser and an internet connection.
Yup Microsoft is going to be responsible for lots of Ubuntu installations when they finally try to sell this one to a public who have had their credit card numbers compromised by retailers, or their social security files on stolen government laptops. These days I don't think the public trust of corporations is there, especially after Vista.
We have recently added some applications to our Ubuntu PC, including the ClamTk virus scanner. As mentioned there are tens of thousands of free applications available for Ubuntu. The problem is that no one has space, or the need, for all of them. Since we started on Ubuntu in April 2007 we have tried a few applications that looked promising but found that they weren't what we wanted and deleted them.
Adding or removing applications on Ubuntu is very easy, provided that they are on the add/remove list. To add one you just find it on the list, check the box next to it and Ubuntu installs it, right down to creating space on the applications menu for it, icon and all. To remove it you just find it on the add/remove list and then uncheck the box. Ubuntu removes it totally, even on the menus. You don't get leftover icons and that annoying flashlight that happens on Windows when you click on the left-over icon for a deleted application. Ubuntu seems to remove applications cleanly with no empty spaces or residue left over.
We have recently removed two applications that we had on the Ubuntu box for a while.
The first was GnuCash, which is a finance application. Ruth originally wanted this one to do the household books on. After playing with it for a while she discovered that it was just far more complex than she needed and found that Open Office.org Calc spreadsheets met her requirements better. So at the time we moved to Gutsy Gibbon we uninstalled that application.
The second application we removed was Scribus, which is a desktop publishing program. Ruth and I both worked with it for quite a while. It basically worked okay, but has a lot of quirks around formating text boxes and pictures that made it harder to use than we wanted. We were really looking for something more like MS Publisher, but with PDF output capability. Scribus is much harder to use than MS Publisher, although it has PDF capabilities.
In a way Scribus is an odd application. It isn't fully featured enough for doing true newspaper layouts (lacks publishing marks) and yet it is too difficult to use for things like office newsletters. Perhaps future versions will correct these deficiencies and it will find a better utility niche.
We did discover that for desktop publishing and PDF creation that Open Office.org Writer does a much quicker and better job than Scribus, so we use Writer now for those tasks. We already had it installed, as it comes with the Ubuntu package from the start.
Removing both these unneeded applications makes more space on the PC and reduces clutter.
One thing that we would like for our Ubuntu box, that we already have on our Windows XP computer, is a general PDF print capability. Right now we can save any Open Office.org document as a PDF which is great, but we lack the ability to save web-pages (on-line receipts and such) as PDFs. Saving whole HTML web-pages in Ubuntu doesn't work well because they lose their formating and copying them whole to Open Office.org Writer usually crashes it, although just the text can be copied to Writer and saved okay.
There is a PDF printer for Linux called CUPS PDF (Common Unix Printing System) made by Apple, that can be installed with command line interface input. A search though the Ubuntu Forum seems to show that there are some serious problems with it and that it doesn't work right, at least on Ubuntu (Forum pages: 1, 2, 3, 4, 5)
There is also an Ubuntu package called PDFedit, but it is clearly not for simple tasks or non-expert users. Wikipedia says that "PDFedit is a low-level tool for technical users, that provides structured access to the internal structure of the PDF file. It may require familiarity with PDF specifications to be able to make substantial modifications."
As mentioned in our previous report Ubuntu now has a virus scanner available in the "add/remove" repositories. This is new and is a welcome addition to the free software available for Ubuntu. It helps make the argument that Ubuntu should be taken seriously as an operating system for mainstream use.
Okay, 'nuff said on that.
So once I discovered that Ubuntu has a virus scanner available through the "normal add/remove means" (easy for new users to find and install), I had to download it and try it out.
As described earlier the application is called ClamTk.
ClamTk is named for the Tk libraries that it originated with. It is actually the graphical user interface (GUI) version of the command-line-only ClamAV (AV = Anti-Virus). When you use the Ubuntu add/remove to install it you get both ClamTk and ClamAV and so you can run it from the GUI or from the command line.
ClamAV was designed as an e-mail scanner for UNIX/Linux servers, but in its Linux version can do on-demand scanning as well, just like the AVG Linux scanner.
As with AVG, we had a problem with the updates once it was installed. Hitting the update button resulted in an error that said "you must be root to install updates". I tried a few ways of doing that, but none worked. In the end a quick search through the Ubuntu Forum turned up the answer, as usual. It seems the best way to do the updates is to run it from the CLI as "sudo freshclam". That does the trick.
I also found the user manual for ClamAV 0.92 and the ClamAV Read-me document which contain a wealth of further information. There is also a CLI manual that can be accessed by typing "man clamscan", that has a fairly complete list of CLI inputs that you can use.
As with AVG the GUI doesn't give you as much control over ClamTk as the CLI. For instance via the GUI it can only be set to scan a directory or file or the whole home folder. Because of Ubuntu "permissions" limitations, it can only scan inside your own home folder. For plain users, as opposed to administrators, this is probably okay as this is where any viruses will probably end up, since they need a password to get out of that folder. The problem is that ClamTk doesn't scan the sub-folders in a directory, just the actual files present. There is a "recursive scan" that is supposed to scan everything in a directory including sub-folders, but so far it only seems to work from the CLI and not the GUI.
One more minor issue - the "Gutsy" version of ClamTk is 2.32, which is out of date, as the current scan engine version is 3.05. The ClamTk website has instructions to update it via the CLI, since Ubuntu won't include the newer version until "Hardy Heron" comes out in April, 2008. In the meantime it still works. For a free open source virus scanner we can't complain!
To use the full value of ClamAV you have to get outside ClamTk and use the CLI instead. This allows a scan of the whole PC or of the whole home folder including folders within folders. It isn't hard to do, once you learn the commands. Here are few useful ones that we have learned. Just open a terminal window (Applications > Accessories) or a console (ctl + alt + F1) and run:
So how is the performance? I ran a scan of the home directory and it took 74 minutes to scan about 28,000 files. Scanning the root directory took just under two hours and scanned just over 129,000 files. This is a slow scan but it does cover more ground than the AVG anti-virus we have installed. AVG takes about 40 minutes to scan about 92,000 files.
So is ClamAV better than AVG? It is hard to say, but fortunately you don't have to choose - you can download and run them both. ClamTk/ClamAV is just easier to get through "add/remove". AVG requires CLI input to install, so it takes a bit more to get it.
Either way (or both) get a virus scanner for your Linux PC and use it. Perhaps we can keep the virus writers at bay if we are all well protected!
We are still feeling our way around the improvements to "Gutsy Gibbon" and there are quite a few of them!
One we have discovered is that there are more applications now available through the normal add/remove channel.
When we installed "Feisty Fawn" we had a list of applications that we wanted. Some, like Open Office.org, came with Ubuntu from the start. Others, like Celestia and Stellarium, were available through add/remove. There were two applications that we wanted that weren't available through any of the repositories and these had to be downloaded via the command line interface (CLI) and then carefully installed using the CLI, too. It wasn't impossible, but it did take some learning to do. It would have been a barrier to many new Ubuntu users.
One of the applications we wanted was a WYSIWYG web-page creator. We write almost all our websites on either the gEdit (Ubuntu) or jEdit (Windows) text editors these days, but there are still a couple of our websites that aren't hand-coded. There were no web design applications of any kind in the Feisty repositories. The one we wanted was KompoZer, which is the updated version of NVu. It works well and is open source. We managed to find instructions to install it via the CLI and that worked out okay, although it was harder to do than many new users would have been happy with.
The second application we wanted was a virus scanner. Ideally we wanted the Linux version of AVG anti-virus. Again there was no virus scanner available in the Feisty repositories. We managed to download and install AVG using the CLI. Again it wasn't too hard, once we figured out the correct commands to use. Installing it and getting it to update were another story - that took a week. Eventually we discovered that the user account had to be added to the "AVG group" or else it would not update. That done, it worked fine.
I am pleased to report that the "Gutsy Gibbon" Ubuntu release has solved both of those problems! KompoZer is now available through the add/remove process which makes installing it much easier than even on Windows. The current version of KompoZer is 0.7.10 which replaces the version 0.77 that we had.
"Gutsy Gibbon" also includes a virus scanner in the repository. This is not AVG, which is commercial freeware, but instead is the open source virus scanner, ClamTk.
So "Gutsy Gibbon" has come a long way in resolving our early issues with "Feisty Fawn". As The Economist said "No question, Gutsy Gibbon is the sleekest, best integrated and most user-friendly Linux distribution yet. It’s now simpler to set up and configure than Windows." All quite true!
So where are we with our early issues with Ubuntu? Here is the list and disposition:
For the record here are the optional applications that we currently have installed on our Ubuntu PC, other than the core packages that come with it (like GIMP and Open Office.org):
There certainly are a lot of available applications to choose from!
I would say overall we are pretty impressed with Ubuntu 7.10 "Gutsy Gibbon". Ubuntu is getting better all the time. As soon as the developers can fix a few remaining hardware problems and those "system breaking updates" that cause working hardware to stop working, then Ubuntu will be totally ready for use by mainstream, "average users".
One of the National Capital Freenet Free Software Discussion Group's fearless contributors pointed out a recent article published on-line by highly respected The Economist in their Tech.view column. In their Technology in 2008 article posted on Dec 23rd 2007 the anonymous author tackles Three Fearless Predictions for 2008.
The first is that the internet will slow down due to the traffic on it. The second is that a lot of internet surfing is going to go wireless.
The third is that open source is going to be the dominant factor in technology. The column puts much of the credit for that on Ubuntu and especially the new Gutsy Gibbon release.
Bulletproof distributions of Linux from Red Hat and Novell have long been used on back-office servers. Since the verdict against SCO, Linux has swiftly become popular in small businesses and the home.
That’s largely the doing of Gutsy Gibbon, the code-name for the Ubuntu 7.10 from Canonical. Along with distributions such as Linspire, Mint, Xandros, OpenSUSE and gOS, Ubuntu (and its siblings Kubuntu, Edubuntu and Xubuntu) has smoothed most of Linux’s geeky edges while polishing it for the desktop.
No question, Gutsy Gibbon is the sleekest, best integrated and most user-friendly Linux distribution yet. It’s now simpler to set up and configure than Windows. A great deal of work has gone into making the graphics, and especially the fonts, as intuitive and attractive as the Mac’s.
Like other Linux desktop editions, Ubuntu works perfectly well on lowly machines that couldn't hope to run Windows XP, let alone Vista Home Edition or Apple’s OS-X.
Pundits agree: neither Microsoft nor Apple can compete at the new price points being plumbed by companies looking to cut costs. With open-source software maturing fast, Linux, OpenOffice, Firefox, MySQL, Evolution, Pidgin and some 23,000 other Linux applications available for free seem more than ready to fill that gap. By some reckonings, Linux fans will soon outnumber Macintosh addicts. Linus Torvalds should be rightly proud.
There is lots more in the article as well - recommended reading.
It is all high praise for Linux in general and for Ubuntu in particular.
More and more I am thinking that when Windows XP mainstream support ends on 14 April 2009 (that date subject to the future whims of Microsoft, of course) that a lot of people are going to be installing Ubuntu on those existing boxes instead of running out to buy new hardware to run Microsoft's latest "bloatware".
By April 2009 Ubuntu will be into version 9.4 ("Jumping Jabiru"??) and hopefully the few remaining hardware issues with will be solved by then.
Oddly enough we have never tested out printing on our Ubuntu PC. Mostly we have never had a reason to do so. The printer is hooked up to the XP computer and it works fine over there. Even doing a test of the printer would mean hauling the hardware around and we have just never done it.
It did seem to be an oversight on our part in our ongoing tests of Ubuntu and so I decided to finally check it out and see if Ubuntu could find and print on our HP 1018 printer.
I did decide to cheat, however and rather than haul the printer over to the Ubuntu PC, instead bring Ubuntu to the printer. I restarted the Windows PC and booted it to Ubuntu, from a Feisty Fawn (Ubuntu 7.4) disc that I have. It was odd to see the familiar old brown desktop appear on that PC, but I quickly checked it out and found that everything was working right, just a little slower, because it was running live off the CD.
I tried a quick print test and it failed. Poop. I checked the printers and it had a dummy default set-up. It only took a few clicks to have it recognize the HP printer and use its own Ubuntu driver for it. I wrote a quick Open Office Writer document and printed it - it worked perfectly.
So I think that there is no more doubt - Ubuntu and the HP 1018 printer get along just fine together.
One of the many features most home computer users like using is a calendar/appointment program. For many Windows operating systems, Outlook is used. For Ubuntu users, you don't need to look any further than Sunbird. Described as a cross-platform calendar which can be used "...around the world..." Sunbird is a wonderfully easy application.
Its interface is very much like Outlook's calendar application. However, unlike Outlook, Sunbird is not associated with any email program but is a standalone program. Users can choose different views from 'today' to 'this week', 'multiweek' and 'month', although there really isn't any significant difference between the 'multiweek' and 'month' views. The only noticeable difference seems to be that selecting 'multiweek' puts your current date and week at the top of the screen, whereas the 'month' view puts your current date on the screen relative to the rest of the month. In other words, if your current date is somewhere towards the end of the month, the 'month' view will place you towards the bottom of the screen whereas the multiweek view places you at the top.
The left hand side of the screen depicts both a conventional calendar and a task pad just below. As with nearly all programs of this kind, one click on the date will send the user to that date on the more detailed right hand side of the screen. It would be useless to delve too deeply into any details because Sunbird is so intuitive that explaining the features just isn't necessary
Double-click on a date on the right hand side of the screen (where the calendar itself is displayed) and a dialogue box will open with the heading New Event . At that point, it's a very simple matter of filling in the information.
Curiously, though not surprising, there isn't a Save option. By closing the application, whatever appointments, reminders or other information you've entered is automatically saved. Sunbird seems to understand that other users may have other calendar information elsewhere, so the option to both import and export a calendar is available from the File menu. Selecting either of those options will open another dialogue box. The option to Publish one's calendar allows users to publish their calendar on the Internet.
The application itself isn't particularly large. On high speed, it took under a minute to download it.
Ubuntu users can find Sunbird available using Ubuntu's Add/Remove program function. Anyone else can download this neat calendar program at Mozilla.org
As noted in the entry about the upgrade to "Gutsy Gibbon", our scanner is now working! This was one of the problems that we had with "Feisty Fawn" that has now been solved.
It is worth discussing the scanner software as it really works well with our Canon CanoScan LiDE 20 scanner. We had downloaded the scanner software for Ubuntu some time ago when we first tried getting the scanner to work. The problem wasn't the software or the scanner itself, but the ports needed for USB scanners were assigned for other things in "Feisty Fawn". Apparently they had worked in the previous Ubuntu version "Edgy Eft". It looks like the port problem was resolved with the release of Ubuntu 7.10 "Gutsy Gibbon".
The scanner software is XSane, an application from the open source community at www.xsane.org. "Sane" in this case means "Scanner Access Now Easy". The "X" indicates that it is designed for the "X Window System".
The XSane scanner application opens a number of windows to carry out all the functions. It sounds complex but it is very intuitive to use. The main window allows you to set the parameters - resolution (in dpi), contrast, colour, grayscale or black & white, lightness/darkness, destination folder, etc. It also allows you to designate whether it will create a PDF, JPEG, PNG or other format output. A second window allows scanning for a preview, from which you can then adjust the colour, lightness, etc.
When you are happy with the preview and the settings you can then hit "scan" and it does the job, depositing the file where you indicated you want it.
There really isn't a lot else that can be said about XSane - it is simple to understand and works, at least once the operating system allows it to communicate with the scanner properly.
Congratulations to the folks at www.xsane.org, your scanning software works perfectly!
Well we have now had a chance to use "Gutsy Gibbon" and find out what it comes with, plus what it can and can't do.
First we checked the versions of software Ubuntu 7.10 comes with and found:
Most of these are upgrades from the applications that came with "Feisty Fawn"
In running through the different applications to see if they all work we found a few anomalies:
So overall there are a few pluses to "Gutsy Gibbon" and a few minuses. Other than these issues it seems to work as well as "Feisty Fawn", but we still have to do some more testing, such as recording CD-RWs for back-ups, etc.
I will add more info when we have more!
We have to admit that we have been a bit reluctant to upgrade our version of Ubuntu.
Since we got our second PC in April 2007 it has been running Ubuntu 7.4 "Feisty Fawn" and in general it has worked quite well. In October the next version became available: 7.10 "Gutsy Gibbon" (or as Ruth calls it "Goofy Grape" - she thinks the versions should be named for Kool-Aid characters),
Since we found that in general "Feisty Fawn" worked okay and we thought that we would put off upgrading for a while to see what the Ubuntu community response was. Reading the Ubuntu Forum there didn't seem to be any major wholesale panic over "Gutsy Gibbon", so that was a good sign.
There is no problem with continuing to run "Feisty Fawn" almost indefinitely, although Canonical support only lasts for 18 months. After that there are no more upgrades or fixes that version. Before we got onto high-speed DSL internet it was almost impossible for us to upgrade - it would have taken about three days on line to download the updates. High speed is a real necessity for managing Ubuntu.
There are a number of drawbacks to not upgrading as new versions become available each April and October. Mostly these are:
Of course we have to find out what that all means for the way we use the PC.
The Compiz Fusion website says:
Compiz Fusion is the result of a merge between the well-known Beryl composite window manager and Compiz Extras, a community set of improvements to the Compiz composite window manager. Compiz Fusion aims to provide an easy and fun-to-use windowed environment, allowing use of the graphics hardware to render each individual window and the entire screen, to provide some impressive effects, speed and usefulness...Compiz Fusion is an open-source software project, meaning anyone can use it freely and contribute.
From our understanding it will make Ubuntu look as bad as Vista's Aero interface. We would rather that the folks at Canonical spend their time making Ubuntu work better with hardware like scanners and cameras, rather than trying to compete with Windows Vista for useless glitz.
So after we discussed it, we decided to upgrade today. We went to the upgrade manager and pressed the "Upgrade To 7.10" button. There were few "congratulations" screens to navigate then we got to the download box. There was a warning that of our installed Ubuntu-supported applications our gXine media player was no longer supported and that we would have a choice of keeping it later in the process.
The download upgrade tool indicated that the process would involve:
The download itself went smoothly. We were asked if we wanted to keep our configuration settings, which we answered "yes" to and then the PC commenced the installation of the upgrades - estimating 39 minutes to complete that process.
The process was completed in a total of 1:52. It left us with a display that looked just like our previous one. I did check and the upgrade was completed - it just retained all our settings.
We have new versions of many applications, like Open Office 2.3 (replaces 2.2), Gimp 2.4.0 Release Candidate 3 (i.e beta), Firefox 22.214.171.124, etc.
Of interest the Compiz Fusion interface didn't come up when the upgrade was complete. So I checked to see what was up - the controller said it was disabled. Trying to enable it didn't work, so I assume that we don't have the graphics hardware to run it. That means that we can't review it, but we didn't really want it anyway!
Overall everything seems to still be there and still works just fine. More on "Gutsy Gibbon" once we have been able to give it a proper "test flight".
As described more fully in our Open Source/Freeware Windows Project our switch to hosting our home website on National Capital Freenet from GeoCities meant that a File Transfer Protocol (FTP) client application would be the best means of getting the web pages uploaded to the new website.
In reading the NCF help pages we found a recommendation to use FileZilla, a free open source FTP. A quick check on the Ubuntu "add/remove" list showed that it was there, as well as available for Windows XP, too. That made it an easy choice for us, since it enabled us to keep both PCs equipped with the same software, our goal whenever possible!
So I quickly downloaded on Ubuntu and got version 3.0.0-Beta7. As usual this is a much older version than is available for Windows XP which downloaded version 126.96.36.199. The Ubuntu version is a "beta" to boot! Due to Canonical's requirement to have special versions for the Ubuntu operating system, they are always behind the versions available for Windows. Regardless this version of FileZilla does seem to work fine, even though the interface requires "fixing" each time it is opened for use.
It is great to have the same applications available for both our Windows and Ubuntu PCs. It reduces the amount of learning we have to do and enables us to get on with the important stuff - creating content, instead!
We finally took the plunge and have gone to "high speed Internet" at home. Yup that is right, we have been on the Internet since 1998 and have been on dial-up the whole time.
The first place we were on the Internet was a remote village in central Alberta. No choice there, even the local library was on dial-up and we were too. Most people don't realize that, even in 2007, outside the the major cities in Canada broadband service is often just not available.
When we moved to Ottawa we just kept our Windows 98 PC and our Win Modem. We started with Primus in 2000, but their service was awful and we quickly switched to Ottawa's number one provider at the time, Cyberus. We were with them until early in 2007. They were great, with no busy signals and pretty good speed for dial-up, usually 49.9 KB/s. Then they were bought up and the company collapsed. It was difficult to even contact them to terminate the service. Finally we did and switched to Magma, which was bought out by Primus.
So we ended up back with Primus in early 2007. They had to have improved, right? No! The service has been awful, with busy signals 50% of the time, really slow data rates and poor customer service. In the past month we haven't seen anything faster than 26.4 KB/s. Downloads typically run around 3 KB/s. That is very slow, even for dial-up!
I should say that we are not opposed to high speed Internet. There are lots of advantages to it, including that everything happens faster and that your phone line is not tied up. A major reason for being on high speed would be to manage our Ubuntu upgrades. Ubuntu sends lots of updates and the downloads can be very large! It really is hard to manage with dial-up. Upgrading to the next version of Ubuntu every six months means around a 700 MB download - that would take about 65 hours at the speed that Primus was supplying us. Another advantage is no more taking turns with two PCs and one dial-up connection. Both can be on the internet at the same time.
Our reason for not being on high speed earlier was simply cost. Out in Western Canada people can get high speed for under $30 per month, but here in Ottawa all the providers seemed to have fixed the price at $49.95 per month. To get a lower price you had to buy a "bundle" which included home phone, long distance and perhaps cable TV, too. The problem is that we have a great long distance program with Primus (formerly London Telecom) that includes "reverse calling". This means that Ruth's kids can call us for free, a very useful feature that we didn't want to lose. Going to someone else's "bundle", or even Primus' would mean losing that feature. We don't have a TV and obviously didn't want a "bundle" that included that service,
Primus was a special case since they were already our dial-up ISP and long distance provider. We had received many leaflets in the mail from Primus advertising their "bundle" of high speed Internet, local phone and long distance (no reverse calling however) for $59.95. Not too bad really. We called them about it and discovered that even through we have received about 200 of these leaflets in the last two years that "bundle" isn't available in our area. So why send out the leaflets? Maybe to try to "up-sell" us? They offered us the same services as in the "bundle" but for $99.95 a month!!
Needless to say we weren't impressed. Primus' Internet on dial-up has been really poor quality. There was no reason to think that their high speed service would be better.
We checked out all the other ISPs that have service in Ottawa and the price was all the same - $49.95 per month. Then we discovered one ISP we had missed before.
National Capital Freenet was Canada's first ISP, starting in 1993. They are rather unique in that they are a not-for-profit, member-owned association that is in business as a community service. DSL service is $29.95 per month and dial up is a suggested donation of $5 per month. The organization has a totally different philosophy than the commercial providers. We signed up on line and then dropped by their office to pick up our DSL gateway and cabling. NCF have four employees and the rest is done by volunteers, all for the benefit of the community. NCF's Executive Director, John Selwyn, explained that Internet access isn't an option in our country anymore - everyone needs it to get information, do banking, pay bills, get a job and so on. NCF is there to make sure that disadvantaged people in the city are not further disadvantaged by being unable to afford Internet service. NCF uses some of the money they make providing DSL service to subsidize the dial-up service, right down to zero in some cases. It is a great project to be part of and many members volunteer their time to provide tech support and generally help out.
So our service started on Friday 30 November 2007. We had everything wired up to the gateway, including both our PCs. By the middle of the afternoon the DSL was "hot" and we were on. The gateway showed a speed of over 3 MB/s for downloads and 0.8 MB/s for uploads - pretty darn quick! That is about 114 times faster than Primus dial-up in the last month and even three times faster than most DSL services. The gateway is "DSL 2" ready, which will provide 25 MB/s one day, according to John Selwyn.
So we are impressed with NCF! John Selwyn has even asked us to start an Open Source/Freeware discussion group to help other members learn about that subject. Due to our involvement in the Ubuntu community and also in creating a predominately open source Windows XP PC we have a little bit of expertise in that area and will be happy to help out.
So, what is the bottom line on DSL and Ubuntu? It works incredibly well! Once the DSL was "hot" and the Ubuntu PC plugged into the gateway with an ethernet cable, we just needed one reboot and it was on. Our Ubuntu PC is a two year old Dell that has a 2.8 Ghz processor, so now the Internet connection finally matches what the rest of the box can do.
Within a couple of hours on DSL the Ubuntu PC notified us of a required Ubuntu update. The files added up to 25 MB, a pretty typical update package. On dial-up this would have taken almost 2 1/2 hours, but the DSL had it downloaded in less than a minute. Amazing.
I have to conclude that Ubuntu isn't really very practical on dial-up, especially with the sizes of the update and version files. Ubuntu was obviously designed for high speed. Trying to even configure Ubuntu to make it work with a dial-up modem is a challenge as we described in our entry for 29 April 2007 - Making Progress With Ubuntu (or something like that).
The only question left is when we will make the upgrade to the next edition of Ubuntu that was introduced two months ago, Gutsy Gibbon (7.10). Our current version, Feisty Fawn (7.4) is working pretty well and we hate to give that up!
We had lunch with a good friend of ours this week. She bought a new laptop this summer and, predictably, it came with Windows Vista installed. No big deal she thought - after all it is Microsoft Windows, right? How bad could it be?
She has had a summer of frustrations with the unit. It runs slowly, opens applications slowly or not at all and generally barely works. It has been back to the retailer several times. The first was for more RAM - lots more RAM, in an attempt to make it work better. No go.
It isn't a hardware problem, it is Vista. After several months she has given up on that Operating System.
We did talk to her about installing Ubuntu, but she is't a "technically-minded geek-type", she is very much an average Windows user, who uses her PC to get work done, not as a hobby in itself. Also she has to run some specialty accounting applications for school. She really wouldn't have the patience to get Ubuntu configured right, especially if she needed to install emulation software, such as "Wine", to make Windows-only finance applications run. As much as we like Ubuntu, we really just couldn't recommend it to her for what she needs a laptop to do.
In the end our recommendation was to see if she could get Windows XP installed in place of Vista. She took it back to the store and that is what they have done for her. She knows XP quite well from four-plus years of using it at work and at home. XP remains Microsoft's best OS to date. Hopefully she will have a usable laptop now.
I wish Ubuntu was just a little further down the development road that we could recommend it for users like her.
Recently a very good friend our ours out west sent a link to an article he had found: Linux’s Free System Is Now Easier to Use, But Not for Everyone by Walter S. Mossberg, published on September 13, 2007. It is a review of Ubuntu, motivated partly by the popular demand of his fans and partly by Dell offering Ubuntu as an option this year.
Walter S. Mossberg is a well-regarded tech subject writer; he writes his Personal Technology column for The Wall Street Journal, so he has credibility and a large audience. His article has certainly attracted lots of responses in the few days that it has been posted.
So I read the article. It seems well-researched and written clearly. He describes his column as "... written for mainstream, nontechie users of digital technology".
His conclusion on Ubuntu is:
My verdict: Even in the relatively slick Ubuntu variation, Linux is still too rough around the edges for the vast majority of computer users. While Ubuntu looks a lot like Windows or Mac OS X, it is full of little complications and hassles that will quickly frustrate most people who just want to use their computers, not maintain or tweak them.
He has some interesting information to back up his opinions on this matter.
Mark Shuttleworth, the South African-born founder of the Ubuntu project, told me this week that “it would be reasonable to say that this is not ready for the mass market.” And Dell’s Web site for its Ubuntu computers warns that these machines are for “for advanced users and tech enthusiasts.”
He gives lots of examples of the "rough edges" he encountered with Ubuntu from a lack of adjustment for the touchpad on his Dell-supplied laptop, to the volume control crashing. He notes that most average users "wouldn’t want to enter text commands, hunt the Web for drivers and enabling software, or learn a whole new user interface.
His bottom line: "...for now, I still advise mainstream, nontechnical users to avoid Linux."
Overall I think he has it right. Ubuntu does work well, but it still has hardware compatibility issues, there isn't software that will do some things that some users may need, particularly in some specialty areas and it requires more patience and skill to set up and troubleshoot than the average Windows user is prepared to invest. The command line interface will scare many users. When I showed the CLI feature to my wife's 17-year-old daughter she laughed out loud and thought it was "very old fashioned, from the 1960s", until I showed her Windows XP's "Command Prompt". She hadn't seen that before.
But Ubuntu is getting better all the time. I am convinced one day it will be an excellent operating system and will be able to run any hardware, will require minimal configuration and will satisfy reviewers such as Walter S. Mossberg.
This is an interesting situation with Ubuntu that I had suspected existed, but had't got around to fully testing out. On Windows XP you can add new files to an existing CD-R or CD-RW that already has files burned onto it, but on Ubuntu, you can't.
The first opportunity to do this was while I was working on some photos. After having processed the photos using Gimp I wanted to add them to a CD-R that I already had some photo file folders on. In Windows this is a snap - XP creates a visible "temp" icon for upload and then you select "write" and it will burn the images onto the CD-R, while preserving the existing files on the CD. The end result is that the new file joins the old files on the CD. Given the same task Ubuntu just says "insert a blank CD".
There is one disadvantage to doing this as a back-up procedure on Windows - the CD-R won't hold nearly as much as when you burn a whole CD's worth of files all at once. When done all at one time a CD-R will typically hold 700 MB of content. When burned one file at a time (some now, some later) then the same CD-R will often max out at 350 MB. I assume that the missing space is due to the separate session compilation.
It seems that the only way to write Ubuntu files to a CD is to do the whole CD all at once. Since I write files to CD to typically back them up (such as photos or videos) this means that while I am accumulating 700 MB of files I will have to back them up some other way. Probably the easiest method would be to accumulate them on a "USB device" and then when they add up to 700 MB, to transfer them to CD instead. More of a culture change than a technical glitch, really.
It seems that when moving from Windows XP to Ubuntu there are some "mindset" changes to be made, but most tasks can still be carried out in one way or another. That said, we still don't have solutions to the scanner or camera problems previously mentioned. Perhaps they will come in October when "Gutsy Gibbon" is launched.
The tale of the WIN32/PolyCrypt virus problem seems to have come to an end today. I downloaded AVG virus update 269.12.2/966 and did a scan and it came up clean. This was a definite improvement over previous scans which indicated up to 44 files were infected with WIN32/PolyCrypt as described in Fun with AVG Anti-Virus below.
More recently just one file had come up as infected and that was /usr/bin/mawk. Many other Ubuntu users were also noting the same problem as described on the Ubuntu Forum.
All during this time I didn't take any other action on the PC other than doing scans - no healing action, so it is clear that this was a "false positive" all along. We were all pretty sure about this from the use of the Jotti Online Malware Scanner that showed that only AVG thought any of these files had a problem. The other 19 anti-virus scanners didn't think those files were infected, which is a good indication of a "false positive".
So I have learned that not all virus scans that show infections can be trusted and that it is well worth trying the file in Jotti to see what the other scanners think.
As you can tell by reading the Ubuntu forum some people were very concerned about the AVG results. I was convinced from the start that it was a "false alarm".
Overall this whole incident was a minor annoyance and it could happen on any operating system, not just Ubuntu. At the same time we had a false positive on our Windows system, although it was for the installer program for Ad-Aware, of all things!! The only connection between Ubuntu and this AVG "false positive" maybe that the folks at Grisoft who write the AVG virus updates either didn't test it on a Linux system or weren't sufficiently aware of what is in the binary files in Ubuntu and that their virus definition was "too generic" and picked up normal Linux files. I am sure that the main focus at Grisoft is Windows and not Linux systems, which is understandable.
Despite the "false positive" from AVG and the less than sterling customer service from Grisoft on the issue, I still think that AVG is worthwhile and a valuable addition to any Linux-based system.
We have been using Grisoft's AVG "Anti-Virus Guard" Free Edition for a number of years on Windows with no problems whatsoever. You just download it, install it and it works great. There are very frequent virus definition file updates, any threats found can be quickly dealt with and with a high degree of confidence.
The experience with the Linux version of AVG has been a bit different. Installing it was a challenge, as described earlier in this diary but we did sort through that, even with the surly assistance of the AVG staff on their forum. Since then, AVG on Ubuntu has worked mostly okay. It doesn't have nearly the features of the Windows versions, but it does manually-initiated scans and that is really all you need.
Then there were the events of yesterday, 09 August 2007.
It all started with a routine download of an AVG virus definition file. These are issued by Grisoft every day or two and sometimes more than once per day. At least once a day, when I am on-line on the Ubuntu box, I try to run the update function and see if there are any updates available. Then, when I am done with the Ubuntu PC, I run a virus scan on it. It always comes up clean because there really aren't any viruses for Linux right now. The AVG definitions are generic, which is to say contain information on all viruses, not just Linux ones. This is good as, while a Linux PC cannot run a Windows virus, it can pass one on to Windows users. (See 29 April 2007 - Making Progress With Ubuntu (or something like that) for some thoughts on that issue.)
So I downloaded the definition file and then ran a scan. It indicated that the PC had 44 infections of WIN32/PolyCrypt.
Obviously this was very suspicious. First off I checked some virus information sources and discovered that WIN32/PolyCrypt is a Windows-only virus. It would be possible to pick it up, but it cannot be spread all over a Linux PC, because:
Rather than run a "cleaning" scan on AVG via the command line, I checked both the AVG and Ubuntu help forums. AVG didn't have any reports of this problem, but the Ubuntu forum did. There were a number of Ubuntu users there who were reporting the same problem, although none of them had sought help on the AVG Forum. Probably a good thing, as events would show!
So I posted the details on the AVG Linux forum, since it seemed to be a problem unique to Linux. I asked if they had any information or advice on this issue. The AVG forum is only answered by AVG staff, they don't appreciate public help there, so even though it is termed a "forum" it really isn't one in the traditional sense.
What I suspected had happened was that the latest AVG virus definition file contained a search string for something that appears naturally all over the place in Ubuntu systems.
I also posted a note on the Ubuntu forum saying that I was seeking an answer on the AVG form, provided the link to the page and promised to report back when I had news. In response I got some positive posts on the Ubuntu forum for that initiative. Several Ubuntu users had indicated that they suspected from the evidence that we had a "false positive" situation. The Ubuntu community is really very helpful and supportive, not to mention polite.
I did get an answer from AVG - sort of. The staff there seem to have a different approach to dealing with people. First they deleted my post - when I came back to the URL I had bookmarked I found just a "page missing" tag. So I did a site-wide search and found that they had moved it from the "Linux" section to the "Virus" section, along with a snotty note to pay attention to the section posted in. I was lucky that I found it at all! Then they indicated that I shouldn't have added to an existing post and that my post was "not helpful". That is fine except that I didn't add it to an existing post, it was a new one. I did point that out to them, but just got another rude response.
They did actually provide some semi-useful information. They wanted me to run it though the Jotti Online Malware Scanner and then if it came up as a likely false alarm to send them the file in a password protected zip file, with the password.
The Jotti website was something new to me and turns out to be a great resource. It uses 20 different virus scanners from different providers to scan any uploaded file. That way you can get a second opinion (and twentieth opinion as well) as to whether it is a real virus or a false alarm. I uploaded a couple of the files AVG had flagged on the Ubuntu PC and they all scanned with only AVG reporting that there was a problem. The other 19 scanners reported "clean". That meant that it was likely a "false alarm" as I, and several others on the Ubuntu forum, had suspected.
The second part - sending them to Grisoft as zipped and password protected files was more difficult. The files indicated were all binary files and I have no application that will open them, let along save them with a password. Regardless, I reported my results on both fora. On the Ubuntu one I got words of thanks and support, on AVG's staff forum, silence, which is better than I usually get on that forum.
So next I did nothing. I didn't run an AVG "cleaning" scan on Ubuntu or make any other changes. Instead I waited until later in the day and then downloaded a new AVG virus definition file. I ran a scan with that and it reported just one infection - WIN32/PolyCrypt again, this time only in/usr/bin/mawk.
Again I ran the file through Jotti and it came up as only AVG reporting it as infected. As Grisoft had also suggested I ran another scan with the "heuristic" analysis turned off, but it didn't make any difference. I reported the results again on both fora.
So I am convinced that the problem was entirely a "false alarm" issue, perhaps in the next few definition updates the remaining scan will come up clean?
I don't really have a problem with the occasional "false alarm". I suppose that these things can happen, although this was the first one that I have seen with AVG in several years of using it. The Jotti website is a good way to test them and see if there is a real problem or not. What I do have a problem with is the consistently rude retorts I get from Grisoft. I have had occasions to post on their forum twice and both times the responses have been quite different from what I have encountered anywhere else on the Internet. I realize that this is a "free" product and therefore "you get what you pay for", but they do sell these products for commercial use and the "free version" users reports of problems should help them make the product better and more marketable. I really think that the "free" users are helping Grisoft, not hindering them and deserve just a little bit better response.
I will leave it to readers to have a look at the complete forum posting from this issue and make up their own minds as to whether this is warranted or not.
This should be the end of the story here, but not quite. After downloading the second definition file mentioned above, I also downloaded it for the AVG installation on our Windows XP PC and I ran a scan. It claimed that I had a Generic5 trojan and deleted the file. Guess what file it was? It was an archived copy of aaw2007.exe that I had saved. This is the installer file for Lavasoft Ad-Aware a competitor's anti-spyware product. I had the installer file cached on my drive, but also saved on CD, so it is no loss to have it deleted. Ad-Aware works fine and isn't a trojan. I think the folks at Grisoft are having a busy week there.
We have been using the native e-mail program that comes with Ubuntu, called Evolution 2.10.1 for three months now. It generally sends and receives e-mail okay, but it does have some drawbacks. It is designed to work very much like MS Outlook in that it not only handles e-mail and has an address book, but also has a complete calendar function, too.
One of the main problems has been that Evolution doesn't seem to organize messages in a way that I can understand easily. No matter how I order them I can't find messages when I go back to look for them. Ruth likes the calendar function, but I never use that feature. Also we never found a way to import our existing Windows Address Book and so just started collecting address on the Ubuntu PC from scratch, cribbing them from the Windows PC when needed. Not totally ideal.
There is something about the Evolution interface that is just a bit stark, too. It looks bare and for some reason that makes it hard to find things at a glance. Overall it works, but neither of us has been totally happy with it.
So we decided to try an alternative. It is called Thunderbird 188.8.131.52 and is open source software available from Mozilla, the same team who bring you the highly successful Firefox web browser. The Ubuntu repository download for Thunderbird is about 6 MB. It installs easily and can be quickly set-up.
Thunderbird is a basic e-mail application - just e-mail, no calendars. I was able to import my WAB address book from the Windows PC by copying it into a text document and then editing the text file to fit the "tab delineated" format that Thunderbird wanted. Then it imported fine! That was a break - we now have a complete address book.
In use Thunderbird looks very much like MS Outlook Express, but without all the vulnerabilities. It is quite customizable, but most of the available interfaces look like what Windows users are used to. There are many plug-ins to further customize it to suit individual tastes.
My only initial complaint about Thunderbird was that it would not go and get e-mail on a user-defined schedule (i.e. every five minutes, etc). This tuned out to be just that I hadn't found the setting to do this. It actually has two settings. You can select so that it will check for e-mail on a user defined schedule (default is every ten minutes) and then notify you so you can download it when you are ready to do so or it can download automatically. This is actually a great feature as it prevents the e-mail application automatically downloading large messages in the middle of trying to download other things. On dial-up this can be a real problem if a large e-mail is being downloaded while you are trying to do something else. As far as the calendar function goes, Ruth still has Evolution for that if needed and she is happy with that.
The Thunderbird interface is much better than Evolution, the ordering of e-mail messages seems to work much better and the search functions seem to work well, too. Interestingly the current Ubuntu repository version is Thunderbird 184.108.40.206. The current download version for Windows is 220.127.116.11, so I am guessing that the Ubuntu version will catch-up soon.
Overall we are both pleased with the way Thunderbird seems to be working and would recommend it as an alternative to Evolution. For us, it does improve the functionality of Ubuntu.
Incidentally, as part of our Windows PC Open Source/Freeware Project we have also downloaded Thunderbird onto our Windows box. The newer Thunderbird 18.104.22.168 version seems to work even better and in that environment it is a great replacement for MS Outlook Express. It also helps with interoperability to have the same application and the same address book running on both PCs.
We have been working with the image editing tool that comes with Ubuntu. It is called Gimp 2.2 which stands for "Gnu Image Manipulation Program".
Gimp is designed to be a free, open source competitor to PhotoShop issued under the GNU Public Licence. It works really well, in fact it has some features that either PhotoShop doesn't have or are more flexible in Gimp, for instance:
These two items alone make Gimp better for working with photographs than PhotoShop, which is mostly what we use these types of applications for. There may be some things that PhotoShop does better than Gimp, but we haven't discovered them yet.
We have both done some work creating original images in Gimp and it generally is pretty easy to figure out how to make it do things. We are still learning the subtleties on the program. Of note, there is an amazingly complete and very clear illustrated user manual on the Gimp website.
Ubuntu comes with Gimp installed from the start. This means that it interacts well with the Gnome desktop. This isn't always the case with PhotoShop on Windows. I had to remove it from one Windows XP PC because it was messing up Acrobat, of all things! Given the cost of buying PhotoShop for Windows and the fact that it seems to work as least as well as PhotoShop makes it a great deal!
Here is some even better news - Gimp is available for Windows and Mac as well, so if you need a better image application then go download it!
Kudos to the Gimp team for building a great piece of software and to the folks at Canonical for deciding to include it with Ubuntu!
There have been many debates on the Ubuntu Forum about viruses and virus protection for Linux systems. Of course far too often you will hear answers like "Dude - no one writes Linux viruses". Back in 29 April 2007 - Making Progress With Ubuntu (or something like that) I said that you do need anti-virus for two reasons:
Well that day is here. Read these:
So if you are running any Linux system - get some virus protection. There will be more of these. We are using AVG and it works pretty well and it is free. Best of all the company is very fast on updating virus definitions - there are new definitions almost every day.
The last week or two have been a bit odd in the world of Ubuntu - at least for us.
The first odd situation was our videos. After downloading an Ubuntu update we found our movies all looked overexposed suddenly. It didn't matter whether we were watching a DVD or a QuickTime movie or even an MPEG - they we all far too light. We have two video applications, Totem and gXine and they were both the same. I thought perhaps this would get fixed by a further update quickly, but it wasn't, at least not yet.
I tried looking for some information on this snag on the various FAQ sites for Ubuntu and the video applications that we have, but did't find anything useful. An entry I found on the Ubuntu Forum simply suggested adjusting the contrast from the default "50%" down to "25%". This worked fine and the videos are all now normal.
It seems like a very easy fix, but doesn't answer the question: were the defaults changed by the update (I had never checked before, as the pictures were fine) or has the way video is displayed changed somehow?
A second problem arose with a new Linux kernel release last week. We have never had any problems recording data CDs and after this update it just stopped. I could play data CDs and video CDs and DVDs fine, but it would not record any. Since we use CDs to do our weekly document back-ups this was a real problem. It wouldn't erase and write on CD-RWs, but left the existing data intact, just giving an error. On CD-Rs it either said it couldn't write to the disc and gave an error or it said the job was completed and then when I checked, the disc was blank. In the case of the CD-Rs it ruined the discs too - both PCs failed to even detect the disc after attempting to write it.
A check of the Ubuntu Forum showed that kernel update 16 has caused thousands of problems for users, ranging from failure to boot to loss of all kinds of functions. When I checked last there were 54 pages of complaints there and few solutions, except to boot back to Kernel 15. I guess we are lucky that we just lost CD writing when we downloaded this update. Others suffered far worse.
I found a work-around on the back-up CD issue. In this case I did a back-up to my USB jump-drive (that still works) and then took it over to our Windows XP PC and then wrote the files to a CD-RW there. I also created a "snapshot" CD-R disc the same way. Every six months I do these, just to provide a record of what was on the PC at that time. Of course relying on Windows to do Ubuntu back-ups is beyond ironic! This issue was beyond annoying - it was a real problem.
Strangely this problem fixed itself. A few days later I downloaded another Ubuntu update, rebooted and suddenly CDs can be written once again. Weird and disconcerting, but at least it works.
Does all of this make me feel comfortable in recommending Ubuntu to anyone? Not at the present. Apparently even Dell Computers is having some second thoughts about offering Ubuntu as an alternative to Windows, or at least they are having doubts about the level of support they will provide if you buy one. It won't be the same as when you buy a Dell/Windows PC.
So at this point the only things that don't work are the scanner and the camera once again.
The problem seems to be what one Ubuntu Forum contributor labelled "system-breaking updates". His solution was "get a better distro". Another contributor said he wished he could, but all Linux distributions seem to suffer from the same problem.
There seems to be a disconnect here in the Linux world. I have found many articles such as these two with good things to say about Ubuntu:
These and many more state quite clearly that Ubuntu is a great replacement for Windows
Then there are some counter-point articles:
These articles tell a different story and are good reading for that reason. They portray Linux in general as an operating system designed by geeks and hackers for their own use. It is flexible and fully customizable, but it is maintained by volunteers who set their own priorities. Newcomers are tolerated, but don't expect support, expect advice instead from the forums.
These form two completely different pictures of what to expect when you install Ubuntu. One says Ubuntu is a viable alternative to Windows for computer "users" - people who just want to get work done on a PC and not spend a lot of time trying to make the operating system work for them. The other says that Linux systems are for "geeks" - people who like spooging around with computers as a hobby and enjoy working out problems to make the PC do what they want it to. Which point of view is right? Is Ubuntu for "Geeks" or "Users"?
Where do we go for a tie-breaker? How about the company that develops, supports and distributes Ubuntu, Canonical?
Right on my Ubuntu PC under "About Ubuntu", the main introductory section is entitled "Ubuntu - Linux for Human Beings!". It then goes on to say under "The Difference":
"There are many different operating systems based on Linux...So what makes Ubuntu different? ...Ubuntu aims to create a distribution that provides an up-to-date and coherent Linux system for desktop and server computing... By focusing on quality, Ubuntu produces a robust and feature-rich computing environment that is suitable for use in both home and commercial environments. The project takes the time required to focus on finer details and is able to release a version featuring the latest and greatest of today's software once every 6 months...."
Canonical also says:
"Ubuntu 'Just Works' We've done all the hard work for you. Once Ubuntu is installed, all the basics are in place so that your system will be immediately usable."
I can only conclude from these words that Canonical is marketing Ubuntu mainly for "Users", not "Geeks". In fact it sounds like they are really targeting not just home users, but business users as well.
But does it "just work?" From our experiences so far it "sorta works, some of the time".
My current opinions:
So is Ubuntu a "geek" or "user" operating system?
It is a bit of both, but perhaps 75% "geek" and 25% "user" right now. Before it is really ready for the "user mainstream" peace will have to be made between the "geeks" who are building the system every day and "users" trying to get work done. A "geek" system will never be anything but a "geek" system, but perhaps, just perhaps, with some guidance from Canonical, over time an operating system can be built that will keep both sides happy. That will require enough customization capability and room to maneuver for the "geeks" and no more "system-breaking updates" that stop the users from writing reports, precises and even great works of literature.
Where am I on Ubuntu this week? I think that am still enjoying it. The challenges are interesting and the level of knowledge is slowly turning me from a "user" into a "geek", after a fashion. I am actually enjoying spooging around with it. I am hand-coding all my web pages on it, which is pretty "geeky".
I am just glad that we still have a Windows XP box here for now, to get some things done when we need to.
Will Ubuntu grow into a system that I would be happy to recommend to my boss as a replacement for Windows (without having to replace the whole office staff with hackers)? One day, I hope.
Will that happen before 14 April 2009 when Microsoft ends mainstream support for Windows XP and hundreds of millions of users start looking for a new home? I sure hope so.
It isn't there yet.
I wanted to write about about an application that comes with Ubuntu. I am sure no one writes much about it, because Gedit is just a lowly text editor. Who writes in praise of MS Notepad or MS Wordpad? This one works so well that it deserves mention.
Like all native Ubuntu applications Gedit is open source and developed, at least in part, by volunteers. The "about" function says that "gedit is a small and lightweight text editor for the GNOME Desktop". It even sounds humble.
Most text editors are very simple and just allow you to write on them. Gedit has some features that MS Notepad and Wordpad don't have, like spell checking! I have been using Gedit 2.18.1 for hand coding XHTML pages, like this one. In fact I am slowly replacing the whole More Spooge website with hand coded XHTML and CSS pages, just for practice. Gedit is so much easier to code with because it incorporates a feature whereby it recognizes the HTML and colour-codes it as you write. This makes picking up errors in the coding so much easier. Microsoft has nothing like it.
So, once again, my hat is off to the open source community. Gedit is a superior product to anything else I have used! It makes hand coding web pages fun!
Another point to Ubuntu over Windows.
Gedit even has its own website, if you want more information.
This is pretty interesting - we can now play encrypted DVDs on our Ubuntu PC! This was something we have wanted to do for a while, since we have no TV in the house and our Dell Ubuntu box came with a DVD player. Ubuntu doesn't come with the ability to play encrypted DVDs, as most commercial ones are.
I was still researching how to get it to handle DVDs when at our last Linux night class, our instructor, Scott Blayney, gave us a great and simple way to do this - legal too in Canada! He showed us a script that you can download called EasyUbuntu that works really well. Once the script is downloaded and installed you just configure it with the extras you want and it can enable lots of things, including DVD playback. Thanks Scott - it was a great course!
So tonight we tried out a DVD of the musical "Cats" that I borrowed from the Public Library and it worked perfectly. I guess we will be able to gather around the Ubuntu PC anytime to watch movies! The best part is that the software was all free and unencumbered by the DRM garbage that Windows Vista uses to playback DVDs. Definitely one up for Ubuntu!
We ran into a strange problem this past week with our digital camera. I tested it out a few weeks ago by downloading photos from it and then saving them. It all worked fine. I tried to do the same while our Windows PC was in the shop and it wouldn't do it - it just kept giving an error saying that Linux couldn't communicate with the kernel:
"An error occurred in the io-library ('Unspecified error'): Could not query kernel driver of device".
I checked all the documentation and found on the Ubuntu Forum that others had had a similar problem, but never after the camera had been working, only from the start. I tried all the suggested fixes and none of them worked. The change in operation seemed to be just after an update to Ubuntu was downloaded.
I tried both adding a plea for help to the old forum thread and also posting a new thread on the subject. I didn't really get any useful help, other than one person who suggested changing Linux distributions to one that doesn't have system breaking updates.
This all leaves me stuck as far as the camera goes - it won't work on Ubuntu and there is apparently no help available. All I can do is hope that some future update fixes it.
Getting our Windows XP PC back from the shop saved us and allowed us to download our photos from the camera (proving that it wasn't a camera issue) and then save them and posted them on our website. Ironically we were able to transfer them to the Ubuntu box fine from the CD once they were backed up.
Combined with the no-go on the scanner this is all not good. Overall Ubuntu works okay as long as you don't have a USB scanner or want to use your camera. I am hoping that since these issues have been reported as bugs that they will get fixed in the next year or two, before XP is no longer supported.
With our Ubuntu PC up and running well, we decided to send our Windows XP computer into the shop for some needed upgrades and fixes, including some more RAM. This gave us the chance to see what it would be like to be "Ubuntu-only" for a week. The XP computer went in on Saturday 12 May, so it has been five days now.
This period has given us a chance to try out some of the applications that we weren't 100% familiar with. Ruth wrote a Zuby newsletter on Scribus and saved it as a PDF. Saving as a PDF was a capability that we had always wished MS Publisher had incorporated and Scribus is a definite improvement over MS Publisher in that respect.
I did some playing around with GPS Drive and found that it needs some downloaded maps added, which was not a problem. It also needs some other programs for it to be able to download tracks from a GPS set. This is a bit silly as the program isn't much use without those and so they really ought to be included from the start. I have to figure out where they are and then load them.
Yesterday we discovered another "challenge". Our XP computer currently runs the scanner and the printer. We don't have the two PCs networked yet, so documents get passed via a USB device when needed for printing. With the XP box in the shop I thought I would try hooking up the scanner to the Ubuntu PC. No Go.
A check of the Ubuntu Forum turned up lots of traffic on this subject. It seems that USB scanners worked fine in earlier versions of Ubuntu (Edgy Eft and before) but the development team disabled the USB connection in Feisty Fawn thus rendering these scanners unusable without some dramatic work-rounds (like running Windows through an emulator or recompiling the Linux kernel from an earlier version). I left a note on the forum page - hopefully this will get fixed in a future version of Ubuntu or else there will be a lot of very unhappy people out there! One of the things I am doing is evaluating Ubuntu for office use and this one is a "show-stopper". I have received some assurances on the forum page from the development and testing team that the next Linux kernel version will solve this problem, so that is heartening.
A recently solved problem on our list was printing business cards. Right now we are doing this on MS Publisher with pre-punched sheets of ten cards and it works great. There is a feature of Open Office Writer that is supposed to print cards, but I wasn't able to get it working on the Windows version of Open Office yet. In that version it seems to want to print just one card per page instead of multiples. I probably need to just fiddle with it some more to get it working.
I tried it on the version of Open Office on our Ubuntu PC and it works fine. It runs through a set-up to define the parameters of the card and then produces a matched sheet in a very neat way. I even saved it as a PDF - a nice feature of Open Office in general.
I'll have to play with that again in the Windows version of Open Office Writer, the Ubuntu version seems to handle business cards just fine.
So outstanding issues that remain for Ubuntu at this point are:
Otherwise this trial week of being Windows-free has been fine. I didn't try to run our printer off Ubuntu, but the forum indicates it shouldn't be too difficult to achieve that.
We had our second Ubuntu night-school class this past week and we are learning lots there. Our instructor, Scott Blayney, is very knowledgeable and has taken us to parts of the operating system that we were unlikely to go to on our own. We have one more class left in the series, in a couple of weeks.
Scott also demonstrated "Beryl", which is a visual effects program similar to the Windows Aero interface. To us it seems like pointless "Vista-style eye-candy", but if you want that sort of thing at least it runs on a lot less memory than Vista uses.
We have now been running Ubuntu for 22 days and are pretty impressed with it. Certainly for home use it is as good as Windows XP and infinitely better than the morass of Windows Vista, from our experiences playing with that operating system. Ubuntu does eliminate the need for the required Vista hardware upgrades and if you really want the funny visual effects of Vista they are available in Feisty Fawn via Beryl. Personally we can live without them.
I will write more when we either discover more issues or fixes or both!
Well today the last of the "pre-operational" tasks was checked off and our new Ubuntu PC is no longer an experiment - it is in service! I consider that pretty good, given that the operating system is totally new to us, to have all the issues resolved that quickly.
The last item to be "X"ed of the list was AVG anti-virus. This was actually the longest issue we worked on, as we started it last Sunday and finished a week later.
AVG works great on Windows, as as mentioned in earlier articles in this series, we wanted to have it on the Ubuntu PC as well. Ubuntu doesn't have AVG as a package available, so you have to go and get it on your own, using a command line. That was fine , it downloaded and installed easily and worked okay, too, with one exception - it wouldn't update. When the update button was pressed it would run for a while, seem to download some files and then give an error. Attempts to get it going again, by restarting it, closing and opening the application or even rebooting the PC just produced a message that that operation was in progress, although nothing was happening.
On the AVG forum I found a command to clear the error, although it produced the same error each time that the update was tried. It took a week on the AVG forum to get an answer. They had me run a command line to update the AVG and it responded "permission denied". The AVG folks indicated that the problem was at my end with permissions and groups and left it at that.
So I tried running the same update command as a "Super User" (sudo) command and it updated. It turns out that AVG installs very differently to other applications, at least on Linux systems. Instead of just running, it forms as a "group" and unless you are a member of the group, you can't run the update function. So I added myself to the AVG group. Now it updates fine from the command line or the graphic interface.
AVG for Linux works quite differently than in Windows, not just in permissions and groups. For one thing it just scans the PC, there is no real time e-mail scanning or other features. If it finds a virus it doesn't do anything, it just tells you. You then have to run a "cleaning" scan from a command line to get it to scrub the virus. Odd, but it will work that way.
That was the last item to clear. There are lots of installed applications to learn how to use, like Scribus (Ruth is designing a newsletter that may end up as a "Zuby page") and GPS Drive. Those are just learning items for us, however.
We need to learn a lot more about Linux as a system and how to make it work, too. Tomorrow is the first of three two-hour beginner's classes on Linux that we have signed up for through the Ottawa-Carleton School Board. This will be fun to do!
So to sum up our experiences in the last 13 days since we brought the box home: it has been interesting and quite a bit of fun to get it operational. We are happy with it as an operating system.
Ubuntu is pretty much ready to run "out of the box" if you are on high-speed and don't need AVG anti-virus or a web-design tool. Setting up dial-up, AVG and KompoZer were the only real challenges, although KompoZer was easy to install. AVG and getting the dial-up working took the longest and involved the most head-scratching.
The user forums were the most help, along with the other websites that people have posted. The Ubuntu Wiki was incomplete in the case of AVG and dial-up. Perhaps we can fix that problem ourselves?
Would we recommend Ubuntu to anyone looking for an operating system to replace Windows or Mac? Absolutely! It works at least as well as Windows XP and better in many respects. It has a very clean interface, is a smaller system so it is faster and comes with everything you need for free, no extras to buy, ever. From our experiences trying both, it is head-and-shoulders above Windows Vista. Ubuntu has pure simple functionality instead of Vista's restrictions, built-in spyware and needless glitz.
It is also a totally different philosophy working with the open-source community, lots of volunteerism and help available, instead of a profit motive.
I think the Ubuntu operating system and all its associated applications have good future - the market failure of Vista has created lots of interest in Linux systems. Even Dell have announced that they are now offering new PCs with Ubuntu!
If you are thinking about making the switch, especially to avoid a downgrade to Windows Vista, then do some research and then give Ubuntu a try.
As of today we have had Ubuntu running on our second PC for 12 days and it is going well overall.
We decided to treat this PC as "experimental" until all the outstanding issues are dealt with and so it still carries its "experimental" reminder label. This has actually been a fun process, learning the new system and where we can't figure things out asking questions on the Ubuntu forum.
The forum support has been really amazing, anything we post gets an answer within minutes and the answers are generally right! It is nice to have that much support from the Ubuntu community available. You certainly don't get that with Windows.
So since the last update we have installed a number of more free downloads and tried them out. All but two of the applications were just loaded with the Ubuntu Add/Remove application, a simple checklist of what you can get within the Ubuntu world. You just search the list or even just read through it by category and see what is available. Each application has a brief description of what is does and how big it is. Then you just check off the box and the program is downloaded by Ubuntu for you and installed. Ubuntu even let you know where on the menu it now is located! Pretty impressive.
The new applications we have added are:
Functionally we have done lots with the new Ubuntu PC so far. We have:
So what is left on the list before the box is declared "operational" and the "experimental" tag comes off? Just one item at this point and that is AVG anti virus. It is installed and running, but won't update at this point. I am working with the AVG forum staff to try to find out what the problem is and what can be done to fix it - just waiting for an answer there.
Once that is fixed, as long as there are no other outstanding items then it will be operational and our Windows PC can go into the shop for some upgrades.
We will write more in this series when we are actually operational and give some thoughts on Ubuntu overall.
This weekend was mostly a rainy one and so that gave us both the chance to do some work getting our new Ubuntu computer up and running. And we have actually made some good progress!
One of the key things I did was put a sign on the monitor to remind me that that PC is considered an "experimental project" and that it is not operational yet. This was to slow me down in the rush to get it up and running as fast as possible. It was too high an expectation.
First on the list was an easy task for Friday evening - see if we could take the browser bookmarks from our existing PC and install them in Firefox on the new one. Shouldn't be too hard? Internet Explorer saves its bookmarks(or "Favourites" as Microsoft calls them) in a file. Each one is a separate file with a .url extension. I started by trying to do a bookmark import on Firefox, but that didn't work. So I went looking for the file where Firefox keeps the book marks to see if I could just drop them in there. Firefox doesn't work that way. Instead we discovered that IE can export the favourites into an .htm file and that can be transfered to the other computer on a Jump Drive and then installed through the import process. Pretty simple when you know how.
The next job was a little more complex - getting it on line via dial-up.
We had already pretty much proven that the internal software modem was not going to work due to lack of an available driver. The Ubuntu forum suggested that an external hardware modem was the answer. I found a nice US Robotics one downtown at Staples and even got a great discount on it by showing a competitor's ad. Apparently they will beat competitors' prices by 10%. Who knew?
The modem came with everything needed, except a cable to connect it to the serial port of the PC. A trip to our favourite computer fixer, Jim at PC Fixr, on Saturday morning turned up an RS 232 cable that would fit fine. He also pulled the internal modem out of the box for me so that wouldn't create a conflict.
So later that day, after reading everything possible about how to do the connection we went ahead and hooked up the modem and tried configuring the dialer. The system dialer didn't work right (as we were warned in the Wiki) so the next step was to use pppconfig. There is a great page on Ubuntu Geek that outlined how to get in and configure pppconfig perfectly, step-by-step. That kind of help is invaluable! The only part that didn't work was finding the graphic interface, called Gnome-PPP. The problem turned out to be that we needed to download the program, which we couldn't do, because we weren't on the Internet at that point. Gnome-PPP is available in the Ubuntu Add/Remove list, but you have to go and get it.
The good news is that when we rebooted the computer it dialed on line all by itself and connected, so the pppconfig worked! While the connection was great, the auto-on-boot was odd. There was a warning about this in the Wiki and so we unchecked a few boxes on the set-up (the only part that works of that utility) and it stopped doing that. With the Gnome-ppp interface not installed we were left using the terminal command line interface, but that really isn't a problem - it is simple and gets us on line really fast. We have decided not to install Gnome-PPP for now as we really don't need it.
The US Robotics modem seems to work flawlessly and produces speeds comparable to our XP computer, so we can't complain.
The first thing we did on line was set up Firefox and download the most recent list of software packages available for Ubuntu. That is one of the beauties with this system - there are thousands of software releases available and all free! On dial up the downloads take a while, but we just set them going and then go and play Scrabble or, for a long download, leave it on all night.
We also downloaded our first Feisty Fawn update and also got the e-mail program, Evolution, configured and working on line. We even got our first e-mail and it wasn't spam!
Last night we downloaded the K-Stars Astronomy program. Ruth tested it out and it works well. It is very similar to Starry Night, except for the price!
Next I downloaded a GPS mapping program called GPS Drive. I haven't had a chance to run it yet but it opens fine.
Next we downloaded Scribus, which is a publishing program. This is very similar to Quark Express. In fact it has some better features than Quark - for instance it can save anything as a PDF, so it is perfect for making newsletters.
On the list to download soon are:
Some people claim that Linux is immune to viruses and that no one writes viruses for Linux anyway. Even if that were true having anti virus is good for two reasons:
So we still have some things to do yet, before the Ubuntu computer gets its "experimental" label removed and is deemed "operational", but we aren't far off yet.
What are my impressions of Ubuntu at this point? So far I am pretty impressed. Other than making the dial-up work there have been no problems, basically everything worked on installation. Ubuntu 7.04 Feisty Fawn seems to be every bit as stable and functional as Windows XP and far better than Windows Vista. The main difference is the the operating system is cleaner, faster and far better laid out for use. For us the best part is the lack of Vista splash and glitz - instead it is fast and simple.
Programmers and computer geeks love systems that require command interfaces and lots of tweaking and testing. Most average computer users don't appreciate those sorts of things. They just want something that works out of the box, has the applications that they need and allows them to get on with actually doing things. In a way when a computer system really works it allows the user to focus on creating content (writing books, web pages, newsletters, studies, papers, proposals, etc) instead of focusing on how the computer works or getting it to work, instead.
By that measure I think we both agree that Ubuntu is a winner. It works and allows us to get on with doing things with it. For the price that is a bargain!
I am typing this on our second computer, one that is running the Ubuntu operating system. We have been using this now for three days and wanted to pass on our early impressions:
Unlike Windows, Ubuntu does not take ages to start up or shut down. That's not to say we merely flick a switch and shut it off, but it does mean that shutting the computer down properly takes only a second. Who hasn't tapped their fingers in mild irritation while Windows asks you a million times if you're absolutely really completely sure you really want to shut down, delete a file or change a file's extension?
Unlike Windows, Ubuntu does not have the need to junk up the HD with huge and unnecessary desktop image files and endlessly banal and thoroughly stupid tones for every single minor function you do. Who wants to hear robotic bleeps, bloops and stupid dings every time you open a file, run a program, sneeze or decide where you want to go for lunch that day? It's insulting, to say the very least. With Windows you can turn the tones off, with Ubuntu they start off as “off”.
We use computers to write documents, design web pages, practice XHTML coding, store photos and even to keep our finances organized. We have less than zero interest in upgrading to the latest greatest techno-device which can help us write documents while playing soft music, with a pleasing and gentle background and/or with the ability to play songs backwards. We are free of the upgrade vortex and free of the silly looks of disdain from others merely because we're running an old version of Windows and not the latest.
The reason Windows takes forever to do anything is because of all the RAM it uses just to load a web page, open an application, a picture or, indeed, anything whatsoever. Our second computer has a 40GB HD and Ubuntu, being a small OS, takes almost no memory (compared to Windows) to run what we need and so is very fast.
There is no defragmenter with Ubuntu because it isn't needed. Ubuntu doesn't store files in a fragmented manner in the first place. That tells you something.
Using Ubuntu is SO similar to Windows that virtually anything we have on our XP computer can be run just as easily on our Ubuntu system. There is almost no learning curve in moving to Ubuntu – virtually everything looks similar and works in the same way as XP, just faster.
Here is a look at the applications that come with Ubuntu to give some idea of the equivalence to Windows XP:
There isn't a lot missing that you might want for basic computing!
Ubuntu also comes with applications that Windows doesn’t have. They include:
We are also planning to add:
That is not to say things are seamless and flawless because they aren't...well, not quite.
So...overall, I have been very impressed with this small but powerful OS. Yes, it has its limits but they are nothing compared to the mind-numbing choices, options and constraints seen in Windows. But, perhaps best of all, Ubuntu is open source and, thus, free! We just have to get our $(#$*^#$%^ dial up modem to work and then we'll be in business...
More to follow!
There are some very interesting things happening in the world of computer operating systems these days. Microsoft's new operating system, Windows Vista, was recently released and it has caused quite a stir in the software world. Not because it is highly acclaimed, but because it has been almost universally decried.
Maybe I should back up and give some of our own personal history here. Neither of us are programmers, but we consider ourselves fairly sophisticated computer users. I started computing in the 1970s on an IBM mainframe and graduated to working on a DEC mainframe in the 1980s. Ruth started coding with a Commodore 64. The first PC we had together ran Windows 3.1. We have experience with Windows 95 and between 1999 and 2004 we survived Windows 98 with all its "blue screens" and crashes. We guessed right and avoided Windows ME altogether.
In 2004 we had a shop custom-build us our current PC, running Windows XP, now XP SP2.
All together XP has been a good experience and we have been happy with how it works and what it does. Of course mainstream support from Microsoft for XP is scheduled to cease on 14 April 2009, less than two years away.
After that date there will be no more security updates, unless you pay for the five years more extended support available up until 2014. After those dates gradually it will fade away as an unsupported operating system. Of course Microsoft would like us all to switch to their new system, Vista.
We have seen all the hype about Vista and that sounded the first note of caution. When a company spends US$500M on advertising an operating system, something starts to sound suspicious. That is a red flag to us. Good software sells itself.
So we waited until Vista had been released (to avoid Beta reviews) and then did some serious research. We've read hundreds of articles about Vista, almost all of them very negative. One of the most charitable called it "Microsoft's Suicide Note".
Analyzing what the problems are would be a very long essay so let me instead point out a few good sources to read:
There are a lot more, but perhaps you get the picture?
Since 90% of computers out there are running Windows at the present and the vast majority of those are running XP, something is going to happen in April 2009. At that point millions of users are going to have a choice:
We wanted to see Vista for ourselves and so dropped into the local computer discount shop and plinked away at Vista for a while. The PC we were on was a huge hardware upgrade from what we have now: 2.0 GHz processor, 2 GB of RAM and a 120 GB hard drive. It needs most of that to run Vista.
Vista moved slowly opening applications, while the distractions of the Aero interface sent windows tumbling all over the place. Our conclusion? Blech! All glitz and no substance. It doesn't do anything we need any better than XP and does lots of stuff we don't need at a high price to buy new hardware to run the expensive new software. We couldn't run Vista on our current box, we would need new everything. Totally pointless.
What to do? Well, until 2009 there is no problem, XP is working fine and is fully supported, so we have two years, like everyone else, to decide.
More research; we found these articles:
Now this was something interesting. Linux-based Ubuntu is a free operating system, developed by a company called Canonical working with the volunteer developers of the open source community. How does Canonical make money giving away operating systems? They provide professional support, developing specialized applications for the software for commercial uses. It isn't just a "hook"; they have committed to providing Ubuntu for free permanently and have even funded an independent foundation to continue the system, even if Canonical disappears.
Ubuntu has attracted a very dedicated group of users and supporters and is currently the most popular of the 300-odd Linux operating systems available. There is a strong support community of help websites, forums and assistance available. People who use Ubuntu seem to want to volunteer to give something back, so they become programmers, contributing to the project, promoters and help-wiki editors, helping newcomers and spreading the word that there are alternatives that work.
And not just the Ubuntu operating system is free, there is application software for almost everything you could want to do with a computer, all for free:
Etc etc etc...there are thousands of applications available - all free and all open source so you can modify them if you want to.
The more we read the more sense it made to switch to Ubuntu. Ubuntu uses a fraction of the memory that Windows does, so there is no need for hardware upgrades. Why give in to inferior products at high prices when better is available for free?
There are critics of Ubuntu - mostly it seems to centre around the fact that the system uses some proprietary drivers, which, to purists means it isn't 100% open source. We aren't purists.
So after all this reading we felt pretty good. There are good alternatives and no one is trapped "downgrading" to Vista. We could wait until 2009 for Ubuntu to get even better and then replace XP with it with one CD. But then we have been talking about getting a second computer for a while so we can both work at the same time.
We contacted our local computer fix-it shop and they offered to sell us a used corporate lease-return PC for a good price. We wrote the spec, they configured it and downloaded Ubuntu for us. This one will be an experiment for us.
We signed up for a Linux course at night school to learn more about how it works. The new unit will be here next week. We will write more when we have spent some time configuring it and using it.
If it works the plan is that the PC I am writing this on will migrate to Ubuntu after its XP days are done. Meanwhile we will compare them and see which one works better.
More to follow...
This web-page is licensed under a Creative Commons Licence.