Friday, December 19, 2014

Random acts of math!

Hi all,

Hope you are having a splendid Friday!

Just a quick note to let you know about a new blog started by yours truly.

Random Acts of Math is a fun little place to share some peculiar or interesting things from the world of math. I wanted some practice writing with the digitizer on my Galaxy Note Pro and thought posting some writings to a blog would be constructive and might help a few people.

Disclaimer: I'm not a math expert, nor do I have a math degree (I did take some math courses in university). Given my amature status, always consult a professional before using anything you read on this blog for something important!

What about Jay's Desktop?

It hasn't been forgotten! Jay's Desktop will still be a place for me to write about things I find interesting  ("a place for my stuff") when the mood strikes me. Think of Random Acts of Math as a "subset" of Jay's Desktop - specifically for math-related items and writing from my tablet :)


Tuesday, March 18, 2014

Ubuntu 12.04 Tips: Clearing out old kernels & SSD Trim


I've recently come across two Ubuntu/Linux tips that I wanted to share (and document). They are particularly import if you run Ubuntu on machine with not a lot of extra hard disk space. In my case, I have a hybrid hard drive with a 24 GB SSD partition, and a 750 GB data partition (Ubuntu is installed on the SSD partition of obvious reasons, while most of /home uses symbolic links to the data partition). Over the last year or so, I've noticed my SSD partition steadily increase is size, from about 40% to 73%. Fearing I would run out of room soon, I did some research on if this was merely from system updates and installed software, or if it was something else. I also noticed my machine seemed to be responding much more slowly then it had been when I first set it up a year ago, and tried several things to improve performance without much result. I feared this might be related to the lack of disk space on the SSD partition as well, and I was sort of right.

These tips might be application to other Linux distributions as well. As always, use at your own risk!

1) Clearing out old kernel versions

Ubuntu keeps old kernels hanging around after you install new ones via auto-update. They can take up quite a bit of room. There are some good reasons for keeping old kernels around (e.g. reverting if a kernel update breaks something). But it's unlikely you'll need all of them.

Here's a simple little command line to clear out the old ones.

sudo apt-get purge $(dpkg -l linux-{image,headers}-"[0-9]*" | awk '/ii/{print $2}' | grep -ve "$(uname -r | sed -r 's/-[a-z]+//')")

It worked for me, and cleared out a good 6 GB or so of old kernel files, which made a big difference on my 24GB SSD (I went from 73% full to 45% full). I'd recommend only doing this after you've confirmed that the newest kernel works, and even then, you might want to modify it slightly to keep the second last kernel just in case.

This also made a big difference in my /boot partition, which is about 500 MB. This is in fact what led me to find this, as I'd been getting error messages upon boot about /boot being nearly full and went to investigate. After removing the extra kernels, I'm back down only 29 MB full on /boot. Nice!


2) SSD Trim

If you ever use Ubuntu on an SSD drive (as I do), your performance will slow overtime unless you periodically run the 'fstrim' command (e.g. in a daily cron job) to send SSD delete commands for removed files. I definitely noticed a drop in performance over the last year, and have been trying to diagnose why when I came across this little gem.

Since then I've started running fstrim, I've definitely noticed an improvement. The first time I ran it, it sent about 7GB worth of "deletes". They are supposed to add this in 14.04, but it's not in as of 13.10 (or 12.04, which is what I use).

Full instructions and information are in the link below. Enjoy!


Friday, March 7, 2014

Linux Conversion for ASUS S56C (Part 1 - Windows 8 Backup)

Hello everyone!

This post was a long time in coming. I started it last April, but got distracted with life and happiness (you know, those non-computer related things). Anyway, last April I picked up an "Ultrabook-style" laptop to serve as my daily machine (a snazzy Asus S56C). The blog post chronicles transferring the machine to a Linux-based one, and what steps I had to go through to do so as well as any Tips and Tricks I discovered along the way. Note that I was using Windows 8.0 at the time, so the newer update, 8.1, might have some of the issues I ran into fixed.

Backing up Windows 8

I knew from the beginning that this would be a dedicated Linux machine. My first impressions on trying to use Windows 8 was basically that it was a pile of insanity served in a bowl of nonsense (I'm not biased or anything I swear :)). I'm sure I could get used to Windows 8 eventually, if I had to, but I would probably do some tweaks to get a more traditional desktop feel.

But since I knew Windows wouldn't be staying, I didn't invest too much time in that. However, I felt it was important to protect my investment by making sure I had a backup of the pre-installed Windows, in case I ever needed to restore it.

Sadly, even this was more complicated them I'm used too, so I thought it prudent to start the conversion blog with some helpful tips on backing up the original system.

Recovery Discs

I'm very used to making recovery discs on older systems, in fact I encourage everybody using computers to have some sort of disaster-recovery mechanism in place (including, but not limited to, recovery discs).

Normally, these recovery disc tools are provided by the computer manufacturer. They also take the form of a "recovery partition" on your hard drive, although in my experience, you can also make recovery discs in case the partition gets removed (or corrupted). These discs usually server to restore the partition in that case.

That's all fine and dandy, but I had a hard time finding the mechanism to create recovery discs. There didn't seem to be a manufacturer provided tool, and indeed there wasn't. However, with some Googling, I found how to create a create a recover drive from within Windows 8. That's right, you don't seem to be able to use discs any more, instead you create a recovery flash drive.

To get to the tool, open your "Charms" bar (move the cursor to one of the screens four corners), and select the "Search" option (it looks like a magnifying glass). In the Search Bar, type "recovery".

You might be surprised, as I was, to find zero results. That's because things are partitioned into categories in the search results. By default, you are only searching the "Apps" category. To find the recovery drive tool, you need to search the "Settings" category. Do that by clicking the "Settings" button. Personally, I would consider the tool an "App" and not a "Setting", but what do I know? :)

Once you search for "recovery" in Settings, you should find a link called "Create a recovery drive". You'll need a minimum 16GB Flash Drive to use. Thankfully, these aren't too expensive now-a-days (around $9.99 CDN here).

After that, follow the instructions to create the recovery drive. NOTE: You sadly won't be able to boot the recovery drive without doing some additional steps, but we'll get to in Part 2.

Create a System Image

If you are familiar with Windows 7, you know that you can create a complete Windows 7 system image to restore to an alternate drive (for example, if you suffer a hard drive malfunction).

This tool still exists in Windows 8, but it's hidden in an even more obscure location. Like before, go to the search menu and type "recovery" (in the Settings category). Look for an option called "Windows 7 File Recovery". "Windows 7??", I hear you asking. Yes, Windows 7. To me, a tool called "Windows 7 File Recovery" would be some sort of tool for recovery files from Windows 7 (perhaps a backup). And, indeed, it is. But it's also where they decided to hide the System Image tool (used for Windows 8). Strange, but let's continue...

In the Windows 7 File Recovery menu, there are two links on the side:
1) Create a System Image
2) Create a System Repair Disc

We're going to use the first one.

Broken out of the box

Sadly, it turns out the System Image tool is broken out of the box. If you try to run the tool and following the instructions to put the image on DVD, you may get a rather cryptic and unhelpful error message:

"The backup failed. The drive cannot find the sector requested. (0x8007001b)."

The error is actually referring to the fact that the disc isn't formatted. You could probably manually format the disc, and it would work fine, but you'd think the too would do that for you, no?

Well, it does, as long as you install the update to fix it. Installing all the recent updates (which is a good idea before doing a System Image anyway) should repair it, or if you are the impatient type, here's a link to the Knowledge Base article which will help you download the specific patch:

After you are up to date, click the link to create the system image and follow instructions. You can choose an external hard drive, or a "one or more DVDs". I opted for the DVD option (on DVD-RW's so as to not be deleted by accident, which led to the error above). On a fresh out of the box system, it took me four DVD's. Don't forget to label them!

Create a system repair disc

Last but not least, you should probably create a System Repair disc. It seems the system image you made above is useless without a tool to load the image back onto your hard drive. The system repair disc might also come in handy for other reasons. You'll find one of the options on the system repair disc is to restore the machine from a system image. However, I won't go into those details here.

Note: TechRepublic posted an article great here which goes into far more detail on the steps than what I mentioned above. It suggests that creating a System Repair Disc and a Recovery Drive are effectively the same thing, and you probably don't need to do both. Although I find it a little strange that the Recovery Drive takes up most of a 16GB flash drive, but the System Repair Disc fits on a single DVD. Regardless, I felt more comfortable having both, so I created both.

A few other tips:
1) Feel free to check for (and install) any BIOS updates before removing Windows. There's a handy Windows-based tool for updating your BIOS, though my machine was update to date out of the box. You can probably also do the flash from within the BIOS, but I'm not 100% sure.

2) You might get bugged after a few boots to register your system. This is probably a good idea if you want your manufacture warranty.

3) You can get to the recovery partition by holding down "F9" on boot. As far as I can tell, the interface is very similar to the Repair Disc and the Recovery drive interface.

That's it for now! In Part 2, I'll show you what BIOS you need to update to boot from USB or DVD, either for restoring one of your backups (or installing Linux). All the best!

Tuesday, July 23, 2013

Be wary of a solely app-centric ecosystem

Happy Tuesday everybody!

I recently switched my trusty Nokia N900 for the more mainstream (but not as geeky) Samsung Galaxy S3.

Don't get me wrong, I really loved my N900. I used it as my primary cell phone for over three years. But sadly, it was supported by only one carrier where I live, a carrier which was gouging me for a very basic plan (I didn't even have data). So when a new carrier launched locally which offered everything I already had on my plan (plus data) for a little over half the cost, I could no longer ignore the economic argument of switching.

I'd done a fair bit of research on the S3 (and Android in general) before switching as I wanted to make an informed decision. The S3 was appealing since it was on for $99 (with a two year agreement. However, recent changes in the law allow people to quit contracts early simply by paying off the device balance, which I think is fair). I also was lucky enough to get the $99 purchase fee wavered as a special opening day offer, so I effectually got the phone for free). I also considered the S4, but the few extra features it had over the S3 really didn't see to justify the cost (economic argument wins again). So far, I've been mostly happy with the S3.

(N900 purists: don't despair! While it may no longer be my primary phone, my N900 shall not go to waste, as it is a truly wonderful device. I'm already working on plans to re-purpose it as a dedicated media player and/or web server).

In any case, this blog post is not about comparing the merits of the N900 vs the Galaxy S3. Instead, it's about a possibly disturbing tend I've noticed since switching over to the S3.

The nature of "apps"

One of the biggest selling points of mobile devices is the size of the "App Store", i.e. what kind of 3rd-party applications can be added to the device to add more features.

Apps are, of course, nothing new. Every since the early days of computers, people have been buying them not just for the software that comes included with the computer, but for the software which can be added onto the computer after the fact. Back in the day, we simply called them "programs" or "software". This became synonymous with "application", which was eventually just shortened to an "app".

The distribution of 3rd party applications have changed as well, since the introduction of mobile operating systems. Originally, software was produced on physical media (CDs, Floppy Disks, etc), bought at brick-and-mortar stores, which the user put into their computer and installed the software. With the rise of the Internet, its become much easier to simply transfer the software electronically and cut out the middleman. Even in the early Internet, there were many sites dedicated to downloadable software. The idea of an app store basically builds this into the operating system itself (Linux distributions of course long ago introduced this as a software repository).

Why apps are good

That's all fine and dandy. App stores make it a lot easier for the application developers to get their applications into the hands of customers, while making it easier for customers to get the applications.

Apps also tend to be more tailored to the specific hardware, or platform. This can (although not necessarily does) mean that the software can be better tested before being released, and thus less buggy. If a company writes both the software serving the information, and the client interpreting it, they can do a better job of making sure the protocol works together and their application will work better since they won't have to rely on potentially buggy clients which detracts from their service.

Why apps are bad

In the early days of the Internet, it was well established that the protocols which distributed information over the Internet (HTTP, FTP, POP, etc) were publicly published, and well understood. That meant that there existed a common language spoken by both the client and the server. The server used a specific protocol to provide the information, and anybody could read the protocol specification and write a client to determine and display the information. For example, a web server is written which speaks HTTP, and a client is written which also speaks HTTP. This had two benefits 1) Anybody could write a client to interpret the protocol, and develop it as they see fit; 2) A single client could interpret many different types of information (e.g. over HTTP) from many sources, without the need for thousands of protocols to be developed. After all, every protocol needs a client. Imagine if every single website on the Internet required a separate web browser, or if a single web browser was thousands of times bigger because it had to support thousands of different protocols. Chaos, I say, chaos.

And yet, many apps seem to be taking this approach. Even organizations which are severed well by nothing more than a website are instead creating tailored applications, instead of expecting users to access the site through the web browser.

In other words, apps are encouraging proprietary protocols, which are read by  specific clients instead of clients which can be written by anybody. This, in and of itself, isn't a bad thing, for the reasons I mentioned above.

The concerning thing is that if a prominent or well-used served decided to drop support for the public protocol (e.g their website) and only support the proprietary protocol. Then, in order to access the service, you have a dependency on being able to access their client, which further has a dependency on having the platform that their client runs on. For people who like to be able to develop their own custom clients, or run custom platforms, this wouldn't be acceptable.

While this trend started with mobile devices, it seems to also be migrating over to more traditional computers. For example, even Windows 8 encourages software to be gained via an app-store rather then accessing the information through a common protocol via a web browser.

It also means that you have to have more software on your machine, which means that you need to consume more resources. I do understand creating custom clients (apps) for things which need highly customized protocols (especially ones optimized for speed, i.e. gaming protocols), but there are a lot of organizations out there developing apps for information which, in my opinion, simply don't need them and would be just as well served via a web browser. However, out of the ones that I'm aware of, it's not like they have discontinued the public interface, but rather simply added an app-based one to enhance access to the service.


So as long as the app isn't the only way to access the information, we shouldn't have an issue. But maintaining two separate protocols (a public and a proprietary one) is costly and resource consuming. So, one could see the argument in switching to only using one. And given the benefits to a proprietary protocol and client I mentioned above, it's easy to see why it would be tempting to go that route.

In any case, it's mostly food for thought, but something that I'll continue to be weary of in the future. Hopefully there is room for both private and public-type protocols to exist side by side. If not, there are ways we can deal with lack of public protocols such as virtual machines. I'm also encouraged by the fact that things like Android are based on open source principles, which tend to be easier to visualize if necessary, unlike other platforms. 

Monday, November 12, 2012

Setting up multiple Firefox profiles in Ubuntu 12.04 and Windows 7


One of the more underrated features of the Firefox Web Browser is the ability to set up different user profiles.

This feature comes quite a bit in handy when you have different people using the same computer, but don't require the overhead of setting up separate user accounts for each individual using it.

Setting up separate profiles allows multiple people to have Firefox configured the way they like it: with different bookmarks, saved passwords,  personal history, and so on.

Plenty of people like to keep their browsers signed in to the services they like for convenience (not having to keep signing back in), but when multiple people share a computer, signing people in and out all the time can be a hassle. Enter Profiles, unique "Firefox's" (so to speak) for each person. On my system, I have three profiles: One of myself, one for my better half, and one for guests which uses Firefox In-Private mode so their personal passwords, etc., are not tracked. You could even have different profiles for work and personal use, if you choose.

Firefox profiles are very easy to set up and manage, in today's post, I'll show you how to set them up on both Ubuntu 12.04 and Windows 7.

Ubuntu 12.04

I'll start off with a few assumptions here: You already have one Firefox profile set up and one short cut in your Unity Launcher (or other launcher of your choice). This is how Ubuntu comes by default, so it's not too crazy to assume this. If for some reason this isn't the case, you can easily install Firefox with the Ubuntu Software Center. Also, we'll assume that your secondary profiles (for other users), will also be accessed from the Unity launcher.

Creating new profiles

Before adding the new Firefox sessions to the Ubuntu browser, we'll have to set them up in Firefox first. First, launch a Terminal. Then, from the command line, run:

firefox -no-remote -ProfileManager

Most likely, you already have one profile named something like "Default". Feel free to keep this named Default, or rename it to your own name. Just remember whatever name you give it.

To make a new profile, click the "Create Profile" button:

Firefox will give you some data about profiles (pretty much what I summed up above), and then you can click "Next" and enter a name for the new profile. You can also change the directory the profile is stored, although most likely the default will be sufficient.

Create as many profiles as you desire, remembering the name for each. When you are done, close the Profile Manager.

Setting up different launchers for each Profile

There are actually two different ways you could go about this. You could just keep using your single Firefox launcher, but have it ask on launch what profile to use. To do this, from the Profile Manager, simply uncheck "Don't ask on startup". This will allow you to pick which profile you want each time you start.

But personally, I prefer to have a separate Unity launcher for each profile. This allows each person to access their own profile with just a single click. This is pretty each to set up, but also requires some more Terminal to work. Basically, we're going to change the default launcher to open one profile (your own for instance), then make new launchers for the other profiles.

From Terminal, navigate to:


Here, you'll find all (or at least most) of the launchers used by Ubuntu and Unity, there should be one for Firefox called "firefox.desktop". Open it in your favourite text editor (with root access).

There is a lot of stuff in here, but you only have to make a very minor change or two. Look in the file for the lines beginning with "Exec=". This is the command which is executed when the launcher is run. For example:

Exec=firefox %u

To make it launch a specific profile, use the -P flag:

Exec=firefox -no-remote -P Jason %u

The "-no-remote" flag causes Firefox to launch it's own process rather than trying to connect to an existing process. This way, we can have different Firefox sessions with each profile open simultaneously. Make the change to all Exec lines in the file and save.

Pro-Tip: You might notice upon saving that the Firefox icon disappears from the Unity launcher. If it does, no sweat. Just open /usr/share/applications in Nautilus, find the "firefox.desktop" file, and drag it over to the Unity launcher. Keep Nautilus open since you'll need to do this for subsequent launchers.

To make a new launcher for another profile, just copy the firefox.desktop file to a different name, e.g. "profile2_firefox.desktop". Open it in a text editor and make the same changes to the "Exec" line. But don't save yet.

In order for Unity to isolate this launcher from the original, it's important that you also change the "Name=" line. For example, change:

Name=Firefox Web Browser

to something more personal like:


Do this for all instances of "Name".

You can also change the icon easily too, so there aren't two Firefox icons in the Launcher. To do this, change the "Icon=" from:


To something else. One icon I like looks like a small picture of the planet:


But feel free to poke around /usr/share/icons and find one you like better!

Finally, once you've saved the launcher, don't forget to to to /usr/share/applications on Nautilus and drag it over to the Unity launcher!

That's about it, just repeat the process for each profile you want. If you'd like a "Private" session to use (to avoid tracking passwords, etc), use the "-private" flag in your Exec line.

You can also keep using a "default" Firefox icon if you wish. This will cause the last profile to be launched which was launched with the ProfileManager. One might want to do this if you frequently launch links from other other applications (e.g. Thunderbird). In this case, just keep the default Firefox icon using the most frequently used profile (your own), but use alternate icons to launch other profiles.

Pro-Tip: I also set up a keyboard shortcut to launch the Guest Firefox session. To do this, just make a script (in the "Home", or "Home/Scripts" folder for example) with the following content:

firefox -P Guest -no-remote -private

Replace "Guest" with whatever you might have named the Guest profile. Save, then right click on the file and go to "Properties", then the "Permissions" tab. Check "Allow executing this file as a program":

Now, launch the HUD and type "keyboard". Open the Keyboard settings and go to "Shortcuts". Under "Custom Shortcuts" click the "+" button. Give it a name, and point the command to your script. Click Apply. To active the new short cut, click on where it says "Disabled", then hit the key combination you want to trigger it (for example CTRL-ALT-P). Now try hitting CTRL-ALT-P and you should see the private Guest profile launch.

Windows 7

The process in Windows 7 is quite similar, but seems to be less flexible then Ubuntu.

To open the ProfileManager in Windows 7, hit Win+R to open the Run menu. Then, type "firefox -no-remote -p" to open the Profile manager.

Create your Profiles exactly as described above.You can also un-check "Don't ask at start up" if you want to pick the profile to use every time.

Assuming to have your Firefox icon docked on the Start Bar (a.k.a. Task Bar), you can continue to just click on that icon, which will launch whatever the last profile started with the Profile Manager was. Unfortunately, I haven't figured out how to make the docked Firefox icon always start a specific profile (but if you know, please mention it in the comments).

You also also create shortcuts to launch other Firefox profiles. To do this, from the Desktop, Right-click and say "New->Shortcut". Use "Browse" to select the Firefox executable (you'll likely find it near C:\Program Files (x86)\Mozilla Firefox").

But before saving it, make the following change:

Before: "C:\Program Files (x86)\Mozilla Firefox\firefox.exe" -no-remote -P "Profile1"

Where "Profile1" is the name of an alternate profile.

You can make as many of these on the Desktop as you want, and even drag them into the Start Menu. Unfortunately, the Start Bar only seems to allow us to have a single icon, which launches whatever was last launched with the ProfileManager. I'm not sure why this is, but it's a limitation we'll have to live with for now (again, if you have any advice, please let us know).

Hopefully this is a feature of Firefox you'll find useful, I certainly do. Just one of the things that makes Firefox such a great browser!

One final tip: You can protect your saved passwords in Firefox by using a Master Password. This can come in handy in case your computer is ever stolen or its security otherwise compromised.

Sunday, June 24, 2012

A few Myth TV tips for Ubuntu 12.04

Good morning all!

I've been using MythTV 0.25 on Ubuntu 12.04 for a little over a month now, and while pleased overall, there were a few annoyances that caused me to scour the Interwebs for a solution. Things are working better now, so I wanted to share my tips for others out there!

1) Legacy Full Screen Support

If you are using Ubuntu with Unity, you may notice the MythTV interface doesn't completely go "full screen". That is, it's somewhat shadowed by the Unity bar and top bar.

To fix this, you'll need the CompizConfig Settings Manager. I don't think this is installed by default, so find it in the Ubuntu Software Center. Note that this is an advanced tool to mess with your settings, and while there is some pretty cool stuff in here, do be careful.

To get MythTV full screen over the Ubuntu panels, use "Legacy Full Screen" support.

1) Under CompizConfig Settings Manager, go to "Utility"
2) Go to "Work arounds"
3) Check "Legacy Full Screen" support.

After that, your MythTV should go properly full screen. Restart if necessary.


2) Mythfilldatabase takes forever to run

(Note: These seems to have been fixed in a recent version of mythfilldatabase. A fresh MythTV install on Ubuntu 12.04 seems to no longer have this issue, but the instructions are still here for reference).

One bizarre thing I noticed is that mythfilldatabase never seems to end (actually, my girlfriend pointed out that hard drive was clicking loudly causing me to investigate and finding mythfilldatabase and mysql hogging the I/O. Hint: iotop is a cool program).

This is apparently due to a mythfilldatabase bug, and is fixed in the newest builds, but if you need a work around, here it is:

1) Set up a TMPFS for /tmp
Add the following to '/etc/fstab':

tmpfs   /tmp   tmpfs   nodev,nosuid    0 0

2) Change MYSQL configuration parameter:

Add the following to '/etc/mysql/conf.d/':


Note: It's important to know that in step 1, you are actually switching your /tmp to a TMPFS file system. This means that any data written to /tmp is written to your RAM, and not to your hard disk as usual.

This can have both positive and negative consequences.

Using a tmpfs for /tmp can have some all around nice performance improvements since writing to RAM is far faster than writing to disc. Therefore, programs using /tmp will have their write operations quicker, therefore improving performance.

The downside is that the side of /tmp will be limited to the size of available RAM. This can be issue if /tmp ever becomes full, writes will fail. So, if you run into this issue, your programs may stop working.

If this becomes a recurring problem, you might need to revert your /tmp back to the HD and use the bug fixed version of mythfilldatabase. But, I've been using /tmp as a tmpfs for a good month now and have had no ill effects with moderate usage (I also have 6GB of RAM...just saying.)

Tip: Keep track of /tmp usage with the df command on the command line. Mine is currently only at 1% full.


3) Mythweb PHP errors

I noticed some unusual errors in Mythweb. Simple functions like setting up recordings didn't seem to take, and there were some php error message along the top of some of the pages. This was a bit irritating to fix.

Rather than repeat everything I did, I'll just point you to the official ticket:

I just manually clicked on the individual *.patch files and applied the changes by hand. There's probably a better way to do it, but that worked for me. Hope you find it useful.

4) A new Remove_Commercials script

My first attempt to use my old remove_commercials script failed miserably. Not surprising, since some of the command line parameters changed between versions. But, there's an updated one you can use here: 

The script didn't quite work out of the box for me. A few tweaks I made:

1) Replacing all of the usages of $START with "$START"
2) Adding --user=mythtv --password=PASSWORD (where PASSWORD is your mysql password, from the mythbackend settings) to the mysql command
3) You might also get better results if you use -f on mythcommflag instead of --chanid and --starttime, but if they work for you then great.

I'm still playing with the commercial cut script a bit, so I may make some more updates, but hopefully that'll give you something to start with.

One issue I still have is the display mythfrontend runs on. I haven't yet been able to control what display it opens on from the command line as setting $DISPLAY has no effect. But I'll keep trying!

If you have any of your own tips, let us know. Take care, and have a great day!

Sunday, June 3, 2012

Configuring LIRC for HVR-1600 in Ubuntu 12.04

As promised, here is the recipe I followed to get LIRC working for my Hauppauge HVR1600 card on Ubuntu 12.04. This includes both the IR Receiver and the IR Blaster. This may work for other Hauppauge cards as well. The hard work was all done by others, for Ubuntu 10.04, this just serves as an updated reference for Ubuntu 12.04. As usual, this info is provided with NO warranty of any kind, so use at your own risk! It's assumed you have basic knowledge of Linux/LIRC/Scripts/etc.

The secret to the HVR1600 is the lirc_zilog kernel module. Once this kernel module is loaded and LIRC is configured, the HVR1600's IR capabilities should work.

Step One: Install LIRC

Step one is easy, just install standard LIRC! You can install it from the Ubuntu Software Center, or, install (or build) it from the command line if you are so inclined. The LIRC version that was installed for me was 0.9.0.

Step Two: Install lirc_zilog

(Note: you might already have lirc_zilog installed, see below).

This part is a bit tricky. The "proper" way to do it would be to install the lirc-modules-source package, apply the lirc_zilog patch, then reconfigure the package to build the kernel modules. The Ubuntu 10.04 recipe for this is here.

However, it does not yet appear that the lirc-modules-source package is available for Ubuntu 12.04. If it were, you would likely have to make some changes to the patch file to get it building under 12.04 (you might find some hints here).

As a shortcut, I found a complied version of lirc_zilog that appears to be 12.04 compatible here. There is a *.deb file for both i386 and 64-bit version (disclaimer: I can't vouch for the security of this site of the files). I find it strange that these files appear in this package but aren't actually in Ubuntu by default - but I'm sure there is an answer to that (if you know, feel free to share).

Download the version applicable to you and open the *.deb file with the Archive Manager (not the Ubuntu Software Center, which it opens with by default).

Find the lirc_zilog.ko file in the archive here: /lib/modules/3.2.0-24-generic/kernel/drivers/staging/media/lirc/

Extract it, and copy it to the following location in your system:

To load the module: sudo modprobe lirc_zilog

If all was successful, you shouldn't get any errors. At this point, you should be able to do lsmod | grep -i lirc_zilog and see:

lirc_dev               19204  1 lirc_zilog

Pro-Tip: To have lirc_zilog automatically inserted into the kernel on boot, add it to /etc/modules

Pro-Tip:  You might already have lirc_zilog installed. As an easy check, in a Terminal, go to /lib/modules and run find . -name "lirc_zilog*. If it finds lirc_zilog, already, you shouldn't have to copy one from another source. However, you'll still have to modprobe lirc_zilog to get it loaded and add it to /etc/modules to have it run boot.

Step Three: Install the HVR1600 IR Firmware files

I believe this can be found here, and a few other locations on the web.

This file needs to be copied here: /lib/firmware

Step Four: Configuration Files

You need to modify the following two files:


First, we modify hardware.conf to get lirc_zilog to be loaded. Add or modify the following lines (if necessary):

REMOTE_MODULES="lirc_dev lirc_zilog"

As for lircd.conf, again, a working one for my card was found here. I believe you can even put the relevant parts in its own file and use an include directive if you'd like.

Step Five: Take a deep breath

You're almost there! At this point, the following two things should work:

From a terminal, run "irw". Point your HVR1600 at the IR Receiver and press some buttons! If irw is working correctly, you should see some response, like:

$ irw
00000000000017a5 00 OK Hauppauge_350
0000000000001795 00 Down Hauppauge_350
0000000000001794 00 Up Hauppauge_350

This means your IR Receiver is working.

To test the IR Blaster, use the irsend command. At this point, it doesn't matter what code set you use, you just want to get the light to blink. For example:

$ irsend SEND_ONCE blaster 0_130_KEY_0

Note that the name "blaster" was the name chosen in "lircd.conf" from above....if you used another name insert it there.

Step Six: Final configuration

If the IR Blaster and Receiver are responsive, you're pretty much golden. All that remains is a bit of extra configuration.

To get the IR Receiver to do "cool" things (e.g. work MythTV and/or launch applications), you want to configure a ".lircrc" file. This should be in your Home directory.

Each command to be sent to a specific application has an entry that looks like this:

prog = mythtv
button = LEFT
repeat = 3
config = Left

Each starts with "begin" and ends with "end".

button = The button on the remote which will cause the command
prog = The program to send the command too.
config = The command to send to the program
repeat = What to do when a key is repeated. 0 ignores repeats. A positive value (n) sends the command to the program every "nth" times.

Obviously, you need a lot of commands to get various buttons working. To get you started, here is a .lircrc file very similar to the one I use.

Pro-Tip: To get a remote button to start a program, instead of just being interpreted by one, set "prog" to irexec and "config" to the command line-method of starting the program. Remember that irexec must be running in the background for this to work, so you can set it up as an Ubuntu start-up application.

For the receiver, you need to figure out what code set the device you are sending commands to uses. One again, a script is already developed for this. This will send the Power command to the device. Once you see it respond, you've found the right code set. For example, in the command I have above, the code set being used was 0_130 code set.

You'll most often be using this to change the channel, but you can do other things as well. You can find some example change_channel scripts here. Don't forget that in order for MythTV change the channel, you need to configure the backend to post to the script.

At this point you should be in good shape, but you might require some additional modifications to get things "quite right" (geeks are never happy, are we?)

My only concern is that some point along the line, a Linux kernel change will cause lirc_zilog to stop working. But I'll cross that bridge when I come to it...either by getting it building from source or finding it compiled elsewhere. I hope you found this guide helpful.

References and special thanks!