Usual disclaimers: I'm not a doctor, legal professional or financial advisor. This article is for information/education only and reflects my own opinions. It should not be taken as financial, legal or medical advice. Do your own research and never invest anything you cannot afford to lose (including your time).

20 December 2010

Desktop Automation Pt4. The Corrections

Ok so if you've tried this series of scripts you might have noticed a slight error. Upon reboot the system can't find the mac addresses file and produces an error. Apologies for missing out this minor but very vital part.

The reason the script fails is that during execution the path is not set to the folder where the data files are. This is easily fixed with the following batch file. Copy this to a file and drop it into your C:\postghost folder. Save the file as hostname.bat then drag & drop this into the run-once tab of startup-cpl instead of the hostname.vbs file.

c:
@echo off
cd c:\postghost\hostname
call myhostnamescript.vbs

That call line needs to be changed to whatever you called your rename and join the domain script.

19 December 2010

Xbox Live Quick Points Convertor

This is my quick lookup table for the Microsoft points system for the UK. It is based on the current conversion ratio of £8.50=1000 points. Please keep in mind that if you buy points on pre-paid cards, the value will be different.


Points: Value

_001 =  ____0.85p (850/1000)
_060 =  ___51p
_100 =  ___85p
_120 = £_1.02
_160 = £_1.36
_200 = £_1.70


_400 = £_3.40
_560 = £_4.76
_800 = £_6.80
1000 = £_8.50
1200 = £10.20

I would advise printing this and keeping it somewhere near your TV or Xbox

Automating Desktop Installation PT3. The main event

A quick recap then. In the previous sessions we have created a list of MAC addresses for all our machines (called macs.txt). We have one machine which has all our software installed which we are going to clone. We (optionally) installed Mike Lin's startup-cpl control panel applet and we have created two encrypted password files called lcl.txt and net.txt.

The first thing to do now is make sure that the macs.txt file is copied into the postghost\hostname folder so do that now.

Next we need to create the script which does the renaming and domain joining. So lets get on with it. Here is the code to do this. Copy it and save the file in the postghost\hostname folder.

'
' Set local admin a/c details here
'
Username = "Administrator"
Password = ""
strDomain = "mydomain.com" 'change this to your own domain
strUser = "mydomain\myusername" 'change this to your domain admin username
strPassword = ""
'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''
'
' Identify the hostname by finding from a list of mac addresses
'
dim maclist(8) '*shouldn't need more than this - increase if more than 8 nic's
dim macount
dim macsfile
dim strMac,strHName
dim eqpos
Dim sbox(255)
Dim key(255)
Dim fso
Dim tst
Dim Oput,strReadin,strAdminPwd,strUsrPwd
Const JOIN_DOMAIN = 1
Const ACCT_CREATE = 2
Const ACCT_DELETE = 4
Const WIN9X_UPGRADE = 16
Const DOMAIN_JOIN_IF_JOINED = 32
Const JOIN_UNSECURE = 64
Const MACHINE_PASSWORD_PASSED = 128
Const DEFERRED_SPN_SET = 256
Const INSTALL_INVOCATION = 262144
plaintxt = "YouAreNotATerminatorRobot" 'text to use as common key

Set fso = createObject("Scripting.FileSystemObject")
Set tst = fso.OpenTextFile("lcl.txt", 1, false) 'mode2=write (append=8) - output file
'read first
While Not tst.AtEndOfStream
strReadin = tst.readLine
wend
if strReadin <>"" then
strAdminPwd = EnDeCrypt(strReadin, plaintxt)
Password = strAdminPwd
end if

Set objWMIService = GetObject("winmgmts:\\127.0.0.1\root\cimv2")
Set colItems = objWMIService.ExecQuery _
("Select * From Win32_NetworkAdapterConfiguration Where IPEnabled = True")
macount=1
For Each ObjItem in colItems
maclist(macount) = objItem.MACAddress
macount = macount+1
Next
'uncomment to check first mac address is returned
'WScript.Echo "First mac =" & maclist(1)

'begin search for matching mac
Set fso = createObject("Scripting.FileSystemObject")
Set macsfile = fso.OpenTextFile("Macs.txt", 1, false) 'mode2=write (append=8)
While Not macsfile.AtEndOfStream
strMac = macsfile.readLine
if right(strMac,17)=maclist(1) Then
eqpos=instr(strMac,"=")
if eqpos<>0 then
strHName=left(strMac,(eqpos-1))
'WScript.Echo "Found match " & maclist(1) & " = " & strHName
'Now know the hostname so rename here
Set objWMIService = GetObject("Winmgmts:root\cimv2")
' Call always gets only one Win32_ComputerSystem object.
For Each objComputer in _
objWMIService.InstancesOf("Win32_ComputerSystem")
Return = objComputer.rename(strHName,Password,Username)
If Return <> 0 Then
WScript.Echo "Rename failed. Error = " & Err.Number
Else
'WScript.Echo "Rename succeeded." & " Reboot for new name to go into effect"
End If
Next
'end of renaming code

end if
End if
Wend

Set tst = fso.OpenTextFile("c:\postghost\joindom\net.txt", 1, false) 'mode2=write (append=8) - output file
'read second
While Not tst.AtEndOfStream
strReadin = tst.readLine
wend
if strReadin <>"" then
strUsrPwd = EnDeCrypt(strReadin, plaintxt)
strPassword = strUsrPwd
end if

'now join domain
Set objNetwork = CreateObject("WScript.Network")
strComputer = objNetwork.ComputerName

Set objComputer = GetObject("winmgmts:{impersonationLevel=Impersonate}!\\" & strComputer & "\root\cimv2:Win32_ComputerSystem.Name='" & strComputer & "'")

ReturnValue = objComputer.JoinDomainOrWorkGroup(strDomain, strPassword, strUser, NULL, JOIN_DOMAIN + ACCT_CREATE)
'ok so reboot
Set OpSysSet = GetObject("winmgmts:{(Shutdown)}//./root/cimv2").ExecQuery("select * from Win32_OperatingSystem where Primary=true")
for each OpSys in OpSysSet
OpSys.Reboot()
next


Again you need to add Mike Shaffers RC4 encrypt/decrypt routines to the end of this code. The two routines required are:

Sub RC4Initialize(strPwd) &
Function EnDeCrypt(plaintxt, psw)

Also don't forget to change the plaintext keyword (below the const lines) if you changed this in the password encryption routines from the previous posts.

So that should be it. Save your script file as hostname.vbs and then drop it onto the run-once tab in startup-cpl. Shutdown your machine and clone the drive when you're ready.

At this point it is worth pointing out that there are usually other applications which require some work after cloning. Kaspersky antivirus for example (at least our enterprise version) requires a command line to individualise it which is:

c:\progra~1\kasper~1\networ~1\klmover.exe -dupfix

Other applications like SPSS require you to re-run the license utility after the hostname has changed. We discovered we could copy an older version of spssactivator.exe into our postghost folder and use the following batch file to do this:

copy spssactivator.exe "c:\program files\SPSSInc\PASWStatistics18\spssactivator.exe"
cd /d "c:\program files\SPSSInc\PASWstatistics18"
start "SPSS18Act" "c:\program files\SPSSInc\PASWStatistics18\spssactivator.exe" (add license key here)

One day I fully expect software companies to keep track of all this via their own cloud-based licensing servers. For now, the very best of luck and if you wish to express gratitude for all the work presented here, please make a donation to the charity of your choice to help make the world a little bit better.

Update
======
I also forgot to mention that I didn't include code to remove the machines from the domain if they already exist. My solution was to remove the machines before cloning them using the active directory users & computers utility. If they're not there already, they can't cause a problem. :)

16 December 2010

Automating Desktop Installation - Pt 2. Postghost folder

Assuming the preghost routine has gone to plan you should now have a text file called Macs.txt which contains the hostnames and MAC addresses for all your lab machines. This is your lookup table. What you want to do now is create a folder somewhere on your machine to be cloned. I call this folder postghost as it contains any scripts I want to be run after the drive cloning process.

Within this folder I have a folder for each fix which is done after ghosting. For now just create two new folders called:

Hostname
Joindom

As you can guess these are for scripts to rename the PC and join it to the domain. This raises an issue of security since you need to have the local administrator account password stored on the drive if you want this to be fully automatic. Likewise if you want to automate joining a domain you need to store a password with a domain administrator account (yes that should sound alarm bells).

The way around this problem is to use a combination of encryption and permissions. Right-click on your postghost folder and go to the security permissions tab. You can remove all users from the list leaving only administrator able to read & execute. No writing is necessary and you will still need to log-in using the local administrator account to run the script.

At this point I will dish out some credit to a guy named Mike Lin. He has an excellent utility on his website called startup-cpl which adds a nice control panel applet to your system. One of the nice features of this utility is that it lets you add entries to the task schedulers run-once list just by dragging and dropping them onto the applet. In effect this means that once you have created the postghost script, you can set it to automatically run once when the local admin first logs in after cloning (so you will log in, the machine will rename and join the domain and then automatically reboot itself - how much work does that save you?)

Before we get to this stage though, we need to encrypt those privileged account passwords and drop them into our new folders. The script to do that is here:

Dim sbox(255)
Dim key(255)
Dim fso
Dim tst
Dim Oput

strAdmPwd = (inputbox("Enter local admin password:","Admin Password"))
strUsrPwd = (inputbox("Enter your network admin a/c password:","User Password"))
plaintxt = "YouAreNotATerminatorRobot" 'text to use as common key - change this for extra security

Set fso = createObject("Scripting.FileSystemObject")
Set Oput = fso.OpenTextFile("lcl.txt", 2, true) 'mode2=write (append=8) - output file
Oput.Writeline EnDeCrypt(plaintxt, strAdmPwd)
Set Oput = fso.OpenTextFile("net.txt", 2, true) 'mode2=write (append=8) - output file
Oput.Writeline EnDeCrypt(plaintxt, strUsrPwd)
wscript.echo "Created encrypted password Files"

This code is not quite complete though as it uses Mike Shaffers RC4 encryption routine which is copyrighted so not reproduced here. All you need to do is find this routine and add the code for the two functions which are:

Sub RC4Initialize(strPwd) &
Function EnDeCrypt(plaintxt, psw)

Add these sections from Mikes code to the end of the script and then run it. Also don't forget to change the plaintxt variable. You can change it to anything you like but the same value needs to be entered into the decoding routine later on. The RC4 encryption algorithm is used for WEP, WPA and SSL encryption amongst others. It may not be the most secure system available but if you are concerned about it's effectiveness my advice is to re-write the code with a more secure algorithm. The two routines mentioned above perform the encryption and decryption so only they would need to be substituted.

Once run you should have two files called lcl.txt and net.txt which hold your local and network admin passwords. Copy lcl.txt into the postghost\hostname folder and then copy net.txt into postghost\Joindom. If you open these files with a text editor they should appear to be a garbled sequence of obscure characters. I always do this just to be certain that my passwords have not been stored as plain text (which would be bad if the local admin account was compromised).

That's it for today. Next we look at the actual renaming script

Automating Desktop Installation - Pt1. Concepts

This may be familiar territory for a lot of sys-admins out there. You have a room full of computers, all requiring the same software to be installed although of course every machine needs to be slightly different. So what do you do? There are three approaches.

You could install each machine individually but that would take forever. You could set-up one machine and then copy it using drive cloning software (like Symantec Ghost for example), or you could go down the enterprise route and use sysprep to create a machine which has no individualisation which can then be cloned and reconfigured. There are in fact other options but for now we'll look at the cloning option since this is the most cost-effective for small or medium sized businesses.

Usually there are a lot of factors beyond your control however some network configurations can make life easier. For example if you have the choice between static IP addresses or DHCP, I would personally opt for static. Being able to spot malicious packets using wireshark and knowing exactly which machine is being used can save a lot of time. On the other hand DHCP can make managing desktops easier since it avoids any possibility of having multiple machines using the same IP address. Using DHCP with a wi-fi access point will also allow your users to use other devices (phones, ipads, laptops etc) without you having to allocate an IP address to each one (and also having to configure the device for use on your network).

If you use static IP addresses, you probably use the last octet of the IP address in the computers network name. For example a PC called RoomA-1 would be set to 192.168.123.1 with RoomA-2 being 192.168.123.2. The 3rd octet (123) may be different but this is fairly basic networking and used to be called a private class C network. These days there's a new standard (called CIDR) which would see this IP address displayed as 192.168.123.2/24 (the /24 means the first 24 bits are used to identify the network address and the remaining bits are used to specify the machine address - in this case using the remaining 8 bits which would allow a max of 2^8 or 256 machines on this network).

With DHCP, life is simpler. You give each machine a unique host name and then let the DHCP server take care of all the IP addressing. Think of a DHCP server as being like the voter registration system used by the government - everybody should get one voting slip and only one which identifies them. DHCP also creates a problem though since we never know what the IP address is going to be (if you move house the week before voting, your voting card may not arrive at your new address in time). You could configure your DHCP server to always issue the same IP address which sort of defeats the purpose a bit. Well ok, in some circumstances you want this to happen. It's harder to configure firewalls and port-forwarding on routers if your servers keep changing IP addresses.

Back to the main issue though. You now have one machine set-up and you want to create another 10+ copies of that machine but you don't want the hassle of reconfiguring them all. What do you do?

We know that hostnames will initially be the same since we are going to copy the entire hard-drive and that will have the machine-name stored on it. We suspect that you will most likely be using DHCP if you want users to have their own devices connected to your network with the least amount of hassle. So how do you get the machines to reconfigure themselves automatically?

The answer is to use the hardware. Each network card (NIC) has it's own unique identifier called a MAC Address. It's what network switches use to direct packets of information between connected devices. They are designed to be unique so that network devices don't get confused about which messages going over the network are for them. Think of this like an RFID chip. We can question this chip to find out the MAC address and then perform a look-up using a data file. This data file will have the MAC address of each computer on our network and we will use a script to discover this hostname, set it, join our domain and then reboot the machine. To think I used to have to do this manually on over 150 Pc's each year.

Unfortunately all of this needs to be installed on our machine before we clone the drive so the first step is to collect the hardware information about all our computers. Once we have done this, there are certain other factors to keep in mind. In many cases machines are secured to desks but if one is removed (swapped because of a fault for example) just keep in mind that you will need to update your list of MAC addresses.

I like to call this initial data collection process the pre-ghost routine. The process involves running the following script on all our lab PC's. The good news is this can be done remotely from my own desktop. First of all we need a list of our current hostnames which looks like this:

fcet-B110-1
fcet-B110-2
fcet-B110-3
fcet-B110-4
fcet-B110-5

We save this to a file called hostnames.txt. This file is then called from our VB script file to read the MAC addresses of those machines. The script which does this is here:

'
' Requires a list of machine host names in file hostnames.txt
' Must be run from domain pc with admin rights on all machines in list
' Windows management service must also be running on these PCs
'
Dim fso
Dim tst
Dim Oput
Dim strMachineName

Set fso = createObject("Scripting.FileSystemObject")
Set tst = fso.OpenTextFile("hostnames.txt", 1, false) 'mode1 = read
Set Oput = fso.OpenTextFile("Macs.txt", 8, true) 'mode2=write (append=8)

While Not tst.AtEndOfStream
strMachineName = tst.readLine
echoMAC strMachineName
Wend

Sub echoMAC(strComputer)
On error resume next
' strComputer = (InputBox(" Computer name for MAC address", "Computer Name"))
If strComputer <> "" Then
strInput = True
End if

Set objWMIService = GetObject("winmgmts:\\" & strComputer & "\root\cimv2")
Set colItems = objWMIService.ExecQuery _
("Select * From Win32_NetworkAdapterConfiguration Where IPEnabled = True")

For Each objItem in colItems
if objItem.MACAddress <> "00:11:67:27:B4:4E" Then
'this mac appears on all PC's - wi-fi/bluetooth maybe?
'you can probably leave out the if and end-if
'for your network but leave the oput line below
Oput.writeline strComputer & "=" & objItem.MACAddress
End if
Next
End Sub

If all goes to plan then you will end up with a file called Macs.txt which looks similar to below. If not then todays task is to discover why this is not working and fix it. For that reason I will leave off here for now.

Macs.txt file
=============
fcet-B110-1=00:24:19:B2:3D:45
fcet-B110-2=00:24:19:B2:5E:81
fcet-B110-4=00:24:19:B2:63:B8
fcet-B110-5=00:24:19:E6:86:F9

If you do get a machine which persistently fails to run the script, you can also obtain the MAC address using command line tools. From the start menu, select run and type in cmd.exe. Then use the following commands

ping hostname
arp -a

If this also fails you will need to login on the machines in question and start up cmd.exe again and type in

ipconfig /all

The MAC address is referred to as the physical address and looks like those above with all the : characters. If you decide to edit your macs.txt file by hand to include these, make sure that all the hex characters (A-F) are entered using CAPITAL letters or the next scripts may fail.

15 December 2010

The Future of Firmware (or software in general)

There have been a lot of privacy stories in the news recently. Articles ranging from how governments would really like to stop Wikileaks from releasing information they consider sensitive to stories of people falling foul of extreme pornography laws when their naughty holiday snaps are viewed by customs officials to stories about security professionals being detained and having their equipment confiscated at airports. While I personally beleive that all the information published by Wikileaks should be publicly available via the freedom of information act anyway (without actual names maybe - that would fall foul of the Data-Protection Act), the fact that these articles are newsworthy says much about how our governments censor and control our everyday access to media.

It may be time that manufacturers empower us with the freedom to control how much we choose to disclose about ourselves in a similar way. For laptops and computers, this issue has been addressed several years ago. There are both hardware and software encryption systems to protect our data. What about portable devices though? Is it time that manufacturers gave us the option of encrypting our camera images, our video camera footage and in fact all our digital works? There also seem to be a number of stories about security professionals being parted from their devices when they are detained at airports so let me predict the future of hand-held devices for any of those big companies out there that would like to create the next iPhone etc.

First of all encryption is going to come to all things mobile. I'm sure there are plenty of people out there that welcome this. What people consent to in the privacy of their own home or holiday apartment etc. is their own business. If they want to keep a digital momento of it then I see no harm in it but with todays cameras and camera phones you can never be sure that your 'artistic' works are safe from prying customs officials eyes. Encrytped data will become the norm and anyone who wants to be in this movement at the ground level should probably have already produced their ePDF equivalent data format by now.

I also predict encrypted backup storage in the cloud. Those security professionals need to be able to wander into any store and pick up a new device which meets this minimum specification. Within 5 minutes of purchase they should have been able to restore their contacts list and most recent data from online storage facillities. For anyone who has ever had a phone lost or stolen, you know exactly what I'm talking about. I would also expect phones to come with the ability for the owner to track where their lost phone is (or for this service to be available to police forces in the case of stolen phones). Obviously your contacts list and recent media creations will need to be encrypted and backed-up automatically or else few people would take the time. The first company that offers this sort of service will probably be the market leader for several years.

It's also likely there will be more compatability between models in terms of accessories and interfaces. You have only to look at recent historical memory card formats to see what happens to technology over time. Cameras used to have all sorts of different formats but eventually SD started to dominate due to its lower prices. These days most cameras seem to use SDHC. Not going with the flow can cost suppliers in terms of sales. I would expect this sort of convergence to continue. Along the way will be some other developments which will also be merged into the finished devices.

Batteries would be the next logical development. Being able to use standard batteries are a good selling point of many video cameras aimed at the Youtube producer market. We have been told time and again that litium polymer batteries provide more power and for longer but every manufacturer seems to provide different sized batteries. I expect to see some developments here along the lines of new battery ranges which are compatible with existing standard sizes (AA, AAA, C & D). Even if normal zinc-carbon batteries cannot power devices for more than one or two hours, the wide availability of them catches a buyers attention. It would be better to have 10 minutes talk-time in an emergency than none at all because your battery is flat. I would be surprised if we didn't also see solar and motion-powered (wind-up?) recharging facillities built into future devices.

So to summarise, mobile devices will:
Make use of encryption as standard,
Have rapid data movement to a new device,
Use standard parts and interfaces,
Implement green energy systems

At least that's my theory, feel free to differ.

Next week I hope to return to more technical postings. I intend to start with a downloadable archive of scripts for quickly setting up a number of PC's and adding them to a domain. I adapted these from various online tutorials to acheive a set of scripts which allow me to rapidly configure our lab machines here. Check back soon, it will be worth it.

11 December 2010

Yourshape lookin' good

I mentioned a couple of weeks back that we have decided to join the xbox brigade in an effort to stop junior launching wii-motes into the TV. I have to admit I've been quite enjoying Your Shape - Fitness Evolved. Despite the fact the xbox system doesn't have a linked in measuring system (like the wii-fit balance board) it has been quite compelling in other ways.

In fact for a bloke, the cardio-boxing exercises are great. None of that namby-pamby step aerobics if you please, this is serious get-your-heart-pumping stuff and the only thing missing is a small Japanese bloke shouting "wax-on, wax-off". Here we see the potential of kinect for podge-busting. Yes who would have thought getting in shape could actually be enjoyable and a little bit... macho?

Obviously this is a great starting point and I look forward to further developments along these lines. Why "Get fit with Mel B" when you could... get fit punching Mel B? Where does it go from here though? In a recent Gadget Show challenge, Jason Bradbury had a special training dummy built to learn martial arts (specifically Wing Chun). Will we ever get to the stage where you could learn these skills via a games console? It could spark a whole new revolution in gaming. Not only do you get to beat up the baddies but you genuinely learn a martial art along the way and no doubt an in-game trainer could give advice on diet and warn of the perils of drinking too much fizzy juice.

Back to the present though and this week also saw the launch of some extra training programs for YSFE. At 560 Microsoft points for the pair (another cardio-boxing and another toning program) you can't help but think that maybe this is not the best value for money. The 560 points equate to about £4.76 which is hard to justify for something which looks like more of the same. I would have liked to see completely different training programs for anything over a couple of hundred points. Since the release date is so close to the original title, you can't help feeling that these were just not quite ready for the initial release date. Have a look at the differences between wii-fit and wii-fit plus to see how it should be done.The associated web-site is good and I am enjoying setting calorie burning goals although it doesn't always seem to be as up-to-date with my scores as I would like.

According to the wii-fit I have gone from 11.5 stone to 10.75-ish in a few weeks. I was so shocked that I had to retake the body test 3 times (twice in the buff to rule out errors in my clothes weight). The question is... have I found a way to adopt a new lifestyle or is this just a fad that will lose it's appeal in a few weeks? There is no doubt that some of the exercises are a chore and the biggest problem is the inability to create your own program of exercises. Having done several of the 'Mens Health' fat loss sessions I am beginning to worry about my knees. I'm not quite sure where the fat is that requires the split-jumps to remove it. I remain skeptical that it's doing anything for my stomach or abs though. This is where a routine constructor would have been good, giving you options for building your own routines and the possibility of swapping such knee-destroying activities with something a bit more stomach-centric.

I will have to return to this topic at some point in the future but one month on, it does look like YSFE was a good buy.

7 December 2010

Amazon #fail

Once again the otherwise good name of Amazon is being degraded by those deemed trusted agents. I am of course referring to their resellers. I imagine it works well for Amazon, sitting back in control of the portal while other people take and fulfill the orders. I hadn't really given it much thought as after it was introduced the service was generally very good. The cracks are beginning to show though.

Today I received an email to say that order for an item I ordered as a christmas present has been cancelled. Nothing to suggest a reason why and a review of my Amazon orders reveals that not even the original order is stored online. So I am left baffled. Why did the reseller cancel this order? Amazon thoughtfully included a link to the product in the cancellation email which I followed out of curiosity. Strangely the item is still being offered by the same seller who apparently still has the item in stock. Had I not logged into my home email account I might have been left in a real panic this time next week when I realise the present hasn't arrived.

The real question though is what happened to the audit trail? I know I placed an order (after all I have a cancellation email about it). I just can't seem to find any information about ordering it in my account history. Now if I want to return an item or cancel an order I generally have to give a reason. I'm now really curious to know why my order was cancelled when the seller appears to be still selling the same item. I appear to have lost that buyer-seller communication. Since this sort of thing has now happened on multiple occasions I am learning to steer clear of resellers when possible. It's not that they are all bad (far from it) but I have learned a lot about shopping online over the years. First of all never buy unless the web-site clearly says the item is in stock. Luckily Amazon takes care of this automatically. The second rule is always check contact details for the seller which is much harder when dealing with resellers.

So why is Amazons reseller program having these difficulties? After all if I order something which is sold by Amazon.co.uk it usually arrives within 5 days. If I order from resellers though things sometimes don't arrive at all and then I just get a cancellation email like this (and sometimes too late). In fact last time it happened I gave the seller several weeks to sort out the delivery and then had to file a non-delivery complaint at which point I was refunded. You have to ask why Amazon are allowing their resellers to behave like this when their own core business is generally good? I have no idea but if the items I want to buy are not offered direct from Amazon then I do now look to see if I can find them from another trusted retailer, even if they are slightly more expensive. Sometimes you need to have a high level of confidence in a retailer. Sadly that's something that a few resellers are being allowed to spoil for all the others. If I were in charge of Amazon, I'd look into this further. I think I would also want to collect data on who is cancelling orders and how frequently. After all if the seller cancels a transaction, no feedback gets posted after the event to warn their future customers.

29 November 2010

Kinecting Part 2

Well it turns out that my xbox thought my NAT was too strict which might have been stopping my kinect fitness from keeping track of how many calories I'd burned. The good news is I found this article here which explains that this really means - the firewall needs some ports opening to work properly. In case the site vanishes, these ports are:

UDP 88
UDP 3074
TCP 3074

24 November 2010

Xbox Live - Is it worth it?

I was curious about Xbox live when I heard about it. My first impression was... "well that sounds like a con" and to some extent that holds true. It's a bizarre concept for sure. I buy the xbox, I buy the games, I pay for the internet connection and then I have to pay a gamers tax to connect to on-line servers? Microsoft are really earning their M$ tag with this as I've discovered from the free one months trial.

At the time of writing this, it's possible to get a 12 month Xbox-Live membership for around £30 from Amazon. The question is why would you want to? You really do need to research this topic as I could walk into M$ tomorrow with a list of things which should be done differently with XBL.

Ok so with XBL you get to download lots of demo's so I can sort of see the point for that. At £2.50 per month it's cheaper than a magazine with a disc on the front and you have a better selection of things to try. I will also admit that the ability to unlock full versions of games from within the console is also good (although warning bells are ringing on that one - I'll explain later). There is a certain amount of stuff which is naff though which deserves to be explored.

First of all the points system. Yes I credited my account with 2000 points for £17 and I used 800 of those to unlock the full version of my all-time favourite driving game - Outrun. Here's the problem though. Everything is listed in terms of points so you have to be numerically dextrous to figure out how much you're paying for things. 800 points equates to (17.00/2000*800) which is £6.80. In the case of Outrun I would say that's a bargain but at the same time I've lost the resale value. With a disc I could have put it on eBay in a year or two and perhaps got some of my money back. Not so easy with data on your drive.

I also quite like the idea of film rental via the xbox but here in the UK we get to access Zune rather than Netflix. The upshot appears to be a really limited number of films. Again you need to check the prices of these very carefully. When I looked, some of the more appealing and more recent films were being offered for around 590 points for a 24 hour loan period. Working this out at £5.02 reveals it's a lot more costly per film than other services.

Yet another cash-sink for the financially astute is the avatar system. Yes some of those coveted points can be swapped for customised clothing and accessories for your digital representation (or wii-mii on Nintendo). It's true that some of these can be obtained from games when you obtain specific achievements but I can't imagine who would really want to spend money on them? If they were outfits for in-game characters that you played a lot then maybe but as the avatar is purely a dashboard (and online) feature there appears to be little incentive to spend money on them as far as I can see.

Then there are the things which XBL is not so good at. Ok everyone can have their own profile (and avatar) which enables M$ to sell 'family subscriptions' so why then do the parental controls apply to all profiles? Surely it's logical that I only want my young son to be blocked from the online blood-n-gore fests? Also where is the option to pay for a months XBL subscription with the points I have left over? The real scary part however is that my profile now has my credit card linked and I can't help wondering how easy it will be for my son to switch to my profile and start clicking on the 'unlock this game now' buttons. There are so many warning bells ringing that I could be mistaken for thinking I'm just outside the Vatican.

I've also noticed that on the xbox there's currently an offer to get two months XBL membership for the price of one. I'm not sure how regular this is but paying for 6 months sounds more appealing than paying for 12. I think I will now have to wait until my trial membership ends to see if I miss it enough to consider paying for it. At the moment though I can't see the point. Maybe if they threw in one full price new release game into the deal it would be worth an annual subscription. At the end of the day, those demo's exist to part me from my money. If I'm not playing them, I'm probably not buying the full games either.

15 November 2010

Get Yourself Kinected

I will be the first to admit that I have often found reason to criticise Microsoft in the past, but every now and then they do something which deserves attention and credit. Of course you can guess from the title I'm talking about Kinect. I've seen a lot of people (read teenagers) making comments on Youtube videos about how awful Kinectimals is. Well I have a four year old son who thinks it's the best thing ever and I've been surprised at just how quickly he's taken to this new control system.

Ironically we bought it to prolong the life of the TV as we had visions of wii-motes going flying into things. Just a couple of days after we got the new console he flew one of his planes into the TV and we now have a small line across the screen. That's kids for you.

So is this new toy worth the asking price? Well I beleive it is but I would have changed some of the software. For example, I would have produced a sports disc which has the same games which are available for the wii. Think about it... I now have to find golf, bowling and softball before I can think about offloading the wii. That got me thinking even more though. Why should we be paying £30-40 for a disc with half a dozen mini-games and invariably end up with some we will never actually use. It would make more sense to offer some sort of compile-your-own-sports-disc or in the case of the X360 you could opt to download the ones you want.

Just looking through the available compilations there are many which have some sort of appeal. I can see my wife enjoying the horse-riding in kinect motion sports and I expect my son would like the hand-gliding. American football however is not likely to appeal to any of us. So what's the answer? Well for us I think it's sit back and wait for some of these to appear in the pre-owned section of the local Game stores. The xbox does seem to excel in this area because for some reason pre-owned xbox games appear to sell cheaper than their wii equivalents. Maybe that's all the teenagers trading in their old titles for the latest call of duty. Whatever the reason, £10 for a pre-used title for the xbox does more to sell consoles than £17 for the same pre-used title on the wii. For this I am glad to now be "Kinnected".

11 October 2010

Cloud computing yet again

Last week I may have expressed quite a biased viewpoint about Cloud computing since it's design does seem to be about how to run a server farm without a team of admins. I can see there are both advantages and disadvantages. When I actually sat down and thought about it logically from an admin view though I can see this revolution in our work practices is long overdue however I don't beleive cloud is the answer right now.

Take developers for example. They love tools like Mamp, Wamp, Xammp because it removes the complexity and required knowledge of having to learn the intricacies of IIS or Apache. It's a tool which they run and straight away they have a localised environment which allows them to concentrate on developing their applications.

Cloud computing is doing the same for servers. What do you want? ok here it is in a ready configured software instance; but it still has all those complexities within it. One wrong configuration change and it could be game over. That's ok though because it's purely software and in the cloud, software is quick and easy to swap. The downside is the expense of doing this. So why is it necessary?

Well as server products mature they become more and more complex and none of my admin friends look forward to having to rebuild a server from scratch. Here then is the key to the problem. Why go on with this conventional method of running a server?

As an example, a few years ago I played with a firewall product called smoothwall. Smoothie was great. The software was all on CD and once it was configured, the configuration could be stored on a floppy disk (which could then be write protected). Sadly a typical server reuires more than a CD of system software but does it need more than a standard DVD? After all it's this system software that is a pain to reconfigure. Moving the data to another machine is as easy as swapping the drive over (or restoring a backup to another drive if it was the data drive that died).

So imagine if there was a project to install a few standard server configurations like this. You could download a disk with everything ready set-up in a standard configuration. all that would be needed is a few tweaks once the system was installed. Of course there would be security issues. If someone broke into it, every system would be affected. Then again there would be lots of people working on a solution at the same time. Add the fact that the server could be rebooted and would re-install from read-only media and it turns out that all you have to work on is the route that the intrusion uses to get into the system. It would also mean that configuration would be put together by experts using best practices.

The real advantage of this system would be in running it on your own local servers. No cloud tie-ins or expensive vendor changes. No hidden costs or data security concerns associated with your data being out there somewhere on another companies servers. I still think £2-3k for a new powerful server is a bargain compared to the price of consultants so this would be a real goer in my books. Now that I've thought of it, it's time to see if anyone is already doing it.

7 October 2010

Cloudbusting 101

I've been hearing a lot about this magical environment just lately. A place where applications are always available, can be seamlessly scaled and provide high availability. It's the greenest computing ever and all of your apps are belong to us.
Well I can now say at least I've investigated it and I wouldn't jump ship just yet. The price for all this is that you are likely to get stung by hefty development costs and you may end up tied into proprietry systems that could end up costing more than you expect.
 
Take for example Googles App Engine which I have thoroughly enjoyed playing with. To do anything useful with it I'm going to have to learn Python or Java. Since cron jobs are likely to be useful that choice is further limited to Python. While I have nothing against learning Python, it's important to remember that I've spent the past 10 years programming in PHP and it is apparently still the most popular Apache module. I know PHP well and I like it. Python, while not being quite as picky as Java has some unusual quirks about indenting sections of code. This means that not only am I restricted by choice of language, but further almost restricted to their choice of text editor until I understand this quirk. Not only that, but I am also faced with non-relational databases. Still at least Google lets developers have a go at all this for free, unlike their competition.
 
Oddly enough Microsoft Azure and Amazons EC have both set their price for a minimal system at $0.12 per hour. That sounds cheap when compared to the cost of a server and someone to look after it. The trouble is, you don't actually understand what you get for that $0.12 because looking at Azure pricing (here) suggests you will pay more for having a database, and then pay more for each GB of data you pass into or pull from this magic cloud. For this sum of $1051.2 ($0.12x24x365) you will get an amazing 1.6GHz driven VM with 1.75GB of memory and 225GB of internal storage. If you listen carefully you can just hear the server admin guy you just fired still laughing. Even louder as they hear the competition wants your credit card details before you get anywhere near uploading your precious applications.
 
Lets see how this compares with the server in the next room to me though. The equivalent of my dual processor xeon equipped server would be the large offering (4x1.6GHz processors, 1TB storage and 7GB memory - ok so the memory is more than our 4GB). How much would this system cost? Well it's $0.48 per hour or ($0.48x24x365)=$4204.80 per year. Add the fact that we have a database on it which is we'll guess slightly smaller than 5GB (that's an extra $49.95/month x 12) which costs us an extra $599.40. We typically backup our server every week as users like a bit of storage security and our data backups are typically 60GB. We'll be generous with knocking off the system files though and say that's 10GB which is not our data and wouldn't be transferred (we hope). So the transfer cost is 50GB per week for 52 weeks. Thats (50x0.15x52) or $390. This is just transferring data from their storage to ours though so we have to pay for their storage space too. If the data is 50GB backed up, it's likely to be around 80GB uncompressed. So for storage we also have to pay (80x$0.15x12=) $144. So far we're up to $5338.20 per year and that's before we've factored in any developer costs involved with migrating to cloud technology. Once we start adding in developers and consultants it could become really expensive, really quickly.
 
Oddly enough though, once you have everything online you're still going to need to give people access to it. How will they access your data? By this I mean the people in your company who utilise your data. In my workplace most people have a PC on their desk. Move all the server-side content over to the cloud and guess what - people still need a PC on their desk to access it. All you've done is released into the world a wild and grumpy admin who probably knows more about your business systems and data structures than you do. Not the wisest of moves to downgrade him to a desktop support role then. Also you have to look at how similar this is to outsourcing your helplines to foreign call centres. Yes you've saved some money for what started out as a great service, but how will it compare when more people jump on the bandwagon and the prices rise? What when your customers start complaining that the service they get is not what they expected?
 
The whole consultancy process alone is likely to wipe-out your IT budget at the moment as the consultant may only be there for 3-6 months (just long enough to really make a cobblers of it) but what are you paying them compared to your server admins? To really test out the new service, why not try calling the consultant at 2am with a critical problem and check out how they react. The developers likewise may be eager to get on-board but it's such a new concept that any worth their salt will be charging higher rates than conventional developers right now. Yeah you can buy a TV... or you can buy this new 3D TV and a limited amount of 3D ready films. See the comparison?
 
So I may already be a dinosaur but if my server dies I had the sense and good fortune to request a second for development (or spares). It may not be as quick to restore but since there are 365 days in a year, one lost half-day of service still gives an uptime of 99.86%. I can make that even higher if I use the second server to test out the backups. Of course I may return to this topic in 6 months time and completely change my oppinion on the subject. By that time I may be a fully-fledged cloud developer if I continue tinkering with this latest toy Google have given me access to.
 

24 September 2010

Since Oracle have taken an interest in MySQL (see here) there have been some interesting developments. Not least of these is the target they have set themselves of increasing their share in Windows environments. I have to wonder how much of this is lip service and how much is genuine lets mutate our product into something new.

I have to say that I'm a big fan of MySQL and have been a DBA for several years now and yes running it on Windows servers at that. I can explain in a few paragraphs why most users drop it at the first opportunity. Let me explain.

As I work in education, I meet a lot of young people who typically have just 12 weeks to learn any new package. So lets see how MySQL stacks up against the competition in this area.

First and foremost, our students can get a free copy of Access via Microsofts' Academic Alliance (or equivalent licensing deal). This means to compete, MySQL will have to give students a FULL product at the same price. Students can install the Access platform on any of their machines and moving their data between them is as simple as copying a single file.

By contrast, for MySQL we rely on the latest community version of SQLYog (now SQL Workbench). So already we are using a third-party front end just to manipulate data. The next problem is we have to teach students this strange concept that the database server is constantly running, not just when they start up Ms Access. This means they have to backup their data to a SQL file and this is where things usually start to go wrong.

You see we are limited to when we can update our version of MySQL on our server since once students start using it, we can't just upgrade when we feel like it. We have to wait until after submissions, marking and sometimes later reviews by external examiners. So we end up in a situation where students download the latest version and then struggle to upload their SQL file. Why? Because the SQL dump file seems to change format at almost every update.

Add to this students, like water opt for the easiest route. Many of them will likewise opt to install Wamp, Xamp or Mamp because lets face it even a seasoned administrator does not look forward to setting up a new server and having to configure IIS, MySQL and PHP all over again. If I had £1 for every time a student told me they had installed one of these and were now unable to get their projects running on our server, well I wouldn't still need to work here.

Of course there have been other problems in the past as well. The Old_Password function (if memory serves) because of a difference between the MySQL code and the PHP library attempting to connect. These things we just take for granted as administrators. Code moves on and when you hit a brick wall there's always Master Google. Then again if you want academics to support your cause don't make major changes because that means rewriting notes (and they did that last year or the year before).

So what should MySQL do to take over the Windows world? Well KISS might be a good starting point (the concept, not the band). For a start add a proper front-end. Make it simple to move data from machine to machine (including servers). Educate the user a little (explain how the DB is live and the save/backup is only a snapshot of the current data for example). Use XML for the data transfer and make sure that all future versions can import data from previous versions (remember MS Access 2007 has a limited "Save as Access 2003" format and claims to have forward compatibility).

Most importantly start making these changes now. When Google launched a browser, Microsoft launched a search engine. It stands to reason that if MySQL launches a Windows developer IDE, Microsoft is likely to react by launching an Access Server. Which will have the highest uptake? Probably the one which is easiest to get started with and costs the least. We're public sector after all and despite the recent cuts in public expenditure we've already seen our student facing technical team cut by 50% over the last decade. Never has simplicity and reliability been more urgently required (so instead of %/localhost, a domain option would be useful).

If you really want to take the crown though, add some finesse. Have a query builder and tester in the user interface and then... wait for it because this is revolutionary... produce the PHP, VB, Java code to make the connection and bring back the results. The dev would then just need to concentrate on style sheets and making the page pretty.

17 August 2010

Hey Google, what's wrong with Blogger?

Is it just me or is everyone else experiencing the same problem when switching to the new design tools? There are some great images but it seems to lack the ability to upload your own. I played around with what was on offer on one of my other blogs and discovered that whenever I tried to change the design, it wouldn't let me add new gadgets to the layout and instead inserted the dashboard into the sub-frame. Did I do something wrong? I tried again and found the same thing occurred within the sub-frame.

I thought maybe the frequent design changes had confused it so I exported my blog, deleted it and then tried to create a new version and import the content. Blogger just refused point-blank to have anything to do with the file. Arrrgh... undelete... undelete. Luckily I managed to restore the old version. You do have to wonder how committed Google are to the Blogger service though.Hmmm... maybe I should try it in Chrome to see if that makes a difference.

Apparently it does because changing the design of this blog in Chrome worked flawlessly. So there is a need for chrome then

18 June 2010

Some Photography Abbreviations

Sometimes I forget what these are so this is my handy reminder post.

AF - Auto focus
AF-S - Auto Focus with Silent mechanism (supposedly)
VR - Vibration Reduction (helps with camera shake)
DX/FX/1.6/1.3 - Refer to frame sizes: See here

I will update this one as and when

Annual Versioning System - AVS

Here's my thoughts on why development of an AVS would be a good idea for all software producers. At the moment there are millions of packages out there and it's going to be a real pain for anyone in the future to demonstrate a set-up from the past. Why? Well windows is now Windows 7, yet Windows media player is on version 11. Then every few years producers go off and create something new which we all suspect is just to raise more money. In 20 years time if we need to emulate a PC system from today, how will we get the versions right?
I propose the AVS system. A product version will be versioned by the date it was released. So a new product released today would be V10.6 (as in 2010.6th month). This could be extended to specific days e.g. v10.6.18. It may sound weird to start something at version 10 but the benefits should be obvious.

19 March 2010

Why Tomcat's should be neutered

Two hours I sat staring at a page of code which by rights should have worked perfectly first time. After all, the important loop I added came almost exactly from a text-book example. The problem? Tomcat was throwing up an error "resultset cannot be resolved to a type". There were lots of pages on Google suggesting making sure certain files were in the correct folders. The answer though was a real D'OH!!!! moment. Yes due to Tomcat's default case sensitivity, it turned out that I should have used RecordSet rather than Recordset (the capital S in set if you missed it). This is why there's a good reason to hate Java & JSP.

In fact the whole problem of Tomcat is neatly demonstrated over at the Tomcat wiki (see here: http://wiki.apache.org/tomcat/FAQ/Windows). Can I turn off case sensitivy? A one word response - Yes. Awesome classic fail. A few Googles later reveals why this is a really bad idea as it can lead to exploits, one of which is allowing attackers to see your source code. Nice!!! Someone else already pointed out that this could be percieved as a bit of a design flaw. Sort of like the architects leaving out the windows to negate the need for comfort cooling.

Still there is good news on the horizon. The guy who gets paid at least 10K/year more than me to support the developers is back from paternity leave next week. For another 10K I might be convinced there's a place in the world for Java but I still remember my lecture on Xpath. I replaced eleven lines of java which didn't work with five lines of PHP which did. I'm afraid I don't have the time to write twice as much code and spend four times longer fixing it.

11 February 2010

Why corporate policy inhibits productivity

Here I am, sitting at my desk faced with quite a dilemma. Our organisation has invested in new anti-virus software (which I won't mention by name). The internal systems team are overjoyed with the new product, because it's pro-active defense system stops a lot of new viruses from getting onto our systems, which means less work for them.

Unfortunately I am a mere cog in the machine and this new pro-active system has so far stopped Java from working (by quarantining JAVAW.EXE ????), it has deleted Smart board software and currently I am waiting for them to allow me to use a copy of Pinnacle 9 on a system I have set up to perform video file editing & conversions.

So what are my options exactly? Do I wait another 2 days for our team to implement the ruleset I suggested (which tells the A/V to leave everything in the Pinnacle folder alone). Should I wait as they add each single part of the application, one executable at a time? My only other option is to use a non-networked machine which will at least let me get on with the job in hand. While this suits my needs, I can't help thinking that since this machine has no antivirus, I can't put it on the network. Which means if it ever does get a virus from transferred files, it will become an incubator for whatever infection finds a way onto it.

If you find yourself out there considering writing a new antivirus package, it might be worth contacting a few software vendors. I would suggest a simple CD or DVD checksum approach. If the checksum is valid, let my applications install and allow them to do whatever they need to without quarantining anything. It might just save me 3-4 days in the process.

3 February 2010

Moblin for the Masses?

While the rest of the world is going iPad crazy and producing all sorts of humourus comparisons, I've actually taken advantage of the fact that netbooks have plummeted in value to the point where I can now pick up a 1.6 GHz atom powered machine with 160GB hard drive for little more than the very first Asus Eee's were when they first appeared. Am I that far behind the times? Not really but my employer bought some of the first Eee's which were already outdated by the time we got them (it seems that anything you order in the public sector costs at least 10% more than it should and takes an eternity to arrive due to all the bureaucracy).

Anyway after reviewing the market I finally decided that prices had now stabilised enough so that there is very little difference between the various netbooks which are available. They would seem to be more powerful than an iPad and with Windows installed these days it means I can make use of Tucows, 5 star shareware, Major Geeks, Cnet-downloads and all the others (one app store is Soooo last decade don't you think?). It also means that if Apple does sort out the lack of published material in the UK ebook market then I can always install iTunes to take advantage of the fact.

So what did I go for? After much researching and late night review-reading, I finally opted for the Samsung NC10. One of the things which pushed me towards this decision was the fact that it was known to run Moblin, which I have wanted to try out since I saw the first demo video of it around this time last year. What is Moblin I hear you ask?

Moblin (or MOBile LINux) is a linux distribution targetted at netbooks and Medium-sized Internet Devices (or MIDS). The Moblin interface is quite simplistic but at the same time has some very neat features. It also has some limitations which I hope will be tackled in future revisions because I really want to be using Moblin as my main operating system in future years.

First the good points. Moblin has a sort of start page called Myzone which keeps track of your task list, recent web-sites and media files which you've accessed and even your twitter feed. These are shown as medium sized icons which are very easy to tell at a glance, just which content they represent. Also as you browse to web-pages a similar process occurs. By clicking on the internet icon you are taken to a page filled with thumbnails of recently accessed pages. I may have cheated a bit by picking hardware which I knew would run the wireless networking, but connecting Moblin to my home WPA-secured wi-fi network was as simple as entering the password (once I'd figured out where the network icon was). So my initial view was this is a fantastic system with a brilliantly intuitive interface. As I'd already freed up some drive space during set-up I took the plunge of installing it to hard-disk as bizarrely it didn't seem to store things when running from a USB stick. I'm glad to say that it does once it's properly installed and it didn't trash my Windows XP home installation in the process (although you need to be quick at the menu to get back to it).

Now the bad and I hate to say there are bad points about something which looks so good but here goes. First of all the Twitter page took a while to accept my settings. There was no visible feedback on that page that my username and password were correct and you have to go back to Myzone to see the feed. Not very intuitive. Then there's the display theme. It's too bright at night and too dark in my office. There doesn't seem to be any way to change the theme. It also doesn't display the day on the clock (some days I'm just so busy that it's nice to be reminded ok). It also struggled with my test media, giving me an error message about gstreamer codecs? when I tried to play my mp3 files. I also got this message when trying to play avi and mov files. Yet it will play formats I don't have content for (.ogg for music and .ogv for video). The most successful content test was with some PDF ebook files I have. The PDF viewer is awesome and very user friendly. It's just a shame that the display was so harsh in a dimly lit room that I really didn't feel like using it as an ebook reader. Eventually I managed to get mp3's playing with an add-on to the Firefox browser.

Shared file storage is also an issue as unlike other distributions (like Puppy for example), Moblin doesn't seem to want to know about NTFS partitions, not even opening them as read only to get at the content. So my 20GB's of media storage space is not accessible - not a good thing really.

So to conclude, Moblin is very good, but equally bad. It would be great for someone who had never used a computer before; someone who has never used Windows or downloaded an mp3 or had no idea that you can usually customise a desktop to suit your preferences. For the rest of us, it's a sacrifice too great at the moment. I would need to have time to understand how to re-write parts and learn to tweek it to suit my needs. For now I will continue to play with Moblin, at least until I figure out how to remove it cleanly. I will then probably try to install Puppy into that partition and then try to find the apps that will make Puppy look like Moblin. It's user interface is so sweet but it's not currently backed up with usability in my oppinion. A real shame as if it did everything Puppy does, I would even be able to contemplate ditching Windows (at least on my new netbook).

1 February 2010

Why the iPad could be a good thing for the UK

So the hype and the secrecy is over and we have all seen the iPad device previews. There has been a mixed reaction to it, some asking why buy one when they already have an iPhone? Others meanwhile have commented on it's lack of Flash plug-ins, multitasking and USB ports. Whatever you think of the hardware, two things are apparent from the outset.

The first is that Apple think this is what you should use to read ebooks and emagazines. The rest of the world is unconvinced, especially those who spend a lot of time reading books. Those who sell books are also quite busy promoting E-ink devices which update their screens only once (for each page turn), therefore supposedly doing less damage to your eyes and preserving precious battery life. Just this last week I started reading an ebook on my HTC magic, only for the experience to be cut short as my battery ran flat.

Still, it's the second point about Apple's new device which may be good for the UK. The iPad appears to be just another platform for iTunes. Yes they really want you to buy most (or all) of your digital content from them. A lot of people in the US are commenting about how this is not a good thing and that Apple will do to publishing what they've already done to music. Now stop and think about that. In the UK you can get iTunes gift vouchers in most supermarkets these days. Contrast that with trying to buy an ebook. In the cases where I've tried, I've been confronted by pages which say things along the lines of "This file is only available in the U.S. or Canada". So if Apple actually do improve this situation then maybe, just maybe something useful will come from the iPad.

Still the ebook market is young and vibrant. It's been interesting to see how certain online music stores, supporters of drm-free music downloads have not applied this ideaology to their core business and released their ebooks in drm-free formats. This market is young yet and I wouldn't be a bit surprised if E-ink screens start turning up on mobile phones or as plug-in USB accessories; maybe even add-on wi-fi screens. So in such a volatile market why would I rush out to buy an iPad? In 12 months time there will be another, maybe with USB or a video camera and probably with more battery life.

If they do actually sort out the UK's access to ebooks I might be tempted. I have shelves which are stressing under the weight of hefty computer books because that's the only way I could buy them. So the old web-master saying that "Content is King" applies here. It's HD-DVD vs BluRay for books only with more formats and devices. Eventually one will lead the pack. Authors will want to be exposed to as many income streams as possible so expect some severe market shake-ups. Consumers also want to choose their readers based on tactile feel and user-interface rather than which files they support and how easy (or not) it is to get the right content.

When it comes to electronic media distribution you have to concede that love it or hate it, Apple is a big player, possibly even the biggest. When it comes to user interfaces, Apple are again emulated by everyone else. Damn it I'm not ready to turn into an Apple fanboy. Not until they implement proper drag & drop. I guess I need to see the Lenovo hybrid, the Entourage Edge and the Notion Ink Adam before I part with my hard-earned. I just hope they are all represented at the Gadget Show in April.