Skip to content

estm computer information systems for business, Shropshire, West Midlands

IT Management, Infrastructure, Computer Network, Computer Support, Computer Repair, Information Security, Business Continuity, Backup, West Midlands and Shropshire

IT Management

Infrastructure, Network, Support, Security, Business Continuity, Disaster Recovery

Increase font size  Decrease font size  Default font size 
You are here:    Home arrow Blog
Windows 7 SP1 - update readiness
Infrastructure
Written by Ian Edwards   
Saturday, 26 February 2011

Microsoft have begun rolling out Windows 7 Service Pack 1. While this appears to have been largely successful some failures have been reported, these have been attributed to some inconsistencies in the Windows update code. The quickest and easiest repair option is the Windows Update Troubleshooter. Open Control Panel, click in the search box, and type troubleshoot. Click the Troubleshooting link at the top of the search results, then click System and Security, and finally click Windows Update. 

Microsoft have also produced an update readiness kit to detect and fix these inconsistencies. If you have problems installing Windows 7 SP1 and the above process fails download and run the update readiness checker appropriate to your version of the operating system. If you, like me, approach the installation of major service packs with trepidation, run the update readiness kit before applying the service pack.

 
Windows 7: Annoyance No 1: Constant disc access
Consumer
Written by Ian Edwards   
Wednesday, 16 February 2011

Of late I have been driven to the point of insanity by incessant disc activity from my Windows 7 desktop. As this does actually sit on my desk the constant clattering of the hard drive for no apparent reason had become quite a distraction.

At first I blamed my anti-virus software thinking it was scanning the drive even though I had configured it to not do that. I was also ready to blame sync centre which I use to replicate files between my server and desktops - the idea being that normally my files reside on the server but sync centre keeps a local copy so that if I'm not connected to the network I can still access my files. Good for laptops and as a business continuity measure. Sync centre was in the frame because the problem only seemed to occur  when attached to the network, but sync centre itself appeared to be inactive (according to the system tray icon) most of the time this was happening.

Well I think I have tracked down the culprit, it is a process called superfetch. Superfetch preloads frequently used programs and documents into a RAM cache. Apparently it tracks usage patterns and learns what you do and when so that applications load faster. It also interacts with defrag to optimise the boot process. That may well be, but disabling it appears to have stopped the disc thrashing and that's a big performance boost so far as I'm concerned. Generally it's not recommended to disable superfetch, and my problem may be due to some interaction between superfetch and sync centre that won't affect everybody, but for the time being superfetch is staying disabled.

To disable superfetch run services.msc, scroll down the list of services until you find superfetch, edit it's properties and set it to disabled. If it is running you can stop it immediately by clicking on stop (obviously).

Links:

Microsoft superfetch overview    Microsoft pc accelerators download paper

 

 
Virtual Desktops - a blind alley?
Infrastructure
Written by Administrator   
Friday, 11 February 2011

There are probably three hot topics in IT at the moment, "the cloud", "mobile" and "virtualisation". There is overlap between these areas but it is virtualisation I want to address in this post. Virtualisation is technology which appears to make something out of nothing. Where you might have had five physical file servers, for example, you now only need one which can be split up using software into a number of virtual servers. These in logical terms appear to be physically separate (if you browse the network from a pc you'll still see apparently x different file servers). Virtualisation at the server end makes perfect sense to me. Done right it offers huge benefits in saving on hardware costs, operational reliability, scalability and business continuity.

Virtual desktop illustrationWhat I am not convinced of is the benefit of desktop virtualisation, or more specifically Virtual Desktop Infrastructure (VDI). This is a scenario where the end user, instead of having their own relatively high power pc with all its peripherals and applications, is given effectively a keyboard and screen which gives a view of an instance of the operating system or application running on a server in a remote datacentre somewhere. A 21st century equivalent of the dumb terminal "green screens" that were appearing in business up to the mid 80's, except this technology allows on demand provisioning of applications and data on pretty much any device.

The arguments for virtualised desktop infrastructure is that it is easier to manage and deploy, cheaper to install and more secure with no data saved locally than a conventional client server arrangement. The cost issue would have to be argued on a case by case basis, all I can say is that pc's are relatively cheap now and you'll still need a screen which is probably half the cost of a pc anyway, but virtualisation pushes a lot of the hardware and software costs back to the datacentre along with a stack of complexity. There would be a saving on power, particularly the cost of power distribution to desktops which in a large estate would be considerable - but that's it. To be fair Microsoft themselves don't make any claims for cost saving.

Yes, virtualising the desktops makes them easier to manage than Windows clients, but there have been tools available to configure and lock down desktops and remotely install applications available for ten years or more. Similarly "self-service" user provisioning has been possible for nearly as long. The advantage VDI has over these more traditional techniques is that it separates the application from the device and it's operating system

So what's driving this? In my view it's the legacy hardware and software manufacturers who have made their fortunes out of the dominance of the Wintel (Windows and Intel) alliance over the past decade. Windows still dominates the desktop, certainly in business, but it is almost certain that the computing device most people use on a daily basis isn't running Windows - it's their mobile, and Windows has a declining share of that space. The applications those users are accessing via their mobiles are also not likely to be Windows based either - in most cases they will be accessing the web which runs on open source technology. Facebook for example uses open source technology (see here), as does Google. Most of the web runs on open source software. For example in 2010 Apache, the open source web server, took it's share from 46.6% to 59.4%, a gain of 12.8% while Microsoft's IIS went from 21.0% to 22.2%, a 1.2% gain (source Netcraft).

The corporate world is still largely wedded to Microsoft but increasingly organisations are using a mix of technology. Apple computers are starting to appear on desktops beyond their traditional media stronghold and open source is appearing in the datacentre / server room.

Now don't get me wrong, I am not predicting the demise of Microsoft (not just yet anyway) but the once dominant corporation has got a battle on to keep up with the pace of change and user expectations.

So virtualised desktops offer advantages in some situations, but could it be that the major manufacturers are just trying to squeeze a few billion dollars more out of the Microsoft stack before user pressure to consume true web based applications (as opposed to thick client apps presented in a browser) on mobile devices finally takes over the corporate space? Will we still be talking about VDI in ten years time?

Follow these links to find out more about VDI ...
Microsoft VDI  - VMWare   -  Citrix