Repeatable, consistent, and predictable are three things that add an incredible amount of value in IT, and building servers from a base image is one way to deliver on this. I was just replying to a thread on a discussion alias where the person who started the thread had reviewed a blog post (http://www.jasonsamuel.com/2010/05/07/how-to-build-a-vmware-vsphere-vm-template-for-windows-server-2008-r2/) on how to build such an image for VMWare. I and a number of people disputed the recommendations made in the referenced blog post in addition to the various other things the individual who started the thread was planning to install in his image/template.
At a high level, the most important thing from my reply, I think, is that you should not be customizing a server for it to be convenient to your work style. The server is there for a purpose driven task. You shouldn’t even be connected to it via Remote Desktop (or sitting at the console) unless you’re troubleshooting a problem which can only be investigated from the desktop. What this means is that all the little utilities and tweaks you have on your desktop should stay on your desktop. Things like PDF readers, Internet Explorer customizations, screensavers and colors, and taskbar preferences, etc. First and foremost, many of these tweaks are profile specific so when you login with a domain account later, after running sysprep, the tweaks will be gone unless you push them into the default profile and in turn push them on to all users. Secondly, when you move on to the next big opportunity, your successor shouldn’t be expected to inherit and adapt (or spend weeks undoing) your personal preferences.
As far as system settings (e.g. page files, network settings, etc.), tweaking or disabling things just because some random guy on the Internet (including me) told you to is not a great idea. When you install an application from a vendor, be it Microsoft or any third party ISV, chances are they made assumptions in their testing that the OS is in a state relatively close to the default install. When you start disabling services (e.g. Indexing, Print Spooler, etc.) by default in an image, you provide a baseline that doesn’t meet this assumption. Rather than customizing by default, customize for specific applications. The settings in an image should be the ones required to operate at a minimum level of functionality on your network (e.g. maybe you need some custom DNS search suffixes to join the domain or settings to meet security standards). This extends to most any system level default. When you’re looking at changing a default setting, I’d first stop and ask yourself “why?”. What’s the reason this is being changed? Next, consider whether or not it should be or needs to be hardcoded in the image or if you can simply apply that configuration change via Group Policy. Anything hardcoded will require you to come back and update and re-test, and re-release the image when the value needs to be changed, whereas a Group Policy setting is easily tweaked centrally.
Inevitably there are various tools, gadgets, trinkets, etc., which make troubleshooting easier. My experience is that a great many of these tools don’t require installation and will simply run from a network share. Rather than caching tools locally on each server, or installing individual copies, put them out on a share that you can access from any machine. This way you only need to refresh one location when a tool gets updated and you also aren’t installing essentially random software on servers on an as-necessary basis. As soon as those installs aren’t consistent across all servers, you start permuting your test matrix. Personally the only thing I consistently install on a server build is Netmon or Wireshark and occasionally BgInfo (Sysinternals).