Virtual Machine Parent Child Relationships

A coworker ran into a problem the other day that I wanted to highlight for those who may run into this.  First off, never, ever resize a VHD that has snapshots on it.  Snapshots are child objects that relate to a VHD and they map changes to certain places in the VHD.  When you resize a parent VHD, the child VHDs have incorrect references and this needs to be corrected.

You can fix the problem with VHDtool.exe.  The program is a bit hard to find now but once you find it, the process is quite straightforward.

Enhancing quality assurance with virtualization

Virtualization shines in quality assurance, development, and testing. Backups or virtual machines can be restored to another environment and operate just like the original. This allows an organization to better test their backups or their business continuity plan without interrupting production systems. Similar to the hot, warm, or cold site, virtual machines can be deployed in an alternate location just as they would in a disaster thereby allowing the organization to determine the length of time needed to make systems available in an emergency. As a result, team members will be more familiar with the process and less likely to make mistakes that could cause delays in making the organization operational again.

Regardless if you have a virtual environment or not, it is important to test your backups and business continuity plan as part of your information security risk management strategy. In this way you will be able to proactively identify any flaws in the plan prior to an actual emergency. It is much better to correct an error before it results in lost data or profits.

Virtualization business continuity with snapshots

Snapshots are a valuable feature virtualization offers business continuity. Organizations can create point in time recovery points numerous times a day by creating snapshots. Snapshots record all changes to a virtual machine so that the machine can be restored to the state at which the snapshot was taken. This is especially important when making changes to a virtual server because changes do not always work as planned. If a change impacts a system negatively, the virtual server can quickly be rolled back to the state it was in prior to the change by using snapshots.

Snapshots also increase the Recovery Point Objective (RPO) by enabling the organization to recover a system to some point in time during the day after the daily backups have been taken. Many systems may only be backed up once a day but snapshots can be taken throughout the day. If a failure occurs during the day you can recover back to the snapshot and lose less data than if you had to recover all the way back to the previous night’s backup. This assumes, of course, that the snapshots have not been damaged along with the system.

Virtualization at hot, warm, and cold sites

Business continuity plans are designed to define the processes necessary to protect organizational assets and to keep the business running in the event of a disaster or local incident. Backups and recovery are important elements to business continuity but sometimes an organization needs a shorter Recovery Time Objective (RTO). In some cases, organizations will set up hot, warm, or cold sites that allow the organization to pick up business from that location in the event of a disaster. Hot sites are immediately ready to assume the workload of the production site, while warm sites require some data to be restored to them and cold sites require servers to be turned on and possibly configured before data is loaded to them.

Virtualization assists with rapidly making systems available in an emergency because backups can be taken of entire virtual machines. This allows an organization to start up machines that have been backed up in a completely different environment without worrying about hardware compatibility. Formerly, in hot, warm, and cold sites, hardware would be identical to production sites in order to avoid possible compatibility problems during restores. This is extremely costly to an organization. With virtualization, the host machines that house the virtual servers can have different hardware from production systems without incurring compatibility problems because the hardware presented to each virtual machine is identical. The hosts do this through a process known as abstraction where virtual machines are presented with resources in a generic way but the resources are managed by a service known as a hypervisor behind the scenes.

ASP AJAX and IIS7

I have been struggling with some ASP AJAX code.  I moved the site from Server 2003 32-bit IIS6 to IIS7 on Server 2008 64-bit.  After a lot of searching and banging my head against the wall I found a blog on MSDN that told me how to add a binding redirect to my web config file so that the code would use the AJAX built into .NET 3.5.  Here it is. 

<runtime>

    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">

      <dependentAssembly>

        <assemblyIdentity name="System.Web.Extensions" publicKeyToken="31bf3856ad364e35"/>

        <bindingRedirect oldVersion="1.0.0.0-1.1.0.0" newVersion="3.5.0.0"/>

      </dependentAssembly>

    </assemblyBinding>

  </runtime>

Obtained from: http://blogs.msdn.com/webdevtools/archive/2007/07/28/upgrading-asp-net-ajax-1-0-websites-and-web-applications-to-net-framework-3-5.aspx


Pipe Dream: Data migration with batch files

So I was working on a problem at work today.  We can only copy 4000 files at a time and I need to copy millions of files.  I grabbed a directory structure with file counts using treesize and dumped that to an excel spreadsheet.  Next I wrote a program to take all folders that have more than 4000 files in them and separate them into smaller chunks.  The next step was to output batch files with thousands of xcopy commands inside that could run during certain windows on my servers.  this is where I ran into a snag.  What do you do for folders that have more than 4000 files?  We separated them out but I do not have the file names for all the files inside those folders.  We only have the folder name and the file count.  Well, I came up with an idea.  I thought, how about I put variables into the batch file I are programmatically creating and then use a command such as dir to send the files to the copy command.  I typed out some pseudo batch file code and I talked about it.  Here is what I had.
dir [source] @1 | xcopy [source]@1 [email protected]
The dir command or some similar command would output all the files in the directory specified as [source] and put them one by one into the variable @1 which would then run xcopy to copy those files using the pipe |.┬á Well, here comes the name of my blog post.┬á The idea was good and well received by the team but the commands did not support a syntax like that so I termed the idea my “pipe dream”.