To be a good engineer, you need a Testlab. End of sentence.
You need it so you can peruse flights of fancy, like making some web services, trying out that new language and other endeavors perhaps not specifically related to your day to day work.
It HAS to be your own too! You can’t just use the one at your work. If things go awry between you and your company, you definitely don’t want to lose your livelihood AND your hard-earned testlab in the same stroke! This is also why you don’t want to have your life insurance purchased through your work too (or if you do, make sure you don’t get fired and die in the same day).
In consulting, I would get assigned to a project and have a month or so to come up to speed on new technologies. I found that when I had a testlab, it was so much quicker to get working, just make a new VM, domain join it and have SQL installed and ready for a new SCCM, Scorch, Air-Watch, whatever. In fact, the periods when I did the best engineering work over my career closely line up to the times that I had a working testlab available to model my customer’s environments and make mistakes on my own time, not theirs.
Today, I woke up to a nasty error in the FoxDeploy Hyper-V lab. All of my VMs were stopped, and wouldn’t start! When I tried to start one, I’d see this error:
An error occurred while attempting to start the selected virtual machines: General Access Denied Error…
VMName: Account does not have permission to open attachment <PathToVHD>
In my case, this happened because I have a Raid of SSDs for maximum IOPs for my VMs (can’t stand something being slow!) and I lost a drive. Somehow in rebuilding the volume, permissions were lost for some items on the drive, in addition to corrupting my Recycle Bin.
Can’t start any pre-existing VMs but are able to make a new one.
Something is wrong with permissions (namely, the VM doesn’t have Full Control rights to it’s VHD anymore. In this image below, you can see a new and working VM on the left, and a broken VM on the right. Note the missing VMid entry. Continue reading →
Were you one of those who installed the server 2012 binaries into your Windows 8.1 to enable Disk Deduplication? Did you turn on dedupe on all of your drives, saving hundreds of gigs of storage space, then upgrade to Windows 10?
Upon boot, were you greeted with frequent ‘the machine cannot access the file’ errors? If so, then this is the guide for you!
This fixes the error 0x80070780: The file cannot be accessed by the system
What happened to my stuff? Did I lose it all?
NO! You did not lose your files. What happened when you ran deduplication on your files, is that Windows gradually scrubbed all of the common elements out of many files in order to compress them, much like what happens when you put files into a .zip or .rar archive. Continue reading →
If you’re keeping up with the Azure talks from TechEd Barcelona this week (and you should be!) you’ve heard a lot of mentions about Docker recently.
Wondering what it is?
Docker is a technology that allows an application developer to install and sequence the execution of their application to ‘containerize’ it in a portable container which can be docked and executed anywhere, seamlessly alongside other applications without application installation dependencies causing interference. The key differentiation point here is that the application and its dependencies are virtualized, rather than the underlying OS, as in traditional virtualization. The big benefit to abstraction of the OS is portability and a significant reduction in overhead.
Instead of developers giving out an install script and binaries, they can provide a fully configured Docker image, which can be run on any system with the Docker engine installed, which includes OS X, Windows and pretty much any version of Linux. Keep in mind while running Docker in Windows that while VirtualBox will be used, Docker does not require hardware virtualization support.
Don’t be a dummy, use the fantastic tools built-in to Hyper-V 2012 to make it easy to migrate your VMs onto new storage or over to a new host. Never mess with copying .vhdx or configuration files again.
I rebuilt my storage in my test lab, making use of Storage Tiering in Storage Spaces to pool two 64 GB SSDs and one 1.5 TB HDD to get the best of both high-speed read and writes and bulk storage without needing to move my files around manually. The tiering aspect of Storage Spaces will detect which files are most used and then automatically merge them into the faster speed SSD from bulk, how cool is that!