Product Activation Thoughts …

As I talked about in my last post, I moved to Windows 7 over the weekend.  As I was installing things, I had problems activating several applications.  Some of the companies were pretty easy to go fixed with a quick email, but some still have not been fixed.

One application that I use is locked to a hardware key that is generated when it is first installed.  The key is transferred it the vendor, and if you reinstalled the application it compares the keys.  The funny thing is that the keys NEVER match, but a quick email to the vendor and they send a new key within a hour most times.

Three companies allow five activations before they disable your software.   Two of the companies, Nero and Diskeeper, reset my account with a simple email.  Both companies took about two days, but the problem was fixed.  One company has basically said, too bad that is the terms of service.  I do not recall ever purchasing something that said you could not reinstall if needed on the same hardware.  That program was ScanSoft PDF Converter Pro, which has been purchased by Nuance.  I’m not sure if that has something to do with them not reactivating it or not.

I have to say that this will make me think twice about continuing to purchase software from companies that require activation, because one day they may not be there and you will be out whatever application it was.  I know this was a large point of contention when we were getting ready to release our 911 dispatching software.  My partner wanted to lave the software call home to check and make sure that the users service contract was current.  If it was not, he wanted to shut the application down.  The problem was that were selling them a license to run the program, if they didn’t want a service contract they would not receive updates.  What he wanted was more of a SAS, software as a service, type plan.  I do believe in getting paid for commercial software, and have always used software keys to unlock applications that I have developed.  I have always tried to embed the department/dispatch centers name in the key, and used that name on any reports that are generated from the application.  It will not stop piracy, but it will look fun if every report that Town A files says Town B at the top of the report.

I also feel to this day that I would be sitting in a prison somewhere if a dispatch center tried to dispatch apparatuses to a fire and the program would not let them, because there service contract was expired.  That idea never set right with me.

Google’s Chrome Finally Ready?

I have been using Chrome almost since the day it was released.  I really, really love the clean interface that it has, and wish that more developers would take a page from Google’s book.  One of the biggest roadblocks to using it full time has been the lack of support from LogMeIn.  I started using their service early this year to connect to and trouble shoot our road warriors notebooks.  It has saved us from having to get notebooks shipped back and fourth on several occasions.  Well it seems that LogMeIn is finally working with Chrome.  I’m not sure when this happen, because I had not checked it is a week or two.  I am currently running the development branch of Chrome, but I have not had any problems with it crashing.

I’m going to try and keep this post updated with any problems that I run into and when they get fixed.

Still looking for a good blogging application

WindowsLiveWriterI have been using Microsoft Word 2007 do create my post this year and have never been real happy with it.  After digging around looking at some different applications that will do offline post creation, I may have finally found something that I like.  I am creating this post using Windows Live Writer.  It has one neat feature that I have not seen in any of the other applications that I tested.  It collects the theme information from your blog and displays the post in the theme as you are typing.   It’s a small touch that makes a big difference.

Major Systems Upgrade

We have been making major changes at work the last six weeks. We replaced both of our old Dell Power Edge 4600 servers with three new Dell PowerEdge 2900 Servers. I went from having half a terabyte of storage to having two terabyte of storage. Two of the servers are running VMware ESX 3.5 and the other server is my Microsoft SQL server. When we purchased the new servers we did not get any tape drives on them because we wanted to change our backup method during the change.

The basic server layout is:
Physical Server 1 (PowerEdge 2900)

  • VM 1: Domain Controller
  • VM 2: File and Print Server
  • VM 3: Application Server

Physical Server 2 (PowerEdge 2900)

  • VM 4: Intranet Server
  • VM 5: Domain Controller (Primary)
  • VM 6: Application Server

Physical Server 3 (PowerEdge 2900)

  • ERP and MS SQL Server

Physical Server 4 (PowerEdge 4600)

  • Backup Server

For backups we are using Acronis TrueImage Echo Server to backup VM 2, VM 4, VM 5 and the Physical Server 3. We are creating full images on Sunday when to network to slowest and pushing them to the backup server. This process is taking about one hour. Monday through Friday we are doing differential backups on the save four machine and pushing them to the backup server. On the SQL server we are also doing log file backups every 30 minutes. These files are created locally and then copied to the backup server. The differential backup is only taking about five minutes total for the four servers. On Saturday we are using the ability of TrueImage to merge the full image from the previous Sunday and the last differential image from Friday into one file and then removing all the pieces from the week. At this point we have one image for each of the four servers that is current. All of the nightly processing done on these four servers is handled by a VBscript. (Note: The script has been saved as a text file for security on the web site.) Each night the backup server backs itself up to tape which will include all the image file from the other servers. We are not backing up VM 1 because it is a mirror of the other domain controller and in the event that we had to restore everything, I have never had much luck getting to domain controllers to resync after. VM 3 only changes when one of the applications on is updated. So after any application changes we make a snapshot of the VM and burn it to DVD to store off site. VM 6 has one data directory that changes so I pull that directory each night when the backup server backs up, other wise it is treated the same as VM 3.

These changes have made a world of difference to our nightly processing. When everything was going to tape it was taking almost four hours total for all the servers to backup and the one time we did have a major hardware failure it took sixteen hours to get everything up and running. We did a test run after everything was up and running, and the full restore took under two hours have things running.

I also finally got and answer to my question about our corporate pain threshold for data loss in a major failure. I have always been working to keep my exposure to less than four hours. Well it turns out the corporate standard is one WEEK. I was shocked to hear that. I am pretty sure that I will have NO problem meeting that requirement.

DirSync Utility

This is a little utility that I wrote for backing up a source directory into a target directory. It copies all the files from the source directory to the target directory. If the target file exist, then the two files are compared and if the source file is newer then it is copied to the target directory. It does not copy files from the target directory to the source directory. This could be added if it is needed, but at this point I do not require that function. There are three switches for the executable, they are -s for source directory, -t for target directory and -r for retention days. After comparing the source and target directories, the program will scan the target directory and delete any file that has not been modified for more than the retention days. The best way to describe a use for this would be in the case of a SQL server. We create one full backup each morning and then backup the transaction logs every 15 minutes during the day. The full backup of our ERP is 6 gigs. The server that is hosting the SQL server only has 130 gigs of storage so we can not keep more than two days for backup files on the server before we start to have storage problems. We do have a file server with 800 gigs of storage so I am using this program to copy the backup files from the SQL server to the file server and only keep two days on the SQL server. I can then set the retention days to seven and it will keep seven days worth of backups on the file server.

The utility is called like:
sync.exe -sc:sql backups -t\computer1c$temp -r14

The source can be downloaded here and the executable can be downloaded here.

Microsoft Patches

I guess this a rant about some Microsoft issues that I am starting to have. When Microsoft first started their patch Tuesday it seemed like a good idea, however it is a idea that no longer seems like such a good idea. Microsoft has been being hit with zero-day exploits every Tuesday after they release their patches for the month. The logic behind the once a month patch release was to make things easier for companies to test and deploy patches. When we test patches at work, we install them on two production computers and then beet on them for a day before pushing them out to everyone. To me I think it would be better to get the patches when they are ready and have to do the testing more often instead of knowing that I have 75 computers with known vulnerabilities. The other reason I keep hearing is that it takes a lot of time and effort to roll out the patches. With all the tools out there like Numara Deploy, TriActive and Microsoft WSUS I find it hard to believe that there are still companies out there with a group of people running around installing patches.

The other part of this that bothers me is how Microsoft will not release out of cycle patches unless you are using their Windows Live OneCare. I guess I have problems with them charging people to get updates faster for problems they they are responsible for.

VMware ESX 3.0

After running VMware Server for about two months at work we managed to display the advantages of using VMware, which convinced management to allow us to purchase VMware ESX and install it on one of the primary servers. Since we started running VMware Server we have stopped having the weekly problems with the Domain Controller going down at night and slowing the other production server to a crawl. We installed ESX about a month ago now, and it has been worth every penny. It has allowed us to separate all the services and applications that were running on the two servers. The ESX server is currently hosting 4 Windows 2003 Servers, 2 Windows XP boxes, 1 Netware Server and a Ubuntu Server. Most of the time when I check to see how the server is running it is only at 25% utilization. We purchased the VMware Infrastructure Starter package which only supports local SCSI drives and NAS. It does not support SANs or iSCSI, but we do not need any of these options. The only real problem I have found is that the Virtual Infrastructure Client used to manage the server is designed on the .NET framework and only runs under Windows. It is not the best solution, but right now I am running the VI Client in a vm running on VMware Workstation on my Ubuntu notebook. There is a web based console for it, however it is not as full featured as the VI Client. All in all I would have to say that I would recommend this to anyone looking to better utilize there resources.

New Project … (PodPicker)

This is a PodCast aggregator that will auto-sync with an iPod. To the best of my knowledge there is nothing designed for Linux that works as smooth as iTunes. Which is the only application that is making me dual boot. Ad yes I know there are ways of doing this, but all the solutions that I have seen require several applications and way too much manual intervention.

See the project page for PodPicker

Synchronize Utility (Updated 11/27/2006)

This is a little utility that I wrote for synchronizing two directories. At work our MS SQL Server backs-up the transaction logs every 15 minutes. What I needed was a way to then sync the directory where the logs a stored to a second server for safety. All the utilities that I found online where more than I needed, so I wrote something quickly to solve my problem. It can be called from the command prompt so it is easy to schedule. You pass it the source and target directories and it does the rest. It copies any new files from the sorce directory to the target directory. If the file is in the target directory, but not in the source directory, it is deleted from the target directory.

The utility is called like:
sync.exe c:sql backups,\computer1c$temp

You can download this here. (Removed from server)

Update: I have been using this on our server at work for two weeks, and it is working great. I have been able to tweak my backup plans and save some time at night.
Update (11/27/2006): I found that this was changing some settings in the file which MS SQL did not like when trying to restore. I have written a new utility called DirSync in mono that gets around these problems.