The hard to catch bugs are in your thinking. Not in the code.

The copy algorithm calls get_backup_list() to get a list of source files that have to be copied whenever it runs. It does this by looking at the last copied file in destination and continues from that in the source list.Obviously this does not work if the user has modified the files that have been already copied. As the backup runs, the user may modify files that have been backed up as well. So, it looks like it is always a race between the backup completing and user modifying what was completed.

To illustrate, suppose there are files A, B, C in folder x, and it needs to be copied to folder y. Let us say the backup copied A and B and it is copying C. If before C is copied, the user modifies A or B, then what is backed up will not reflect the true state of the source. So, if as per get_backup_list() we copy from the last file copied in destination we would end up leaving out A or B that gets modified after C was last copied into destination and continue to other files that were not copied beyond C.

So we should just find all modified files in source after the last modified time in destination for the corresponding source folder and then initiate a copy of all of those files.

However, there is one issue here. What if the backup takes 5 minutes and some source files are modified during this time. And when the next backup runs, we consider the last file copied time in the destination to compute the new source files that needs to be backed up. Wouldn’t this result in leaving out those source files that got modified within the 5 minute window?

This is getting trickier than we thought. No matter what time you consider as the point from which you compute the incremental source files to be backed up, there is a possibility that some file could have changed just before that.

We should probably compute the incremental source files based on the size of the file. This can be more guaranteed than looking at files based on the last modified time.

We will modify the code as follows

backup(source folder array, destination folder)

So we completely moved away from finding the new set of files to be backed up when backup runs using last modified timestamps to just the file size as there is always a window where we can skip files that got modified before whatever time we fix as the last backed up time.

get_backup_list() returns the files in source folder that have different size than the destination. If it finds a folder it simply adds to the list and does nothing. As copy() is recursive and will go over that folder as part of its depth first dive.

Software is a double-edged sword. Because of its re-writable nature, we may find it easy to change things. At the same time, this brings in a never ending road to reach for perfection. The above change was slightly a tricky one. We could have lived with the last version in post-9. It might have worked for most cases. And we may not have caught the bug with any number of tests we write.

Testing alone cannot catch the bugs. Sometimes, questioning your own thinking, asking questions you have not asked before on the algorithms will reveal some really fundamental flaws.

Writing code is not a one time affair. When you are shopping in a mall or when you are waiting for your turn at the Doctor, you may find a flaw in your thinking and you may want to rush back to your desk to fix it. But Hey, fortunately all of us move on to write the next code leaving the bugs behind for the poor hapless guy who will inherit yours.

But isn’t that true with life as well?

I am building Trec. A new way to read the Web. If you are interested, send a mail to You will be notified.