A lot would depend on how many sites you are looking at. Web spiders look at huge areas of the net and need a lot of resources. If its only a few sites say ten it would be easier/cheaper to do it manually. Depending on the information you are trying to collate you may find that some sites block automatic access, so you need human input to see the information.
I wonder if companies like google or others would supply the information you require as a subscription service this maybe a lot cheaper than employing a programmer and hosting servers.