The seeker alone is not that complicated. You simply load the site, and then evaluate and follow the links found.
What you can do to be “friendly” is to create a crawler for every site you plan to poison. In other words, select one site and see how they are structured. Code your receive requests and html parsing around this structure. Rinse and repeat for other sites.
If they use regular shopping cart software (anything is possible here), then obviously you have a little reuse.
When scanning, you may want to hit your sites during peak hours (this will be an assumption). Also, do not run 500 / queries per second. Drop it a little.
One of the additional things that you might even consider is to contact these other sites and see if they want to participate in the direct exchange of data. An ideal would be for everyone to have an RSS feed for their products.
Of course, depending on who you sell, this may be considered a price fix ... So, proceed with caution.
Notme source share