It's not actually internet traffic that causes the slow down, it's actual script processing time, whenever a user or a spider visits the website, a certain amount of server memory is allocated to that user for resources like xml reader objects or scripting dictionary objects, the more users/spiders that connect the more memory is reserved, for normal usage we have a server that is more than capable of handling all the requests, but when we have 50+ normal people then 100+ connections from google, another 50+ connections from bing and another 100+ connections from yahoo all connecting at the same time, it's like having 500 people on the site all requesting the same page all at the same time,
There are a few other things we can do like stop some of the un-used objects from being created on page load to reduce the memory reserved per user, I'll take a look at it over the next few days but time is really short for me at the moment, lots of family issues and work related stuff to deal with.