The problems with cache servers



Networkworld brings us this report that exploit code removed from websites can live on for quite a while in caching servers. Which, in a way is NOT news, but it’s worth remembering. Many times when someone visits a website, their really visiting a caching proxy server that has previously grabbed a copy of data from the original website. Many networks use cache servers to improve network performance. (i.e…. we have 20 people an hour hitting cnn.com why shouldn’t we just be able to download the page once?)


One lesson of this is that even after code is cleaned off a site it may still be infecting people through the cache servers that their network may force them to use. (I seem to recall AOL uses caching servers for instance.) For small office networks it’s possible to virus scanning on a web proxy, but larger networks many times neglect this due to the processor load.

   Send article as PDF   

Similar Posts