I'm sitting on a large University network, over which all HTTP access must be routed through a proxy server. The proxy server performs load balancing, content filtering, virus scan injection on certain content types and the like, and connections cannot be made to remote port 80 through the network without using it.

This isn't a problem as long as you configure your web browser (and any other web client) to make HTTP requests through the proxy.

Now, recently I noticed that my Firefox 3.0.6 had developed an irritating tendency to briefly hang upon connecting to certain websites. I couldn't observe a noticeable pattern in which sites were affected, but it seemed to be a pretty consistent problem. The little hangs were hindering my workflow by introducing multi-second wait times during general use of the web.

However I eventually developed a theory which — directly or indirectly — might lead to a nice solution.

My theory is based on the notion of the University's HTTP proxy being horrendously overloaded, which is already clearly evident in its general performance. Dropped connections and temporary unavailability of sites are fairly common, especially in the early evening when Nottingham's 10,000 students on campus get home from Uni and start scouring the net before dinner.

It's not a stretch to suggest that any DNS lookups performed by the same server might be somewhat latent. Where do DNS lookups come into it? Observe.