To partially overcome above load limits and to prevent overload, most popular Web sites use common techniques like:
* managing network traffic, by using:
o Firewalls to block unwanted traffic coming from bad IP sources or having bad patterns;
o HTTP traffic managers to drop, redirect or rewrite requests having bad HTTP patterns;
o Bandwidth management and traffic shaping, in order to smooth down peaks in network usage;
* deploying Web cache techniques;
* using different domain names to serve different (static and dynamic) content by separate Web servers, i.e.:
o
http://images.example.com o
http://www.example.com * using different domain names and/or computers to separate big files from small and medium sized files; the idea is to be able to fully cache small and medium sized files and to efficiently serve big or huge (over 10 - 1000 MB) files by using different settings;
* using many Web servers (programs) per computer, each one bound to its own network card and IP address;
* using many Web servers (computers) that are grouped together so that they act or are seen as one big Web server, see also: Load balancer;
* adding more hardware resources (i.e. RAM, disks) to each computer;
* tuning OS parameters for hardware capabilities and usage;
* using more efficient computer programs for Web servers, etc.;
* using other workarounds, especially if dynamic content is involved.