A dynamic website is a site which consist of one or more dynamic web pages. A dynamic web page is specially parsed on each request to the server. What that means is, when I surf into a website, my browser sends a request to the server which hosts the website for the desired web page. After the server receives the request for a certain page it takes that page from the hard drive and sends it to the user. The difference between a static page and a dynamic page is that while a static page is sent to the user immediately, a dynamic page is sent to a parser first, that parser identifies special tags in the static page and fills them appropriately, after that, the parser will hand the parsed page to the server which in turn will send it to the user.

A very simple example of this would be time.
For instance, when you access E2, you can see the 'Server time:' in your Epicenter nodelet, this time is constantly updated, but instead of manually altering the file every single minute, a special dynamic tag is set into that static page, and is later interpreted by the parser which will replace the tag with an exact timestamp at the moment at which the page is requested. In PHP this the tag would look like this: <?php echo date('H:i D m F j Y'); ?> and would output something like 11:21 Tue May 8 2001.

But while showing the current time is the simplest task and requires almost no resources, querying the database for information takes up much more processing power, and pages like the main page of E2 query the database many times. This is where the problem inlays, while a server can easily parse a page dozens of times a second, for dozens of different users, it does has its limit and when that limit is reached, users have to wait for the server to catch up. This recently happened quite often with E2 and was the main reason for the lag users experienced.

There are ways however, to ease things up and help the server deal with the on growing number of requests. Among these methods are upgrading to a new faster server, re-coding certain scripts to make them more efficient, properly index the database which holds the information and numerous others. One of the most effective methods being used is caching. Take an online news source for example, the most requested page in such a site will always be the front page, and parsing it for every request is a waste of resources, so instead, it could be parsed once a minute or only after the information in it was altered, and by that freeing the server to deal with other requests.

Log in or registerto write something here or to contact authors.