Once there was only one choice, you had to generate your HTML-files on the server. This has always been a rather clunky process where the server needs to do all work and hold all logic for inserting data into UI. Typically the form of splicing an XML tree with dynamic content. Not only does this make caching difficult, you are also loosing a lot of flexibility since every action needs a round trip to the server.
These days this isn’t necessary. The server can send data and the client can take care of the UI. Still basically everyone is generating HTML pages on the server like it was 1999. I have a few guesses:
- Servers are cheap and when you have two, you could just as well deploy four or ten. There is no real need for the web to be efficient just to save money. It won't be noticeable compared to what the web-developers costs anyway.
- Google punishes pages with bad/slow code, and since most programmers are sloppy, it’s better to be safe than sorry (pure HTML is always faster than JS).
- Your company prefer using a framework or language that isn’t JS or that isn’t supported in the browser. This forces you to output plain HTML.
Feel free to send me your answer/opinion: @olofT
For my web-services the answer is 2, I don't want to be punished in page ranks for not serving plain HTML. Not sure I am, but just in case...

