We're all accustomed to the speed by which a browser renders a web document. We're also accustomed to the fact that before the DOM is ready to be used the browser must download all the dependencies of a document. But what happens behind the scenes? Here's an excerpt taken from the documentation of Firefox's parser component:
Once the tokenization process is complete, the parse-engine needs to emit its content (tokens). Since the parser doesn't know anything about the document model, the containing application must provide a "content-sink". The sink is a simple API that accepts a container, leaf and text nodes, and constructs the underlying document model accordingly. The DTD interacts with the sink to cause the proper content-model to be constructed based on the input set of tokens.
In other words, the content sink is a rough pre-model of the future, full-constructed DOM model. The construction of such an API is actually relevant in determining the effective rendering performance of a web browser, because if a browser is not capable of getting a quick idea about how the final DOM tree will look like, it's very likely that you'll see some rendering inconsistencies, especially when the content is generated via JavaScript. For example, if you see this blog in various browsers, you will probably notice a delay in the rendering of my Flickr photo stream (bottom of the page). Safari and Chrome render such photos almost immediately, while Internet Explorer and Firefox need more time. Sure, this is also due to the fact that these browsers have a different way to handle multiple HTTP requests and DNS querying, but it's also important to notice that this aspect is reflected by their different construction of the content sink.