Whenever you write a blog post, there's a big chance your blogging platform automatically pings a dedicated server. This way, a blog search engine (like Technorati or Google Blog Search) can discover new stuff to index.
If you create a standard web page, and upload it to a server, no search engine will find out unless there's a link to that page somewhere or you manually submit it. Sitemaps protocol, developed by Google, and supported by three major search engines: Google, Yahoo and MSN, wants to overcome this problem.
"A Sitemap is an XML file that can be made available on a website and acts as a marker for search engines to crawl certain pages. It is an easy way for webmasters to make their sites more search engine friendly. It does this by conveniently allowing webmasters to list all of their URLs along with optional metadata, such as the last time the page changed, to improve how search engines crawl and index their websites.
Sitemaps enhance the current model of Web crawling by allowing webmasters to list all their Web pages to improve comprehensiveness, notify search engines of changes or new pages to help freshness, and identify unchanged pages to prevent unnecessary crawling and save bandwidth."
There are many tools that can help you generate a sitemap for your site, including a Python script created by Google.
The fact that Yahoo and Microsoft support this protocol is a big step towards a wider acceptance.
Here are the announcements from: Google, Yahoo and Microsoft.
Dan Deacon's "When I Was Done Dying"
3 hours ago