Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> example.com/sitemap.xml should point search engines to interesting parts of your site.

Really, an XML Sitemap should be used to show Search Engines ALL pages on your site, and should only be used when it can be dynamically updated via a CMS or database that knows of all URLs. Don't even get me started about the idiocy of using a random CRAWLER (e.g. https://www.google.com/search?q=xml%20sitemap%20generator) to generate a sitemap to be used by the best CRAWLER on the internet.

I know the tone of the post is informal, but this wouldn't be a HN comment thread without some nit-picking.



Yeah, I think if it makes sense for the site, you should focus more on making it fully crawlable. Generally, humans should be able to find all of your content, too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: