From UA Libraries Digital Services Planning and Documentation
Revision as of 14:13, 24 March 2015 by Jlderidder (talk | contribs)

Sitemaps are a way of telling web search engine crawlers where to find the content on your site that you want them to index. After all, crawlers have no idea how to create the URLs that your database or delivery system create on the fly to provide access to online materials. Thus, database content is like a black hole on the web, and without help, that content will not be reflected in web search engine results such as Google.

Sitemaps cannot contain more than 50,000 URLs and must be no larger than 50 MB uncompressed. If you have multiple sitemaps, then you need a sitemap index file that lists them all -- then this would be the file you submit to the search engine site for indexing, as opposed to the sitemap itself.

Our sitemaps are automatically regenerated once a month, using the file date for the <lastmod> value; all our entries are listed as changing "yearly", since the next option is "monthly" and they rarely are updated that frequently. The <priority> value is highest for finding aids, and lowest for mass-digitized content (as that has little metadata to index).

Our sitemaps are located in /srv/www/htdocs/acumen/sitemaps and /srv/www/htdocs/sitemaps/ with corresponding sitemapIndex files in the directory just above these locations (visible via the web at and

One is for Acumen, and the other for libcontent, but they contain the same links.

To submit a new sitemap to Google, or to check our indexing progress, log in with web services credentials to Google Webmaster Tools.