Sitemaps and Robots

Implementing a sitemap generation that is fully automated.
When you like to be found by search engines in a defined and reliable way the 2 SEO files /robots.txt and the sitemap are important to be in place and correct.
A good article on creating a sitemap.xml
file automatically can be found in
https://daily-dev-tips.com/posts/adding-a-sitemap-in-eleventy/.
The full output should look like:
<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
...
<url>
<loc>https://www.mathertel.de/blog/2025/0227-11ty-overview.htm</loc>
<lastmod>2025-01-23T13:48:49.019Z</lastmod>
<changefreq>monthly</changefreq>
</url>
<url>
<loc>https://www.mathertel.de/blog/2025/0228-sfc-concept.htm</loc>
<lastmod>2025-01-27T19:49:06.829Z</lastmod>
<changefreq>monthly</changefreq>
</url>
...
</urlset>
Like html generated by eleventy the xml output also can be produced using a nunjucks template:
---
permalink: /sitemap.xml
eleventyExcludeFromCollections: true
layout: false
---
<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
{% for page in collections.posts %}{% if not page.data.draft -%}
<url>
<loc>https://www.mathertel.de{{ page.url | url }}</loc>
<lastmod>{{ page.date.toISOString() }}</lastmod>
<changefreq>{{ page.data.changeFreq if page.data.changeFreq else "monthly" }}</changefreq>
</url>
{% endif %}{% endfor %}
</urlset>
The <loc>
in the <url>
should use complete URLs including protocol and domain name. This requires that the domain
name is included in this template. This must be changed to your environment when you reuse this nunjucks template.
Urls are case sensitive.
In case the change frequency is different for specific pages you can specify this in the page meta data