Why Flatten a Sitemap Into a Plain URL List?
XML is built for crawlers, not humans. When you need to feed URLs into a spreadsheet, a bulk HTTP checker, an AI prompt, a migration redirect map, or a robots.txt allow list, a plain newline-delimited list is vastly easier than parsing XML by hand. This tool handles nested sitemap-index files, strips the XML boilerplate, and gives you a clean list you can copy into any other workflow in seconds.
Common Use Cases
- Audit every public URL a site exposes before a redesign or migration.
- Generate a source list for a bulk HTTP status / redirect check.
- Compare two sites by diffing their URL lists.
- Feed a content inventory into a spreadsheet or AI model.
- Spot unexpected routes, test pages, or orphaned sections in a competitor’s sitemap.
- Build a robots.txt or CDN allow/deny list from a known-good URL inventory.
Fetch URL vs Paste XML
Use "Fetch from URL" when the sitemap is publicly accessible — our server fetches it for you, which also avoids browser CORS limits and lets us recurse through nested sitemap-index files automatically. Use "Paste XML" when the sitemap is on a staging server, behind auth, or otherwise not reachable from the public internet; parsing runs entirely in your browser and the content never leaves your machine.
