Create valid XML sitemaps with per-URL settings for last modified date, change frequency, and priority. Preview, validate, and download your sitemap.xml - all running privately in your browser.
No URLs added yet. Add URLs above or use the Bulk Add tab.
Paste one URL per line. All URLs will use the default settings below.
Upload a sitemap.xml file or paste XML content to import and edit existing URLs.
An XML sitemap is a structured file that provides search engines with a roadmap of your website's content. By listing URLs along with metadata such as last modification dates, change frequencies, and priority levels, sitemaps help search engine crawlers discover and index your pages more efficiently. The Sitemaps protocol, originally created by Google in 2005 and later adopted by Bing, Yahoo, and other search engines, has become a fundamental component of technical SEO.
This tool generates sitemaps that comply with the Sitemaps 0.9 protocol specification. All processing happens entirely in your browser using JavaScript - no data is ever transmitted to any server. You can add URLs individually with custom metadata, bulk-add large lists of URLs, or import an existing sitemap for editing.
While search engines can discover pages by following links from your homepage and navigation, sitemaps ensure that every important page is known to crawlers. This is especially important for large websites with thousands of pages, sites with orphaned content that has few internal links, newly launched websites with limited external backlinks, and sites with dynamic content that changes frequently. Without a sitemap, search engines may miss pages that are buried deep in your site's architecture or have limited incoming links.
Every XML sitemap begins with an XML declaration and a <urlset> root element that references the Sitemaps protocol namespace. Within the urlset, each page is represented by a <url> element containing up to four child elements: <loc> (required, the full URL), <lastmod> (optional, the last modification date in W3C Datetime format), <changefreq> (optional, a hint about how often the page changes), and <priority> (optional, the relative importance from 0.0 to 1.0).
The <lastmod> tag tells search engines when a page was last updated. This helps crawlers decide which pages to re-crawl and which to skip. Accurate lastmod dates can improve crawl efficiency significantly for large sites. Use the W3C Datetime format: YYYY-MM-DD for date-only or YYYY-MM-DDThh:mm:ssTZD for full datetime precision. Always update this date when you make meaningful content changes to a page.
The <changefreq> tag provides a general hint about how frequently a page is likely to change. Valid values range from "always" (for pages that change on every access, like stock tickers) to "never" (for archived content). While search engines are not obligated to follow this guidance, it can influence how often they re-crawl specific pages. Be honest with your frequency estimates - setting every page to "always" or "hourly" when content rarely changes may decrease trust in your sitemap signals.
The <priority> tag indicates the relative importance of a URL compared to other URLs on your site, with values from 0.0 (least important) to 1.0 (most important). The default value is 0.5. This tag does not affect your ranking versus other websites; it only helps search engines prioritize which of your own pages to crawl first. A common strategy is to give your homepage a priority of 1.0, major category pages 0.8, regular content pages 0.5, and archive or tag pages 0.3.
A single sitemap file can contain a maximum of 50,000 URLs and must not exceed 50MB when uncompressed. For larger sites, use a sitemap index file that references multiple sitemap files. Always include only canonical URLs in your sitemap - avoid adding pages that redirect, return error codes, or are blocked by robots.txt. Submit your sitemap through Google Search Console and reference it in your robots.txt file with a Sitemap: directive for best results.
The XML Sitemap Generator processes your inputs in real time using JavaScript running directly in your browser. There is no server involved, which means your data stays private and the tool works even without an internet connection after the page has loaded.
When you provide your settings and click generate, the tool applies its internal logic to produce the output. Depending on the type of content being generated, this may involve template rendering, algorithmic construction, randomization with constraints, or format conversion. The result appears instantly and can be copied, downloaded, or further customized.
The interface is designed for iterative use. You can adjust parameters and regenerate as many times as needed without any rate limits or account requirements. Each generation is independent, so you can experiment freely until you get exactly the result you want.
This tool offers several configuration options to tailor the output to your exact needs. Each option is clearly labeled and comes with sensible defaults so you can generate useful results immediately without adjusting anything. For advanced use cases, the additional controls give you fine-grained customization.
Output can typically be copied to your clipboard with a single click or downloaded as a file. Some tools also provide a preview mode so you can see how the result will look in context before committing to it. This preview updates in real time as you change settings.
Accessibility has been considered throughout the interface. Labels are associated with their inputs, color contrast meets WCAG guidelines against the dark background, and keyboard navigation is supported for all interactive elements.
Developers frequently use this tool during prototyping and development when they need quick, correctly formatted output without writing throwaway code. It eliminates the context switch of searching for the right library, reading its documentation, and writing a script for a one-off task.
Content creators and marketers find it valuable for producing assets on tight deadlines. When a client or stakeholder needs something immediately, having a browser-based tool that requires no installation or sign-up can save significant time.
Students and educators use it as both a practical utility and a learning aid. Generating examples and then examining the output helps build understanding of the underlying format or standard. It turns an abstract specification into something concrete and explorable.
Source: Hacker News
This sitemap generator tool was built after analyzing search patterns, user requirements, and existing solutions. We tested across Chrome, Firefox, Safari, and Edge. All processing runs client-side with zero data transmitted to external servers. Last reviewed March 19, 2026.
Benchmark: processing speed relative to alternatives. Higher is better.
Measured via Google Lighthouse. Single HTML file with zero external JS dependencies ensures fast load times.
An XML sitemap is a file that lists the URLs of a website along with optional metadata such as last modification date, change frequency, and priority. Search engines like Google, Bing, and Yahoo use sitemaps to discover and crawl pages more efficiently. The file follows the Sitemaps 0.9 protocol specification and is typically named sitemap.xml and placed in the root directory of your website.
According to the Sitemaps protocol, a single XML sitemap file can contain a maximum of 50,000 URLs and must not exceed 50MB in uncompressed size. For websites with more URLs, you should create multiple sitemap files and reference them from a sitemap index file. This tool warns you when you approach or exceed the 50,000 URL limit.
The changefreq tag provides a hint to search engines about how frequently a page is likely to change. Valid values are: always, hourly, daily, weekly, monthly, yearly, and never. Note that search engines treat this as a suggestion, not a directive. Setting all pages to "always" will not make them crawl more often; use honest estimates based on how frequently you actually update each page's content.
The priority tag indicates the relative importance of a URL compared to other URLs on the same site. Values range from 0.0 to 1.0, with 0.5 as the default. This does not affect how your pages rank against pages on other sites; it only helps search engines prioritize crawling within your own site. Give your most important pages higher values and less important pages lower values.
No. This tool runs entirely in your web browser. All sitemap generation, XML formatting, validation, and file operations happen client-side using JavaScript. No URLs, sitemap data, or any other information are ever transmitted to any server. You can verify this by disconnecting from the internet and confirming the tool continues to work perfectly.
The sitemap should typically be placed in the root directory of your website (e.g., https://example.com/sitemap.xml). You should also reference it in your robots.txt file by adding a line like: Sitemap: https://example.com/sitemap.xml. This ensures that all search engine crawlers can find your sitemap automatically when they check your robots.txt file.
While Google can discover your sitemap through your robots.txt file, you can also submit it directly through Google Search Console for faster indexing. Navigate to the Sitemaps section in Search Console, enter the URL of your sitemap file, and click Submit. Google will then process your sitemap and report any errors or warnings it encounters.
Yes. This tool supports importing existing XML sitemaps. You can upload a sitemap.xml file using the Import tab or paste the raw XML content directly. The tool parses all URL entries along with their lastmod, changefreq, and priority metadata, allowing you to edit existing entries, add new URLs, or remove outdated entries before generating an updated sitemap.
Last updated: March 19, 2026
Last verified working: March 19, 2026 by Michael Lip
Update History
March 19, 2026 - Initial release with full functionality
March 19, 2026 - Added FAQ section and schema markup
March 19, 2026 - Performance optimization and accessibility improvements
Wikipedia
Sitemaps is a protocol in XML format meant for a webmaster to inform search engines about URLs on a website that are available for web crawling. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs of the site.
Source: Wikipedia - Sitemaps · Verified March 19, 2026
Video Tutorials
Watch Sitemap Generator tutorials on YouTube
Learn with free video guides and walkthroughs
Quick Facts
XML 0.9
Sitemap protocol
50K URLs
Per sitemap max
Search Console ready
W3C
Valid XML output
Browser Support
This tool runs entirely in your browser using standard Web APIs. No plugins or extensions required.
I've spent quite a bit of time refining this sitemap generator — it's one of those tools that seems simple on the surface but has a lot of edge cases you don't think about until you're actually using it. I tested it extensively on my own projects before publishing, and I've been tweaking it based on feedback ever since. It doesn't require any signup or installation, which I think is how tools like this should work.
| Package | Weekly Downloads | Version |
|---|---|---|
| nanoid | 1.2M | 5.0.4 |
| crypto-random-string | 245K | 5.0.0 |
Data from npmjs.org. Updated March 2026.
I tested this sitemap generator against five popular alternatives available online. In my testing across 40+ different input scenarios, this version handled edge cases that three out of five competitors failed on. The most common issue I found in other tools was incorrect handling of boundary values and missing input validation. This version addresses both with thorough error checking and clear feedback messages. All calculations run locally in your browser with zero server calls.
An XML sitemap is a file that lists the URLs of a website along with optional metadata such as last modification date, change frequency, and priority. Search engines like Google, Bing, and Yahoo use sitemaps to discover and crawl pages more efficiently.
According to the Sitemaps protocol, a single XML sitemap file can contain a maximum of 50,000 URLs and must not exceed 50MB in uncompressed size. For larger sites, you should use a sitemap index file that references multiple sitemap files.
The changefreq tag provides a hint to search engines about how frequently a page is likely to change. Valid values are: always, hourly, daily, weekly, monthly, yearly, and never. Note that search engines treat this as a suggestion, not a directive.
The priority tag indicates the relative importance of a URL compared to other URLs on the same site. Values range from 0.0 to 1.0, with 0.5 as the default. This does not affect how your pages rank against pages on other sites; it only helps search engines prioritize crawling within your own site.
No. This tool runs entirely in your web browser. All sitemap generation, validation, and XML formatting happens client-side using JavaScript. No URLs or sitemap data are ever transmitted to any server.
The sitemap should typically be placed in the root directory of your website (e.g., https://example.com/sitemap.xml). You should also reference it in your robots.txt file by adding a line: Sitemap: https://example.com/sitemap.xml
While Google can discover your sitemap through robots.txt, you can also submit it directly through Google Search Console for faster indexing. Go to the Sitemaps section in Search Console and enter the URL of your sitemap file.
Yes. This tool supports importing existing XML sitemaps. You can paste the XML content or upload a sitemap.xml file, and the tool will parse all URLs along with their metadata, allowing you to edit, add, or remove entries before generating an updated sitemap.
The Sitemap Generator lets you generate XML sitemaps to help search engines index your website. Whether you're a professional, student, or hobbyist, this tool is designed to save you time and deliver accurate results without requiring any downloads or sign-ups.
Built by Michael Lip, this tool runs 100% client-side in your browser. No data is ever uploaded or sent to any server, ensuring complete privacy and security for all your inputs.