A sitemap is an essential tool for any website if it needs to be indexed by search engines. The bots that any search engine sends to crawl sites uses these sitemaps to get the list of pages in the site, with their respective URLs. However a Salesforce site which is built in community with an existing license for that org (non trial org), for Enterprise, Performance and Unlimited edition automatically creates a sitemap. We do not have an option to use a custom sitemap in the place of that original file. This is the topic of the blog, how to use Custom sitemap for your salesforce site?

Creating a VF page

First step in this process would be to create a VF page for this sitemap. The content type of this sitemap would be ‘text/xml‘. Simply add this to the contentType attribute of the <apex:page>. Next create a controller for this sitemap and return a list of all the VF pages from a function in this controller. This can be done with the help of following query:

SELECT id, name from Apexpage where id in(SELECT SetupEntityId FROM SetupEntityAccess WHERE SetupEntityType = 'ApexPage' AND Parent.Profile.Id = :UserInfo.getProfileId())

Loop over the list of sObject that we received and then add the names to the list of strings. After this go back to the VF page and create the following tag in it.

<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"></urlset>

Between these tags create an <apex:repeat> element iterating over the list of VF pages that we got earlier and add the following code in the repeat element.


This will create a sitemap file with the priorities of 0.5 each. Alternatively you can create a map of string to decimal containing the priorities of each page and use them in this page.

OverRiding the default Sitemap

Now we won’t be actually overriding the default sitemap, rather we will be redirecting bots to the new custom sitemap that we created with the help of robots.txt. As we know that we can create a custom robots in Salesforce.

For the blog on creating custom robots.txt go to this link: https://wedgecommerce.com/setup-robots-txt-salesforce-site/

Now in this robots.txt we need to add a line that is:

SITEMAP: {!$Site.BaseUrl}/sitemap

With this done, when the robots will crawl the robots.txt they will find the following URL to the sitemap, and hence will not go to the default sitemap URL.


That’s all about creating the custom sitemap in Salesforce site, for any further queries feel free to contact us at:


Or let us know your views about this blog in comments section below.

Leave A Reply

Please verify that you are not a robot.

Tell us about Your Company

How can we help you with your business?

    Message Sent!

    If you have more details or questions, you can reply to the received confirmation email.

    Back to Home