How to index pages of Locator & Pages

Step by step manual on how to index pages.

Last updated on April 3rd, 2024

Indexing

Indexed pages will show up in search results when deemed by Google to have enough quality for relevant search phrases so organic traffic will be driven to your site. Your pages can be found by search crawlers (like Google spiders) and indexed.

In order to ensure correct crawling and indexing by Google we recommend to complete the steps below:

Locator and Pages accessibility

Confirm that Google bots can access your pages:

  • Go to your domain Google Search Console 
  • Go to 'URL inspection' section
  • Paste the url of one of your local pages in the search bar and tap Enter
  • If the url is not indexed yet, then click 'TEST LIVE URL' button
  • URL must be available to Google

Upload the Pages sitemap to Google Search Console

In order to upload the Pages to the Google Search Console a list of URLs has to be created - so called sitemap. The sitemap with all Pages URLs can be exported by a small command on the URL that contains the Locator. That command has to be executed in the browser console:

Open the Browser Console

Browser Action Mac Windows/Linux Source
Chrome Open the Console panel Command + Option + J Control + Shift + J https://developers.google.com/web/tools/chrome-devtools/shortcuts
Firefox Open Browser Console Command + Shift + J Control + Shift + J https://developer.mozilla.org/en-US/docs/Tools/Keyboard_shortcuts
Edge Open the Console panel Command + Option + J Control + Shift + J https://docs.microsoft.com/en-us/microsoft-edge/devtools-guide-chromium/shortcuts

Executing sitemap download

Copy the following command into your browser console and execute it with the return button:

window.sfExportSitemap()

Then the .xml file of your sitemap will be downloaded. If you open that file with an text editor it should look similar to that:

<?xml version="1.0" encoding="UTF-8"?><urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
<url><loc>https://uberall.helpjuice.com/locator-and-pages/458416#!/l/amsterdam/weteringschans-109/2093380</loc><lastmod>2020-11-09T10:50:14.521Z</lastmod></url>
<url><loc>https://uberall.helpjuice.com/locator-and-pages/458416#!/l/berlin/hussitenstrasse-32-33/2093381</loc><lastmod>2020-11-09T10:50:14.521Z</lastmod></url>
<url><loc>https://uberall.helpjuice.com/locator-and-pages/458416#!/l/london/1-saint-katharine's-way/2093382</loc><lastmod>2020-11-09T10:50:14.521Z</lastmod></url><url>
<loc>https://uberall.helpjuice.com/locator-and-pages/458416#!/l/paris/26-28-rue-de-londres/2161887</loc><lastmod>2020-11-09T10:50:14.521Z</lastmod></url></urlset>

Make your sitemap available to Google (Submit your sitemap to Google)

Follow the instructions on Google Search Console Help article to add sitemap URLs

There are a few different ways to make your sitemap available to Google:

  • Submit it to Google using the Search Console Sitemaps tool
  • Insert the following line anywhere in your robots.txt file, specifying the path to your sitemap:
    Sitemap: http://example.com/sitemap_location.xml
  • Use the "ping" functionality to ask us to crawl your sitemap. Send an HTTP GET request like this:
    http://www.google.com/ping?sitemap=<complete_url_of_sitemap>
            for example:  
    http://www.google.com/ping?sitemap=https://example.com/sitemap.xml

If you’ve recently added or made changes to a page on your site, you can request that Google re-index your page ask Google to recrawl your URLs.

Consistency with platform

Upload the Pages URLs in 'Website' field to each corresponding location profile in order to generate backlinks from GMB to the newly created Pages.

Do not specify canonical links, title tags nor meta descriptions on the specific Page where the Locator is integrated, as the widget will generate them dynamically.

Troubleshooting 

In case you followed the instructions above, but Google is still not indexing your pages or a part of them, the following points might help.

Meta tags

Have you set your meta tags properly? Google might have issues identifying the pages and their content as individual since the location data is identical or, at least, very similar. This might be the case, when your locations are named identically and are located in the same city.

In either way, we advise you to add meaningful and, ideally, individual tags per page. As our script does add the tags automatically and dynamically you might think possibilities are limited. But we came up with a solution to add SEO relevant information in the tags per location page.

You can define these four different Locator attributes inside your widget code:

  • data-mainpagetitle
  • data-mainpagedescription
  • data-localpagetitle
  • data-localpagedescription

Whereas the first two are for the meta tags of the Locator map part itself, the other two allow to set tags for the location pages.

So, for "data-localpagetitle" and "data-localpagedescription", we support creating one of each tag with individual content per Location Page. This is facilitated by placeholders.

A placeholder is a term representing a location data-field of our platform. By adding the placeholder to the tags, our script will recognize them and dynamically create individual meta tags per page, which are recognized by Google.

Supported placeholders are: $name, $city, $address, $zip, $keywords and any $custom_field name (if available).

Using a custom field as a placeholder for your meta tags, you can enter unique titles and descriptions for each page. 

Example:

The meta tags for a Location page of McDonald's with the address Alexanderplatz 1, 10178 Berlin would be:

  • data-localpagetitle="Welcome to $name"
  • data-localpagedescription="This a description of our $name in $address, $zip $city."

Published 

  • title tag: "Welcome to McDonald's"
  • description tag: "This a description of our McDonald's in Alexanderplatz 1, 10178 Berlin"
Delete

Incorrect data in Google Search Console

We discovered over time, that the status of pages in Google Search Console isn't always as accurate, as you might think.

So in order to be sure about the current indexing situation of your Pages we highly recommend to "inspect" the ones, which are flagged in Google Search Console and to search for them in Google Search.

We often find inconsistency with the Search Console status and the actual publication of pages on Google Search.

Delete

Website of GMB listing

The URL published as a website of a GMB listing should point to the according Location Page. Google will check if data of the Location Pages you want to index is available in it's GMB database. In case it can find a matching listing, it will check the provided website and compare it to the URL you are requesting to be indexed.
In case the URL is pointing to a completely different domain, Google might decide to not index your Page.

Therefore it's advised to put the URLs of your pages as websites in our platform.

Delete

Referencing

Google will also check listings on other directories and platforms and see if data is consistent with what's published on your Location Page. Google will take the website URLs published on these listings in consideration for the decision, whether or not the Pages get indexed. It will have a positive impact, in case Google finds the identical URL as a website on a listing.

Therefore we advise customers to refer to the location pages as often as possible. This will also lead to an increase of traffic, which helps Google identify Pages as individual content.

Delete

My Local Pages are showing a noindex policy

If your Local Pages are showing a "noindex" policy, verify that you are not adding this on your end. Uberall does not offer the option to not index Local Pages.

The only time a Local Page should have a policy of "noindex" is when the requested page does not exist or if the location cannot be loaded or times-out for whatever reason. When loading Local Pages, we add a "noindex" policy until we are able to get a valid response from the Uberall API returning the location information for display. We do this to prevent having invalid pages indexed.

Delete


Was this article helpful?

Save as PDF