My client is using a mobile template for their local pages and the Google search console is reporting thousands of duplicate titles/meta descriptions
-
So my client has 2000+ different store locations. Each location has the standard desktop location and my client opted for a corresponding mobile template for each location. Now the Google search console is reporting thousands of duplicate titles/meta descriptions. However this is only because the mobile template and desktop store pages are using the exact same title/meta description tag.
Is Google penalizing my client for this? Would it be worth it to update the mobile template title/meta description tags?
-
Excellent advice. Thanks Andy!
-
Hi Rosemary,
Is Google penalizing my client for this?
Does it look like there is any penalty going on?
What you need to do is add a rel=alternate from the desktop location pages to the mobile versions and a rel=canonical from the mobile page back to the desktop version. This should remedy all duplication issues and send Google the correct signals at the same time.
What Google says...
- On the desktop page, add a special link rel=”alternate” tag pointing to the corresponding mobile URL. This helps Googlebot discover the location of your site’s mobile pages.
- On the mobile page, add a link rel=”canonical” tag pointing to the corresponding desktop URL.
You can read the information you need that matches your separate page format here on Google.
This should fix all issues.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
How to remove the duplicate page title
Hi everyone, I saw many posts related to this query.But i couldnt find a solution for my error.. Here is my question I got 575 Duplicate page title & 600 duplicate page content errors. My site is related to realestate. I created a page title like same sentence differs with locality name Eg: Land for sale - kandy property Land for sale - Galle property Likewise Locality name only differs..I have created meta title & Content like this. Can anyone let me know how to solve this error ASAP ?
Technical SEO | | Rajesh.Chandran0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
/index.php/ page
I was wondering if my system creates this page www my domain com/index.php/ is it better to block with robot.txt or just canonize?
Technical SEO | | ciznerguy0 -
Why isn't Google pushing my Schema data to the search results page
I believe we have it set up right. I'm noticing all my competitors schema data is showing up which is really giving them a leg up on us. We have a high ranking website so I'm just not sure why it's now showing up. Here is an example URL http://www.airgundepot.com/3576w.html I've used the Google webmaster tools tester and it all looks fine. Any ideas? Thanks in advance.
Technical SEO | | AirgunDepot0 -
Optimum title and description meta tag length
Hi all, I have read that a title tag and description tag length of 69 and 156 characters respectively, should be used as this is all that Google will show in the search results, but that search engine robots will read longer titles and descriptions and additional characters will have an effect on ranking algorithms. However, is there any SEO benefit in making title and description tags longer to include more keywords to aid ranking, even though the latter part won't be visible in the results. I have read elsewhere on this forum that there may be concerns with regards to keyword dilution, but what about keyword reinforcement, i.e. by a repetition of the main keyword at the end of the title/description (I mean in a readable manner here, not 'stuffed')? Thanks in advance, Gareth
Technical SEO | | gdavies090319770 -
Our Development team is planning to make our website nearly 100% AJAX and JavaScript. My concern is crawlability or lack thereof. Their contention is that Google can read the pages using the new #! URL string. What do you recommend?
Discussion around AJAX implementations and if anybody has achieved high rankings with a full AJAX website or even a partial AJAX website.
Technical SEO | | DavidChase0 -
Google not using <title>for SERP?</title>
Today I noticed that Google is not using my title tag for one of my pages. Search for "covered call search" Look at organic result 6: Search - Covered Calls Covered call screener filters 150000 options instantly to find the best high yield covered calls that meet your custom criteria. Free newsletter.<cite>https://www.borntosell.com/search</cite> - CachedNow, if you click through to that page you see the meta title tag is:Covered Call ScreenerEven the cached version shows the title tag as Covered Call ScreenerI am not logged in, so I don't believe personalization has anything to do with it.Have others seen this before?It is possible that "search - covered calls" was the title tag 9 months ago (before I understood SEO); I honestly don't remember. I cleaned all my titles up at least 6 months ago.Can I force Google to re-index the page? Its content has changed a few times in the last few months, and Google crawls my site frequently according to webmaster tools.
Technical SEO | | scanlin0