Duplicate content homepage - Google canonical 'N/A'?
-
Hi,
I redesigned a clients website and launched it two weeks ago. Since then, I have 301 redirected all old URL's in Google's search results to their counterparts on the new site.
However, none of the new pages are appearing in the search results and even the homepage has disappeared. Only old site links are appearing (even though the old website has been taken down ) and in GSC, it's stating that:
Page is not indexed: Duplicate, Google chose different canonical than user
However, when I try to understand how to fix the issue and see which URL it is claiming to be a duplicate of, it says:
Google-selected canonical: N/A
It says that the last crawl was only yesterday - how can I possibly fix it without knowing which page it says it's a duplicate of? Is this something that just takes time, or is it permanent?
I would understand if it was just Google taking time to crawl the pages and index but it seems to be adamant it's not going to show any of them at all.
-
The contradictory GSC report is curious. My guess without more info is that either Google has not found the redirect, or cannot see the redirect.
When I checked some pages myself e.g. https://sa-state.cataloxy.net/firms/adelaide-airport.htm, they are not redirected. Is this intentional?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Subdirectory site / 301 Redirects / Google Search Console
Hi There, I'm a web developer working on an existing WordPress site (Site #1) that has 900 blog posts accessible from this URL structure: www.site-1.com/title-of-the-post We've built a new website for their content (Site #2) and programmatically moved all blog posts to the second website. Here is the URL structure: www.site-1.com/site-2/title-of-the-post Site #1 will remain as a normal company site without a blog, and Site #2 will act as an online content membership platform. The original 900 posts have great link juice that we, of course, would like to maintain. We've already set up 301 redirects that take care of this process. (ie. the original post gets redirected to the same URL slug with '/site-2/' added. My questions: Do you have a recommendation about how to best handle this second website in Google Search Console? Do we submit this second website as an additional property in GSC? (which shares the same top-level-domain as the original) Currently, the sitemap.xml submitted to Google Search Console has all 900 blog posts with the old URLs. Is there any benefit / drawback to submitting another sitemap.xml from the new website which has all the same blog posts at the new URL. Your guidance is greatly appreciated. Thank you.
Intermediate & Advanced SEO | | HimalayanInstitute0 -
Google News and Discover down by a lot
Hi,
Technical SEO | | SolenneGINX
Could you help me understand why my website's Google News and Discover Performance dropped suddenly and drastically all of a sudden in November? numbers seem to pick up a little bit again but nowhere close what we used to see before then0 -
Google Drop Following Negative Article in New York Times
I have two sites that were mentioned in a negative article in The New York Times a couple weeks ago. They saw a good increase in traffic, but on the sixth both of them saw sudden unexplained Google drops. Both seemed on the average position from search console doubling overnight. I run similar websites that have seen no such drops. The only thing these two have in common are being mentioned in the same negative article. Normally I would expect the mention from a major news outlet to make the sites more authoritative in Google's eyes. Is this a coincidence or a possible manual penalty? They still rank number one for their respected brand names, but everything else has suffered. Did Google make any recent algorithm changes or do you think someone at Google may have read the article and decided the sites needed to be demoted?
Algorithm Updates | | PostAlmostAnything0 -
When to use mod rewrite / canonical / 301 redirect
Hello, I have taken over the management of a site which has a big problem with duplicate content. The duplicate content is caused by two things: Upper and lower case urls e.g: www.mysite.com/blog and www.mysite.com/Blog The other reason is the use of product filters / pagination which mean you can get to the same 'page' via different filters. The filters generate separate URLs. http://www.mysite.com/casestudy
Technical SEO | | Barques-Design
http://www.mysite.com/casestudy/filter?page=1
http://www.mysite.com/casestudy/filter?solution=0&page=1
http://www.mysite.com/casestudy?page=1
http://www.cpio.co.uk/casestudy/filter?solution=0" Am I right to assume that for the case sensitive URLs I should use a 301 redirect because I only want the lower page to be shown? For the issue with dynamic URLs should we implement a mod-rewrite and 301 to one page? Any advice would be greatly appreciated.
Mat0 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
How critical is Duplicate content warnings?
Hi, So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ. However, the crawl report returned thousands of errors and most of them are duplicate content warnings. As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know). I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler) So my question is, should I be worried about the thousands of error messages in crawler diagnostics? any ideas appreciated
Technical SEO | | Gamer070 -
How do I fix Duplicate Content/Title going to memberlist.php page?
I have over 6,000 duplicate title and duplicate content errors going to this link: http://community.mautofied.com/memberlist.php?mode=viewprofile&u=100299 How do I fix this?
Technical SEO | | mautofied0