Duplicate WordPress Home Page Issue
-
I have an issue where I've created a site, (www.tntperformance805.com), using WordPress as a CMS. I enabled the option to use a static page as the home page, and created that page as /home. Well, now the issue that exists is that Google is indexing both www.tntperformance805.com, and www.tntperformance805.com/home/. I've already setup a 301 redirect, pointing /home/ to the main domain, and even have rel=canonical set up automatically, pointing every page to the www version of that particular page.
However, Google Webmaster Tools is still reporting the pages as having duplicate page titles and descriptions. I've even had the page removed from Google's cache and index. I'm assuming Google is not considering the 301 redirect, even though it's setup properly. Should I add rel="canonical" href="http://www.tntperformance805.com" /> to the body of the /home/ post, to ensure that it is giving credit to the main domain?
I am assuming the page is only redirecting to , as that's the www version, but I thought the 301 redirect would enforce that the search engines should give all credit to the main domain.
Thanks in advance for the help everyone. I look forward to some insightful feedback.
Best Regards,
Matt Dimock
-
I work at National Positions, an SEO company, and have access to hundreds of different GWT accounts. In my 4 years here, I have seen the same issue persist on many different websites. GWT is definitely clunky in certain areas. But the most frustrating thing is the inaccurate 404's and duplicate title issues.
-
Absolutely, I think you're OK on all fronts and glad to see the sitemap.xml is there - I didn't see it when I searched "site:tntperformance.com" but I didn't check the actual /sitemap.xml url.
If it makes you feel better, I've got GWT errors over a year old that persist. I've checked the code 8 different ways and implemented redirects and everything under the sun, sometimes I think it's just them.
-
The 301 redirect definitely seems to be working fine. Though the tntperformance.com/home/ link was indexed in Google just a few days ago. I already have a sitemap.xml live (www.tntperformance805.com/sitemap.xml), and submitted it awhile back.
I was more curious if Googlebot indexed the content on the /home/ page, or was just immediately forwarded to the main domain. I cannot think as to how they would have been able to find the page, as there are no internal links or external links pointing at it.
I'll give GWT some time, and hopefully the duplicates go away. Thanks for putting my mind at ease Kane!
-
The redirect appears to be working correctly.
The canonical tag on the homepage appears to be correct as well.
Your sitemap looks fine, too, though I'd add an XML sitemap and submit that to GWT.
I would not add another canonical tag in the body.
Have you given GWT a week or two to update? It's known to take a little while.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird Indexing Issues with the Pages and Rankings
When I found the my page was non-existent on the search results page, I requested Google to index my page via the Search Console. And then just a few minutes after I did that, that page rose to top 3 ranking on the search page (with the same keyword and browser search). It happens to most of the pages on my website. Maybe a week later the rankings sank again, and I had to do the process again to make my pages to the top. Any reasons to explain this phenomenon, and how I can fix this issue? Thank you in advance.
Intermediate & Advanced SEO | | mrmrsteven0 -
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Web accessibility - High Contrast web pages, duplicate content and SEO
Hi all, I'm working with a client who has various URL variations to display their content in High Contrast and Low Contrast. It feels like quite an old way of doing things. The URLs look like this: domain.com/bespoke-curtain-making/ - Default URL
Intermediate & Advanced SEO | | Bee159
domain.com/bespoke-curtain-making/?style=hc - High Contrast page
domain.com/bespoke-curtain-making/?style=lc - Low Contrast page My questions are: Surely this content is duplicate content according to a search engine Should the different versions have a meta noindex directive in the header? Is there a better way of serving these pages? Thanks.0 -
After adding a ssl certificate to my site I encountered problems with duplicate pages and page titles
Hey everyone! After adding a ssl certificate to my site it seems that every page on my site has duplicated it's self. I think that is because it has combined the www.domainname.com and domainname.com. I would really hate to add a rel canonical to every page to solve this issue. I am sure there is another way but I am not sure how to do it. Has anyone else ran into this problem and if so how did you solve it? Thanks and any and all ideas are very appreciated.
Intermediate & Advanced SEO | | LovingatYourBest0 -
How do i prevent Google and Moz from counting pages as duplicates?
I have 130,000 profiles on my site. When not Connected to them they have very few differences. So a bot - not logged in, etc, will see a login form and "Connect to Profilename" MOZ and Google call the links the same, even though theyre unique such as example.com/id/328/name-of-this-group example.com/id/87323/name-of-a-different-group So how do i separate them? Can I use Schema or something to help identify that these are profile pages, or that the content on them should be ignored as its help text, etc? Take facebook - each facebook profile for a name renders simple results: https://www.facebook.com/public/John-Smith https://www.facebook.com/family/Smith/ Would that be duplicate data if facebook had a "Why to join" article on all of those pages?
Intermediate & Advanced SEO | | inmn0 -
Ecommerce: remove duplicate product pages or use rel=canonical
Say we have a white-widget that is in our white widget collection and also in our wedding widget collection. Currently, we have 3 different URLs for that product (white-widgets/white-widget and wedding-widgets/white-widget and all-widgets/white-widget).We are automatically generating a rel=canonical tag for those individual collection product pages that canonical the original product page (/all-widgets/white-widget). This guide says that is the structure Zappos uses and says "There is an elegance to this approach. However, I would re-visit it today in light of changes in the SEO world."
Intermediate & Advanced SEO | | birchlore
I noticed that Zappos, and many other shops now actually just link back to the parent product page (e.g. If I am in wedding widget section and click on the widget, I go to all-products/white-widget instead of wedding-widgets/white-widget).So my question is:Should we even have these individual product URLs or just get rid of them altogether? My original thought was that it would help SEO for search term "white wedding widget" to have a product URL wedding-widget/white-widget but we won't even be taking advantage of that by using rel=canonical anyway.0 -
Duplicate Page Title/Content Issues on Product Review Submission Pages
Hi Everyone, I'm very green to SEO. I have a Volusion-based storefront and recently decided to dedicate more time and effort into improving my online presence. Admittedly, I'm mostly a lurker in the Q&A forum but I couldn't find any pre-existing info regarding my situation. It could be out there. But again, I'm a noob... So, in my recent SEOmoz report I noticed that over 1,000 Duplicate Content Errors and Duplicate Page Title Errors have been found since my last crawl. I can see that every error is tied to a product in my inventory - specifically each product page has an option to write a review. It looks like the subsequent page where a visitor can fill out their review is the stem of the problem. All of my products are shown to have the same issue: Duplicate Page Title - Review:New Duplicate Page Content - the form is already partially filled out with the corresponding product My first question - It makes sense that a page containing a submission form would have the same title and content. But why is it being indexed, or crawled (or both for that matter) under every parameter in which it could be accessed (product A, B, C, etc)? My second question (an obvious one) - What can I do to begin to resolve this? As far as I know, I haven't touched this option included in Volusion other than to simply implement it. If I'm missing any key information, please point me in the right direction and I'll respond with any additional relevant information on my end. Many thanks in advance!
Intermediate & Advanced SEO | | DakotahW0 -
Get Duplicate Page content for same page with different extension ?
I have added a campaign like "Bannerbuzz" in SEOMOZ Pro account and before 2 or 3 days i got errors related to duplicate page content . they are showing me same page with different extension. As i mentioned below http://www.bannerbuzz.com/outdoor-vinyl-banners.html
Intermediate & Advanced SEO | | CommercePundit
&
http://www.bannerbuzz.com/outdoor_vinyl_banner.php We checked our whole source files but we didn't define php related urls in our source code. we want to catch only our .html related urls. so, Can you please guide us to solve this issue ? Thanks <colgroup><col width="857"></colgroup>
| http://www.bannerbuzz.com/outdoor-vinyl-banners.html |0