Are my duplicate meta titles and descriptions an issue ?
-
HelloMy website http://www.gardenbeet.com has been rebuilt using prestacart and there are 158 duplicate title and meta descriptions being reported by google.My developer advised the following Almost all the duplicates are due to the same page being accessible at the root and following the category heading. e.g;
/75-vegetable-patio-planter-turquoise.html
/patio-planters/75-vegetable-patio-planter-turquoise.htmlThis is hard-wired into PrestaShop. Was the Canonical module (now disabled) responsible for the confusion by not including the category name?
The Googlebot shouldn't be scanning the root versions now. I don't believe this to be a serious issue but I'd recommend a second opinion from someone more SEO savvy just to be sure.Opinions??
-
thanks - i dont know why it was disabled - it probably caused another problem - we have had many!! will give reply to developer and see what he says
appreciate your time
-
Hey, you essentially have two URLs pointing to exactly the same page so you have an issue here. Exactly why prestashop is configured like that I don't know but it's certainly not going to help you long term and you need to get this sorted.
I guess you have a couple of options.
1. Reconfigure prestashop so it only maintains one unique URL for each page & 301 redirect the ones you remove to the new one to pick up any search referrals and transfer any link value
2. add a canonical URL to both versions of the page pointing to the main version of the page.
I just did a quick google and this seems to be an oft discussed topic on the prestashop forums so there is likely a tried and tested solution out there. As this is specific to that content management system I would suggest looking there for the solution as there is likely something you can do in the configuration of prestashop to sort this out.
You mentioned that you disabled the canonical plugin - why was this? Was it pointing to the correct versions of the pages (the categorised ones)? If so, putting that back in place may solve the issue and I would be happy to take a look under at the source code for you to confirm this.
Not exactly an answer, but I think the answer is likely simple and in the prestashop configuration of the disabled canonical module.
Let me know if I can help!
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
Duplicate Content? Designing new site, but all content got indexed on developer's sandbox
An ecommerce I'm helping is getting a complete redesign. Their developer had a sandbox version of their new site for design & testing. Several thousand products were loaded into the sandbox site. Then Google/Bing crawled and indexed the site (because developer didn't have a robots.txt), picking up and caching about 7,200 pages. There were even 2-3 orders placed on the sandbox site, so people were finding it. So what happens now?
Web Design | | trafficmotion
When the sandbox site is transferred to the final version on the proper domain, is there a duplicate content issue?
How can the developer fix this?0 -
Are URL suffixes ignored by Google? Or is this duplicate content?
Example URLs: www.example.com/great-article-on-dog-hygiene.html www.example.com/great-article-on-dog-hygiene.rt-article.html My IT dept. tells me the second instance of this article would be ignored by Google, but I've found a couple of instances in which Google did index the 'rt-article.html' version of the page. To be fair, I've only found a couple out of MANY. Is it an issue? Thanks, Trisha
Web Design | | lzhao0 -
Can the website pages have the site name like Title of the page | Sitename.com
Hi, Can the website pages have the site name like Title of the page | Sitename.com I have a site with 50K pages and all pages have | Sitename.com mentioned would that be a good practice or bad? Thanks Martin
Web Design | | mtthompsons0 -
Multiple Sites, multiple locations similar / duplicate content
I am working with a business that wants to rank in local searches around the country for the same service. So they have websites such as OURSITE-chicago.com and OURSITE-seattle.com -- All of these sites are selling the same services, but with small variations in each state due to different legal standards in the state. The current strategy is to put up similar "local" websites with all the same content. So the bottom line is that we have a few different sites with the same content. The business wants to go national and is planning a different website for each location. In my opinion the duplicate content is a real problem. Unfortunately the nature of the service makes it so that there aren't many ways to say the same thing on each site 50 times without duplicate content. Rewriting content for each state seems like a daunting task when you have 70+ pages per site. So, from an SEO standpoint we have considered: Using the canonocalization tag on all but the central site... I think this would hurt all of the websites SERPs because none will have unique content. Having a central site with directories OURSITE.com/chicago -- but this creates a problem because we need to link back to the relevant content in the main site and ALSO have the unique "Chicago" content easily accessable to Chicago users while having Seattle users able to access their Seattle data. The best way we thought to do this was using a frame with a universal menu and a unique state based menu... Also not a good option because of frames will also hurt SEO. Rewrite all the same content 50 times. You can see why none of these are desirable options. But I know that plenty of websites have "state maps" on their main site. Is there a way to accomplish this in a way that doesn't make our copywriter want to kill us?
Web Design | | SysAdmin190 -
Duplicate content and blog/twitter feeds
Hi Mozzers, I have a question... I'm planning to add a blog summary/twitter feed throughout my website (onto every main content page) and then started worrying about duplicate content. What is best practice here? Let me know - thanks, Luke PS. I sat down and re: blog feed... thought that perhaps it would help if I fed different blog posts through to different pages (which I could then edit so I could add<a></a> text different from that in blog). Not sure about twitter.
Web Design | | McTaggart1 -
Real Estate and Duplicate Content
Currently we use an MLS which is an iFrame of property listings. We plan to pay an extra fee and have the crawlable version. But one problem is that many real estate firms have access to the same data, which makes our content duplicate of theirs. Is there any way around this ? Thanks
Web Design | | SGMan0 -
Has Anyone Had Issues With ASP.NET 4.0 URL Routing?
I'm seeing some odd results in my SEOMOZ results with a new site I just released that is using the ASP.NET 4.0 URL routing. I am seeing thousands(!) of duplicate results, for instance, because the crawl has uncovered something like this: http://www.mysite.com/
Web Design | | TroyCarlson
http://www.mysite.com/default.aspx (so far, so good, though I wish it wouldn't show both)
http://www.mysite.com/default.aspx/about/ (what the heck -?)
http://www.mysite.com/default.aspx/about/about/ (WTF!?)
http://www.mysite.com/default.aspx/about/about/products/ (and on and on ad infinitum) I'm also seeing problems pop up in my sitemap because extensionless urls have an odd "eurl.axd/abunchofnumbersgohere" appended to the end of every address which is breaking links. sigh Buyer beware. I've found articles that discuss the "eurl.axd" issue here and there (this one seems very good), but nothing about the weird crawl issue I outlined above. Any advice?0