Magento Duplicate Content Recovery
-
Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks.
We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return?
Thanks!
-
Hi There
This is really hard to answer. Let's assume nothing else changed (which is of course, completely impossible) but for the sake of the question. All of the wrong pages that were not indexed prior would need to be de-indexed in Google - I would check to see if they are still indexed, because it can take some time for them to be de-indexed. If/when that happens, in our hypothetical vacuum like situation, the rankings would return when those pages are completely de-indexed.
However, in reality, this is not how it works of course - so by all means, fix the technical issues, as they are important of course - but don't forget to keep a wide view of the situation and realize that other factors may be affecting the rankings now as well - and to not hyper-focus/obsess too much about this one thing.
-Dan
-
Hi Jon,
Are you sure the indexing issue was the cause of your drop in rankings? Switching platforms can introduce a raft of potential issues. You have probably covered these but worth keeping in mind things like any content changes that happened at the same time, proper rel canonical tags on the product pages, 301 redirects to account for url changes etc. I would pay particular attention to main keywords and their associated landing pages pre and post platform change to note any major differences which might give you more specific areas to look at.
In regards how long it might take, if you have a big site with lots of products then it could take google a number of weeks to reindex (or deindex) all the pages, so I would suggest being patient before making too many changes on top of each other. You can check the cache date of your main pages in the google serps by clicking on the 'cached' link next to the url which will give you a page with the date of the latest copy google has.
Hope that helps!
-
I personally feel Magento is prehistoric and not much fun. However if it is a issue where you have simply had mixed up settings with index and archive I believe it will be back before you know it. I have fouled up the settings once or twice myself. this is a link to a very good resource for Magento in my opinion I would be running WordPress woo commerce but I don't know how big your stores nor do I know you're interested in running
http://yoast.com/articles/magento-seo/
I hope I have been of help,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wondering if creating 256 new pages would cause duplicate content issues
I just completed a long post that reviews 16 landing page tools. I want to add 256 new pages that compare each tool against each other. For example: Leadpages vs. Instapage Leadpages vs. Unbounce Instapage vs. Unbounce, etc Each page will have one product's information on the left and the other on the right. So each page will be a unique combination BUT the same product information will be found on several other pages (its other comparisons vs the other 15 tools). This is because the Leadpages comparison information (a table) will be the same no matter which tool it is being compared against. If my math is correct, this will create 256 new pages - one for each combination of the 16 tools against each other! My site now is new and only has 6 posts/pages if that matters. Want to make sure I don't create a problem early on...Any thoughts?
Intermediate & Advanced SEO | | martechwiz0 -
Internal Duplicate Content - Classifieds (Panda)
I've been wondering for a while now, how Google treats internal duplicate content within classified sites. It's quite a big issue, with customers creating their ads twice.. I'd guess to avoid the price of renewing, or perhaps to put themselves back to the top of the results. Out of 10,000 pages crawled and tested, 250 (2.5%) were duplicate adverts. Similarly, in terms of the search results pages, where the site structure allows the same advert(s) to appear under several unique URLs. A prime example would be in this example. Notice, on this page we have already filtered down to 1 result, but the left hand side filters all return that same 1 advert. Using tools like Siteliner and Moz Analytics just highlights these as urgent high priority issues, but I've always been sceptical. On a large scale, would this count as Panda food in your opinion, or does Google understand the nature of classifieds is different, and treat it as such? Appreciate thoughts. Thanks.
Intermediate & Advanced SEO | | Sayers1 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Is Sitemap Issue Causing Duplicate Content & Unindexed Pages on Google?
On July 10th my site was migrated from Drupal to Google. The site contains approximately 400 pages. 301 permanent redirects were used. The site contains maybe 50 pages of new content. Many of the new pages have not been indexed and many pages show as duplicate content. Is it possible that there is a site map issue that is causing this problem? My developer believes the map is formatted correctly, but I am not convinced. The sitemap address is http://www.nyc-officespace-leader.com/page-sitemap.xml [^] I am completely non technical so if anyone could take a brief look I would appreciate it immensely. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan | |0 -
Duplicate Content and Titles
Hi Mozzers, I saw a considerable amount of duplicate content and page titles on our clients website. We are just implementing a fix in the CMS to make sure that these are all fixed. What changes do you think I could see in terms of rankings?
Intermediate & Advanced SEO | | KarlBantleman0 -
PDF for link building - avoiding duplicate content
Hello, We've got an article that we're turning into a PDF. Both the article and the PDF will be on our site. This PDF is a good, thorough piece of content on how to choose a product. We're going to strip out all of the links to our in the article and create this PDF so that it will be good for people to reference and even print. Then we're going to do link building through outreach since people will find the article and PDF useful. My question is, how do I use rel="canonical" to make sure that the article and PDF aren't duplicate content? Thanks.
Intermediate & Advanced SEO | | BobGW0 -
Duplicate Content Error because of passed through variables
Hi everyone... When getting our weekly crawl of our site from SEOMoz, we are getting errors for duplicate content. We generate pages dynamically based on variables we carry through the URL's, like: http://www.example123.com/fun/life/1084.php
Intermediate & Advanced SEO | | CTSupp
http://www.example123.com/fun/life/1084.php?top=true ie, ?top=true is the variable being passed through. We are a large site (approx 7000 pages) so obviously we are getting many of these duplicate content errors in the SEOMoz report. Question: Are the search engines also penalizing for duplicate content based on variables being passed through? Thanks!0 -
Wordpress Duplicate Content Due To Allocating Two Post Categories
It looks like google has done a pretty deep crawl of my site and is now showing around 40 duplicate content issues for posts that I have tagged in two seperate categories for example: http://www.musicliveuk.com/latest-news/live-music-boosts-australian-economy http://www.musicliveuk.com/live-music/live-music-boosts-australian-economy I use the all in one SEO pack and have checked the no index for categories, archive, and tag archive boxes so google shouldn't even crawl this content should it? . I guess the obvious answer is to only put each post in one category but I shouldn't have to should I? Some posts are relevant in more than once category.
Intermediate & Advanced SEO | | SamCUK0