Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content on recruitment website
-
Hi everyone,
It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason.
The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content.
Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed).
The questions here are:
- How bad would this be for the website usability, and would it be the reason the traffic went down?
- Is this the affect of Panda 4.2 that is still rolling
- What can be done to resolve these issues?
Thank you in advance.
-
Hi Issa,
You're right, duplicate content and bad usability could be triggering the slow rolling Panda 4.2, but I'd dig in a little more (apologies if you already did this research):
-
You mentioned 200 pages are potentially duplicate; how many are on the site in total? If you have thousands of pages indexed, 200 duplicates probably aren't going to cause a Panda penalty.
-
How similar are these postings? Just the page title? Or is the entire page extremely similar in content? (To answer this: if you made a keyword cloud for these similar job descriptions, would they show roughly the same mapping?)
-
If it's just the page title that's similar, make sure to set the pages apart by including the name of the hiring company (which I assume makes the different positions unique) towards the beginning of the page title
-
If the entire page is similar, then add more content to make the pages more unique, like a blurb about the hiring company, how long the job has been up, how many applicants the job has (if available), etc.
-
Either way, make sure you don't have any old jobs that still have live pages! If possible, I'd redirect them to a similar job posting.
-
Like John asked, did your traffic drop dramatically one day, or has it been tapering off? If it's tapering off, I'd guess it's not Panda.
-
And, last, which pages lost traffic and rankings? Which keywords dropped in rankings? You may be able to tell how you were penalized by which keywords were most affected.
Hope this helps,
Kristina
-
-
Hi Issa -
Great question here. Seems your client is potentially in a tough spot with this!
There is a ton to unpack here and it is hard to know specifics without the site (feel free to private message it to me), but to your specific questions:
- Re: if it is a problem that the jobs have the same title, that is only something you can answer with the analytics data you have access to. It usually is not a problem, but when you have this sort of situation I'd also ask if you have category pages for those terms (eg 20 Growth Hacker jobs in SF a day, but also a "Growth Hacker Jobs in SF" category where all those individual jobs link back up to
- Regarding syndication of content, this can cause an issue if not done correctly. You'd have to see where they lost traffic (you hopefully already know), but if it's the case with syndicated listings losing traffic and non-syndicated not, this is an issue. What I've often done is either get the site we are syndicating to to implement a canonical back to my listing, or get a followed link from their version back to yours. Also, you can be selective about what you syndicate so that it's a small duplication vs complete. Also, make your pages more robust and only syndicate the necessary info if possible.
- Website usability can be bad for Panda, especially if bounce rates are really high. Check those and see if they are high. If they are, you should fix it anyways because you'll get better conversions. I've also heard of cases where they made their site "stickier" and they bounced back from Panda.
I guess it's hard to know if Panda is still rolling out, but from everything I have heard it is. I assume this was not just a one-time drop on one day, but rather a slow leak of traffic? That makes it harder to investigate if the second.
Good luck!
John
-
Great thank you.
Will have a read.
Still though, with the situation above, is it OK for this industry to have such duplicate content and what to do about it if its not.
Thanks
-
I was reading an article earlier from SEO RoundTable, where it details that Duplicate content is a side issue and not necessarily related to the Panda Update - read more here - https://www.seroundtable.com/google-duplicate-content-panda-issues-different-21039.html
John Mueller stated that sites with low quality content are hit by Panda and that duplicate content is a separate side issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
Moving Content To Another Website With No Redirect?
I've got a website that has lots of valuable content and tools but it's been hit too hard by both Panda and Penguin. I came to the conclusion that I'd be better off with a new website as this one is going to hell no matter how much time and money I put in it. Had I started a new website the first time it got hit by Penguin, I'd be profitable today. I'd like to move some of that content to this other domain but I don't want to do 301 redirects as I don't want to pass bad link juice. I know I'll lose all links and visitors to the original website but I don't care. My only concern is duplicate content. I was thinking of setting the pages to noindex on the original website and wait until they don't appear in Google's index. Then I'd move them over to the new domain to be indexed again. Do you see any problem with this? Should I rewrite everything instead? I hate spinning content...!
Intermediate & Advanced SEO | | sbrault741 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Duplicate content on ecommerce sites
I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,
Intermediate & Advanced SEO | | Creode0