How do I Address Low Quality/Duplicate Content Issue for a Job portal?
-
Hi,
I want to optimize my job portal for maximum search traffic.
Problems
- Duplicate content- The portal takes jobs from other portals/blogs and posts on our site. Sometimes employers provide the same job posting to multiple portals and we are not allowed to change it resulting in duplicate content
- Empty Content Pages- We have a lot of pages which can be reached via filtering for multiple options. Like IT jobs in New York. If there are no IT jobs posted in New York, then it's a blank page with little or no content
- Repeated Content- When we have job postings, we have about the company information on each job listing page. If a company has 1000 jobs listed with us, that means 1000 pages have the exact same about the company wording
Solutions Implemented
- Rel=prev and next. We have implemented this for pagination. We also have self referencing canonical tags on each page. Even if they are filtered with additional parameters, our system strips of the parameters and shows the correct URL all the time for both rel=prev and next as well as self canonical tags
- For duplicate content- Due to the volume of the job listings that come each day, it's impossible to create unique content for each. We try to make the initial paragraph (at least 130 characters) unique. However, we use a template system for each jobs. So a similar pattern can be detected after even 10 or 15 jobs. Sometimes we also take the wordy job descriptions and convert them into bullet points. If bullet points already available, we take only a few bullet points and try to re-shuffle them at times
Can anyone provide me additional pointers to improve my site in terms of on-page SEO/technical SEO?
Any help would be much appreciated.
We are also thinking of no-indexing or deleting old jobs once they cross X number of days. Do you think this would be a smart strategy? Should I No-index empty listing pages as well?
Thank you.
-
Unique Listing Copy
I would try to get that unique content to the top of the source order - it doesn't necessarily have to appear at the top of the page - it could be in a sidebar for instance, but it should be first in the source so that Googlebot gobbles it up before it reaches duplicate stuff or secondary nav / footer links etc.
No Results pages
Yes, you could certainly noindex your no-results pages with a robots meta tag - that would be a good idea.
Loading duplicate content with ajax
In terms of Google and ajax content, yes Googlebot can and does follow links it finds in javascript.
All I can tell you here is my own experience On my product detail template, I have loaded up category descriptions with ajax that appear canonically (if that's the right way of putting it) on my listing pages. In the SERPs, the cat description content is indexed for the pages I want it to be indexed for (the listings in this case), and not for the product detail pages where I'm loading with ajax. And these product detail pages still perform well and get good organic landing traffic.
On the product detail page where I'm loading with ajax, I have the copy in an accordion, and it's loaded with an ajax request on document ready. It might be considered slightly more cochre to do this in response to a user action though - such as clicking on the accordion in my case. The theory being that you're making your site responsive to a user's needs, rather than loading up half the content one way and the other half another way, if you get what I mean.
Sometimes of course you just cannot avoid certain amounts of content duplication within your site.
-
Luke,
Thank you for your detailed reply.
I forgot to mention that for each of our important filter pages (like IT jobs in New York) we do have a unique paragraph text which is human readable and at the same time SEO optimized (They are around 200 words long and is not there for all filter pages due to the volume of such pages.) This unique block of text rests at the bottom of the page, just above the footer, after all the latest 20 job listings are shown.
"Filtering & Blank Results Pages Could this not be done with javascript / ajax, so that Google never finds an empty listing?"
I am afraid this cannot be done due to the structure of our system. No-Indexing them would be much more easier. Wouldn't it do?
"You could load this content from an ajax template, either as the page loads, or in response to a user action (eg. click on a button 'show company details')."
Sounds like a good idea. Are you sure Google will not consider this as cloaking and that Google cannot read Ajax content?
"Try not to load up the duplicate description by default."- Do you mean we should implement Ajax again for this part?
"You will want to, where possible, specify a view-all page for Google"- not sure if this will be possible from our side due to engineering limitations. I thought rel=next and prev would solve the issue. However, I still see intermediate pages indexed.
-
Hi, I've tried to address your issues point by point according to your post...
Duplicate Job Posting Content
You can try to offset this by having a couple hundred words of unique copy per listing page url exposed to Google. So, if your page lists all jobs in the catering industry in New Jersey for instance, write some copy on the topic, and try to make it readable and useful to the user as well. Add microdata to the template using schema.org, so that Google can understand what's there at the data level - there will likely be entities available there to describe your content in terms of location, the companies that are hiring, etc.
I'm inclined to say don't bother with reshuffling duplicate content and adding bullet points to it - Google is smart enough to recognise that this copy is the same, and will not give you any points - perhaps the opposite - for trying to disguise this.
Filtering & Blank Results Pages
Could this not be done with javascript / ajax, so that Google never finds an empty listing?
'About the Company' Duplicate Content
You could load this content from an ajax template, either as the page loads, or in response to a user action (eg. click on a button 'show company details'). I have solved this exact problem like this in the past - loading a tour category description that appears on a great many tour detail pages.
Perhaps you can do as I'm suggesting above for the job description duplication - where possible, and as long as it's done in a way that does not come across as cloaking. It's good that you have a unique paragaph above the duplicate description.Try not to load up the duplicate description by default. I'm not sure on your source order or site / template structure so difficult to get too detailed here and I don't want to risk suggesting something that could be interpreted as a violation of Google's guidelines.
Pagination
You will want to, where possible, specify a view-all page for Google - this is suggested by them as a best practice, and in my experience, Googlebot loves to find chunky listing content, PROVIDED that it loads quickly enough not to hamper the user experience.
You can make sure of this by lazyloading images and other media. Be sure to specify the correct image src attributes (not spacer.gif for instance) inside of noscript tags to make sure that image content is still indexed.
You could also load up the markup for all items in the listing, and then use javascript to chunk the content into 'pages', or load it asynchronously where javascript is available. If no javascript, then load all content. By using javascript pagination, you basically avert the need for a separate view all page, meaning only have one template to maintain and optimise.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content or an update ???
Buying Guide and Product Category page competing for the same keyword ? Got a “nuts and bold website” selling basic stuff. Imagine selling simple nuts, bolts and washers (the little ring that goes in between) in different metals. Imagine a website with a very wide and deep line of these simple products. For long tail keywords we rank well (Example: 0.25 inch bolts). For the keyword: “Nuts bolts” our main category page use to rank well low 1<sup>st</sup> page to second page up against the big guys (Amazon, Walmart, Target, Costco, some drug store who may have a mix pack of nuts and bolts, but still Google don’t see the difference and list 2 pages each for these guys). But then in mid-February there were an update and suddenly our “Buying guide for nuts and bolts” rank higher and started to compete with our own product category page. That was never our intention. These two pages now compete for the ranking on page 4<sup>th</sup>. Clearly there were more words on the buying guide page but no changes had been made to it for well months or years. To make up for it some more words were added to the category page, but of cause there is only so many way you can fraise words about “nuts and bolts” without sounding a bit duplicate/re-writing. So what do I do now ?? Clearly the product category page is the one we like to rank highest with the guide a close 2nd. Most customer don’t need the buying guide but it is good to have and great support as we got lot of good comments from customer who read it. Made a link to the buying guide from the category page and wise verses. The category page got an embedded video. Moz list the page authority for the category page to 16 and 1 for the buying guide but clearly G see it differently. Already tried to change the Meta Tag Title and Description a little but it is hard to do if the word “Nuts Bolts” is to appear in the description or people don’t know what to expect. Could just insert a “do not index” for the buying guide but not a good long term solution. Unfortunately I am out of imagination at this point. Any good suggestions ?? Thanks, Kim Any good suggestions ???
Technical SEO | | KimX0 -
Duplicate content on charity website
Hi Mozers, We are working on a website for a UK charity – they are a hospice and have two distinct brands, one for their adult services and another for their children’s services. They currently have two different websites which have a large number of pages that contain identical text. We spoke with them and agreed that it would be better to combine the websites under one URL – that way a number of the duplicate pages could be reduced as they are relevant to both brands. What seamed like a good idea initially is beginning to not look so good now. We had planned to use CSS to load different style sheets for each brand – depending on the referring URL (adult / Child) the page would display the appropriate branding. This will will work well up to a point. What we can’t work out is how to style the page if it is the initial landing page – the brands are quite different and we need to get this right. It is not such an issue for the management type pages (board of trustees etc) as they govern both identities. The issue is the donation, fundraising pages – they need to be found, and we are concerned that users will be confused if one of those pages is the initial landing page and they are served the wrong brand. We have thought of making one page the main page and using rel canonical on the other one, but that will affect its ability to be found in the search engines. Really not sure what the best way to move forward would be, any suggestions / guidance would be much appreciated. Thanks Fraser .
Technical SEO | | fraserhannah0 -
How to fix HTTP/HTTPS duplicate content
I recently installed an SSL certificate on the site: https://libertywholesalesupply.com Moz is now reading thousands of duplicate content pages because it is reading both http and https. I set up the configuration in Magento to auto-redirect the base URL, created a permanent redirect for the URL in the SEO settings, and adjusted the canonical settings. What am I missing??
Technical SEO | | adamxj20 -
Http:// to https:// 301 or 302 redirect
I've read over the Q & A in the Community, but am wondering the reasoning behind this issue. I know - 301's are permanent and pass links, and 302s are temporary (due to cache) and don't pass links. But, I've run across two sites now that 302 redirect http:// to https://. Is there a valid reason behind this? From my POV and research, the redirect should 301 if it's permanent, but is there a larger issue I am missing?
Technical SEO | | FOTF_DigitalMarketing1 -
Duplicate Content
Hello guys, After fixing the rel tag on similar pages on the site I thought that duplicate content issue were resolved. I checked HTML Improvements on GWT and instead of going down as I expected, it went up. The duplicate issues affect identical product pages which differ from each other just for one detail, let's say length or colour. I could write different meta tags as the duplicate is the meta description, and I did it for some products but still didn't have any effects and they are still showing as duplicates. What would the problem be? Cheers
Technical SEO | | PremioOscar0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
How to avoid duplicate content penalty when our content is posted on other sites too ?
For recruitment company sites, their job ads are posted muliple times on thier own sites and even on other sites too. These are the same ads (job description is same) posted on diff. sites. How do we avoid duplicate content penalty in this case?
Technical SEO | | Personnel_Concept0 -
Duplicate content + wordpress tags
According to SEOMoz platform, one of my wordpress websites deals with duplicate content because of the tags I use. How should I fix it? Is it loyal to remove tag links from the post pages?
Technical SEO | | giankar0