Job Posting Page and Structured Data Issue
-
We have a website where we do job postings. We manually add the data to our website.
The Job Postings are covered by various other websites including the original recruiting organisations. The details of the job posting remain the same, for instance, the eligibility criteria, the exam pattern, syllabus etc.
We create pages where we list the jobs and keep the detailed pages which have the duplicate data disallowed in robots.txt.
Lately, we have been thinking of indexing these pages as well, as the quantum of these non-indexed pages is very high. Some of our competitors have these pages indexed. But we are not sure whether doing this is gonna be the right move or if there is a safe way to deal with this. Additionally, there is this problem that some job posts have very less data like fees, age limit, salary etc which is thin content so that might contribute to poor quality issue.
Secondly, we wanted to use enriched result snippets for our job postings. Google doesn't want snippets to be used on the listing page:
"Put structured data on the most detailed leaf page possible. Don't add structured data to pages intended to present a list of jobs (for example, search result pages). Instead, apply structured data to the most specific page describing a single job with its relevant details."
Now, how do we handle this situation? Is it safe to allow the detailed pages which have duplicate job data and sometime not so high quality data in robots.txt?
-
First of all those more detailed URLs should have been handled via canonical tags and not via robots.txt
You are probably safe to allow the detailed URLs to rank, try allowing a sample of them to rank whilst keeping others disallowed. First, fix the architecture. Stop using robots.txt and on the detailed URLs, make them canonical to their parents
Once that is done, select a volume of the detailed URLs as a test. Remove the canonical tags from those URLs, allowing them to index. Do they start ranking, performing? Do you get duplicate content warnings?
Depending on the outcome, you may want to lift the canonical tags from all detailed URLs, or even reverse the canonicals so that the detailed pages have ranking preference
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Images on their own page?
Hi Mozers, We have images on their own separate pages that are then pulled onto content pages. Should the standalone pages be indexable? On the one hand, it seems good to have an image on it's own page, with it's own title. On the other hand, it may be better SEO for crawler to find the image on a content page dedicated to that topic. Unsure. Would appreciate any guidance! Yael
Intermediate & Advanced SEO | | yaelslater1 -
URL Structure for geo location for specific page
On hackerearth.com/challenges page, there is an option to select languages. This option is in the footer. Once you select the language the url changes. Ex - if we select French, the URL changes to hackereath.com/fr/challenges. In case we decide to change the URL of this page with Geo, what should be the URL structure which accommodates languages as well. My research says that it would good to keep the url like domainname.com/page/language.
Intermediate & Advanced SEO | | Rajnish_HE0 -
Multiple h1 tags on this html 5 page a issue?
Hi Guys, I have a html5 page located here: https://tinyurl.com/yc6s3xs2 I know from some online discussions having multiple h1 tags on HTML 5 pages like this, shouldn't be an issue. Any thoughts on this? Cheers,
Intermediate & Advanced SEO | | bridhard80 -
Is it necessary to use Google's Structured Data Markup or alternative for my B2B site?
Hi, We are in the process of going through a re-design for our site. Am trying to understand if we need to use some sort of structured data either from Google Structured data or schema. org?
Intermediate & Advanced SEO | | Krausch0 -
May integrating my main category page in the index page improve my ranking of main category keyword?
90% of our sales are made with products in one of our product categories.
Intermediate & Advanced SEO | | lcourse
A search for main category keyword returns our root domain index page in google, not the category page.
I was wondering whether integrating the complete main category directly in the index page of the root domain and this way including much more relevant content for this main category keyword may have a positive impact on our google ranking for the main category keyword. Any thoughts?1 -
Can buying a sponsored post (for non SEO purposes) on a website where you already have a guest post have a negative impact?
Hi, I thinking about buying a sponsored post about our product on a website we have previously contributed a guest post. Can a new sponsored post make Google think our original guest post was paid for? Thanks, Ori
Intermediate & Advanced SEO | | dizi3770 -
Url structure of a blog
We are trying to work out what the best structure for our blog is as we want each page to rank as highly as possible, we were looking at a flat structure similar to http://www.hongkiat.com/blog/ where every posts is after the blog/ but not in category's although the viewers can look in different category's from the top buttons on the page- photoshop - icons etc or we where going to go for the structured way- blog/photoshop/blog-post.html the only problem is that we will end up 4 deep at least with this and at least 80 characters in the url. any help would be appreciated. Thanks Shaun
Intermediate & Advanced SEO | | BobAnderson0 -
Is Sitemap Issue Causing Duplicate Content & Unindexed Pages on Google?
On July 10th my site was migrated from Drupal to Google. The site contains approximately 400 pages. 301 permanent redirects were used. The site contains maybe 50 pages of new content. Many of the new pages have not been indexed and many pages show as duplicate content. Is it possible that there is a site map issue that is causing this problem? My developer believes the map is formatted correctly, but I am not convinced. The sitemap address is http://www.nyc-officespace-leader.com/page-sitemap.xml [^] I am completely non technical so if anyone could take a brief look I would appreciate it immensely. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan | |0