Establishing our web pages as the original source
-
I'm currently working on a recrutiment website. One of the things we do to drive traffic to the site is post our job listings out to a number of job boards eg. Indeed. These sites replicate our own job listings which means that for every job there are at least 5-10 exact duplicates on the web. By nature the job boards have good domain authority so they always rank above us but I would still expect to see more in the way of long-tail traffic.
Is it necessary for me to claim our own job listings as the original source and if so, how do I go about doing it?
Thanks
-
Hi,
Having a self referencing canonical tag on your own pages is not a problem. The canonical tag needs to go into the head of the page though (it is not valid if in the body of the html) so just make sure that the 3rd party syndication service actually provides this - it might - but it might not I am guessing. Even with the canonical I would still include a clean text link back to the original page if this is possible (both as a second indication of origin but also for the visits it might send).
-
Thanks Lynn, that's perfect!
Another question then - the job listings are syndicated out to the job boards automatically via a 3rd party (probably used by 75% of Uk recruitment companies). If I was to put a rel=canonical tag on each job listing it should be carried over to each of the job boards which would get around the duplicate content problem. However, each job listing page on our site would carry the re=canonical tag essentially pointing back to itself. Would this cause any issues?
Thanks
-
HI,
There was a recent WBF about syndicated content which runs down the various technical ways you can attribute your listings as the original source - check it out here. It would probably also help to make sure your listings are the first ones into the index which can be done by internally linking to the new jobs (obviously), quickly adding them to your sitemap, sharing them through social channels (especially twitter) - all of which should help make sure your content is indexed quickly and ideally before it is replicated on other sites.
If the other sites have stronger authority than yours you would still really want to get one of the 3 options discussed in the video implemented. It sounds like you already have the link back to your site (option 3 in the video) so perhaps the link is not 'clean' ie a straight text link that leads to the exact job on your site and is not no-followed?
It might also depend on what kind of long tail you are looking at ranking for. Individual job ads might not pull a lot of organic traffic by themselves if they are not aggregated by type or location for example - at which point the higher authority domains are likely to show an advantage (just a thought).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On-Page Problem
Hello Mozzers, A friend has a business website and the on-page stuff is done really bad. He wants to rank for: conference room furnishing, video conference, digital signage. (Don't worry about the keywords, it's just made up for an example.) For these three services he has a page: hiswebsite.com/av AV stands for audio and video and is the h1. If you click on one of the service, the url doesn't change. Like if you click on video conference, just the text changes, the url stays /av. All his targeted pages got an F Grade, I am not surprised, the services titles are in . Wouldn't it be a lot better to make an own page for every service with a targeted keyword, like hiswebsite.com/video-conference All this stuff is on /av, how will a 301 resirect work to all the service pages, does this make sense? Any help is appreciated! Thanks in advance!
Technical SEO | | grobro1 -
Can Google show the hReview-Aggregate microformat in the SERPs on a product page if the reviews themselves are on a separate page?
Hi, We recently changed our eCommerce site structure a bit and separated our product reviews onto a a different page. There were a couple of reasons we did this : We used pagination on the product page which meant we got duplicate content warnings. We didn't want to show all the reviews on the product page because this was bad for UX (and diluted our keywords). We thought having a single page was better than paginated content, or at least safer for indexing. We found that Googlebot quite often got stuck in loops and we didn't want to bury the reviews way down in the site structure. We wanted to reduce our bounce rate a little, so having a different reviews page could help with this. In the process of doing this we tidied up our microformats a bit too. The product page used to have to three main microformats; hProduct hReview-Aggregate hReview The product page now only has hProduct and hReview-Aggregate (which is now nested inside the hProduct). This means the reviews page has hReview-Aggregate and hReviews for each review itself. We've taken care to make sure that we're specifying that it's a product review and the URL of that product. However, we've noticed over the past few weeks that Google has stopped feeding the reviews into the SERPs for product pages, and is instead only feeding them in for the reviews pages. Is there any way to separate the reviews out and get Google to use the Microformats for both pages? Would using microdata be a better way to implement this? Thanks,
Technical SEO | | OptiBacUK
James0 -
Why do they rank the home page?
We are trying to rank for the key word Motorcycle Parts. We have moved up to page 2 over the past couple months; however, google is ranking our home page not our http://www.rockymountainatvmc.com/s/49/61/Motorcycle-Parts page that is for motorcycle parts. We are working on internal linking to help point the right signals too. Any other thoughts? ( we have new content written to put in as well we just have to wait for an issue to be fixed before we can put it in)
Technical SEO | | DoRM0 -
Page has a 301 redirect, now we want to move it back to it's original place
Hi - This is the first time I've asked a question! My site, www.turnkeylandlords.co.uk is going through a bit of a redesign (for the 2nd time since it launched in July 2012...) First redesign meant we needed to move a page (https://www.turnkeylandlords.co.uk/about-turnkey-mortgages/conveyancing/) from the root to the 'about-us' section. We implemented a 301 redirect and everything went fine. I found out yesterday that the plan is to move this page (and another one as well, but it's the same issue so no point in sharing the URL) back to the root. What do I do? A new 301? Wouldn't this create a loop? Or just delete the original 301? Thanks in advance, Amelia
Technical SEO | | CommT0 -
Noindex Pages indexed
I'm having problem that gogole is index my search results pages even though i have added the "noindex" metatag. Is the best thing to block the robot from crawling that file using robots.txt?
Technical SEO | | Tedred0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0 -
Ranking above PLACE PAGES
What does it take for results to show up above Place Page results. It seems like Google Local gets a lot of emphasis . Any thoughts?
Technical SEO | | musillawfirm0