Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google for Jobs: how to deal with third-party sites that appear instead of your own?
-
We have shared our company's job postings on several third-party websites, including The Muse, as well as putting the job postings on our own website. Our site and The Muse have about the same schema markup except for these differences:
The Muse...
• Lists Experience Requirements
• Uses HTML in the description withtags and other markup (our website just has plain text)
• Has a Name in JobPosting
• URL is specific to the position (our website's URL just goes to the homepage)
• Has a logo URL for OrganizationWhen you type the exact job posting's title into Google, The Muse posting shows up in Google for Jobs--not our website's duplicate copy. The only way to see our website's job posting is to type in the exact job title plus "site:http://www.oursite.com".
What is a good approach for getting our website's posting to be the priority in Google for Jobs? Do we need to remove postings from third-party sites? Structure them differently? Do organic factors affect which version of the job posting is shown, and if so, can I assume that our site will face challenges outranking a big third-party site?
-
We have found the following:
1 Using the API is better than waiting for Google to crawl the jobs.
2 They have you must have data fields, but they have would like to have and be tickled pink if you have fields. Filling in all three changes rankings in the testing we have done.
3 The quality of the title you give vs the title they understand.
4 The overall authority of your site. No exact on this yet but a gut feel factor.
5 SERPs result are also jumping around like crazy just now, we see the Google for jobs panel with no links about it and then four hours later it has 4 organic links about it for the same search, then a day later 2, then a day later none, then back to four then an hour later none...Testing google for jobs when it landed in the UK three weeks ago its results are inconsistent with its own rules, we have found jobs with the wrong suggested title format, the wrong address format, landing pages not actual jobs have found their way onto the service!!! jobs with red warning have made it onto the service and so the list goes on.
-
Yeah, I'm sorry I'm not seeing a really good resource for you, Kevin. It's early days. The person who takes on the task of writing that resource will have valuable information to share. I would say your best hope is in experimentation with this, but I don't see that anyone has figured out a solution to the important questions you've asked.
-
Thanks, Miriam. This article offers a good summary of information that Google put out there, but it doesn't discuss factors that may affect which version of a duplicate posting appears. Ideally, there's be a way to canonical third-party duplicates, but I'm not sure if this would be possible with these huge third-party job posting sites or even if this would affect which version of the posting appeared in Google for Jobs.
-
Hi Kevin! It's nice to speak with you, too. Another article that might help:
http://www.clearedgemarketing.com/2017/06/optimize-google-jobs/
I'd love to see someone do a deep dive on the exact questions you've raised.
-
Wow, a reply by the Miriam Ellis! I've found your past posts on local search very useful.
Seriously, though, this was a very good thread on which I could begin to pull. I took a look at the article and found this helpful line: "For jobs that appeared on multiple sites, Google will link you to the one with the most complete job posting." I'd be interested in knowing more about what constitutes "complete." I'm assuming it's the post that has the most schema items included and in particular the "critical" items according to Google's rich cards report. If this is the case, then it would seem that organic signals may not affect the visibility of the job posts as much as I originally suspected.
Then again, there's got to be some keyword relevance going on here.
Our website's job posting is being included in Google for Jobs. However, this posting only appears with a very specific search (typing in the exact job title plus "site:http://www.oursite.com".)
So, maybe it's a combination: multiple versions of the same job can be part of Google for Jobs, but Google for Jobs will show the posting that is both most keyword relevant and most complete. This is just a theory without significant research (everyone's favorite kind of theory, right?), but I'm going to send an email to the author of the TechCrunch article to see if there's any more detail he can share. Thanks again!
-
Hey Kevin,
I'm afraid I'm not very familiar with Google for Jobs, but here's something that caught my eye in a TechCrunch article:
To create this comprehensive list, Google first has to remove all of the duplicate listings that employers post to all of these job sites. Then, its machine learning-trained algorithms sift through and categorize them.
This sounds like it might be applicable to what you're describing. Maybe read the rest of the article? And I'm hoping you'll get further community input from folks who have actually been experimenting with this new Google function.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any way to report a website that is not complying with webmaster guidelines to Google?
Like how we can "suggest an edit" in Google Business Listings, is there any way to report Google about the webmaster guidelines violation?
Local Website Optimization | | Alagurajeshwaran0 -
How does Google read multiple Geo Shape Schema Mark Up?
Hi Guys, I posted a question recently about "Can I have multiple areaServed mark up on one domain?" and the responses I got was no. My client work predominantly in the South East of England in specific towns, so I wanted to be able to list all the areas they service. However, after being told no, I went ahead anyway and put in multiple areaServed markup on the page to see if this generates any errors and it isn't when I run it through the Structured Data Testing Tool. I don't get any errors by doing this, so hurray! But... What I want to understand (which I can't find the answer anywhere), is if this is okay, and how will Google read my markup? Will Google see that we are in multiple areas across the SE of England and push my content up before other sites, or is this just going to confused Google? By putting in all these areas into the website as multiple locations, will Google identify that person X in area Y fits the areaServed mark up I've added and push my content to them? Overall... has anyone else used multiple areaServed markup and can validate that this works? hHpEyQf
Local Website Optimization | | Virginia-Girtz1 -
Does having an embedded Google Map still count as a positive SEO signal?
I know this was true a few years ago, however is there still an advantage to having an embedded map vs. a pop up map in 2017?
Local Website Optimization | | BigChad21 -
Subdomain vs. Separate Domain for SEO & Google AdWords
We have a client who carries 4 product lines from different manufacturers under a singular domain name (www.companyname.com), and last fall, one of their manufacturers indicated that they needed to move to separate out one of those product lines from the rest, so we redesigned and relaunched as two separate sites - www.companyname.com and www.companynameseparateproduct.com (a newly-purchased domain). Since that time, their manufacturer has reneged their requirement to separate the product lines, but the client has been running both sites separately since they launched at the beginning of December 2016. Since that time, they have cannibalized their content strategy (effective February 2017) and hacked apart their PPC budget from both sites (effective April 2017), and are upset that their organic and paid traffic has correspondingly dropped from the original domain, and that the new domain hasn't continued to grow at the rate they would like it to (we did warn them, and they made the decision to move forward with the changes anyway). This past week, they decided to hire an in-house marketing manager, who is insisting that we move the newer domain (www.companynameseparateproduct.com) to become a subdomain on their original site (separateproduct.companyname.com). Our team has argued that making this change back 6 months into the life of the new site will hurt their SEO (especially if we have to 301 redirect all of the old content back again, without any new content regularly being added), which was corroborated with this article. We'd also have to kill the separate AdWords account and quality score associated with the ads in that account to move them back. We're currently looking for any extra insight or literature that we might be able to find that helps explain this to the client better - even if it is a little technical. (We're also open to finding out if this method of thinking is incorrect if things have changed!)
Local Website Optimization | | mkbeesto0 -
How Google's Doorway Pages Update Affects Local SEO
Hey Awesome Local Folks! I thought I'd take a proactive stance and start a thread on the new doorway pages update from Google, as I feel there will be questions coming up about this here in the forum: Here's the update announcement: http://googlewebmastercentral.blogspot.com/2015/03/an-update-on-doorway-pages.html And here's the part that will make local business owners and Local SEOs take a second glance at this: Here are questions to ask of pages that could be seen as doorway pages: Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic? I think this will naturally lead to questions about the practice of creating local/city landing pages. At this point, my prediction is that this will come down to high quality vs. crummy quality pages of this type. In fact, after chatting briefly with Andrew Shotland, I'm leaning a bit toward seeing the above language as being strongly geared toward directory type sites and large franchises. I recommend reading Andrew's post about his take on this, as I think he's on the right track: http://www.localseoguide.com/googles-about-to-close-your-local-doorway-pages/ So, I'm feeling at this point that if you've made the right efforts to develop unique, high quality local landing pages, you should be good unless you are an accidental casualty of an over-zealous update. We'll see! If anyone has thoughts to contribute on this thread, I hope they will, and if lots of questions start coming up about this here in the community, feel free to link back to this thread in helping your fellow community members 🙂 Thanks, all!
Local Website Optimization | | MiriamEllis9 -
How to target a site to only specific US states?
Hey Guys, Does anyone have experience or can point me to the right documentation about geo targeting possibilities for specific states in the US or specific areas in the world. Local SEO does not apply in my case, since my website is not a business nor have a physical address. My website offers information that is only relevant for specific states in the US, how can I leverage my I optimisation to gain more exposure in those specific states? I really appreciate any help. A
Local Website Optimization | | Mr.bfz0 -
Single Site For Multiple Locations Or Multiple Sites?
Hi, Sorry if this rambles on. There's a few details that kind of convolute this issue so I'll try and be as clear as possible. The site in question has been online for roughly 5 years. It's established with many local citations, does well in local SERPs (working on organic results currently), and represents a business with 2 locations in the same county. The domain is structured as location1brandname.com. The site was recently upgraded from a 6-10 page static HTML site with loads of duplicate content and poor structure to a nice, clean WordPress layout. Again, Google is cool with it, everything was 301'd properly, and our rankings haven't dropped (some have improved). Here's the tricky part: To properly optimize this site for our second location, I am basically building a second website within the original, but customized for our second location. It will be location1brandname.com/secondcity and the menu will be unique to second-city service pages, unique NAP on footer, etc. I will then update our local citations with this new URL and hopefully we'll start appearing higher in local SERPs for the second-city keywords that our main URL isn't currently optimized for. The issue I have is that our root domain has our first city location in the domain and that this might have some negative effect on ranking for the second URL. Conversely, starting on a brand new domain (secondcitybrandname.com) requires building an entire new site and being brand new. My hunch is that we'll be fine making root.com/secondcity that locations homepage and starting a new domain, while cleaner and compeltely separate from our other location, is too much work for not enough benefit. It seems like if they're the same company/brand, they should be on the same sitee. and we can use the root juice to help. Thoughts?
Local Website Optimization | | kirmeliux0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0