Lots of city pages - How do I ensure we don't get penalized
-
We are planning on having a job posting page for each city that we are looking to hire new CFO partners in. But, the problem is, we have LOTS of locations. I was wondering what would be the best way to have similar content on each page (since the job description and requirements are the same for each job posting) without being hit by Google for having duplicate content? One of the main reasons we have decided to have location based pages is that we have noticed visitors to our site are searching for "cfo job in [location] but we notice that most of these visitors then leave. We believe it to be because the pages they land on make no mention of the location that they were looking for and is a little incongruent with what they were expecting.
We are looking to use the following URLs and TItle/Description as an example:
|
http://careers.b2bcfo.com/cfo-jobs/Alabama/Birmingham
| CFO Careers in Birmingham, AL |
| Are you looking for a CFO Career in Birmingham, Alabama ? We're looking for partners there. Apply today! |
|
Any advice you have for this would be greatly appreciated.
Thank you.
-
We would have the job description on each page mentioning the locations, then we would also have the job capture form.
You are right in that these descriptions do have unique data on them. I am thinking we are just going to have to take the time to write as much unique content as possible.
Thanks for the feedback.
-
Hey, the last sentence was based around other ways to bring in this inbound traffic but scratch that for now.
So, have you examined how these other, well ranking sites are doing what they do? Are they living off the fact they are big domains? Is the content on these pages unique as I just Googled:
CFO Careers in Birmingham, AL
And it appears they are job listings specific to that location so I am guessing that content is fairly unique and the listings is the content.
These pages that you would create, what content would they have on them? Would they all be different?
My initial understanding was that this would just be a data capture form but if we actually have unique job listings like on indeed.com, simplyhired, jobs2careers etc then these pages should be unique enough to rank.
Or am I missing something? (it is late in the day here 7pm, hitting my 12th hour of work so the old synapses may be failing me somewhat).
-
Marcus,
I am not sure I understand the last line of your post. But I have looked at the Keyword difficulty tool and these are fairly competitive phrases.
The problem we have is that we are competing against the likes of Indeed.com, Monster.com and sites such as that. While we do use these sites, they don't quite provide the flexibility we are looking for.
We used to rank quite highly for these types of phrases, but I have noticed a recent trend in Google for them to rank the job search sites ahead of us. The hope is that if we provide similar content, then Google would start pushing us up the rankings again.
-
A lot of this depends on the competitiveness of the search query and would need some testing to better determine your approach.
You can use the keyword difficulty tool here but also just google the terms and see what comes up. If the results are weak, you could try this as a stage 1 approach and see how you get on.
Maybe there is another way to think about it, what about the job listings themselves or does it not work that way?
-
I think these need to be indexed as it is through organic search that people have been getting to our site using terms such as "cfo jobs in [location]"
I have been thinking about adding new content for each city, but you are right, that is a LOT of work. I wonder if it might be worth having one page with unique, location based content for the main city in an area and just have a list of nearby cities on the page that we are also hiring in.
-
Hey Danny
A few suggestions:
1. Make each location page unique enough that you can safely have it on the site without worrying about duplication (lots of work).
2. If people are only searching or browsing to these pages internally then don't index them (robots.txt / meta noindex)
3. You could do this dynamically and use a canonical to your main enquiry page on these pages.
4. You could just create all the variations and add a canonical to your main enquiry page and they may, if it is not mega competitive rank (bit risky but easy to fix if it causes issues).
I would always try to look at this from the perspective of your users and if you don't really care about having these as organic search landing pages then simply noindexing them would seem an ideal solution.
Hope that helps!
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why rankings dropped from 2 page to 8th page and no penalization?
Dear Sirs, a client of mine for more than 7 years used to have his home page (www.egrecia.es) between 1st and 2nd page in the Google Serps and suddenly went down to 8 page. The keyword in question is "Viajes a Grecia". It has a good link profile as we have built links in good newspapers from Spain, and according to Moz it has a 99% on-page optimization for that keyword, why why why ??? What could I do to solve this? PD: It has more than 20 other keywords in 1st position, so why this one went so far down? Thank you in advance !
Intermediate & Advanced SEO | | Tintanus0 -
Why isn't www.devonshiredentalcare.co.uk ranking?
Hi, devonshiredentalcare.co.uk was hit badly by one of Google's algorithm updates due to some prior poor seo (by another company). We took this client on and followed all of Google's guidelines and after a lot of work, managed to lift the penalty. It's been almost a year since the penalty was lifted, but it seems to be impossible to get this website ranking for 'Dentist Glasgow', they are currently page 4 in the SERPs! They have 33 Google reviews, we've built good quality links and citations, they have a 'Grade A' for on page optimisation in moz, we are also about to make the website responsive due to the recent mobilegeddon update. Do you have any further suggestions to help get this website ranking? Thanks in advance, Faye
Intermediate & Advanced SEO | | dentaldesign0 -
How to solve outbound broken links? Those don't exist now?
There are many, many broken links on the website. What normal strategy to use for that? http://www.txacspecialist.com/air-conditioning-equipment-service-austin/american-standard/ It's an AC site, so all the links to AC vendors who have changed their product pages, all of those links are broken So for instance, the carrier 20xl doesn't exist anymore. Now they sell the carrier 45abp. We link carrier 20xl and now the page and AC model is not exist. So what I can do to solve the broken link issue?
Intermediate & Advanced SEO | | bondhoward0 -
How do I get my Golf Tee Times pages to index?
I understand that Google does not want to index other search results pages, but we have a large amount of discount tee times that you can search for and they are displayed as helpful listing pages, not search results. Here is an example: http://www.activegolf.com/search-northern-california-tee-times?Date=8%2F21%2F2013&datePicker=8%2F21%2F2013&loc=San+Diego%2C+CA&coupon=&zipCode=&search= These pages are updated daily with the newest tee times. We don't exactly want every URL with every parameter indexed, but at least http://www.activegolf.com/search-northern-california-tee-times. It's weird because all of the tee times are viewable in the HTML and are not javascript. An example of similar pages would be Yelp, for example this page is indexed just fine - http://www.yelp.com/search?cflt=dogwalkers&find_loc=Lancaster%2C+MA I know ActiveGolf.com is not as powerful as Yelp but it's still strange that none of our tee times search pages are being indexed. Would appreciate any ideas out there!
Intermediate & Advanced SEO | | CAndrew14.0 -
Add noindex,nofollow prior to removing pages resulting in 404's
We're working with another site that unfortunately due to how their website has been programmed creates a bit of a mess. Whenever an employee removes a page from their site through their homegrown 'content management system', rather than 301'ing to another location on their site, the page is deleted and results in a 404. The interim question until they implement a better solution in managing their website is: Should they first add noindex,nofollow to the pages that are scheduled to be removed. Then once they are removed, they become 404's? Of note, it is possible that some of these pages will be used again in the future, and I would imagine they could submit them to Google through Webmaster Tools and adding the pages to their sitemap.
Intermediate & Advanced SEO | | Prospector-Plastics0 -
Is my text readable? I don't see it in the page source
Text on my site seems to be readable in a text only version (the page is not cached so I viewed it by disabling JAVA and then copy and pasted the page into Word) However, when I look in the page source I don't see the text there. The text was created using Open X html boxes to help us with formatting, but is this causing an SEO problem?
Intermediate & Advanced SEO | | theLotter0 -
Is it possible to get a list of pages indexed in Google?
Is there a tool that will give me a list of pages on my site that are indexed in Google?
Intermediate & Advanced SEO | | rise10 -
Link Juice - Lots of Pages
I have a site, PricesPrices.com where I'm steadily building inbound links and pagerank. I have about 4600 pages on the site, most of which are baby products in the baby gear sector. There are many outdated items that aren't really my focus, but do pop up in long-tail search queries from time to time. My question is a pretty basic one. Theoretically if a site has say 28/100 link juice, then as you go deeper and deeper into the site, the link juice is divided more and more. My question: Is this really true or just a concept? My thoughts are to hide many of the products that i don't really need to focus on therefor passing more link juice to the products that remain, but I also don't want to that if it won't necessarily make the remaining pages rank higher or have more link juice. I also have to keep in mind the merchandising aspect of the site and providing a good user experience. If i only have 300 products on the site, there will be a ton of unhappy people who can't find the products they are looking for. Any thoughts and/or pointers in the direction of funneling that pagerank down into my site would be much appreciated. Thanks!
Intermediate & Advanced SEO | | modparent0