Use of Robots.txt file on a job site
-
We are performing SEO on a large niche Job Board. My question revolves around the thought of no following all the actual job postings from their clients as they only last for 30 to 60 days. Anybody have any idea on the best way to handle this?
-
Happy to help!
-
Thanks Jennifer! Great answer - I wasn't sure if which strategy would be better. Your answer makes a lot of sense. Thanks for your input!
-
Hi Oliver!
Before coming to SEOmoz I used to work for OntargetJobs which is a company that has multiple niche job boards. Here's what I would recommend:
- Keep those pages followed because people will link to them and you want to preserve as much of the link equity as you possibly can. So how do you do that?
- Make sure that when a job expires (or gets removed, whatever) that the page gets 301 redirected to the category page the job is posted under. Depending on the niche, it may be locale based, in that case redirect it to the location. The idea here is to send the user to a helpful page for good user experience and conserve some link equity at the same time.
- On the page that gets redirected to, program it so when a redirection happens that it displays a message at the top of the page. Something along the lines of "Oops! The job you were looking for is no longer active. However here are similar jobs in XYZ category"
Again as I mentioned above, this is a good way to help user experience, plus keep some of that link equity from the inevitable links job posting pages get.
I hope this helps!
Jen
-
I do not know if I understand correctly
Do you want insert no following to all the job posting that expire in 60 days?
If it 's so, you can put a control in the cms for the date of expiry of the job postingIf somebody click on the offer expired by SERP, you can retrieve a little script with a 301 redirect to the job posting similar category to the expired.Ciao
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
John Mueller says don't use Schema as its not working yet but I get markup conflicts using Google Mark-up
I watched recently John Mueller's Google Webmaster Hangout [DEC 5th]. In hit he mentions to a member not to use Schema.org as it's not working quite yet but to use Google's own mark-up tool 'Structured Data Markup Helper'. Fine this I have done and one of the tags I've used is 'AUTHOR'. However if you use Google's Structured Data Testing Tool in GWMT you get an error saying the following Error: Page contains property "author" which is not part of the schema. Yet this is the tag generated by their own tool. Has anyone experienced this before? and if so what action did you take to rectify it and make it work. As it stands I'm considering just removing this tag altogether. Thanks David cqbsdbunpicv8s76dlddd1e8u4g
Technical SEO | | David-E-Carey0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Site verification in WMT
Hello all, I have a site and I want to set a preferred domain but when I do it says I need to verify my site but it gives me no ideas how to do that. I know that normally you have to do it when you set the account up but I had an analytics account for this domain first then just logged on with those details and I was in with no verification process. Cheers
Technical SEO | | jwdl0 -
Robots.txt checker
Google seems to have discontinued their robots.txt checker. Is there another tool that I can use to check my text instead? Thanks!
Technical SEO | | theLotter0 -
Does anyone use Ning (Custom social site)?
We just got a custom Ning site developed and the second we moved it to our domain, we lost almost all of our rankings in Google. So, we sign up for a SEOMoz Pro account and within a week we see that we have over 6,000 errors and warnings on our site. Duplicate content, duplicate page titles and pages with too many links. Is anyone using Ning and successfully maintaining 1st page rankings in Google? Seems to me that we need hundreds of fixes and modifications in the code of our site to fix all of these errors and get Google to take our penalizations off.
Technical SEO | | danlok0 -
Site not being Indexed that fast anymore, Is something wrong with this Robots.txt
My wordpress site's robots.txt used to be this: User-agent: * Disallow: Sitemap: http://www.domainame.com/sitemap.xml.gz I also have all in one SEO installed and other than posts, tags are also index,follow on my site. My new posts used to appear on google in seconds after publishing. I changed the robots.txt to following and now post indexing takes hours. Is there something wrong with this robots.txt? User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /wp-login.php Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /? Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /? Disallow: /*?Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: ia_archiverUser-agent: ia_archiver Disallow: / disable duggmirror User-agent: duggmirror Disallow: / allow google image bot to search all imagesUser-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire siteUser-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz
Technical SEO | | ideas1230 -
The course of action to move my macro site to some mini sites- justin if you can help
We have a site that we want to break up into mini sites but keep the old site for the major brands. Empirecovers.com is the major and we want to break it off into Empire Truck Covers and Empire Boat covers. What I am thinking of doing is linking from the home to Empiretruckcovers.com instead of a mini page on the site and 301 redirect the mini page to empiretruckcovers.com. Than (there wont be duplicate content) making a small page for truck covers on empire just so people do not get confused. Is this the best way to go or what do you suggest? We are doing this because I feel there is seo value in having mini sites and also the user experience will be cleaner and people will trust it a lot more than inside a big site. The other problem is I have some great rankings on the pages so I want to do it so there is as little damage as possible. I guess once I start I will do all the free directories, yahoo directory and try to get links as fast as I can. Any suggestions would be great. I am going to do a/b testing to see if my adwords convert better on mini site or on the big site for certain keywords too
Technical SEO | | goldjake17880 -
Using a third party server to host site elements
Hi guys - I have a client who are recently experiencing a great deal of more traffic to their site. As a result, their web development agency have given them a server upgrade to cope with the new demand. One thing they have also done is put all website scripts, CSS files, images, downloadable content (such as PDFs) - onto a 3rd party server (Amazon S3). Apparently this was done so that my clients server just handles the page requests now - and all other elements are then grabbed from the Amazon s3 server. So basically, this means any HTML content and web pages are still hosted through my clients domain - but all other content is accessible through an Amazon s3 server URL. I'm wondering what SEO implications this will have for my clients domain? While all pages and HTML content is still accessible thorugh their domain name, each page is of course now making many server calls to the Amazon s3 server through external URLs (s3.amazonaws.com). I imagine this will mean any elements sitting on the Amazon S3 server can no longer contribute value to the clients SEO profile - because that actual content is not physically part of their domain anymore. However what I am more concerned about is whether all of these external server calls are going to have a negative effect on the web pages value overall. Should I be advising my client to ensure all site elements are hosted on their own server, and therefore all elements are accessible through their domain? Hope this makes sense (I'm not the best at explaining things!)
Technical SEO | | zealmedia0