Is my "term & conditions"-"privacy policy" and "About Us" pages stealing link juice?
-
should i make them no follow?
or is this a bogus method?
-
Hi Keri...thanks for sharing some insight to my response as well ;>)
Is due to the Panda Update why they should be indexed -- since the panda update I believe now wants to see the contact page and those 2 other pages I think ...hummm thanks for clarity !
-
I wouldn't bother with nofollow, I would put all 3 on the same page & consolidate that way. If you wanted, you could leave your navigation as is, by just using anchors to get to specific parts of the page
I don't like to noindex or block the legal pages, as they are a signal for trust
-
There's a good discussion about this in a very similar post from a few hours ago at http://www.seomoz.org/q/should-i-make-all-my-non-money-pages-no-follow. The short answer is that a couple years back it may have helped, but not anymore.
You do want to index them, as you want the search engines to see that you have a privacy policy, and both the privacy policy and about us help the engines trust your site just a tad more.
-
Yes...if it is of no importance to you then you are hurting yourself and your site -- just use the no index no follow tag -- this will prevent it from indexing and passing juice to those insignificant pages
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How many links to the same page can there be for each page?
I need to know if I can add more than 2 equal links on the same page, for example 1 link in the header, another in the body and one in the footer
Intermediate & Advanced SEO | | Jorgesep0 -
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Pages with rel "next"/"prev" still crawling as duplicate?
Howdy! I have a site that is crawling as "duplicate content pages" that is really just pagination. The rel next/prev is in place and done correctly but Roger Bot and Google are both showing duplicated content + duplicate page titles & meta's respectively. The only thing I can think of is we have a canonical pointing back at the URL you are on - we do not have a view all option right now and would not feel comfortable recommending it given the speed implications and size of their catalog. Any experience, recommendations here? Something to be worried about? /collections/all?page=15"/>
Intermediate & Advanced SEO | | paul-bold0 -
Wrong pages ranking for key terms
Hi, I have a website that was rebuilt and redesigned earlier this year, and it's struggling to rank. The problem is that the wrong pages are ranking for the key terms. For example, there is a page for 'Loft Conversions Essex' but the page that's ranking is actually the FAQ page (which doesn't mention the word 'Essex' at all). I have been through all of the usual items, and none of them seem to apply: The landing pages have been properly optimised (not overly so), while the pages that rank only contain the terms within the menu (the link that goes to the actual landing page) We thought it may be a redirect issue since the site was a bit of a mess before the rebuild, so we removed all of the redirects and resubmitted the htaccess file but that hasn't helped Internal anchor text is relevant There aren't a huge number of external links to the old site pages, and many of these pages didn't exist at all so I don't think that's an issue Most of the pages were built at the same time so there's no real reason why one would have more authority than another There are no canonicals interfering with these pages I can't really canonical these since we do want the pages to rank, it's just that they're all ranking for the wrong thing (so the SERPs are a lot lower than they should be). Most of these pages are pretty new, as I said, so while we have tried smaller content changes I don't think a full refresh will really help. To make it even weirder, the pages that rank for each term change regularly but it's never the right page. Help! EDIT: Thanks for the responses everyone!
Intermediate & Advanced SEO | | innermedia10 -
To Many Links On Page Problem
Hello My Moz report is showing I have an error for too many links on my sitemap and blog. The links on both pages are relevant and I'm not sure if this has to be sorted out, as I would have thought Google would expect sitemaps and blogs to have lots of links. If I were to reduce the number of links how much of a positive affect would it have on my site? If any of you feel it is best practice to reduce number of links on these particular pages, do you have any suggestions on how I can tackle this? http://www.dradept.com/blog.php http://www.dradept.com/sitemap.php Thank you Christina
Intermediate & Advanced SEO | | ChristinaRadisic0 -
SEO & Magento Multistore - I have been asked if "duplicatiing" a magento stor using its "Multistore" functionality will cause both to be picked up as duplicate content, can anybody help?
Hello all. I have been asked what the consequences of using Magento's "multistore" functionality are if we were to duplicate our entire magento store and place it on a secondary domain... The simple answer which comes to my mind is that it will be a flagged as duplicate content. However, is this still the case if the site were placed in a different country? The original being the UK the copy being Ireland (both English speaking) How would Google.co.uk & Google.ie treat these stores? Hope this is clear... our site is http://www.tower-health.co.uk
Intermediate & Advanced SEO | | TowerHealth0 -
Could large number of "not selected" pages cause a penalty?
My site was penalized for specific pages in the UK On July 28 (corresponding with a Panda update). I cleaned up my website and wrote to Google and they responded that "no manual spam actions had been taken". The only other thing I can think of is that we suffered an automatic penalty. I am having problems with my sitemap and it is indexing many error pages, empty pages, etc... According to our index status we have 2,679,794 not selected pages and 36,168 total indexed. Could this have been what caused the error? (If you have any articles to back up your answers that would be greatly appreciate) Thanks!
Intermediate & Advanced SEO | | theLotter0 -
Reducing pages with canonical & redirects
We have a site that has a ridiculous number of pages. Its a directory of service providers that is organized by city and sub-category of the vertical. Each provider is on the main city page, then when you click on a category, it will only show those folks who offer that subcategory of this service. example: colorado/denver - main city page colorado/denver/subcat1 - subcategory page There are 37 subcategories. So, 38 pages that essentially have the same content - minus a provider or two - for each city. There are approx 40K locations in our database. So rough math puts us at 1.5 million results pages, with 97% of those pages being duplicate content! This is clearly a problem. But many of these obscure pages do rank and get traffic. A fair amount when you aggregate all these pages together. We are about to go through a redesign and want to consolidate pages so we can reduce the dupe content, get crawl budget allocated to more meaningful pages, etc. Here's what I'm thinking we should do with this site, and I would love to have your input: Canonicalize Before the redesign use the canonical tag on all the sub-category pages and push all the value from those pages (colorado/denver/subcat1, /subcat2, /subcat3... etc) to the main city page (colorado/denver/subcat1) 301 Redirect On the new site (we're moving to a new CMS) we don't publish the duplicate sub-category pages and do 301 redirects from the sub-category URLs to the main city page urls. We'd still have the sub-categories (keywords) on-page and use some Javascript filtering to narrow results. We could cut to the chase and just do the redirects, but would like to use canonicalization as a proof of concept internally at my company that getting rid of these pages is a good thing, or at least wont have a negative impact on traffic. i.e. by the time we are ready to relaunch traffic and value has been transfered to the /state/city page Trying to create the right plan and build my argument. Any feedback you have will help.
Intermediate & Advanced SEO | | trentc0