Handling Similar page content on directory site
-
Hi All,
SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US.
I do not want these pages being indexed and was wanting to know the best way to go about this.
I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this.
Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt.
The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site.
Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index?
Thanks!
-
Thanks Kane!
Meta-robots it is!
I will apply it and see how I go with it.
Cheers
-
The best solution is to use on those pages.
I believe that using robots.txt will still allow the URLs to be shown as URLs in search results, so that is less ideal. Not certain if that's still the case, but it used to be that way.
I personally would not nofollow links to that page, because if you use "noindex, follow" it will in turn pass value to other indexed pages, and nofollowing links to a noindex page isn't supposed to increase pagerank to other links on the page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages that did NOT 301 redirect to the new site
Hi, Is there a tool out there that can tell me what pages did NOT 301 redirect to the new sites? I need something rather than going into google.com and typing in site:oldsite.com to see if it's still indexed and if it's not 301 redirecting.. I'm not sure if screaming frog can do that. Thanks.
Intermediate & Advanced SEO | | ggpaul5620 -
Parallax site with snippets of internal pages on the homepage
Hello, I am working on a parallax site that also has an internal landing page structure. The homepage includes snippets of the existing copy from some of the other internal pages. My question is what can I do to the homepage to prevent duplicate content in this situation? We aren't utilizing the entire landing page on the homepage just a few lines. Would it be possible to place a 'no-index, follow' tag on these sections? Thanks in Advance
Intermediate & Advanced SEO | | Robertnweil10 -
Significantly reducing number of pages (and overall content) on new site - is it a bad idea?
Hi Mozzers - I am looking at new site (not launched yet) - it contains significantly fewer pages than the previous site - 35 pages rather than 107 before - content on the remaining pages is plentiful but I am worried about the sudden loss of a significant "chunk" of the website - significantly cutting the size of a website must surely increase the risks of post-migration performance problems? Further info - the site has run an SEO contract with a large SEO firm for several years. They don't appear to have done anything beyond tinkering with homepage content - all the header and description tags are the same across the current website. 90% of site traffic currently arrives on the homepage. Content quality/volume isn't bad across most of the current site. Thanks in advance for your input!
Intermediate & Advanced SEO | | McTaggart0 -
Can I delay an AJAX call in order to hide specific on page content?
I am an SEO for a people search site. To avoid potential duplicate content issues for common people searches such as "John Smith" we are displaying the main "John Smith" result above the fold and add "other John Smith" search results inside an iframe. This way search engines don't see the same "other John Smith" search results on all other "John Smith" profile pages on our site and conclude that we have lots of duplicate content. We want to get away from using an iframe to solve potential duplicate content problem. Question: Can we display this duplicate "John Smith" content using a delayed AJAX call and robot.txt block the directory that contains the AJAX call?
Intermediate & Advanced SEO | | SEOAccount320 -
Content Marketing: Should we build a separate website or built in site within the Website itself?
Hi Mozzers, Client: Big carpet cleaner player in the carpet cleaning industry Main Goal: Creating good content to Get more organic traffic to our main site Structure of the extra content: It will act like a blog but will be differentiated from the regular site by not selling anything but just creating good content. The look and design will be different from the client's site. SEO question: In terms of SEO, what would be the most beneficial for us to do, should we built in this new section/site outside or inside the client's site? I personally think that it should be separated from the main site because of the main reasons: A followed link to the main site Anchor texts implementation linking back to our service pages If we would to choose to build in this content, it would be highly beneficial for getting organic traffic within the main site but I am afraid this will not provide us any link juice since anchor texts won't be accounted the same since all of those would be located in the Nav bar of the main site. Can someone tell me what would be the best in terms of SEO? P.S: My boss doesn't agree with me and would rather go the second option (build in within the main site) that's why i am asking you guys what would be the most beneficial? Thank you Guys
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
How to handle a server outage if I have two sites
I operate a web application. It consists of two sites, www.mysite.com and app.mysite.com. As you might imagine, www is used for marketing purposes, and it's our main organic search entry point. The app.mysite.com domain is where our application portal is for customers, and it is also where our login and registration pages are located. Currently, www.mysite.com is experiencing a catastrophic outage and is returning 504 errors, but app.mysite.com is on a totally separate system with a lot redundancy, and is doing just fine. If we get traffic from referrals or search, we want that traffic to be able to login and register, so we've replaced the 504 error with a 302 redirect to app.mysite.com until the situation is resolved. This provides the best possible experience for users (nothing's worse than a 504). How will this affect SEO? Is there something other than a 302 that I should be doing with the broken www.mysite.com domain?
Intermediate & Advanced SEO | | Ehren0 -
Content that is split into 4 pages, should I consolidate?
I am working on improving a website that has each section split into four pages. For example, if Indonesia Vacation was a section, it would have its main page, www.domain.com/indonesia-vacation, and the about, fact sheet, and tips on three other pages www.domain.com/indonesia-vacation-1 www.domain.com/indonesia-vacation-2 www.domain.com/indonesia-vacation-3 The pages share very similar title tags and I am worried it is hurting the main page for placement.. So to conserve link juice, would it make sense to have them all one page? There is not so much content that it would affect load time. My strategy would be to have all content available and part of the main page and 301 the three URL's back to the main page: www.domain.com/indonesia-vacation Any insight would be greatly appreciated!!!
Intermediate & Advanced SEO | | MattAaron0 -
Key page of site not ranking at all
Our site has the largest selection of dog clothes on the Internet. We're been (every so slowly) creeping up in the rankings for the "dog clothes" term, but for some reason only rank for our home page. Even though the home page (and every page on the domain) has links pointing to our specific Dog Clothes page, that page doesn't even rank anywhere when searching Google with "dog clothes site:baxterboo.com". http://www.google.com/webhp?source=hp&q=dog+clothes+site:baxterboo.com&#sclient=psy&hl=en&site=webhp&source=hp&q=dog+clothes+site:baxterboo.com&btnG=Google+Search&aq=f&aqi=&aql=&oq=dog+clothes+site:baxterboo.com&pbx=1&bav=on.2,or.r_gc.r_pw.&fp=f4efcaa1b8c328f Pages 2+ of product results from that page rank, but not the base page. It's not excluded in robots.txt, All on site links to that page use the same URL. That page is loaded with more text that includes the keywords. I don't believe there's duplicated content. What am I missing? Has the page somehow been penalized?
Intermediate & Advanced SEO | | BBPets0