Block search engines from URLs created by internal search engine?
-
Hey guys,
I've got a question for you all that I've been pondering for a few days now. I'm currently doing an SEO Technical Audit for a large scale directory.
One major issue that they are having is that their internal search system (Directory Search) will create a new URL everytime a search query is entered by the user. This creates huge amounts of duplication on the website.
I'm wondering if it would be best to block search engines from crawling these URLs entirely with Robots.txt?
What do you guys think? Bearing in mind there are probably thousands of these pages already in the Google index?
Thanks
Kim
-
That sounds perfect - if the user-generated URLs are getting enough traffic, make them permanent pages and 301-redirect or canonical. If not, weed them out of the index.
-
Thanks for your reply Dr. Meyers. I think you're probably right.
Yes I'm recommending they define a canonical set of pages that are the most popular searches, categories and locations which can be reached via internal links and we'll get all those duplicates re-directed back to that canonical set.
But for pages that fall outside those categories and locations, I'll recommend a meta-no-index tag.
-
It can be a complicated question on a very large site, but in most cases I'd META NOINDEX those pages. Robots.txt isn't great at removing content that's already been indexed. Admittedly, NOINDEX will take a while to work (virtually any solution will), as Google probably doesn't crawl these pages very often.
Generally, though, the risk of having your index explode with custom search pages is too high for a site like yours (especially post-Panda). I do think blocking those pages somehow is a good bet.
The only exception I would add is if some of the more popular custom searches are getting traffic and/or links. I assume you have a solid internal link structure and other paths to these listings, but if it looks like a few searches (or a few dozen) have attracted traffic and back-links, you'll want to preserve those somehow.
-
Sure, check below and some of the duplication I mean:
Capitalization Duplication
http://yellow.co.nz/yellow+pages/Car+dealer/Auckland+Region
http://yellow.co.nz/yellow+pages/Car+Dealer/Auckland+Region
With a few URL parameters
And with location duplication
http://yellow.co.nz/yellow+pages/Car+Dealer/Auckland
Let me know if you need any more info!
Cheers
Kim
-
Whats the content look like on the new url? Can you give us an example?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages Disappearing from Search
Hi, We have had a strongly ranking site since 2004. Over the past couple of days, our Google traffic has dropped by around 20% and some of our strong pages are completely disappearing from the rankings. They are still indexed, but having ranked number 1 are nowhere to be found. A number of pages still remain intact, but it seems they are increasingly disappearing. Where should we start to try and find out what is happening? Thanks
Intermediate & Advanced SEO | | simonukss0 -
Are there any issues with search engines (other than Google/Bing) reading Protocol-Relative URLs?
Are there any issues with search engines (other than Google/Bing) reading Protocol-Relative URLs? Specifically with Baidu and Yandex?
Intermediate & Advanced SEO | | WikiaSEO0 -
Image URL Change Catastrophe
We have a site with over 3mm pages indexed, and an XML sitemap with over 12mm images (312k indexed at peak). Last week our traffic dropped off a cliff. The only major change we made to the site in that time period was adding a DNS record for all of our images that moved them from a SoftLayer Object Storage domain to a subdomain of our site. The old URLs still work, but we changed all the links from across our site to the new subdomain. The big mistake we made was that we didn't update our XML sitemap to the new URLs until almost a week after the switch (totally forgot that they were served from a process with a different config file). We believe this was the cause of the issue because: The pages that dropped in traffic were the ones where the images moved, while other pages stayed more or less the same. We have some sections of our property where the images are, and have always been, hosted by Amazon and their rankings didn't crater. Same with pages that do not have images in the XML sitemap (like list pages). There wasn't a change in geographic breakdown of our traffic, which we looked at because the timing was around the same time as Pigeon. There were no warnings or messages in Webmaster Tools, to indicate a manual action around something unrelated. The number of images indexed in our sitemap according Webmaster Tools dropped from 312k to 10k over the past week. The gap between the change and the drop was 5 days. It takes Google >10 to crawl our entire site, so the timing seems plausible. Of course, it could be something totally unrelated and just coincidence, but we can't come up with any other plausible theory that makes sense given the timing and pages affected. The XML sitemap was updated last Thursday, and we resubmitted it to Google, but still no real change. Anyone had a similar experience? Any way to expedite the climb back to normal traffic levels? Screen%20Shot%202014-07-29%20at%203.38.34%20PM.png
Intermediate & Advanced SEO | | wantering0 -
Uppercase in URLs = Dupe Content
Hi Mozzers, My developers recently changed a bunch of the pages I am working on into all lower case (something I know ideally should have been done in the first place). The URLs have sat for about a week as lower case without 301 redirecting the old upper-case URLs to these pages. In Google Webmaster Tools, I'm seeing Google recognize them as duplicate meta tags, title tags, etc. See image: http://screencast.com/t/KloiZMKOYfa We're 301 redirecting the old URLs to the new ones ASAP, but is there anything else I should do? Any chance Google is going to noindex these pages because it seems them as dupes until I fix them? Sometimes I can see both pages in the SERPs if I use personalized results, and it scares me: http://screencast.com/t/4BL6iOhz4py3 Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Internal linking between categories
Is it necessary to do internal links between the same categories of a website ( Let's say Ihave a category about shoes and in the category I have a page about boots and one about sandals ( should the page boots be accessible from the page sandals and the other way round or is the back button going back to the section shoes enough ) ? If internal links between the same category ( sandals to boots ) are needed/recommended is it also a good practice to do site wide links between categories ( shoes and and bags for example ) Because by reading google recommendations "Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link" I am not sure if they are talking about breadcrumbs or text links i am kind of lost ... Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
What is the best way to hide duplicate, image embedded links from search engines?
**Hello! Hoping to get the community’s advice on a technical SEO challenge we are currently facing. [My apologies in advance for the long-ish post. I tried my best to condense the issue, but it is complicated and I wanted to make sure I also provided enough detail.] Context: I manage a human anatomy educational website that helps students learn about the various parts of the human body. We have been around for a while now, and recently launched a completely new version of our site using 3D CAD images. While we tried our best to design our new site with SEO best practices in mind, our daily visitors dropped by ~15%, despite drastic improvements we saw in our user interaction metrics, soon after we flipped the switch. SEOMoz’s Website Crawler helped us uncover that we now may have too many links on our pages and that this could be at least part of the reason behind the lower traffic. i.e. we are not making optimal use of links and are potentially ‘leaking’ link juice now. Since students learn about human anatomy in different ways, most of our anatomy pages contain two sets of links: Clickable links embedded via JavaScript in our images. This allows users to explore parts of the body by clicking on whatever objects interests them. For example, if you are viewing a page on muscles of the arm and hand and you want to zoom in on the biceps, you can click on the biceps and go to our detailed biceps page. Anatomy Terms lists (to the left of the image) that list all the different parts of the body on the image. This is for users who might not know where on the arms the biceps actually are. But this user could then simply click on the term “Biceps” and get to our biceps page that way. Since many sections of the body have hundreds of smaller parts, this means many of our pages have 150 links or more each. And to make matters worse, in most cases, the links in the images and in the terms lists go to the exact same page. My Question: Is there any way we could hide one set of links (preferably the anchor text-less image based links) from search engines, such that only one set of links would be visible? I have read conflicting accounts of different methods from using JavaScript to embedding links into HTML5 tags. And we definitely do not want to do anything that could be considered black hat. Thanks in advance for your thoughts! Eric**
Intermediate & Advanced SEO | | Eric_R0 -
Changing URL Structure
We are going to be relaunching our website with a new URL structure. My question is, how is it best to deal with the migration process in terms of old URLS appearing whilst we launch the new ones. How best should we launch the new structure, considering we've in the region of 10,000 pages currently indexed in Google.
Intermediate & Advanced SEO | | NeilTompkins0 -
Help with creating a widget
I would like to create a widget that I can give to other website owners to place on their blog. Obviously the point of doing this is to get backlinks. The widget is a simple calculator. (Think of it like a mortgage calculator). I see that there are two ways of creating widgets: 1. Javascript 2. Iframe I've been reading this excellent tutorial on building a widget using javascript and jquery:http://alexmarandon.com/articles/web_widget_jquery/. However, with my limited knowledge of javascript it's going over my head. I understand that if I offer a widget that is in an iframe that the links don't get counted as backlinks, but rather as links from my own site. But, what if I offered code like this: <iframe src="http://www.mysite.com" width="300" height="250"></iframe> This tool was provided by MySite Would that be helpful to me?
Intermediate & Advanced SEO | | MarieHaynes0