Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Culling 99% of a website's pages. Will this cause irreparable damage?
-
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that.
The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total.
My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way?
The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site.
Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly?
And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site?
Thanks for your time, I know this is quite a long one,
Nick
-
Thank you all for the positive feedback.
Lately I have made the time for SEOmoz Q&A as I have been doing various SEO research and these boards can be a great way to stretch thought processes.
-
Seriously, Ryan is always ALL OVER Seomoz comments with good feedback
-
Just figured out how to do this, I'm new to SEOMoz Q&A, thanks for the nudge! Ryan certainly deserves it!
-
I do hope Ryan gets "Good Answer" and/or "Endorsed Answer" for this... hint, hint
-
Your understanding is correct.
Google does not care how many directories appear in a URL. The two URLs you offered as an example are viewed equally by Google. What's important is how many clicks it takes users to access those links.
-
Hi Ryan,
Sorry for not getting back to you straight away, I've been in meetings all day.
You've given me some excellent ideas again!!
Just to clarify, the old URL's are in the following format:
www.url.com/resort_hotels/hotels_in_rome.asp
I am aiming to use the following structure for the new website:
or
I was wondering if you knew from a search engine perspective, which URL is the better option. From a user perspective, I would assume the second.
I am operating under the assumption that Google rates a URL's importance by the number of clicks it is from the homepage and not the number of directories (www.url.com/.../.../...) within the URL?
If this is the case I will probably go for the second URL structure, but place links higher up the hierarchical structure of the site for the more important locations.
Unfortunately, the landing pages for the cars and flights house exactly the same content with just the location text tweaked. There is nothing else unique on these pages, which is why I find myself with no other option but to get rid of them.
I really like your idea of testing landing pages for a specific area. This may be a good way to go, but creating two paragraphs of text for both the flight and car hire pages is not an option at this time. With 40,000 locations we’d need to produce 160,000 paragraphs of unique text, which would cost around $400,000, may be slightly less with bulk discounting.
If I was to spend that much money on content writing, I would probably expand the hotel side of the site as this is most profitable. But my priority after the launch of the new site is an extensive link building campaign to assist the transition.
Thanks so much again for your help Ryan, you're a star!
Did you know whether Google rates a URL's importance by the number of clicks it is from the homepage and not the number of directories (www.url.com/.../.../...) within the URL? It is really important that I find this one out!
Take care buddy,
Nick
-
Nick,
Sounds like you have a good strategy. I only have two additional items to share based on your latest reply.
www.url.com/resort_hotels/hotels_in_rome.asp
That url seems a bit spammy to me. Mentioning "hotels" twice is something I would avoid. I would consider something along the lines of the below options instead:
www.url.com/resorts/hotels_in_rome
www.url.com/resort_hotels/rome
I also wanted to talk about the landing pages for cars and air travel once more. Before directing all your current pages to a generic page I would take a look at the existing 140 pages and ask once again, do any of the pages have anything that is unique which can be used for the location based car and air landing pages?
Your plans are to develop these pages with quality content over time, which is great. I hate the idea of having establishing pages for each area, pulling back to having one generic page, then expanding again to location-based pages.
If you sincerely intend to develop these pages on a reasonable time period, I would suggest establishing one page for each location even if it was thin on content to start with. Driving directions, local driving laws, testimonials, anything that can be used as a starting point to hold your footing would be preferred.
If you do pull back to a generic "car rentals" page, I have two ideas. Build out your location landing page for one area such as London. Closely watch your conversion rates on users on the London page versus the generic page. If there is a significant difference, it may help speed up your transition. If you realize you are losing $$ every day you don't have those pages, then perhaps you can hire additional help to speed up the process.
The final idea would be to build country-based landing pages for car rentals as an stop-gap measure. Your Milan, Rome, etc pages could all direct to "Cars Italy" and "Air Italy".
There are tons of choices on the internet for travel providers. You have an extremely well established user base. My top concern for any migration is to maintain all my existing relationships. Some travel sites do great with a single landing page for air/cars/hotels. It sounds like your site has catered to clients in a specific way, and I would be sensitive to maintaining your current user experience.
One last idea that just came to me. After the migration poll users for feedback. Take surveys, offer discounts, generate hype but engage users because they will offer a different point of view which you may not have considered.
-
Ryan, you have given me some excellent ideas here and a great overall structure to make the transition between sites. I can't thank you enough for your help. I will certainly consult an SEO before proceeding with anything, but your insight has given me a lot to think about.
With regards to the sites current pages, the majority of locations only have 3 pages; Hotels, Car Hire & Flights. It is the amount of locations covered that make the site so expansive.
So with Hotels being our biggest earner, my idea going forward was to:
-
Use the travel guide's unique content for the hotel landing pages, i.e. [Hotels in Rome]
-
Redirect all of the old Car & Air location pages to the new website’s generic Car Hire & Flights pages.
This would mean that there wouldn’t be any location-based pages for Flights and Car Hire. The idea would be to build these up gradually as it would take some time and money to add the unique content required.
- From every hotel landing page we would use anchor text to promote the generic Flights and Car Hire pages. For example, [Buy Cheap Flights] or [Cheap Car Hire]
This additional anchor text should help our external link building and the generic Flights and Car Hire pages would house a search form for users to search any location.
So essentially, the majority of the site would be made up of Hotel landing pages, until we began building the site further.
I can see that your main concern is that the correct redirects are in place.
The site currently has the following URL structures, with locations for each:
Apart from the sitemaps, each have locations with them, for example:
www.url.com/resort_hotels/hotels_in_rome.asp
So my idea is to:
1) Redirect all “resort_hotels” URL’s to their relevant hotel page on the new website, for example,
www.url.com/resort_hotels/hotels_in_rome.asp
will go to the “Hotels in Rome” page on the new website.
- The rest of the pages will be redirected to the home page for their category, for example,
will go to “Flights” home page on the new website; and,
will go to “Car Hire” home page on the new website, etc.
Unless there is something really wrong with this strategy, or you have any instant criticism, I would like to thank you for your help again and ask that if you need anything, please don’t hesitate to drop me a message on here. You have given up enough your time and I’m more than grateful.
Kind Regards,
Nick
[I am using my work’s account, which is why I am displayed as Steve]
-
-
The transition I mentioned would allow for a smoother migration process rather then a "cold turkey" switch from the old site to the new site. You clearly recognize the end goal is to create your new site and delete the old site. The good news is that change does not have to happen over night.
You can build out your new site completely and go live with it. At that point you would update any external links you control along with your advertisements, signatures, etc. You would also want to reach out to partners and any sites with links that you can influence. Update those links so they point to your new pages.
The final step is the redirection of your 140k page old site to the appropriate pages on the new site. Clearly you wish to begin with the most prominent pages such as your landing pages along with any important pages such as "Contact Us", your reservation system, etc.
The next step would be applying your redirect rules to the remaining pages. Extensive testing will be required.
You should set up GA or another tracking tool to monitor your old site. You will want to closely monitor activity for quite some time. Specifically look for any issues with 404s and multiple redirects.
With respect to your anchor text, I suspect it was used to sculpt your site so your link value was focused on a particular page for each topic. When you have 140 pages on a given topic, you can pursue an incredible amount of longtail phrases. Now I suspect you may have 4 pages for each area: Rome, Rome by Air, Rome by Car, and Rome hotels. If that is the case your future anchor text linking will be a lot more straight forward.
I want to say "I wouldn't be concerned about the anchor text" but you have a major project ahead of you, you are highly dependent on SEO and there are many opportunities for something to go wrong. In that context, I would share the anchor text would be on the list of things to think about, but the proper redirects is a much larger concern.
A final thought I would offer: this is all high level, generic advice. I would recommend hiring a SEO who could offer a proper evaluation of your site along with a migration plan. Once the change has been completed and tested, you should gain many advantages with your new site. Hopefully they will offset any loss from the migration. Once you are confident in your new site, I would recommend a SEO campaign promoting your new site.
-
Hi again Ryan,
All the URL's are currently coded as .asp (www.url.com/Rome.asp) and we aim to build the new site with user friendly permalinks (www.url.com/Rome). So in answer to your question, yes, the sites could co-exist.
I'd hadn't thought of doing it this way, what a great idea.
With regards to the site's internal linking structure, I'm probably not explaining myself correctly. I understand that all of the site's juice needs to be recycled, but I'm now thinking that on many of the 120,000 pages there are links with anchor text to other relevant parts of the site, will removing these links, because there are so many of them, ruin the site's authority.
In addition, I would be really interested to hear your ideas on staging a transition.
I can't thank you enough for this Ryan, my head's spinning at the moment!
-
You are on the right track. The link value from your existing pages must be saved.
Prior to offering a further reply I would like to ask a couple questions:
-
how are your currently URLs coded? As an older site I presume your page URLs end in .asp?
-
will your new design also be in asp?
What I am trying to determine is, will the new site require new URLs. If your current page is /rome.asp and the new page will be /rome.php then the URL will change so both your new site and old site can co-exist at the same domain. This process will be helpful for staging a transition.
PS. My recommendation for URLs would be to use friendly URLs which do not show an extension (i.e. /rome) but that is not the present focus.
-
-
Thanks for a swift answer Ryan, very helpful indeed!
Put simply, the site is split into three key areas, Hotels, Flights & Car Rentals, each with about 40,000 pages each. The problem is that each of these pages uses a generic paragraph or two that is more or less the same, but tweaked slightly to update the location in question. For example,
"Our goal is to provide the best choice of hotels in Rome."...
"Our goal is to provide the best choice of hotels in Barcelona."...
Obviously Google sees this as duplicate content and rightly so, but other than rewriting 120,000 pages of content, I can't see an alternative to the problem, other than to remove the pages in question.
The site has so many quality links going into it, from authorities all over the web; it would be a shame to waste this juice on pages that are getting penalised.
The travel guide areas are all unique; there is a single guide for each of the 1000 destinations. For example,
http://www.url.com/guides/rome
My idea was to use this unique content to promote our hotel pages, for example,
http://www.url.com/hotels-in-rome
This page would have the unique travel content from that area plus a list of the hotels we have available in Rome.
Every other duped page on the site relating to "Rome", does have "Rome" in its URL, so a regex expression could be used to redirect all "Rome" themed pages to the "Hotels in Rome" page that would house the unique content.
All other pages that did not have unique content written about them could be redirected to the generic Hotels, Flights or Car Rentals pages as they all have either “hotels”, “flights” or “car-rental” in their URL.
The site is over 10 years old, is written in .asp and is managed with a bespoke piece of software created specifically for the site itself.
However, this doesn't really matter as we’re having the website redesigned at the same time as removing the dupe content and it will be built to our own specification.
My idea is to begin building up these locations from when the redesign goes live. This way I could keep a track of our content as it expands.
My main worry is that the culling of these pages, will remove 99% of our internal linking structure. And I'm wondering if removing this will dramatically reduce the authority of the site. However, at this point I’m struggling to see another option.
Sorry for the length of this reply, any ideas would be welcome, I just thought it would be best if you knew a bit more of the background.
Thanks again Ryan!
-
Hi Steve.
I would suggest taking a good look at your content pages. I understand having dupe pages, but you are suggesting a possible 140:1 ratio which is....wow.
As I am sure you are aware, there is not going to be any quick and easy fix. Here are some initial thoughts:
The first step I would take is to look for ANY commonalities between pages you can grab. For example, you mention 1000 travel guides. Are all the travel areas unique? For example, is there only one guide for Rome? If so, do all the duplicate Rome pages have "Rome" in the URL? If so, you can consider adding a regex expression to your htaccess file (presuming you are on an apache server) which could cover your 301s.
Is there any unique code on the pages which are common for a given guide? Using the above example, do all the "Rome" pages have "Rome" in the page title? If so, you could possibly update all the "Rome" pages by adding the correct canonical to the page's meta information. At this point it would truly depend on how your site is coded. Do you have a CMS? In what language are your pages written?
The bottom line, there is simply too much value in those pages to discard them. They each need to be properly 301'd as the preferred method. The 301s really need to be handled with general expressions which cover a large number of pages at once. You cannot use individual redirects even if you wanted to as it would cripple your web server.
I would not redirect all the pages to a single home page unless every other opportunity was completely explored.
-
Wow that's a big, bold move! I don't know how to answer it but if I were you I'd wait until you get a few, nice and comprehensive answers on here before doing anything to drastic. Either that or use a private Q&A question to SEOmoz staff if you have any points spare to do so. With such a large change, you want to ensure you're doing it right.
I'll be interested to see the answers you get for this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will 301 Redirects Slow Page Speed?
We have a lot of subdomains that we are switching to subfolders and need to 301 redirect all the pages from those subdomains to the new URL. We have over 1000 that need to be implemented. So, will 301 redirects slow the page speed regardless of which URL the user comes through? Or, as the old urls are dropped from Google's index and bypassed as the new URLs take over in the SERPs, will those redirects then have no effect on page speed? Trying to find a clear answer to this and have yet to find a good answer
Intermediate & Advanced SEO | | MJTrevens0 -
Ranking 1st for a keyword - but when 's' is added to the end we are ranking on the second page
Hi everyone - hope you are well. I can't get my head around why we are ranking 1st for a specific keyword, but then when 's' is added to the end of the keyword - we are ranking on the second page. What could be the cause of this? I thought that Google would class both of the keywords the same, in this case, let's say the keyword was 'button'. We would be ranking 1st for 'button', but 'buttons' we are ranking on the second page. Any ideas? - I appreciate every comment.
Intermediate & Advanced SEO | | Brett-S0 -
We have two different websites with the same products and information, will that hurt our rankings?
We have two different domains, one for the UK and the other for the US, they have the exact same products, categories and information. (the information is almost the same in 400 products) We know that Google could recognize that as duplicate content, but will that actually hurt our rankings in both sites? Is it better if we create two completely different versions of the content on those pages?
Intermediate & Advanced SEO | | DoitWiser0 -
May know what's the meaning of these parameters in .htaccess?
Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
Intermediate & Advanced SEO | | esiow2013
RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist1 -
Creating 100,000's of pages, good or bad idea
Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities. e.g. Stirling
Intermediate & Advanced SEO | | PottyScotty
Stirling paintball
Stirling Go Karting
Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit0 -
Do I need to use canonicals if I will be using 301's?
I just took a job about three months and one of the first things I wanted to do was restructure the site. The current structure is solution based but I am moving it toward a product focus. The problem I'm having is the CMS I'm using isn't the greatest (and yes I've brought this up to my CMS provider). It creates multiple URL's for the same page. For example, these two urls are the same page: (note: these aren't the actual urls, I just made them up for demonstration purposes) http://www.website.com/home/meet-us/team-leaders/boss-man/
Intermediate & Advanced SEO | | Omnipress
http://www.website.com/home/meet-us/team-leaders/boss-man/bossman.cmsx (I know this is terrible, and once our contract is up we'll be looking at a different provider) So clearly I need to set up canonical tags for the last two pages that look like this: http://www.omnipress.com/boss-man" /> With the new site restructure, do I need to put a canonical tag on the second page to tell the search engine that it's the same as the first, since I'll be changing the category it's in? For Example: http://www.website.com/home/meet-us/team-leaders/boss-man/ will become http://www.website.com/home/MEET-OUR-TEAM/team-leaders/boss-man My overall question is, do I need to spend the time to run through our entire site and do canonical tags AND 301 redirects to the new page, or can I just simply redirect both of them to the new page? I hope this makes sense. Your help is greatly appreciated!!0 -
Include Cross Domain Canonical URL's in Sitemap - Yes or No?
I have several sites that have cross domain canonical tags setup on similar pages. I am unsure if these pages that are canonicalized to a different domain should be included in the sitemap. My first thought is no, because I should only include pages in the sitemap that I want indexed. On the other hand, if I include ALL pages on my site in the sitemap, once Google gets to a page that has a cross domain canonical tag, I'm assuming it will just note that and determine if the canonicalized page is the better version. I have yet to see any errors in GWT about this. I have seen errors where I included a 301 redirect in my sitemap file. I suspect its ok, but to me, it seems that Google would rather not find these URL's in a sitemap, have to crawl them time and time again to determine if they are the best page, even though I'm indicating that this page has a similar page that I'd rather have indexed.
Intermediate & Advanced SEO | | WEB-IRS0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0