Best way to remove spammy landing pages?
-
Hey Mozzers,
We recently took over a website for a new client of ours and discovered that their previous webmaster had been using a WordPress plugin to generate 5,000+ mostly duplicated local landing pages. The pages are set up more or less as "Best (service) provided in (city)"
I checked Google Webmaster Tools and it looks like Google is ignoring most of these spammy pages already (about 30 pages out of nearly 6,000 are indexed), but it's not reporting any manual webspam actions.
Should we just delete the landing pages all at once or phase them out a few (hundred) at a time?
Even though the landing pages are mostly garbage, I worry that lopping off over 95% of a site's pages in one fell swoop could have other significant consequences.
Thanks!
-
Hi Brian,
Good for you for discovering these. The process I would recommend would look like this:
-
Create a strategy for launching a set of new, excellent pages that cover the basics without needing to cover every possible combo as these duplicate/thin pages are likely trying to do.
-
Launch your new pages.
-
Delete the old ones and say, 'good riddance!'
-
-
I would fold the deletion of these pages into any other design and content changes you have planned for the site. Also, 301 redirection in place of all the old URLs will minimize any 404s that might have been created otherwise. Spidering software like ScreamingFrog or Xenu can help you spot inbound internal links on the pages.
Ultimately these pages are going to go away, either in terms of design or outright deletion so don't let the rate at which you delete them impede other decisions as you move forward in your work on the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are core pages considered "cornerstones"?
To check that I understand the terminology, "cornerstone articles" are posts (or pages) that have some extensive, detailed, important information about a subject that other blog posts and articles can link to in reference, right? For example, a website for an auto repair shop might have a blog post about what cold weather does to a car's transmission and that post could link to a cornerstone "explainer" article that goes into more detail explaining to car-dummies like me what a transmission even DOES. But are core pages also in this category of cornerstone content? Or are they something entirely different and should be constructed accordingly? By "core pages", I mean the base-level pages about what your business is and does. For the repair shop example, I mean things like an "About Us" page or a "Services" page*. *or broken up into individual pages listing the services related to brakes, engine, wheels, etc. Thanks!
Local Website Optimization | | BrianAlpert780 -
Page optimisation score = 93, but rank on 2nd page?
So, one of my pages has an optimisation score of 93. The DA of the website is 74 and is lower than many of our competitors, but to rank 12th? Anyway, does anyone have any suggestions? All the images are under 100kb, but the page speed isn't great (not something I'm currently able to change). All alt tags are using variations of our keywords.
Local Website Optimization | | SwanseaMedicine0 -
Search result page
I need an answer how google sees this page. if somebody searches in carhub.com , normally goes to http://www.carhub.com/Results.aspx?CarState=Used&MakeName=BMW&MakeId=ENKWD0M8TR7W&Location=Los_Angeles but pushes the webpage http://www.carhub.com/Results.aspx , User sees the webpage like these.. but not seen any title, description and h1
Local Website Optimization | | carhub0 -
On what pages of my site should I put schema.org structured markup for an Aggregate Review of a Concrete Construction Contractors work?
I have a concrete contractor that I do a site for. He has many reviews from Home Advisor. So I created a Structured Data Markup using HTML5\. I put the the AggregateReview near the bottom of the About Us page at [http://www.skv-construction.com/about-us.html](http://www.skv-construction.com/about-us.html). Question 1: Should I also put the AggregateReview on the home page, or on specific project pages. Question 2: How will Google use the data now if the About page is NOT searched or displayed in SERPs. Does Google display this markup when and where they want to? Question 3: Siince this is a Local Business, should I embed the AggregateReview within the LocalSearch tag. I passed the Google test as it is for the Aggregate Review! But I have the review wrapped in the HomeAndConstructionBusiness tag. Here is the code: "http://schema.org/HomeAndConstructionBusiness"> # Quality Workmanship w 50 Yrs Experience "http://schema.org/AggregateRating"> 4.37 stars-based on 54 reviews at ["http://www.homeadvisor.com/rated.SKVConstruction.18028291.html"](<a) target="_blank">Home Advisor "http://schema.org/PostalAddress"> 10005 Fair Lane <spam itemprop="addresslocality" union=""></spam> IL 60180 (847) 364 0161 ["http://www.skv-construction.com/contact-us.html"](<a)>Contact Us Price Range: All Jobs Custom; Call for Quote or Visit Web Site Would appreciate any help. This markup is so vague, I can see why few people are using it. Maybe you should do a Video training or extended training on how to's. Vernon Wanner 815-332-8062
Local Website Optimization | | VernonWanner0 -
Pages ranking outside of sales area
Hi there Moz Community, I work with a client (a car dealership), that mostly serves an area within 50-100 miles at most from their location. A previous SEO company had built a bunch of comparison pages on their website (i.e. 2016 Acura ILX vs. Mercedes-Benz C300). These pages perform well in their backyard in terms of engagement metrics like bounce rate, session duration, etc. However, they pull in traffic from all over the country and other countries as well. Because they really don't have much of an opportunity to sell someone a car across the country that a customer could easily buy at their local dealership, anyone from outside their primary marketing area typically bounces. So, it drags down their overall site metrics plus all of the metrics for these pages. I imagine searchers from outside their primary sales area are seeing their location and saying "whoah that's far and not what I'm looking for." I tried localizing the pages by putting their city name in the title tags, meta descriptions, and content, but that doesn't seem to really be getting rid of this traffic from areas too far away to sell a car to. My worry is that the high bounce rates, low time on site, and general irrelevancy of these pages to someone far away are going to affect them negatively. So, short of trying to localize the content on the page or just deleting these pages all together, I'm not quite sure where to go from here. Do you think that having these high bouncing pages will hurt them? Any suggestions would be welcomed. Thanks!
Local Website Optimization | | Make_Model1 -
How best to clean up doorway pages. 301 them or follow no index ?
Hi Mozzers, I have what is classed as doorway pages on my website. These have historically been location specific landing pages for some of our categories but from speaking to a number of different webmasters , then general consensus is that they are not in google guidelines so I will be getting punished by having them. My options are : I can 301 the pages back to their original category pages . This will conserve some link juice to pass back to the respective category page. I can set these as Follow No index. Not sure what will happen here with regards to link value etc. What would be best ?... Some of the pages do currently rank "fairly well" for some of the locations so I am getting traffic from them but I also know I will be getting a algorithmic penalty for having them so how best I clean these up ?. Also , by cleaning up the site structure , would I see any benefit here ? or will I have to wait for a new panda update/ refresh ? I thought the panda refresh won't use a new dataset thanks Pete
Local Website Optimization | | PeteC120 -
What is the best CMS Approach for Multilingual Versions of Site?
We have expanded into France and Brazil and now have a someone in-house that can translate to French and Brazilian Portuguese. I own ".fr" and ".com.br" versions of our domain. We are using Wordpress for our CMS. We are currently publishing about 2 articles a week on English site which we would be translating and publishing through new international sites (when appropriate). We will be changing out photos and videos at times in addition to all the text/copy. So, before I jump deep into this I wanted to reach out for help regarding the best modern approach to this. Should I use some sort of WP Plugin that will let me manage each of these through 1 WP install or is it better to run each separately through multiple WP installs? I want to achieve this while... avoiding any duplicate content penalties. provide easy admin/editor management of publishing content. Any help/advice is greatly appreciated. Thanks!
Local Website Optimization | | the-coopersmith0 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0