Disallow a spammed sub-page from robots.txt
-
Hi,
I have a sub-page on my website with a lot of spam links pointing on it. I was wondering if Google will ignore that spam links on my site if i go and hide this page using the robots.txt
Does that will get me out of Google's randar on that page or its useless?
-
Does it rank for anything worthwhile?
Does it have any legitimate / valueable links pointing to it?
If the answer is no to both of those questions, just delete the page and recreate it at a new URL and request a removal of the old URL from Google's index (and obviously don't 301 redirect it).
-
Hi, my personal opinion is that if they were unintentional or not done by you then Google will ignore these and not penalise site (see Rans Whiteboard Friday video on Negative SEO).
However if it is a page that is not very important to you then maybe you should consider removing this page from Googles index (use GWT for this) and then getting Google to re-index a new page that has no spam links pointing to it?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How many pages should we optimise?
I have more than 250 pages on my site including my products. Is it a good idea to optimise each page with a unique keyword or is there a limit to the number of pages we should aim for?
On-Page Optimization | | Timberwink0 -
Homepage On-page Optimization
How do you all handle homepage optimization, if you (or a client) offers a variety of services? Our homepage has the strongest link profile of any of our pages, but it lists all the areas of law we cover. Therefore, it has too many keywords and none really rank well. Should we just pick our most profitable areas and optimize for that? www.kempruge.com in case anyone would benefit from looking at the actual page. Thanks, Ruben
On-Page Optimization | | KempRugeLawGroup0 -
Why is this page not ranking?
Can you please tell me why this page is not ranking. http://goo.gl/BqoRT The page doesn't rank at all for keywords but even if I copy a line or 2 of text it still doesn't rank for that text. Any help will be much appreciated.
On-Page Optimization | | JillB20130 -
Two keywords in one page
Hi guys, I have a question...is it possible to posicionate two keywords in one only page? If yes, how would it be the process so that Google take note of that action/s. How many criteria/keywords are recommended to positionate in one site? Thanks all
On-Page Optimization | | juanmiguelcr0 -
Duplicate Page Content on Empty Manufacturer Pages
I work for an internet retailer that specializes in pet supplies and medications. I was going through the Crawl Diagnostics for our website, and I saw in the Duplicate Page Content section that some of our manufacturer pages were getting flagged. The way our site is set up is that when products are discontinued we mark them as discontinued and use 301 redirects to redirect their URLs to other relevant products, brands, or our homepage. We do the same thing with brand and manufacturer pages if all of their products are discontinued. 90% of the time, this is a manual process. However, the other 10% of the time certain products come and go automatically as part of our inventory system with one of our fulfillment partners. This can sometimes create empty manufacturer pages. I can't redirect these empty pages because there's a chance that products will be brought back in stock and the page will be populated again. What can we do so that these pages won't get marked as duplicates while they're empty? Write unique short descriptions about the companies? Would the placement of these short descriptions matter--top of the page under the category name vs bottom of the page underneath where the products would go? The links in the left sidebar, top, and in the footer our part of our site architecture, so those are always going to be the same. To contrast, here's what a manufacturer page with products looks like: Thanks! http://www.vetdepot.com/littermaid-manufacturer.html
On-Page Optimization | | ElDude0 -
Removing OLD pages
Dear all, I was removing tons of old pages from my directory (about 400 pages), I was setingup a 404 custom page, all is fine, so when I go to an existing page I get a 404 and redirected to my 404 page. The problem is Google Webmaster tools list all these pages as 404, and never clean my list (1 year til now), so I assume something is wrong. Question what is the best way or natural to remove old pages from one directory? Note: previously I tryed add on these pages the NOINDEX/NOFOLLOW meta tag and I got from google Soft-404. Thank you
On-Page Optimization | | SharewarePros0 -
To Reduce (pages)... or not to Reduce?
Our site has a large Business Directory with millions of pages. For examples' sake, let's say it's a directory of Restaurants. Each Restaurant has 4 pages on the site, each tied together through a row of tabs across the top of the page: Tab 1 - Basic super 7 info - name, location, contact info Tab 2 - Restaurant menu Tab 3 - Restaurant reviews Tab 4 - Photos of food The Tab 1 page generates 95% of our traffic, and 90% of conversions. The conversion rate on Tab 2 - Tab 4 pages is 6 - 10x greater than Tab 1 conversions. Total Conversions from search queries on menus, reviews and food are 20% higher than are conversions resulting from searches on restaurant name & info alone. We're working with a consultant on a redesign, who wants to consolidate the 4 pages into one. Their advice is to focus on making a better page, featuring all of the content, sacrifice a little organic traffic but make up any losses by improving conversion. My counterpoint is that we shouldn't scrap the Tab 2-4 pages just because they have lower traffic - we should make the pages BETTER. The content we display is thin, and we have plenty of data we could expose to make the pages more robust. By consolidating it will also be hard to optimize a page for people searching for name/location AND menu AND reviews AND photos. We're asking that one page to do too much, and it's likely we will see diminished search volume for queries on menu, reviews and food. I think the decline will be much more significant than the consultant estimates. The consultant says there will be little change to organic traffic. since Tab 1 already generates 95% of traffic. Through basic math, they're saying the risk is a 5% decline in organic traffic. Further, they see little chance of queries for menu, reviews, and food declining because most of those queries tend to send people too the home page or Tab 1 page anyway. Finally, the designer of the new wireframes admitted that potential organic traffic risks were not taken into consideration when they recommended consolidating the pages. I sincerely appreciate your thoughts and consideration! Trisha
On-Page Optimization | | lzhao0 -
Blog page outranks static page for KW -- why?
Blog page ranks 10 in Google, while the static page is on page 7. What makes it more interesting is that the blog page scores an "F" with the Term Target tool while the static page scores an "A". Static page has more inbound links and a mR/mT of 3.89/ 4.54 vs. 3.71/ 4.14 for the blog page. Any ideas on how to approach this one?
On-Page Optimization | | 540SEO0