Penality issues
-
Hi there,
I'm working on site that has been badly hit by penguin. The reasons are clear, exact match blog network links and tons of spammy exact match links such as comment spam, low quality directories, the usual junk.
The spammy links were mainly to 2 pages, they were targetting keyword 1 and keyword 2.
I'd like to remove these two pages from google, as they dont even rank in google now and create one high quality page that targets both the keywords, as they are similar.
The dilemma I have is these spammy pages still get traffic from bing and yahoo and it's profitable traffic. Is there a safe way to remove the pages from google and leave them for bing and yahoo?
Peter
-
What about using this Irving? Have you tried it before?
-
The problem with Google is that it's difficult to know whether it is a page level penalty or an anchor text filter that you are triggering from the exact match anchor text abuse. You could try creating a new page for those keywords but there is the chance that they still stop any page from ranking well for that term because of the anchor text (this has happened to me before). Let's hope Google follows Bings lead and comes up with a link removal tool!
Worth a try though.
-
I don't think there is any way around that, the pages need to 404 or Google will reindex them due to all of the links pointing to the pages, even if you do set up robots.txt to allow bing and disallow googlebot to crawl those pages that only works when the crawlers come in from the homepage.
-
My personal opinion is that Bing and Yahoo don't value those links at all. They may not penalizing you for it, but they probably aren't boosting your ranking either.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal link is creating duplicate content issues and generating 404s from website crawl.
Not sure what the best way to describe it but the site is built with Elementor page builder. We are finding out that a feature that is included with a pop modal window renders an HTML code as so: Click So when crawled I think the crawling is linking itself for some reason so the crawl returns something like this: xyz.com/builder/listing/ - what we want what we don't want xyz.com/builder/listing/ xyz.com/builder/listing/%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9/ xyz.com/builder/listing/%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9//%23elementor-action%3Aaction%3Dpopup%3Aopen%26settings%3DeyJpZCI6Ijc2MCIsInRvZ2dsZSI6ZmFsc2V9/ so you'll notice how that string in the HREF is appended each time and it loops a couple times. Could I 301 this issue, what's the best way to go about handling something like this? It's causing duplicate meta descriptions/content errors for some listing pages we have. I did add a rel='nofollow' to the anchor tag with JavaScript but not sure if that'll help.
Technical SEO | | JoseG-LP0 -
Fake 404 Issue
Hello, I just had a problem my site started showing up 404 issues for all my wordpress pages and post but visually the page was loading with content but yet all pages and software including google WMT was showing the 404 issue. I never found the issue but was able to move the site into a new hosting and restore from a backup and it work. Google found the issue on Jan 27th and they remove all the pages with 404 from the index and I lost most of my top ranking I have since fix the issue and was wondering if google would restore my ranking in such a case? Regards, M
Technical SEO | | thewebguy30 -
What is best practice for fixing urls that have duplicate content, non-static and other issues?
Hi, I know there are several good answers regarding duplicate content issues on this website already, however I have a question that involves the best way to avoid negative SEO impacts if I change the urls for an ecommerce site. Basically a new client has the following website http://www.gardenbeauty.co.uk and I notice that it suffers from duplicate content due to the http://www version and the non www version of the pages - this seems quite easy to fix using the guidance on this website. However I notice that the product page urls are far from ideal in that they have several issues including:- (a) they are mostly too long (b) don't include the keyword terms (in terms of best practice) (c) they don't use Static URLS An example of one these product urls would be http://www.gardenbeauty.co.uk/plant-details.php?name=Autumn Glory&p_genus=Hebe&code=heagl&category=hebe I'd like to address these issues, but the pages rank highly for the products themselves, therefore my question is what would you recommend I do to fix the urls without risking the high positions that many of these product pages have? thanks, Ben
Technical SEO | | bendyman0 -
What online tools are best to identify website duplicate content (plagiarism) issues?
I've discovered that one of the sites I am working on includes content which also appears on number of other sites. I need to understand exactly how much of the content is duplicated so I can replace it with unique copy. To do this I have tried using tools such as plagspotter.com and copyscape.com with mixed results, nothing so far is able to give me a reliable picture of exactly how much of my existing website content is duplicated on 3rd party sites. Any advice welcome!
Technical SEO | | HomeJames0 -
Too many on-page links vs. UX issue
I am having an issue with many of our pages having too many on-page links. I have gotten many of them below the 100 page limit that is suggested and I understand this is not a critical factor with SEO, but my issue is this: Many important pages I am trying to optimize are buried at a "3rd" level which is actually not accessible from the home page navigation dropdown due to our outdated CMS. I am trying to decide if we should develop our site to display these pages on-hover from the main navigation. This would make a lot of sense since users would find these pages easier, however adding this functionality would increase on-page links by a lot more. So in your opinion, would it be worth it to spend the money to have this functionality developed? Or would it end up hurting our SEO standings?
Technical SEO | | isret_efront0 -
Duplicate content issues, I am running into challenges and am looking for suggestions for solutions. Please help.
So I have a number of pages on my real estate site that display the same listings, even when parsed down by specific features and don't want these to come across as duplicate content pages. Here are a few examples: http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html?feature=waterfront http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html This happens to be a waterfront community so all the homes are located along the waterfront. I can use a canonical tag, but I not every community is like this and I want the parsed down feature pages to get index. Here is another example that is a little different: http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=without-pool http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=4-bedrooms http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=waterfront So all the listings in this community happen to have 4 bedrooms, no pool, and are waterfront. Meaning that they display for each of the parsed down categories. I can possible set something that if the listings = same then use canonical of main page url, but in the next case its not so simple. So in this next neighborhood there are 48 total listings as seen at: http://luxuryhomehunt.com/homes-for-sale/windermere/isleworth.html and being that it is a higher end neighborhood, 47 of the 48 listings are considered "traditional listings" and while it is not exactly all of them it is 99%. Any recommendations is appreciated greatly.
Technical SEO | | Jdubin0 -
Brand New Site Penalized?
I recently launched 2 completely separate and unrelated websites at the same time. Both are new domains and hosting accounts. neither have any links. One is ranking for a branded search and the other is not. The interesting thing is that I tested both sites on the back end of my server before launch. The site that is not ranking for branded search IS ranking still on the back end of my site for the branded search. I have removed all content and 301 redirected the testing urls back to my portfolio page. Could this be do to Google indexing one but not the other. Does it have anything to do with testing on my server first and my DA being higher than current new sites? Or is it something completely different I'm missing completely. Is this a Penalty?
Technical SEO | | CDUBP0 -
RSS Hacking Issue
Hi Checked our original rss feed - added it to Google reader and all the links go to the correct pages, but I have also set up the RSS feed in Feedburner. However, when I click on the links in Feedburner (which should go to my own website's pages) , they are all going to spam sites, even though the title of the link and excerpt are correct. This isn't a Wordpress blog rss feed either, and we are on a very secure server. Any ideas whatsoever? There is no info online anywhere and our developers haven't seen this before. Thanks
Technical SEO | | Kerry220