Penality issues
-
Hi there,
I'm working on site that has been badly hit by penguin. The reasons are clear, exact match blog network links and tons of spammy exact match links such as comment spam, low quality directories, the usual junk.
The spammy links were mainly to 2 pages, they were targetting keyword 1 and keyword 2.
I'd like to remove these two pages from google, as they dont even rank in google now and create one high quality page that targets both the keywords, as they are similar.
The dilemma I have is these spammy pages still get traffic from bing and yahoo and it's profitable traffic. Is there a safe way to remove the pages from google and leave them for bing and yahoo?
Peter
-
What about using this Irving? Have you tried it before?
-
The problem with Google is that it's difficult to know whether it is a page level penalty or an anchor text filter that you are triggering from the exact match anchor text abuse. You could try creating a new page for those keywords but there is the chance that they still stop any page from ranking well for that term because of the anchor text (this has happened to me before). Let's hope Google follows Bings lead and comes up with a link removal tool!
Worth a try though.
-
I don't think there is any way around that, the pages need to 404 or Google will reindex them due to all of the links pointing to the pages, even if you do set up robots.txt to allow bing and disallow googlebot to crawl those pages that only works when the crawlers come in from the homepage.
-
My personal opinion is that Bing and Yahoo don't value those links at all. They may not penalizing you for it, but they probably aren't boosting your ranking either.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Linking issue
So i am working with a review company and I am having a hard time with something. We have created a category which lists and categorizes every one of our properties. For example a specific property in the category "restaurant" would be as seen below: /restaurant/mcdonalds /restaurant/panda-express And so on and so on. What I am noticing however is that our more obscure properties are not being linked to by any page. If I were to visit the page myurl.com/restaurant I would see 100+ pages of properties, however it seems like only the properties on the first few pages are being counted as having links. So far the only way I have been able to work around this issue is by creating a page and hiding it in our footer called "all restaurants". This page lists and links to every one of our properties. However it isn't exactly user friendly and I would prefer scrapers not to be able to scrape all properties at once! Anyway, any suggestions would be greatly appreciated.
Technical SEO | | HashtagHustler0 -
Robots.txt in subfolders and hreflang issues
A client recently rolled out their UK business to the US. They decided to deploy with 2 WordPress installations: UK site - https://www.clientname.com/uk/ - robots.txt location: UK site - https://www.clientname.com/uk/robots.txt
Technical SEO | | lauralou82
US site - https://www.clientname.com/us/ - robots.txt location: UK site - https://www.clientname.com/us/robots.txt We've had various issues with /us/ pages being indexed in Google UK, and /uk/ pages being indexed in Google US. They have the following hreflang tags across all pages: We changed the x-default page to .com 2 weeks ago (we've tried both /uk/ and /us/ previously). Search Console says there are no hreflang tags at all. Additionally, we have a robots.txt file on each site which has a link to the corresponding sitemap files, but when viewing the robots.txt tester on Search Console, each property shows the robots.txt file for https://www.clientname.com only, even though when you actually navigate to this URL (https://www.clientname.com/robots.txt) you’ll get redirected to either https://www.clientname.com/uk/robots.txt or https://www.clientname.com/us/robots.txt depending on your location. Any suggestions how we can remove UK listings from Google US and vice versa?0 -
SSL redirect issue
Hi guys, I have a site that has some internal pages with SSL. Recently i noticed that if i put https://mydomain.com, this URL is accessible but all the design is messed up. My site is on wordpress and i use "redirection" plugin for all the 301 redirect. So i decided to add a new 301 redirect from https://mydomain.com to my actual URL version of home page http://mydomain.com. After doing that, my home page doesn't load at all. Does anybody know what happens? Thank you for advice!
Technical SEO | | odmsoft0 -
Sitemap issue
How can I create XML as well as HTML sitemaps for my website (both eCommerce and non - eCommerce )Is there any script or tool that helps me making perfect sitemapPlease suggest
Technical SEO | | Obbserv0 -
Noindex nofollow issue
Hi, For some reason 2 pages on my website, time to time get noindex nofollow tags they disappear from search engine, i have to log in my thesis wp theme and uncheck box for "noindex" "nofollow" and them update, in couple days my website is back up. here is screen shot http://screencast.com/t/A6V6tIr2Cb6 Is that something in thesis theme that cause the problem? even though i unchecked the box and updated but its still stays checked http://screencast.com/t/TnjDcYfsH4sq appreciated for your help!
Technical SEO | | tonyklu0 -
Duplicate content issue. Delete index.html and replace with www.?
I have a duplicate content issue. On my site the home button goes to the index.html and not the www. If I change it to the www will it impact my SERPS? I don't think anyone links to the index.html.
Technical SEO | | bronxpad1 -
Duplicate Content Issue: Google/Moz Crawler recognize Chinese?
Hi! I am using Wordpress multisite and my Chinese version of the website is in www.mysite.com/cn Problem: I keep getting duplicate content errors within www.mysite.com/cn (NOT between www.mysite.com and www.mysite.com/cn) I have downloaded and checked the SEOmoz report and duplicate_page_content list in CSV file. I have no idea why it says they have the same content., they have nothing in common in content . www.mysite.com is the English version of the website,and the structure is the same for www.mysite.com/cn *I don't have any duplicate content issues within www.mysite.com Question: Does google Crawler properly recognizes chinese content??
Technical SEO | | joony20080 -
Facebook Like button issue
In looking through my top pages in Google Analytics, my #2 page (oddly enough) looked like this "/?fb_xd_fragment=". Apparently, this is because we added the Facebook Like button to many of our pages. But I'm worried these show very skewed PageView data and lower Time Spent on each page. The average time on this page is 5 seconds whereas the average sitewide time is much higher. Further, it shows 9,000 pageviews coming from only 250 Unique Visitors. I'm sure this is messing with our SEO. Is there a fix for this? Should I even be worried about it? I heard that I can remove it from my GA stat reporting, but I don't want it to be causing problems in the background. Please advise..my boss wants to keep the Facebook Like button the pages as it has brought us some good response. The page that this is on is: www.accupos.com Maybe there's an alternate version of the Facebook Like that we don't know about... I would appreciate any help on this DM
Technical SEO | | DerekM880