Penality issues
-
Hi there,
I'm working on site that has been badly hit by penguin. The reasons are clear, exact match blog network links and tons of spammy exact match links such as comment spam, low quality directories, the usual junk.
The spammy links were mainly to 2 pages, they were targetting keyword 1 and keyword 2.
I'd like to remove these two pages from google, as they dont even rank in google now and create one high quality page that targets both the keywords, as they are similar.
The dilemma I have is these spammy pages still get traffic from bing and yahoo and it's profitable traffic. Is there a safe way to remove the pages from google and leave them for bing and yahoo?
Peter
-
What about using this Irving? Have you tried it before?
-
The problem with Google is that it's difficult to know whether it is a page level penalty or an anchor text filter that you are triggering from the exact match anchor text abuse. You could try creating a new page for those keywords but there is the chance that they still stop any page from ranking well for that term because of the anchor text (this has happened to me before). Let's hope Google follows Bings lead and comes up with a link removal tool!
Worth a try though.
-
I don't think there is any way around that, the pages need to 404 or Google will reindex them due to all of the links pointing to the pages, even if you do set up robots.txt to allow bing and disallow googlebot to crawl those pages that only works when the crawlers come in from the homepage.
-
My personal opinion is that Bing and Yahoo don't value those links at all. They may not penalizing you for it, but they probably aren't boosting your ranking either.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a good Free tool that will check my entire subdomain for mobility issues?
I've been using the Google tool and going page by page, everything seems great. But I'd really like something that will crawl the entire subdomain and give me a report. Any suggestions?
Technical SEO | | absoauto0 -
¿Seo issue with loading product images into an iframe?
Hi there, Recently, I modified the structure of my product page to load the images into an iframe, instead of using the img tag directly . The reason is because I wanteddd product videos(YouTube) to be shown in the same iframe. My question is: If the attributes of the images are correctly set, from a SEO perspective, Do you see any problem with that approach? I know Google bot wasn't very good crawling iframes in the past. Thanks a lot. Best regards.
Technical SEO | | footd0 -
Duplicated rel=author tags (x 3) on WordPress pages, any issue with this?
Hi,
Technical SEO | | jeffwhitfield
We seem to have duplicated rel=author tags (x 3) on WordPress pages, as we are using Yoast WordPress SEO plugin which adds a rel=author tag into the head of the page and Fancier Author Box plugin which seems to add a further two rel=author tags toward the bottom of the page. I checked the settings for Fancier Author Box and there doesn't seem to be the option to turn rel=author tags off; we need to keep this plugin enabled as we want the two tab functionality of the author bio and latest posts. All three rel=author tags seem to be correctly formatted and Google Structured Data Testing Tool shows that all authorship rel=author markup is correct; is there any issue with having these duplicated rel=author tags on the WordPress pages?
I tried searching the Q&A but couldn't find anything similar enough to what I'm asking above. Many thanks in advance and kind regards.0 -
How to improve ranking of a website again, after being penalized by Google?
The ranking of our website has gone down in past 2 months. The reason,
Technical SEO | | TGA123
I believe is that we had more than 300,000 spammy comments posted on it
(the website is based on wordpress) so Google treated it as
un-monitored forum and penalized. We have deleted the older comments
and new comments can no longer be posted. Need suggestions on what else
should we do to rank better. Any advice would be very welcome.0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Impact issues when switching from .com to uk
Buon Giorno from wetherby UK 5 degrees C and rivers bursting their banks everywhere 😞 This site http://www.sandtoft.com/ has requested a switch in url forwardfing in that they want the following to happen: When u enter the .com url it forwards to the .co.uk domain (the opposite from what it is today ie when you eneter .co.uk it switches to the .com url) So my question is please... "Will changing the .com url to .co.uk via forwarding affect SERPS in any significant manner" My view is the impact will be a minor dip in the serps followed by a recovery. Any insights welcome 🙂
Technical SEO | | Nightwing0 -
Having some weird crawl issues in Google Webmaster Tools
I am having a large amount of errors in the not found section that are linked to old urls that haven't been used for 4 years. Some of the ulrs being linked to are not even in the structure that we used to use for urls. Never the less Google is saying they are now 404ing and there are hundreds of them. I know the best way to attack this is to 301 them, but I was wondering why all of these errors would be popping up. I cant find anything in the google index searching for the link in "" and in webmaster tools it shows unavailable as where these are being linked to from. Any help would be awesome!
Technical SEO | | Gordian1 -
"Too Many On-Page Links" Issue
I'm being docked for too many on page links on every page on the site, and I believe it is because the drop down nav has about 130 links in it. It's because we have a few levels of dropdowns, so you can get to any page from the main page. The site is here - http://www.ibethel.org/ Is what I'm doing just a bad practice and the dropdowns shouldn't give as much information? Or is there something different I should do with the links? Maybe a no-follow on the last tier of dropdown?
Technical SEO | | BethelMedia0