Penalty removing company recommendation?
-
We've got a manual penalty, not sitewide, that we've been trying to remove and keep getting our reconsideration request denied. We also do not have the manpower to manually check backlinks, contact domain owners, etc anymore. Does anyone have recommendations on a company to use?
-
Any SEO agency should be able to do this for you. We do a one-off removal for clients for a set fee and they pay once - after that, we do all the work for removal. If it's denied, we do resubmission and get it taken care of. That's something you should be looking for - make sure they don't do it once-off because if they don't get everything, you'll be paying again.
(You don't want someone to be overly aggressive with the disavow, either. You will lose the good links as well.)
-
1. Unnatural links. We keep finding all these obviously paid for links and asking for them to be removed, and disavowing them.
2. Unnatural links.
-
1. Does the reconsideration request give a specific reason as to why it repeatedly gets denied?
2. What was the original manual action for?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Guest blogging penalty
We would like to receive a blogging post from guest on our blog which links to their website and vice versa....a link from their blog to our website. Does this affect us in terms of Google's "guest blogging" scenario? We have natural link exchange from our partners...website to website from partners page.
White Hat / Black Hat SEO | | vtmoz0 -
SEO Template Recommendations - example provided but would welcome any advice
Hi there, I'm trying to improve the templates used on our website for SEO pages aimed at popular search terms. An example of our current page template is as follows: http://www.eteach.com/teaching-jobs Our designers have come up with the following new template: http://www.eteach.com/justindaviesnovemeber I know that changing successful pages can be risky. One concern is putting links behind JQuery, where the 'More on Surrey' link is. Does anyone had any strong suggestions or observations around our new template? Especially through the eyes of Google! Thanks in advance Justin
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Recovering from Pinguin Penalty
We have big issue with a website who has been hardly penalized by Pinguin on october 4th. After a lot of try to remove bad links and sending two disavow files, none of our actions has improved our situation. We're wondering if this solution might be good : changing the domaine name Keeping the same content Not using Webmaster tools and redirect 301 and wait until the site will be fully indexed Build new links Please tell us your opinion and solution. Thanks
White Hat / Black Hat SEO | | webit400 -
Manual Penalty Question
Hello dear MoZ community, I have already communicated this problem before but now it reaches to a level I have to make some hard decisions and would like your help. One of our new accounts (1 month old) got a manual penalty notification few weeks ago from Google for unnatural link building. I went through the whole process, did link detox and analysis and indeed there were lots of blog networks existing purely for cross linking. I removed these and the links got decreased dramatically. The company had around 250,000 links and truth be told if I go by the book only 700-800 of them are really worth and provide value. They will end up with roughly 15000 -20000 left which I acknowledge are a lot but some are coming from web 2 properties such as blogger, wordpress etc. Because the penalty was in some of the pages and not the whole web site I removed the ones that I identified were harming the web site, brought the anchor text down to normal levels and filed a very detailed reconsideration request and disavow file. I do not have a response so far by webmasters but here is where my concerns begin: Should I go for a new domain? losing 230.000 links ? How can there even be a "reconsideration" request for a web site with 85% of its link profile being cross linking to self owned directories and web 2 properties? If I go for a new domain should I redirect? Should I keep the domain, keep cleaning and adding new quality links so I take it with a fresh new approach? Thanks everyone in advance!
White Hat / Black Hat SEO | | artdivision0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Cross linking websites of the same company, is it a good idea
As a user I think it is beneficial because those websites are segmented to answer to each customer needs, so I wonder if I should continue to do it or avoid it as much as possible if it damages rankings...
White Hat / Black Hat SEO | | mcany0 -
Why does Google recommend schema for local business/ organizations?
Why does Google recommend schema for local business/ organizations? The reason I ask is I was in Structed Data Testing Tool, and I was running some businesses and organizations through it. Yet every time, it says this "information will not appear as a rich snippet in search results, because it seems to describe an organization. Google does not currently display organization information in rich snippets". Additionally, many of times when you do search the restaurant or a related query it will still show telephone number and reviews and location. Would it be better to list it as a place, since I want to have its reviews and location show up thanks? I would be interested to hear what everyone else opinions are on this thanks.
White Hat / Black Hat SEO | | PeterRota0 -
Over optimization penalty on the way
Matt Cutts has just anouced that they are bringing in a penalty for over optimized sites, to try and reward good content. http://searchengineland.com/too-much-seo-google%e2%80%99s-working-on-an-%e2%80%9cover-optimization%e2%80%9d-penalty-for-that-115627?utm_source=feedburner&utm_medium=feed&utm_campaign=feed-main
White Hat / Black Hat SEO | | AlanMosley3