Anchor text penalty doesn't work?!
-
How do you think, does the anchortext penalty exactly work? Keyword domains obviously can't over-optimize for their main keyword (for example notebook.com for the keyword notebook). And a lot of non-keyword-domains do optimize especially in the beginning for their main keyword to get a good ranking in google (and it always works). Is there any particular point (number of links) I can reach, optimizing for one keyword, after what i'm gonna get a penalty?
-
mix them up, you dont want your anchor text to be all the same,
Web developer
Web development
web dev
a local web developer
web design
website development
Looks a lot more natural
-
There is not like a set number for anything in SEO. But if you think its too much then it probably is. It depends on your competition and what they are already doing (google obviously likes what they are doing or they wouldn't rank).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate categories how to make sure I don't get penalized for this
Hi there How would I go about fixing duplicate categories? My products sell in multiple category areas and some overlap the other - how can I go about making sure that I don't get penalised for this? Each category and content is unique but my advisors offer different tools and insights.
White Hat / Black Hat SEO | | edward-may0 -
Why do these links violate Google's Quality Guideline?
My reconsideration request was declined by Google. Google said that some of the links to my site (www.pianomother.com) are still outside its quality guidelines. We provide piano lessons and sheet music on the site. Three samples are given. 1. http://www.willbeavis.com/links.htm 2. http://vivienzone.blogspot.com/2009/06/learning-how-to-play-piano.html 3. http://interiorpianoservice.com/links/ The first one is obvious because it is a link exchange page. I don't understand why the 2nd and 3rd ones are considered "inorganic links" by Google. The 2nd link is a blog that covers various topics including music, health, computer, etc. The 3rd one is a page of the site that provides piano related services. Other resources related to piano including my website are listed on the page. Please help. Thanks. John
White Hat / Black Hat SEO | | pianomother0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Preparing for Penguin: Delete or Change to Branding 25 small blogs, anchor text
Hello, This site has 80 root domains pointing to the domain, call it site X. 25 of them are tiny blogs the owner put up himself. The blogs consist only of 4 posts or so, where each post has a 2 keyword anchor text links to each one of his 4 sites. One link in each post goes to the home page and one goes to an internal page. Let's concern ourselves with cleaning up the anchor text profile of site X. All blogs are on private registration. Half of the blog domain names are furniture related and furniture is not relevant to this niche. But 3/4 to 3/5 of the content of each blog (2 paragraphs per post and 4 posts) is relevant. My concern is that even though the anchor text is varied and there's only 2 links going out to site X per blog, none of it is branded and so I'm concerned about Penguin type updates. Should we change these to branded or delete them? We're working on content promotion for backlinks in case we have to delete these blogs, but it's a small budget. What should we do? '
White Hat / Black Hat SEO | | BobGW0 -
Sitewide logo footer link - what's the risk?
Hi, an incredibly popular website, with several thousand pages, has offered me a site-wide footer logo link. The site this popular website would backlink to has 50 high quality backlinks (and low volumes of traffic - it's a new site). I am tempted to say no, because of the risk of penalty, but then I started wondering whether a logo link posed the same penalty risk as a text link.
White Hat / Black Hat SEO | | McTaggart0 -
Content box (on page content) and titles Google over-optimization penalty?
We have a content box at the bottom of our website with a scroll bar and have posted a fair bit of content into this area (too much for on page) granted it is a combination of SEO content (with links to our pages) and informative but with the over optimization penalty coming around I am a little scared if this will result in a problem for us. I am thinking of adopting the process of this website HERE with the content behind a more information button that drops down, would this be better as it could be much more organised and we will be swopping out to more helpful information than the current 50/50 (SEO – helpful content) or will it be viewed the same and we might as well leave it as is and lower the amount of repetition and links in the content. Also we sell printed goods so our titles may be a bit over the top but they are bring us a lot of converting traffic but again I am worried about the new Google release this is an example of a typical title (only an example not our product page) Banner Printing | PVC Banners | Outdoor Banners | Backdrops | Vinyl Banners | Banner Signs Thank you for any help with these matters.
White Hat / Black Hat SEO | | BobAnderson0 -
Google turned me down, don't know why...
Hello, I'm experiencing decreasing on some of my keywords. I'm aware of some things which could be responsible for it. So I'd like to asi you, if my thoughts are right, and what to do with it. 1. I put backlinks leading onto my website. Those backlinks are on website I also own (they are on the same server). But nothing happened. Than I put other backlikns on this webiste. Those links also led to webistes I own. So could Google "punnished" those websites I'm linking to? 2. I offered my content to another website, which has a higher authority. This content had been published on my website weeks ago, I put it on this (another site). Co could Google punnished me for "duplicate" content? 3. In the past, we outsorced our SEO, and the company which was responsible for our SEO put backlinks leading to our website almost everywhere, I mean, those websites, they put links leading to our webistes fos focused on almost everything but our field (finance). But everything seemed to be fine, till now 4. Couple of days ago, I put our RSS on many RSS agregators and put our webiste on many catalogs. My website URL is www.penizenavic.cz Could you help me out? 🙂 Thanks a lot Petr
White Hat / Black Hat SEO | | petr.rozkosny0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0