Spammy link for each keyword
-
Some people believe that having a link for each keyword and a page of content for each keyword (300+ words) can help ranking for those keywords. However, the old approach of having "restaurant New York", "restaurant Buffalo", "restaurant Newark" approach has become seen as a terrible SEO practice. I don't know whether this was because it's spammy or because people usually combined it with thin content that was 95% duplicate.
Which brings us to;
Why does such a major company have the following on the site (see the footer);
- Aberdeen Takeaway
- Birmingham Takeaway
- Brighton Takeaway
- Bristol Takeaway
- Cambridge Takeaway
- Canterbury Takeaway
- Cardiff Takeaway
- Coventry Takeaway
- Edinburgh Takeaway
- Glasgow Takeaway
- Leeds Takeaway
- Leicester Takeaway
- Liverpool Takeaway
- London Takeaway
- Manchester Takeaway
- Newcastle Takeaway
- Nottingham Takeaway
- Sheffield Takeaway
- Southampton Takeaway
- York Takeaway
- Indian Takeaway
- Chinese Takeaway
- Thai Takeaway
- Italian Takeaway
- Cantonese Takeaway
- Pizza Delivery
- Sushi Takeaway
- Kebab Takeaway
- Fish and Chips
- Sandwiches
Do they know something I don't?
[unnecessary links removed by staff]
-
I meant it still reads the link as "Newcastle" rather than "Scrap Car Newcastle" (i.e. it doesn't inherit information from the parent list-item - i.e. Scrap Car) but the point about the actual landing page being optimised for "scrap car Newcastle" being enough is a good one and seems to be the best approach.
-
Where have you heard it can not?? Google can read all the code on the site including simple code like this, it's mainly heavy javascript or heavy flash they struggle to read, but they are getting really clever at reading parts of that now
-
Yeah... which is why I wanted to know why a major brand like Hungryhouse thought it was okay. The answer is "they're wrong" I guess.
-
as per my previous reply above, its too many exact anchor links which is causing the over optmimzation problem... remember these are site wide links, so every page has this anchor text, so again if there is 1000 pages then thats 1000 internal exact anchors causing you to get hit by google latest over optimization penalty (AKA PENGUIN)
-
I would be very carefull the mis-use of no follow can land you in trouble... looking at your screen shot talking about the menu, the word scrap cars is repeated over and over which there is no need, remove the scrap car from the menu, BUT keep on the landing page title as scrap my car in bolton (and then optimize your phrases in the meta tags) then this would be better, why not have Scrap Car Locations instead of Location
You are falling into the trap of over using internal anchors, as if there is 1000 pages, thats 1000 internal exact anchors saying scrap car bolton etc
same applies if you do within the footer site wide links as per hungry house, they just need to change it like justeat.co.uk have it, then hey presto ALL GOOD AGAIN (provided external links are not also over used
-
Do you know about "Penguin" link overoptimization problems?
-
I agree it does look like a schoolboy error... but do you think you'd avoid overoptimisation if they only use this style of footer on the homepage. Just taking the homepage on its own, they've used the word "takeaway" 54 times for (amusingly) 3% keyword density.
Hungryhouse don't appear to be ranking anywhere near as much as their budget (TV adverts, newspaper, etc.) so I'd imagine they've probably been penalised somehow.
-
That's indeed the right question
-
Yes, that's the question! Similar problem on a client site.
I'm optimising the menu system for a scrap car recycling company. Unless I stick in a nofollow, the anchor text in each link in this navbar will be the "description" Google takes for the page. I'm trying to optimise each of the location pages so I might not need to do this.
-
So, here's the question..... Is the big list of anchor text links in the original post on this page dangerous for hungryhouse?
-
Having a landing page for each area is good for as long as it serves a purposes for users who are looking for something in that area, look at this company http://www.just-eat.co.uk/ they are flying with all their results at the top of the engines for all their takeaway town keyphrases... look at the difference between the two sites, espcially at the footer they have less links and no exact anchors used, they just mention the big towns without the keyphrases before it... this is biggest school boy error hungry house has done... and by having exact anchors in the footer like this will result with too many internal exact anchors, causing you to hit over optimization area!
TUT! TUT!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
ASP Canonical and Internal Linking
Hello - I'm working with a large ASP website and trying to troubleshoot issues I believe might be related to how the canonical element is used. On page - all internal links, including navigation links, use the following format (uppercase) - website.com**/F**older/Folder/Product . So, any page navigated to will always display the uppercase version of the URL. And, all of these pages have the canonical tag pointing to the lowercase version of the URL. The pages included in Google's index are all lowercase versions of the URL like this - website.com**/f**older/folder/product . My concern is that a lot of internal authority flow is being impacted/negated because all internal links point to the uppercase versions of URLs and all those pages reference the lowercase version URL in the canonical reference. Is this a valid concern?
On-Page Optimization | | LA_Steve0 -
Am I spamming my Keyword?
Hi All I am trying to rank my site for many key phrases but the pretty much always contain the word "Sussex" The biggy with a lot of competition is "Caterers Sussex" and similar variations when I view source on their page I find that Gastro catering's code uses "sussex" 92 times in it code. My site www.SussexChef.com uses the word "Sussex" 590 times, the competitors site mentions the word less in its code and is dominant for all my desirable key words. Am I spamming my keyword by using Sussex too often when naming my image file? Is there anything in this or am I barking up the wrong tree? Thanks for your help Ben
On-Page Optimization | | SussexChef830 -
Appropriate Keyword Usage in Document
Moz on page grader - Text content is very important for modern SEO. In order to optimize your chances of ranking higher for the targeted keyword(s), we recommend using the targeted keyword(s) at least 4 times. Recommendation: Add at least 4 instances of the targeted keyword(s) to the document text of this page. Is there any preferred word count for the web page/blog? for example it is not good to add keyword 4 time in 200 words content. and We need to add exact keyword 4 times in the content?
On-Page Optimization | | marknorman0 -
Too Many On-Page Links
Hello. So, my SEO team has worked very hard to finally resolve RogerBot/GoogleBot specific Crawl Errors either manually or programmatically can be fixed for our Budget Blinds USA Pro Campaign. We've done a good job even if a lot of it came from Robots.txt file entries as this was the most efficient way our client chose to do it. Good news is most of it is CMS configuration and not bad site architecture. That being said our next big volume of Crawl Errors is "Too Many On-Page Links". Our Moz DomainRank is 61. Our client, on this new version of the website, added a large nav-based footer which has duplicate links from the Header Main Navigation. I believe our solution is to put in No-Follow Metatags at the Footer Link Level, so we don't zap Page Authority by over-dividing as you recommend. Is this the best way to resolve this? Is there any risk in this? Or is a 61 DomainRank high enough for RogerBot and GoogleBot to crawl these anyway? Please advise,
On-Page Optimization | | Aviatech0 -
Keywords to optimize
In the menu there's an item with a submenu with 4 items (pages) and another item with a submenu with almost the same pages with a litle bit different content. The problem is that one keyword can be applied and must be applied to the similar pages (the topic is very similar). I guess the number of keywords that we optimize is also important too. Optimizing minimun 8 keywords seems to me very hard. I' was told to optimize for a very low number of keywords but then we have the problem of redundancy. What should I do? Thanks!
On-Page Optimization | | juanmiguelcr0 -
Trying to understand why i do not rank well for this keyword
Hi, i am working on a page at the moment and i am trying to work out why i do not rank well for the keyword gastric band hypnotherapy or gastric band hypnosis. The page is http://www.in2town.co.uk/Gastric-Band-Hypnotherapy any help on what i need to change to start ranking well would be of a great help
On-Page Optimization | | ClaireH-1848860 -
Does targeting more than one keyword or keyword phrase effect rankings?
Hi, We have a homepage where we are targeting three main keywords. 'Cheap books', 'buy books' and 'used books'. We are ranking well for cheap books and making progress on the more competitive buy and used. My question is how many keywords can you reasonably rank for on one page. We are targeting other keywords on other pages and having some success - but is three the maximum or is that too many?
On-Page Optimization | | Benj251 -
Optimizing Internal Links to Homepage
I've read that the Bots only count the first link on a page. My navigation has 2 links to my homepage - 1 from the logo keyword.png alt txt=keyword & 1 text link 'keyword.' Now in my content does it have any seo value to link relevant pieces of text back to the homepage? Thanks Jason Jackson
On-Page Optimization | | JasonJackson0