Too Many Internal Links?
-
Hi Guys,
I'm completing a overhawl of our website at the moment have a certain penguin killed our site for our main keyword.
I'm currently working on our internal linking as most of our blog posts have a link back to our home page with the main money keyword.
At present we have 3,331 internal links and our site has only 1,000 pages.
Can you get penalised for having too many internal links with exact match anchors.
Thanks,
Scott
-
Hi Scott,
Let me be very specific about what we're talking about. To be clear, no you can't get penalized for too much internal exact match anchor text, but you can hurt your rankings.
It's very common to have site-wide internal anchor text on your website, all with the same anchor text, all pointing to the same page - especially if it's in the navigation. For example, SEOmoz has literally 100,000 links pointing to it's tools page with the anchor text "research tools"
That said, it's best to vary your anchor text the best you can, and be careful where you place it. Sitewide anchor text in your footer is probably the least valuable spot (and potentially the most harmful) followed by the sidebar and header. The best anchor text is often editorial links within the main text body, using natural, non-repeating phrases.
Hmmm... now that I read your question again, it probably does look unnatural if every blog post has the same anchor text in it, pointing to your homepage. Best to clean that up.
And check out this latest Whiteboard Friday on internal linking.
Hope this helps. Best of luck with your SEO!
-
Internally linking to the homepage is best practice, if anything.
Penguin related problems arise where there are keyword stuffing issues, or anything else that could be seen as webspam.
As well as looking at your internal link profile, it might also be a good idea to check out your use of keywords on the /main money' keyword!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disavow links and domain of SPAM links
Hi, I have a big problem. For the past month, my company website has been scrape by hackers. This is how they do it: 1. Hack un-monitored and/or sites that are still using old version of wordpress or other out of the box CMS. 2. Created Spam pages with links to my pages plus plant trojan horse and script to automatically grab resources from my server. Some sites where directly uploaded with pages from my sites. 3. Pages created with title, keywords and description which consists of my company brand name. 4. Using http-referrer to redirect google search results to competitor sites. What I have done currently: 1. Block identified site's IP in my WAF. This prevented those hacked sites to grab resources from my site via scripts. 2. Reach out to webmasters and hosting companies to remove those affected sites. Currently it's not quite effective as many of the sites has no webmaster. Only a few hosting company respond promptly. Some don't even reply after a week. Problem now is: When I realized about this issue, there were already hundreds if not thousands of sites which has been used by the hacker. Literally tens of thousands of sites has been crawled by google and the hacked or scripted pages with my company brand title, keywords, description has already being index by google. Routinely everyday I am removing and disavowing. But it's just so much of them now indexed by Google. Question: 1. What is the best way now moving forward for me to resolve this? 2. Disavow links and domain. Does disavowing a domain = all the links from the same domain are disavow? 3. Can anyone recommend me SEO company which dealt with such issue before and successfully rectified similar issues? Note: SEAGM is company branded keyword 5CGkSYM.png
Technical SEO | | ahming7770 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
GWT shows 38 external links from 8 domains to this PDF - But it shows no links and no authority in OSE
Hi All, I found one other discussion about the subject of PDFs and passing of PageRank here: http://moz.com/community/q/will-a-pdf-pass-pagerank But this thread didn't answer my question so am posting it here. This PDF: http://www.ccisolutions.com/jsp/pdf/YAM-EMX_SERIES.PDF is reported by GWT to have 38 links coming from 8 unique domains. I checked the domains and some of them are high-quality relevant sites. Here's the list: Domains and Number of Links
Technical SEO | | danatanseo
prodiscjockeyequipment.com 9
decaturilmetalbuildings.com 9
timberlinesteelbuildings.com 6
jaymixer.com 4
panelsteelbuilding.com 4
steelbuildingsguide.net 3
freedocumentsearch.com 2
freedocument.net 1 However, when I plug the URL for this PDF into OSE, it reports no links and a Page Authority if only "1". This is not a new page. This is a really old page. In addition to that, when I check the PageRank of this URL, the PageRank is "nil" - not even "0" - I'm currently working on adding links back to our main site from within our PDFs, but I'm not sure how worthwhile this is if the PDFs aren't being allocated any authority from the pages already linking to them. Thoughts? Comments? Suggestions? Thanks all!0 -
Are bad links the reason for not ranking?
Hello Moz community. I'm looking here for some input from the experts on what could be wrong with a site I'm working on. The site is in Spanish, but I'm sure you'll get the idea. We want to rank the site first page on Google Mexico (www.google.com.mx) for the keyword "refacciones Audi" and some other brands (refacciones = replacement parts would probably be a good translation, just FYI). Now, our page hasn't been completely optimized, so in my mind it's OK not to be on first page yet. However, our main competitor is ranking first page for all the keywords we want to rank for, but when you check their site, you'll find there is hardly any content, no keywords are being used in their content, all pages have the exact same title and meta description, their catalog is in a completely different domain. In short, no SEO whatsoever. Looking at Moz data, our site has a DA of 26, while our competitor's has a 10. They have no external backlinks at all, while we have a few hundred. This leaves me scratching my head: how can a completely non-optimized site outrank us? I decided to check our backlink profile, and a previous SEO agency seems to have built MANY fake blogs with lots of backlinks with rich anchor text. Quite a big percentage of our backlinks are of this kind, so this is the only thing I can think can be affecting our ranking. Will disavowing be our solution? If you'd like to check, our site is: www.refaccionariaalemana.com.mx Our competitors' is: www.saferefacciones.com ANY help will be extremely appreciated as I feel a bit lost. Thanks!
Technical SEO | | EduardoRuiz1 -
How to Break Up a Page with Too Many Links
My client has a live page with 100+ links subdivided into 10 categories that each have great potential keyword targeting opportunities. I'd like to improve this page and my intuition is to split it into 11 pages, one page with links to all the others and a bit of content about each. Here's an example of the potential IA: Dog Rescue Groups
Technical SEO | | elenarox
Golden Retriever Rescue - description
Poodle Rescue - description
Cocker Spaniel Rescue - description
Poodle Rescue - description
Labrador Retriever Rescue - description
etc. --------- Golden Retriever Rescue
Link 1 - description
Link 2 - description
Link 3 - description Is this a good idea and will I see a big traffic drop overall at first? Also, these are all internal links, not external.0 -
Linking root domains and youtube
All of my competitors have high linking root domains from youtube and our isn't showing up although we have 1.5 million views to youtube. I tried adding our URL to the videos but it hasn't recognized as a linking root domain. What should I do?? There's a ton of SEO juice here I want to tap into! watch?v=GTXFRTY4CCA&list=UUOcfF9LAHKedNSyk-gk5xDw&index=28
Technical SEO | | tonymartin0 -
Competition links make no sense
Hello everybody, I used the open site explorer to check where my competitor has links and try to put mine there too. However I am extremely confused with the results. Eg the first link to my competitor coming from a domain with authority 91, is a download file. The other one is a link from ups, the courier service. When I click on it I get an access denied.The other one comes from samsung and when I click on it, I download an swf file. Next one, fcc.gov and it downloads a wp file. If I keep clicking on these links, in the end I am going to get a virus or something and learn nothing about what my competitor does. Any one have a clue how they managed to get linked like that?
Technical SEO | | polyniki0