Backlinks to internal pages
-
Hi,
Our website of 3K+ pages currently has more links coming to internal pages (2nd & 3rd Level), compared to links to homepage.
Just wanted to know if this is bad for rankings ? Please share your thoughts. Thanks.
-
Hi Jeeyaul,
As long as the links pointing at the 2nd and 3rd level pages (and the site in general) are good quality, then I wouldn't worry about this too much. I certainly don't think it's bad for rankings and is actually more likely to help those 2nd and 3rd level pages rank better.
If, on the other hand, these links aren't great quality or could be seen as manipulative by Google, then that could be bad for rankings.
You could take a look at this blog post that I wrote late last year which talks about what makes a good and bad link:
https://moz.com/blog/the-anatomy-of-a-link
I hope that helps!
Paddy
-
A good rule of thumb for SEO is to write your pages for the users, not the search engines. So I would assume if it seems logical and beneficial to the reader to link to 2nd and 3rd level pages then it shouldn't be an issue. If it makes sense and provides usability then I would not worry about it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Validated pages on GSC displays 5x more pages than when performing site:domain.com?
Hi mozzers, When checking the coverage report on GSC I am seeing over 649,000 valid pages https://cl.ly/ae46ec25f494 but when performing site:domain.com I am only seeing 130,000 pages. Which one is more the source of truth especially I have checked some of these "valid" pages and noticed they're not even indexed?
Intermediate & Advanced SEO | | Ty19860 -
Backlink audit - anyone know of a good tool for manually checking backlinks?
Hi all, I'm looking at how to handle backlinks on a site, and am seeking a tool into which I can manually paste backlinks - is there a good backlink audit tool that offers this functionality? Please let me know! Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Landing Page Drop Out
Hi, If a product page drops out of organic ranking, but you've made no changes is there a good place to start in order to find out why? I feel like it's almost impossible? Thank you!
Intermediate & Advanced SEO | | BeckyKey1 -
Internal Linking - Can You Over Do It?
Hi, One of the sites I'm working on has a forum with thousands of pages, amongst thousands of other pages. These pages produce lots of organic search traffic... 200,000 per month. We're using a bit of custom code to link relevant words and phrases from various discussion threads to hopefully related discussion pages. This generates thousands of links and up to 8 in-context links per page. A page could have anywhere from 200 to 3000 words in one to 50+ comments. Generally, a page with 200 words would have fewer of these automatically generated links, just because there are fewer terms naturally on the page. Is there any possible problem with this, including but not limited to some kind of internal anchor text spam or anything else? We do it to knit together pages for link juice and hopefully user experience... giving them another page to go to. The pages we link to are all our pages that produce or we hope to produce organic search traffic from. Thanks! ....Darcy
Intermediate & Advanced SEO | | 945010 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Page Titles of Blog
Hi, Should all the page titles of our blogs include a Keyword(s) and\or our website name?
Intermediate & Advanced SEO | | Studio330 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0