Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best possible linking on site with 100K indexed pages
-
Hello All,
First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you.
My story / question:
I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page.
However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks.
I have NEVER received one single notice from Google that I have bad links. That first.
But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me.
Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site.
Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site.
My question:
What would be the best to do for my 'most SEO gain' out of this?
The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial.
Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche.
Should I leave my rss feed in the side bars of all the content?
Should I leave an achor text link on the sidebar (on all news etc.)
If so: there can be just one keyword... 407K pages linking with just 1 kw??
Should I keep it to just one link on the home page?
I would love to hear what you guys think.
(My domain is from 2001. Like a quality wine. However, still tanked like a submarine.)
ALL SEO reports I got here are now Grade A. The site is finally fully optimized.
Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site.
Thank you.
-
Howdy richardo24hr,
One of articles that changed my SEO life was written by Rand back in 2010:
All Links are not Created Equal
The article is still valid today, and update link Panda and Penguin have further changed the landscape of links. Here's a few important points to keep in mind:
1. Multiple links from the same domain don't always help. It's better to have 50 links from 50 domains than 10,000 links from one domain. After the first link, other links from the same domain may pass value, but that value tends to diminish.
2. Since the Penguin update, sitewide, over-optimized anchor text can lead to penalties and/or filters targeting your keywords.
For example, a sitewide footer link (or sidebar link) that pointed to your site with optimized anchor text is often seen as non-editorial (as it's placed automatically by your CMS) and this could actually hurt you.
3. Google is getting better at sniffing out site with "administrative" relationships, and tend to devalue these links. So a site that links to you from 100,000 pages is likely to broadcast a relationship between the 2 sites to Google, so Google may possibly devalue these links.
Links from this site can help! But the danger is to overdo it in a way that can actually act counter to what you're trying to achieve.
The best links from this site, from an SEO point of view, would be:
- Editorial. Meaning they are linked to in the body of a text article, and not auto-generated by a CMS
- This means the links are varied in thier anchor text, and not over-optimized
- I would avoid sitewide links
- RSS links are tricky, but in general I don't see them adding much value. Although these links are often scraped and posted on 3rd party sites. In this case it's best to use generic or branded anchor text like the full url of your site: example.com
If I had to choose, one link on the homepage may well give you more value than 100,000 RSS links. There's probably an opportunity to do more than this, but I'd do a thorough link audit to look for as many over-optimized links from this domain as possible.
Hope this helps! Best of luck with your SEO.
-
I guess my question was too hard to answer?
-
Thank you

Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
So many links from single site?
this guy is ranking on all high volume keywords and has low quality content, he has 1600 ref domains check the attachment how did he get so many links from single site is he gonna be penalized YD2BvQ0
Intermediate & Advanced SEO | | SIMON-CULL0 -
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
Do I have to optimize every page on my site?
Hi guys I run my own photography webstie (www.hemeravisuals.co.uk Going through the process optimizing my page for seo. I have one question I have a few gallery pages with no text etc? Do I still have to optimize these ? Would it rank my site lower if they weren't optimized? And how can i do this sucessfully with little text on these pages ( I have indepth text on these subjects on my services & pricing pages? Kind Regards Cam
Intermediate & Advanced SEO | | hemeravisuals0 -
How to find all indexed pages in Google?
Hi, We have an ecommerce site with around 4000 real pages. But our index count is at 47,000 pages in Google Webmaster Tools. How can I get a list of all pages indexed of our domain? trying to locate the duplicate content. Doing a "site:www.mydomain.com" only returns up to 676 results... Any ideas? Thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0 -
Should I Allow Blog Tag Pages to be Indexed?
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
Intermediate & Advanced SEO | | JSOC0