Links from non-indexed pages
-
Whilst looking for link opportunities, I have noticed that the website has a few profiles from suppliers or accredited organisations. However, a search form is required to access these pages and when I type cache:"webpage.com" the page is showing up as non-indexed.
These are good websites, not spammy directory sites, but is it worth trying to get Google to index the pages? If so, what is the best method to use?
-
As others have mentioned, it sounds like these links have little potential value. You could always drop a few comment URLs, tweets, G+ posts to those pages to help them get indexed, but they would still pass very little authority and I can't say it would be worth the effort.
Perhaps you could contact those same Suppliers and offer to give them a testimonial or find some other way to get your company linked on a more prominent page of their website. Think about what you can offer of value for their website.
-
Thanks guys.
I suspected the profiles wouldn't be worth indexing, but it's always worth getting a 2nd opinion. Just annoying as the links are already there on good websites, but not offering any link juice. Nevermind.
-
Oleg is correct, the crawlers have to be able to spider the page in order for them to find it.
There are a couple of ways round this but these are nasty, not fool proof and I wouldn't recommend them and as Oleg said the page probably doesn't have to much power anyway.
If your purely after this site for a link to pass juice don't waste your time, however if your after a link to generate sales, leads and revenue then still get the link.
Not all links should be about Google and ranking, obviously when link building don't break any of the SEO rules, but don't turn down the opportunity to generate sales, just because you won't get any link juice from the site doesn't make the link irrelevant.
-
If there is no direct way for the crawler to reach the page, it won't be indexed + if the page is so hard to reach, it probably wouldn't pass much authority onto your site.
I'd still get the link if it's on a good website for potential referral traffic but wouldn't worry about trying to get it indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Migration Question - Do I Need to Preserve Links in Main Menu to Preserve Traffic or Can I Simply Link to on Each Page?
Hi There We are currently redesigning the following site https://tinyurl.com/y37ndjpn The local pages links in the main menu do provide organic search traffic. In order to preserve this traffic, would be wise to preserve these links in the main menu? Or could we have a secondary menu list (perhaps in the header or footer), featured on every page, which links to these pages? Many Thanks In Advance for Responses
Intermediate & Advanced SEO | | ruislip180 -
Best practice to 301 NON-WWW pages?
Hi Guys, Have a site which has 302 redirects installed for pages like: https://domain.com.au/ to https://www.domain.com.au/ (302 redirect) Is it worth changing the redirect to a 301? This is a large site, like 10,000 pages. Also does anyone know how can this be done via Magento? Cheers
Intermediate & Advanced SEO | | bridhard80 -
Link Types For Link Building
Hi i have a SEO agency we work with who are building quality guest post links for us, however they are also building forum, profile, blog comments
Intermediate & Advanced SEO | | spyaccounts14
and directory based links. 60% of their links they are building are high quality, relevant guest posts while the other 40% are the other link types. The 40% seem to be relevant directories, forums, blog comments, etc. They said they build other link types because it diversifies the link building and profile rather then just building high quality guest posts. As just building one link type can leave a footprint. What are your thoughts on this? Cheers.0 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
Link Juice + multiple links pointing to the same page
Scenario
Intermediate & Advanced SEO | | Mark_Ch
The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links. Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark0 -
PR Dilution and Number of Pages Indexed
Hi Mozzers, My client is really pushing for me to get thousands, if not millions of pages indexed through the use of long-tail keywords. I know that I can probably get quite a few of them into Google, but will this dilute the PR on my site? These pages would be worthwhile in that if anyone actually visits them, there is a solid chance they will convert to a lead do to the nature of the long-tail keywords. My suggestion is to run all the keywords for these thousands of pages through adwords to check the number of queries and only create pages for the ones which actually receive searches. What do you guys think? I know that the content needs to have value and can't be scraped/low-quality and pulling these pages out of my butt won't end well, but I need solid evidence to make a case either for or against it to my clients.
Intermediate & Advanced SEO | | Travis-W0 -
How can I fix "Too Many On Page Links"?
One of the warnings from SEO Moz says that we have "too many on page links" on a series of pages on my website. The pages it's giving me these warnings on are on my printing sample pages. I'm assuming that it's because of my left navigation. You can see an example here: http://www.3000doorhangers.com/door-hanger-design-samples/deck-and-fence-door-hanger-samples/ Any suggestions on how to fix this warning? Thanks!
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0