Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will noindex pages still get link equity?
-
We think we get link equity from some large travel domains to white label versions of our main website. These pages are noindex because they're the same URLs and content as our main B2C website and have canonicals to the pages we want indexed. Question is, is there REALLY link equity to pages on our domain which have "noindex,nofollow" on them?
Secondly we're looking to put all these white label pages on a separate structure, to better protect our main indexed pages from duplicate content risks. The best bet would be to put them on a sub folder rather than a subdomain, yes? That way, even though the pages are still noindex, we'd get link equity from these big domains to www.ourdomain.com/subfolder where we wouldn't to subdomain.ourdomain.com?
Thank you!
-
According to John Mueller, the answer is no (at least in the long term)
https://www.seroundtable.com/google-long-term-noindex-follow-24990.html
-
Thanks for your advice chaps - ultimately a change is coming in a couple of weeks, might update this page if it's useful...
-
I agree with Gaston´s view. What is stopping you though from changing the nofollow tag to follow and maintain the canonical and noindex? That way you wouldn´t have duplicate content issues (either on the main domain, folder or subdomain) and still pass link equity.
-
Hello Josep,
Firstly, when the noindex tag by itself doesnt stop pagerank to be transfered. The tag nofollow is the problem here.
Remember that the link equity is passed when you/the page lets the googlebot to go to and "follow" the next page.Secondly, if you still repect the noindex, canonical and all the correct stuff to prevent the duplicate content, there will be no difference between folder and subdomain.
Hope it helps.
Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG **Regarding: **https://six9ja.com/
Reporting & Analytics | | Kingsmart1 -
Will changing the property from http to https in Google Analytics affect main unfiltered view?
I set my client up with an unfiltered view in Google Analytics. This is the one with historical data going back for years, so I don't want to do anything that will affect this view. Recently, the website moved from HTTP to HTTPS. There's a setting for the property that will allow me to change the property name to https://EXAMPLE.com and change the default URL to https://EXAMPLE.com. Questions: 1. If I change the property name and the default URL, will this somehow affect my unfiltered view in a way that I'll lose historical data or data moving forward? 2. I have heard that changing the default URL to HTTPS will help me avoid a common problem others have experienced (where they lose the referrer in Google Analytics and a bunch of their sessions go to direct / other). Is this true?
Reporting & Analytics | | Kevin_P3 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Find Pages with 0 traffic
Hi, We are trying to consolidate the amount of landing pages on our site, is there any way to find landing pages with a particular URL substring which have had 0 traffic? The minimum which appears in google analytics is 1 visit.
Reporting & Analytics | | driveawayholidays0 -
How can I see what Google sees when it crawls my page?
In other words, how can see the text and what not it sees from start to finish on each page. I know there was a site, but I can't remember it.
Reporting & Analytics | | tiffany11030 -
Easiest way to get out of Google local results?
Odd one this, but what's the easiest way to remove a website from the Google local listings? Would removing all the Google map listings do the job? A client of ours is suffering massively since the Google update in the middle of last month. Previously they would appear no1 or no2 in the local results and normally 1 or 2 in the organic results. However, since the middle of last month any time they rank on the first page for a local result, their organic result has dropped massively to at least page 4. If I set my location as something different in google, say 100 miles away, they then rank well for the organic listings (obviously not appearing for local searches). When I change it back to my current location the organic listing is gone and they are back to ranking for the local. Since the middle of July the traffic from search engines has dropped about 65%. All the organic rankings remain as strong as ever just not in the areas where they want to get customers from!! The idea is to remove the local listing and get the organics reranking as the ctr on those is much much higher. On a side note, anyone else notice very poor ctr on google local listings? Maybe users feel they are adverts thanks
Reporting & Analytics | | ccgale0 -
Time on page: What happens when I open many tabs?
Hello everyone, I was studying Analytics, and checked that the time on page is calculated by the diference of the time you entered the page and when you click to go to another one. But how the time is calculated when I open several links using new tabs in different moments? Does Google counts the last tab? Just a guess... Thanks!
Reporting & Analytics | | seomasterbrasil0 -
Will having a subdomain cause referral traffic from the domain name?
Hi! One of our clients has a site with the store on a subdomain: store.example.com. When we've set up goals for order confirmation pages, we often see most of the sources attributed to example.com. Is this because of the subdomain issue? How would we correct it so that we would see as the referring source for the goal the site that sent to the root domain originally, and not the site that sent to the subdomain? Thanks!
Reporting & Analytics | | debi_zyx0