Why isn't my uneven link flow among index pages causing uneven search traffic?
-
I'm working with a site that has millions of pages. The link flow through index pages is atrocious, such that for the letter A (for example) the index page A/1.html has a page authority of 25 and the next pages drop until A/70.html (the last index page listing pages that start with A) has a page authority of just 1. However, the pages linked to from the low page authority index pages (that is, the pages whose second letter is at the end of the alphabet) get just as much traffic as the pages linked to from A/1.html (the pages whose second letter is A or B). The site gets a lot of traffic and has a lot of pages, so this is not just a statistical biip. The evidence is overwhelming that the pages from the low authority index pages are getting just as much traffic as those getting traffic from the high authority index pages. Why is this? Should I "fix" the bad link flow problem if traffic patterns indicate there's no problem? Is this hurting me in some other way? Thanks
-
Thanks Everett, I appreciate it!
-
Hello Gil,
With regard to user-generated profile pages, I typically recommend to clients that they noindex,follow these until they reach a minimum threshold of completeness (e.g. 75% complete) to avoid filling the index with thin "stub" pages, or those created by spam profiles.
If these are local business type pages, as in the White Pages example, the more "supporting content" you customize those pages with the better. For example, a local business listing page could link to similar businesses in the area, provide star ratings, allow visitors to leave reviews/comments, share demographics data for the area, include links to the business' social profiles, embedded videos (commercials, etc...) for the business and many other things.
I realize these pages might be getting traffic at the moment, but as Google updates the machine learning algo to incorporate feedback from the quality raters, who are now being asked to look at supporting content, your client may find their traffic to those pages (and indeed the site as a whole) slowly declining over the next year or two.
That's about as far as I can take it without seeing the pages. Good luck and I hope we've been of some assistance!
-
Hi Travis,
Thanks for your reply.
As I just wrote to Everett, I can't share too many details for confidentiality reasons. My site is somewhat similar to WhitePages, where http://www.whitepages.com/ind/p-001 has a Moz Page Authority of 45, but http://www.whitepages.com/ind/p-150 has a Moz PA of 1. We have similar PA distribution among our index pages, but our organic search traffic is just as high when linked to from the PA 1 pages and when linked to from the PA 45 pages. So I don't know if my client should spend time fixing the problem.
Thanks
-
Thanks. I can't share too many details for confidentiality reasons. I realize that makes it hard / impossible to diagnose correctly, and I'm sorry about that.
These are person pages. The site's link structure naturally gives more link power to the people with the most connections. We could NoIndex (or mask links to) pages that don't have much information but I think such a system would probably be complex and may backfire.
So there's not the kind of taxonomy / directory / long-tail keyword structure that you would expect from a large product directory (for example).
Let's pretend we're discussing WhitePages.com where http://www.whitepages.com/ind/p-001 has a Moz Page Authority of 45, but http://www.whitepages.com/ind/p-150 has a Moz PA of 1. I can fix the problem and get the back pages to have higher PA, but I can't recommend that my client spend resources to fix this since the pages at the back of the index get just as much organic search traffic as the pages at the top.
Thanks
-
As others have stated, we can't really say much with certainty unless we view the site. However, here are my two pennies anyway...
The farther you go down into the directory structure (assuming you have a logical taxonomy and site architecture) the more long-tail and specific the keywords will be. The more long-tail and specific the topic, the less page authority is needed to rank.
With that said, if I was working on a site with millions of pages I'd look into doing a content audit to determine which ones even SHOULD be in the index. Very few sites can scale quality landing pages into the millions.
-
You shouldn't expect anyone to solve anything that technical, with any sort of certainty, without stating the actual domain.
If it's getting organic traffic, great. Could it get more? Maybe.
No one can speak with any sort of certainty based upon what you have written at this point.
Apologies if I appear a little cranky. I'm getting tired of all of these; "I have a problem with a bajillion possible issues, but I won't tell you what I'm looking at." questions.
So you can always PM me, I'm not coming after your client. The problem is more interesting.
-
Yes, an even distribution of organic search traffic seems to indicate that the pages are indexed and ranking. Gains might be made via external links, but as far as modifying your link flow goes, it doesn't seem like the site needs it based on what you've described.
-
Thanks. Sorry I wasn't clear, when I say the traffic is pretty evenly distributed among the pages, I'm referring specifically to organic traffic. I'm wondering if the relatively even distribution of organic traffic is proof that better balancing the link flow won't increase traffic.
-
If you're speaking in terms of just organic search visits it doesn't seem to be a problem, but "traffic" in your example is a little broad. There could be paid search being targeted to those pages, or some sort of social media mechanism that causes people to visit their specific page, or so on.
A segmented look at your analytics for the site (or site section) will give you a good idea of whether or not the pages have a problem getting organic search traffic. If they don't I wouldn't worry about link flow. Really the main reason to adjust it is if you're lacking indexation or rank, and so far from what you've described you're not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any idea why Google Search Console stopped showing "Internal Links" and "Links to your site"
Our default eCommerce property (https://www.pure-elegance.com) used to show several dozen External Links and several thousand Internal Links on Google Search Console. As of this Friday both those links are showing "No Data Available". I checked other related properties (https://pure-elegance.com, http:pure-elegance.com and http://www.pure-elegance.com) and all of them are showing the same. Our other statistics (like Search Analytics etc.) remain unchanged. Any idea what might have caused this and how to resolve this?
Intermediate & Advanced SEO | | SudipG0 -
Ranking 1st for a keyword - but when 's' is added to the end we are ranking on the second page
Hi everyone - hope you are well. I can't get my head around why we are ranking 1st for a specific keyword, but then when 's' is added to the end of the keyword - we are ranking on the second page. What could be the cause of this? I thought that Google would class both of the keywords the same, in this case, let's say the keyword was 'button'. We would be ranking 1st for 'button', but 'buttons' we are ranking on the second page. Any ideas? - I appreciate every comment.
Intermediate & Advanced SEO | | Brett-S0 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Thinking about not indexing PDFs on a product page
Our product pages generate a PDF version of the page in a different layout. This is done for 2 reasons, it's been the standard across similar industries and to help customers print them when working with the product. So there is a use when it comes to the customer but search? I've thought about this a lot and my thinking is why index the PDF at all? Only allow the HTML page to be indexed. The PDF files are in a subdomain, so I can easily no index them. The way I see it, I'm reducing duplicate content On the flip side, it is hosted in a subdomain, so the PDF appearing when a HTML page doesn't, is another way of gaining real estate. If it appears with the HTML page, more estate coverage. Anyone else done this? My knowledge tells me this could be a good thing, might even iron out any backlinks from being generated to the PDF and lead to more HTML backlinks Can PDFs solely exist as a form of data accessible once on the page and not relevant to search engines. I find them a bane when they are on a subdomain.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Why isn't the Google change of address tool working for me?
Last night I switched my site from http to https. Both sites are verified in Webmaster Tools but when I try to use the change of address it says- Your account doesn't contain any sites we can use for a change of address. Add and verify the new site, then try again. How do I fix this?
Intermediate & Advanced SEO | | EcommerceSite0 -
Do image "lightbox" photo gallery links on a page count as links and dilute PageRank?
Hi everyone, On my site I have about 1,000 hotel listing pages, each which uses a lightbox photo gallery that displays 10-50 photos when you click on it. In the code, these photos are each surrounded with an "a href", as they rotate when you click on them. Going through my Moz analytics I see that these photos are being counted by Moz as internal links (they point to an image on the site), and Moz suggests that I reduce the number of links on these pages. I also just watched Matt Cutt's new video where he says to disregard the old "100 links max on a page" rule, yet also states that each link does divide your PageRank. Do you think that this applies to links in an image gallery? We could just switch to another viewer that doesn't use "a href" if we think this is really an issue. Is it worth the bother? Thanks.
Intermediate & Advanced SEO | | TomNYC0 -
Can't seem to get traffic back post Panda / Penguin. WHY?
I have done and am doing everything I can think of to bring back lost traffic after the late 2012 updates from google hit us. I just is not working. We had some issues with our out of house web developers which screwed up our site in 2012 and after taking it in house we have Eden doing damage control form months now. We think we have fixed pretty much everything. URL structure filling up with good unique content(under way. Lots still to do) making better category descriptions redesigned homepage. Updated product pages (CMS is holding things back on that part otherwise they would be better. New CMS under construction) started more link building(its a real weak spot on our SEO as far as I can see) audited bad links from dodgy irelavent sites. hired writers to create content and link bait articles. Begun making high quality video's for both YouTube (brand awareness and viral) and on site hosting (link building and conversions) (in the pipeline not online yet). Flattened out site architecture. optimise internal link flow (got this wrong by using nofollows. In the process of thinking of a better way by reducing nun wanted Nav links on page.) i realise its not all done but I have been working ever since the drop in traffic and I'm just seeing no increase at all. I have been asking a few questions on here for the past few days but still can't put my finger on the issue. Am I just impatient and need to wait on the traffic as I am doing all the correct things? Or have I missed something and need to fix it. you anyone would like to have a quick look at my site and see if there is an obvious issue I have missed It would be great as I have been tearing my hair out trying to find the issues with my site. It's www.centralsaddlery.co.uk Criticism would me much appreciated.
Intermediate & Advanced SEO | | mark_baird0 -
How do I index these parameter generated pages?
Hey guys, I've got an issue with a site I'm working on. A big chunk of the content (roughly 500 pages) is delivered using parameters on a dynamically generated page. For example: www.domain.com/specs/product?=example - where "example' is the product name Currently there is no way to get to these pages unless you enter the product name into the search box and access it from there. Correct me if I'm wrong, but unless we find some other way to link to these pages they're basically invisible to search engines, right? What I'm struggling with is a method to get them indexed without doing something like creating a directory map type page of all of the links on it, which I guess wouldn't be a terrible idea as long as it was done well. I've not encountered a situation like this before. Does anyone have any recommendations?
Intermediate & Advanced SEO | | CodyWheeler0