PR Dilution and Number of Pages Indexed
-
Hi Mozzers,
My client is really pushing for me to get thousands, if not millions of pages indexed through the use of long-tail keywords.
I know that I can probably get quite a few of them into Google, but will this dilute the PR on my site?
These pages would be worthwhile in that if anyone actually visits them, there is a solid chance they will convert to a lead do to the nature of the long-tail keywords.
My suggestion is to run all the keywords for these thousands of pages through adwords to check the number of queries and only create pages for the ones which actually receive searches.
What do you guys think?
I know that the content needs to have value and can't be scraped/low-quality and pulling these pages out of my butt won't end well, but I need solid evidence to make a case either for or against it to my clients.
-
Ah, e-commerce product pages - that makes more sense!
-
Thanks, Doug.
I know that these pages will have a solid conversion rate. The long tail keywords are for product pages on an e-commerce site.
After posting, I took a look at the competitors' sites in the industry, and most of them have 150k - 300k of these similar product pages indexed. This client only has about 40k, so I think we will go ahead and try to beef it up.
-
OK, I get what he's thinking, but there's another problem in addition to diluting or canibalising your money keywords/head-terms, or running into crawl problems etc..
These pages also need to do more than sit there getting traffic for long-tale keywords. They also need to support the goals of your business/website and get people to engage with your website and writing content (thousands of pages) that'll do this is a tough ask!
It's not just about the volume of the traffic - but the relevance and intent of that traffic and what ultimately that traffic is worth to you or your client.
If visitors click through to your content and bounce straight back to the search results then you're just wasting your time and your time/money. ( http://moz.com/blog/solving-the-pogo-stick-problem-whiteboard-friday )
Take a look at he engagement metrics for your current long-tale keywords. What's the bounce rate / page depth for these visitors. Do any of them actually likely to convert?
Don't know if that'll help persuade your client.... good luck!
-
Hi, Travis-
I wouldn't be concerned with "diluting" any pagerank that you already have, if that's your question. You may be setting up new pages in competition with existing pages, if they're going after the same terms, though. The longer the existing pages have been out there, unmodified, the easier a new page may outrank it, all other things being equal.
Unfortunately, without seeing the site, it's impossible to even render a guestimate. For instance, how's your crawl budget? if it's limited, QDF could possibly allow newer pages to push older pages into the shadows. You might check out this post: http://www.algohunters.com/building-solid-index-presence-optimizing-crawl-budget/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Authority Dropped and Indexed Pages Went Down on Google?
Hi there, We run an e-commerce site on Shopify. Our Domain Authority was 28 at the start of our campaign in May of this year. We also had 610 indexed pages on Google. We did some SEO work which included: Renaming Images for SEO Adding in alt tags Optimizing the meta title to "Product Name - Keyword - Brand Name" for products Optimizing meta descriptions Transition of Hubspot blog to Shopify (it was on a subdomain at Hubspot previously) Fixing some 404s Resubmitting site map after the changes Now it is almost at the 3-month mark and it looks like our Domain Authority has gone down 4 points to 24. The # of indexed pages has gone to down to 555. We made sure all our SEO updates weren't spammy or keyword-stuffed, but took a natural and helpful-sounding approach. We followed guidelines. So there shouldn't be any penalty right? I checked site traffic and it does not coincide with the drop. Our site traffic remains steady. I also looked at "site:" as well as conducted some test searches for the important pages (i.e. main pages, blog pages, and product pages) and they still come up on Google. So could it only be non-important pages being deindexed? My questions are: Why did both the Domain Authority and # of indexed pages go down? Is there any way to see which pages were deindexed? I checked Google Search Console, but couldn't find it. Thank you!
Intermediate & Advanced SEO | | kindalpaca70 -
Too many SEO changes needed on a page. Create a new page?
I've been doing some research on a keyword with Page Optimization. I'm finding there's a lot of changes suggested. I'm wondering that because of the amount of changes required is it better to create a new page entirely from scratch that has all the suggestions implemented OR change the current page? Thanks, Chris
Intermediate & Advanced SEO | | Chris29181 -
URL structure - Page Path vs No Page Path
We are currently re building our URL structure for eccomerce websites. We have seen a lot of site removing the page path on product pages e.g. https://www.theiconic.co.nz/liberty-beach-blossom-shirt-680193.html versus what would normally be https://www.theiconic.co.nz/womens-clothing-tops/liberty-beach-blossom-shirt-680193.html Should we be removing the site page path for a product page to keep the url shorter or should we keep it? I can see that we would loose the hierarchy juice to a product page but not sure what is the right thing to do.
Intermediate & Advanced SEO | | Ashcastle0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
Should we show(to google) different city pages on our website which look like home page as one page or different? If yes then how?
On our website, we show events from different cities. We have made different URL's for each city like www.townscript.com/mumbai, www.townscript.com/delhi. But the page of all the cities looks similar, only the events change on those different city pages. Even our home URL www.townscript.com, shows the visitor the city which he visited last time on our website(initially we show everyone Mumbai, visitor needs to choose his city then) For every page visit, we save the last visited page of a particular IP address and next time when he visits our website www.townscript.com, we show him that city only which he visited last time. Now, we feel as the content of home page, and city pages is similar. Should we show these pages as one page i.e. Townscript.com to Google? Can we do that by rel="canonical" ? Please help me! As I think all of these pages are competing with each other.
Intermediate & Advanced SEO | | sanchitmalik0 -
Linking to local pages on main page - keyword self-cannibalization issue?
Hi guys, Our website has this landing page: www.example.com/service1/ Is this considered keyword self-cannibalization if on the above page we link to local pages such as: www.example.com/service1-in-chicago/ www.example.com/service1-in-newyork/ www.example.com/service1-in-texas/ Many thanks David
Intermediate & Advanced SEO | | sssrpm0 -
Ranking with other pages not index
The site ranks on page 4-5 with other page like privacy, about us, term pages. I encounter this problem allot in the last weeks; this usually occurs after the page sits 1-2 months on page 1 for the terms. I'm thinking of to much use the same anchor as a primary issue. The sites in questions are 1-5 pages microniche sites. Any suggestions is appreciated. Thank You
Intermediate & Advanced SEO | | m3fan0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0