Certain Pages Not Being Indexed - Please Help
-
We are having trouble getting a bulk of our pages indexed in google. Any help would be greatly appreciated!
The Following Page types are being indexed through escaped fragment:
http://www.cbuy.tv/celebrity#!65-Ashley-Tisdale/fashion/4097-Casadei-BLADE-PUMP/Product/175199
<cite>www.cbuy.tv/celebrity/155-Sophia-Bush#!</cite>
However, all our pages that look like this, are not being indexed:
-
Hi Takeshi,
We have a sitemap but also the pages are all interlinked. I didn't know that google puts an upper-bound on indexing based on PR - that's interesting.
Since there is a black and white difference between a set of pages of a certain kind (zero of these pages are being indexed) I suspect there is some other issue. Is it at all possible that google does not like the urls of these pages? :
1. does google not like the parameters?
2. should we reduce the length of our guid id number and move it to the end of the url?
-
Where are these pages being linked from? If you want these pages indexed, you may want to try making them more prominent in your site's navigation and architecture. Listing them in a sitemap can help them get discovered by Google, but actually linking to them from your site will have much more impact.
Also, I notice that the site is only pagerank 2, and already has 5000+ pages indexed in Google. Google limits the number of pages it indexes for sites based on their pagerank, so you may want to consider improving your PR so Google indexes more pages from your site.
-
Hi Mike,
I am sure you've probably already barked up this tree, but do those pages contain 100% substantially unique content?
Also, have you had an SEO developer review your robots.txt and .htaccess files to make sure there isn't something it there preventing crawlers from having access?
Dana
-
Hello Dana,
Thanks for your reply.
We have thousands of #! pages being indexed. Googlebot is sent to our escaped fragment page through a redirect. Our dynamic sitemap helped us get many pages indexed. However there are a subset of pages that google does not like at all and we cannot figure out why. For example when you visit our homepage, http://www.cbuy.tv, then navigate through images in our carousel (each assigned a unique url) none of these pages are being indexed.
Mike
-
Hi Mike,
I am not a developer, but I think the problem is the hashtag in your URL. This is a problem for search engines in that, anything following the "#" is completely ignored by search engines.
Depending on your platform, I would consider re-writing all of your URLs to omit that hashtag completely. Search engines (and humans!) can respond in unpredictable ways to anything other than alpha-neumeric characters. Then I would implement 301 redirects if necessary (depending on how old the site is and how many inbound links there are to each page).
I don't think that sitemap submission is even going to help right now because of the hashtag issue, but I'd love to hear from a developer on this for verification.
I hope this helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
Hi, I'm working with a site that has created a large group of urls (150,000) that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/... so /item/ is common to all of them, if that helps.
Intermediate & Advanced SEO | | 945010 -
React.js Single Page Application Not Indexing
We recently launched our website that uses React.js and we haven't been able to get any of the pages indexed. Our previous site (which had a .ca domain) ranked #1 in the 4 cities we had pages and we redirected it to the .com domain a little over a month ago. We have recently started using prerender.io but still haven't seen any success. Has anyone dealt with a similar issue before?
Intermediate & Advanced SEO | | m_van0 -
Will more comprehensive content on product pages help improve ranking?
We're working to improve the ranking of one of our product landing pages. The page that currently ranks #1 has a very simple, short layout with the main keyword many times on the page with otherwise very little text. One thought we had was to make a more comprehensive page including more info on the features and benefits of the product. The thought being that a longer form page would be more valuable and potentially look better to Google if the other SEO pieces are on par. Does that make sense to do? Or would it be better to keep the product page simple and make some more related content on our blog linking back to that landing page? Thanks in advance to any help you can provide!
Intermediate & Advanced SEO | | Bob_Kastner0 -
Drop in Indexed pages
Hope everyone is having an Awesome December! I first noticed a drop in my index in the beginnings of November. My site drop in indexed pages from 1400 to 600 in the past 3-4 weeks. I don't know the cause of it, and would like the community to help me figure out why my indexing has dropped. Thank you for taking time out of your schedule to read this.
Intermediate & Advanced SEO | | BSC0 -
Google indexing "noindex" pages
1 weeks ago my website expanded with a lot more pages. I included "noindex, follow" on a lot of these new pages, but then 4 days ago I saw the nr of pages Google indexed increased. Should I expect in 2-3 weeks these pages will be properly noindexed and it may just be a delay? It is odd to me that a few days after including "noindex" on pages, that webmaster tools shows an increase in indexing - that the pages were indexed in other words. My website is relatively new and these new pages are not pages Google frequently indexes.
Intermediate & Advanced SEO | | khi50 -
Please help with page
We used to use this page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html to rank for the words vinyl banner and PVC banner but we have tried to focus the page only on PVC banners and move the vinyl banners word to http://www.discountbannerprinting.co.uk/ yet for some reason even though they have both been spidered google has now chosen to rank this page http://www.discountbannerprinting.co.uk/stickers/vinyl-stickers.html for the vinyl banner words- how do I stop this from happening I thought the home page would be powerful enough to rank for the word with a title inclusion and a spread of the word on the page. Also if anyone can give their opinion on why they thinkhttp://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html does not rank very well I would be truly appreciative.
Intermediate & Advanced SEO | | BobAnderson0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0