Certain Pages Not Being Indexed - Please Help
-
We are having trouble getting a bulk of our pages indexed in google. Any help would be greatly appreciated!
The Following Page types are being indexed through escaped fragment:
http://www.cbuy.tv/celebrity#!65-Ashley-Tisdale/fashion/4097-Casadei-BLADE-PUMP/Product/175199
<cite>www.cbuy.tv/celebrity/155-Sophia-Bush#!</cite>
However, all our pages that look like this, are not being indexed:
-
Hi Takeshi,
We have a sitemap but also the pages are all interlinked. I didn't know that google puts an upper-bound on indexing based on PR - that's interesting.
Since there is a black and white difference between a set of pages of a certain kind (zero of these pages are being indexed) I suspect there is some other issue. Is it at all possible that google does not like the urls of these pages? :
1. does google not like the parameters?
2. should we reduce the length of our guid id number and move it to the end of the url?
-
Where are these pages being linked from? If you want these pages indexed, you may want to try making them more prominent in your site's navigation and architecture. Listing them in a sitemap can help them get discovered by Google, but actually linking to them from your site will have much more impact.
Also, I notice that the site is only pagerank 2, and already has 5000+ pages indexed in Google. Google limits the number of pages it indexes for sites based on their pagerank, so you may want to consider improving your PR so Google indexes more pages from your site.
-
Hi Mike,
I am sure you've probably already barked up this tree, but do those pages contain 100% substantially unique content?
Also, have you had an SEO developer review your robots.txt and .htaccess files to make sure there isn't something it there preventing crawlers from having access?
Dana
-
Hello Dana,
Thanks for your reply.
We have thousands of #! pages being indexed. Googlebot is sent to our escaped fragment page through a redirect. Our dynamic sitemap helped us get many pages indexed. However there are a subset of pages that google does not like at all and we cannot figure out why. For example when you visit our homepage, http://www.cbuy.tv, then navigate through images in our carousel (each assigned a unique url) none of these pages are being indexed.
Mike
-
Hi Mike,
I am not a developer, but I think the problem is the hashtag in your URL. This is a problem for search engines in that, anything following the "#" is completely ignored by search engines.
Depending on your platform, I would consider re-writing all of your URLs to omit that hashtag completely. Search engines (and humans!) can respond in unpredictable ways to anything other than alpha-neumeric characters. Then I would implement 301 redirects if necessary (depending on how old the site is and how many inbound links there are to each page).
I don't think that sitemap submission is even going to help right now because of the hashtag issue, but I'd love to hear from a developer on this for verification.
I hope this helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Complicated Title Tag Issues. Experts, Please Help!
Hey there Moz community! This is the first time I ask a question here so please forgive me if I miss any forum etiquette. I am managing SEO for an educational site which is built in React Js, and so far much of the job has been keyword research and site optimization. The site still has slow PageSpeed though. The Issues - 4 weeks ago we published 20 or so content pieces, I had pre-prepared title tags and meta descriptions. But when we released the content there was a programming error that made all of the pages show another title tag for all 20 pages instead of the pre-prepared individual title tags. I noticed this after 3 days and the issue was fixed within 6 days, but by then Google had crawled and indexed the pages. And now I can't get Google to change to the pre-prepared tags no matter what I do! I've tried changing the content, changing the URL of one of the pages, and I've sent Google spiders to re-crawl the pages multiple times. The super weird thing is that the correct title tag shows in the 'navigation bar/tabs bar' on google chrome: But NOT when I view the source code for the page: Yesterday I was taking a walk in the park and I just couldn't stop thinking about it (it is really starting to get to me by now since nothing works), so I ran back home and looked closely at one of these pages in the Google search console. And I noticed something I hadn't seen before… BOTH of the title tags can be found in the HTML: Pre-prepared title tag: <title></strong>UK Seat Belt & Car Seat Laws: The Definitive Guide<strong></title> The other title tag (in src section): title=Ace%20The%20DMV%20Permit%20Test%20%26%20Get%20Your%20License Could this be the problem or what do you think? I've understood that Google has automated title tags and that they can choose their own if they think it fits better, but the title tags aren't even close to describing the topic as it is now so it doesn’t make any sense. All answers are greatly appreciated! Your advice is life-saving for a learner like me. P.S. I love SEO but it can be very frustrating sometimes! Thank you very much, Leo
Intermediate & Advanced SEO | | Leowa0 -
React.js Single Page Application Not Indexing
We recently launched our website that uses React.js and we haven't been able to get any of the pages indexed. Our previous site (which had a .ca domain) ranked #1 in the 4 cities we had pages and we redirected it to the .com domain a little over a month ago. We have recently started using prerender.io but still haven't seen any success. Has anyone dealt with a similar issue before?
Intermediate & Advanced SEO | | m_van0 -
How would you handle these pages? Should they be indexed?
If a site has about 100 pages offering specific discounts for employees at various companies, for example... mysite.com/discounts/target mysite.com/discounts/kohls mysite.com/discounts/jcpenney and all these pages are nearly 100% duplicates, how would you handle them? My recommendation to my client was to use noindex, follow. These pages tend to receive backlinks from the actual companies receiving the discounts, so obviously they are valuable from a linking standpoint. But say the content is nearly identical between each page; should they be indexed? Is there any value for someone at Kohl's, for example, to be able to find this landing page in the search results? Here is a live example of what I am talking about: https://www.google.com/search?num=100&safe=active&rlz=1C1WPZB_enUS735US735&q=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&oq=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&gs_l=serp.3...7812.8453.0.8643.6.6.0.0.0.0.174.646.3j3.6.0....0...1c.1.64.serp..0.5.586...0j35i39k1j0i131k1j0i67k1j0i131i67k1j0i131i46k1j46i131k1j0i20k1j0i10i3k1.RyIhsU0Yz4E
Intermediate & Advanced SEO | | FPD_NYC0 -
Removing massive number of no index follow page that are not crawled
Hi, We have stackable filters on some of our pages (ie: ?filter1=a&filter2=b&etc.). Those stacked filters pages are "noindex, follow". They were created in order to facilitate the indexation of the item listed in them. After analysing the logs we know that the search engines do not crawl those stacked filter pages. Does blocking those pages (by loading their link in AJAX for example) would help our crawl rate or not? In order words does removing links that are already not crawled help the crawl rate of the rest of our pages? My assumption here is that SE see those links but discard them because those pages are too deep in our architecture and by removing them we would help SE focus on the rest of our page. We don't want to waste our efforts removing those links if there will be no impact. Thanks
Intermediate & Advanced SEO | | Digitics0 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Sort term product pages and fast indexing - XML sitemaps be updated daily, weekly, etc?
Hi everyone, I am currently working on a website that the XML sitemap is set to update weekly. Our client has requested that this be changed to daily. The real issue is that the website creates short term product pages (10-20 days) and then the product page URL's go 404. So the real problem is quick indexing not daily vs weekly sitemap. I suspect that daily vs weekly sitemaps may help solve the indexing time but does not completely solve the problem. So my question for you is how can I improve indexing time on this project? The real problem is how to get the product pages indexed and ranking before the 404 page shows u?. . Here are some of my initial thoughts and background on the project. Product pages are only available for 10 to 20 days (Auction site).Once the auction on the product ends the URL goes 404. If the pages only exist for 10 to 20 days (404 shows up when the auction is over), this sucks for SEO for several reasons (BTW I was called onto the project as the SEO specialist after the project and site were completed). Reason 1 - It is highly unlikely that the product pages will rank (positions 1 -5) since the site has a very low Domain Authority) and by the time Google indexes the link the auction is over therefore the user sees a 404. Possible solution 1 - all products have authorship from a "trustworthy" author therefore the indexing time improves. Possible solution 2 - Incorporate G+ posts for each product to improve indexing time. There is still a ranking issue here since the site has a low DA. The product might appear but at the bottom of page 2 or 1..etc. Any other ideas? From what I understand, even though sitemaps are fed to Google on a weekly or daily basis this does not mean that Google indexes them right away (please confirm). Best case scenario - Google indexes the links every day (totally unrealistic in my opinion), URL shows up on page 1 or 2 of Google and slowly start to move up. By the time the product ranks in the first 5 positions the auction is over and therefore the user sees a 404. I do think that a sitemap updated daily is better for this project than weekly but I would like to hear the communities opinion. Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
Indexing a several millions pages new website
Hello everyone, I am currently working for a huge classified website who will be released in France in September 2013. The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it. Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ? The website will cover about 300 jobs : In all region (= 300 * 22 pages) In all departments (= 300 * 101 pages) In all cities (= 300 * 37 000 pages) Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ? More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ? One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet. Thanks for your help ! Best Regards, Raphael
Intermediate & Advanced SEO | | Pureshore0 -
Do in page links pointing to the parent page make the page more relevant for that term?
Here's a technical question. Suppose I have a page relevant to the term "Mobile Phones". I have a piece of text, on that page talking about "mobile phones", and within that text is the term "cell phones". Now if I link the text "cell phones", to the page it is already placed on (ie the parent page) - will the page gain more relevancy for the term "cell phones"?? Thanks
Intermediate & Advanced SEO | | James770