Is there a way to index important pages manually or to make sure a certain page will get indexed in a short period of time??
-
Hi There!
The problem I'm having is that certain pages are waiting already three months to be indexed. They even have several backlinks. Is it normal to have to wait more than three months before these pages get an indexation? Is there anything i can do to make sure these page will get an indexation soon?
Greetings
Bob
-
Hi Jimmy,
It seems that the page is not indexed yet. I fetched the page again and this time it said succesful. Now the URL is tagged to index, so i will notice tomorrow if the page is getting indexed. Thank you very much for your help!
Greetings
Bob
-
Hi Bob,
Is the page currently partially indexed or not at all now?
What is the exact message Google says about the livechat?
If the on-page code is causing problems then the good advice from Ben won't help until it is resolved.Kind Regards
Jimmy
-
Hi there
I answered a question a few minutes ago with the same layout, so don't be offended if you run across these suggestions again!
Here are some suggestions to help the process:
- Check your server responses
- Check your robots.txt and meta tags
- Verify your site in Google & Bing Webmaster Tools
- Check and see if there are any issues in Webmaster Tools
- Update your sitemap and upload it to Google & Bing Webmaster Tools
- Make sure your site is crawlable for Googlebot
- Google also crawls through inbound links to your site - take a look at your Local SEO for some potentially quick and easy wins
- Check your internal links to make sure they are follow
Running through those should help you find the issue rather quickly - hope this helps! Good luck!
-
Hi Jimmy,
Thank you for your answering the question I stated above. What i forgot to tell in my story is that I already fetched as Google but the page only gets partially indexed. Since then, the situation hasn't changed and the page doesn't get's an indexation. Google says our livechat tool Zopim (type AJAX) blocks the crawling while other pages do get succesfully fetched while having the same tool.
Bob
-
Are the backlinks from external pages, rather than internal? If you can get an external backlink or two pointing at the page that may help indexation speed. The easiest - but least reliable for ensuring indexation - way would be to mention the new page on social media from a few accounts including Google +.
-
Hi,
If you do a 'Fetch as Google' in Google Webmasters Tools (under the Crawl menu) there is a submit to index button once it successfully finds the page.
Kind Regards
Jimmy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Alternatives 301? Issues redirection of index.html page with Adobe Business Catalyst
Hi Moz community, As for now we have two different versions of a client's homepage that’s dividing our traffic. One of the urls is the index.html version of the other url. We are using Adobe Business Catalyst for one of our clients and they told us they can’t 301 redirect. Adobe Business Catalyst does 301 redirects, but not to itself like an .htaccess rewrite. Doing a 301 redirect using BC from index.html to / creates an infinite loop and break the page. Are there alternatives to a 301 or any suggestions how to solve this? Thanks for all your answers and thoughts in advance,
Technical SEO | | Anna_Hoesl
Anna0 -
Page Load Timings: How accurate is Google Analytics Data?
Hello Guys, what are your experiences? How accurate is google analytics data regarding page load times? I know that one of my sites has trouble with pageload times, especially in India and USA. We are based in middle Europe and regarding to the GA data we have here in middle europe of about 2 seconds page load time. Moreover we have of about 4 seconds in USA and 10 seconds in India. Therefore I decided to test for a few sides a CDN (on these pages all static files are served over the CDN). However, first GA data indicates, that the page load times are even getting worse!!! But when I test it for example with pingdom (http://tools.pingdom.com/fpt/) and compare it with an old landing page without CDN implementation, the tool says it's faster. The CDN provider (maxcdn) send me also some reports, which indicate, that the page load time should be faster...That's the reason why I ask about your experience with the GA page load time data, because personally I get the impression you cannot trust the data... Thanks for your help! Cheers
Technical SEO | | _Heiko_2 -
Number of index pages in web master is different from site:mydomainname
Google says one to discover whether my pages is index in Google is site:domain name of my website: https://support.google.com/webmasters/answer/34444?hl=enas mention in web page above so basically according to that i can know totally pages indexed for my website right:it shows me when type (site:domain name ) 300 but it says in Google web master that i have 100000so which is the real number of index page 300 or 1000000 as web master says and why i get 300 when using site:domain name even Google mention that it is way to discover index paged
Technical SEO | | Jamalon0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
41.000 pages indexed two years after it was redirected to a new domain
Hi!Two years ago, we changed the domain elmundodportivo.es to mundodeportivo.com. Apparently, everything was OK, but more than two years later, there are still 41.000 pages indexed in Google (https://www.google.com/search?q=site%3Aelmundodeportivo.es) even though all the domains have been redirected with a 301 redirect. I detected some problems with redirections that were 303 instead of 301, but we fixed that one month ago.A secondary problem is that the pagerank for elmundodportivo.es is 7 yet and mundodeportivo.com is 3.What I'm doing wrong?Thank you all,Oriol
Technical SEO | | MundoDeportivo0 -
Un-Indexing a Page without robots.txt or access to HEAD
I am in a situation where a page was pushed live (Went live for an hour and then taken down) before it was supposed to go live. Now normally I would utilize the robots.txt or but I do not have access to either and putting a request in will not suffice as it is against protocol with the CMS. So basically I am left to just utilizing the and I cannot seem to find a nice way to play with the SE to get this un-indexed. I know for this instance I could go to GWT and do it but for clients that do not have GWT and for all the other SE's how could I do this? Here is the big question here: What if I have a promotional page that I don't want indexed and am met with these same limitations? Is there anything to do here?
Technical SEO | | DRSearchEngOpt0 -
Duplicate Page Content and Title for product pages. Is there a way to fix it?
We we're doing pretty good with our SEO, until we added product listing pages. The errors are mostly Duplicate Page Content/Title. e.g. Title: Masterpet | New Zealand Products MasterPet Product page1 MasterPet Product page2 Because the list of products are displayed on several pages, the crawler detects that these two URLs have the same title. From 0 Errors two weeks ago, to 14k+ errors. Is this something we could fix or bother fixing? Will our SERP ranking suffer because of this? Hoping someone could shed some light on this issue. Thanks.
Technical SEO | | Peter.Huxley590 -
How can I get unimportant pages out of Google?
Hi Guys, I have a (newbie) question, untill recently I didn't had my robot.txt written properly so Google indexed around 1900 pages of my site, but only 380 pages are real pages, the rest are all /tag/ or /comment/ pages from my blog. I now have setup the sitemap and the robot.txt properly but how can I get the other pages out of Google? Is there a trick or will it just take a little time for Google to take out the pages? Thanks! Ramon
Technical SEO | | DennisForte0