Is there a way to index important pages manually or to make sure a certain page will get indexed in a short period of time??
-
Hi There!
The problem I'm having is that certain pages are waiting already three months to be indexed. They even have several backlinks. Is it normal to have to wait more than three months before these pages get an indexation? Is there anything i can do to make sure these page will get an indexation soon?
Greetings
Bob
-
Hi Jimmy,
It seems that the page is not indexed yet. I fetched the page again and this time it said succesful. Now the URL is tagged to index, so i will notice tomorrow if the page is getting indexed. Thank you very much for your help!
Greetings
Bob
-
Hi Bob,
Is the page currently partially indexed or not at all now?
What is the exact message Google says about the livechat?
If the on-page code is causing problems then the good advice from Ben won't help until it is resolved.Kind Regards
Jimmy
-
Hi there
I answered a question a few minutes ago with the same layout, so don't be offended if you run across these suggestions again!
Here are some suggestions to help the process:
- Check your server responses
- Check your robots.txt and meta tags
- Verify your site in Google & Bing Webmaster Tools
- Check and see if there are any issues in Webmaster Tools
- Update your sitemap and upload it to Google & Bing Webmaster Tools
- Make sure your site is crawlable for Googlebot
- Google also crawls through inbound links to your site - take a look at your Local SEO for some potentially quick and easy wins
- Check your internal links to make sure they are follow
Running through those should help you find the issue rather quickly - hope this helps! Good luck!
-
Hi Jimmy,
Thank you for your answering the question I stated above. What i forgot to tell in my story is that I already fetched as Google but the page only gets partially indexed. Since then, the situation hasn't changed and the page doesn't get's an indexation. Google says our livechat tool Zopim (type AJAX) blocks the crawling while other pages do get succesfully fetched while having the same tool.
Bob
-
Are the backlinks from external pages, rather than internal? If you can get an external backlink or two pointing at the page that may help indexation speed. The easiest - but least reliable for ensuring indexation - way would be to mention the new page on social media from a few accounts including Google +.
-
Hi,
If you do a 'Fetch as Google' in Google Webmasters Tools (under the Crawl menu) there is a submit to index button once it successfully finds the page.
Kind Regards
Jimmy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which pages should I index or have in my XML sitemap?
Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike
Technical SEO | | mjk260 -
Spam pages being redirected to 404s but sill indexed
Client had a website that was hacked about a year ago. Hackers went in and added a bunch of spam landing pages for various products. This was before the site had installed an SSL certificate. After the hack, the site was purged of the hacked pages and and SLL certificate was implemented. Part of that process involved setting up a rewrite that redirects http pages to the https versions. The trouble is that the spam pages are still being indexed by Google, even months later. If I do a site: search I still see all of those spam pages come up before most of the key "real" landing pages. The thing is, the listing on the SERP are to the http versions, so they're redirecting to the https version before serving a 404. Is there any way I can fix this without removing the rewrite rule?
Technical SEO | | SearchPros1 -
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
Sitemap indexed pages dropping
About a month ago I noticed my pages indexed from my sitemap are dropping.There are 134 pages in my sitemap and only 11 are indexed. It used to be 117 pages and just died off quickly. I still seem to be getting consistant search traffic but I'm just not sure whats causing this. There are no warnings or manual actions required in GWT that I can find.
Technical SEO | | zenstorageunits0 -
I am trying to figure out why a website is not getting fully indexed by google. Any ideas?
I am trying to figure out why a website is not getting fully indexed by google. The website was built with Godaddy's website designer so maybe this is the problem. Originally, the internal links throughout the navigation were linked to “pages” within the site. I went in and changed all of these navigation links to point to the actual url links throughout the site instead of relative links pointing to pages on the server. I thought this would have solved the problem because I thought that perhaps google was not able to follow the original relative links. When I check to see how many pages are in the google index I still see the same #. What is going on? Should this website be rebuilt using more search engine friendly code like wordpress? Is there a simple fix that will enable google to find all of this content created by Godaddy design software? I appreciate any help offered. Here is the site- http://www.securehomeusa.com/
Technical SEO | | ULTRASEM0 -
Does this content get indexed?
A lot of content on this site is displayed in pop up pages. Eg. Visit the Title page http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title To access the sample report or fee details, the info is shown in a pop up page with a strange url. Example: http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title+-+Fee+Details I can't see any of these pages being indexed in Google or other search engines when I do a site search: http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title+-+Fee+Details Is there a way to get this content indexed besides telling the client to restructure this content?
Technical SEO | | Bigheadigital0 -
Is there an easier way from the server to prevent duplicate page content?
I know that using either 301 or 302 will fix the problem of duplicate page content. My question would be; is there an easier way of preventing duplicate page content when it's an issue with the URL. For example: URL: http://example.com URL: http://www.example.com My guess would be like it says here, that it's a setting issue with the server. If anyone has some pointers on how to prevent this from occurring, it would be greatly appreciated.
Technical SEO | | brianhughes2 -
Changed cms - google indexes old and new pages
Hello again, after posting below problem I have received this answer and changed sitemap name Still I receive many duplicate titles and metas as google still compares old urls to new ones and sees duplicate title and description.... we have redirectged all pages properly we have change sitemap name and new sitemap is listed in webmastertools - old sitemap includes ONLY new sitemap files.... When you deleted the old sitemap and created a new one, did you use the same sitemap xml filename? They will still try to crawl old URLs that were in your previous sitemap (even if they aren't listed in the new one) until they receive a 404 response from the original sitemap. If anone can give me an idea why after 3 month google still lists the old urls I'd be more than happy thanks a lot Hello, We have changed cms for our multiple language website and redirected all odl URl's properly to new cms which is working just fine.
Technical SEO | | Tit
Right after the first crawl almost 4 weeks ago we saw in google webmaster tool and SEO MOZ that google indexes for almost every singlepage the old URL as well and the new one and sends us for this duplicate metatags.
We deleted the old sitemap and uploaded the new and thought that google then will not index the old URL's anymore. But we still see a huge amount of duplicate metatags. Does anyone know what else we can do, so google doe snot index the old url's anymore but only the new ones? Thanks so much Michelle0