How should i knows google to indexed my new pages ?
-
I have added many products in my ecommerce site but most of the google still not indexed yet. I already submitted sitemap a month ago but indexed process was very slow. Is there anyway to know the google to indexed my products or pages immediately. I can do ping but always doing ping is not the good idea. Any more suggestions ?
-
Fetch as Google tool is not the same as the submit URL tool that I have mentioned.
In the pages that are not indexed yet, there might be some accessibility issues. In such a case, you will be able to see the URLs in Issues and Warnings report from Webmaster.
If your site is a WordPress site, you can install a plugin - BWP GXS. This plugin automatically sets the ping frequency for each section. It helps in getting the site indexed quickly.
Another solution is to share your site's URL on social media platforms or update the site with fresh information daily. Once, Google bots identifies a site as a static site, it will take sometime to increase crawl and indexing frequency. So, make sure that you keep updating content on the site for next few days on a regular basis.
-
Is Fetch as Google are same as your link
Yes i am checking each of my pages by typing site: and in sitemap its showing 163 URLS submitted out of 336. Daily it is increasing but it is taking too much time.
-
I think you should do the following things:
1. Go to yourwebsite.com/sitemap.xml
2. Check if all the xml files are accessible properly.
3. Check Google Webmaster dashboard about issues with sitemap. Most probably, Google will be able to report if there are major issues in your sitemap due to which robots are not able to access it.
4. When you add new pages to the site, use Google's submit URL tool. Here's the URL https://www.google.com/webmasters/tools/submit-url
This allows Google to identify that you have added a new URL and needs to be crawled.
Another thing, are you sure that Google has not indexed your pages. If you are not sure, do a google search for site:yourwebsite.com.
Hope this helps
-
How to conduct that SEO test ?
I just submitted the site on bing.
-
Chandu, the best way to let Google know is the sitemap, which you have done. Even if that has not worked out, I sense there might be some technical issues with the website. Being a ecommerce site, the chances are high. Conduct a SEO Audit to find out the potential harms and solve them.
Have you tried indexing them in Bing, if not give it a try!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why images are not getting indexed and showing in Google webmaster
Hi, I would like to ask why our website images not indexing in Google. I have shared the following screenshot of the search console. https://www.screencast.com/t/yKoCBT6Q8Upw Last week (Friday 14 Sept 2018) it was showing 23.5K out 31K were submitted and indexed by Google. But now, it is showing only 1K 😞 Can you please let me know why might this happen, why images are not getting indexed and showing in Google webmaster.
Technical SEO | | 21centuryweb0 -
Google indexes page elements
Hello We face this problem that Google indexes page elements from WordPress as single pages. How can we prevent these elements from being indexed separately and being displayed in the search results? For example this project: www.rovana.be When scrolling down the search results, there are a lot of elements that are indexed separately. When clicking on the link, this is wat we see (see attachements) Does anyone have experience with this way of indexing and how can we solve this problem? Thanks! LlAWG4w.png C7XDDYS.png gVroomx.png
Technical SEO | | conversal0 -
Google indexing .com and .co.uk site
Hi, I am working on a site that is experiencing indexation problems: To give you an idea, the website should be www.example.com however, Google seems to index www.example.co.uk as well. It doesn’t seem to honour the 301 redirect that is on the co.uk site. This is causing quite a few reporting and tracking issues. This happened the first time in November 2016 and there was an issue identified in the DDOS protection which meant we would have to point www.example.co.uk to the same DNS as www.example.com. This was implemented and made no difference. I cleaned up the htaccess file and this made no difference either. In June 2017, Google finally indexed the correct URL, but I can’t be sure what changed it. I have now migrated the site onto https and www.example.co.uk has been reindexed in Google alongside www.example.com I have been advised that the http needs to be removed from DDOS which is in motion I have also redirected http://www.example.co.uk straight to https://www.example.com to prevent chain redirects I can’t block the site via robot.txt unless I take the redirects off which could mean that I lose my rankings. I should also mention that I haven't actually lost any rankings, it's just replaced some URLs with co.uk and others have remained the same. Could you please advise what further steps I should take to ensure the correct URL’s are indexed in Google?
Technical SEO | | Niki_10 -
Does Google index internal anchors as separate pages?
Hi, Back in September, I added a function that sets an anchor on each subheading (h[2-6]) and creates a Table of content that links to each of those anchors. These anchors did show up in the SERPs as JumpTo Links. Fine. Back then I also changed the canonicals to a slightly different structur and meanwhile there was some massive increase in the number of indexed pages - WAY over the top - which has since been fixed by removing (410) a complete section of the site. However ... there are still ~34.000 pages indexed to what really are more like 4.000 plus (all properly canonicalised). Naturally I am wondering, what google thinks it is indexing. The number is just way of and quite inexplainable. So I was wondering: Does Google save JumpTo links as unique pages? Also, does anybody know any method of actually getting all the pages in the google index? (Not actually existing sites via Screaming Frog etc, but actual pages in the index - all methods I found sadly do not work.) Finally: Does somebody have any other explanation for the incongruency in indexed vs. actual pages? Thanks for your replies! Nico
Technical SEO | | netzkern_AG0 -
Which carries more weight Google page rank or Alexa Rank?
And how come do I see websites with Google PR of Zero and Alexa Page Rank in the top Thousands rank?
Technical SEO | | sherohass0 -
Site being indexed by Google before it has launched
We are currently coming towards the end of a site migration, and are at the final stage of testing redirects etc. However, to our horror we've just discovered Google has started indexing the new site. Any ideas on how this could have happened? I have most recently asked for robots.txt to exclude anything with a certain parameter in URL. Is there a chance this, wrongly implemented, could have caused this?
Technical SEO | | Sayers0 -
Google picking up wrong page title
Hi, When searching for "Tottenham Forum" on google.co.uk (link below) http://www.google.co.uk/search?q=tottenham+forum&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-GB:official&client=firefox-a The site I manage (THFCTalk.com) is listed as 4th in the search results, but was hacked a few months ago and the search results lists the page title as "Free Shipping. Order Cialis Online. - Online Pharmacy" when the actual page title of THFCTalk is not actually set at that. Any idea how to fix this so Google updates this header on the search results? - as it is surely putting people off from clicking on our search result
Technical SEO | | WalesDragon0 -
Google dropping pages from SERPS
The website for my London based plumbing company has thousands of specifically tailored pages for the various services we provide to all the areas in London. It equates to approximately 6000 pages in total. When google has all these pages indexed, we tend to get a fair bit of traffic - as they cater pretty well for long tail searches. However, every once in a while Google will drop the vast majority of our indexed pages from SERPs for a few days or weeks at a time - for example at the moment Google is only indexing 613 whereas last week it was back at the normal ~6000. Why does this happen? We of course lose a lot of organic traffic when these pages don't displayed - what are we doing wrong? Website: www.pgs-plumbers.co.uk
Technical SEO | | guy_andrews0