How much time for re-indexing ?
-
I was just checking Google Webmaster tools and I found 102 duplicate title pages. Just fixed them all now.
Shall I re-submit the site map again or how do we tell Google about the changes and then how much time does it take for them to clear SERPS cache and re-index re-count ? -
here is one I "borrowed"
http://www.artdriver.co.uk/wp-content/uploads/2012/05/google-webmaster-tools-index-new-content.jpg
-
Which option ?
Screenshot please.
-
You can also give it a nudge by asking Google to recrawl in webmaster tools. There is an option in there to ask it to add it to the index. No guarantees and its normally best to let Google,e do it on its own time.
-
Ok thanks. So, key is to wait and in the meantime, keep sharing and getting good social traffic!
-
Google is in no hurry. It takes approximately a month for Google to revisit all your site's pages. The time is highly variable based on how important Google views each page on the site (i.e. how many links, tweets, etc the page receives).
If you check Google after 30 days, you will likely find most of the issue has been resolved, but it may take longer for the issue to be fully resolved. Sometimes Google may visit your site and experience an issue where the page could be skipped for a month. Other issues can arise as well.
If you are truly concerned about these pages, you can either promote them via social media or earn links to them. Otherwise, check after 30 days and the issue should be mostly resolved.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blogs Not Getting Indexed Intermittently - Why?
Over the past 5 months many of our clients are having indexing issues for their blog posts.
Technical SEO | | JohnBracamontes
A blog from 5 months ago could be indexed, and a blog from 1 month ago could be indexed but blogs from 4, 3 and 2 months ago aren't indexed. It isn't consistent and there is not commonality across all of these clients that would point to why this is happening. We've checked sitemap, robots, canonical issues, internal linking, combed through Search Console, run Moz reports, run SEM Rush reports (sorry Moz), but can't find anything. We are now manually submitting URLs to be indexed to try and ensure they get into the index. Search console reports for many of the URLs will show that the blog has been fetched and crawled, but not indexed (with no errors). In some cases we find that the blog paginated pages (i.e. blog/page/2 , blog/page/3 , etc.) are getting indexed but not the blogs themselves. There aren't any nofollow tags on the links going to the blogs either. Any ideas? *I've added a screenshot of one of the URL inspection reports from Search Console alt text0 -
Redirect indexed lightbox URLs?
Hello all, So I'm doing some technical SEO work on a client website and wanted to crowdsource some thoughts and suggestions. Without giving away the website name, here is the situation: The website has a dedicated /resources/ page. The bulk of the Resources are industry definitions, all encapsulated in colored boxes. When you click on the box, the definition opens in a lightbox with its own unique URL (Ex: /resources/?resource=augmented-reality). The information for these colored lightbox definitions is pulled from a normal resources page (Ex: /resources/augmented-reality/). Both of these URLs are indexed, leading to a lot of duplicate indexed content. How would you approach this? **Things to Consider: ** -Website is built on Wordpress with a custom theme.
Technical SEO | | Alces
-I have no idea how to even find settings for the lightbox (will be asking the client today).
-Right now my thought is to simply disallow the lightbox URL in robots.txt and hope Google will stop crawling and eventually drop from the index.
-I've considered adding the main resource page canonical to the lightbox URL, but it appears to be dynamically created and thus there is no place to access (outside of the FTP, I imagine?). I'm most rusty with stuff like this, so figured I'd appeal to the masses for some assistance. Thanks! -Brad0 -
Old url is still indexed
A couple of months ago we requested a change of address in Search console. The new, correct url is already indexed. Yet when we search the old url (with site:www.) we find that the old url is still indexed. in Google Webmaster Tools the amount of indexed pages is reduced to 1. Is there another way to remove old urls?
Technical SEO | | conversal0 -
Site Not Being Indexed
Hey Everyone - I have a site that is being treated strangely by google (at least strange to me) The site has 24 pages in the sitemap - submitted to WMT'S over 30 days ago I've manually triggered google to crawl the homepage and all connecting links as well and submitted a couple individually. Google has been parked the indexing at 14 of the 24 pages. None of the unindexed URL's have Noindex or follow tags on them - they are clearly and easily linked to from other places on the site. The site is a brand new domain, has no manual penalty history and in my research has no reason to be considered spammy. 100% unique handwritten content I cannot figure out why google isn't indexing these pages. Has anyone encountered this before? Know any solutions? Thanks in advance.
Technical SEO | | CRO_first0 -
Not All Submitted URLs in Sitemap Get Indexed
Hey Guys, I just recognized, that of about 20% of my submitted URL's within the sitemap don't get indexed, at least when I check in the webmaster tools. There is of about 20% difference between the submitted and indexed URLs. However, as far as I can see I don't get within webmaster tools the information, which specific URLs are not indexed from the sitemap, right? Therefore I checked every single page in the sitemap manually by putting site:"URL" into google and every single page of the sitemap shows up. So in reality every page should be indexed, but why does webmaster tools shows something different? Thanks for your help on this 😉 Cheers
Technical SEO | | _Heiko_0 -
Anything new if determining how many of a sites pages are in Google's supplemental index vs the main index?
Since site:mysite.com *** -sljktf stopped working to find pages in the supplemental index several years ago has anyone found another way to identify content that has been regulated to the supplemental index?
Technical SEO | | SEMPassion0 -
Duplicate page content - index.html
Roger is reporting duplicate page content for my domain name and www.mydomain name/index.html. Example: www.just-insulation.com
Technical SEO | | Collie
www.just-insulation.com/index.html What am I doing wrongly, please?0