Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long does it take for Google to see Changes to a site?
-
Hi,
I have a low PR site (PR 1) that I am starting to work on. Ingeneral when you make changes to my site how long would it take Google to recognize and index those changes?
The reason I am wondering is because the site I am working on had a lot of duplicate content (around 700 pages), I got rid of it all, but I wasn't sure how long it would take Google to spider all these pages and re-index them since the site is low PR.
Thanks,
Ken
-
Ive started a new blog from scratch with optimized content and it's only taken 5 days to be indexed (on a 1 yr old site). That time will shorten in the future with more posting consistency for most any site. As for the former 700 page website, I'd guess that Google would learn to trust, say, a new website faster than it would trust a website that it had learned not to trust due to duplicate content. Im still a novice.
-
Hi Ken,
That really depends on the website and how often it is updated. If it is a site that is updated often, Google increases that rate at which it crawls your website.
The easiest thing to do, presuming you have a webmaster tools account set up for the website, is to submit the site to Google's index.
To do this, go to webmaster tools:
- In the left hand menu select 'Diagnostics'
- Then click on 'fetch as googlebot'
- Click 'Fetch' toward the top of the screen (This will fetch your homepage)
- After a few seconds the fetch status column will change to 'Success'
- Next to this column you will have the option to 'Submit to index'
- You will then have the choice of either submitting this URL or this URL and all linked pages
- Select the latter and click submit
This should speed up your indexing!
Hope this helps.
Elias
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The particular page cannot be indexed by Google
Hello, Smart People!
On-Page Optimization | | Viktoriia1805
We need help solving the problem with Google indexing.
All pages of our website are crawled and indexed. All pages, including those mentioned, meet Google requirements and can be indexed. However, only this page is still not indexed.
Robots.txt is not blocking it.
We do not have a tag "nofollow"
We have it in the sitemap file.
We have internal links for this page from indexed pages.
We requested indexing many times, and it is still grey.
The page was established one year ago.
We are open to any suggestions or guidance you may have. What else can we do to expedite the indexing process?1 -
site speed
i use mid-quality pic and... but my site speed is low
On-Page Optimization | | zlbvasgabc
any suggestion?
my site is:
https://bandolini.ir/0 -
Virtual URL Google not indexing?
Dear all, We have two URLs: The main URL which is crawled both by GSC and where Moz assigns our keywords is: https://andipaeditions.com/banksy/ The second one is called a virtual url by our developpers: https://andipaeditions.com/banksy/signedandunsignedprintsforsale/ This is currently not indexed by Google. We have been linking to the second URL and I am unable to see if this is passing juice/anything on to the main one /banksy/ Is it a canonical? The /banksy/ is the one that is being picked up in serps/by Moz and worry that the two similar URLs are splitting the signal. Should I redirect from the second to the first? Thank you
On-Page Optimization | | TAT1000 -
Linking to External Site In Nav Bar
Hi, we are a celebrity site but also own a separate sports site with its own URL. We have a link to that site in our Nav bar. Are we being penalized by having that link? thanks
On-Page Optimization | | Uinterview0 -
When I changes Template, why traffic goes down?
I've noticed that when I change my blog's template the traffic goes down dramatically, about of 40% decrease. I know that new themes can have some problems but I have tried this with 2 different themes. First try was with genesis framework(Paid one) and just in one day traffic went down and when I reverted the old theme, the traffic became normal. Should I wait for 1 week to see what happens? What could be the potential reason of this?
On-Page Optimization | | hammadrafique0 -
Is it bad to include google Maps in footer?
We have 5 locations and we were thinking about including a map for each location in the footer. These would be set-up as no-follow links. They could potentially enhance user experience but it also increases size of footer. Right now there are just basic links to pages (sitemap, terms, etc), contact info, social links, and contact form. If we did the maps it would also include link to the individual location pages. Not sure if we are doing too much in footer or need to just keep it basic. Thanks for the help!
On-Page Optimization | | Restore0 -
Blocking Subdomain from Google Crawl and Index
Hey everybody, how is it going? I have a simple question, that i need answered. I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more. What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc? Hope to hear from you, Best Regards,
On-Page Optimization | | JesusD3 -
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy. For example - the old structure : country / city / city area Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city : We needed to change the structure to be : country / region / area / city / cityarea So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too. Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301). So my question is (sorry for long waffle) : Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually? Thanks for any help anyone can give.
On-Page Optimization | | TinkyWinky0