Search Console - Best practice to fetch pages when you update them?
-
Hi guys,
If you make changes to a page e.g. add more content or something is it good practice to get google to fetch that page again in search console?
My assumption is this way, Google can review the updated page quicker, resulting in faster changes in the SERPs for that page.
Thoughts?
Cheers.
-
Yup, if you fetch it, it's more likely to appear fresh in the index. If you don't fetch it, you'll have to wait for Google to crawl your site again. This does not mean that your site is instantly indexed as soon as you fetch it; you just send a signal to Google that you want it crawled again and you'll see a status next to the page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitting URLs After New Search Console
Hi Everyone I wanted to see how people submit their urls to Google and ensure they are all being indexed. I currently have an ecommerce site with 18,000 products. I have sitemaps setup, but noticed that the various product pages haven't started ranking yet. If I submit the individual url through the new Google Search Console I see the page ranking in a matter of minutes. Before the new Google Search Console you could just ask Google to Fetch/Render an XML sitemap and ask it to crawl all the links. I don't see the same functionality working today on Google Search Console and was wondering if there are any new techniques people could share. Thanks,
Intermediate & Advanced SEO | | abiondo
Anthony1 -
Pillar pages and blog pages
Hello, I was watching this video about pillar pages https://www.youtube.com/watch?v=Db3TpDZf_to and tried to apply it to my self but find it impossible to do (but maybe I am looking at it the wrong way). Let's say I want to rank on "Normandy bike tou"r. I created a pillar page about "Normandy bike tour" what would be the topics of the subpages boosting that pillar page. I know that it should be questions people have but in the tourism industry they don't have any, they just want us to make them dream !! I though about doing more general blog pages about things such as : Places to rent a bike in Normandy or in XYZ city ? ( related to biking) Or the landing sites in Normandy ? (not related to biking) Is it the way to do it, what do you recommend ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Whats the best practice for acquisition?
Hi, My company have just bought out a competitor. We wan't to dissolve their website and if possible steal some of their link juice. The site hasn't got any spammy links or 404's so i'm not worried in that department. What I am not sure about is which of the following is best practice? a. Redirect every single page (even pages like /?checkout) to a relevant page on our website. b. Only redirect important pages, category pages, contact pages etc and leave the other pages to 404? c. Redirect the important pages to a relevant URL and redirect the less important pages to our homepage. d. Redirect the entire domain to our home page (i assume this isn't a good idea) e. Don't redirect any of the pages just delete the site.
Intermediate & Advanced SEO | | DannyHoodless0 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Mobile Search Results Include Pages Meant Only for Desktops/Laptops
When I put in site:www.qjamba.com on a mobile device it comes back with some of my mobile-friendly pages for that site(same url for mobile and desktop-just different formatting), and that's great. HOWEVER, it also shows a whole bunch of the pages (not identified by Google as mobile-friendly) that are fine for desktop users but are not supposed to exist for the mobile users, because they are too slow. Until a few days ago those pages were being redirected for mobile users to the home page. I since have changed that to 404 not founds. Do we know that Google keeps a mobile index separate from the desktop index? If so, I would think that 404 should work.. How can I test whether the 404 not founds will remove a url so they DON'T appear on a mobile device when I put in site:www.qjamba.com (or a user searches) but DO appear on a desktop for the same command.
Intermediate & Advanced SEO | | friendoffood0 -
Best practice for site maps?
Is it necessary or good practice to list "static" site routes in the sitemap? I.e. /about, /faq, etc? Some large sites (e.g. Vimeo) only list the 'dynamic' URLs (in their case the actual videos). If there are urls NOT listed in a sitemap, will these continue to be indexed? What is the good practice for a sitemap index? When submitting a sitemap to e.g. Webmaster tools, can you just submit the index file (which links to secondary sitemaps)? Does it matter which order the individual sitemaps are listed in the index?
Intermediate & Advanced SEO | | shawn810 -
Are links to on-page content crawled / have any effect on page rank?
Lets say I have a really long article that begins with links to <a name="something">anchors on the same page.</a> <a name="something"></a> <a name="something">E.g.,</a> Chapter 1, Chapter 2, etc, allowing the user to scroll down to different content. There are also other links on this page that link to other pages. A few questions: Googlebot arrives on the page. Does it crawl links that point to anchors on the same page? When link juice is divided among all the links on the page, do these links count and page rank is then lost? Thanks!
Intermediate & Advanced SEO | | anthematic0 -
Best way to stop pages being indexed and keeping PageRank
If for example on a discussion forum, what would be the best way to stop pages such as the posting page (where a user posts a topic or message) from being indexed AND not diluting PageRank too? If we added them to the Disallow on robots.txt, would pagerank still flow through the links to those blocked pages or would it stay concentrated on the linking page? Your ideas and suggestions will be greatly appreciated.
Intermediate & Advanced SEO | | Peter2640