How can I get Google to crawl my site daily?
-
I was wndering if there was a trick to getting google to crawl my website daily?
-
Great advice from everyone!!! Thanks!
Saibose - If I add a twitter, FB or blog feed to my site, and it updates daily, this will help the rate that google crawls my site? Does it matter that it is embeded content?
-
Blogs, tweets and other social mentions are always good for a fresh crawl.
-
The best way is to enable the creation of unique content daily. UGC and user participation if added to the unique content would add to the crawl ability. If you want to go a bit high tech, you can put the fresh content on a webcache to enable Google bots to visit you more frequently. ALso, make sure that your older/archived pages get some content cycled ( new comments, new tweets etc etc)
-
sitemap should show daily, but only for pages you intent to change daily. I got the idea from a Matt Cutts video, that they only honour the sitmap if they find it to be reliable. if you say daily but never update the content, then its not going to work.
-
Proper sitemap that tells the SE's to come back can help.
Write content daily. If you have a blog, ping google when you post a new piece of content. If you show Google that you write content daily, they'll come back.
I did SEO for a very successful conservative news website and Google crawled them 15-20 times a day and indexed things in sometimes seconds.
Make good use of pinging!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Client suffered a malware attack. Removed links not being crawled by Google!
Hi all, My client suffered a malware attack a few weeks ago where an external site somehow created 700 plus links on my clients site with their content. I removed all of the content and redirected the pages to the home page. I then created a new temporary xml sitemap with those 700 links and submitted the sitemap to Google 9 days ago. Google has crawled the sitemap a few times but not the individual links. When I click on the crawl report for the sitemap in GSC, I see that the individual links still have the last crawled date from before they were removed. So in Googles eyes, that old malicioud content still exists. What do I do to ensure Google knows the contnt is gone and redirected? Thanks!
Technical SEO | | sk19900 -
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
Can Google Read schema.org markup within Ajax?
Hi All, as a local business directory, we also display Openinghours on a business listing page. ex. http://www.goudengids.be/napoli-kontich-2550/
Technical SEO | | TruvoDirectories
At the same time I also have schema.org markup for Openinghours implemented.
But, for technical reasons (performance), the openinghours (and the markup alongside) are displayed using AJAX. I'm wondering if google is able to read the markup. The rich snippet tool and markup plugings like Semantic Inspector can't "see" the markup for openinghours. Any advice here?0 -
Can you have a /sitemap.xml and /sitemap.html on the same site?
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?
Technical SEO | | PioneerServices0 -
Google Showing Multiple Listings For Same Site?
I've been optimizing a small static HTML site and have been working to increase the keyword rankings, yet have always ranked #1 for the company name. But, I've now noticed the company name is taking more than just the first position - the site is now appearing in 1st, 2nd, and 3rd position (each position referencing a different page of the site). Great.. who doesn't want to dominate a page of Google! ..But it looks kind of untidy and not usually how links from the same site are displayed. Is this normal? I'm used to seeing results from the same site grouped under the primary result, but not like this. any info appreciated 🙂
Technical SEO | | GregDixson0 -
What to do about Google Crawl Error due to Faceted Navigation?
We are getting many crawl errors listed in Google webmaster tools. We use some faceted navigation with several variables. Google sees these as "500" response code. It looks like Google is truncating the url. Can we tell Google not to crawl these search results using part of the url ("sort=" for example)? Is there a better way to solve this?
Technical SEO | | EugeneF0 -
What tool can i use to get the true speed of my site
hi, i am trying to get the true speed of my site. i want to know how fast www.in2town.co.uk is but the tools that i am using are giving me different readings. http://tools.pingdom.com/fpt/#!/DkHoNWmZh/www.in2town.co.uk says the speed is 1.03s http://gtmetrix.com/reports/www.in2town.co.uk/i4EMDk34 says my speed is 2.25s and http://www.vertain.com/m.q?req=cstr&reqid=dAv79lt8 says it is 4.36s so as you can see i am confused. I am trying to get the site as fast as possible, but need to know what the correct speed is so i can work on things that need changing to make it faster. can anyone also let me know what speed i should be working for. many thanks
Technical SEO | | ClaireH-1848860 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0