How long does it take for an article or a page to be listed by google
-
Hi, my question is a two parter. I think i must be doing something wrong.
With my site map, it is set to show different section of my site while on my old site the site map listed every single article - i am not sure if setting it to each section is correct, can someone please advise me on this.
The second part of the question is, how long does it take for an article to be listed by google.
This article on my site was written today http://www.in2town.co.uk/lifestyle/holidaymakers-ignore-the-importance-of-travel-insurance-according-to-survey
Holidaymakers Ignore The Importance of Travel Insurance According To Survey
but when i check to see if google has listed the article yet by putting in the whole title, it does not come up, i even added the website name at the end and still it did not come up.
This is worrying me a bit as a lot of my articles are news stories which means they are current articles so if google is not picking them up then no one else will be.
can anyone let me know what i should be doing so google picks them up quicker please.
-
If you add new conetnt every day, you will start to get crawled every day.
-
the huge problem i have is getting the news pages picked up straight away, this has been a big headache of mine. there is no point in a news page being read in two days when it is old news.
I need to find a way to promote the latest news on my site and get it picked up by google
-
Bing's duane forrester said that you should not list every page, but the imporatant pages, but when i asked him about this he said that for a small site it is ok to list every page.
A site maop does not mean that your the pages it luist will be indexed, nor does it mean pages that are not included wont be indexed. It is a chance to giove the SE some info about the pages. liek change freqency, last modified, priority and such. It is also a signal of the canonical version of a page.
It is also worth noting that Bing will ignore a sitemap if it is not honest, if you put updated daily but dont do so, they will lose trust in it.
As for how long it takles to get listed, anywhere up to a month in most cases. In bing webmster tools you can place it directly into the index and will be in results shoprtly after, you can do the same in GWMT but using the instant previews or fetch as googlebot (I cant rember which) I have been told.
-
I'm not a Joomla expert - so you're best bet is to check with someone who is, however there are Joomla extensions you can use to automate the generation of your sitemap so you don't have to manually do it every time.
Which one you use is something I'm not prepared to recommend because I am not up to speed enough on Joomla.
-
hi alan, this is great. can you explain more. i use joomla, so not sure how to really set the site map.
this is the site map i am using
http://www.in2town.co.uk/sitemap-xml?sitemap=1
can you explain what i need to do to make sure that all articles are included and should i put the sitemap on my site or leave it in googlewebmaster
-
Diane,
a sitemap.xml file should include links to every page on the site you want indexed. While Google and Bing are fairly good at discovering content, this helps ensure they find pages sooner than their crawler might get around to discovering them. (unless you have a site with more than 10,000 URLS - at which point you should consider splitting sitemap files into multiple files and including a separate sitemap index file that you then submit. )
That then leads to the next question - how often? Every site is different and crawled at a different frequency based on Google's assessment of how often it should happen as well as factoring in that their system can only crawl so many pages on any given day.
That alone is reason to include all your content in sitemap files - and automatically ping search engines each time the sitemap file is updated.
If you have enough "news quality" content, look into a separate news sitemap file as well. With the right footwork and leverage, you can then see if your news specific content can be indexed even faster, and included in the Google news system as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
How can I provide titles and descriptive text for our list of USPs on the same page optimized both for usability and SEO
I am rebuilding our website together with an agency and I am stuck with the following problem: We have a page which will provide the visitor with a quick and convincing impression why he should chose our enterprise. On this page we want to show our USPs (Unique Selling Points) each with a title and a short description. Now my preferred way of presenting those USPs would be of a list of the titles (which permits to see all USPs without having to read a lot of text) where each title can be clicked to expand the description (in case you want to know more about this specific USP) and if you click on another title the previously clicked title description will collapse and the new description expand and so on (similar to this page: http://www.berlin-city-immobilien.de/38.html - I'm talking about the list in the middle of the page starting with the headline "Dabei profitieren Sie von folgenden Vorteilen"). Since I also want to use these descriptions as on page SEO-texts I checked whether Google might not index or at least value "click to expand content" less than plain text in the body of the page and I stumbled over this article: https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html. According to this article Google will definitely discount the descriptions on my page. Does anyone have an idea how to solve this problem? Either by suggesting a different way to show titles and descriptions on the page or maybe by suggesting a workaround so Google will not treat the descriptions as "click to expand text". Thank you already in advance for your input.
Technical SEO | | Benni
Ben0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
How long does it take to reindex the website
Generally speaking, how long does it take for Google to recrawl/reindex an (ecommerce) website? After changing a number of product subcategories from 'noindex' back to 'index', I regenerated the sitemap and have fetched as Google in WMT. This was a couple of weeks ago and no action yet. Second question: Does Google treat these pages as if they're brand new? I 'noindexed' them back in April, and they were ranking ok then. (I had noindexed them on the back of advice from my SEO, due to concerns about these pages being seen as duplicate content). Help!
Technical SEO | | Coraltoes770 -
Can you 301 redirect a page to an already existing/old page ?
If you delete a page (say a sub department/category page on an ecommerce store) should you 301 redirect its url to the nearest equivalent page still on the site or just delete and forget about it ? Generally should you try and 301 redirect any old pages your deleting if you can find suitable page with similar content to redirect to. Wont G consider it weird if you say a page has moved permenantly to such and such an address if that page/address existed before ? I presume its fine since say in the scenario of consolidating departments on your store you want to redirect the department page your going to delete to the existing pages/department you are consolidating old departments products into ?
Technical SEO | | Dan-Lawrence0 -
Weird Local Listings same company in listing
Have a look at this It's the local listing of weddings Gretna Here is a short version
Technical SEO | | ibexinternet
http://bit.ly/HD6ay4 There are two listing with the same address and they are the same company but just different Wed address I though this was against the rules! www.gretnaweddings.co.uk and www.gretnaweddings.com Any ideas how they getting away with it!? Cheers Steve0 -
How long does it take for traffic to bounce back from and accidental robots.txt disallow of root?
We accidentally uploaded a robots.txt disallow root for all agents last Tuesday and did not catch the error until yesterday.. so 6 days total of exposure. Organic traffic is down 20%. Google has since indexed the correct version of the robots.txt file. However, we're still seeing awful titles/descriptions in the SERPs and traffic is not coming back. GWT shows that not many pages were actually removed from the index but we're still seeing drastic rankings decreases. Anyone been through this? Any sort of timeline for a recovery? Much appreciated!
Technical SEO | | bheard0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0