Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long does google take to show the results in SERP once the pages are indexed ?
-
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions -
A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap.
1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ?
2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ?
3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page)
An answer from SEO experts will be highly appreciated. Thnx !
-
No problem my friend. You are most welcome. As most of your site gets served through https, you need to have your http version of URLs re-directed to their https equivalents. I repeat, HTTP to HTTPS. Make sure that the re-direction gives an HTTP header status message 301 and not anything else. If you do so you do not loose any of your efforts put in to building links to the https version.
You can check the HTTP header status messages for your URLs by using any of the tools like the one found here: http://web-sniffer.net
Best regards,
Devanur Rafi.
-
Hey thanks Moosa.
-
Hello Devanur,
Thanks for the prompt reply. Never knew that http & https would be so much of a trouble. Will get this one resolved. Btw, I just wanted to know whether after making this changes (https to http) will the link value be passed/ redirected from https to http or will I have loose the entire effort made on https pages? Thanks again. Awaiting your reply
Regards,
PepMoBot
-
Sorry but I am little lazy at writing so i will try to keep it short and simple

There is no time of it... but your website should be appear for branded terms like if your website s www.exampleABC.com ... your website should at least appear against “example ABC”. If you want to target more keywords and you want your website to appear against them then other then optimized pages you need some targeted links pointing back to your website.
-
Hi,
there is no fixed time after which or under which an indexed page starts appearing in the SERPs.
I just checked your sitemap.xml file and it has only the https version of the URLs. In the index, I saw non https version is URLs are also listed. So there is no consistency. You have decided to serve the entire site in https and parts of it are still non-https. Serving the pages in https puts an overhead on your server. This might result in poor page loading times. If you have good resources on the server side, then this should not be a problem.
Though guys at Google say they don't care if the URLs are https or http when in comes to ranking but here I would like to mention as site loading time is an official ranking factor, when Google comes across two similarly capable and eligible pages competing for the same keyword, the one that has better loading times will be favored. By the way, can you let me know the reason behind serving the entire site in https?
Your linking profile is not at all consistent. You build links to http://www.peprismine.com and https://www.peprismine.com
Please beware that http://www.peprismine.com, though it takes you to https://www.peprismine.com, it does not give an http header status 301 instead it gives a status 200 message. This should be fixed immediately. If you get this fixed, I think you should be fine technically but be careful with pages being served over SSL as this tends to screw the page loading times sometimes. You might want to look in to this. Don't blindly go by the page speed test scores instead, look at the actual page loading times. You can do a test here: http://www.urivalet.com and also go ahead and perform a test at webpagetest.org and check out the performance review section.
Best regards,
Devanur Rafi.
-
Hi Devanur,
Thanks for the reply. I have posted a query below (In continuation to my previous query).... Would be good if you could let me know
-
Hi Moosa,
Thanks for the reply. I have posted a query below (In continuation to my previous query).... Would be good if you could let me know
-
Hi Moosa & Devanur,
Thanks for your responses. However, I would like to know some more information on my 1st query
After making the necessary changes to our web pages, how long will it usually take to rank for particular keyword/ keywords (Assuming we have optimized these pages, as per the requirement). I read in some websites, that it will take minimum of 1 month, after the indexing is done. Is this really true or a myth? What have been your experiences?
P.S: I'm unable to see my url for any of my keywords yet
(Not even in the last page too)Regards,
PepMozBot
-
Hi there,
Straight into the meat:
1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ?
A. Once the pages are in the index they become eligible to appear in the SERPs but, where they appear, on which page and in which position will they appear depends on lot of factors like the competition for the search term, your content, the back links that you have and the list goes on.
2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ?
A. To a little extent and in some cases yes, but this again depends on the quality (in terms of relevance, uniqueness, originality etc, etc, ) of the content on a website, the quality of its link popularity and all the other 200+ factors that Google considers before positioning a website in the SERPs. To put it straight, you do not need to worry about the number of pages if your content is of pristine quality and highly relevant as per Google.
3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page)
A. If the URLs being removed had duplicate content then in this case you will not have any negative effect.
Over a period of time, gradually, on an as and when required basis, keep adding pages that target one search term per page with relevant, unique and up-to-date content. This will result in a positive change in your organic traffic numbers. And very importantly, do not build links desperately from all the places. Earn links, that is what I would say, you have to earn links by giving a reason for your visitors to visit your website.
1. Try to earn links from authority sites in your niche. Links like this fall in the tier 1 category.
2. Get links from generic authority websites (like Wikipedia) by posting quality content. This would be your tier 2.
3. Get links from similar theme (sites that operate in your niche) websites. These links can be your tier 3.
4. Finally, earn links from generic web properties like forums, blogs, social networking sites, social bookmarking sites etc. These would be your tier 4 links.
A very important thing to keep in mind while doing the above is, "the quality of the content being posted". Be specific and try to address an issue or provide a solution in your posts. Never engage in low quality link exchanges and bulk link building. Above all, keep asking yourself all the time, "why should anyone visit my website?", "what can I do to make a visitor's visit to my website worthwhile" and " what should I do to make my website to give a better user experience or a better advantage than that of my competitors?"
With questions like the above, you will be able to secure a good longstanding and enduring position in the SERPs for your website.
Also be an active participant in social sites to attract good social buzz. Social signals are very good for your search engine optimization efforts and they can give a boost to your SEO efforts.
Wish you good luck.
Best regards,
Devanur Rafi.
-
Ok, when you said the URLs are indexed, this simply means they are appearing in SERPs, you can type in your exact URL in the Google search bar to see if the page is appearing or not... appearing against a keyword is a completely different topic it has nothing to do with indexing alone.
It’s good to have more pages but if more pages are not producing any value and your overall website is getting low value then you should prefer to go with less pages and more value.
Removal or Change of URL can have an impact on rankings... for instance one of your URL is ranking on first page for some “XYZ” keyword if you are going to change or remove the URL it will obviously going to lost its rankings...
It is always recommended to add 301 from old domain to new domain when changing or removing the URL.
Hope this helps...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
Wrong meta descriptions showing in the SERPS
We recently launched a new site on https, and I'm seeing a few errors in the SERPS with our meta descriptions as our pages are starting to get indexed. We have the correct meta data in our code but it's being output in Google differently. Example: http://imgur.com/ybqxmqg Is this just a glitch on Google's side or is there an obvious issue anyone sees that I'm missing? Thanks guys!
Intermediate & Advanced SEO | | Brian_Owens_10 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Is there a way to get a list of Total Indexed pages from Google Webmaster Tools?
I'm doing a detailed analysis of how Google sees and indexes our website and we have found that there are 240,256 pages in the index which is way too many. It's an e-commerce site that needs some tidying up. I'm working with an SEO specialist to set up URL parameters and put information in to the robots.txt file so the excess pages aren't indexed (we shouldn't have any more than around 3,00 - 4,000 pages) but we're struggling to find a way to get a list of these 240,256 pages as it would be helpful information in deciding what to put in the robots.txt file and which URL's we should ask Google to remove. Is there a way to get a list of the URL's indexed? We can't find it in the Google Webmaster Tools.
Intermediate & Advanced SEO | | sparrowdog0 -
Indexed Pages in Google, How do I find Out?
Is there a way to get a list of pages that google has indexed? Is there some software that can do this? I do not have access to webmaster tools, so hoping there is another way to do this. Would be great if I could also see if the indexed page is a 404 or other Thanks for your help, sorry if its basic question 😞
Intermediate & Advanced SEO | | JohnPeters0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80 -
Number of Indexed Pages are Continuously Going Down
I am working on online retail stores. Initially, Google have indexed 10K+ pages of my website. I have checked number of indexed page before one week and pages were 8K+. Today, number of indexed pages are 7680. I can't understand why should it happen and How can fix it? I want to index maximum pages of my website.
Intermediate & Advanced SEO | | CommercePundit0