Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long does google take to show the results in SERP once the pages are indexed ?
-
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions -
A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap.
1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ?
2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ?
3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page)
An answer from SEO experts will be highly appreciated. Thnx !
-
No problem my friend. You are most welcome. As most of your site gets served through https, you need to have your http version of URLs re-directed to their https equivalents. I repeat, HTTP to HTTPS. Make sure that the re-direction gives an HTTP header status message 301 and not anything else. If you do so you do not loose any of your efforts put in to building links to the https version.
You can check the HTTP header status messages for your URLs by using any of the tools like the one found here: http://web-sniffer.net
Best regards,
Devanur Rafi.
-
Hey thanks Moosa.
-
Hello Devanur,
Thanks for the prompt reply. Never knew that http & https would be so much of a trouble. Will get this one resolved. Btw, I just wanted to know whether after making this changes (https to http) will the link value be passed/ redirected from https to http or will I have loose the entire effort made on https pages? Thanks again. Awaiting your reply
Regards,
PepMoBot
-
Sorry but I am little lazy at writing so i will try to keep it short and simple
There is no time of it... but your website should be appear for branded terms like if your website s www.exampleABC.com ... your website should at least appear against “example ABC”. If you want to target more keywords and you want your website to appear against them then other then optimized pages you need some targeted links pointing back to your website.
-
Hi,
there is no fixed time after which or under which an indexed page starts appearing in the SERPs.
I just checked your sitemap.xml file and it has only the https version of the URLs. In the index, I saw non https version is URLs are also listed. So there is no consistency. You have decided to serve the entire site in https and parts of it are still non-https. Serving the pages in https puts an overhead on your server. This might result in poor page loading times. If you have good resources on the server side, then this should not be a problem.
Though guys at Google say they don't care if the URLs are https or http when in comes to ranking but here I would like to mention as site loading time is an official ranking factor, when Google comes across two similarly capable and eligible pages competing for the same keyword, the one that has better loading times will be favored. By the way, can you let me know the reason behind serving the entire site in https?
Your linking profile is not at all consistent. You build links to http://www.peprismine.com and https://www.peprismine.com
Please beware that http://www.peprismine.com, though it takes you to https://www.peprismine.com, it does not give an http header status 301 instead it gives a status 200 message. This should be fixed immediately. If you get this fixed, I think you should be fine technically but be careful with pages being served over SSL as this tends to screw the page loading times sometimes. You might want to look in to this. Don't blindly go by the page speed test scores instead, look at the actual page loading times. You can do a test here: http://www.urivalet.com and also go ahead and perform a test at webpagetest.org and check out the performance review section.
Best regards,
Devanur Rafi.
-
Hi Devanur,
Thanks for the reply. I have posted a query below (In continuation to my previous query).... Would be good if you could let me know
-
Hi Moosa,
Thanks for the reply. I have posted a query below (In continuation to my previous query).... Would be good if you could let me know
-
Hi Moosa & Devanur,
Thanks for your responses. However, I would like to know some more information on my 1st query
After making the necessary changes to our web pages, how long will it usually take to rank for particular keyword/ keywords (Assuming we have optimized these pages, as per the requirement). I read in some websites, that it will take minimum of 1 month, after the indexing is done. Is this really true or a myth? What have been your experiences?
P.S: I'm unable to see my url for any of my keywords yet
(Not even in the last page too)
Regards,
PepMozBot
-
Hi there,
Straight into the meat:
1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ?
A. Once the pages are in the index they become eligible to appear in the SERPs but, where they appear, on which page and in which position will they appear depends on lot of factors like the competition for the search term, your content, the back links that you have and the list goes on.
2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ?
A. To a little extent and in some cases yes, but this again depends on the quality (in terms of relevance, uniqueness, originality etc, etc, ) of the content on a website, the quality of its link popularity and all the other 200+ factors that Google considers before positioning a website in the SERPs. To put it straight, you do not need to worry about the number of pages if your content is of pristine quality and highly relevant as per Google.
3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page)
A. If the URLs being removed had duplicate content then in this case you will not have any negative effect.
Over a period of time, gradually, on an as and when required basis, keep adding pages that target one search term per page with relevant, unique and up-to-date content. This will result in a positive change in your organic traffic numbers. And very importantly, do not build links desperately from all the places. Earn links, that is what I would say, you have to earn links by giving a reason for your visitors to visit your website.
1. Try to earn links from authority sites in your niche. Links like this fall in the tier 1 category.
2. Get links from generic authority websites (like Wikipedia) by posting quality content. This would be your tier 2.
3. Get links from similar theme (sites that operate in your niche) websites. These links can be your tier 3.
4. Finally, earn links from generic web properties like forums, blogs, social networking sites, social bookmarking sites etc. These would be your tier 4 links.
A very important thing to keep in mind while doing the above is, "the quality of the content being posted". Be specific and try to address an issue or provide a solution in your posts. Never engage in low quality link exchanges and bulk link building. Above all, keep asking yourself all the time, "why should anyone visit my website?", "what can I do to make a visitor's visit to my website worthwhile" and " what should I do to make my website to give a better user experience or a better advantage than that of my competitors?"
With questions like the above, you will be able to secure a good longstanding and enduring position in the SERPs for your website.
Also be an active participant in social sites to attract good social buzz. Social signals are very good for your search engine optimization efforts and they can give a boost to your SEO efforts.
Wish you good luck.
Best regards,
Devanur Rafi.
-
Ok, when you said the URLs are indexed, this simply means they are appearing in SERPs, you can type in your exact URL in the Google search bar to see if the page is appearing or not... appearing against a keyword is a completely different topic it has nothing to do with indexing alone.
It’s good to have more pages but if more pages are not producing any value and your overall website is getting low value then you should prefer to go with less pages and more value.
Removal or Change of URL can have an impact on rankings... for instance one of your URL is ranking on first page for some “XYZ” keyword if you are going to change or remove the URL it will obviously going to lost its rankings...
It is always recommended to add 301 from old domain to new domain when changing or removing the URL.
Hope this helps...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
Google indexing only 1 page out of 2 similar pages made for different cities
We have created two category pages, in which we are showing products which could be delivered in separate cities. Both pages are related to cake delivery in that city. But out of these two category pages only 1 got indexed in google and other has not. Its been around 1 month but still only Bangalore category page got indexed. We have submitted sitemap and google is not giving any crawl error. We have also submitted for indexing from "Fetch as google" option in webmasters. www.winni.in/c/4/cakes (Indexed - Bangalore page - http://www.winni.in/sitemap/sitemap_blr_cakes.xml) 2. http://www.winni.in/hyderabad/cakes/c/4 (Not indexed - Hyderabad page - http://www.winni.in/sitemap/sitemap_hyd_cakes.xml) I tried searching for "hyderabad site:www.winni.in" in google but there also http://www.winni.in/hyderabad/cakes/c/4 this link is not coming, instead of this only www.winni.in/c/4/cakes is coming. Can anyone please let me know what could be the possible issue with this?
Intermediate & Advanced SEO | | abhihan0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Getting Pages Requiring Login Indexed
Somehow certain newspapers' webpages show up in the index but require login. My client has a whole section of the site that requires a login (registration is free), and we'd love to get that content indexed. The developer offered to remove the login requirement for specific user agents (eg Googlebot, et al.). I am afraid this might get us penalized. Any insight?
Intermediate & Advanced SEO | | TheEspresseo0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80 -
How long is the google sandbox these days?
Hello, I'm putting up a new site for the first time in a while. How long is the Google Sandbox these days, and what has changed about it. Before it was 6 months to 1 year long. Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0