Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long does google take to show the results in SERP once the pages are indexed ?
-
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions -
A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap.
1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ?
2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ?
3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page)
An answer from SEO experts will be highly appreciated. Thnx !
-
No problem my friend. You are most welcome. As most of your site gets served through https, you need to have your http version of URLs re-directed to their https equivalents. I repeat, HTTP to HTTPS. Make sure that the re-direction gives an HTTP header status message 301 and not anything else. If you do so you do not loose any of your efforts put in to building links to the https version.
You can check the HTTP header status messages for your URLs by using any of the tools like the one found here: http://web-sniffer.net
Best regards,
Devanur Rafi.
-
Hey thanks Moosa.
-
Hello Devanur,
Thanks for the prompt reply. Never knew that http & https would be so much of a trouble. Will get this one resolved. Btw, I just wanted to know whether after making this changes (https to http) will the link value be passed/ redirected from https to http or will I have loose the entire effort made on https pages? Thanks again. Awaiting your reply
Regards,
PepMoBot
-
Sorry but I am little lazy at writing so i will try to keep it short and simple
There is no time of it... but your website should be appear for branded terms like if your website s www.exampleABC.com ... your website should at least appear against “example ABC”. If you want to target more keywords and you want your website to appear against them then other then optimized pages you need some targeted links pointing back to your website.
-
Hi,
there is no fixed time after which or under which an indexed page starts appearing in the SERPs.
I just checked your sitemap.xml file and it has only the https version of the URLs. In the index, I saw non https version is URLs are also listed. So there is no consistency. You have decided to serve the entire site in https and parts of it are still non-https. Serving the pages in https puts an overhead on your server. This might result in poor page loading times. If you have good resources on the server side, then this should not be a problem.
Though guys at Google say they don't care if the URLs are https or http when in comes to ranking but here I would like to mention as site loading time is an official ranking factor, when Google comes across two similarly capable and eligible pages competing for the same keyword, the one that has better loading times will be favored. By the way, can you let me know the reason behind serving the entire site in https?
Your linking profile is not at all consistent. You build links to http://www.peprismine.com and https://www.peprismine.com
Please beware that http://www.peprismine.com, though it takes you to https://www.peprismine.com, it does not give an http header status 301 instead it gives a status 200 message. This should be fixed immediately. If you get this fixed, I think you should be fine technically but be careful with pages being served over SSL as this tends to screw the page loading times sometimes. You might want to look in to this. Don't blindly go by the page speed test scores instead, look at the actual page loading times. You can do a test here: http://www.urivalet.com and also go ahead and perform a test at webpagetest.org and check out the performance review section.
Best regards,
Devanur Rafi.
-
Hi Devanur,
Thanks for the reply. I have posted a query below (In continuation to my previous query).... Would be good if you could let me know
-
Hi Moosa,
Thanks for the reply. I have posted a query below (In continuation to my previous query).... Would be good if you could let me know
-
Hi Moosa & Devanur,
Thanks for your responses. However, I would like to know some more information on my 1st query
After making the necessary changes to our web pages, how long will it usually take to rank for particular keyword/ keywords (Assuming we have optimized these pages, as per the requirement). I read in some websites, that it will take minimum of 1 month, after the indexing is done. Is this really true or a myth? What have been your experiences?
P.S: I'm unable to see my url for any of my keywords yet (Not even in the last page too)
Regards,
PepMozBot
-
Hi there,
Straight into the meat:
1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ?
A. Once the pages are in the index they become eligible to appear in the SERPs but, where they appear, on which page and in which position will they appear depends on lot of factors like the competition for the search term, your content, the back links that you have and the list goes on.
2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ?
A. To a little extent and in some cases yes, but this again depends on the quality (in terms of relevance, uniqueness, originality etc, etc, ) of the content on a website, the quality of its link popularity and all the other 200+ factors that Google considers before positioning a website in the SERPs. To put it straight, you do not need to worry about the number of pages if your content is of pristine quality and highly relevant as per Google.
3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page)
A. If the URLs being removed had duplicate content then in this case you will not have any negative effect.
Over a period of time, gradually, on an as and when required basis, keep adding pages that target one search term per page with relevant, unique and up-to-date content. This will result in a positive change in your organic traffic numbers. And very importantly, do not build links desperately from all the places. Earn links, that is what I would say, you have to earn links by giving a reason for your visitors to visit your website.
1. Try to earn links from authority sites in your niche. Links like this fall in the tier 1 category.
2. Get links from generic authority websites (like Wikipedia) by posting quality content. This would be your tier 2.
3. Get links from similar theme (sites that operate in your niche) websites. These links can be your tier 3.
4. Finally, earn links from generic web properties like forums, blogs, social networking sites, social bookmarking sites etc. These would be your tier 4 links.
A very important thing to keep in mind while doing the above is, "the quality of the content being posted". Be specific and try to address an issue or provide a solution in your posts. Never engage in low quality link exchanges and bulk link building. Above all, keep asking yourself all the time, "why should anyone visit my website?", "what can I do to make a visitor's visit to my website worthwhile" and " what should I do to make my website to give a better user experience or a better advantage than that of my competitors?"
With questions like the above, you will be able to secure a good longstanding and enduring position in the SERPs for your website.
Also be an active participant in social sites to attract good social buzz. Social signals are very good for your search engine optimization efforts and they can give a boost to your SEO efforts.
Wish you good luck.
Best regards,
Devanur Rafi.
-
Ok, when you said the URLs are indexed, this simply means they are appearing in SERPs, you can type in your exact URL in the Google search bar to see if the page is appearing or not... appearing against a keyword is a completely different topic it has nothing to do with indexing alone.
It’s good to have more pages but if more pages are not producing any value and your overall website is getting low value then you should prefer to go with less pages and more value.
Removal or Change of URL can have an impact on rankings... for instance one of your URL is ranking on first page for some “XYZ” keyword if you are going to change or remove the URL it will obviously going to lost its rankings...
It is always recommended to add 301 from old domain to new domain when changing or removing the URL.
Hope this helps...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain replaced domain in Google SERP
Good morning, This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below: Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP. Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall. Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index. Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again? Thank you for your time, Chase
Intermediate & Advanced SEO | | chiplab0 -
Ranking 2 pages on the same domain in the same SERP
I thought it was generally said that Google will favour 1 page per domain for a particular SERP, but I have seen examples where that is not the case (i.e. Same domain is ranking 2 different pages on the 1st page of the SERPs...) Are there any "tricks" to taking up 2 first page SERP positions, or am I mistaken that this doesn't always happen?
Intermediate & Advanced SEO | | Ullamalm0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
My blog is indexing only the archive and category pages
Hi there MOZ community. I am new to the QandA and have a question. I have a blog Its been live for months - but I can not get the posts to rank in the serps. Oddly only the categories rank. The posts are crawled it seems - but seen as less important for a reason I don't understand. Can anyone here help with this? See here for what i mean. I have had several wp sites rank well in the serps - and the posts do much better. Than the categories or archives - super odd. Thanks to all for help!
Intermediate & Advanced SEO | | walletapp0 -
Google indexing only 1 page out of 2 similar pages made for different cities
We have created two category pages, in which we are showing products which could be delivered in separate cities. Both pages are related to cake delivery in that city. But out of these two category pages only 1 got indexed in google and other has not. Its been around 1 month but still only Bangalore category page got indexed. We have submitted sitemap and google is not giving any crawl error. We have also submitted for indexing from "Fetch as google" option in webmasters. www.winni.in/c/4/cakes (Indexed - Bangalore page - http://www.winni.in/sitemap/sitemap_blr_cakes.xml) 2. http://www.winni.in/hyderabad/cakes/c/4 (Not indexed - Hyderabad page - http://www.winni.in/sitemap/sitemap_hyd_cakes.xml) I tried searching for "hyderabad site:www.winni.in" in google but there also http://www.winni.in/hyderabad/cakes/c/4 this link is not coming, instead of this only www.winni.in/c/4/cakes is coming. Can anyone please let me know what could be the possible issue with this?
Intermediate & Advanced SEO | | abhihan0 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0 -
Should pages of old news articles be indexed?
My website published about 3 news articles a day and is set up so that old news articles can be accessed through a "back" button with articles going to page 2 then page 3 then page 4, etc... as new articles push them down. The pages include a link to the article and a short snippet. I was thinking I would want Google to index the first 3 pages of articles, but after that the pages are not worthwhile. Could these pages harm me and should they be noindexed and/or added as a canonical URL to the main news page - or is leaving them as is fine because they are so deep into the site that Google won't see them, but I also won't be penalized for having week content? Thanks for the help!
Intermediate & Advanced SEO | | theLotter0 -
Google is indexing wordpress attachment pages
Hey, I have a bit of a problem/issue what is freaking me out a bit. I hope you can help me. If i do site:www.somesitename.com search in Google i see that Google is indexing my attachment pages. I want to redirect attachment URL's to parent post and stop google from indexing them. I have used different redirect plugins in hope that i can fix it myself but plugins don't work. I get a error:"too many redirects occurred trying to open www.somesitename.com/?attachment_id=1982 ". Do i need to change something in my attachment.php fail? Any idea what is causing this problem? get_header(); ?> /* Run the loop to output the attachment. * If you want to overload this in a child theme then include a file * called loop-attachment.php and that will be used instead. */ get_template_part( 'loop', 'attachment' ); ?>
Intermediate & Advanced SEO | | TauriU0