Should you stop indexing of short lived pages?
-
In my site there will be a lot of pages that have a short life span of about a week as they are items on sale, should I nofollow the links meaning the site has a fwe hundred pages or allow indexing and have thousands but then have lots of links to pages that do not exist. I would of course if allowing indexing make sure the page links does not error and sends them to a similarly relevant page but which is best for me with the SEarch Engines? I would like to have the option of loads of links with pages of loads of content but not if it is detrimental
Thanks
-
HI Adam
I like the idea of the showing related content on that page, that would be useful for the user as well as increase the speed of indexing I would guess
Thank you
-
Depending on what the pages are, I would let them get indexed so they can be ranked for any longtail keywords, then perhaps 301 redirect them to the main sale page. Or leave them up with a notice that product is no longer available but display related products.
-
Would it be the case of product variation (color, size, date, price)?
If that was the case you would implement rel="canonical" for the main page.
For example:
site.com/page.php remains main URL while site.com/page.php?special=january gets treated as if it was the first one.
Any links you earn through that January page would count towards the core product.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links / Top Pages by Page Authority ==> pages shouldnt be there
I checked my site links and top pages by page authority. What i have found i dont understand, because the first 5-10 pages did not exist!! Should know that we launched a new site and rebuilt the static pages so there are a lot of new pages, and of course we deleted some old ones. I refreshed the sitemap.xml (these pages are not in there) and upload it in GWT. Why those old pages appear under the links menu at top pages by page authority?? How can i get rid off them? thx, Endre
Intermediate & Advanced SEO | | Neckermann0 -
My website is not indexing
Hello Experts As i search site :http://www.louisvuittonhandbagss.com or just entering http://www.louisvuittonhandbagss.com on Google i am not getting my website . I have done following steps 1. I have submitted sitemaps and indexed all the site maps 2.i have used GWT feature fetch as Google . 3. I have submitted my website to top social book marking websites and to some classified sites also . Pleae
Intermediate & Advanced SEO | | aschauhan5210 -
Is it a bad idea to use our meta description as a short description of a product on that product page?
Does this count as duplicating content even though the meta description has no effect on search results?
Intermediate & Advanced SEO | | USAMM0 -
Google Is Indexing The Wrong Page For My Keyword
For a long time (almost 3 mounth) google indexing the wrong page for my main keyword.
Intermediate & Advanced SEO | | Tiedemann_Anselm
The problem is that each time google indexed another page each time for a period of 4-7 days, Sometimes i see the home page, sometimes a category page and sometimes a product page.
It seems though Google has not yet decided what his favorite / better page for this keyword. This is the pages google index: (In most cases you can find the site on the second or third page) Main Page: http://bit.ly/19fOqDh Category Page: http://bit.ly/1ebpiRn Another Category: http://bit.ly/K3MZl4 Product Page: http://bit.ly/1c73B1s All links I get to the website are natural links, therefore in most cases the anchor we got is the website name. In addition I have many links I get from bloggers that asked to do a review on one of my products, I'm very careful about that and so I'm always checking the blogger and their website only if it is something good, I allowed it. also i never ask for a link back (must of the time i receive without asking), and as I said, most of their links are anchor with my website name. Here some example of links that i received from bloggers: http://bit.ly/1hF0pQb http://bit.ly/1a8ogT1 http://bit.ly/1bqqRr8 http://bit.ly/1c5QeC7 http://bit.ly/1gXgzXJ Please Can I get a recommendation what should you do?
Should I try to change the anchor of the link?
Do I need to not allow bloggers to make a review on my products? I'd love to hear what you recommend,
Thanks for the help0 -
Old pages still in index
Hi Guys, I've been working on a E-commerce site for a while now. Let me sum it up : February new site is launched Due to lack of resources we started 301's of old url's in March Added rel=canonical end of May because of huge index numbers (developers forgot!!) Added noindex and robots.txt on at least 1000 urls. Index numbers went down from 105.000 tot 55.000 for now, see screenshot (actual number in sitemap is 13.000) Now when i do site:domain.com there are still old url's in the index while there is a 301 on the url since March! I know this can take a while but I wonder how I can speed this up or am doing something wrong. Hope anyone can help because I simply don't know how the old url's can still be in the index. 4cArHPH.png
Intermediate & Advanced SEO | | ssiebn70 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Why the archive sub pages are still indexed by Google?
Why the archive sub pages are still indexed by Google? I am using the WordPress SEO by Yoast, and selected the needed option to get these pages no-index in order to avoid the duplicate content.
Intermediate & Advanced SEO | | MichaelNewman1 -
Indexed Pages in Google, How do I find Out?
Is there a way to get a list of pages that google has indexed? Is there some software that can do this? I do not have access to webmaster tools, so hoping there is another way to do this. Would be great if I could also see if the indexed page is a 404 or other Thanks for your help, sorry if its basic question 😞
Intermediate & Advanced SEO | | JohnPeters0