Still ok to use
-
This is the flag to prevent google storing a copy of your webpage.
I want to use it for good reasons but in 2013 is it still safe to use. My websites not spammy but it's still very fresh with little to no links.
Each item I sell takes a lot of research to both buy and sell with the correct info. Once it's sold one I may just come across another and want to hold my advantage of having already done my research and my sold price to myself. Competitors will easily find my old page from a long tail search. Some off my old sold pages keep getting hits and high bounce rates from people using it as reasearch and price benchmark. I want to stop this.
So, No archive first, then 301 to category page once sold. Will the two cause a problem in googles eyes?
-
Thank you,
That put my mind at rest.
I was also concerned about the wayback as I have used it many times to find out details of an old page when the google cache doesn't show me what I need. So an extra thank you for that info link as well.
-
I don't see any problem with putting a noarchive on your page. We do it on all of our skill pages, because those pages get their content via AJAX, and as such appear broken when viewing the cached versions. Doing this should not have any effect on your rankings.
301 redirecting pages that are no longer used to another (hopefully) relevant page on your site is a very common tactic and is a best practice, so I wouldn't be worried about that either.
The wayback machine will still archive your content, and your competitors may look it up there. If you want to keep your old pages out of their index, you'll need to disallow their crawler in your robots.txt, and keep it from visiting those pages, or your entire site. There's info on that here.
-
Yes, I agree , it would look artificial, which is what worries me.
What I am trying to achieve is normal and full indexing of an item page and normal coverage so people can find it.
However, once it is sold I want to remove the page as I don't want competitors to come across the page via a keyword search and then request the original cache copy with price and info from the search results.
I assumed pages hang around a while and can be found via cache, so once I remove or 301 the page that needs to make the old page information inaccessible immediately.
-
The behaviour might seem artificial to them. I have seen people use the 'noarchive' tag, but only when they want to speed up the removal (from Google index) process. Plus, I didn't entirely get what you're trying to achieve.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using 410 To Remove URLs Starting With Same Word
We had a spam injection a few months ago. We successfully cleaned up the site and resubmitted to google. I recently received a notification showing a spike in 404 errors. All of the URLS have a common word at the beginning injected via the spam: sitename.com/mono
Technical SEO | | vikasnwu
sitename.com/mono.php?buy-good-essays
sitename.com/mono.php?professional-paper-writer There's about 100 total URLS with the same syntax with the word "mono" in them. Based on my research, it seems that it would be best to serve a 410. I wanted to know what the line of HTACCESS code would be to do that in bulk for any URL that has the word "mono" after the sitename.com/0 -
Is this a correct use of 302 redirects?
Hi all, here is the situation. A website I'm working on has a small percentage of almost empty pages. Those pages are filled "dynamically" and could have new content in the future, so, instead of 404ing them, we automatically noindex them when they're empty and remove the noindex once they have content again. The problem is that, due to technical issues we can't solve at the moment, some internal links (and URLs listed in sitemaps) to almost empty pages remain live also when pages are noindexed. In order not to waste Google crawler's time, sending it to noindexed pages through those links, someone suggested us to redirect those pages to our homepage with a 302 (not a 301 since they could become indexable again, so it can't be a permanent redirect). We did that, but after some weeks Search Console reported an increase in soft 404s: we checked it and it is 100% related to the 302 implementation. The questions are: is this a correct use of 302 redirects? Is there a better solution we haven't thought about? Maybe is it better to remove 302s and go back to the past situation, since linking to noindexed pages isn't such a big problem? Thank you so much!
Technical SEO | | GabrieleToninelli0 -
Can you force Google to use meta description?
Is it possible to force Google to use only the Meta description put in place for a page and not gather additional text from the page?
Technical SEO | | A_Q0 -
Re-using site code.
Hi, I'm looking at launching a new website, and am keen to understand whether re-using the basic code behind one of my other sites will cause me an issue. I'll be changing the directory structure/ file names, etc - but it will basically leave me with a very similar-looking site to another in my portfolio - using code thats all ready out there, etc. Thanks, David
Technical SEO | | newstd1000 -
What hosting companies do you use & do you use dedicated servers
I am hoping the community of semoz will help me in deciding what hosting company i should use as there are hundreds of them. I have asked previously about dedicated servers but was shocked to have only received one responce. Recently i have been having nothing but problems with my hosting company so now i am trying to find a UK hosting company that can offer a dedicated server. I would be grateful if people could let me know what companies they use for their sites and if they use managed hosting companies.
Technical SEO | | ClaireH-1848860 -
Cross-Domain Canonical - Should I use it under the following circumstances?
I have a number of hyper local directories, where businesses get a page dedicated to them. They can add images and text, plus contact info, etc. Some businesses list on more than one of these directory sites, but use exactly the same description. I've tried asking businesses to use unique text when listing on more than one site to avoid duplication issues, but this is proving to be too much work for the business owner! Can I use a cross-domain canonical and point Google towards the strongest domain from the group of directories? What effects will this have? And is there an alternative way to deal with the duplicate content? Thanks - I look forward to hearing your ideas!
Technical SEO | | cmaddison0 -
Does anyone use buzzfeed to creat links traffic and increase brand
Hi i would like to know if anyone uses http://www.buzzfeed.com to create links, gain traffic and increase brand awareness. I have signed up for an account but cannot get it to work and would like some help. I can get my content on there but cannot manage to get the links to work I signed up for this account a while back and a friend shown me how to use it but i have forgotten. here is my page http://www.buzzfeed.com/lifestylemagazine some links work and some links do not. what i am trying to do is to publish stories from my site as well as other sites and have the link included where you press the title and it goes to the site any help would be great
Technical SEO | | ClaireH-1848860 -
Optimizing a website which uses information from a database
Hi, Sorry if this question is very general. I am in the process of building a website for local business, and it it will be heavily dependent on a database. The database will contain loads of information which can be optimized such as business directory listings, articles, forums, questions and answers etc. The businesses will also be able to link to and from the site. Which is the best way to display this information so that it can be optimized the best? I was going to use drop down boxes on a single page, ie main category, sub catagory, then display the business listings based on this. However, as the information on the page changes constantly based on the drop down the robot / user uses, I am assuming this is very hard to get optimized well. Does anyone know a better way? Thanks.
Technical SEO | | PhatJP0