Still ok to use
-
This is the flag to prevent google storing a copy of your webpage.
I want to use it for good reasons but in 2013 is it still safe to use. My websites not spammy but it's still very fresh with little to no links.
Each item I sell takes a lot of research to both buy and sell with the correct info. Once it's sold one I may just come across another and want to hold my advantage of having already done my research and my sold price to myself. Competitors will easily find my old page from a long tail search. Some off my old sold pages keep getting hits and high bounce rates from people using it as reasearch and price benchmark. I want to stop this.
So, No archive first, then 301 to category page once sold. Will the two cause a problem in googles eyes?
-
Thank you,
That put my mind at rest.
I was also concerned about the wayback as I have used it many times to find out details of an old page when the google cache doesn't show me what I need. So an extra thank you for that info link as well.
-
I don't see any problem with putting a noarchive on your page. We do it on all of our skill pages, because those pages get their content via AJAX, and as such appear broken when viewing the cached versions. Doing this should not have any effect on your rankings.
301 redirecting pages that are no longer used to another (hopefully) relevant page on your site is a very common tactic and is a best practice, so I wouldn't be worried about that either.
The wayback machine will still archive your content, and your competitors may look it up there. If you want to keep your old pages out of their index, you'll need to disallow their crawler in your robots.txt, and keep it from visiting those pages, or your entire site. There's info on that here.
-
Yes, I agree , it would look artificial, which is what worries me.
What I am trying to achieve is normal and full indexing of an item page and normal coverage so people can find it.
However, once it is sold I want to remove the page as I don't want competitors to come across the page via a keyword search and then request the original cache copy with price and info from the search results.
I assumed pages hang around a while and can be found via cache, so once I remove or 301 the page that needs to make the old page information inaccessible immediately.
-
The behaviour might seem artificial to them. I have seen people use the 'noarchive' tag, but only when they want to speed up the removal (from Google index) process. Plus, I didn't entirely get what you're trying to achieve.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using Sub Domains For Back Linking
Hey Guys! I'm building links for my page and happened upon the "Hoth" link building page. I tried it out and it built some no follow links and some links on several sub domains. I know that, when back linking via guest posting, no follow links do not juice my site. My question is, does building links on a subdomain from another company juice my site? If it's not helpful, could you explain why? Does it juice my site in any way? If you could link sources, I would really appreciate it. Also, do any of you have input on Hoth or platforms like it? Are they worth it? Thank you!
Technical SEO | | rodv0 -
Using 302s to redirect pages returning in 6 months
We are doing a 2-phase site redesign (in order to meet a deadline). An entire section of the site will not be available in the first phase, but will come back in 6 months. The question is, do we use 301s or 302s for those pages that will be coming back in 6 months? Is there a time limit on what is considered "temporary"? thanks in advance!
Technical SEO | | Max_B0 -
How important is using hreflang if u have plenty of other geo signals ?
HI How important is it to use the hreflang attributes and supporting sitemaps (and do you need both) ? Since if sites are being set up on country specific tlds (but on top of WP multisite network.domain.com environment) and geotargeted in GWT, as well as country meta tags and local schema etc etc that should send enough signals shouldnt it 🙂 ? Implementation of hreflang seems like an absolute technical nightmare All Best Dan
Technical SEO | | Dan-Lawrence0 -
Do other search engines use meta keywords
Just want to know even those Google states it doesn't use them is there any benefit from using them for the other search engines?
Technical SEO | | ReSEOlve0 -
Some posts not showing on Google search if I seach them using post title?
Hello! Some of my WordPress blog posts aren't showing on the Google search result, even if I type the post title. What could be the issue? Is it my site text selection disabled issue or WordPress SEO by Yoast plug-in issue or something else? Moreover, if I search some of my post text (article content), I can't see relevant post on Google search. I use following code to protect my articles. Is it SEO friendly? .content {
Technical SEO | | Godad
-webkit-user-select: none;
-khtml-user-select: none;
-moz-user-select: none;
-ms-user-select: none;
user-select: none;
} my site:- http://goo.gl/tD2fS Thanks!0 -
What is "evttag=" used for?
I see evttag= used on realtor.com, what looks to be for click tracking purposes. Does anyone know if this is an official standard or something they made up?
Technical SEO | | JDatSB0 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0