302 vs. a href="nofollow"
-
we came across one thing the we did not asked to programm by our intention.
we have a magento shop and on the produktpage we have those "compare" buttons. these link have a session id and the follow a 302 back onto the same page.
so i beleive the idea is that google will just not follow 302s and thats it.
so my questions is:
is this right what we beleive
if so why is a 302 better compared to a a href="nofollow" ???
-
Thanx for the great answere, so the best is probably to leave it as it is.
I just found it out because Screaming Frog runs into a loop, because it follows 302s.
So i just put this url string which guides into the 302 into the robots.txt while i scan with seo frog and turn it off afterwards again.
thanx!
-
302 redirects mean the page has temporarily moved. It sounds like the pages are only temporary, so that would be the correct thing to use. You could also use nofollow or canonical tags, too.
A nofollow won't stop the page from getting indexed if it is linked to from elsewhere on the web.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Href lang issues - help needed!
Hi, I have an issue with Google indexing the US version of our website rather than the UK version on Google.co.uk. I have added hreflang tags to both sites (https://www.pacapod.com/ and https://us.pacapod.com/), have updated and submitted an XML sitemap for each website and checked that the country targeting in search console is set-up correctly but Google are still indexing the wrong website. I would be grateful for any assistance with this issue. Many thanks Eddie
Technical SEO | | mypetgiftbox0 -
301 or 302 or leave at 410
I have a client who manages vacation rental properties and those properties get links. If an owner pulls their property off the rental market the current status given is a 410 which I instinctively want turned into a 301. The problem is, often those properties come back online with the same URL so the question is, when a 301 is turned into a 200 - has anyone noticed a significant delay in time for that page to rank?I know technically it should probably be a 410 or maybe a 302 but ... you know ... the link weight. 🙂
Technical SEO | | BeanstalkIM1 -
Google displaying "Items 1-9" before the description in the Search Results
We see our pages coming up in Google with the category page/product numbers in front of our descriptions. For example: Items 1 - 24 of 86 (and than the descriptions follows). Our website is magento based. Is there a fix for this that anyone knows of? Is there method of stopping Google from adding this on to the front of our Meta Description?
Technical SEO | | DutchG0 -
Does bing accept meta name="fragment" for AJAX crawling?
I have a case in which the whole site is AJAX, the method to appease to crawlers used is <meta< span="">name="fragment" content="!"> Which is the new HTML5 PushState that Bing said it supports (At least I think it is that) This approach works for Google, but Bing isn't showing anything. Does anyone know if Bing supports this and we have to alter something or if not is there a known work around? The only other logic we have is to recognize the bing user agent and redirect to the rendered page, but we were worried that could cause some kind of cloaking penalty</meta<>
Technical SEO | | MarloSchneider0 -
Robots.txt Download vs Cache
We made an update to the Robots.txt file this morning after the initial download of the robots.txt file. I then submitted the page through Fetch as Google bot to get the changes in asap. The cache time stamp on the page now shows Sep 27, 2013 15:35:28 GMT. I believe that would put the cache time stamp at about 6 hours ago. However the Blocked URLs tab in Google WMT shows the robots.txt last downloaded at 14 hours ago - and therefore it's showing the old file. This leads me to believe for the Robots.txt the cache date and the download time are independent. Is there anyway to get Google to recognize the new file other than waiting this out??
Technical SEO | | Rich_A0 -
"Extremely high number of URLs" warning for robots.txt blocked pages
I have a section of my site that is exclusively for tracking redirects for paid ads. All URLs under this path do a 302 redirect through our ad tracking system: http://www.mysite.com/trackingredirect/blue-widgets?ad_id=1234567 --302--> http://www.mysite.com/blue-widgets This path of the site is blocked by our robots.txt, and none of the pages show up for a site: search. User-agent: * Disallow: /trackingredirect However, I keep receiving messages in Google Webmaster Tools about an "extremely high number of URLs", and the URLs listed are in my redirect directory, which is ostensibly not indexed. If not by robots.txt, how can I keep Googlebot from wasting crawl time on these millions of /trackingredirect/ links?
Technical SEO | | EhrenReilly0 -
Google caching the "cookie law message"
Hello! So i've been looking at the cached text version of our website. (Google Eyes is a great add on for this) One thing I've noticed is that, Google caches our EU Cookie Law message. The message appears on the top of the page and Google is caching this. The message is enclosed within and but it still is being cached. I'm going to ask the development mean to move the message at the bottom of the page and fix the position, but reviewing other websites with cookie messages, Google isn't caching them in their text only versions. Any tips or advice?
Technical SEO | | Bio-RadAbs0 -
What's our easiest, quickest "win" for page load speed?
This is a follow up question to an earlier thread located here: http://www.seomoz.org/q/we-just-fixed-a-meta-refresh-unified-our-link-profile-and-now-our-rankings-are-going-crazy In that thread, Dr. Pete Meyers said "You'd really be better off getting all that script into external files." Our IT Director is willing to spend time working on this, but he believes it is a complicated process because each script must be evaluated to determine which ones are needed "pre" page load and which ones can be loaded "post." Our IT Director went on to say that he believes the quickest "win" we could get would be to move our SSL javascript for our SSL icon (in our site footer) to an internal page, and just link to that page from an image of the icon in the footer. He says this javascript, more than any other, slows our page down. My question is two parts: 1. How can I verify that this javascript is indeed, a major culprit of our page load speed? 2. Is it possible that it is slow because so many styles have been applied to the surrounding area? In other words, if I stripped out the "Secured by" text and all the syles associated with that, could that effect the efficiency of the script? 3. Are there any negatives to moving that javascript to an interior landing page, leaving the icon as an image in the footer and linking to the new page? Any thoughts, suggestions, comments, etc. are greatly appreciated! Dana
Technical SEO | | danatanseo0