the bots can easily identify a url shortener, as this performs a normal 301 redirect
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best posts made by zeepartner
-
RE: Does Bitly hurt your SEO?
-
RE: Best practice for Portfolio Links
Yes, probably a good idea to use specific portfolio tags (like "portfolio cats", "portfolio dogs", etc.). That way it doesn't get mixed up with other content (unless you want it to, of course).
Then you could create a portfolio overview page where you link to your sub-topics.
-
RE: Best practice for Portfolio Links
Split the portfolio up into various topics? This seems better for usability as well. Tag the various portfolios by category and make it nice and browseable. Does that make sense?
-
ECommerce: Best Practice for expired product pages
I'm optimizing a pet supplies site (http://www.qualipet.ch/) and have a question about the best practice for expired product pages.
We have thousands of products and hundreds of our offers just exist for a few months. Currently, when a product is no longer available, the site just returns a 404. Now I'm wondering what a better solution could be:
1. When a product disappears, a 301 redirect is established to the category page it in (i.e. leash would redirect to dog accessories).
2. After a product disappers, a customized 404 page appears, listing similar products (but the server returns a 404)
I prefer solution 1, but am afraid that having hundreds of new redirects each month might look strange. But then again, returning lots of 404s to search engines is also not the best option.
Do you know the best practice for large ecommerce sites where they have hundreds or even thousands of products that appear/disappear on a frequent basis? What should be done with those obsolete URLs?
-
RE: Does Google crawl the pages which are generated via the site's search box queries?
Google could crawl the dynamic URLs created by your searchbox - but it usually doesn't unless there is a link to such a dynamic url somewhere. Internal searches don't create much problems anymore, but if you want to be sure, you could always block your dynamic search results pages via robots.txt or Google Webmaster Tools (>Site configuration >URL parameters).
So if the URL generated by internal searches is http://www.site.com/search/?searchword=search+query+here, you could add this to robots.txt:
User-agent: *
Disallow: /search/
-
RE: ECommerce: Best Practice for expired product pages
Thanks for your thoughts guys.
@Igal@Incapsula: I like your 302 idea! That might acutally make a lot of sense for some products that are short-lived.
@Matthew: Good to know that lots of 301s were not an issue on your sites. Are you talking about thousands of those, though?
Most importantly, I will have to find something that can be automated and doesn't require much extra-work. I will probably go for 301s and remove those after a few months
Remind me to post my learnings here after implementation:)
-
RE: What should be done with old news articles?
Basically I don't see a reason to remove old news articles from a site, as it makes sense to still have an archive present. The only reason I could think of to remove them is if they are duplicate versions of texts that have originally been published somewhere else. Or if the quality is really crap...