Is it OK to 301 redirect 1000s of duplicate random URLs to homepag?
-
Hello,
We found a critical error in our site internal link structure and the way Google indexes it. Website has 1000s of URLs that are basically 50% match to homepage. They all start the same example.com/category/random/random
I can do a redirect match and 301 them to homepage. This way 1000s of bogus url are not indexed and no value given. Is it OK to redirect so many URLs to homepage?
Platform is creating these URLs because of search query, where it adds all site content to one page. Currently this search page /category / has own canonical and all those duplicate content URLs have canonical to that /category /.
To fix my plan is to a. Remove canonical from /category / that way all those duplicate URLs don't have it either. B. Redirect match all URLs that have /category / in them to homepage. (this is most important page where 50% of that content is and should be the main page).
Is this plan ok?
-
Instead of adding a bunch of no value links to your home page, which might be seen as black hat by Google, could you no-index the extra pages? It's not as if the page has any authority to distribute.
-
Any ideas follow MoZers? I can not find myself any other way to do it and think that would be the best and quickest way to fix?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect chains and rankings
Hi All, I've got interesting question for Moz community today. Has redirect chain got any impact on rankings or not? Thank you in advance.
On-Page Optimization | | Optimal_Strategies0 -
How would you improve our URL structure?
Hi Mozzers, I have a question about the URL structure on our website (www.ikwilzitzakken.nl). We now have a main category with "zitzakken" (beanbags). We also have different brands, types and colours. Now we have URL's like this: <a>https://www.ikwilzitzakken.nl/zitzakken/vetsak/vetsak-fs600-flokati-zitzak/_381_w_3544_3862_NL_1</a> which seems long and not clean. Please don't look at the query at the end, we can't do anything about that in our CMS. In english this would be: https://www.iwantbeanbags.nl/beanbags/vetsak/vetsak-fs600-flokati-beanbag/_381_w_3544_3862_NL_1 How would you optimise this? We do have good rankings (this one ranks #1 for example), but I think our overall structure could be way better. Would love your thoughts about this.
On-Page Optimization | | TheOnlineWarp0 -
Pages with near duplicate content
Hi Mozzers, I need your opinion on the following. Imagine that we have a product X (brand Sony for example), so if we sell parts for different models of items of this product X, we then have numerous product pages with model number. Sony camera parts for Sony Camera XYZ parts for Sony Camera XY etc. So the thing is that these pages are very very similar, like 90% duplicate and they do duplicate pages for Panasonic, Canon let's say with small tweaks in content. I know that those are duplicates and I would experiment removing a category for one brand only (least seached for), but at the same time I cannot remove for the rest as they convert a lot, being close to the search query of the customer (customer looks for parts for Sony XYZ, lands on the page and buys, insteading of staying on a page for Sony parts where should additionally browse for model number). What would you advise to make as unique as possible these pages, I am thinking about: change page titles. meta descriptions tweak the content as much as I can (very difficult, there is nothing fancy or different in those :(() i will start with top top pages that really drive traffic first and see how it goes. I will remove least visited pages and prominently put the model number in Sony parts page to see how it goes in terms of organic and most importantly conversions Any other ideas? I am really concerned about dupes and a penalty, but I try to think of solutions in order not to kill conversions at this point. Have a lovely Monday
On-Page Optimization | | SammyT0 -
Optimize URL
Hello, My website have been running over five years. I have just reviewed and seen some URLs had not good. It is http://www.vietnamvisacorp.com/faqs/who-need-visa-to-vietnam---1.html, containing characters "---1". Should I remove unnecessary characters "---"?. Thanks for any advice!
On-Page Optimization | | JohnHuynh0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Plagiarism or duplicate checker tool?
Do you know a plagiarism or duplicate checker tool where I can receive an email alert if someone copies my content? I know there's a tool like this (similar to http://www.tynt.com/ though people can still remove the link from the original source) but I forgot the name or site. It's like a source code that you must insert in each of your webpage. Thanks in advanced!
On-Page Optimization | | esiow20131 -
Redirecting URLS on windows
Could anyone help out here please. A client of ours have reveloped their website from HTML to ASP (helpful!). They have 60 odd pages indexed in Google with the .html extension. We need to do a redirect on these pages so that all link juice is passed to the new pages. What would be the best way to do this please?
On-Page Optimization | | Grumpy_Carl0 -
Google found bad links delete them or 301 redirect?
we went into our google account and saw about 70 bad links that they found on our site. what's the best thing to do, seo-wise: should we go into the pages that have the bad links and delete them from the html code, or re-direct them in our htaccess script?
On-Page Optimization | | DerekM880