Best solution to get mass URl's out the SE's index
-
Hi,
I've got an issue where our web developers have made a mistake on our website by messing up some URL's . Because our site works dynamically IE the URL's generated on a page are relevant to the current URL it ment the problem URL linked out to more problem URL's - effectively replicating an entire website directory under problem URL's - this has caused tens of thousands of URL's in SE's indexes which shouldn't be there.
So say for example the problem URL's are like
www.mysite.com/incorrect-directory/folder1/page1/
It seems I can correct this by doing the following:
1/. Use Robots.txt to disallow access to /incorrect-directory/*
2/. 301 the urls like this:
www.mysite.com/incorrect-directory/folder1/page1/
301 to:
www.mysite.com/correct-directory/folder1/page1/3/. 301 URL's to the root correct directory like this:
www.mysite.com/incorrect-directory/folder1/page1/
www.mysite.com/incorrect-directory/folder1/page2/
www.mysite.com/incorrect-directory/folder2/301 to:
www.mysite.com/correct-directory/Which method do you think is the best solution? - I doubt there is any link juice benifit from 301'ing URL's as there shouldn't be any external links pointing to the wrong URL's.
-
Cheers Ryan.
-
Option 2 is preferred.
You definitely do not want to use the robots.txt method. In general, avoid using robots.txt unless there are no other options.
Whenever your site's visitors have a link to an invalid URL, 301 them to the correct URL if you have the content they are seeking. It creates the best user experience and the best SEO results.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Only Indexing Canonical Root URL Instead of Specified URL Parameters
We just launched a website about 1 month ago and noticed that Google was indexing, but not displaying, URLs with "?location=" parameters such as: http://www.castlemap.com/local-house-values/?location=great-falls-virginia and http://www.castlemap.com/local-house-values/?location=mclean-virginia. Instead, Google has only been displaying our root URL http://www.castlemap.com/local-house-values/ in its search results -- which we don't want as the URLs with specific locations are more important and each has its own unique list of houses for sale. We have Yoast setup with all of these ?location values added in our sitemap that has successfully been submitted to Google's Sitemaps: http://www.castlemap.com/buy-location-sitemap.xml I also tried going into the old Google Search Console and setting the "location" URL Parameter to Crawl Every URL with the Specifies Effect enabled... and I even see the two URLs I mentioned above in Google's list of Parameter Samples... but the pages are still not being added to Google. Even after Requesting Indexing again after making all of these changes a few days ago, these URLs are still displaying as Allowing Indexing, but Not On Google in the Search Console and not showing up on Google when I manually search for the entire URL. Why are these pages not showing up on Google and how can we get them to display? Only solution I can think of would be to set our main /local-house-values/ page to noindex in order to have Google favor all of our other URL parameter versions... but I'm guessing that's probably not a good solution for multiple reasons.
Intermediate & Advanced SEO | | Nitruc0 -
Should I Add Location to ALL of My Client's URLs?
Hi Mozzers, My first Moz post! Yay! I'm excited to join the squad 🙂 My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc. I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like: example.com/weddings/planners-washington-dc-md-va
Intermediate & Advanced SEO | | pdrama231
example.com/weddings/djs-washington-dc-md-va
example.com/weddings/ballroom-lighting-washington-dc-md-va OR example.com/weddings/planners
example.com/weddings/djs
example.com/weddings/ballroom-lighting In both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it. Thoughts? Thank you!!0 -
'?q=:new&sort=new' URL parameters help...
Hey guys, I have these types of URLs being crawled and picked up on by MOZ but they are not visible to my users. The URLs are all 'hidden' from users as they are basically category pages that have no stock, however MOZ is crawling them and I dont understand how they are getting picked up as 'duplicate content'. Anyone have any info on this? http://www.example.ch/de/example/marken/brand/make-up/c/Cat_Perso_Brand_3?q=:new&sort=new Even if I understood the technicality behind it then I could try and fix it if need be. Thanks Guys Kay
Intermediate & Advanced SEO | | eLab_London0 -
Should I change client's keyword stuffed URLs?
Hi Guys, We currently have a client that offers reviews and preparation classes for their industry (online and offline). One of the main things that I have noticed is how all of their product landing page urls are stuffed with keywords. I have read changing url's will impact up to 25% traffic and to not mess with url's unless it is completely needed. My question is, when url's are stuffed with keywords and make the url length over 200 characters, should I be focusing on a more structured url system?
Intermediate & Advanced SEO | | EricLee1230 -
What are your best moves if you want to get your traffic and rankings back for a specific keyword?
Hi all We are server and website monitoring company for over 13 years and I dare to say our product evolved and mastered over the years. Our marketing not so much. Most of our most convertible traffic came from the keyword "ping test" with our ping test tool page, and for the first 10 years we have been positioned 1-3 in Google.com so it was all good. The last two years we have been steady on positioned 8-9, and since 7-30-13 we are on the second page. We have launched a blog in 2009 at http://www.websitepulse.com/blog, and post 2-3 times a week, and are working on new website now, and my question is what is your advice in our situation? Aside from providing fresh content and launching a new website is there anything specific we could do at this stage to improve our position for "ping test"? Thanks Lily
Intermediate & Advanced SEO | | wspwsp0 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0 -
Canonical URL's - Do they need to be on the "pointed at" page?
My understanding is that they are only required on the "pointing pages" however I've recently heard otherwise.
Intermediate & Advanced SEO | | DPSSeomonkey0 -
What's the best way to phase in a complete site redesign?
Our client is in the planning stages of a site redesign that includes moving platforms. The new site will be rolled out in different phases throughout a period of a year. They are planning to put the new site redesign on a subdomain (i.e. www2.website.com) during the roll out of the different phases while eventually switching the new site back over to the www domain once all the phases are complete. We’re afraid that having the new site on the www2 domain will hurt SEO. For example, if their first phase is rolling out a new system to customize a product design and this new design system is hosted on www2.website.com/customize, when a customer picks a product to customize they’ll be linked to www2.website.com/customize instead of the original www.website.com/customize. The old website will start to get phased out as more and more of the new website is completed and users will be directed to www2. Once the entire redesign is completed, the old platform can be removed and the new website moved back to the www subdomian. Is there a better way of rolling out a website redesign in phases and not have it hosted on a different subdomain?
Intermediate & Advanced SEO | | BlueAcorn0