Is there a great tool for URL mapping old to new web site?
-
We are implementing new design and removing some pages and adding new content. Task is to correctly map and redirect old pages that no longer exist.
-
Now there is a free Chrome Extention available for this.
https://chrome.google.com/webstore/detail/searchministry-url-mapper/mehppnpbkjigbgbadakaeibdniekhccd?
-
Try https://www.redirectmapper.com/ - it will automatically map URLs based on keyword similarity.
-
If anyone looking for a similar tool, I think this tool will help.
-
Are you using a content management system at all? There might be an export feature in there you can use for getting a list of your current pages. You can also use something like Xenu Link Sleuth to spider your current site. If you've already moved pages, keep a close eye in Google Webmaster Tools and the SEOmoz tools for 404s, and redirect those as you find them.
If you're using Wordpress, I like the Redirection plugin because I can create redirects from the list of 404s that it has and fix pages that way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL dynamic structure issue for new global site where I will redirect multiple well-working sites.
Dear all, We are working on a new platform called [https://www.piktalent.com](link url), were basically we aim to redirect many smaller sites we have with quite a lot of SEO traffic related to internships. Our previous sites are some like www.spain-internship.com, www.europe-internship.com and other similars we have (around 9). Our idea is to smoothly redirect a bit by a bit many of the sites to this new platform which is a custom made site in python and node, much more scalable and willing to develop app, etc etc etc...to become a bigger platform. For the new site, we decided to create 3 areas for the main content: piktalent.com/opportunities (all the vacancies) , piktalent.com/internships and piktalent.com/jobs so we can categorize the different types of pages and things we have and under opportunities we have all the vacancies. The problem comes with the site when we generate the diferent static landings and dynamic searches. We have static landing pages generated like www.piktalent.com/internships/madrid but dynamically it also generates www.piktalent.com/opportunities?search=madrid. Also, most of the searches will generate that type of urls, not following the structure of Domain name / type of vacancy/ city / name of the vacancy following the dynamic search structure. I have been thinking 2 potential solutions for this, either applying canonicals, or adding the suffix in webmasters as non index.... but... What do you think is the right approach for this? I am worried about potential duplicate content and conflicts between static content dynamic one. My CTO insists that the dynamic has to be like that but.... I am not 100% sure. Someone can provide input on this? Is there a way to block the dynamic urls generated? Someone with a similar experience? Regards,
Technical SEO | | Jose_jimenez0 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Old URLs Appearing in SERPs
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to. Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s. We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain. We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input. Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise. Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days. Update robots.txt to block access to the redirecting directories. Thank you. Rosemary One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
Technical SEO | | RosemaryB3 -
Is it problematic for Google when the site of a subdomain is on a different host than the site of the primary domain?
The Website on the subdomain runs on a different server (host) than the site on the main domain.
Technical SEO | | Christian_Campusjaeger0 -
Need advice for new site's structure
Hi everyone, I need to update the structure of my site www.chedonna.it Basicly I've two main problems: 1. I've 61.000 index tag (more with no post)2. The category of my site are noindex I thought to fix my problem making the category index and the tag noindex, but I'm not sure if this is the best solution because I've a great number of tag idexed by Google for a long time. Mybe it is correct just to make the category index and linking it from the post and leave the tag index. Could you please let me know what's your opinion? Regards.
Technical SEO | | salvyy0 -
What if an old site goes into PENDINGDELETE status
Hi, I have an old domain which accidentally was set as PENDINGDELETE by the registry. It's now not resolving to any ip address any more. Actually I was relocating from the old domain to a new domain. Just one month before it become PENDINGDELETE, I have submitted a "Chang of Address" in Google Webmasters Tools as well as setup the web server to 301 redirect all urls on old domain to the new domain. I have some sub-questions for this case: 1. What will happen to the effectiveness of the "Change of Address" in Google Webmasters Tool after old domain is dropped. As a domain is deleted, I have no way to maintain the verified ownership of the it in case Google asks me to reverify. 2. Suppose during last month before it's deleted, Googlebot had crawled 50% of urls on old domains, detected the 301 redirects and save them to its index. When Googlebot crawls those 50% urls again after the old domain is deleted, as those urls are not resolving to any web server, will Googlebot retain the last 301 redirects or drop the 301 redirects as well? 3. After a domain is deleted, how soon will Google purge urls on old domain from its index? Thank you. Best regards Jack Zhao
Technical SEO | | Bull1350 -
What happens when a link goes to a dead url on my site?
I noticed in Open Site Explorer, I have several incoming links going to dead urls because i re-organized my site. For example, there might be an incoming link to: sample.php?ID=8 The problem is that I moved the file to /subdir1 so it would be nice if it could link to /subdir1/sample.php?ID=8 BUT, on top of that, I have also changed the url to seo-friendly urls. So, really, it should link to /Category_Descripton/ProductName/8 and then get re-written to /subdir1/sample.php?ID=8 So, what are the implications of having these incoming links to dead urls other than the bad user experience. What are the implications from an SEO standpoint? What's the best way to fix this? Thanks.
Technical SEO | | webtarget0 -
My urls changed with new CMS now search engines see pages as 302s what do I do?
We recently changed our CMS from php to .NET. The old CMS did not allow for folder structure in urls so every url was www.mydomain/name-of-page. In the new CMS we either have to have .aspx at the end of the url or a /. We opted for the /, but now my page rank is dead and Google webmaster tools says my existing links are now going through an intermediary page. Everything resolves to the right place, but looks like spiders see our new pages as being 302 redirected. Example of what's happening. Old page: www.mydomain/name-of-page New page: www.mydomain/name-of-page/ What should I do? Should I go in and 301 redirect the old pages? Will this get cleared up by itself in time?
Technical SEO | | rasiadmin10