Found a Typo in URL, what's the best practice to fix it?
-
Wordpress 3.4, Yoast, Multisite
The URL is supposed to be "www.myexample.com/great-site"
but I just found that it's "www.myexample.com/gre-atsite"
It is a relatively new site but we already pointed several internal links to "www.myexample.com/gre-atsite"
What's the best practice to correct this? Which option is more desirable?
1.Creating a new page
I found that Yoast has "301 redirect" option in the Advanced tap
Can I just create a new page(exact same page) and put noindex, nofollow and redirect it to http://www.myexample.com/great-site
OR
2. htacess redirect rule
simply change the URL to http://www.myexample.com/great-site
and update it, and add
Options +FollowSymLinks RewriteEngine On
RewriteCond %{HTTP_HOST} ^http://www.myexample.com/gre-atsite$ [NC]
RewriteRule ^(.*)$ http://www.myexample.com/great-site$1 [R=301,L] -
Thank you Dan!
Wordpress 3.4 did have some redirecting features. but it didn't correct typos
so our solution was just fix the URL, changed all the internal links and used 301 direct using an example here
If anyone facing this problem, here is your answer!
-
Hi
I actually think WordPress now does the redirects and fixes internal links now when you update the permalink. Try just updating it and see if they get fixed. (Although it may not catch internal links you've added within posts etc, just in the main menu).
If WordPress doesn't do this, just change the URL, change all internal links as needed and also use a 301 redirect.
A fast way to look for bad internal links is with Screaming Frog SEO Spider.
Hope that helps!
-Dan
-
Why not just use Redirect 301 /gre-atsites http://www.myexample.com/great-site ?
-
Dont use redirects if you can avoid it, they do not pass all the link juice,
Make the correct page, and fix the internal links. delete the old page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEMRush's Site Audit Tool "SEO Ideas"
Recently SEMRush added a feature to its site audit tool called "SEO Ideas." In the case of specific the site I'm looking at it with, it's ideas consist mostly of suggesting words to add to the page for the page/my phrase(s) to perform better. It suggests this even when the term(s) or phrases(s) it's looking at are #1. Has anybody used this tool for this or something similar and found it to be valuable and if so how valuable? The reason I ask is that it would be a fair amount of work to go through these pages and find ways to add the select words and phrases and, frankly, it feels kind of 2005 to me. Your thoughts? Thanks... Darcy
Technical SEO | | 945010 -
What's Worse - 404 errors or a huge .htaccess file
We have changed our site architecture pretty significantly and now have many fewer pages (albeit with more robust content and focused linking). My question is, what should I do about all the 404 errors (keep in mind, I am only finding these in Bing Webmaster tools, not Moz or GWT)? Is it worse to have all those 404 errors (hundreds), or to have a massive htaccess file for pages that are only getting hits by the Bing crawlbot. Any insight would be great. Thanks
Technical SEO | | CleanEdisonInc0 -
Content Based on User's IP Address
Hello, A client wants us to create a page on two different sites (www.brandA.com/content and www.brandB.com/content) with similar content and serve up specific content to users based on their IP addresses. The idea is that once a user gets to the page, the content would slightly change (mainly contact information and headers) based on their location. The problem I am seeing with this is that both brandA and brandB would be different Urls so there is a chance if their both optimized for the similar terms then they would both rank and crowd up the search results (duplicate content). Have you seen something similar? What are your thoughts and/or potential solutions? Also, do you know of any sites that are currently doing something similar?
Technical SEO | | Rauxa0 -
Best practice for multiple domain links
A site i'm working on has about 12 language domains - .es, it, .de etc. On each page of every domain the header has links to every homepage. At the moment these are all set to no-follow as an initial step to stop potential link profile issues spreading around. Moving forward i'm not totally sure how to handle these links. On one side I see and agree that no-follow is not necessary, but do-follow is just filtering out and weakening link juice. What is the best way to handle this scenario?
Technical SEO | | MickEdwards0 -
Robots.txt crawling URL's we dont want it to
Hello We run a number of websites and underneath them we have testing websites (sub-domains), on those sites we have robots.txt disallowing everything. When I logged into MOZ this morning I could see the MOZ spider had crawled our test sites even though we have said not to. Does anyone have an ideas how we can stop this happening?
Technical SEO | | ShearingsGroup0 -
How to solve the meta : A description for this result is not available because this site's robots.txt. ?
Hi, I have many URL for commercialization that redirects 301 to an actual page of my companies' site. My URL provider say that the load for those request by bots are too much, they put robots text on the redirection server ! Strange or not? Now I have a this META description on all my URL captains that redirect 301 : A description for this result is not available because this site's robots.txt. If you have the perfect solutions could you share it with me ? Thank You.
Technical SEO | | Vale70 -
'No Follow' and 'Do Follow' links when using WordPress plugins
Hi all I hope someone can help me out with the following question in regards to 'no follow' and 'do follow' links in combination with WordPress plugins. Some plugins that deal with links i.e. link masking or SEO plugins do give you the option to 'not follow' links. Can someone speak from experience that this does actually work?? It's really quite stupid, but only occurred to me that when using the FireFox add on 'NoDoFollow' as well as looking at the SEOmoz link profile of course, 95% of my links are actually marked as FOLLOW, while the opposite should be the case. For example I mark about 90% of outgoing links as no follow within a link masking plugin. Well, why would WordPress plugins give you the option to mark links as no follow in the first place when they do in fact appear as follow for search engines and SEOmoz? Is this a WordPress thing or whatnot? Maybe they are in fact no follow, and the information supplied by SEO tools comes from the basic HTML structure analysis. I don't know... This really got me worried. Hope someone can shed a light. All the best and many thanks for your answers!
Technical SEO | | Hermski0 -
Negative effect on google SEO with 301's?
Cleaning up the website by consolidating pages - each with a little bit of useful info - into one definitive page that is really useful and full of good content. Doing 301's from the many old pages to the one new really good one. Didn't want to do rel canonicals because I don't want the old pages around, I want to get rid of them. Will google see the 301s and go nuts or see that there is one definitive, really good page with no duplicate content? The change is very good from a user perspective. Also, On-Page Report Cards on SEOMoz suggests that you put a rel canonical on a page to itself to tell google that this page is the definitive page. What do you think? Thanks so much for anyone who has time to answer - so many gurus - this is a great forum. - jean
Technical SEO | | JeanYates0