301 Clean-Up - Best Practices & Procedure?
-
Hello Again,
I have taken over managing a website for about 2 months and have fixed a whole heap of problems. Im now turning my attention to the URL rewrites as there are ALOT of them. I have fixed the most problematic offenders that were blocking products and all sorts of mischief but I now want to clean them up.
The website is on Magento, and there are 240 custom URL rewrites.
Question 1: Am i correct that I should edit the links on my website so that they link directly to the new page instead of utilising the re-direct for best SEO results.
Question 2: If my website doesn't utilise the URL rewrite (fixed in question 1) its only purpose is to transfer link juice from any external link the page had before. If this page didnt have any external inbound links then I can delete the URL rewrite as it serves no purpose.
Question 3: If Q1 and Q2 are correct, what is the quickest way to check the inbound links to a page quickly so I can make a quick decision on if i should remove the re-write.
Many Thanks in advance!
-
Thank you ATP for your answer +1 thumbs up
-
Answer 1 - Yes, you are correct. Update internal links. This is best practice.
Answer 2 - No, this is not correct. If users have a URL stored in a bookmark or someone is linking to it with javascript, in an email, or some way that would not show up in a backlink report the redirect would get them to the right page. Redirects are for more than just passing page-rank.
Answer 3 - Use the Moz toolbar for this on a page-by-page basis, or use OSE for the site as a whole. You should also check Google Webmaster Tools, as they have a fresher database than OSE and may know of new links, or ones that OSE didn't catch.
-
From my understanding it is a disadvantage Adam,
Although a 301 passes link juice it only passes a percentage of it, a little is lost along the way. Also redirects increase the amount of time needed for a page to load after the click as it has to walk down a longer path to reach its destination. Im sure I read somewhere (cant find the source sorry) that too many redirects and redirect "chains" can negatively impact your ranking but it likes redirects that genuinely offer replacement relevant content.
If it were me I would at the very least
Redirect: /this-is-it/ -> /how-nice/ removing a step in that path
Keep: /yep-yep/ -> /how-nice/Then change all links on your website point to /this-is-it/ and /yep-yep/ to /how-nice/
If possible get any backlinks changed to the new url also. -
I would like to add a question to the previous ones I hope that ATP won't mind, Is it a disadvantage in a SEO perspective to have 301-s that for example set for 3 pages and the previous 1 has been deleted already?
Example: /this-is-it/ (deleted old original page) --301--> /yep-yep/ --301--> /how-nice/ (current page)
Cheers,
Adam -
Hi There,
If I understand the first question correctly, Yes you should point all of your links to your new URL/page. 301 the old page and if there are any backlinks ask the webmaster of that page to point to the new URL. There are a few websites that allow you to check backlinks. I like MOZ Open Site Explorer but i've also used Ahrefs.
I hope this helps! Let me know if you have anymore questions.
Be well,
Alex Brown
Del Mar Fans and Lighting
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practices For Angular Single Page Applications & Progressive Web Apps
Hi Moz Community, Is there a proper way to do SPA (client side rendered) and PWA without having a negative impact on SEO? Our dev team is currently trying to covert most of our pages to Angular single page application client side rendered. I told them we should use a prerendering service for users that have JS disabled or use server side rendering instead since this would ensure that most web crawlers would be able to render and index all the content on our pages even with all the heavy JS use. Is there an even better way to do this or some best practices? In terms of the PWA that they want to add along with changing the pages to SPA, I told them this is pretty much separate from SPA's because they are not dependent. Adding a manifest and service worker to our site would just be an enhancement. Also, if we do complete PWA with JS for populating content/data within the shell, meaning not just the header and footer, making the body a template with dynamic JS as well would that effect our SEO in any way, any best practices here as well? Thanks!
Technical SEO | | znotes0 -
301 Redirect non existant pages
Hi I have 100's of URL's appearing in Search Console for example: ?p=1_1 These go to on to 5_200 etc.. I have tried to do htaccess and the mod rewrite is on as I can redirect directories to the root i.e RewriteRule ^web_example(.*)$ /$1 [R=301,N,L] However I have tried all kinds of variations to redirect ?p= and either it doesn't work at all or it crashes the website. Can anyone point me in the right direction to fix this.
Technical SEO | | Cocoonfxmedia0 -
Best Practices for Image Optimisation
Hi Guys, I would love some recommendations from you all. A potential client of mine is currently hosting all their website image galleries (of which there are many) on a flickr account and realise that they could gain more leverage in Google images (currently none of their images cover off any of the basics for optimisation eg filename, alt text etc), I did say that these basics would at least need to be covered off and that Image hosting is supposedly an important factor especially when it comes to driving traffic from Google Image Search. (potentially images hosted on the same domain as the text are given more value than the images hosted at another domain like websites such as Flickr). The client has now come back saying they have done some 'reading' and that this suggests a sub-domain could be the way to go, e.g. images.mydomain.com - would love feedback on this before I go back to them as it would be a huge undertaking for them. Cheers
Technical SEO | | musthavemarketing0 -
Redirects - How Best to do this ?
Hi I am looking to close Website A which has many pages. I would like to keep the home page and add some great content to it with a link pointing to Website B. As for all the other pages excluding the home page , how is it best to approach them on Website A. Should I redirect them all to the home page of Website A which will tell Google thoose Pages are no longer needed and to prevent the visitors from seeing a 404? My Main aim here is to not lose any visitors to Website A by sending them to Website B but also to hopefully pass any Page strength from Website A to Website B Thanks Adam
Technical SEO | | AMG1000 -
How long for a sitewide 301 to reindex?
Hey Gang, Finally joined the big boys here, excited to see what we all can do together. Here is my situation. I have been struggling since panda 1.0 on a particular site at www.burnworld.com. Over 2011 we figured out what the issues were with the content and went on a major cleanup. This seemed to help towards the end of the 2011. However further panda updates this year mainly April have again struck. This was after adding a wordpress blog to the site late 2011, so it was a mix of a traditional html site and a wordpress blog. Thinking that this could be an issue in May this year we transferred all the content over to wordpress only. We did keep the same linking strucutre using a permallink plugin to set specific url's. Forward to Panda 20. This wiped out all rankings and then we could not even rank for our own content. One site that syndicates our content is now ranking for our content instead of us, and many 'feed' sites that scrape out feeds also rank insead of us. Okay now to my original question. 2 weeks ago we pulled the plug and made the decision it may be best to start over on www.burnworld.net since the .net in the past was a blog on wordpress (which was shutdown earlier n 2012), but sat with about 5 pages of content until we did the 301. So today none of the pages are in the main index and I am wondering if doing the 301 might have been a mistake by pointing it to an existing site that never really ranked. Would it have been best to start on a new domain? How long have others seen before google puts the pages back in the main index? Would like to figure out the best action to take to get back into google's good graces. I'll keep this page updated so others with this issue can hopefully have a resource to turn to. BTW- nothing has chaned with Binghoo, rankings are all the same and they have updated the domain change properly.
Technical SEO | | robdawg0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
301 Redirects
Last year we merged 3 websites into 1 website and launched the new site in February. When developing the new site I created 301 redirects for all the pages from the old sites to the new site. Unfortunately when the new website was created the URLs were not optimised for search engines. I now need to optimised the page URLs. In theory I need to create new 301 redirects from this existing pages to the new optimised URLS. I am concerned that in a few years I might end up with a string of 301 redirects and if I break some links I might loose some ranking. How many redirects will link juice work for? I hope I'm clear here, if not I've attached a image showing what I'm doing. Thank you. unledfh.jpg
Technical SEO | | Seaward-Group0