Do I need to 301 EVERY page?
-
I have a client who is consolidating multiple EMD domains into a single domain for SEO reasons and for practical reason, like not having to produce content and perform SEO for 20 domains.
My question is this:
Do I need to 301 every single page from these old EMD domains?
I bill this client hourly and while I could take the time to write 301s for literally thousands of pages I feel that this might not be the best use of his money, that I could strategically 301 the landing pages that get traffic and then route everything else to the new root domain...thoughts? I've researched this and have not been able to hear a really strong opinion yet.
-
Well Google prefers relevant page-by-page level redirects.... Do what Google wants and you will be rewarded. Your client is investing in SEO so he/she should understand that the work you're doing, no matter how costly, is an investment in his or her business. Sometimes the 'right' decision isn't always the most cost-effective, but it's still the right thing to do.
-
Hi,
Using 301 redirect on every page will be the best to help keep all the visitors from the old EMD domains to your new website. If you don't 301 them, they will land on a 404 page of the old domain and you will lose those visitors.
Furthermore, 301-ing everypage will also help with SEO by transfering all their link juice and etc to the new site. I see more benefit in 301 every site than not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots txt. in page with 301 redirect
We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are located in our old website, which now has a 301 redirect to current site. Which is the proper way to go around? 1- Add the pages we want to disallow to the robots.txt of the new website? 2- Break the redirect momentarily and add the pages to the robots.txt of the old one? Thanks
Technical SEO | | Kilgray0 -
404 Errors for Form Generated Pages - No index, no follow or 301 redirect
Hi there I wonder if someone can help me out and provide the best solution for a problem with form generated pages. I have blocked the search results pages from being indexed by using the 'no index' tag, and I wondered if I should take this approach for the following pages. I have seen a huge increase in 404 errors since the new site structure and forms being filled in. This is because every time a form is filled in, this generates a new page, which only Google Search Console is reporting as a 404. Whilst some 404's can be explained and resolved, I wondered what is best to prevent Google from crawling these pages, like this: mydomain.com/webapp/wcs/stores/servlet/TopCategoriesDisplay?langId=-1&storeId=90&catalogId=1008&homePage=Y Implement 301 redirect using rules, which will mean that all these pages will redirect to the homepage. Whilst in theory this will protect any linked to pages, it does not resolve this issue of why GSC is recording as 404's in the first place. Also could come across to Google as 100,000+ redirected links, which might look spammy. Place No index tag on these pages too, so they will not get picked up, in the same way the search result pages are not being indexed. Block in robots - this will prevent any 'result' pages being crawled, which will improve the crawl time currently being taken up. However, I'm not entirely sure if the block will be possible? I would need to block anything after the domain/webapp/wcs/stores/servlet/TopCategoriesDisplay?. Hopefully this is possible? The no index tag will take time to set up, as needs to be scheduled in with development team, but the robots.txt will be an quicker fix as this can be done in GSC. I really appreciate any feedback on this one. Many thanks
Technical SEO | | Ric_McHale0 -
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect?
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect? If this scenario requires a 301 redirect no matter what, I might as well update the URL to be a little more keyword rich for the page while I'm at it. However, since these pages are ranking well I'd rather not lose any authority in the process and keep the URL just stripped of the ".html" (if that's possible). Thanks for you help! [edited for formatting]
Technical SEO | | Booj0 -
Should you change Temporary redirects 302's to a 301 even if page is not important/intended for ranking ?
Hi Whilst i appreciate its best practice to 301 redirect permanently moved pages, what if the page is say a login page or other page you not really interested in ranking or transferring juice to ? is it still important/best practice to do so simply because the page has permanently moved hence should still be a 301 even though you don't really want it to rank ? cheers dan
Technical SEO | | Dan-Lawrence1 -
How do I fix issue regarding near duplicate pages on website associated to city OR local pages?
I am working on one e-commerce website where we have added 300+ pages to target different local cities in USA. We have added quite different paragraphs on 100+ pages to remove internal duplicate issue and save our website from Panda penalty. You can visit following page to know more about it. And, We have added unique paragraphs on few pages. But, I have big concerns with other elements which are available on page like Banner Gallery, Front Banner, Tool and few other attributes which are commonly available on each pages exclude 4 to 5 sentence paragraph. I have compiled one XML sitemap with all local pages and submitted to Google webmaster tools since 1st June 2013. But, I can see only 1 indexed page by Google on Google webmaster tools. http://www.bannerbuzz.com/local http://www.bannerbuzz.com/local/US/Alabama/Vinyl-Banners http://www.bannerbuzz.com/local/MO/Kansas-City/Vinyl-Banners and so on... Can anyone suggest me best solution for it?
Technical SEO | | CommercePundit0 -
301 redirect chains
Hi everyone, I've had my site for a while now and have changed the structure a number of times. I'm confident my 301's work well and am not concerned about dead ends on my site. My question is, is there a way to find 301 redirect chains? i.e. can I export my link data from webmaster tools and run it through some software that tells me how many steps my 301's are taking to get to the final page? I don't know for sure that there are long 301 chains in my link structure, but I have a suspicion and it's very hard to check by going through them manually. Thanks in advance Will
Technical SEO | | madegood0 -
301 how?
My website is www.photosbykristopher.com, but I have some important links pointing to photosbykristopher.com How do I get my domain without the www to redirect to the domain with the www. PS I use Go daddy for hosting.
Technical SEO | | KristopherWho0 -
301 redirects
Hi Guys, Question,
Technical SEO | | VividLime
Lets say I have a page oldfile.php at position #2 then set-up a redirection in the following way 100 incoming external links--> oldfile.php [301 to] newfile.php Google comes along and updates its index to newfile.php and ranking of newfile.php remains at position #2. Everything is good. Lets say in 5months, I come along and delete oldfile.php so we have
100 incoming external links--> deleted(oldfile.php) or 100 incoming external links-->404 error. |||| newfile.php Do I then loose the rankings on newfile.php. My thinking is that now that all the external links now point to a page not found, newfile.php should loose rankings Am I correct in my assumption?0