Questions created by DGAU
-
Can I use a 301 redirect to pass 'back link' juice to a different domain?
Hi, I have a backlink from a high DA/PA Government Website pointing to www.domainA.com which I own and can setup 301 redirects on if necessary. However my www.domainA.com is not used and has no active website (but has hosting available which can 301 redirect). www.domainA.com is also contextually irrelevant to the backlink. I want the Government Website link to go to www.domainB.com - which is both the relevant site and which also should be benefiting from from the seo juice from the backlink. So far I have had no luck to get the Government Website's administrators to change the URL on the link to point to www.domainB.com. Q1: If i use a 301 redirect on www.domainA.com to redirect to www.domainB.com will most of the backlink's SEO juice still be passed on to www.domainB.com? Q2: If the answer to the above is yes - would there be benefit to taking this a step further and redirect www.domainA.com to a deeper directory on www.domianB.com which is even more relevant?
Technical SEO | | DGAU
ie. redirect www.domainA.com to www.domainB.com/categoryB - passing the link juice deeper.0 -
In this situation, should I consolidate two pages into 1 for stronger SEO?
Hi, I have a site that has a categorized structure and products like this:
Technical SEO | | DGAU
/categoryA
/categoryA/product1
/categoryA/product2
/categoryA/product3
/categoryB
/categoryB/product1
/categoryB/product2
etc. The category pages have a list of the products within that category.
At the moment the category pages perform strongest SEO wise - ie these pages:
/categoryA
/categoryB Sometimes I get down to only having 1 product in each categotry like this:
/categoryA
/categoryA/product1 My Question:
Q: In this case is it a good idea to direct / redirect all traffic to the single product page - ie /categoryA/product1 ? BTW these are my reasons for thinking this this might be worthwhile:
• UX - User gets to the product page quicker with one less step
• Merging 2 pages with similar content together might somehow combine/consilidate the SEO strength and perform better in SERP. thanks in advance0 -
Site with 2 domains - 1 domain SEO opimised & 1 is not. How best to handle crawlers?
Situation: I have a dual domain site:
Getting Started | | DGAU
Domain 1 - www.domain.com is SEO optimised with product pages and should of course be indexed.
Domain 2 - secure.domain.com is not SEO optimised and simply has checkout and payment gateway pages. I've discovered that Moz automatically crawls Domain 2 - the secure.domain.com site and consequently picks up hundreds of errors.
I have put an end to this by adding a robots.txt to stop rogerbot and dotbot (mozs crawlers) from crawling domain 2. This fixes my errors in Moz reports however after doing more research into 'Crawler Control' I figure this might be the best option. My Question: Instead of using robots.txt to stop moz from crawing all of Domain 2 should I use on each page of domain 2? I believe this would then allow moz and google to crawl Domain 2 but also tell them both not to index it.
My understanding is that this would be best, and might even help my overall SEO by telling google not to give any SEO value to the Domain 2 pages?0 -
Noindex follow on checkout pages in 2017
Hi,
Algorithm Updates | | DGAU
My website really consists of 2 separate sites. Product site:
• Website with product pages.
• These product pages have SEO optimised content. Booking engine & checkout site:
• When a user clicks 'Book' on one of the product pages on the aforementioned product site they go to a seaparate website which is a booking engine and checkout.
• These pages are not quality, SEO optimised content, they only perform the function of booking and buying. Q1) Should I set 'noindex follow' via the meta tag on all pages of the 'Booking engine and checkout' site?
ie. Q2) should i add anything to the book buttons on the product site? I am hoping all this will somehow help concentrate the SEO juice onto the Product Site's pages by declaring the Booking engine and Checkout sites pages to be 'not of any content value'.0