2 home page domains causing split in link juice
-
We have 2 home page domains that are resulting in a split in links. We have mysite.com and mysite.com/index.php. The site is on Joomla and when we try and re-direct the /index.php to just mysite.com is causing an infinite loop. I have done this on other platforms with no problems - is this an issue because of Joomla?
How can we complete a 301 re-direct to consolidate our link juice to one domain url?
-
Try changing mysite.com/index.php to mysite.com directly in the database. Backup first.
-
Hi Devon,
Were you able to get this sorted out, or are you still looking for some help?
-
If that is causing an infinite loop, then you may have some other code working against the redirect you set up.
-
That is not two separate domains. Don't matter joomla it is or wordpress. You just need to redirect index.php to / serverside.
find a .htaccess file in your root folder
Options +FollowSymLinks RewriteEngine on RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.php\ HTTP/ RewriteRule ^index\.php$ http://www.example.com/ [R=301,L]
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Links to one Sub-Domain on a Site hurt a different Sub-Domain on the same site by affecting the Quality of the Root Domain?
Hi, I work for a SaaS company which uses two different subdomains on our site. A public for our main site (which we want to rank in SERPs for), and a secure subdomain, which is the portal for our customers to access our services (which we don't want to rank for) . Recently I realized that by using our product, our customers are creating large amounts of low quality links to our secure subdomain and I'm concerned that this might affect our public subdomain by bringing down the overall Authority of our root domain. Is this a legitimate concern? Has anyone ever worked through a similar situation? any help is appreciated!
Technical SEO | | ifbyphone0 -
Best practices for controlling link juice with site structure
I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff?
Technical SEO | | Santaur0 -
Merging multiple sites and contacting linking domains
This is strictly academic but I am having a friendly debate and I am hoping you guys could help me. If I decided that I wanted to merge several websites into a single new URL doing everything I am supposed to (page to page 301 redirects, etc), will I still need to reach out to those important websites that link to my different sites to have them change the links and anchor text to point to the new site? I know that 90% of the link juice is supposed to transfer and that you are SUPPOSED to contact linking domains, but is it really worth it, especially if there are literally hundreds of sites to contact?
Technical SEO | | Mike_Davis0 -
Is my home page over optimized for this key word?
I've been working for a couple of months not to try and get my site optimized for the key word "kayak fishing". I haven't done any black hat linking or anything and my site had disappeared passed page 76 on Google United States... Did I over optimize things? I get an A for the onpage reports from SEOMOZ. site: www.yakangler.com Keyword: Kayak Fishing
Technical SEO | | mr_w0 -
Moving a blog from unique domain to root /blog/ but on 2 different servers? HELP!
I have a main site hosted on one server, I have the blog hosted on another server - BOTH of which my team has FULL control over. I ultimately want the blog to reside on the root domain: www.mysite.com/blog/ My network team is saying "DNS will not allow this to happen, the resolution will ultimately have to be on blog.website.com" Has anyone out there done this? Is it even possible? HELP!
Technical SEO | | BCA0 -
Getting home page content at top of what robots see
When I click on the text-only cache of nlpca(dot)com on the home page http://webcache.googleusercontent.com/search?q=cache:UIJER7OJFzYJ:www.nlpca.com/&hl=en&gl=us&strip=1 our H1 and body content are at the very bottom. How do we get the h1 and content at the top of what the robots see? Thanks!
Technical SEO | | BobGW0 -
What should i do with the links for "Login", "Register", "My Trolley" links on every page.
My website ommrudraksha has 3 links on every page. 1. Login 2. Register 3. My trolley My doubt is i do not want to give any weightage to these links. does these links will be calculated when page links are calculated ? Should i remove these as links and place these as buttons ? ( with look a like of link visually ? )
Technical SEO | | Ommrudraksha0 -
Dealing with hundreds of spam pages caused by a hacker
A couple of my sites have recently been hacked with the hacker managing to overwrite lots of my pages with their own spam products and also adding in lots of (hundreds) pages that they have created themselves. I have rectified this in so far as removing folders that the hacker used to over write my pages so my original pages are now back showing the correct content and also removed all the hundres of new pages that they had managed to instantly add. I appreciate that google will find and re-crawl all my genuine pages so the correct content is being displayed and indexed for them but what is the best method for dealing with the hundreds of extra spam ages that google had managed to crawl but have now been deleted so there are loads of 404 page not founds in google?
Technical SEO | | Wardy0