Hi - I have a question about IP addresses
-
- would it hurt link juice to host a blog on a different server to the rest of your website?
I have a web host saying they can't run Wordpress as they won't support PHP for "security reasons" - one solution would be to set up Wordpress on a different server and redirect domain.com/blog there (I presume this is do-able?).
But I don't know if that affects the SEO adversely?
-
Thank you, that gives me a lot of clarity
-
Not really. As big as this site sounds, attempting to do so would probably pose a security risk to your website (as an IT professional I can think of a few ways this could work, but all involve exposing the main server in ways I would cringe at). The subdomain has the fewest questions overall.
-
Thanks Egol - this website took over a year to build and cost 7 figures to build, so not so simple I'm afraid. (It's integrated with stock controls in a shop and warehouse and all sorts)
-
This issue can be cleanly solved by placing this site on a different hosting service.
That's what I would do instead of rigging-up complex ways of doing something simple.
-
Really not a problem - thank you for responding
-
Thanks Highland - ironically that's the exact setup at the moment - a wordpress.com blog hosted on a subdomain!
So my idea was to move it to a subfolder for better SEO - then the hosts chipped in with their refusal to run PHP.
This is in a high-competition niche where every detail can make a difference.
I guess you're saying it's impossible to have a WP (.org) site hosted elsewhere and pointed at the URL domain.com/blog ?
-
It depends a bit. In order to host on a different site you'll have to have a different domain or subdomain. That will let it live under a different IP. The IP thing isn't an issue but the different domain might be. I would try to get it under a subdomain of your main domain (i.e. blog.domain.com) so bots can at least see there's a relationship there. The catch here is that your subdomain is not going to pass as much juice to your main site as if it lived under domain.com/blog (where it's part of the same domain).
You don't have to host your own blog incidentally. Check out wordpress.com where, for a fee, they will map a domain to your blog. It's the safest way to host Wordpress, since they update it and secure the servers.
-
sorry for the mistake.
-
Ah OK, that's a much happier thing to hear! Thank you
-
Damn it. I've had a typo . IT WONT AFFECT YOUR SEO.
I'm just editing the first reply. Sorry
-
Thanks Gaston, much appreciated and as I feared.
I'm feeling a bit stuck as to what to do here then. I want to run Wordpress (principally for the ease of client use and the Yoast SEO plugin), but the hosts simply won't allow PHP.
So if a different server / IP number isn't a solution, I wonder if there is any way I haven't thought of to run Wordpress in an effective manner as a subfolder of the site? Or perhaps an alternative to WP that has great SEO - the hosts say they run "web servers with .Net applications hosted on them using IIS "
Does anybody have any ideas?
-
Hi there.
No, it won't affect you SEO.
Hope it helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Re: Auto Detection of Currency based on IP & Google SEO
Greetings to the fellow Moz community members! On an e-commerce site, I am using a script to change the default currency of storefront based on IP detection ( GBP for UK visitors, CAD for Canadian visitors and so on). My question is : can this create any problems at all in Google Crawling or Indexing? Will google be able to understand the setup? I don't think this should trigger the "cloaking" or presenting different content to search engines vs users, but just want to double check from the collective wisdom here. Thanks for reading, and wish you a good day ahead. Warm Regards Amit
Technical SEO | | amitgg0 -
Htaccess mod rewrite from server address
Hi there, I'm not massively experienced with creating mod rewrite rules and I'm worried I've got this wrong as its slightly different to what I'm used to. The web dev and content creators were working on http://5.10.105.45/~isea/ to create content. I want to redirect all URL's to www.iseasurfwear.co.uk. This is what I've written //Rewrite to www
Technical SEO | | BlueTree_Sean
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^5.10.105.45/~isea/[nc]
RewriteRule ^(.*)$ http://www.iseasurfwear.co.uk/$1 [r=301,nc] Can anyone tell me if this is correct?0 -
Would duplicate listings effect a client's ranking if they used same address?
Lots of duplication on directory listings using similar or same address, just different company names... like so-and-so carpet cleaning; and another listing with so-and-so janitorial services. Now my client went from a rank around 3 - 4 to not even in the top 50 within a week. -- -- -- Would duplication cause this sudden drop? Not a lot of competition for a client using keyword (janitorial services nh); -- -- -- would a competitor that recently optimized a site cause this sudden drop? Client does need to optimize for this keyword, and they do need to clean up this duplication. (Unfortunately this drop happened first of March -- I provided the audit, recommendations/implementation and still awaiting the thumbs up to continue with implementation). --- --- --- Did Google make a change and possibly find these discrepancies within listings and suddenly drop this client's ranking? And they there's Google Places:
Technical SEO | | CeCeBar
Client usually ranks #1 for Google Places with up to 12 excellent reviews, so they are still getting a good spot on the first page. The very odd thing though is that Google is still saying that need to re-verify their Google places. I really would like to know for my how this knowledge how a Google Places account could still need verification and yet still rank so well within Google places on page results? because of great reviews? --- Any ideas here, too? _Cindy0 -
Summarize your question.Crawl Diagnostics Summary
Hi, Crawl Diagnostics Summary pointed on some mistakes I've done, I fixed them, but Crawl Diagnostics Summary still shows same errors, how often does ithe data refreshes?
Technical SEO | | AndreyStotsky0 -
Pagination question
I have a website http://www.example.com with pagination series starting with page1.html upto page10.html. With backlinks to some of the pages ( page1.html, page2.html----page7.html). If i include rel="next" and rel="prev" on page1.html to page10.html pages. Will value of those links will be transfered to http://www.example.com This is what i interpret from http://bit.ly/mUOrn2 Am i right ?
Technical SEO | | seoug_20050 -
Webmaster tools question
Hello i have a doubt. in my webmaster tools my sitemap is showing like this | /sitemap.xml | OK | Images | Nov 27, 2011 | 2,545 | 1,985 | i am not sure why the type is showing like Images i have one blog attached to the same webmaster account and it is showing correctly.. | /blog/sitemap.xml | OK | Sitemap | Nov 28, 2011 | 695 | 449 |
Technical SEO | | idreams0 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1 -
Robots.txt question
I want to block spiders from specific specific part of website (say abc folder). In robots.txt, i have to write - User-agent: * Disallow: /abc/ Shall i have to insert the last slash. or will this do User-agent: * Disallow: /abc
Technical SEO | | seoug_20050