Hi - I have a question about IP addresses
-
- would it hurt link juice to host a blog on a different server to the rest of your website?
I have a web host saying they can't run Wordpress as they won't support PHP for "security reasons" - one solution would be to set up Wordpress on a different server and redirect domain.com/blog there (I presume this is do-able?).
But I don't know if that affects the SEO adversely?
-
Thank you, that gives me a lot of clarity
-
Not really. As big as this site sounds, attempting to do so would probably pose a security risk to your website (as an IT professional I can think of a few ways this could work, but all involve exposing the main server in ways I would cringe at). The subdomain has the fewest questions overall.
-
Thanks Egol - this website took over a year to build and cost 7 figures to build, so not so simple I'm afraid. (It's integrated with stock controls in a shop and warehouse and all sorts)
-
This issue can be cleanly solved by placing this site on a different hosting service.
That's what I would do instead of rigging-up complex ways of doing something simple.
-
Really not a problem - thank you for responding
-
Thanks Highland - ironically that's the exact setup at the moment - a wordpress.com blog hosted on a subdomain!
So my idea was to move it to a subfolder for better SEO - then the hosts chipped in with their refusal to run PHP.
This is in a high-competition niche where every detail can make a difference.
I guess you're saying it's impossible to have a WP (.org) site hosted elsewhere and pointed at the URL domain.com/blog ?
-
It depends a bit. In order to host on a different site you'll have to have a different domain or subdomain. That will let it live under a different IP. The IP thing isn't an issue but the different domain might be. I would try to get it under a subdomain of your main domain (i.e. blog.domain.com) so bots can at least see there's a relationship there. The catch here is that your subdomain is not going to pass as much juice to your main site as if it lived under domain.com/blog (where it's part of the same domain).
You don't have to host your own blog incidentally. Check out wordpress.com where, for a fee, they will map a domain to your blog. It's the safest way to host Wordpress, since they update it and secure the servers.
-
sorry for the mistake.
-
Ah OK, that's a much happier thing to hear! Thank you
-
Damn it. I've had a typo . IT WONT AFFECT YOUR SEO.
I'm just editing the first reply. Sorry
-
Thanks Gaston, much appreciated and as I feared.
I'm feeling a bit stuck as to what to do here then. I want to run Wordpress (principally for the ease of client use and the Yoast SEO plugin), but the hosts simply won't allow PHP.
So if a different server / IP number isn't a solution, I wonder if there is any way I haven't thought of to run Wordpress in an effective manner as a subfolder of the site? Or perhaps an alternative to WP that has great SEO - the hosts say they run "web servers with .Net applications hosted on them using IIS "
Does anybody have any ideas?
-
Hi there.
No, it won't affect you SEO.
Hope it helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question re: spammy internal links on site
Hi all, I have a blog (managed via WordPress) that seems to have built spammy internal links that were not created by us on our end. See "site:blog.execu-search.com" in Google search results. It seems to be a pharma-hack that's creating spammy links on our blog to random offers re: viagra, paxil, xenical, etc. When viewing "Security Issues", GSC doesn't state that the site has been infected and it seems like the site is in good health according to Google. Will anyone be able to provide any insight on the best necessary steps to take to remove these links and to run a check on my blog to see if it is in fact infected? Should all spammy internal links by disavowed? Here are a couple of my findings: When looking at "internal links" in GSC, I see a few mentions of these spammy links. When running a site crawl in Moz, I don't see any mention of these spammy links. The spammy links are leading to a 404 page. However, it appears some of the cached version in Google are still displaying the page. Please lmk. Any insight would be much appreciated. Thanks all! Best,
Technical SEO | | hdeg
Sung0 -
Event Schema markup for multiple events (same location/address)?
I was wondering if its possible to markup multiple events on the same page for one location/address using the event schema.org markup? I tried doing it on a sample page below: http://www.rama.id.au/event-schema-test/ Google's schema testing tool shows that its all good (except for warning for offers). Just wanted to know if I am doing it correctly or is there a better solution. Any help would be much appreciated. Thank you 🙂
Technical SEO | | Vsood0 -
301 Redirect Timing Questions
Hey all, Quick question on 301 redirects and the timing of creating them when transitioning from an old site to a new site. Does the timing matter? Can redirects interfere with DNS propigation (which seemed to happen to us when we did redirects minutes after redirecting someone's DNS A record to now point to the new site) And lastly, how long AFTER a new site launch can one still submit redirects and not lose the google juice? All the best,
Technical SEO | | WorldWideWebLabs0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Easy Question: regarding no index meta tag vs robot.txt
This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea? I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best. **DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them? If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right? ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option? **301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think? DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.
Technical SEO | | Santaur0 -
Canonical Expert question!
Hello, I am looking for some help here with an estate agent property web site. I recently finished the MoZ crawling report and noticed that MoZ sees some pages as duplicate, mainly from pages which list properties as page 1,2,3 etc. Here is an example: http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=2
Technical SEO | | artdivision
http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=3 etc etc Now I know that the best practise says I should set a canonical url to this page:
http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=all but here is where my problem is. http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=1 contains good written content (around 750 words) before the listed properties are displayed while the "page=all" page do not have that content, only the properties listed. Also http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=1 is similar with the originally designed landing page http://www.xxxxxxxxx.com/property-for-rent/london/houses I would like yoru advise as to what is the best way to can url this and sort the problem. My original thoughts were to can=url to this page http://www.xxxxxxxxx.com/property-for-rent/london/houses instead of the "page=all" version but your opinion will be highly appreciated.0 -
Multiple Domains pointing to one IP
Hi we have some issues with Multiple domains pointing to one IP. is this considered duplicate content by Google? If so, what is the best thing can we do to avoid this? thanks
Technical SEO | | solution.advisor0 -
Another Penalty Question - Should I Start from Scratch?
I've seen many questions on google penalties recently. Not really sure where to go from here. I realised a year or so we would be living on borrowed time with our link building methods. We have been really successful in the past and are keen to build a site that has a bit more longevity. We have not received a warning from google but have lost pretty much all of our ranking for everything. My question is with our backlink profile as it is. Building links from various blog networks for the past 3 years. Is it just worth rebranding and starting from scratch rather than trying to get over a million links removed? We have a lot of content that I guess could be classed as spam. Should I really remove all of the content? or leave it running as we are still getting some traffic from other marketing activities. Or should I just get a new domain and transfer all the decent content?
Technical SEO | | DaveDawson2