Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is there a reason why a host would be reluctant to give up Cpanel access info?
-
Granted, a strange question here...
My client lost her cpanel login credentials, or never bothered to get them (she didn't even know she had a hosting account). Apparently she has a friend who is hosting her website for her, free of charge.
I need to get into the cpanel, but they are being extremely difficult. The client asked them and they didn't want to give it to her either. Still trying, but is there any reason why they would be so difficult? How does it benefit them? It can't be because they're afraid of losing her account because she isn't paying them anything. Totally confused by this. Any ideas?
-
Masbro, sounds like a bait and switch tactic, which stinks. However, you can now approach it from 2 additional angles for your client: 1) Have them pay for 1 month of cPanel access at that $45 rate, then get in there, take a complete back up of the website files and database file (if a CMS like WordPress) and then create a new web hosting account in HostGator or GoDaddy or anywhere feasible to your client and a server you're used to working in. Or 2) Just request from that company a full, most recent backup copy of the website files and DB and then you can upload those files to a new host as noted above, then change the DNS records for the domain to point to the new server. I'd find out that cost and see which is the cheaper alternative, however, either way, I would get your client away from them and cancel any monthly fees. Move on and with you by their side!
NOTE: Careful with the email hosting. So make sure you know where their email is hosted if using the same domain. If setting up a new host with their backup copy, then you can figure out the Mail settings/MX records as needed to make any modifications necessary.
Hope you get things resolved! It sounds like they are willing to play ball, so that's a huge plus for you and the client.
- Patrick
-
Thanks for the detailed explanation. I think that was it. At first I thought perhaps it was some blackhat hosting, but it appears they are legit; however, they are now asking my client to pay $45 a month to to get cpanel for linux dedicated license. I don't know why she would go through all that when Hostgator is less than $4 a month.
-
Masbro,
We run into this issue all the time working with small business owners. They have a friend's cousin's nephew from down the street get them a website and don't tell them anything about what really is going on with their domain purchase/renewals, where they bought the domain, where they bought the hosting, the login for hosting, where they are hosting email, or logins to WordPress/Drupal/etc if using a CMS. The list goes on.
Unfortunately for us, we have to figure all of that out for them. Although, fortunately, we also get to build a great rapport with the client and educate them on making sure they maintain control over all of their property. I would educate them about all you know for consolidating their domain, hosting and email if you wanted and make sure you keep a login record and you share that with them as well. If they lose it, then you have it.
Now, to answer your question. Many folks are very hesitant to provide direct cPanel access because they may be in a shared environment with many other domains and websites being hosted. So, once they give you login, then you may be able to see ALL of their clients or websites and, to me, that is a big security vulnerability. I'd never allow just anyone into our shared server.
They may also just want to validate who you are in representing your mutual client on their behalf. Usually a phone call to them, with the client on the line is a good starting point, or an email from the client to the host provider.
It really all depends on the level of work you have to do. If it's minor, then you can ask them to provide FTP or SFTP login OR if it's something major or a brand new website, then another alternative you could request is for them to simply provide you a full backup of the website/database files (if any) and you can move the hosting to another provider where you have a little more control.
I believe they are simply looking at this from a security viewpoint. Allowing you access, wouldn't be beneficial to their other clients in the server, putting them at risk potentially.
Again, there are several ways to get to the end point based on your goals and needs for the client. We see it all the time and sometimes it goes smoothly, and others, it can be a long-drawn out nightmare. I hope it's not that latter for you And I hope this was a helpful answer!
- Patrick
-
Well I'd say they want to be sure they have the real person who owns the accounts in front of them and they can't be sure. The CPanel usually provides you with access to databases and all the domain settings. So if you want to do bad this is the absolute best way to go to get sites down usually.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why can't google mobile friendly test access my website?
getting the following error when trying to use google mobile friendly tool: "page cannot be reached. This could be because the page is unavailable or blocked by robots.txt" I don't have anything blocked by robots.txt or robots tag. i also manage to render my pages on google search console's fetch and render....so what can be the reason that the tool can't access my website? Also...the mobile usability report on the search console works but reports very little, and the google speed test also doesnt work... Any ideas to what is the reason and how to fix this? LEARN MOREDetailsUser agentGooglebot smartphone
Technical SEO | | Nadav_W0 -
Impact of Medium blog hosted on my subdomain
I am using the Medium blogging platform to blog, but it is pointed to my site and appears at blog.mysite.com. Since the content is hosted on Medium and pointed to my subdomain via an A Record / CNAME / etc... 1. Will my domain get credit for backlinks to the blog content? 2. If Medium changes in the future and no longer points to my subdomain, will I lose all of the backlinks I've built up?
Technical SEO | | davidevans_seo0 -
Host sitemaps on S3?
Hey guys, I run a dynamic web service and I will start building static sitemaps for it pretty soon. The fact that my app lives in a multitude of servers doesn't make it easy to distribute frequently updated static files throughout the servers. My idea was to host the files in AWS S3 and point my robots.txt sitemap directive there. I'll use a sitemap index so, every other sitemap will be hosted on S3 as well. I could dynamically mirror the content from the files in S3 through my app, but that would be a little more resource intensive than just serving the static files from a common place. Any ideas? Thanks!
Technical SEO | | tanlup0 -
Oh no googlebot can not access my robots.txt file
I just receive a n error message from google webmaster Wonder it was something to do with Yoast plugin. Could somebody help me with troubleshooting this? Here's original message Over the last 24 hours, Googlebot encountered 189 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%. Recommended action If the site error rate is 100%: Using a web browser, attempt to access http://www.soobumimphotography.com//robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot. If your robots.txt is a static page, verify that your web service has proper permissions to access the file. If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure. If the site error rate is less than 100%: Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors. The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website. After you think you've fixed the problem, use Fetch as Google to fetch http://www.soobumimphotography.com//robots.txt to verify that Googlebot can properly access your site.
Technical SEO | | BistosAmerica0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0 -
Homepage outranked by sub pages - reason for concern?
Hey All, trying to figure out how concerned I should be about this. So here is the scoop, would appreciate your thoughts. We have several eCommerce websites that have been affected by Panda, do to content from manufacturers and lack of original content. We have been working hard to write our own descriptions and are seeing an increase in traffic again. We have also been writing blogs since February and are getting a lot of visits to them. Here is the problem, our blog pages are now outranking our homepage when you type in site:domain-name Is this a problem? our home page does not show up until you are 3 pages in. However when you type in just our domain name in google as a search it does show up in position one with sitelinks under it. This is happening across both of our sites. Is this a cause for concern or just natural due to our blogs being more popular than our homepage. Thanks! Josh
Technical SEO | | prima-2535090 -
Where should a knowledge base be hosted for max. SEO benefit?
A client would like to set up a knowledge base to work in conjunction with their website and we are tossing up whether to go with a hosted solution (and therefore set up as a subdomain) or find a solution that we host on the clients domain (which will presumably have more SEO benefit). We are leaning towards the latter (although are mindful that we need to balance the client’s desire for a quality KB solution). Appreciate your feedback.
Technical SEO | | E2E0 -
Using a third party server to host site elements
Hi guys - I have a client who are recently experiencing a great deal of more traffic to their site. As a result, their web development agency have given them a server upgrade to cope with the new demand. One thing they have also done is put all website scripts, CSS files, images, downloadable content (such as PDFs) - onto a 3rd party server (Amazon S3). Apparently this was done so that my clients server just handles the page requests now - and all other elements are then grabbed from the Amazon s3 server. So basically, this means any HTML content and web pages are still hosted through my clients domain - but all other content is accessible through an Amazon s3 server URL. I'm wondering what SEO implications this will have for my clients domain? While all pages and HTML content is still accessible thorugh their domain name, each page is of course now making many server calls to the Amazon s3 server through external URLs (s3.amazonaws.com). I imagine this will mean any elements sitting on the Amazon S3 server can no longer contribute value to the clients SEO profile - because that actual content is not physically part of their domain anymore. However what I am more concerned about is whether all of these external server calls are going to have a negative effect on the web pages value overall. Should I be advising my client to ensure all site elements are hosted on their own server, and therefore all elements are accessible through their domain? Hope this makes sense (I'm not the best at explaining things!)
Technical SEO | | zealmedia0