Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Websites on same c class IP address
-
If two websites are on the same c class IP address, what does it mean ?
Does two websites belong to the same company ?
-
But if you have 20 sites linking to you, and they're all from the same Class C IP address, and those are pretty much your ONLY links, that looks a bit fishy to the search engines. It's a signal to the search engines that "hey, this person can only get links from sites that have a high probability of the same person being in control of all of the sites".
That's why you'll see people talk about linkbuilding and wanting to do so from different Class C IP addresses.
-
Not necessarily. f you see different C blocks, you are usually talking about two different webhosts. So there is a chance that different sites from one owner are hosted in one C-block.
Even an IP-address is not always used for one company. You can have your own IP-address which leads to your website. But providers can also share an IP-address for different domains/ websites for different companies.
To find if website belong tot the same owner you need to check te registars database or tools like allwhois.com.
-
Thanks Olaf.
So, does it mean that sites hosted on same C class IP belong to one owner ?
-
A "C" Block address is based on your IP address.
For example 190.245.111.001 is a standard IP address. The c-blocks in this case are: AAA.BBB.CCC.001-254
So these are within the same C-class:
190.245.111.001
190.245.111.230And these are different C-Class IP's:
190.245.111.001
190.245.222.001Google may assume that sites hosted in differnet C-blocks are more likely to be from different people.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My website is constantly decreasing
For few weeks ago my website is constantly decreasing in search position. I lost keywords and is gooooing down.
Technical SEO | | Dan_Tala
Although it is well rated on several on page and off page seo verification software that I have tried.
I checked Google search console and Analytics and found no major problems. However… from one day to another it keeps going down.
I also checked what the main competitors are doing and they are not doing well, at all.
The main competitor actually has a creepy website. Totally devoid of onpage or offpage SEO but with an enormous number of backlinks. And of a very bad quality, which should disqualify it, still…
Few weeks ago I changed something.
In the pages I had H1, 4xH2, no H3 and an H4 without content.
An unnatural H tag structure.
Now I have H1, H2, H3, 3xH4, with the coherent information.
Theoretically, Google should have been “happy” or I’m missing something. I use a SAAS platform.
I just found out that they made changes to the keywords (tags).
I am selling toner cartridges for printers.
So…
The tags are printer models and generate a url in which they have the products.
Ex. https://www.sertit.ro/cartus-imprimanta-cilindru-color-hp-laserjet-pro-m-177fw goes to the products for that printer model.
The question is… should I make tag canonical?
Is it possible for products to loose so much in Google search?0 -
Two websites, one company, one physical address - how to make the best of it in terms of local visibility?
Hello! I have one company which will be operating in two markets, printing and website design / development. I’m planning on building two websites, each for every market. But I’m a bit confused about how to optimize these websites locally. My thought is to use my physical address for one website (build citations, get listed in directories, etc. ) and PO Box for another. Do you think there is a better idea?
Technical SEO | | VELV1 -
Rel=canonical on Godaddy Website builder
Hey crew! First off this is a last resort asking this question here. Godaddy has not been able to help so I need my Moz Fam on this one. So common problem My crawl report is showing I have duplicate home pages www.answer2cancer.org and www.answer2cancer.org/home.html I understand this is a common issue with apache webservers which is why the wonderful rel=canonical tag was created! I don't want to go through the hassle of a 301 redirect of course for such a simple issue. Now here's the issue. Godaddy website builder does not make any sense to me. In wordpress I could just go add the tag to the head in the back end. But no such thing exist in godaddy. You have to do this weird drag and drop html block and drag it somewhere on the site and plug in the code. I think putting before the code instead of just putting it in there. So I did that but when I publish and inspect in chrome I cannot see the tag in the head! This is confusing I know. the guy at godaddy didn't stand a chance lol. Anyway much love for any replies!
Technical SEO | | Answer2cancer0 -
Redirecting Root domain to subdirectory by IP addresses (country specific)
We are using Wordpress Multisite. so www.mysite.com is our English website and www.mysite.com/sub is our Chinese website Can I redirect Chinese visitors who type "www.mysite.com" to "www.mysite.com/sub" ? so we want to force redirection to www.mysite.com/sub if our website is visited by Chinese IP Address. I've realized that this is called GeoIP Redirection. and our hosting company already has those database, I guess my job is just to simply insert some code in .htacess My question is, would it affect our SEO later on? and what .htacess code is the best practice here?
Technical SEO | | joony20080 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0 -
How to create a tree-like structure map of a website?
Hi all, The online marketing manager requested to make a tree-like map of the website. He means that he would like to see a graphical representation of the website and his contents. This way we will be able to see if there are internal link issues. The problem is that there are thousands of pages and many subdomains, manual labour would make this a very tedious task. If you would get this question, how would you try to solve this? Any software recommendation?
Technical SEO | | djingel10 -
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems. As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month. Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
Technical SEO | | RobertFisher0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0