A website that will not load on a particular computer? Help Me Please!
-
We took on a new client about two weeks ago, took them off a proprietary CMS, placed them on a WordPress site, optimized the site, etc. and were finishing up small details three days ago. My PC in my personal office all of a sudden would not load the site from a Google search, from a direct url, etc.
Our office was using a D-Link wireless router but my PC is hardwired in the office. I cranked up my MacBook Pro with solid state drive (6 months old), got on wireless, and....site would not load. PC's and Macs in offices around me would all load the site.A search online brought up a fix for the PC and tried it - did not work, had lead dev try it - did not work, called a server side friend and he had never heard of such a thing. Every fix revolved around changing IP addresses, etc. I uninstalled my antivirus programs on my PC, installed every update that was outstanding, there was no new software installed on either box prior to problem.
Can you help??? Is there any chance someone not associated with us and just looking for my client or someone entering a direct url could experience?
-
Yes, Woj, we were able to get it from other PC's in the office. But good way to check, thanks.
-
Can you get to it using a proxy service like proxify.com?
-
Ok will be working till late so let us know how you go.
-
Thanks Alan,
Will try that when I go into office in an hour or so. -
Think it is around proxy srvr / DNS setting Steve. Tried changing based on a fix for XP but to no avail.
Mac is working from out of office and will recheck in an hour or so at office. (OMG, working on Sat again).
Thanks for help.
-
Thanks EGOL, did both and no good. Completely uninstalled AVG. Disabled windows firewall. Still no good. Only thing online is fix for Windows (I use XP pro on that mach.) and even that did not work. ???
-
Disable firewall or antivirus for a moment and try visiting the site. My firewall blocks a couple of sites or makes them perform poorly.
-
Rob, you will need to do the NSlookup on the PC with the problem.
It may be that other un-affected machines got dns cached, and will eventualy have same problem.
-
Thanks Doug, I will check with Lead Dev as I know he worked on it for about 30 minutes in the command window.
We changed IP addresses, etc. to no avail.
-
Did you try nslookup as Doug suggested? from pc
-
Will PM the URL to you Alan. thx
-
Different computers on the same network / IP segment not loading the site could be a Proxy server OR DNS Setting.
I'm on Mac so that's what I know about.
What are your Mac network settings?
Do you get an IP address?
Can you load other websites from the same virtual hosting server?
Can you load the website via the IP address?
Whats the URL of the website?
Steve
-
Are other sites loading normally?
Can you resolve the host name from a command window on the PC:
nslookup {hostname}
If you can't resolve the hostname then it's probably a DNS issue. (you could try adding an entry to your hosts file and see if that gets around the problem?)
-
When you say does not load what does happen?
can you PM a url to me?Can it be loaded from outside office?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google crawl and rank our ReactJS website content?
We have 250+ products dynamically inserted and sorted on our site daily (more specifically our homepage... yes, it's a long page). Our dev team would like to explore rendering the page server-side using ReactJS. We currently use a CDN to cache all the content, which of course we would like to continue using. SO... will Google be able to crawl that content? We've read some articles with different ideas (including prerendering): http://andrewhfarmer.com/react-seo/
Technical SEO | | Jane.com
http://www.seoskeptic.com/json-ld-big-day-at-google/ If we were to only load the schema important to the page (like product title, image, price, description, etc.) from the server and then let the client render the remaining content (comments, suggested products, etc.), would that go against best practices? It seems like that might be seen as showing the googlebot 1 version and showing the site visitor a different (more complete) version.0 -
Website dropping in ranking
Hello My website is www.invitationsforless.ie and on google Ireland it was ranking no 6 for the keyword "wedding invitations" and I was doing quite well from this but it has now has moved down along the rankings to no 10 and I have gone from the first page on google to the second page which is very disappointing. I did recently change the blurb on my homepage - would this have effected it. Please help I don't know what to do Thanks Linda
Technical SEO | | invitationsforless0 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Redirects for new website
Hi Moz community,
Technical SEO | | JSimmons17
I'm a fairly new SEO Specialist with a brand new website. We initially had a very basic holding website until the fully functional website was completed. I have to do some redirects as we have both .html and .php files & we don't want to lose SEO value for specific pages (like the index, news, etc). I also want to redirect from a www url to a non-www url. I am trying to accomplish redirects with the following code: RewriteEngine on
RewriteCond %{HTTP_HOST} ^www.mywebsite.com [NC]
RewriteRule (.*) http://mywebsite.com/$1 [R=301,L] RedirectMatch 301 /index.html (.*).(php|html) http://mywebsite.com/index.php RedirectMatch 301 /cupcakes-slideshow/glutenfree-slideshow.html (.*).(php|html) http://mywebsite.com/gluten-and-glutenfree.php RedirectMatch 301 /press.html (.*).(php|html) http://mywebsite.com/news-and-reviews.php Please let me know if I am on the right track. Thanks so much in advance!0 -
480,000 Redirects - Is this hurting my SEO? PLEASE HELP!
Hello everyone, I have over 480,000 internal rewrites in my Magento site. The reason I have so many is because I have over 1,500 products on my site and I update inventory every day via Bulk Import Extension. For the first few months I didn't realize that the URL was changing by a single digit every time I imported the .xml with new inventory counts. This of course created thousands and thousands of 404s. I figured out how to avoid the digit change and then I started redirecting the 404s via a Bulk Rewrite Extension. I managed to rewrite over over 50,000 404s but new ones still pop up every day and there is no end to them in sight. My traffic is terrible. Only about 40 organics daily. It's been like that for months. I can't get it off the ground and I think it's because of this excessive rewrite and 404 issue. My question is, does having so many internal rewrites and 404s hurt my SEO efforts? Would it be better just to start from scratch with a new site, new domain, new everything? Please help me. I'm going crazy with this. Thank you. Nico.
Technical SEO | | niconico1010 -
My site keeps losing positions, Please help!!
Hello, This is the first time I post on this forum but have been a Pro Member for about 11 months. Im going crazy the more I do the more it drop positions. My problem is that one of the sites that Im working is has not been on the top 50 of any of the keywords. There were many issues but I have reduce the number. Im not sure if I can post the link here or via PM. My market is very competitive and Im using word press. One of my target keywords is: web design miami I would like for a member to give me an opinion of my site and may tell me what Im doing wrong. Thanks in advance.
Technical SEO | | ogdcorp0 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0 -
Help with bing redirection error
Can somebody help me figure out this bing redirect error. The link to "http://w******/flea-control" has resulted in HTTP redirection to "http://w******/feas/flea-control/".Search engines can only pass page rankings and other relevant data through a single redirection hop. Using unnecessary redirects can have a negative impact on page ranking. I am using wordpress. I am actually linking to the /feas/flea-control/ version. I have looked every where for help. I got this error using bings seo toftware
Technical SEO | | OxzenMedia0