Reasons Why Our Website Pages Randomly Loads Without Content
-
I know this is not a marketing question but this community is very dev savvy so I'm hoping someone can help me. At random times we're finding that our website pages load without the main body content. The header, footer and navigation loads just fine. If you refresh, it's fine but that's not a solution.
- Happens on Chrome, IE and Firefox, testing with multiple browser versions
- Happens across various page types - but seems to be only the main content section/container
- Happens while on the company network, as well as externally
- Happens after deleting cookies, temporary internet files and restarting computer
- We are using a CMS that is virtually unheard of - Bridgeline/Iapps
- Codebase is .net
Our IT/Dev group keeps pushing back, blaming it on cookies or Chrome plugins because they apparently are unable to "recreate the problem". This has been going on for months and it's a terrible experience for the user to have. It's also not great when landing PPC visitors on pages that load with no content. If anyone has ideas as to why this may be happening I would really appreciate it.
I'm not sure if links are allowed, by today the issue happened on this page serversdirect.com/dm/geek-biz
Linking to an image example below
-
Thanks Gazzerman - This helps as well. Yes, I agree the site definitely needs SEO attention. It was not until recently that a more experience team was brought on to 'fix the site'.
-
I took one look at the code and was amazed by how much is inline. Most of it should be in js and css files. I noticed the solutions pages many of them don't even have title tags! What's going on there? That needs seo attention and is a big no no for coding. This leads me to believe the code needs very close inspection. It surly cant take that long to strip a whole bunch of that code out of each page or template.
If a page is slow to load or pulls in external js files as well you can have scripts try to execute before the required code has loaded. This is probably what is happening some of the time and would explain the inconsistent nature of it.
Run the site through a bunch of validators. I am sure you will find a bunch of stuff.
-
Oddly enough it only seemed to happen on the /Servers page - for me at least. The first time I loaded the page, I got a broken image and an X-Out/Close box. That may help you hone in on the problem.
-
Travis - Thanks for your response. This gives me a better idea of where to start looking. Honestly this site was very poorly developed in my opinion. It was before my time and much of it outsourced to junior teams overseas, which is evident when looking at the source code. But I am stuck with it for now.
-
Found the site. I'm able to recreate the problem. Check the image attachment.
I ran it through tools.pingdom.com, a chrome extension called cast_sender.js isn't loading. But something makes me think there's an issue with the CSS that only crops up from time to time, like the JS blocks it every once in a while. There's both a Global.css and a Global.js in the head. There are also folder levels called /Script%20Library/ and /Style%20Library/.
There's a ton of JavaScript on the site, which I don't have time to get into. And honestly, I'm more of a WordPress guy. I managed to make the page load fail a few times out of a dozen or so tries. I think something in the JavaScript is causing the issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reason for robots.txt file blocking products on category pages?
Hi I have a website with thosands of products. On the category pages, all the products are linked to with the code “?cgid” in the URL. But “?cgid” is also blocked in the robots.txt file for some reason. So I'm thinking it's stopping all my products getting crawled by Google. Am I right here? Is there any reason why a website would want to limit so many URL's? I'm only here a week and the sites getting great traffic, so don't want to go breaking it!!! Thanks
Web Design | | Frankie-BTDublin0 -
Sitemap and Privacy Policy marked for duplicate content?
On a recent crawl, Moz flagged a page of our site for duplicate content. However, the pages listed are our sitemap and our privacy policy -- both very different: http://elearning.smp.org/sitemap/ http://elearning.smp.org/privacy-policy/ What is our best option to address this issue? I had considered a noindex tag on the privacy policy page, but since we have enabled user insights in Google Analytics we need to have the privacy policy displayed and I worry that putting a noindex on the page would cause problems later.
Web Design | | calliek0 -
Location of body text on page - at top or bottom - does it matter for SEO?
Hi - I'm just looking at the text on a redesigned homepage. They have moved all the text to the very bottom of the page (which is quite common with lots of designers, I notice - I usually battle to move the important text back up to the top). I have always ensured the important text comes at the top, to some extent - does it matter where on the page the text comes, for SEO? Are there any studies you can point me to? Thanks for your help, Luke
Web Design | | McTaggart2 -
How would a redesign, content update and URL change affect ranking?
Hi guys, I have a question that I suspect there is no simple true or false answer to, but perhaps someone has done the same thing as we're pondering wether or not to do? We're taking over an existing site that ranks very well on all the important keywords and is obviously very well liked by Google. The site is today hosted on a sub-domain (xxx.domain.com). When taking over, we'll have to redesign the site and recreate most of the content on the site (unique). The site structure, URLs, incoming links etc. will remain exactly the same. Since we are recreating the site, we also have the opportunity to move the site off the sub-domain and on to the main domain (domain.com/xxx - 85/100 Moz rank) and do a 301 Permanent Redirect on all old URLs. Our long-time experience is that content on the main domain, ranks way better than the sub-domain. The big question is wether or not Google will punish us for both changing the content and the location of the site at the same time? Cheers!
Web Design | | mattbs
Matt0 -
Are jobsite themes harder to optimize than say a traditional website?
Until recently I have enjoyed a great deal of success with SEO on my websites and clients websites. SEO is more of a hobby than a profession for me however I am really struggling with my latest website www.securityjobsuk.co.uk - The keywords are easy, 1. security jobs and 2. security vacancies. The site has vanished off radar completely since I used the jobify theme. Has anyone had similar experience with job boards? Do they require more TLC / expert attention?
Web Design | | SJUK0 -
B2C directory website adding B2B ecommerce sub-domain
Hey fellow Mozzers, Just got back from Mozcon and enjoyed getting to know a handful of you. I do in house SEO for a B2B wholesaler. We have a B2C website directory for homeowners to locate contractors to work on their home. On the site we have a products section which includes tech specs but not pricing. Our contractors have been asking us to add the ability to purchase their items online, so we are wanting to add a B2B sub-domain (store.domain.com) to our website for the contractors to purchase products online. We do not want consumers to be able to purchase the items and will have pricing behind a log in. I have a few questions that I'm hoping you might be able to answer: 1. What would be the best practice to not have duplicate content errors with products that are listed on both sites? Should we rel-canonical items shown on both domains or do something else?
Web Design | | AC_Pro
2. We are not against having the new site be crawled, but will Google be upset/ding rankings because pricing is behind a log-in? Are there certain best-practices for B2B ecommerce sites?
3. Do you know of any other sites that have done this/do you have any recommendations on how to best implement this?0 -
Redirect From .aspx to .html if already indexed - Website Redesign
Hi Guys I would like to know if somebody could possibly shed some light on this for me. We are in the process of re-designing our site, but we are keeping all of our content in terms of site structure, internal linking etc. the same. Now we were wondering if it would be a SEO best practice for us to change our pages' extension from .aspx to .html and just put a re-direct from the aspx to the html pages. Or should we keep everything as is, and maybe just revise our on-page seo efforts as well as do some more link-building. I just have to note that we are currently ranking very well for top positions and obviously all these pages are already nicely indexed. And then another question I have is with regards to our mobi site of this same website.Our dev team created it using Responsive Web Design, but they decided to implement techniques that show and hide content based on what device you are viewing it on. So when viewing it on your desktop, it will show content as per normal, but when viewing it on a mobile device it will hide this content and show the content formatted for that specific mobile device. So we are obviously sitting with a case of dup content here.Is this technique acceptable, or is there a workaround/different way of implementing this? Thanks In Advance Dave
Web Design | | DavidZA10 -
What's the best way to sculpt links on a page?
I know PR isn't a top ranking factor anymore, so "PR sculpting" isn't something to focus on. But isn't it still true that having more links that you need on any given page is worse than having fewer, in terms of that page's authority? I'm managing a site that has a lot of navigational links in the footer, which are duplicative because they're almost all included in the top nav bar, and several are triplicated in the sidebar as well. I wanted to remove 85% of these duplicative links from the footer, thinking they diluted the page authority and that most users probably won't scroll there anyway when we launch the site. The site owner is pushing back, though, not wanting to remove so many links because he believes they might be useful to some users. We can test our respective user-behavior theories after launching, but right now I have two questions: Will having a sizable number of duplicative links in the footer dilute the page's authority? and 2) Are there any other ways to reduce this dilution, aside from simply removing the links? (I know nofollow is not the answer, but possibly using iframes or Java or something like that?)
Web Design | | KyleJB0