Is there something fundamentally wrong with our site architecture?
-
Hi everyone!
Could a few of you brilliant people take a look at the architecture of this site http://www.ccisolutions.com, and let me know if you see any obvious problems? I have run the site through XENU, and all of our most important pages, including categories and products, are no deeper than level 3. Everything deeper than that is, in most cases, an image, a pdf or an orphaned page (of which we have thousands).
Could having thousands upon thousands of orphaned pages be having a more hurtful effect on our rankings than our site architecture? I have made loud noises and suggested that duplicate content, site speed and dilution of page authority due to all those orphaned pages are some of the primary reasons we don't rank as well as we could. But, I think those suggestions just aren't sexy or dramatic enough, so there is much shaking of heads and discussion that it must be something fundamentally wrong with site architecture.
I know re-arranging the furniture is more fun than scrubbing the floors, but I think our problems are more about fundamental cleanup than moving things around
What do you think?
-
Thank you so much Peter. This is excellent advice and has dot fallen on deaf ears! As we move forward, not only with trying to make the current site better but also to redesign a site on a new platform that is easier for our customers to use, your advice is going to come in very handy. Thanks so much for taking the time to comment and advise!
Dana
-
Come back in two or three months after you remove the pages and let us know if anything happened.
-
Thanks again. Yes, we are definitely going to go for it. It makes complete sense.
-
If this was my site I would place that form under a tab.
If you get rid of the review/ratings page and the email to friend page that will decrease the size of your site by a lot of pages. That will give you a more compact site with a much higher content value per page.
If I dumped pages like these from one of my sites I would be hoping to see my rankings slowly climb a little higher.
I can't guarantee that... It is what I would do myself and what I would hope the result would be.
Good luck if you try it. I think that there is upside here and I think that the downside is about zero.
-
One final follow up question EGOL if you don't mind. We also have a link on all the product pages that goes to an "Email this Product to a Friend" request page on another URL. It's a very similar scenario as the "Rating/Review" request links. Are these causing a similar problem? I ask because if we are going to fix one we might as well fix the other at the same time. Let me know what you think.
Thanks as always!
-
Fascinating. It never ceases to amaze me that the better I get at SEO, the less I seem to know Thanks very much EGOL I very very much appreciate your explanation and help on this one!
-
My understanding is that pagerank flows into every link on a page. If the page has a nofollow link then the pagerank into that page is lost.
http://www.youtube.com/watch?v=bVOOB_Q0MZY
http://www.youtube.com/watch?v=cl0MBeKDXLY
These pages are not needed for the visitor either. One click less if the form is placed under the review/ratings tab that leads to them.
-
Thanks EGOL,
Is it evaporating pagerank simply because those pages exist as many separate pages with thin/duplicate content? Or would the "no follow" be evaporating pagerank? If so, how? I was under the impression that adding a "no follow" attribute retained the pagerank of the page on which the no followed link resides?
Thanks for any clarification you can provide. I am trying to get my ducks in a row before giving marching orders to our IT director.
Dana
-
That would evaporate a lot of pagerank.
I would place the form under a tab to reduce that loss.
-
Thanks EGOL,
I have spent some time discussing the "Review Request" page types that you referenced above with our IT director. Since we have these set as "disallow" in our robots.txt file, and it appears that none of these pages have been indexed, shouldn't we be able to accomplish what needs to be done by adding a "no follow" attribute to these pages?
-
Maybe do some re-architecting along with the cleanup.
I didn't walk thru the cart process but the site navigation at least has the look of something that's been patched and altered over time, for example: Main menu My Account and Login point to the same page so that kind of thing should be sorted, the navigation paths should be mapped and made simpler if possible.
The site could do with some copy editing. Might want to tighten up how everything is organized, and plan overall what pages are supposed to be ranking for what terms, the information seems spread around too much, it all seems a little scattered and in places a little verbose.
Here: https://www.ccisolutions.com/StoreFront/category/CLA.cat, we have a mis-used "About Us" page offering additional services descriptions instead of talking exclusively about the company.
Services overviews should be found only on their own focussed pages designed to rank for their terms.
About page should be about the company itself: who's in it, reason for being, history, function within the economy, corporate responsibility ideas, market(s) served. That kind of stuff.
My take just checking quickly.
-
If I had a site with thousands orphaned pages I would be attacking them with an ax in both hands. If they serve no purpose they would have been redirected the same day they were orphaned.
Google can have a really long memory for orphaned pages - years for some that I have seen. So if these pages are thin content, or duplicate content I think that they would be putting a drag on your site if they are not causing a Panda problem.
If possible, I would also be putting links in the pdf documents that point to the homepage or a more relevant page of the website. Then any linkjuice that flows into them from your site or from other sites is put to good use. For pdfs that you don't own and can't edit I would be looking for a way to provide the same information where linkjuice can flow back.
I would also be after those review form pages, getting rid of them for a form on the product page under a tab. Those pages are linkjuice sinks, trivial content, duplicate content. I would get rid of them ASAP.
Here is an example
http://www.ccisolutions.com/StoreFront/jsp/product/ProductRatingRequest.jsp?product=ANH-GL2400-24
That's what I see in a quick look.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor's new site ranking with out much keywords - How?
Hi all, One of our competitors have recently redesigned their website with new content. Now I can see much less keywords in the content. And page title also changed away from keywords. Still this is ranking at good position. How? Previously they used to have much landing pages with related keywords which some of them are missing now. Still I wonder why this website is ranking high? Thanks
Web Design | | vtmoz0 -
Moving to new site. Should I take old blog posts with me?
Our company website has needed a complete overhaul for some time now and the new one is almost ready to go live. We also have a separate "news" site that is houses around 800 blog posts and news items. (That news site will be thrown away because it's on a completely different domain and causes confusion.) So we have a main site with about 100 decent blog posts and a separate news site with 800 poor posts. I plan on bringing all the main site blog posts over to the new site (both WordPress), but my question is whether or not to bring over the news site posts? All, handful, none? Another issue is the news site doesn't have Google Analytics, so I'm not sure if any posts actually generate traffic, but I can from the main site we do get some referrals from it. As far as quality of content goes, it's poor. Not sure who wrote it all, but it's mainly text press releases that aren't very interesting. Is it worth bringing over for SEO purposes or simply delete the site and create a mass redirect so all of those pages will direct to the new website's blog page? Any help is greatly appreciated.
Web Design | | codyfrew0 -
Lost Rankings Late April Even Though We Have A Mobile Site
I have noticed a significant drop in rankings since late April. It is about a 30% drop in organic from Google. This is despite the fact that we launched a mobile site before the update. What gives? Any thoughts or suggestions would be appreciated.
Web Design | | inhouseseo0 -
Do Google Fonts Slow Down Your Site?
Hi Guys,
Web Design | | jeeyer
I just did a webpage speed test on http://www.webpagetest.org to see how our site is performing.
I noticed that an exteral URL called fonts.gstatic.com has a "huge" impact on our sites loading time. See a screen here: http://monosnap.com/image/z6drzC2ELoJ48d1rM0Tmtuszl3pFpH#
An overview can be seen here: http://monosnap.com/image/9hofUpr5Ld8D7mi7zyaJmGFIGhpBsY# All our scores are green and A (finally!) but I was a bit concerned when I saw the outcome of the pagespeedtest regarding the fonts.
When I load a page on my pc I indeed notice that the text content is usally quite slow in showing up, pops up afer a few seconds. Is this a know problem and Is this something I need to fix? If so what is the best approach? Looking forward on your thoughts!
Joost1 -
When Site:Domain Search Run on Google, SSL Error Appears on One URL, Will this Harm Ranking
Greetings MOZ Community: When a site:domain search is run on Google, a very strange URL appears in the search results. The URL is http://www.nyc-officespace-leader.com:2082/ The page displays a "the site's security certificate is not trusted." This only appears for one URL out of 400. Could this indicate a wider problem with the server's configuration? Is this something that needs to be corrected, and if so how? Our ranking has dropped a lot in the last few months. Thanks,
Web Design | | Kingalan1
Alan0 -
Wordpress Site Structure and H1 Tags
We are working on optimizing a client's website and asked the client's webmaster to change a handful of H1 tags. The webmaster said they could not do to the existing names being pre-set in the design. The website is built in Wordpress. The client has repeating H1 tags due to the 'design'. I have attached a snapshot of the backend. Is there a rule around Wordpress site structure where this doesn't happen? Is it worth changing? If so,what is the best solution. Thank you ahead of time. ylAMvNg.jpg
Web Design | | seoessentials0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
URLs appear in Google Webmaster Tools that I can't find on my own site?!?
Hi, I have a Magento e-commerce site (clothing) and when I had a look through some of the sections in Google Webmaster Tools I found URLs that I can't find on my site. For example, a product url maybe http://www.example.co.uk/product-url/ which is fine. In that product there maybe three sizes of the product (Small, Medium, Large) and for some reason Googlebot is sometimes finding a url like: http://www.example.co.uk/product-url/1202/ has been found and when clicked on is a live url (Status code: 200) with is one of the sizes (medium). However I have ran a site crawl in Screaming Frog and other crawl tests and can't seem to find where Googlebot is finding these URLs. I think I need to: 1. Find how Googlebot is finding these urls? 2. Find out how to keep out of index (e.g. robots.txt, canonical etc.... Any help would be much appreciated and I'm happy to share the URL with members if they think they can have a look and help with this problem. I can share specific URLs which might make the issue seem clearer, let me know? Thanks, Darrell
Web Design | | clickyleap0