Ranking Issues Recently Popping Up
-
We have a site that based on your research tools, holds its own in almost all aspects in regards to # of links, # of different linking domains, quality of links, mozrank, moztrust and all that stuff. Compared to our top competitors, we do very well based on your tools via our campaign monitor.
The issue is we seem to be dropping like crazy every month in our rankings and traffic despite this fact, and we can't get our head around the cause. I do have a couple of ideas, and I wanted to run them by you guys to get your opinion.
Domain: bonitaj.com
My Thoughts On Possible Issues...
1. Text Content & Panda Update
I know one of the big things with the panda update was quality of content. I know one thing we have for sure is a lack of "text-based" content. Sure, we have home page, main cats, sub-cats and product pages, but they are mostly just windows into the product pages, and don't have a whole lot of good copy. THIS IS MOST EVIDENT ON OUR PRODUCT PAGES, where each product page is loaded with content, but only 3-4 very short paragraphs of text. Do you think this is hurting us? THE ONLY ISSUE IS THAT our competitors also don't have a whole lot of text-based content on their pages.
2. Too Many Category Pages & Same Products Featured Somewhat on them
I think another problem may be that on each category page, we do have a lot of the same products featured. I don't think its crazy duplicate content or anything, but I do think that back in the old days we got a little crazy with creating "niched out" category pages that pretty much feature the same products as some of the more important and base category pages. Do you think this is hurting us?
I've pitched a solution to this that involves trying to tone down the amount of sub-cats we feature that were originally geared towards attracting long-tail traffic. In the end that really isn't working anymore anyway, so maybe we're spreading our site thin by going to deep with some of these niche category pages?
3. Lack of a sitemap?
We used to use an xml sitemap, and really don't anymore. We have nothing on file with google webmaster tools. I've recently read in one of your blog posts that a simple thing like adding a good sitemap could help our 600+ page site or so get crawled a bit deeper allowing more pages to rank?
IN THE END, MY QUESTION IS SIMPLY, IF THERE IS ONE OR TWO THINGS I CAN DO TO GET OVER THIS HUMP, WHAT WOULD YOU SUGGEST?
-
Interesting note to add to this, I did just notice via Google Webmaster Tools, that our blog had been hacked and 150 or so of our blog posts were injected with hidden link spam. In webmaster tools it said that the 2nd biggest keyword within our site was Viagra. So that is definately playing into these drops in rankings I'm guessing within the last couple of months.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Google-selected canonical different to user-declared" - issues
Hi Moz! We are having issues on a number of our international sites where Google is choosing our page 2 of a category as the canonical over page 1. Example; https://www.yoursclothing.de/kleider-grosse-groessen (Image attached). We currently use infinite loading, however when javascript is disabled we have a text link to page 2 which is done via a query string of '?filter=true&view=X&categoryid=X&page=2' Page 2 is blocked via robots.txt and has a canonical pointing at page 1. Due to Google selecting page 2 as the canonical, the page is no longer ranking. For the main keyphrase a subcategory page is ranking poorly. LqDO0qr
On-Page Optimization | | RemarkableAgency1 -
.co.uk ranking in serps and .com ranking on local
Hi im not sure what the best course of action is my client has one site but it has 2 TLD the .co.uk and the .com they wanted all .co.uk redirected to the .com but the .co.uk is ranking in the serps postion 2 and the .com is ranking in local, but is way back in the serps. So i have a bit of a dilemma, by the way it is a local business based in the uk. what should i do?
On-Page Optimization | | juun0 -
Latent semantic Indexing - Does this help rankings/relevance?
Hi, Does semantically related words to the target term on a page help with rankings/relevance? If your after the term 'PC Screen' and you use the term 'PC Monitor' will go make the connection and also reward you because of the relevance? Anyone do this and have you seen any positives? I've just started to try this out lately and have been combining it with Wordle.net to give me an indication of where the content piece is heading and how aggressive the content leans towards certain words (makes things a little more interesting then calculating densities).
On-Page Optimization | | Bondara0 -
Lost Page Rank after directing http:// to WWW?
Hi I am trying to redirect all the non www urls to WWW. After I redirected them, most of my category page PR are dropped to 0. Can someone please tell me if this is the normal after effect after the redirect? Example url: this is PR2 before the redirect http://www.ilovebodykits.com/category/95/Body_Kits_Front_Bumpers.html
On-Page Optimization | | ilovebodykits0 -
Issue: Rel Canonical
My SEO Report shows issues: Rel Canonical I have a wordpress website each page has its content but I'm getting errors from my SEOMOZ report. I instaledl the yoast plug in to fix the issue but I'm still getting 29 errors. Wordpress 3.4.1
On-Page Optimization | | mobiledudes0 -
Duplicate Page Content Issue
For one of our campaigns, we have 164 errors for Duplicate Page Content. We have a website where much of the same content lives in two different places on their website. The information needs to be accessible from both areas. What is the best way to tackle this problem? Is there anything that can be done so these pages are not competing against one another? If the only solution is to edit the content on one of the pages, how much of the content has to be different? Is there a certain percentage to go by? Here is an example of what I am referring to: 1.) http://www.valleyorthopedicassociates.com/services/foot-center/preventing-sprains-and-strains 2.) http://www.valleyorthopedicassociates.com/patient-resources/service/foot-and-ankle-center/preventing-sprains-and-strains
On-Page Optimization | | cmaseattle1 -
Are duplicate titles an issue for pages I don't need ranking for?
A client has a load of duplicate page titles on their site. However, to cut a long story short, most of these pages are pointless and therefore we don't need ranking for them. As such, I'm not concerned whether any of the pages with duplicate content on them are ranked or not..... unless having duplicate page titles / content on these pages could mean that other pages on the site, like the homepage, don't rank as high because of this. Do I need to worry about duplicate titles on these pages, or can I ignore duplicate content on pages that I don't want to be ranked? Hope that makes sense!
On-Page Optimization | | RiceMedia0 -
Subfolder v Root Domain for attaining ranking
Hi, We are working with a site that we can refer to as: www.clientsbrand.com.au It has a subsection of our clients services at www.clientsbrand.com.au/keyword The client has now decided that this subsection is in fact the most important thing they should be trying to rank for. A large amount of the content that is already existing under clientsbrand.com.au/topic1 /topic2 /topic3 etc would fit naturally under clientsbrand.com.au/keyword/topic1 etc The root domain is currently not ranking for the keyword they wish to target. I am planning on moving a lot of this content to the subfolder to try and boost that silo and get things going for that keyword. As long as we 301 the old URL's to their corresponding new position are there any issues any of you see with doing this? I am worried there may be some traps for young players I have not considered. Thanks in advance for any responses.
On-Page Optimization | | MrPaulB0