Can anyone see what hurt me in my traffic
-
http://www.freescrabbledictionary.com/
I need some fresh eyes on this traffic of mine. I have compared the dates to when updates have been rolled out (panda, penguin, page layout, top heavy, etc...) and I cant match these dates with any of them. Can anyone maybe shed some light to my traffic drops and recovery then traffic drop again and see if there is a correlation with any major updates?
- Dropped Jan 22 2014
- Dropped April 4 2014
- Recovered August 25 2014
- Dropped September 18 2014
-
Yeah - it doesn't seem to match known updates, but it's certainly dramatic movement. This isn't some seasonal shift or cyclical thing - these are clearly steep climbs and drops. We don't know the dates of most of the Panda data refreshes, but we do pin one at Sept. 23, 2014, so even that's not lining up cleanly here.
Are you seeing a lot of losses in the long-tail? The site doesn't seem to have much authority, but you've got 10K+ pages indexed, and I strongly suspect that many of them may look thin to Google. It feels Panda-like, for lack of a better word, even if it isn't quite Panda (maybe just a very strong filter). Meanwhile, if the links you have our problematic at all, you could be hitting a double-whammy. It might not be a single update, but you could be getting hit by a number of different changes over time. The trend-line certainly isn't promising.
My concern is the individual definition pages, like:
http://www.freescrabbledictionary.com/dictionary/word/barrio/
While it looks to have a lot of content, the definitions come from online dictionaries (or, at least, are shared with them), and the examples seem to be drawn from publicly available web pages. So, it's very possible that each element on these pages looks duplicated across the web to Google. With a strong link profile, it might not be a problem, but if you're struggling on links and with content, the odds could end up stacked against you.
Truthfully, you may have to see where your strongest ranking pages are (are they top-level or long-tail) and consolidate. If you've taken losses on individual word pages and most of your traffic is coming from pages like this:
http://www.freescrabbledictionary.com/word-lists/words-with-a/
...then you might want to consider not indexing those lower-value pages and focusing on what's working. This could help concentrate your link equity on a stronger offering. It's a difficult choice, but I don't think you're looking at a technical problem here or a clear, single-shot penalty. I think you're looking at a systemic problem.
Looks like you've got a chunk of 404s in your category pages, too, such as:
http://www.freescrabbledictionary.com/word-lists/words-with-z/10-letters/
Not sure what's up there - if you're trying to de-index those, or if there's a technical issue.
Unfortunately, I suspect this is a complex, multi-layer problem and there probably isn't a single solution. I hope I'm wrong, but that's my gut feeling at a glance.
-
I just saw that every one of my dictionary word pages 280,000+ of them all had a list of anagram words on them ranging from 10 to 50 of them and they all contained links like this
[[the word]](/dictionary/word/[the word]/ "check and see if [the word] is in the scrabble dictionary")
[the word] represents the specific word. "Scrabble Dictionary" was my main and hardest hit keyword. Could this have been spammy to Google? That title contained scrabble dictionary in it and it could have been on each one of those 280,000 pages at least 10 - 50+ times?
-
Well the biggest change I did in the last week was completely refresh my site to a new responsive design. I made sure all my canonicals and redirects are setup properly and checked them 10 fold.
Some changes I have done over the last few months was focus on Panda related issues and noindexed about 90% of my site. This was after the traffic drops and not before. The reason I did this was for instance all my dictionary pages get their definitions from an API provided from another site called wordnik. Since that would be considered duplicate content I went ahead and did that.
Example Page: http://www.freescrabbledictionary.com/dictionary/word/test/
I also did the same for all my quotes and sentence examples as well since all those would be considered duplicates. Everything else is original content which used to rank very well. Because of all the duplicate content I thought that possibly could be an issue...no changes yet.
Went through my backlink profile and got some links removed that I suspected spammy as well and disavowed ones I could net get removed and were very spammy.
That is where I am at so far. I used to rank very well in my niche for years and during the Panda/Penguin days is when my rankings tank so I was suspecting that had something to do with it. At this point I have tried so much I feel like there is nothing else.
-
During the low traffic periods, did you make any changes to the site to try and recover? Was any additional link building performed? Anything else outside of the new site design? And do you know when that happened?
Your backlink profile certainly isn't the worst I have ever seen, although a bit of a disavow wouldn't go amiss with some of them, and I wouldn't necessarily be jumping the gun thinking that you have been hit by a penalty.
Where is your traffic at right now?
-Andy
-
Hi there
I would take a look at your backlinks - you have a TON of backlinks coming from sites that look just like yours. I am using Majestic data in my quick scan. From there you can decide which links you wish to remove or disavow. This is something you need to take care of.
Hope this helps! Good luck!
-
This is strictly organic / google
No changes that I am aware of, but at the time I had a old site design. I have a new one now
Never had a manual action
I annotate when I make changes so I had none then
I am strongly suspecting Penguin even though the dates don't line up but I have to wait for the refresh before I know
-
Hi there
Is this strictly organic traffic? Did you make any changes on these days pertaining to GTM and GA?
Did you check and see if you have any manual actions? From here on out, if you do any changes to your site or properties, annotate it so you can watch changes perform.
Hope this helps - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Javascript Links Be Used to Reduce Links per Page?
We are trying to reduce the number of links per page, so for the low-value footer links we are considering coding them as javascript links. We realize Google can read java, but the goal is to reduce level of importance assigned to those internal links. Would this be a valid approach? So the question is would converting low-value footer links to js like below help reduce the number of links per page in google's eyes even though we're reasonably sure they can read javascript. <a <span="" class="html-tag">href</a><a <span="" class="html-tag">="</a><a class="html-attribute-value html-external-link" target="_blank">javascript:void(0);</a>" data-footer-link="/about/about">About Us
On-Page Optimization | | Jay-T0 -
Why would changing 404 pages increase traffic by 9%?
Neil Patel claimed in this article that by creating a custom 404 page that links out to 25 to 50 random internal pages on the website, he was able to increase the traffic of Techcrunch by 9%. I'm a bit skeptical about this claim. A couple of questions: Is this theory sound? If you've personally tried this or have read other articles supporting Neil, I'd love to learn more. Would a big site like Techcrunch really have problems with Google not indexing all of its pages? Also, does getting more pages crawled help you get more traffic? Specifically, would it help a site like mine? For reference, my site gets an average of 12,040 pages crawled per day in last 90 days. Currently 28,922 pages have been indexed. Are there any possible downsides to trying this? Thanks!
On-Page Optimization | | Brand_Psychic0 -
Does PLA hurt my organic google search.
I have experienced a big drop in my organic trafic to my site. But an increase in my cpc...? Does google punish my site for PLA to get higher revenue?
On-Page Optimization | | Egmont0 -
I want to improve our client's website structure, so he gets more traffic locally. What advice do you have ?
We want to "revamp" our client's website, by improving the overall looking (content, images, structure). Our client is a small retail business but wants to have more traffic. What advice can you give me ?
On-Page Optimization | | marketingmedia.ca0 -
Why Can't I Get Indexed?
I cannot seem to get my website indexed by Google! I submitted the sitemap using Google WMT about a month ago but only one page is being indexed. There are very few backlinks to the site, so I don't believe there are any penalties due to over-optimization that would prevent indexing. Also, my robots.txt file is properly configured and is not preventing any pages from being crawled. I've tried using the "Fetch as Google" settings in WMT with no luck. Any ideas?
On-Page Optimization | | socialfirestarter0 -
How can I make it so that the various iterations (pages) do not come up as duplicate content ?
Hello, I wondered if somebody could give me some advice. The problem of various iterations of the clanedar page coming up as duplicate content. There is a large calendar on my site for events and each time the page is viewed it is seen as duplicate content . How can I make it so that the various iterations (pages) do not come up as duplicate content ? Regards
On-Page Optimization | | Tony14Aug0 -
Can internal duplicate content cause issues?
Hello all mozzers - has anyone used nitrosell? we use them only because their inventory connects to ours epos point but because they have thousands of 301s on our domain we are getting duplicate content because different sizes of products (we sell womenswear) are creating seperate URLS so we are duplicating both content and URLS - im curious as to whether anyone has experienced simillar problems that have affected their SERPS? Best wishes, Chris
On-Page Optimization | | DaWillow1 -
How can we get Google to offer postcard verification for our Place Page?
Most of the time, when we claim a Google Place Page, they give 2 choices to verify ownership: 1) phone verification and 2) postcard verification. But right now (and for several weeks), for our listing, they are only giving the phone verification choice, which unfortunately won't work with our automated phone system. How can we get our Place Page listing verified through a postcard sent to our address, when Google isn't presenting that as an option?
On-Page Optimization | | DenisL0