How can you tell if you were hit by the Pay Day Loan update in 2013?
-
Hi All,
I have a feeling that our site was hit by the Pay Day Loan update in 2013.
We have a website offering serviced apartments - I have seen lots of talk in forums about porn, apartment rentals and finances being among the websites hit by this update.
Can anyone offer any advice on whether this is the case and what action might be needed in order to get this penalty taken off our site?
Thanks guys!
Laura
-
Hi Brendan,
Thanks for the reply!
We did see a significant drop in organic traffic around the time the update went live.
We have lost traffic for generic phrases like "apartments in london" and "london apartments".
We haven't got a manual penalty.
Thanks so much for your help
Laura
-
Hi,
Can you tell us why you think you were hit by that specific update? Did you see a drop in organic traffic from Google around the date the update went live? If you use GA you can overlay your traffic using Barracuda's Panguin Tool which will show you when your traffic dropped and how it coincides with Google's algorithm changes.
Are there any particular search terms you've lost ranking for?
I've taken a very quick look at your back link profile and it looks okay, just a few spammyish looking links like the highly optimised one here.
Also, have you checked GWT to see if you have a manual penalty?
There looks to be a few on-site issues (multiple H1 tags etc) but nothing that would point towards a penalty. If you could get back to us on the points above, we can take a closer look for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I think I got hit by the latest Panda update
Hi everyone, I think one of my sites got hit with Panda. On Sept 18th the site dipped to "not in top 50" for almost all keywords. I checked GWT for the manual action email but my inbox is empty!!!!!!!!!! The lesser of 2 evils I guess. They had major server issues that week as well so it is hard to identify what caused the site to dip. My client has original content on the website but almost all content on the blog is copied. Do you recommend me deleting the non original content? Can the problem be elsewhere? Thanks
Technical SEO | | Carla_Dawson0 -
How can i Get this google meta description?
How can i Get the google meta description that has the website Ke Adventure ? https://www.google.co.uk/?gws_rd=ssl#q=ke+adventure I mean with the link of the website section below the SEO TITLE tx
Technical SEO | | tourtravel0 -
NOFOLLOW Links: Can we 100% ignore them for SEO purposes?
Some SEO articles say we can completely ignore NoFollow links. Other articles say they still matter - but then are very vague on what they count for or against. So which is it really? I do realize that they can provide traffic, and for that they are worthwhile. But it is SEO I am asking about... The SEO purpose I am most concerned with is the Link Profile. Separating the Follows from the NoFollows often gives really different anchor text distributions. If they don't matter, why do MOZ and other SEO Analysis programs still include them in their standard reports? (I can see some benefit to having them as part of the in-depth reports) So what's your thoughts? Can we 100% ignore the NoFollows for our SEO analysis?
Technical SEO | | GregB1230 -
Can Google Read schema.org markup within Ajax?
Hi All, as a local business directory, we also display Openinghours on a business listing page. ex. http://www.goudengids.be/napoli-kontich-2550/
Technical SEO | | TruvoDirectories
At the same time I also have schema.org markup for Openinghours implemented.
But, for technical reasons (performance), the openinghours (and the markup alongside) are displayed using AJAX. I'm wondering if google is able to read the markup. The rich snippet tool and markup plugings like Semantic Inspector can't "see" the markup for openinghours. Any advice here?0 -
How can the search engines can crawl my java script generated web pages
For example when I click in a link of this movie from the home page, the link send me to this page http://www.vudu.mx/movies/#!content/293191/Madagascar-3-Los-Fugitivos-Madagascar-3-Europes-Most-Wanted-Doblada but in the source code I can't see the meta tittle and description and I think the search engines wont see that too, am I right? I guess that only appears the source code of that "master template" and that it is not usefull for me. So, my question is, how can I add dynamically this data to every page of each movie to allow crawl all the pages to the search engines? Thank you.
Technical SEO | | mobile3600 -
Can I noindex most of my site?
A large number of the pages on my site are pages that contain things like photos and maps that are useful to my visitors, but would make poor landing pages and have very little written content. My site is huge. Would it be benificial to noindex all of these?
Technical SEO | | mascotmike0