I think My Site Has Been Hacked
-
I am working with a client and have noticed lots of 500 server errors that look very strange in their webmaster tools account.
I am seeing URLs like this blog/?tag=wholesale-cheap-nfl-jerseys-free-0702.html and blog/?tag=nike-jersey-shorts-4297.html
there are 155 similar pages yet the client does not sell anything like this and hasn't created these URLs. I have updated WP and all plugins and cannot find these links or pages on the site anywhere but I am guessing they are slowing the site down as GWT keeps highlighting them as errors.
Has anybody had any experiences with these types of hacks and can point me in the right direction of how to clean it up properly?
Ta
-
If they are tags then they should show up in the tag section of Posts or possibly in the comments. Not sure if you allow uploads to your site, but if you do you should check out the upload folder(s). Keep in mind, these URLs could be showing up somewhere out in cyberspace, not necessarily on your site. Take those steps I pointed out and you should see those ugly URLs go away within a few weeks, not accounting for other factors.
-
Woudl I be able to see this from the Dashboard as in by clicking the tags section or would they be hidden?
-
No! Sorry, SajeetNair. No disrespect here but don't click on these URLs. That's exactly what spammers want you to do. Simply follow the tips outlined in my post and these links will be ignored. Or better yet, disable any tag feature on your client's blog.
-
Looks like somebody or somebot is using your tag widget to tag your pages/posts with their dirty links. If you really need this tag feature, I suggest you make sure any links posted have a rel attribute of nofollow, and I'd add that /tag directory to your robots.txt something like this:
User-agent: *
Disallow: / *?tagAlso, login to your webmaster tools and add URL Parameters to let engines know to ignore these pages.
-
First click on one of the URLs and see the pages from which they are linking from. If its an internal link then check the source code and rectify the error. If its an external link then the third party website needs to be contacted and the links must be removed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Homepage not ranking for targeted keywords (established site with somewhat ok UR&DR)
Hello everyone, i have a question regarding my homepage issue. My homepage is not showing up in google search result for all the keywords except brand name. I have checked the following things to make sure my homepage is working properly. 1.The page is indexed. 2.No canonical issues 3.No robots.txt issues. 4.Ahrefs UR45 DR55 while my competitors ranking in 2nd and 3rd page have lower UR and DR Have tens of thousands of backlinks but i think most of them are legit I suspect the problem might be the hoempage has more than 70 Anchor text (Internal links) working as directory, and many of them contain the keywords we are targeting. Will that be the reason my homepage is not ranking at all? Since the google might consider it as keyword stuffing and penalize my homepage for that. What are your thoughts on this? Any suggestion would be greatly appreciated!
White Hat / Black Hat SEO | | sufanfeiyan0 -
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Google admits it can take up to a year to refresh/recover your site after it is revoked from Penguin!
I found myself in an impossible situation where I was getting information from various people that seem to be "know it all's" but everything in my heart was telling me they were wrong when it came to the issues my site was having. I have been on a few Google Webmaster Hangouts and found many answers to questions I thought had caused my Penguin Penalty. After taking much of the advice, I submitted my Reconsideration Request for the 9th time (might have been more) and finally got the "revoke" I was waiting for on the 28th of MAY. What was frustrating was on May 22nd there was a Penguin refresh. This as far as I knew was what was needed to get your site back up in the organic SERPS. My Disavow had been submitted in February and only had a handful of links missing between this time and the time we received the revoke. We patiently waited for the next penguin refresh with the surety that we were heading in the right direction by John Mueller from Google (btw.. John is a great guy and really tries to help where he can). The next update came on October 4th and our rankings actually got worse! I spoke with John and he was a little surprised but did not go into any detail. At this point you have to start to wonder WHAT exactly is wrong with the website. Is this where I should rank? Is there a much deeper Panda issue. We were on the verge of removing almost all content from the site or even changing domains despite the fact that it was our brand name. I then created a tool that checked the dates of every last cached date of each link we had in our disavow file. The thought process was that Google had not re-crawled all the links and so they were not factored into the last refresh. This proved to be incorrect,all the links had been re-cached August and September. Nothing earlier than that,which would indicate a problem that they had not been cached in time. i spoke to many so called experts who all said the issue was that we had very few good links left,content issues etc.. Blah Blah Blah, heard it all before and been in this game since the late 90's, the site could not rank this badly unless there was an actual penalty as spam site ranked above us for most of our keywords. So just as we were about to demolish the site I asked John Mueller one more time if he could take a look at the site, this time he actually took the time to investigate,which was very kind of him. he came back to me in a Google Hangout in late December, what he said to me was both disturbing and a relief at the same time. the site STILL had a penguin penalty despite the disavow file being submitted in February over 10 months ago! And the revoke in May. I wrote this to give everyone here that has an authoritative site or just an old one, hope that not all is lots just yet if you are still waiting to recover in Google. My site is 10 years old and is one of the leaders in its industry. Sites that are only a few years old and have had unnatural link building penalties have recovered much faster in this industry which I find ridiculous as most of the time the older authoritative sites are the big trustworthy brands. This explains why Google SERPS have been so poor for the last year. The big sites take much longer to recover from penalties letting the smaller lest trustworthy sites prevail. I hope to see my site recover in the next Penguin refresh with the comfort of knowing that my site currently is still being held back by the Google Penguin Penalty refresh situation. Please feel free to comment below on anything you think is relevant.
White Hat / Black Hat SEO | | gazzerman10 -
My site has disapeared from the serps. Could someone take a look at it for me and see if they can find a reason why?
my site has disappeared from the serps. Could someone take a look at it for me and see if they can find a reason why? It used to rank around 4 for the search "austin wedding venues" and it still ranks number three for this search on Bing. I haven't done any SEO work on it in a while so i don't think i did anything to make Google mad but now it doesn't even rank anywhere in the top 160 results. Here's the link: http://austinweddingvenues.org Thanks in advance Mozzers! Ron
White Hat / Black Hat SEO | | Ron100 -
What's the best way to set up 301's from an old off-site subdomain to a new off-site subdomain?
We are moving our Online store to a new service and we need to create 301's for all of the old product URLs. Being that the old store was hosted off-site, what is the best way to handle the 301 re-directs? Thanks!
White Hat / Black Hat SEO | | VermilionDesignInteractive0 -
Links to partner sites
I have some partnerships in some portals, usually I put the banner of my company with a link to my site on a space partners. How should I proceed? To place the banner no link? To put the link nofollow? Can’t I do it? Don’t I need to worry about it?
White Hat / Black Hat SEO | | soulmktpro0 -
Could a sitewide footer EXACT MATCH anchor text link hurt or potentially penalize a site?
I am pretty sure this would hurt rankings yet I just want another's opinion on it. Would a sitewide footer link with exact match keyword anchor text to the page you want to rank for your main keyword hurt you? Basically if it were a link to the homepage, yet you wanted to make the anchor text your main objective keyword, would it hurt to have this in the footer along with the logo link at the top of a page that is just "home" anchor text?
White Hat / Black Hat SEO | | jbster130 -
EXPERT CHALLENGE: What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change?
FOR ALL SEO THOUGHT LEADERS...What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change? NOTE: My hope is that the responses left on this thread will ultimately benefit all members of the community and give recognition to the true thought leaders within the SEO space. That being said, my challenge is a 2 part question: With the 80/20 rule in mind, and in light of recent algorithm changes, what would YOU focus most of your SEO budget on if you had to choose? Let's assume you're in a competitive market (ie #1-5 on page 1 has competitors with 20,000+ backlinks - all ranging from AC Rank 7 to 1). How would you split your total monthly SEO budget as a general rule? Ex) 60% link building / 10% onsite SEO / 10% Social Media / 20% content creation? I realize there are many "it depends" factors but please humor us anyways. Link building appears to have become harder and harder as google releases more and more algorithm changes. For link building, the only true white hat way of proactively generating links (that I know of) is creating high quality content that adds value to customers (ie infographics, videos, etc.), guest blogging, and Press Releases. The con to these tactics is that you are waiting for others to find and pick up your content which can take a VERY long time, so ROI is difficult to measure and justify to clients or C-level management. That being said, how are YOU allocating your link building budget? Are all of these proactive link building tactics a waste of time now? I've heard it couldn't hurt to still do some of these, but what are your thoughts and what is / isn't working for you? Here they are: A. Using spun articles edited by US based writers for guest blog content B. 301 Redirects C. Social bookmarking D. Signature links from Blog commenting E. Directory submissions F. Video Submissions G. Article Directory submissions H. Press release directory submissions I. Forum Profile Submissions J. Forum signature links K. RSS Feed submissions L. Link wheels M. Building links (using scrapebox, senukex, etc.) to pages linked to your money site N. Links from privately owned networks (I spoke to an SEO company that claims to have over 4000 unique domains which he uses to boost rankings for his clients) O. Buying Contextual Text Links All Expert opinions are welcomed and appreciated 🙂
White Hat / Black Hat SEO | | seoeric2