What to do with internal spam url's google indexed?
-
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture.
What is the best way to remove them? use google disavow tool or just redirect them to some page?
The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too
What would be the best strategy to solve this. Thanks.
-
Yes - I would remove/no-index rather than redirect. The URL's don't have any equity so it's not worth redirecting them. Thanks!
-
Thanks John i made new website for client so the new one is safe. Just links are left in google.
So you suggest remove url's not to redirect?
-
Hey there!
At first glance, it looks like your site has been hit with a malware attack. I would recommend using a service like https://sucuri.net to help you clean up the site and put certain protective elements in place so that you don't get hit again.
Once that is done, you'll want to go into the search console and bing webmaster tools to no-index any other spam URL's that may have been created from the attack.
I hope that helps!
John
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam signals from old company site are hurting new company site, but we can't undo the redirect.
My client was forced to change its domain name last year (long story). We were largely able to regain our organic rankings via 301-redirects. Recently, the rankings for the new domain have begun to plummet. Nothing specific took place that could have caused any ranking declines on the new site. However, when we analyze links to the OLD site, we are seeing a lot of link spam being built to that old domain over recent weeks and months. We have no idea where these are coming from but they appear to be negatively impacting our new site. We cannot dismantle the redirects as the old site has hundreds, if not thousands, of quality links pointing to it, and many customers are accustomed to going to that home page. So those redirects need to stay in place. We have already disavowed all the spam we have found on the old Search Console. We are continuing to do so as we find new spam links. But what are we supposed to do about this spam negatively impacting our new site? FYI we have not received any messages in the search console.
White Hat / Black Hat SEO | | FPD_NYC1 -
Traffic exchange referral URL's
We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?
White Hat / Black Hat SEO | | Teamzig0 -
JavaScript encoded links on an AngularJS framework...bad idea for Google?
Hi Guys, I have a site where we're currently deploying code in AngularJS. As part of this, on the page we sometimes have links to 3rd party websites. We do not want to have followed links on the site to the 3rd party sites as we may be perceived as a link farm since we have more than 1 million pages and a lot of these have external 3rd party links. My question is, if we've got javascript to fire off the link to the 3rd party, is that enough to prevent Google from seeing that link? We do not have a NOFOLLOW on that currently. The link anchor text simply says "Visit website" and the link is fired using JavaScript. Here's a snapshot of the code we're using: Visit website Does anyone have any experience with anything like this on their own site or customer site that we can learn from just to ensure that we avoid any chances of being flagged for being a link farm? Thank you 🙂
White Hat / Black Hat SEO | | AU-SEO0 -
Black hat : raising CTR to have better rank in Google
We all know that Google uses click-through-rate (CTR) as one of it is ranking factor. I came up with an idea in my mind. I would like to see if someone saw this idea before or tried it. If you search in Google for the term "SEO" for example. You will see the moz.com website in rank 3. And if you checked the source code you will see that result 3 is linking to this url: https://www.google.com.sa/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDMQFjAC&url=https%3A%2F%2Fmoz.com%2Fbeginners-guide-to-seo&ei=F-pPVaDZBoSp7Abo_IDYAg&usg=AFQjCNEwiTCgNNNWInUJNibqiJCnlqcYtw That url will redirect you to seomoz.com Ok, what if we use linkbucks.com or any other cheap targeted traffic network and have a campaign that sends traffic to the url that I show you. Will that count as traffic from Google so it will increase the CTR from Google?
White Hat / Black Hat SEO | | Mohtaref11 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Internal Link Structure
Hello Everyone, I'd be grateful for a little feedback please; This is my site, the home page
White Hat / Black Hat SEO | | TwoPints
of which is targeting the phrase jobs in **** (I'm sure you can fill i the gap
:)) I've made a few changes recently which has included having the
Contract jobs in **** | Permanent Jobs in **** | Temporary Jobs in **** & Today’s
jobs in **** links added to the homepage... Perhaps foolishly and impatiently, I did all of these at the
same time, whilst also changing the sites internal link structure, specifically
for all links to the homepage, which previously were like <a<br>href="/">Home and have now been changed to <a<br>href="/">jobs in ****</a<br></a<br> Meaning that I have 4500 internal links with the anchor text
'jobs in ****' But rather than seeing an improvement n my SERPs ranking, I have
gone from page 2 of Google to page 6, and falling...... Apart from being inpatient, what have I done wrong? Many thanks0 -
Are there any "legitimate" paid links in Google's eyes?
The news about paid link campaigns is so frequent, that I have to ask the question....does Google allow any paid links? Aside from SEO, paid links can have visibility value. Much like an exit sign on the highway, the paid link says "Get off here"
White Hat / Black Hat SEO | | bcmull0