My Website is getting too many DMCA Hits
-
My Website has been getting too many DMCA Hits since last december then my rankings dropped
i would like to know if getting a new domain would be advisable ... and would it be good to redirect my website that is getting DMCA hits to the new domain i want to get
it is advisable to build links for it the new domain or would it pass link juice to it (it has some spammy links tho)
-
If you stop doing the things that cause the DMCA hits and start over on a new domain then this problem will be gone.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Flurry of thousands of bad links from 3 Spammy websites. Disavow?
I also discovered that a website www.prlog.ru put 32 links to my website. It is a russian site. It has a 32% spam score. Is that high? I think I need to disavow. Another spammy website link has spam score of 16% with with several thousand links. I added one link to the site medexplorer.com 6 years ago and it was fine. Now it has thousands of links. Should I disavow all three?
White Hat / Black Hat SEO | | Boodreaux0 -
Getting Spam Links
Hi There, I am planning to Disavow one spam domain but when check Google cache it shows my client domain name. So if I disavow this spam domain which link Google considered? Please help me. Thanks Satla
White Hat / Black Hat SEO | | TrulyTravel0 -
Do you get penalized for Keyword Stuffing in different page URLs?
If i have a website that provides law services in varying towns and we have pages for each town with unique content on each page, can the page URLS look like the following: mysite.com/miami-family-law-attorney mysite.com/tampa-family-law-attorney mysite.com/orlando-family-law-attorney Does this get penalized when being indexed?
White Hat / Black Hat SEO | | Armen-SEO0 -
Ever seen this tactic when trying to get rid of bad backlinks?
I'm trying to get rid of a Google penalty, but one of the URLS is particularly bizarre. Here's the penalized site: http://www.travelexinsurance.com. One of the external links Google cited as not being natural that links to the penalized site is: http://content.onlineagency.com/index.aspx?site=6599&tide=769006&last=3111516 In the backlink profile of the penalized site, there are about 100 different backlinks pointing to www.travelexinsurance.com from content.onlineagency.com/... So when I visit http://content.onlineagency.com/index.aspx?site=6599&tide=769006&last=3111516 it actually is displaying content from http://www.starmandstravel.com/787115_6599.htm, which you can see after clicking the "Home" button. That company is a legit travel agency who I assume knows nothing about content.onlineagency.com and is not involved in whatever is going on. And that's the case for every link from content.onlineagency.com. So I'm just wondering if someone can help me understand what sort of tactic content.onlineagency.com is using. One of my predecessors I fear used some black hat tactics. I'm wondering if this is a remnant of that effort.
White Hat / Black Hat SEO | | Patrick_G0 -
How to save website from Negative SEO?
Hi, I have read couple of good blog post on Negative SEO and come to know about few solution which may help me to save my website during Negative SEO. Here, I want to share my experience and live data regarding Negative SEO. Someone is creating bad inbound links to my website. I come to know about it via Google webmaster tools. Honestly, I have implemented certain solutions like Google disavow tool, contact to certain websites and many more. But, I can see negative impact on organic visits. Organic visits are going down since last two months. And, I am thinking, These bad inbound links are biggest reasons behind it. You can visit following URLs to know more about it. Can anyone share your experience to save website from negative SEO? How can I save any website from Negative SEO (~Bad Inbound Links) https://docs.google.com/file/d/0BxyEDFdgDN-iR0xMd2FHeVlzYVU/edit https://drive.google.com/file/d/0BxyEDFdgDN-iMEtneXU1YmhWX2s/edit?usp=sharing https://drive.google.com/file/d/0BxyEDFdgDN-iSzNXdEJRdVJJVGM/edit?usp=sharing
White Hat / Black Hat SEO | | CommercePundit0 -
11 000 links from 2 blogs + Many bad links = Penguin 2.0\. What is the real cause?
Hello, A website has : 1/ 8000 inbound links from 1 blog and 3000 from another one. They are clean and good blogs, all links are NOT marked as no-follow. 2/ Many bad links from directories that have been unindexed or penalized by Google On the 22nd of May, the website got hurt by Penguin 2.0. The link profile contains many directories and articles. The priority we had so far was unindexing the bad links, however shall we no-follow the blog links as well? Thanks!
White Hat / Black Hat SEO | | antoine.brunel0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Advice on using the disavow tool to remove hacked website links
Hey Everyone, Back in December, our website suffered an attack which created links to other hacked webistes which anchor text such as "This is an excellent time to discuss symptoms, fa" "Open to members of the nursing/paramedical profes" "The organs in the female reproductive system incl" The links were only visible when looking at the Cache of the page. We got these links removed and removed all traces of the attack such as pages which were created in their own directory on our server 3 months later I'm finding websites linking to us with similar anchor text to the ones above, however they're linking to the pages that were created on our server when we were attacked and they've been removed. So one of my questions is does this effect our site? We've seen some of our best performing keywords drop over the last few months and I have a feeling it's due to these spammy links. Here's a website that links to us <colgroup><col width="751"></colgroup>
White Hat / Black Hat SEO | | blagger
| http://www.fashion-game.com/extreme/blog/page-9 | If you do view source or look at the cached version then you'll find a link right at the bottom left corner. We have 268 of these links from 200 domains. Contacting these sites to have these links removed would be a very long process as most of them probably have no idea that those links even exist and I don't have the time to explain to each one how to remove the hacked files etc. I've been looking at using the Google Disavow tool to solve this problem but I'm not sure if it's a good idea or not. We haven't had any warnings from Google about our site being spam or having too many spam links, so do we need to use the tool? Any advice would be very much appreciated. Let me know if you require more details about our problem. <colgroup><col width="355"></colgroup>
| | | |0