How can do I report a multiple set of duplicated websites design to manipulate SERPs?
-
Ok, so within one of my client's sectors it has become clear that someone is trying to manipulate the SERPs by registering tons of domains that are all keyword targeted.
All of the websites are simply duplications of one another and are merely setup to dominate the SERP listings - which, at the moment, it is beginning to do.
None of the sites have any real authority (in some cases 1 PA and DA) and yet they're ranking above much more established websites. The only back links they have are from dodgy-looking forum ones. It's all a bit crazy and it shouldn't be happening.
Anyway, all of the domains have been registered by the same person and within a two-month time period of each other.
What do you guys think is the best step to take to report these particular websites to Google?
-
Hey Sha,
Thanks for that.
It would seem that would be the best bet.
-
Hi Matthew,
As Mark suggested, create a spreadsheet in Google Drive and list all of the domains/URLs in the spreadsheet.
Add a single URL in the first field of the Spam Report form (the required field), then in the third field, provide a link to the spreadsheet (300 character max)
Something like: "I believe the sites listed in the googledoc at URL are using manipulative linking practices to influence search engine rankings"
Do be confident though, as Mark warned, that your client's site is able to withstand any scrutiny that might come with a review of sites in the niche.
Hope that helps,
Sha
-
Sorry, I don't quite follow. How would this work?
-
I would try creating an open Google doc, and then listing all of the sites in the network. Kind of like the reconsideration request method, where you link to an open Google doc with all of the details of the webmasters you contacted, responses, success rate, etc
-
Hey Mark,
Thanks for your response.
I'm aware of that link but it only allows you to submit one link at a time. In this case, there are tons of domains that I'd have to submit. So, is there anyway to submit multiple sites?
-
Hi Matthew,
Google has a specific form for reporting webspam - you can find the spam report here - https://www.google.com/webmasters/tools/spamreportform?hl=en
Before you submit a competitor, make sure your own site/s are clean - don't want them looking to closely into your SERPs and your sector and finding a problem with you as well.
Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
SERPs Help
Hey Mozzers, Please can someone advise? I manage the on-line content for an estate of Gyms in the UK. We had an existing gym location in Birmingham - www.nuffieldhealth.com/gyms/birmingham and 5 months ago we opened a new location in Birmingham - www.nuffieldhealth.com/gyms/birmingham-central. The 2 pages have different in-page content, different H1's, different title tags, different citations in page both have a few back links from different root domains, however the 2nd page (birmingham-central) does not rank in the top 50 results even though our domain is strong that the vast majority of results? Our original page (/gyms/birmingham) also slipped from page 1 in SERPs to the bottom of page 2 when the second Birmingham gym page was deployed?? I am guessing Google does not know which page to serve in SERPs, bud i am at a lose as to how to fix this issue. Can anyone please advise?? Regards Ben
White Hat / Black Hat SEO | | Bendall0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Rel author and duplicate content
I have a question if a page who a im the only author, my web will duplicate content with the blog posts and the author post as they are the same. ¿what is your suggestion in that case? thanks
White Hat / Black Hat SEO | | maestrosonrisas0 -
How can I recover from an 'unnatrual' link penalty?
Hi I believe our site may have been penalised due to over optimised anchor text links. Our site is http://rollerbannerscheap.co.uk It seems we have been penalised for the key word 'Roller Banner' as the over optimised anchor text contains key word 'Roller Banner' or 'Roller Banners'. We dropped completely off page 1 for 'Roller Banner', how would I recover from this error?
White Hat / Black Hat SEO | | SO_UK0 -
Links from automated translations can damage the source?
I've a website dataprix.net composed by automated translations in diferent languages from original contents from another website, dataprix.com. Is good for dataprix.com to be linked by the contents of dataprix.net as the source of translated content, or could be considered by Google as a lot of low quality links and result on penalties for dataprix.com?
White Hat / Black Hat SEO | | xiruca0 -
Next Step to Improve SERPS?
Hello... Just looking for some opinions on what my next steps should be in regards to improving my rankings in the SERPS. Most of my category and subcategory pages (as well as the home page) have received grades of "A" on SEOMOZ's on page grader... None of my competitors have as much or as unique of content as I do. My question to all of you SEO genius' and experts (i mean that as a great compliment, not sarcasm) is what would your next steps be in terms of moving up in the search results? My url is : http://goo.gl/XUH3f Thanks in advance!
White Hat / Black Hat SEO | | Prime850 -
Pages higher than my website in Google have fewer links and a lower page authority
Hi there I've been optimising my website pureinkcreative.com based on advice from SEOMoz and at first this was working as in a few weeks the site had gone from nowhere to the top of page three in Google for our main search term 'copywriting'. Today though I've just checked and the website is now near the bottom of page four and competitors I've never heard of are above my site in the rankings. I checked them out on Open Site Explorer and many of these 'newbies' have less links (on average about 200 less links) and a poorer page authority. My page authority is 42/100 and the newly higher ranking websites are between 20 and 38. One of these pages which is ranking higher than my website only has internal links and every link has the anchor text of 'copywriting' which I've learnt is a bad idea. I'm determined to do whiter than white hat SEO but if competitors are ranking higher than my site because of 'gimmicks' like these, is it worth it? I add around two blog posts a week of approx 600 - 1000 words of well researched, original and useful content with a mix of keywords (copywriting, copywriter, copywriters) and some long tail keywords and guest blog around 2 - 3 times a month. I've been working on a link building campaign through guest blogging and comment marketing (only adding relevant, worthwhile comments) and have added around 15 links a week this way. Could this be why the website has dropped in the rankings? Any advice would be much appreciated. Thanks very much. Andrew
White Hat / Black Hat SEO | | andrewstewpot0