1500 Domains... Where to begin? & Web Structure Question.
-
So, as the title says, I am stuck. I recently have been brought on as the SEO guru for a small-mid size company with the task of rebuilding their web presence. Their website is in pretty unfortunate condition. The more research I do, the farther and farther I am going down the rabbit hole of chaos.
Essential the previous CEO was doing all SEO work. He purchased 1500 domains, all keyword specific. Installed wordpress on roughly 1,000 and then began pumping out content. Of the 1,000 roughly 300 of them have about 600-2,000 characters worth of content that is absolute fluff. From there the linking began.
Now the content is different enough that Google doesn't seem to notice that its the SAME FREEKIN THING on each domain, but I am very concerned. The company has their main multi-page domain which has other links and sources of traffic, but in essence the previous owner created a micro link web.
My advice is to cut those links ASAP and remove the previous work. At the same time, I also don't want them to lose rank. So I guess I am asking a whole slew of questions...
- Am I right in thinking that we have to build a bridge before we burn a bridge?
- Is it worth fixing up some of those other domains to have original content to try and bolster what we already have?
- Would it be better to combine everything into one website, or try and have different domains represent different things. For example Envato.com is an umbrella website with 8 separate websites operating under the same roof using different domains.?
- Where do I begin? I feel like I have started this project numerous times. I know the keywords, I know where the duplicate content is, I know the structure of the main domain, I am getting the structure of the entire link web.
Lastly, any thoughts you all have would be greatly appreciated. I realistically have minimal experience in this realm. I am a a major nub. I understand SEO in theory, sorta. So I'm getting there!
-
UPDATE
I was just informed by the agency that we are working with that two pages on our website have been compromised.
- Index.php
- Hello.php
Does anybody have any recommendations for malicious code identification software? Has anybody used the Browseo cloak identification tool? What is everybody's go-to tool for this sort of thing?
-
Nice Work!
-
We have teamed up with a local agency in Los Angeles who will be assisting us with a redesign. They are also going to look through our SEO and assist, if possible, with that. I am currently working with the studio, our treatment team, and myself to come up with a website architecture that is user friendly and effective.
As far as what has been done with the link structure. I was able to narrow down the 1000 domain list to roughly 200 that actually had content and were actually linking to us. I changed the nameservers on all of them which should not only speed up the private server we are hosting on, but also lower the number of domains on our CBlock.
I have spoken with a few members from this forum offline who I will not name unless they give me approval to give them credit.
Thanks Everybody!
-
Thank you both! You are both SEOmnipotent. See what I did there?
I might pinch that
With the links, something else you will probably need to do, is perform a disavow of the ones you don't want to be associated with. Google would eventually get round to them all, but i'll bet that with a bit of digging, you won't just find the links from the network the only issues.
And thank you Egol
-Andy
-
HA! Yea, I think that sorta answered my question.
-
I think you need that consultant.
-
Thank you both! You are both SEOmnipotent. See what I did there?
Anyway, that's actually, kinda nice... One of the methods I have been doing is looking at the Open Web Explorer.
So we have 145 Root Links, 2,296 total links a DA of 35 which I know is nothing great. At our best we have a link from psychologytoday.com with a DA of 90, and at the other end a bunch of links from plenty of our random websites with a DA of around 20. I have been going through each domain trying to determine where the "hives" are and which ones are just the outliers. Am I on the right track here?
-
Good info from Andy.
Trying to combine the 'fluff' content of 1000 sites into a primary domain could seal your fate. I would advise heavily against doing this.
Yes. If you are going to recycle anything be sure that it is not duplicate content from anywhere else.
I would guess you need to look to start a structured takedown of the network, but you must handle this carefully.
Yes, I would be using a hatchet more than a hammer on this job. But, careful study before starting.
-
Egol has pretty much hit the nail on the head.
You have a big old mess there and you don't want to end up doing more damage than good - it is easily done!
Now the content is different enough that Google doesn't seem to notice that its the SAME FREEKIN THING on each domain, but I am very concerned. The company has their main multi-page domain which has other links and sources of traffic, but in essence the previous owner created a micro link web.
That there is a big problem - link networks, as that is what this basically is, are a huge headache and I can only assume it is because it is poor quality, that Google hasn't found it yet. However, they probably will at some point so you want to back out of that as soon as possible.
Am I right in thinking that we have to build a bridge before we burn a bridge?
Yes... And no. If you get this bit wrong and suddenly Google becomes aware of what is going on in the background, you will bring them down on you quicker than you can take the sites offline. I am guessing that 99.9% of these sites have no reason to be there? If that is the case, I would be looking at a structured takedown / decommission.
Is it worth fixing up some of those other domains to have original content to try and bolster what we already have?
Probably not. If the content is how you describe it, Google has probably already made their mind up about the site and trying to fix a site of poor quality, can sometimes be an uphill battle because you are starting at a negative score. That said, it is difficult to advise exactly on this. You want to get out of this whole link network though.
Would it be better to combine everything into one website, or try and have different domains represent different things. For example Envato.com is an umbrella website with 8 separate websites operating under the same roof using different domains.?
Trying to combine the 'fluff' content of 1000 sites into a primary domain could seal your fate. I would advise heavily against doing this. Perhaps there are some sites with decent content on them that could be used to make up a blog post, but you would have to be careful about checking to see if was indexed before placing it elsewhere.
Where do I begin? I feel like I have started this project numerous times. I know the keywords, I know where the duplicate content is, I know the structure of the main domain, I am getting the structure of the entire link web.
I wouldn't want to advise incorrectly on this, but I would guess you need to look to start a structured takedown of the network, but you must handle this carefully. Either that or distancing the main site by removing links from the network to the primary domain.
I hope this helps a little.
-Andy
-
Yeah, thank you! That was sorta where my head was going as well. Thanks again!
-
If I was in your situation, I would prepare a one- or two-page document that explains how there are hundreds of spam domains in the wild that contain links and garbage that could damage the company reputation and rankings.
The goal of this document would be to justify funds to hire an experienced consultant who can advise you on how to fix this problem and create a plan for developing the company's web assets going forward.
There is nothing wrong with an in-house SEO getting appraisals or consultations from experts or just second opinions from colleagues. I do this all of the time - at least a few times a month before I make moves or decisions. It is not a confession that you are a noob or that you need assistance to do your job. I get consulting to run my sites better. It pays.
From what you have described, the size of this problem is enormous and probably too complex to get good well-considered advice for free from a forum. It will probably get more complex after you start digging.
I am not recommending paid advice because I am looking for work. I don't do paid consultations for anyone and this job is too big to understand in the amount of time spent on a forum question. You don't want poorly considered advice. That usually costs a lot more in repairs than good advice costs from the start.
There are plenty of people who post here who are qualified to help you. I suggest searching the Q&A for problems similar to this, see who replied, who was helpful, who got lots of thumbs up for relevant questions and who has a style that you like. Don't rush to hire. Be careful. Do some research.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Plan for Analyzing and Optimizing Internal Link Structure
I've noticed that our site has what appears to be a poor ratio of external-to-internal links. I know there are no hard and fast rules in SEO but generally I've come to understand that this ratio should be relatively balanced, and in some cases I've seen some correlation of high-performing/ranking sites closer to 70-30 (external-internal). I'm not going to get into specifics, but our ratio is well-below 50-50 (heavy on internal links compared to the volume of inbound external links). My hypothesis is that this is diluting the link equity/authority for the domain. Moz's tools don't really provide any data beyond the initial data point of the volume of internal links compared to external. Being that I've never really gone down this rabbit hole before, I'm wondering if anyone else has ever experienced anything similar and could point me in the right direction. Thanks in advance for any assistance or guidance!
Moz Pro | | IWaldron0 -
Are AMP pages affecting mobile search visibility?
Hello fellow Mozzers. I've recently seen a fairly hefty drop in search visibility on Google mobile, from 12.8% to 4.1%. Desktop visibility is unaffected. The same search visibility drop is echoed in SEMRush. However, Google Analytics shows that our site traffic from mobile hasn't changed. The only thing I can think of is that we recently launched AMP pages. I know Google sometimes caches AMPs so they’re served off google domains. Could that mean that the cached version of the page is ranking rather than our own? That would explain the drop in visibility but stable traffic I think?! What other explanation could it be? Many thanks in advance, Kit
Moz Pro | | KitSmith0 -
Following moz guideliness complety, will it certainly improve keyword rankings, domain authority
Hi! G’day to you. I am Naresh from Dotline Infotech. I’d like to take up couple of concerns to be clarified for my better understanding on Moz tool. Indeed, Moz is a nice SEO tool that helps in improving Search engine optimization comprehensive process. We are moz pro member since a year. Moz has equally bifurcated every SEO activity and rate it according to its own benchmark. I intend to ask a series of concerns; Q1. Moz has page grading system A to F. According to Moz guidelines, if the keywords on the page gets repeated minimum 4 times, Moz consider it as ‘A’ grading (one of Moz metric to evaluating the page). Whereas, Google indicates repetition of same keywords time and again is treated as keyword stuffing. According to me, if keywords are already within the content one or two time and furnishing relevant information, why keywords re-occurrence to be 4 time. Does the repetition of same keywords 4 times will lead to give more valuable information? Just a request to let me share why we should unnecessarily repeat keywords? Q2. As per Google guidelines, only 3 keywords can be placed in title tag and also advisable to use only one h1 tag instead of multiple. Among 3 keywords in title tag, only one main keyword can be used in H1 tag. Moz gives much weight age to that particular keyword placed in H1 tag but rest two keywords does not get any importance. I just like to know if we use three H1 tags within page, will it increase the page grading or not. Q3. Does Moz tool have its own index data or refer to Google’s ones? Q4. Following the Moz guidelines completely, will it be 100% sure all our keyword rankings, page authority, domain authority certainly be improved. Looking forward to hear you soon. Thanks
Moz Pro | | dotlineseo0 -
Infographic distribution sites, ideas & tools
What is the best way to get a infographic distributed for pure marketing? Is there a good way to get both digital and news related channels? big-mistake.php
Moz Pro | | jdcline1 -
Fresh Web Explorer Question
Hi, I have been using the FWE tool currently in the SeoMoz toolbox but can't seem to do something. I want to search for say 'Flower Pots' but only return results with the extension .co.uk. Is this possible using the operators? Thanks
Moz Pro | | Bondara0 -
Duplicate content & canonicals
Hi, Working on a website for a company that works in different european countries. The setup is like this: www.website.eu/nl
Moz Pro | | nvs.nim
www.website.eu/be
www.website.eu/fr
... You see that every country has it's own subdir, but NL & BE share the same language, dutch... The copywriter wrote some unique content for NL and for BE, but it isn't possible to write unique for every product detail page because it's pretty technical stuff that goes into those pages. Now we want to add canonical tags to those identical product pages. Do we point the canonical on the /be products to /nl products or visa versa? Other question regarding SEOmoz: If we add canonical tags to x-pages, do they still appear in the Crawl Errors "duplicate page content", or do we have to do our own math and just do "duplicate page content" minus "Rel canonical" ?0 -
Why does not the Linking Root Domains pick update?
The linking root domains is not reading the linking domains, Google shows 38, seo tool shows 180 another seo shows 92 and site explore shows 5.
Moz Pro | | 1step2heaven120