Is this organic search sketchiness worth unwinding?
-
Started working on a site and learned that the person before me had done a fairly sketchy maneuver and am wondering if it's a net gain to fix it.
The site has pages that it wanted to get third party links linking to. Thing is, the pages are not easy to naturally link to boost them in search.
So, the woman before me started a new blog site in the same general topic area as the first/main site. The idea was to build up even the smallest bit of authority for the new blog, without tipping Google off to shared ownership. So, the new blog has a different owner/address/registrar/host and no Google Analytics or Webmaster Tools account to share access to.
Then, as one method of adding links to the new blog, she took some links that originally pointed to the main site and re-directed them to the blog site.
And voila! ...Totally controllable blog site with a bit of authority linking to select pages on the main site!
At this point, I could un-redirect those links that give the blog site some of its authority. I could delete the links to the main site on the blog pages.
However, on some level it may have actually helped the pages linked to on the main site.
The whole thing is so sketchy I wonder if I should reverse it.
I could also just leave it alone and not risk hurting the pages that the blog currently links to.
What do you think? Is there a serious risk to the main site in this existing set up? The main site has hundreds of other links pointing to it, a Moz domain authority of 43, thousands of pages of content, 8 years old and Open Site Explorer Spam Score of 1. So, not a trainwreck of sketchiness besides this issue.
To me, the weird connection for Google is that third party sites have links that (on-page-code-wise) still point to the main site, but that resolve via the main site's redirects to the blog site. BTW, the blog site points to other established sites besides the main site. So, it's not the exclusive slave to the main site.
Please let me know what you think. Thanks!
-
I agree with the two methods that both you and Gaston have pointed out.
The downside to reversing those links is that the domain authority could drop a bit—which could impact their rankings on the SERPs. If this happens, the client might think you are doing something wrong and causing their rankings to rank when, in theory, you were trying to help get rid of any sketchy links. In my opinion, I’d keep them. They’ll make your work perform better. Disavowing them could yield worse results than what their former SEO provided. If that happens, you're playing defense and blaming.
Hope this helps!
-
Well, I like Gaston's answers on these boards and at the same time was curious if that seemed like the concensus.... leave it cause no real risk.
-
Hi 94501! Did Gaston answer you question, and if so, would you mind marking his response a "Good Answer?"
Otherwise, how else can we help?
-
Thanks, Gaston!
Any other insights, folks?
Mike
-
Hi there,
There are 2 exits here, and you've pointed them:
- Reverse those links
- Leave all as it is now.
On one hand, if you aren't confortable with those links, just reverse all.
On the other hand, you've said that the main site has a lot of links and it those 'unnatural links' will not make harm and that the satellite blog has really few conections to the latter. I'd say that there isnt, almost nothing, risk. So, i'd leave as it is now.
Hope it helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Site Property Questions
I have a few questions regarding Google Search Console. Google Search Console tells you to add all versions of your website https, http, www, and non-www. 1.) Do I than add ALL the information for ALL versions? Sitemaps, preferred site, etc.? 2.) If yes, when I add sitemaps to each version, do I add the sitemap url of the site version I'm on or my preferred version? - For instance when adding a sitemap to a non-www version of the site, do I use the non-www version of the sitemap? Or since I prefer a https://www.domain.com/sitemap.xml do I use it there? 3.) When adding my preferred site (www or non-www) do I use my preferred site on all site versions? (https, http, www, and non-www) Thanks in advance. Answers vary throughout Google!
Intermediate & Advanced SEO | | Mike.Bean0 -
Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
I've been trying to figure out why my site www.stephita.com has lost it's google ranking the past few years. I had originally thought it was due to the Panda updates, but now I'm concerned it might be because of the Penguin update. Hard for me to pinpoint, as I haven't been actively looking at my traffic stats the past years. So here's what I just noticed. On my Google Search Console - Links to your Site, I discovered there are 301 domains, where over 75% seem to be spammy. I didn't actively create those links. I'm using the MOZ - Open site Explorer tool to audit my site, and I noticed there is a smaller set of LINKING DOMAINS, at about 70 right now. Is there a reason, why MOZ wouldn't necessarily find all 300 domains? What's the BEST way to clean this up??? I saw there's a DISAVOW option in the Google Search Console, but it states it's not the best way, as I should be contacting the webmasters of all the domains, which is I assume impossible to get a real person on the other end to REMOVE these link references. HELP! 🙂 What should I do?
Intermediate & Advanced SEO | | TysonWong0 -
Targeting two search terms with same intent - one or more pages for SEO benefits?
I'd like some professional opinions on this topic. I'm looking after the SEO for my friends site, and there are two main search terms we are looking to boost in search engines. The company sells Billboard advertising space to businesses in the UK. Here are the two search terms we're looking to target: Billboard Advertising - 880 searches P/M Outdoor Advertising - 720 searches P/M It would usually make sense to make a separate page to target the keyword "billboard advertising" on its own fully optimised landing page with more information on the topic and with a targeted URL: www.website.com/billboard-advertising/ and the homepage to target "outdoor advertising" as it's an outdoor advertising agency. But there's a problem, as both search terms are highly related and have the same intent, I'm worried that if we create a separate page to target the billboard advertising, it will conflict with the homepage targeting outdoor advertising. Also, the main competitors who are currently ranked position 1-3, are ranking with their home pages and not optimised landing pages to target the exact search term "billboard advertising". Any advice on this?
Intermediate & Advanced SEO | | Jseddon920 -
Getting too many links on Google search results, how do I fix?
I'm a total newbie so I apologize for what I am sure is a dumb question — I recently followed Moz suggestions for increasing visibility on my site for a specific keyword by including that keyword in more verbose page descriptions for multiple pages. This worked TOO well as now that keyword is bringing up too many results in Google for these different pages on my site . . . is there a way to compile them into one result with the subpages like for instance, the attached image for a search on Apple? Do I need to change something in my robots.txt file to direct these to my main page? Basically, I am a photographer and a search for my name now brings up each of my different photo gallery pages in multiple results, it's a little over the top. Thanks for any and all help! CNPJZgb
Intermediate & Advanced SEO | | jason54540 -
Maintaining SEO with Ecommerce Search Refinement
Hey Everyone, i have an interesting scenario I'd appreciate some feedback on. I'm working on restructuring a client site for a store design, and he had previously built a bunch of landing pages mostly for SEO value- some of them aren't even accessible from the main nav and contain a lot of long-tail type targets. These pages are generating organic traffic but the whole thing is pretty not user-friendly because it's cumbersome to drill down into specific categories (that many of the landing pages fulfill) without going through 3 or 4 pages to get there. For example, if I want to buy orange shoes, i can see specific kinds of orange shoes, but not ALL the orange shoes, even though there is an SEO page for orange shoes that is otherwise inaccessible from the main navigation. If that wasn't too confusing, essentially the usability solution to this is implementing some search refinement so that the specific sub categories can be drilled into easily with less steps. My issue is that I'm hesitant to implement this even though I know it would be an overall benefit to the site, because the existence of these SEO pages and being wary of destroying the organic traffic they're already receiving. My plan was to see to it that the specific category pages are built with the necessary keywords and content to attract those organic visits, but I'm still nervous it might not be enough. Does anyone have any suggestions for this circumstance, but also just maximizing SEO efforts on a site with search refinement and how to minimize loss. From a usability standpoint, search refinement is great, but how do you counter the significant SEO risks that come with it? Thanks for your help!
Intermediate & Advanced SEO | | BrandLabs0 -
Organic Brand Traffic Tanking
Hey Guys, We recently launched a new website in late February. Since then, we have seen a drop in organic traffic from most of our top organic keywords. My major concern is the drop that we've seen from our branded keywords. Since the new site launch, our #1 organic traffic and revenue-driving keyword (brand name) dropped over 31%! It should be noted that all of our URLs changed, however, I've updated the sitemap in GWT and we have utilized 301 redirects on all old URLs. Any insights or recommendations on where I should be looking or what I should be doing? Thanks!
Intermediate & Advanced SEO | | Shorething0 -
Blog posts not showing in serps for exact match title search
hi- my first client ranks #1 for the exact phrase of each blog post title the 2nd client doesnt rank anywhere when i search for the exact post title 2nd client has robots.txt User-agent: *
Intermediate & Advanced SEO | | Ezpro9
Disallow: /wp-admin/
Disallow: /wp-includes/ so that shouldnt noindex any posts right? his site ranks for many kw's - but oddly none of his blog posts are anywhere to be found - i dont mean for a kw search - i mean for searching for the entire title he doesnt rank anywhere in first 5 pages for any of 6-7 posts i checked any idea what could cause this? thanks0 -
Search Engine Pingler
Hello everyone, it's me again 😉 I've just got a Pro membership on SeoMoz and I am full of questions. A few days ago I found very interesting tool called: Search Engine Pingler And description of it was like this: Your website or your page was published a long time, but you can not find it on google. Because google has not index your site. Tool Search engine pingler will assist for you. It will ping the URL of your Page up more than 80 servers of google and other search engines. Inform to the search engine come to index your site. So my question is that tool really helps to increase the indexation of the link by search engine like Google, if not, please explain what is a real purpose of it. Thank you to future guru who can give a right answer 🙂
Intermediate & Advanced SEO | | smokin_ace0