Duplicate content, hijacked search console, crawl errors, ACCCK.
-
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel.
Of course I was not around to manage the build. I also do not have ftp access
I'm also dealing with major search console issues.
The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation.
The site prefers the www version and does not read the same traffic for the non www version
We also have something like 90,000 backlinks from 13 sites.
And a shit ton of ghost spam.
Help!
-
Yes, thank you so much, I will. What I'm concerned about is how bad this was in the first place. The way this company markets themselves is completely out of line with with the state of the build and the advice they give my employers. My bosses LOVE these guys because they are supposedly #technology #experts that do national speaking engagements about #success
What I see from them is mostly paid product endorsements, outsourced workforce, and #broisms on social media.
They're fast talking sales people that are delivering a product to people who don't understand what they are getting (or not getting) under the hood.
-
I'm glad you were able to sort out part of your issue, and it sounds like there's hope for it all to get fixed! The only thing I would add is to make sure you get a promise in writing that should you part ways with the company that's hosting the site, they will transfer the site to a host of your choosing and hand over the keys.
-
Ok, so it's TWO companies.
One is the marketing company that provides the website, the other is a local SEO company that created just the www version of our search console.
We own the domain, but the marketing company has the only access to the website files/ hosting. I'm guessing we are on a shared server with their other clients, so we will not get access. They have front end people on the team, but no understanding of SEO whatsoever. When I came on there was no sitemap or robots file submitted to search console, for example.
As far as the Search Console issue, I actually have a friend at that company there that told me their setup is proprietary. We no longer have a relationship with that company, and the owner of our company was also an owner on Search Console. I managed to remove all the reps from this company by unverifying, so that is no longer a problem! Maybe you helped me in spirit. I tried to do this a few times before and didn't find the way, but right after your response I did.
So now at least, the only issue is with the duplicate staging site and the ghost spam. I'd really prefer that the company take the staging files down. My employers paid a to of money for this site and are paying a large monthly retainer. At the very least we should have a clean build. It's over 4,000 duplicate pages, so I think that is going to have to be on them.
As far as ghost spam, I'll read the articles and get er done.
Thank you so much for your thoughtful response.
-
I have so many questions about this arrangement.
First of all, the third party ownership of the Search Console (and GA too, maybe?) is a massive red flag. Account ownership should always ALWAYS be handled in house. You need to insist on that, and insist loudly and furiously. It's extremely shady for a third-party SEO to own the accounts since it lets them hold the site and its data hostage if the relationship sours. How easy would it be for people who aren't even part of your company to use Search Console to start removing important URLs from the index? What happens to your data if you end the contract? Do they also own your analytics? Could they cut off your access to your own data on a whim? Replace your site with a page telling the world what awful clients you are? Depending on the size and type of company you are, letting an outsider own that access could be a very real threat to your business with the potential to do significant damage.
Also, what exactly is the local SEO company's role here? Why aren't THEY worrying about referral spam and questionable backlinks? If they're not, then what are they being paid to do?
If you don't have FTP access, who does? Does your company actually own the site? Is there a contract that spells it out?
For the staging site, all you should need is to make sure it's excluded from indexing via robots.txt. We have had multiple staging sites that, if indexed, would put some crazy dupe content into the world, but that's what the robots.txt is for. Set and forget. Well, check on it periodically, since you don't seem to have any actual control over what these guys are doing and the account ownership thing makes me very wary of trusting them to get it right and keep it that way.
As for the ghost spam, there's been a ton of discussion about it in the community over the last year. On Moz alone, there's this piece from March, and this one from August, plus a bunch of forum discussions. Bottom line is that there isn't much you can do to stop it, but that doesn't mean you're stuck with seeing it muck up your data.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Loss of search visibility-consecutive drops in one month - something I did or competitors?
I am fairly new to Moz. I co-manage a national website with about 400 common pages and separate location areas for cities in Australia. 1 city starting their own separate website a year ago. A drop in search visibility of the whole national site and my location page started in mid July according to Moz stats.- 8%>12%>$38% consecutive drops per week. In google analytics the organic search has dropped 8% overall & 2% on my location page in last month. I did minor optimisation to the my page and articles using Moz in July - upped H2 to H1 title, tweaked main keyword, wrote slightly different SEO title and included keywords in body copy. The rankings of the target keywords went up but other keyword rankings went down. The other thing that started in June was Facebook advertising of our blog articles (click-throughs have a high bounce rate of 95%). The office with its own website (with a similar brand name) also started doing Facebook advertising and SEO for it earlier this year. I can see their own website traffic really shot up in June/July, and they also maintained their traffic on the national site. Wondering if any of these are causing the drop, or if this is more an indicator of competitor activity or alogorthms? Any ideas about causes and solutions appreciated.
Local Website Optimization | | SueMclean0 -
"spammy structred data" search console message
Hey gang, I want to first say thank you to anybody that tries to help me with this. I'm not quite sure where to start. So first I get the message in search console for my locksmith website that it looks like I have some spammy structured data. I remembered that for one landing page I did have the stars short code on it and it was displaying the stars. Well, I went and looked and they were indeed no longer showing. So I simply deleted the shortcode, but I wanted to do a thorough check of my landing pages, one by one. Now I have project supremacy on my wordpress site, which I stand by, it's a solid product and I have been able to make my per page schema look really good, zero errors. So I went through each page that had errors on it and fixed them and sent it all back into google for 'reconsideration'. BUT today (sorry this is getting long) I look in my search console and I see that ALL of my blog posts have errors on them. Something wrong with the hentry. As I test one of the posts in structured data tester tool I see 4 errors and 4 warnings. I don't have the author displaying which is not true and some other things. But I have never ever tried to schema any of my blog posts and there is ZERO site wide schema, I already checked. Where is this bad schema living, and could that be the reason for the spammy stuff? Thank you crew!!! mwDd8
Local Website Optimization | | Meier0 -
Is it possible to rank for street name searches?
I am working with a real estate agency who serves a very small geographical area in Dallas, TX. Many areas with Dallas addresses have proper names (e.g. Uptown, Highland Park, Lake Highlands, etc.), but the area my client wants to target is nameless, so we had the idea of trying to target searches for particular street names instead (e.g. homes for sale on easy street). I have looked around quite a bit, but have not found a website that takes that approach. Any thoughts on whether it's possible?
Local Website Optimization | | cbizzle0 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
How do I set up 2 businesses that work together but are ran seperately with two separate websites but similar content?
How do I set up these sites so that they will not be negatively affecting their SEO efforts? I have 2 businesses with the same owner. Business A manufactures nurse call systems and Business B installs them. They are run separately with two websites. The content is very similar because the business that installs them describes the different products on their website. These are the two sites: intercallsystems.com and nursecallny.com , My thought was on nursecallny.com when you click on the nav link "Nurse Call Systems" you would be directed to the intercell website. Would this be the best method? Thank you for your help!
Local Website Optimization | | renalynd270 -
Multi-Country Multi-Language content website
Hi Community! I'm starting a website that is going to have content from various countries and in several languages. What is the best URL structure in this case? I was thinking of doing something like: english name of the plant, content in english, content for USA:
Local Website Optimization | | phiber
www.flowerpedia.com/flowers/red-roses spanish name of the plant, content in spanish, content for MX:
mx.flowerpedia.com/es/rosas/rosas-rojas english name of the plant, content in english, content for MX:
mx.flowerpedia.com/roses/red-roses
this content is not the same as flowerpedia/flowers/red-roses Content for Mexico would not exist in languages other than english and spanish. So for example:
mx.flowerpedia.com/jp/flowers/red-roses would not exist and it would redirect
to the english version:
mx.flowerpedia.com/flowers/red-roses What would be the best URL structure in this case?0 -
Should digital marketing agencies treat SEO differently when it comes to homepage content?
When I review competitor digital agency sites, they seem to have very little homepage content. But how would this be beneficial in gaining a higher SERP rank?
Local Website Optimization | | randomagency1 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0