Duplicate content, hijacked search console, crawl errors, ACCCK.
-
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel.
Of course I was not around to manage the build. I also do not have ftp access
I'm also dealing with major search console issues.
The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation.
The site prefers the www version and does not read the same traffic for the non www version
We also have something like 90,000 backlinks from 13 sites.
And a shit ton of ghost spam.
Help!
-
Yes, thank you so much, I will. What I'm concerned about is how bad this was in the first place. The way this company markets themselves is completely out of line with with the state of the build and the advice they give my employers. My bosses LOVE these guys because they are supposedly #technology #experts that do national speaking engagements about #success
What I see from them is mostly paid product endorsements, outsourced workforce, and #broisms on social media.
They're fast talking sales people that are delivering a product to people who don't understand what they are getting (or not getting) under the hood.
-
I'm glad you were able to sort out part of your issue, and it sounds like there's hope for it all to get fixed! The only thing I would add is to make sure you get a promise in writing that should you part ways with the company that's hosting the site, they will transfer the site to a host of your choosing and hand over the keys.
-
Ok, so it's TWO companies.
One is the marketing company that provides the website, the other is a local SEO company that created just the www version of our search console.
We own the domain, but the marketing company has the only access to the website files/ hosting. I'm guessing we are on a shared server with their other clients, so we will not get access. They have front end people on the team, but no understanding of SEO whatsoever. When I came on there was no sitemap or robots file submitted to search console, for example.
As far as the Search Console issue, I actually have a friend at that company there that told me their setup is proprietary. We no longer have a relationship with that company, and the owner of our company was also an owner on Search Console. I managed to remove all the reps from this company by unverifying, so that is no longer a problem! Maybe you helped me in spirit. I tried to do this a few times before and didn't find the way, but right after your response I did.
So now at least, the only issue is with the duplicate staging site and the ghost spam. I'd really prefer that the company take the staging files down. My employers paid a to of money for this site and are paying a large monthly retainer. At the very least we should have a clean build. It's over 4,000 duplicate pages, so I think that is going to have to be on them.
As far as ghost spam, I'll read the articles and get er done.
Thank you so much for your thoughtful response.
-
I have so many questions about this arrangement.
First of all, the third party ownership of the Search Console (and GA too, maybe?) is a massive red flag. Account ownership should always ALWAYS be handled in house. You need to insist on that, and insist loudly and furiously. It's extremely shady for a third-party SEO to own the accounts since it lets them hold the site and its data hostage if the relationship sours. How easy would it be for people who aren't even part of your company to use Search Console to start removing important URLs from the index? What happens to your data if you end the contract? Do they also own your analytics? Could they cut off your access to your own data on a whim? Replace your site with a page telling the world what awful clients you are? Depending on the size and type of company you are, letting an outsider own that access could be a very real threat to your business with the potential to do significant damage.
Also, what exactly is the local SEO company's role here? Why aren't THEY worrying about referral spam and questionable backlinks? If they're not, then what are they being paid to do?
If you don't have FTP access, who does? Does your company actually own the site? Is there a contract that spells it out?
For the staging site, all you should need is to make sure it's excluded from indexing via robots.txt. We have had multiple staging sites that, if indexed, would put some crazy dupe content into the world, but that's what the robots.txt is for. Set and forget. Well, check on it periodically, since you don't seem to have any actual control over what these guys are doing and the account ownership thing makes me very wary of trusting them to get it right and keep it that way.
As for the ghost spam, there's been a ton of discussion about it in the community over the last year. On Moz alone, there's this piece from March, and this one from August, plus a bunch of forum discussions. Bottom line is that there isn't much you can do to stop it, but that doesn't mean you're stuck with seeing it muck up your data.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Incorrect Image returned in Yahoo search results
A client (healthcare practice) is reporting an incorrect profile image associated with one of their providers (another provider in the same practice). They sent a screenshot. I've been unable to replicate the issue, and Yahoo is only 4% of the search traffic. Questions: Is 4% search traffic from Yahoo roughly normal? Is it lately increasing? I'm finding reports of malware Chrome/Firefox extensions that switches the default search engine to Yahoo search to drive revenue. This client was formerly on Yext, and possibly, someone could have associated the wrong image. Would WhiteSpark or Moz Local be of any help? Do you know a way for us to address the problem? Request a re-crawl from Yahoo or request removal? Thanks in advance.
Local Website Optimization | | firecatsue0 -
Preventing multiple market domains from appearing in the local search rsults
Working on an international client, how would you help solve multiple market domains from appearing in the local search rsults?
Local Website Optimization | | Cristiana.Solinas0 -
Hreflang | Should I implement hreflang for regional targeted but - different content of websites?
Hello, I'm implementing hreflang for my e-commerce websites which have different languages and do serve different content based on location. Currently, I'm only using hreflang for for alternate language (fr-fr, fr-be, fr-ma, ...). I wonder if it might be better or if I am allowed to add other version of my websites (IT, ES, DE,... ) even if those version are serving specific content for these specific location. So, the content (products) of Germany is different of the product of the other countries. Here is an example : www.mywebsite.com/apple-phone (selling apple phone for US with product avalaible only in US). www.mywebsite.de/apple-phone (selling apple phone for Germany with product avalaible only in Germany, the available models might be different from US and other websites). www.mywebsite.it/apple-phone (selling apple phone for Italy with product avalaible only in Italy, the available models might be different from US and other websites). www.mywebsite.es/apple-phone (selling apple phone for Spain with product avalaible only in Spain, the available models might be different from US and other websites). www.mywebsite.pt/apple-phone (selling apple phone for Portugal with product avalaible only in Portugal, the available models might be different from US and other websites).
Local Website Optimization | | manoman880 -
Duplicate Content - Local SEO - 250 Locations
Hey everyone, I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc. I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates. So here's my question: If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct? Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations. I really appreciate any insight! Thank you,
Local Website Optimization | | SEOJedi510 -
Using geolocation for dynamic content - what's the best practice for SEO?
Hello We sell a product globally but I want to use different keywords to describe the product based on location. For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name). What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches). I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5. I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada. I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript. Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though. Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck. Thank you so much! Laura Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
Local Website Optimization | | LauraFalls0 -
Implementation advice on fighting international duplicate content
Hi All, Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation. The situation is that we have 5 sites with similar content. Out of these 5: 2 use the same URL stucture and have no suffix 2 have a different URL structure with a .html suffix 1 has an entirely different URL structure with a .asp suffix The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level). 4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content. Is there an easy way to go about this or is the only way a manual addition? Has anyone had a similar experience? Your advice will be greatly appreciated. Many thanks, Emeka.
Local Website Optimization | | OptiBacUK0 -
Duplicate content on a proxy site?
I have a local client with a 500 page site.
Local Website Optimization | | TFinder
They advertise online and use traditional media like direct mail.
A print media company, Valpak, has started a website
And wants the client to use their trackable phone number
And a proxy website. When I type the proxy domain in the browser
It appears to be client home page at this proxy URL. The vendor
Wishes to track activity on its site to prove their value or something
My question is: is their any "authority" risk to my clients website
By allowing this proxy site??0