Duplicate content, hijacked search console, crawl errors, ACCCK.
-
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel.
Of course I was not around to manage the build. I also do not have ftp access
I'm also dealing with major search console issues.
The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation.
The site prefers the www version and does not read the same traffic for the non www version
We also have something like 90,000 backlinks from 13 sites.
And a shit ton of ghost spam.
Help!
-
Yes, thank you so much, I will. What I'm concerned about is how bad this was in the first place. The way this company markets themselves is completely out of line with with the state of the build and the advice they give my employers. My bosses LOVE these guys because they are supposedly #technology #experts that do national speaking engagements about #success
What I see from them is mostly paid product endorsements, outsourced workforce, and #broisms on social media.
They're fast talking sales people that are delivering a product to people who don't understand what they are getting (or not getting) under the hood.
-
I'm glad you were able to sort out part of your issue, and it sounds like there's hope for it all to get fixed! The only thing I would add is to make sure you get a promise in writing that should you part ways with the company that's hosting the site, they will transfer the site to a host of your choosing and hand over the keys.
-
Ok, so it's TWO companies.
One is the marketing company that provides the website, the other is a local SEO company that created just the www version of our search console.
We own the domain, but the marketing company has the only access to the website files/ hosting. I'm guessing we are on a shared server with their other clients, so we will not get access. They have front end people on the team, but no understanding of SEO whatsoever. When I came on there was no sitemap or robots file submitted to search console, for example.
As far as the Search Console issue, I actually have a friend at that company there that told me their setup is proprietary. We no longer have a relationship with that company, and the owner of our company was also an owner on Search Console. I managed to remove all the reps from this company by unverifying, so that is no longer a problem! Maybe you helped me in spirit. I tried to do this a few times before and didn't find the way, but right after your response I did.
So now at least, the only issue is with the duplicate staging site and the ghost spam. I'd really prefer that the company take the staging files down. My employers paid a to of money for this site and are paying a large monthly retainer. At the very least we should have a clean build. It's over 4,000 duplicate pages, so I think that is going to have to be on them.
As far as ghost spam, I'll read the articles and get er done.
Thank you so much for your thoughtful response.
-
I have so many questions about this arrangement.
First of all, the third party ownership of the Search Console (and GA too, maybe?) is a massive red flag. Account ownership should always ALWAYS be handled in house. You need to insist on that, and insist loudly and furiously. It's extremely shady for a third-party SEO to own the accounts since it lets them hold the site and its data hostage if the relationship sours. How easy would it be for people who aren't even part of your company to use Search Console to start removing important URLs from the index? What happens to your data if you end the contract? Do they also own your analytics? Could they cut off your access to your own data on a whim? Replace your site with a page telling the world what awful clients you are? Depending on the size and type of company you are, letting an outsider own that access could be a very real threat to your business with the potential to do significant damage.
Also, what exactly is the local SEO company's role here? Why aren't THEY worrying about referral spam and questionable backlinks? If they're not, then what are they being paid to do?
If you don't have FTP access, who does? Does your company actually own the site? Is there a contract that spells it out?
For the staging site, all you should need is to make sure it's excluded from indexing via robots.txt. We have had multiple staging sites that, if indexed, would put some crazy dupe content into the world, but that's what the robots.txt is for. Set and forget. Well, check on it periodically, since you don't seem to have any actual control over what these guys are doing and the account ownership thing makes me very wary of trusting them to get it right and keep it that way.
As for the ghost spam, there's been a ton of discussion about it in the community over the last year. On Moz alone, there's this piece from March, and this one from August, plus a bunch of forum discussions. Bottom line is that there isn't much you can do to stop it, but that doesn't mean you're stuck with seeing it muck up your data.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question about partial duplicate content on location landing pages of multilocation business
Hi everyone, I am a psychologist in private practice in Colorado and I recently went from one location to 2 locations. I'm currently updating my website to better accommodate the second location. I also plan continued expansion in the future, so there will be more and more locations as time goes on. As a result, I am making my websites current homepage non-location specific and creating location landing pages as I have seen written about in many places. My question is: I know that location landing pages should have unique content, and I have plenty of this, but how much content is it also okay to have be duplicate across the location landing pages and the homepage? For instance, here is the current draft of the new homepage (these are not live yet): http://www.effectivetherapysolutions.com/dev/ And here are the drafts of the location landing pages: http://www.effectivetherapysolutions.com/dev/denver-office http://www.effectivetherapysolutions.com/dev/colorado-springs-office And for reference, here is the current homepage that is actually live for my single Denver location: http://www.effectivetherapysolutions.com/ As you can see, the location landing pages have the following sections of unique content: Therapist picture at the top testimonial quotes (the one on the homepage is the only thing I have I framed in this block from crawl so that it appears as unique content on the Denver page) therapist bios GMB listing driving directions and hours and I also haven't added these yet, but we will also have unique client success stories and appropriately tagged images of the offices So that's plenty of unique content on the pages, but I also have the following sections of content that are identical or nearly identical to what I have on the homepage: Intro paragraph blue and green "adult" and child/teen" boxes under the intro paragraph "our treatment really works" section "types of anxiety we treat" section Is that okay or is that too much duplicate content? The reason I have it that way is that my website has been very successful for years at converting site visitors into paying clients, and I don't want to lose aspects of the page that I know work when people land on it. And now that I am optimizing the location landing pages to be where people end up instead of the homepage, I want them to still see all of that content that I know is effective at conversion. If people on here do think it is too much, one possible solution is to turn parts of it into pictures or put them into I-frames on the location pages so Google doesn't crawl those parts of the location pages, but leave them normal on the homepage so it still gets crawled on there. I've seen a lot written about not having duplicate content on location landing pages for this type of website, but everything I've read seems to refer to entire pages being copied with just the location names changed, which is not what I'm doing, hence my question. Thanks everyone!
Local Website Optimization | | gremmy90 -
Hreflang | Should I implement hreflang for regional targeted but - different content of websites?
Hello, I'm implementing hreflang for my e-commerce websites which have different languages and do serve different content based on location. Currently, I'm only using hreflang for for alternate language (fr-fr, fr-be, fr-ma, ...). I wonder if it might be better or if I am allowed to add other version of my websites (IT, ES, DE,... ) even if those version are serving specific content for these specific location. So, the content (products) of Germany is different of the product of the other countries. Here is an example : www.mywebsite.com/apple-phone (selling apple phone for US with product avalaible only in US). www.mywebsite.de/apple-phone (selling apple phone for Germany with product avalaible only in Germany, the available models might be different from US and other websites). www.mywebsite.it/apple-phone (selling apple phone for Italy with product avalaible only in Italy, the available models might be different from US and other websites). www.mywebsite.es/apple-phone (selling apple phone for Spain with product avalaible only in Spain, the available models might be different from US and other websites). www.mywebsite.pt/apple-phone (selling apple phone for Portugal with product avalaible only in Portugal, the available models might be different from US and other websites).
Local Website Optimization | | manoman880 -
What more can be done to get Google to change the landing pages it uses for certain search terms?
For one of my SEO campaigns, Google is using the website's home page as the landing page for the majority of search terms being tracked. The website splits its products by region and so we want specific region pages to rank for search terms related to that region, rather than the home page. We have optimised each regional page to a reasonably high standard and we have ensured that there is a good amount of internal linking and sign-posting to those region pages, however, Google is still using the home page. The only complication is that for the first few months there were canonical tags on these pages to the home page. These were removed around 3 months ago and we've checked that the region pages are indexed properly. Is there anything we are missing? Has anyone had any success in getting Google to change its landing pages?
Local Website Optimization | | ClickHub-Harry0 -
Franchise Content Spinning
Hey Guys, Thanks for taking the time out to read my question, I appreciate it. I know Google doesn't treat all duplicate content the same, but what about this scenario. We have a garage door company franchise that services Seattle, San Diego, & Salt Lake City. It is the same brand, but each area has a different website, catering to their own county. Say I write & post a blog about "how to maintain your garage door" to the Seattle site. This is certainly useful for the other locations as well. So would I get penalized for posting the same article to San Diego & Salt Lake City without massively changing the content to avoid duplication? Or should I dedicate the extra time to revamp the content and avoid duplication? Does Google care about this type of duplication? Thanks in advance!!
Local Website Optimization | | dwayne.jones260 -
404 error from linking page that does not exist
We migrated our site from php to wordpress about a month ago. All of the old website files have been removed. I ran Moz analytics and get 17 critical 404 errors from linking pages that do not exist. 404 : Received 404 (Not Found) error response for page. http://www.preventivesupport.com/freeestimates.php404010http://preventivesupport.com/freeestimates.phpN/AThe www thing is interesting but freeestimates.php does not exist?
Local Website Optimization | | KrisIrr0 -
Structured Data Question: Is there any value in "Custom Search Result Filters" structured data?
I have been doing a structured data test for a client who is looking to improve their local SEO. After running several tests in Google Developer Tools structured data testing tool I have been noticing data sets for "Custom Search Result Filters" and "Unspecified Type" structured data properties. I have plans to apply Organization and Local Business schematic markup. However my question is this: do the "Custom Search Result Filters" and "Unspecified Type" offer any value at all? I would like to have a response to our client if they ever ask about this. I attached a snapshot of what this looks like. ydu32k6.jpg?1
Local Website Optimization | | RosemaryB0 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Want to move contents to domain2 and use domain1 for other content
Hello, We would like to merge two existing, fairly well positioned web forums. Contents (threads and posts) from www.forocreativo.net would be moved to www.comunidadhosting.com. We are testing some scripts which will handle redirect 301 for every single thread from forocreativo.net to comunidadhosting.com. But here is the thing: once all current contents are moved out of www.forocreativo.net, we would like to use this domain to point it to a specific geographic region and to target other niche/topics. Would you say we can do this and Google will not penalize neither of those 2 domains? Any input is more than welcome. Thank you! 🙂
Local Website Optimization | | interalta0