Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
-
Hi everyone,
We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords.Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French.
Background info:
- In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites.
- Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE).
- We have a lot of sites on our C-Block, some of poor quality.
- We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox.
- We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business.
- Only a third of our organic visits come from Canada.
What are our options?
- Change domain and delete the current one?
- Disallow the blog except for a few good articles, hoping it helps Google understand what we really do.
- Keep donating to Adwords?
Any help greatly appreciated!
Thanks! -
Ahh I get it now, redirect every URL from the old site to its homepage. Makes sense!
For point 2) I meant the URL Removal tool to de-index the whole site but this would no longer be needed if I apply the above suggestion.
Thanks a bunch!
-
Yep. The site isn't done. Every time we try to finish it, another couple of referrals come in.
Regarding "non-google sanction duplicate content" that's just my way with words. You have a French version of the site and an English version of the site. Without proper hreflang usage, that is duplicate content.
-
Well spotted, Travis!
-
ABSOLUTELY do NOT 301 anything from the old site to the new site...or you risk transferring the penalty!
I'm not sure what Google will do if you disallow via robots.txt AND 301. Most likely, this is safe, Google will remove the old site from the index and ignore the 301s. But I think there's some risk here that Google will read the pages anyway, see the 301s, and perhaps transfer the penalty.
Deleting the domain in webmaster tools will have no effect, other than to prevent you from seeing what Google thinks about the old domain :-/. Google will continue to index the old domain, follow redirected links, see duplicate content, etc.
-
Hello / Bonjour.
It looks like you might have an awful lot of duplicate content (e.g. category pages, date archives) on the site. I'd try getting rid of that before deciding to switch domains.
-
Hi Travis, thanks for your response.
I swear those hreflangs were OK not long ago! We'll fix them up, thanks!
Can you give an example of "non-google sanctioned duplicate content"?
The robots.txt file seems OK even though it's a bit heavy in verbatim. I'll ask to shrink it a bit. (By the way, I was curious about PhysVisible's robots.txt but looks like you're disallowing everything. Thought I'd let you know!)
Thanks again!
-
Merci Michael!
Can you elaborate on "Keep the old site running, but 301 redirect all of the pages to the home page..." ? Should any URL on www.oldsite.com redirect to the homepage of www.newsite.com?
We had these options in mind. What do you think of those?
-
robots.txt disallow the old site and map every URL with a 301 to help our users get to the right page while Googlebot won't follow those links (to be tested but seems logical), and/or...
-
Delete the whole old domain in GWT.
Thanks for your time!
-
-
Full disclosure: I've been studying hreflang/rel=alternate for the glorious day when someone wants, and will pay for, a solid Spanish translation. That day has not come. But I wanted to be prepared. So here goes:
Your English pages are pointing the canonical at the French pages. No nationalized form of English is mentioned in the hfrelang alternate. If your English speaking audience is Canadian, put en-ca in the empty quotes after hreflang=. Example from /en:
rel="alternate" hreflang="" href="http://www.axialdev.com/en/" />
All of your canonicals point to the fr-ca version of the pages. For the en-ca pages, they should point to the en-ca pages.
I grew up in Michigan. I have quite a few Canadian friends. The only thing that's different about spoken Canadian English is the pronunciation of 'about' and they tend toward en-gb in spelling. But you should use en-ca anyway.
Yep, you have a lot of site-wide links. That is true. That may be part of the problem. But right now, you have a lot of non-google sanctioned duplicate content.
The site also has one of the most involved robots.txt pages I've seen in a month or so. It may not be a good idea to call any old user agent, *, and not give them a directive. Check the end of the file.
A site should not nose dive within the space of a couple weeks without extensive search engine manipulation, or serious on-page issues. Your site has been live for seven years. It's better to doubt on-page first though.
-
Bonjour! (I lived in Montreal for 6 years :-).
I do a lot of penalty recovery work, and you're in the same situation as a number of my clients: algorithmic penalty (probably), and you've disavowed links, but....no Penguin update for a year.
The next Penguin data update is mostly likely very soon, from mutterings from Matt at SMX Advanced. It's been almost a year since the last one. Your disavows won't take effect until there IS a data update.
I would wait for the data update, and see if you recover on rankings for the 3 terms you had in your footer links from client sites. If you do, then great, continue on...
If not, then I'd be inclined to start a new domain, and move your content from your old site (and blog) to the new site, WITHOUT 301 redirecting. Keep the old site running, but 301 redirect all of the pages to the home page....you want Google to successfully fetch all of those blog pages with the great content, but find it's permanently moved to your home page, where that content no longer exists. This way your new site's content will not be seen as duplicate by Google (if you just 404 the pages, Google will presume the content is still as it was before it 404'd....FOR MONTHS).
It's worth going through all of the backlinks for the old site, seeing which ones are from healthy sites, and manually asking those webmasters if they'd kindly update their links to point to your new site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One of my Friend's website Domain Authority is Reducing? What could be the reason?
Hello Guys, One of my friend's website domain authority is decreasing since they have moved their domain from HTTP to https.
Intermediate & Advanced SEO | | Max_
There is another problem that his blog is on subfolder with HTTP.
So, can you guys please tell me how to fix this issue and also it's losing some of the rankings like 2-5 positions down. Here is website URL: myfitfuel.in/
here is the blog URL: myfitfuel.in/mffblog/0 -
Could this be seen as duplicate content in Google's eyes?
Hi I'm an in-house SEO and we've recently seen Panda related traffic loss along with some of our main keywords slipping down the SERPs. Looking for possible Panda related issues I was wondering if the following could be seen as duplicate content. We've got some very similar holidays (travel company) on our website. While they are different I'm concerned it may be seen as creating content that is too similar: http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays/the-wildlife-and-beaches-of-kenya.aspx http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays/ultimate-kenya-wildlife-and-beaches.aspx http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays/wildlife-and-beach-family-safari.aspx They do all have unique text but as you can see from the titles, they are very similar (note from an SEO point of view the tabbed content is all within the same page at source level). At the top level of the holiday pages we have a filtered search:
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays.aspx These pages have a unique introduction but the content snippets being pulled into the boxes is drawn from each of the individual holiday pages. I'm just concerned that these could be introducing some duplicating issues. Any thoughts?0 -
Big discrepancies between pages in Google's index and pages in sitemap
Hi, I'm noticing a huge difference in the number of pages in Googles index (using 'site:' search) versus the number of pages indexed by Google in Webmaster tools. (ie 20,600 in 'site:' search vs 5,100 submitted via the dynamic sitemap.) Anyone know possible causes for this and how i can fix? It's an ecommerce site but i can't see any issues with duplicate content - they employ a very good canonical tag strategy. Could it be that Google has decided to ignore the canonical tag? Any help appreciated, Karen
Intermediate & Advanced SEO | | Digirank0 -
Should I disallow via robots.txt for my sub folder country TLD's?
Hello, My website is in default English and Spanish as a sub folder TLD. Because of my Joomla platform, Google is listing hundreds of soft 404 links of French, Chinese, German etc. sub TLD's. Again, i never created these country sub folder url's, but Google is crawling them. Is it best to just "Disallow" these sub folder TLD's like the example below, then "mark as fixed" in my crawl errors section in Google Webmaster tools?: User-agent: * Disallow: /de/ Disallow: /fr/ Disallow: /cn/ Thank you, Shawn
Intermediate & Advanced SEO | | Shawn1240 -
Development site is live (and has indexed) alongside live site - what's the best course of action?
Hello Mozzers, I am undertaking a site audit and have just noticed that the developer has left the development site up and it has indexed. They 301d from pages on old site to equivalent pages on new site but seem to have allowed the development site to index, and they haven't switched off the development site. So would the best option be to redirect the development site pages to the homepage of the new site (there is no PR on dev site and there are no links incoming to dev site, so nothing much to lose...)? Or should I request equivalent to equivalent page redirection? Alternatively I can simply ask for the dev site to be switched off and the URLs removed via WMT, I guess... Thanks in advance for your help! 🙂
Intermediate & Advanced SEO | | McTaggart1 -
Refocusing a site's conent
Here's a question I was asked recently, and I can really see going either way, but want to double check my preference. The site has been around for years and over that time expanded it's content to a variety of areas that are not really core to it's mission, income or themed content. These jettisonable other areas have a fair amount of built up authority but don't really contribute anything to the site's bottom line. The site is considering what to do with these off-theme pages and the two options seem to be: Leave them in place, but make them hard to find for users, thus preserving their authority as an inlink to other core pages. or... Just move on and 301 the pages to whatever is half-way relevant. The 301 the pages camp seems to believe that making the site's existing/remaining content focused on three or four narrower areas will have benefits for what Google sees the site as being about. So, instead of being about 12 different things that aren't too related to each other, the site will be about 3 or 4 things that are kinda related to eachother. Personally, I'm not eager to let go of old pages because they do produce some traffic and have some authority value to help the core pages via in-context and navigation links. On the other hand, maybe focusing more would have benefits search benefits. What do think? Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Google fluctuates its result on Chrome's private browsing
I have seen an interesting Google behaviour this morning. As usual, I would open Chrome's private browsing to see how a keyword is ranking. This was what I see... Typed in "sell my car", I see Auto Trader page on 3rd. (Ref:Sell My Car 1st result img) Googled something else, then re-Googled "sell my car" and saw that our page went to 2nd! I repeated the same process and saw that we went from 3rd to 2nd again. Has Google results gone mental? PaGXJ.png
Intermediate & Advanced SEO | | tmg.seo0 -
Is Google's reinclusion request process flawed?
We have been having a bit of a nightmare with a Google penalty (please see http://www.browsermedia.co.uk/2012/04/25/negative-seo-or-google-just-getting-it-painfully-wrong/ or http://econsultancy.com/uk/blog/10093-why-google-needs-to-be-less-kafkaesque for background information - any thoughts on why we have been penalised would be very, very welcome!) which has highlighted a slightly alarming aspect of Google's reinclusion process. As far as I can see (using Google Analytics), supporting material prepared as part of a reinclusion request is basically ignored. I have just written an open letter to the search quality team at http://www.browsermedia.co.uk/2012/06/19/dear-matt-cutts/ which gives more detail but the short story is that the supporting evidence that we prepared as part of a request was NOT viewed by anyone at Google. Has anyone monitored this before and experienced the same thing? Does anyone have any suggestions regarding how to navigate the treacherous waters of resolving a penalty? This no doubt sounds like a sob story for us, but I do think that this is a potentially big issue and one that I would love to explore more. If anyone could contribute from the search quality team, we would love to hear your thoughts! Cheers, Joe
Intermediate & Advanced SEO | | BrowserMediaLtd0