Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
-
Hi everyone,
We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords.Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French.
Background info:
- In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites.
- Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE).
- We have a lot of sites on our C-Block, some of poor quality.
- We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox.
- We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business.
- Only a third of our organic visits come from Canada.
What are our options?
- Change domain and delete the current one?
- Disallow the blog except for a few good articles, hoping it helps Google understand what we really do.
- Keep donating to Adwords?
Any help greatly appreciated!
Thanks! -
Ahh I get it now, redirect every URL from the old site to its homepage. Makes sense!
For point 2) I meant the URL Removal tool to de-index the whole site but this would no longer be needed if I apply the above suggestion.
Thanks a bunch!
-
Yep. The site isn't done. Every time we try to finish it, another couple of referrals come in.
Regarding "non-google sanction duplicate content" that's just my way with words. You have a French version of the site and an English version of the site. Without proper hreflang usage, that is duplicate content.
-
Well spotted, Travis!
-
ABSOLUTELY do NOT 301 anything from the old site to the new site...or you risk transferring the penalty!
I'm not sure what Google will do if you disallow via robots.txt AND 301. Most likely, this is safe, Google will remove the old site from the index and ignore the 301s. But I think there's some risk here that Google will read the pages anyway, see the 301s, and perhaps transfer the penalty.
Deleting the domain in webmaster tools will have no effect, other than to prevent you from seeing what Google thinks about the old domain :-/. Google will continue to index the old domain, follow redirected links, see duplicate content, etc.
-
Hello / Bonjour.
It looks like you might have an awful lot of duplicate content (e.g. category pages, date archives) on the site. I'd try getting rid of that before deciding to switch domains.
-
Hi Travis, thanks for your response.
I swear those hreflangs were OK not long ago! We'll fix them up, thanks!
Can you give an example of "non-google sanctioned duplicate content"?
The robots.txt file seems OK even though it's a bit heavy in verbatim. I'll ask to shrink it a bit. (By the way, I was curious about PhysVisible's robots.txt but looks like you're disallowing everything. Thought I'd let you know!)
Thanks again!
-
Merci Michael!
Can you elaborate on "Keep the old site running, but 301 redirect all of the pages to the home page..." ? Should any URL on www.oldsite.com redirect to the homepage of www.newsite.com?
We had these options in mind. What do you think of those?
-
robots.txt disallow the old site and map every URL with a 301 to help our users get to the right page while Googlebot won't follow those links (to be tested but seems logical), and/or...
-
Delete the whole old domain in GWT.
Thanks for your time!
-
-
Full disclosure: I've been studying hreflang/rel=alternate for the glorious day when someone wants, and will pay for, a solid Spanish translation. That day has not come. But I wanted to be prepared. So here goes:
Your English pages are pointing the canonical at the French pages. No nationalized form of English is mentioned in the hfrelang alternate. If your English speaking audience is Canadian, put en-ca in the empty quotes after hreflang=. Example from /en:
rel="alternate" hreflang="" href="http://www.axialdev.com/en/" />
All of your canonicals point to the fr-ca version of the pages. For the en-ca pages, they should point to the en-ca pages.
I grew up in Michigan. I have quite a few Canadian friends. The only thing that's different about spoken Canadian English is the pronunciation of 'about' and they tend toward en-gb in spelling. But you should use en-ca anyway.
Yep, you have a lot of site-wide links. That is true. That may be part of the problem. But right now, you have a lot of non-google sanctioned duplicate content.
The site also has one of the most involved robots.txt pages I've seen in a month or so. It may not be a good idea to call any old user agent, *, and not give them a directive. Check the end of the file.
A site should not nose dive within the space of a couple weeks without extensive search engine manipulation, or serious on-page issues. Your site has been live for seven years. It's better to doubt on-page first though.
-
Bonjour! (I lived in Montreal for 6 years :-).
I do a lot of penalty recovery work, and you're in the same situation as a number of my clients: algorithmic penalty (probably), and you've disavowed links, but....no Penguin update for a year.
The next Penguin data update is mostly likely very soon, from mutterings from Matt at SMX Advanced. It's been almost a year since the last one. Your disavows won't take effect until there IS a data update.
I would wait for the data update, and see if you recover on rankings for the 3 terms you had in your footer links from client sites. If you do, then great, continue on...
If not, then I'd be inclined to start a new domain, and move your content from your old site (and blog) to the new site, WITHOUT 301 redirecting. Keep the old site running, but 301 redirect all of the pages to the home page....you want Google to successfully fetch all of those blog pages with the great content, but find it's permanently moved to your home page, where that content no longer exists. This way your new site's content will not be seen as duplicate by Google (if you just 404 the pages, Google will presume the content is still as it was before it 404'd....FOR MONTHS).
It's worth going through all of the backlinks for the old site, seeing which ones are from healthy sites, and manually asking those webmasters if they'd kindly update their links to point to your new site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practices to Design Site Mock Up Using Wordpress Rather than Wireframes?
We are in the process of redesigning our real estate website. Our designer/developer is very quick and confident on Wordpress. He suggests designing directly on Wordpress and bypassing wireframes and a mock ups. He is very confident in his Wordpress abilities. Is it a mistake to take this approach? He has also asked that we select a real estate theme at this point. I would think that the theme would be selected after the wireframes and mock ups get done. But there are certainly different approaches. Are there best practices for redesigning a webiste; any suggestions? Are there significant risks/disadvantages to bypassing wireframes/mock ups? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan Rosinsky0 -
If UGC on my site also exists elsewhere, is that bad? How should I properly handle it?
I work for a reviews site, and some of the reviews that get published on our website also get published on other reviews websites. It's exact duplicate content -- all user generated. The reviews themselves are all no-indexed; followed, and the pages where they live are only manually indexed if the reviews aren't duplicate. We leave all pages with reviews that live elsewhere on the web nofollowed. Is this how we should properly handle it? Or would it be OK to follow these pages regardless of the fact that technically, there's exact duplicate UGC elsewhere?
Intermediate & Advanced SEO | | dunklea0 -
Best strategy to follow for a single service site
Can anyone share what they feel is the best strategy to follow for a single service site? Would you optimise and target the homepage for the primary service they offer or target a page one level lower and leave the homepage to target the Brand name? Links to any references or case studies would also be greatly appreciated, thank you!
Intermediate & Advanced SEO | | Marketing_Today0 -
Google's form for "Small sites that should rank better" | Any experiences or results?
Back in August of 2013 Google created a form that allowed people to submit small websites that "should be ranking better in Google". There is more info about it in this article http://www.seroundtable.com/google-small-site-survey-17295.html Has anybody used it? Any experiences or results you can share? *private message if you do not want to share publicly...
Intermediate & Advanced SEO | | GregB1230 -
Development site is live (and has indexed) alongside live site - what's the best course of action?
Hello Mozzers, I am undertaking a site audit and have just noticed that the developer has left the development site up and it has indexed. They 301d from pages on old site to equivalent pages on new site but seem to have allowed the development site to index, and they haven't switched off the development site. So would the best option be to redirect the development site pages to the homepage of the new site (there is no PR on dev site and there are no links incoming to dev site, so nothing much to lose...)? Or should I request equivalent to equivalent page redirection? Alternatively I can simply ask for the dev site to be switched off and the URLs removed via WMT, I guess... Thanks in advance for your help! 🙂
Intermediate & Advanced SEO | | McTaggart1 -
Bad performance for low competition term
Hi everybody. I've been working on this page for some time, http://www.double-glazing-forum.com/anglian-windows.aspx. Until several months ago, it ranked really well for the terms 'Anglian windows' and 'Anglian windows reviews'. However, following a Google update it tanked and has got worse ever since. Here's what I've done to try and fix it. Added 800 words of unique copy Added YouTube videos Replaced scraped press releases with unique descriptions that link to the source Analysed the backlink profile and uploaded a disavow file containing all bad links Contacted webmaster to remove them where possible Getting a bit low on ideas now, so any help would be great!
Intermediate & Advanced SEO | | Blink-SEO0 -
Nofollow in site archutecture. Good or bad in 2013?
We have been using nofollow links to create a silo architecture. is this a good idea or should we stay away from using this on our site. Its an eCommerce site with about 3000+ pages so not sure of the best architecture. ideas and suggestions on best practice welcome!
Intermediate & Advanced SEO | | mark_baird0 -
What's the best method for segmenting HTML sitemap?
Hello all, I was wondering if anyone can help me. Currently I'm trying to set up a HTML sitemap for our website and am having trouble with the 500+ pages of content under each category. How do you segment your HTML sitemap in a case like this, keeping in mind the less than 100 links per page rule? For example, http://www.careerbliss.com/salary/ allows our users to search salaries under company, job title, and location. You can imagine how many thousands of pages we need to represent. Any help will be greatly appreciated! Cheers! Reyna
Intermediate & Advanced SEO | | CareerBliss0