Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
-
Hi everyone,
We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords.Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French.
Background info:
- In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites.
- Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE).
- We have a lot of sites on our C-Block, some of poor quality.
- We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox.
- We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business.
- Only a third of our organic visits come from Canada.
What are our options?
- Change domain and delete the current one?
- Disallow the blog except for a few good articles, hoping it helps Google understand what we really do.
- Keep donating to Adwords?
Any help greatly appreciated!
Thanks! -
Ahh I get it now, redirect every URL from the old site to its homepage. Makes sense!
For point 2) I meant the URL Removal tool to de-index the whole site but this would no longer be needed if I apply the above suggestion.
Thanks a bunch!
-
Yep. The site isn't done. Every time we try to finish it, another couple of referrals come in.
Regarding "non-google sanction duplicate content" that's just my way with words. You have a French version of the site and an English version of the site. Without proper hreflang usage, that is duplicate content.
-
Well spotted, Travis!
-
ABSOLUTELY do NOT 301 anything from the old site to the new site...or you risk transferring the penalty!
I'm not sure what Google will do if you disallow via robots.txt AND 301. Most likely, this is safe, Google will remove the old site from the index and ignore the 301s. But I think there's some risk here that Google will read the pages anyway, see the 301s, and perhaps transfer the penalty.
Deleting the domain in webmaster tools will have no effect, other than to prevent you from seeing what Google thinks about the old domain :-/. Google will continue to index the old domain, follow redirected links, see duplicate content, etc.
-
Hello / Bonjour.
It looks like you might have an awful lot of duplicate content (e.g. category pages, date archives) on the site. I'd try getting rid of that before deciding to switch domains.
-
Hi Travis, thanks for your response.
I swear those hreflangs were OK not long ago! We'll fix them up, thanks!
Can you give an example of "non-google sanctioned duplicate content"?
The robots.txt file seems OK even though it's a bit heavy in verbatim. I'll ask to shrink it a bit. (By the way, I was curious about PhysVisible's robots.txt but looks like you're disallowing everything. Thought I'd let you know!)
Thanks again!
-
Merci Michael!
Can you elaborate on "Keep the old site running, but 301 redirect all of the pages to the home page..." ? Should any URL on www.oldsite.com redirect to the homepage of www.newsite.com?
We had these options in mind. What do you think of those?
-
robots.txt disallow the old site and map every URL with a 301 to help our users get to the right page while Googlebot won't follow those links (to be tested but seems logical), and/or...
-
Delete the whole old domain in GWT.
Thanks for your time!
-
-
Full disclosure: I've been studying hreflang/rel=alternate for the glorious day when someone wants, and will pay for, a solid Spanish translation. That day has not come. But I wanted to be prepared. So here goes:
Your English pages are pointing the canonical at the French pages. No nationalized form of English is mentioned in the hfrelang alternate. If your English speaking audience is Canadian, put en-ca in the empty quotes after hreflang=. Example from /en:
rel="alternate" hreflang="" href="http://www.axialdev.com/en/" />
All of your canonicals point to the fr-ca version of the pages. For the en-ca pages, they should point to the en-ca pages.
I grew up in Michigan. I have quite a few Canadian friends. The only thing that's different about spoken Canadian English is the pronunciation of 'about' and they tend toward en-gb in spelling. But you should use en-ca anyway.
Yep, you have a lot of site-wide links. That is true. That may be part of the problem. But right now, you have a lot of non-google sanctioned duplicate content.
The site also has one of the most involved robots.txt pages I've seen in a month or so. It may not be a good idea to call any old user agent, *, and not give them a directive. Check the end of the file.
A site should not nose dive within the space of a couple weeks without extensive search engine manipulation, or serious on-page issues. Your site has been live for seven years. It's better to doubt on-page first though.
-
Bonjour! (I lived in Montreal for 6 years :-).
I do a lot of penalty recovery work, and you're in the same situation as a number of my clients: algorithmic penalty (probably), and you've disavowed links, but....no Penguin update for a year.
The next Penguin data update is mostly likely very soon, from mutterings from Matt at SMX Advanced. It's been almost a year since the last one. Your disavows won't take effect until there IS a data update.
I would wait for the data update, and see if you recover on rankings for the 3 terms you had in your footer links from client sites. If you do, then great, continue on...
If not, then I'd be inclined to start a new domain, and move your content from your old site (and blog) to the new site, WITHOUT 301 redirecting. Keep the old site running, but 301 redirect all of the pages to the home page....you want Google to successfully fetch all of those blog pages with the great content, but find it's permanently moved to your home page, where that content no longer exists. This way your new site's content will not be seen as duplicate by Google (if you just 404 the pages, Google will presume the content is still as it was before it 404'd....FOR MONTHS).
It's worth going through all of the backlinks for the old site, seeing which ones are from healthy sites, and manually asking those webmasters if they'd kindly update their links to point to your new site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When the site's entire URL structure changed, should we update the inbound links built pointing to the old URLs?
We're changing our website's URL structures, this means all our site URLs will be changed. After this is done, do we need to update the old inbound external links to point to the new URLs? Yes the old URLs will be 301 redirected to the new URLs too. Many thanks!
Intermediate & Advanced SEO | | Jade1 -
Disavow Experts: Here's one for ya ....
Not sure how to handle this one. Simply because there are SO MANY .... I want to be careful not to do something stupid ... Just a quick 3 minute video explanation: https://youtu.be/bVHUWTGH21E I'm interested in several opinions so if someone replies - please still chime in. Thanks.
Intermediate & Advanced SEO | | HLTalk0 -
My blog's categories are winning over my landing pages, what to do?
Hi My blogs categories for the ecommerce site are by subject and are similar to the product landing pages. Example Domain.com/laptops that sells laptops Domain.com/blog/laptops that shows news and articles on laptops Within the blog posts the links of anchor laptop are to the store. What to do? Thanks
Intermediate & Advanced SEO | | BeytzNet1 -
Question about best approach to site structure
I am curious if anyone can share some advice. I am working on planning architecture for a tour company. The key piece of the content strategy will be providing details on each of the tour destinations, with associated profiles for each city within those destinations. Lots of content, which should be great for the SEO strategy. With regards to the architecture, I have a ‘destinations’ section on the Website where users can access each of the key destinations served by the tour company. My question is – from a planning perspective I can organize my folder structure in a few different ways. http://www.companyurl.com/destinations/touring-regions/cities/ or http://www.companyurl.com/destinations/ http://www.companyurl.com/touring-regionA/ http://www.companyurl.com/touring-regionB/cities-profile/ I am curious if anyone has an opinion on what might perform best in terms of the site structure from an SEO perspective. My fear is taking all of this rich content and placing it so many tiers down in the architecture of the site. Any advice that could be offered would be appreciated. Thanks.
Intermediate & Advanced SEO | | VERBInteractive0 -
Effect SERP's internal 301 redirects?
I'm considering installing Wordpress for my website. So I have to change the static URL's from /webpage.html to /webpage/. Yet I don't want to lose in the SERP's. What should I expect?
Intermediate & Advanced SEO | | wellnesswooz1 -
Nofollow in site archutecture. Good or bad in 2013?
We have been using nofollow links to create a silo architecture. is this a good idea or should we stay away from using this on our site. Its an eCommerce site with about 3000+ pages so not sure of the best architecture. ideas and suggestions on best practice welcome!
Intermediate & Advanced SEO | | mark_baird0 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0 -
How to prevent 404's from a job board ?
I have a new client with a job listing board on their site. I am getting a bunch of 404 errors as they delete the filled jobs. Question: Should we leave the the jobs pages up for extra content and entry points to the site and put a notice like this job has been filled, please search our other job listings ? Or should I no index - no follow these pages ? Or any other suggestions - it is an employment agency site. Overall what would be the best practice going forward - we are looking at probably 20 jobs / pages per month.
Intermediate & Advanced SEO | | jlane90