Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
-
Hi everyone,
We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords.Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French.
Background info:
- In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites.
- Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE).
- We have a lot of sites on our C-Block, some of poor quality.
- We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox.
- We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business.
- Only a third of our organic visits come from Canada.
What are our options?
- Change domain and delete the current one?
- Disallow the blog except for a few good articles, hoping it helps Google understand what we really do.
- Keep donating to Adwords?
Any help greatly appreciated!
Thanks! -
Ahh I get it now, redirect every URL from the old site to its homepage. Makes sense!
For point 2) I meant the URL Removal tool to de-index the whole site but this would no longer be needed if I apply the above suggestion.
Thanks a bunch!
-
Yep. The site isn't done. Every time we try to finish it, another couple of referrals come in.
Regarding "non-google sanction duplicate content" that's just my way with words. You have a French version of the site and an English version of the site. Without proper hreflang usage, that is duplicate content.
-
Well spotted, Travis!
-
ABSOLUTELY do NOT 301 anything from the old site to the new site...or you risk transferring the penalty!
I'm not sure what Google will do if you disallow via robots.txt AND 301. Most likely, this is safe, Google will remove the old site from the index and ignore the 301s. But I think there's some risk here that Google will read the pages anyway, see the 301s, and perhaps transfer the penalty.
Deleting the domain in webmaster tools will have no effect, other than to prevent you from seeing what Google thinks about the old domain :-/. Google will continue to index the old domain, follow redirected links, see duplicate content, etc.
-
Hello / Bonjour.
It looks like you might have an awful lot of duplicate content (e.g. category pages, date archives) on the site. I'd try getting rid of that before deciding to switch domains.
-
Hi Travis, thanks for your response.
I swear those hreflangs were OK not long ago! We'll fix them up, thanks!
Can you give an example of "non-google sanctioned duplicate content"?
The robots.txt file seems OK even though it's a bit heavy in verbatim. I'll ask to shrink it a bit. (By the way, I was curious about PhysVisible's robots.txt but looks like you're disallowing everything. Thought I'd let you know!)
Thanks again!
-
Merci Michael!
Can you elaborate on "Keep the old site running, but 301 redirect all of the pages to the home page..." ? Should any URL on www.oldsite.com redirect to the homepage of www.newsite.com?
We had these options in mind. What do you think of those?
-
robots.txt disallow the old site and map every URL with a 301 to help our users get to the right page while Googlebot won't follow those links (to be tested but seems logical), and/or...
-
Delete the whole old domain in GWT.
Thanks for your time!
-
-
Full disclosure: I've been studying hreflang/rel=alternate for the glorious day when someone wants, and will pay for, a solid Spanish translation. That day has not come. But I wanted to be prepared. So here goes:
Your English pages are pointing the canonical at the French pages. No nationalized form of English is mentioned in the hfrelang alternate. If your English speaking audience is Canadian, put en-ca in the empty quotes after hreflang=. Example from /en:
rel="alternate" hreflang="" href="http://www.axialdev.com/en/" />
All of your canonicals point to the fr-ca version of the pages. For the en-ca pages, they should point to the en-ca pages.
I grew up in Michigan. I have quite a few Canadian friends. The only thing that's different about spoken Canadian English is the pronunciation of 'about' and they tend toward en-gb in spelling. But you should use en-ca anyway.
Yep, you have a lot of site-wide links. That is true. That may be part of the problem. But right now, you have a lot of non-google sanctioned duplicate content.
The site also has one of the most involved robots.txt pages I've seen in a month or so. It may not be a good idea to call any old user agent, *, and not give them a directive. Check the end of the file.
A site should not nose dive within the space of a couple weeks without extensive search engine manipulation, or serious on-page issues. Your site has been live for seven years. It's better to doubt on-page first though.
-
Bonjour! (I lived in Montreal for 6 years :-).
I do a lot of penalty recovery work, and you're in the same situation as a number of my clients: algorithmic penalty (probably), and you've disavowed links, but....no Penguin update for a year.
The next Penguin data update is mostly likely very soon, from mutterings from Matt at SMX Advanced. It's been almost a year since the last one. Your disavows won't take effect until there IS a data update.
I would wait for the data update, and see if you recover on rankings for the 3 terms you had in your footer links from client sites. If you do, then great, continue on...
If not, then I'd be inclined to start a new domain, and move your content from your old site (and blog) to the new site, WITHOUT 301 redirecting. Keep the old site running, but 301 redirect all of the pages to the home page....you want Google to successfully fetch all of those blog pages with the great content, but find it's permanently moved to your home page, where that content no longer exists. This way your new site's content will not be seen as duplicate by Google (if you just 404 the pages, Google will presume the content is still as it was before it 404'd....FOR MONTHS).
It's worth going through all of the backlinks for the old site, seeing which ones are from healthy sites, and manually asking those webmasters if they'd kindly update their links to point to your new site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
Site Structure: How do I deal with a great user experience that's not the best for Google's spiders?
We have ~3,000 photos that have all been tagged. We have a wonderful AJAXy interface for users where they can toggle all of these tags to find the exact set of photos they're looking for very quickly. We've also optimized a site structure for Google's benefit that gives each category a page. Each category page links to applicable album pages. Each album page links to individual photo pages. All pages have a good chunk of unique text. Now, for Google, the domain.com/photos index page should be a directory of sorts that links to each category page. Alternatively, the user would probably prefer the AJAXy interface. What is the best way to execute this?
Intermediate & Advanced SEO | | tatermarketing0 -
301's & Link Juice
So lets say we have a site that has 0 page rank (kind of new) has few incoming links, nothing significant compared to the other sites. Now from what I understand link juice flows throughout the site. So, this site is a news site, and writes sports previews and predictions and what not. After a while, a game from 2 months gets 0 hits, 0 search queries, nobody cares. Wouldn't it make sense to take that type of expired content and have it 301 to a different page. That way the more relevant content gets the juice, thus giving it a better ranking... Just wondering what everybody's thought its on this link juice thing, and what am i missing..
Intermediate & Advanced SEO | | ravashjalil0 -
What's your daily SEO checklist?
First thing every morning I login to Google Webmaster tools looking for any errors, review data, sites linking to us, etc. I then login to Google Analytics and SEOMOz to check traffic to our terms to see if there have been any changes that need to be addressed. What's your daily checklist?
Intermediate & Advanced SEO | | Prospector-Plastics1 -
What's the best internal linking strategy for articles and on-site resources?
We recently added an education center to our site with articles and information about our products and industry. What is the best way to link to and from that content? There are two options I'm considering: Link to articles from category and subcategory pages under a section called "related articles" and link back to these category and subcategory pages from the articles: category page <<--------->> education center article education center article <<---------->> subcategory page Only link from the articles to the category and subcategory pages: education center article ---------->> category page education center article ---------->> subcategory page Would #1 dilute the SEO value of the category and subcategory pages? I want to offer shoppers links to more information if they need it, but this may also take them away from the products. Has anyone tested this? Thanks!
Intermediate & Advanced SEO | | pbhatt0 -
Do EMD's give the boost everyone says they do?
Hi, I have used a few myself and if I was targeting UK search with a [emd] .co.uk, every time the domain has hit page 1 with little effort. I have done this maybe 4-5 times, my moz stats show 0 but I rank above results on page 1 with moz stats of DA:45+. Can I now say basically any EMD I buy will rocket through the serp's?
Intermediate & Advanced SEO | | activitysuper0 -
Hundreds of thousands of 404's on expired listings - issue.
Hey guys, We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000. Many of these listings receive links. Classified listings that are less than 45 days show other possible products to buy based on an algorithm. It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results. -> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised. -> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience. -> Or, shall we just leave them as 404's? : google sort of says it's ok Very curious on your opinions, and how you would handle this. Cheers, Croozie. P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
Intermediate & Advanced SEO | | sichristie0 -
Best practice to redirects based on visitors' detected language
One of our websites has two languages, English and Italian. The English pages are available at the root level:
Intermediate & Advanced SEO | | Damiano
www.site.com/ English homepage www.site.com/page1
www.site.com/page2 The Italian pages are available under the /it/ level:
www.site.com/it Italian homepage www.site.com/it/pagina1
www.site.com/it/pagina2 When an Italian visitor first visits www.mysit.com we'd like to redirect it to www.site.com/it but we don't know if that would impact search engine spiders (eg GoogleBot) in any way... It would be better to do a Javascript redirect? Or an http 3xx redirect? If so, which of the 3xx redirect should we use? Thank you0