SEO Audit "Hybrid Site"
-
Hi everyone!
I'm trying to analyze a website which is regional in scope. The way the site for every market has been build out is like this : http://subdomain.rootdomain.com/market | http://asiapacific.thisismybrandname.com/ph
OR
http://subdomain.rootdomain.com/language | http://asiapacific.thisismybrandname.com/en
Since this is the first time I'm trying to work on these kinds of sites, I would want to ask for any guidance / tips on how to do about SEO site and technical audit.
FYI, the owner of the sites is not giving me access / data to their webmaster account nor their analytics tracking tool.
Thanks everyone!
Steve
-
Bummer Wish you luck, though!
-
Yep already did the single custom crawl already and the crawl diagnostics turned out pretty useful and was able to convince the marcom team they need to clean their site.
I did ask several times already with their IT team but resistance is strongly felt. Their guys are not really that into SEO, probably due to lack of educating from our part, and third party tools are always looked at with extreme suspicion.
-
Thanks Steve - thrilled to hear it!
Re: the web crawler - it might be an IP-based or a requests/second type issue. The Moz crawl is usually fairly good, but even we get blocked sometimes. If that's the case, you might need to ask their team internally to run it or specifically allow it. You can also try the single custom crawl tool here: http://pro.seomoz.org/tools/crawl-test
-
Thanks Rand! I've actually used both screaming frog and xenu to crawl their site but the thing is they've put some security on their website(s) in such a way that those tools cannot run properly because they kept on asking for login access. Good thing running the test through SEOMoz SEO web crawler does not prove to be as problematic when running via those free crawler tools.
Btw, just want to say you guys are doing a great job here. Lots of valuable insights to learn here/from you guys. Hope the next MozCation will be in Philippines / Singapore. Will make sure I'll attend =D
Cheers!
Steve
-
Hi Steve - sounds like a big challenge (particularly if you can't get access to analytics or WM tools). You could certainly start by getting crawls through SEOmoz PRO just to check for errors/issues/etc. Screaming Frog is also a good tool for this one-off type crawling that doesn't need tracking over time.
There's a few good posts on site audits in particular:
- http://www.seomoz.org/blog/how-to-do-a-site-audit
- http://www.seomoz.org/blog/4-ways-to-improve-your-seo-site-audit
- http://www.seomoz.org/blog/seo-site-audits-getting-started
- http://www.distilled.net/blog/seo/do-your-very-own-site-structure-audit/
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Baidu Webmaster Tools: How to setup in "Site Properties" the field "Affiliate subject"?
Hi ,
International SEO | | lcourse
finally I managed to setup my site in Baidu Webmaster Tools with the help of a freelance staff member in China. Site is verified and sitemap submitted. In section "Site Properties", field "Affiliate subject" I can't figure out after extensive search what I need to setup here for a foreign company without any presence and without company registration in China. Anybody can help? When I click on this field, it indicates "Site association subject is a necessary link for mobile resources to enter search." so my site will not show up in mobile results without it? Grateful for any tips on how to resolve this piece of puzzle of the baidu setup.
Thanks0 -
How well does Google's "Locale-aware crawling by Googlebot" work?
Hello, In January of this year Google introduced "Locale-aware crawling by Googlebot." https://support.google.com/webmasters/answer/6144055?hl=e Google uses different crawl settings for sites that cannot have separate URLs for each locale. ......... This is basically for sites that dynamically render contend on the same URL depending on the locale and language (IP) of the visitor. If e.g. a visitor was coming from France, the targeted page would load in french. If a visitor was coming from the US the same page would load in English on the same URL. Does anyone have any experience with this setup and how well it works? How well do the different versions of a page get indexed, and how well do those pages rank? In the example above, does the french content get indexed correctly? Many thanks!
International SEO | | Veva0 -
Which will rank higher: Non-mobile friendly site in native language vs. mobile friendly global site in English?
Hi, we are currently implementing a mobile site, e.g. m.company.com. The global mobile site will only be available in English. We have local subsites of the desktop site, e.g. company.com/fr. The local subsites are not mobile friendly. If a user does a search for a brand term in France, **which site will rank higher in SERPs? **If it will be the global site, is there anything we can do (other than making them mobile friendly) to make the local sites rank higher? Would it be the mobile-friendly site, even though it is only in English, because the local site would be penalized for not being mobile friendly? Or would it be the local site, because Google will give priority to the fact that it's in French, which matches the language of the person searching?
International SEO | | jennifer.new0 -
I have on site translated into several languages on different TLDs, .com, .de, .co.uk, .no, etc. Is this duplicate content?
Three of the sites are English (.co.uk, .com, .us) as well as foreign (.de, .no, etc.) - are these all seen as having duplicate content on every site? They're hosted under the same EpiServer backend system if this helps. But I am still copying and pasting content over each site, and translating where necessary, so I'm concerned this is indexed as being large amounts of duplicate content. Site traffic doesn't appear to be suffering but as I'm currently putting together new SEOs strategies, I want to cover this possibility. Any advice on ensuring the sites aren't penalised appreciated!
International SEO | | hurtigruten0 -
SEO and Cloud Hosting
Hello, Cant find any clear answers on my issue and hope someone can help. Rather than being worried about losing local rankings with a move to the cloud, we have the opposite issue. The site is a large, international reference site with millions of visits a month. We have the site on servers hosting in UK, Europe and US. If we move the site to Amazon cloud hosting (obvious benefits aside), is there a danger of losing rankings internationally (depending on where the cloud datacentre is located)? Are their any other possible pitfalls and counters? Would be grateful for some advice on this. Thanks
International SEO | | LoweProfero-AU0 -
Russian SEO: Do you know some good sources with tips and news about Yandex?
We are launching our site in Russia - and basically I have no experience in Russian SEO! Could you please recommend me some good English sources with news, tips and hints about Yandex? Basically I am looking the Russian SEOmoz 🙂 Thanks!
International SEO | | jorgediaz0 -
SEO in the UK
Will soon be starting to do SEO for a client in the UK and wondered if there was anything I should do differently for what I do in the United States?
International SEO | | hwade0 -
International SEO with .com & ccTLD in the same language
I've watched http://www.seomoz.org/blog/intern... and read some other posts here. Most seem to focus on whether to use ccTLD, subdomains or subfolders. I'm already committed to expanding my US-based ecommerce to Canada with a .ca ccTLD. My question is around duplicate content as I take my .com USA ecommerce business to canada with a second site on a .ca URL. With the .com site's preference set to USA, and the .ca site's geo preference (automatically) set to Canada, is it a concern at all? About 80% of the content would be the same. FYI, .com ranks OK in Canada now and I want .ca to outrank it in Canada. I know 'localizing' content within the same language is important (independent of duplicate content), but this might not be viable in the short run given CMS limitations. Any direct experience to help quantify the impact here between US and Canadian ecommerce? Adding: I'm not totally confident here. From this google webmaster central post it seems that canonical tags aren't needed. I tend to think nothing is truly neutral and want to be confident regarding whether to use canonicals or not. Is it helpful, harmful or harmless? My site already has internal canonical tags and having internal and external would be a pain I think. @Eugene Byun used it successfully, but would the results have been the same without? Thanks!
International SEO | | gravityseo0