Hey KJ,
These are good read/resource to start :
http://www.mcanerin.com/en/articles/301-redirect-iis.asp
http://authoritylabs.com/blog/solving-canonical-problems/
Cheers, Rob
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hey KJ,
These are good read/resource to start :
http://www.mcanerin.com/en/articles/301-redirect-iis.asp
http://authoritylabs.com/blog/solving-canonical-problems/
Cheers, Rob
HI Gene,
Once you figure out this list from the data gathering - you'll need a tool to build, and run against that list (or lists if broken out by channel), so you can show the client where they position in the ranks.
You can use a tool like the MOZ (rank tracking utility in PRO) although I find this tedious as you can't build a singular report as easily when compared to something, say like RavenTools (this is another tool my team uses for site rank audits) or AWR (Advanced Web Ranking)
You could also go about getting an independent software like AWR (yearly licenses which can be costly) which isn't cloud based, and I prefer this software over any other. The flexibility of this program is great, and the custom reports are fantastic for moulding as needed to export data in multiple formats. You can take it anywhere, and provided you have an internet connection, run it anytime you like.
Looking over the current KW list they rank for (also looking at Google Webmaster Tools, Bing Webmaster Tools (if you have this setup) is a good place to start Nice call Brian!
you might also want to take that list you extract, and build out an excel file using the MOZ 'keyword analysis and difficulty tool' to map our the competitiveness, and difficutlty score for each keyword, so you can organize them in terms of performance (and attribute that to either short and long tail KW traffic). You'll also extract some nice data through the API from Google on Exact and Broad match phrase search volume
Hope this helps. Cheers.
Rb
Moz should keep the old data in that existing account - but you won't be able to port it over to the new profile/domain tracking (and I don't think you would want too) if it's a new client. I'm guessing you build a 'campaign' for tracking in the PRO account level tools?
I'm not 100% sure on the integration of Moz' and if it can be ported over. I also suggest reaching out to the support teams with a quick email.
If they don't want to share the old account - get a new one going and track your efforts against that of the older metrics. In time - you should be able to run comparative data (manually of course) in Excel to show trending, jumps, bumps etc.
Cheers
I agree, run a new XML sitemap, get it installed into GWMT (Google WebMaster Tools), and verifying you aren't seeing any issue with regards to the site, crawl errors, increased 404's will also help your efforts! Cheers.
I'm assuming you are moving from an old static HTML site to something along the lines of a CMS with Drupal or Joomla based on your new URL structure example above.
Absolutely. If you don't 301 the old page URL's to the new locations and URL names - you will eventually lose all the back-link development you have been working on. Those old URL's will eventually return a 404 error in Google WebMaster Tools, and the link value and 'juice' will be lost..
Plan out an Excel spreadsheet and then work to map all the pages from your site to their counterpart NEW URL names. This way you will make sure to get all your pages mapped out.
You would probably also want to crawl your old domain (before the new site goes live) with a tool like 'Screaming Frog' or 'Xenu' which you can download online and is free (best part and great tools to have). This will help you find and extract all the pages in your site into Excel - ensuring you don't miss any in the mapping process.
I would schedule some time after launch, to double check each URL individually (with the old URL's from the Excel DOC from the crawl) when the site goes live, to verify that the proper page level 301's is in place and correctly working.
Hope this helps you out. You should be in good shape, if you follow these steps pre and post launch.
Rob
Does this mean you have to install all new GA code on the site? With a whole new profile being created? (I'm thinking yes).. you won't lose the data, but they won't migrate together if I get what you are saying correctly.
Your best bet is to build an Excel report using a crawl tool (like Xenu, Frog, Moz, etc), and export that data. Then look to map out the pages you want to log and mark as 'not changing'.
Make sure to built (or have a functioning XML sitemap file) for the site, and as John said, state which URL's NEVER change. Over time, this will tell googlebot that it isn't neccessary yo crawl those page URL's as they never change.
You could also place a META REFRESH tag on those individual pages, and set that to never as well.
Hope some of this helps! Cheers
We also shifted focus away from other CMS systems in Wordpress (unless the goal of the site and traffic is built around a Blog marketing strategy), and moved to using mainly Drupal 7 at this time, but I also am a big fan of Joomla.
If I were you, take some time to look into sites build on these platforms and explore the options. They offer incredible flexibility.
Nice additions Robert! Cheers.
Great response. I would have given you exactly the same steps. You should follow John's advise:)
Link building to these individual ccTLD's will be the biggest obstacle to overcome, especially in short amounts of time (if that matters), but, if you have time and resources, this will help the geographic level of your brand on a global level. It's just too bad when you have to break up one master domain (pooled together), and go with individual domains for each country you are targeting.
Cheers, Rob
Hi Greg,
This might currently be affected by any back-links you have either established (been building to the domain) or one's that have freely linked to you.
My advise, would be to continue just working to improve your site. Metrics for performance will increase over time, if you keep your 'visitors, and clients' in mind. Improve the user experience, take care of the proper technical steps to ensure proper use of 301's, on and off page optimizations, great content and user experience and over time, your metrics will show themselves off which is always nice..
Just remember, all this work isn't something that will happen overnight. It takes time to improve these rankings. As Moz's index is usually updated 1 time per month (if they are running on time), you will be able to track this metric month over month. If you see steady drops, then you might want to start digging.
Again, my guess is that you are on the right track. Don't overthink the metrics behind the site. Look at the analytics data and think, how can I better my site for improved user experience
Cheers, Rob
If you do want to see a run a back-link report on a domain - use the SEOmoz (pro) OSE. Use the filters to break down where and how the links are coming into the site.
Yep, I got my confirmation as a pro tested via email from them today. The premier launch will be announced soon, and then I'll be able to login and test it out - as I wanted too
My advice, even with 13 years in the industry now is to get a personal reference from someone you might know who is currently working with someone who is good at what they do. Plain and simple
Personal references for work like this are always #1 in my book. I also find it easier to talk with the client, becuase they already have a sense of who you are having gotten the reference from someone they know and trust.
Just my 2 cents Cheers,
Yep, I would go with a 301 also. Keep that juice (or as much as you can muster).. you'll lose a little value in the transfer.. (5-15%), but it would be worth keeping it. Cheers!
I think in some way or another - we all do!!
I would check to see if the reporting is accurate on the competitor site/URL. Run checks to see if their duplicate content is actually 'duplicate' or if the reporting tool is just fooling you or mis-reading the data. Always good to do manu al checks.
Do that by grabbing text from the site and placing it in " " for exact match phrase search. Do you see the actual URL, and a duplicate say (like Druapl www.mysite.com/node/12345) below it .. if so, then it's pretty much slam on that there is duplicated content affecting that site.
Just 1 thing you can do to help research it out.
My testing has not shown any improvements with regards to this effort - BUT you could try it, and share the results with the rest of us!
Best practice for SEO TITLES is something like:
Primary Keyword - Secondary Keyword | Brand Name
or
Brand Name | Primary Keyword and Secondary Keyword
See this article on from the Moz team and a pretty good read too!
http://www.seomoz.org/learn-seo/title-tag or
http://www.seomoz.org/beginners-guide-to-seo
Cheers! Rob
As I stated, make sure to use a proper translation service in your efforts, and not any automated translation. They never really get tone, punctuality, etc etc right in native languages (in my book/rant above)
As well - make sure to ADD the Meta feature on each of your pages in the sub-folder (or subdomains) you use like -
And you can use this link resource to find the language code for this feature.
[http://www.seoconsultants.com/meta-tags/language](Meta Language Tags "http://www.seoconsultants.com/meta-tags/language")
Note: You should make sure all the canonical versions of this render properly as well. This way - all links and 'juice' or value are passed the the domain name you specify.
Checking Google webmaster tools will also help see any errors when handling duplicate content issues (for homepage or entire site). as this was also a new feature in recent months added by Google.
Cheers!
I'm not a big fan of breaking language sites into sub-domains because you break the value of the link structure and link juice in the domain. So for every single language you break out into a sub-domain - you then end up having to build more links to the sub-domains.
Each SEO is different and has difference experience, tactics and strategy from testing and previous work on sites. Sub-folders have been more successful for me in work than sub-domains in terms of rankings and language (MSEO) techniques.
Cheers! Rob
Hi Brant,
What you are talking about is Multilingual SEO processes. There are a few ways you can go about doing this.
You can either go with:
A) go with the following setups for the domain with regards to the site URL/sub-folder structure.
www.mysite.com (english)
www.mysite.com/sp/ (spanish)
www.mysite.com/fr/ (french)
www.mysite.com/de/ (german)
etc, etc, etc..
Or:
B) Or, you can also go the route of picking up the same domains name with needed country level extensions that are part of that country (like .ca for canada, .de for germany, .com for USA) etc, etc.
I prefer option A for many reasons, but everyone has their preferences
If you go with A. Keep the domain setup the same and build /folders/ with duplicates of the site pages that are target focused in the target language.
If you go the route of using sub-folders - you will need to inquire about setting up geo-location services at the domain/hosting level (through IP detection)
If going with the sub-folders - don't forget (from a user and experience perspective) on the site to make sure to allow users to 'choose' which element/language manually from the sites' homepage, if offering more than 1-2 languages (expansion) if you go to 3-4 or 5 versions.
This process is very intensive, and needs to be done carefully. You want to use professional services for translation, as Google Translate, or other online services aren't always accurate in sentence structure.Google does not recommend automatic translations.
If you go this route, you will also need to redo a complete KW audit from a search engine optimization perspective, so you have the RIGHT keywords that people use for that market (products), in their own languages. English isn't always a market parallel when languages are involved. Keyword translation is very important here to be successul with customers and target search.
** Using sub-domains can also be done - but sub-domains are considered to be independent domains by Google and therefore don't pass link juice' and value for inbound links across the whole site. Sub folder structures are best for allowing link 'juice' from link building effort' to be passed to the entire site.
If you go with A) - because you are using /folders/ for each of the domains you want to target for each users language. This type of setup is less expensive as well (cost of purchasing more domains, hosting etc)
Try to avoid using geo-location at the hosting level (from an IP address perspective) as it isn't always the best option for your user experience. Giving users the option to choose the language they want to use/see. Allowing users to choose the language they want to view the site in, will help them. Just becuase someone visits a site from the U.S - doesn't mean they are English (they could be Spanish, Chinese, Russian etc) and want the option to choose the language of the site you are promoting.
Remember to use UTF8 for non english language character encoding (on pages, URL's etc)
Presenting sites in multiple languages isn't 'duplicate content' when breaking it out into various /folders and then languages.
*** This is also great user experience and if done properly can help you retain the visitor and convert them into a customer/client as you have taken the time to build out information in their native tongue.
A couple of good articles on MSEO (Multilingual SEO) to help you along. With this, you could probably dig for more information too.
http://www.searchenginejournal.com/multilingual-seo/19903/
Sorry for the long book of information and links! Ideas just kept coming to me while I was writing!
Cheers, Rob
The 301 redirect also tell the search engine where the NEW location address of this page is (the final page you want to have ranked) while passing all other page value from the links built to the page exist.
I totally agree with Alan on this. You really need to look at each individual URL, and 301 redirect to ensure they are in place, page by page. It's a sensitve issue for sure and proceed with caution for sure.
It's also not condusive to 301 all URL's and links to the homepage URL. This doesn't make for a great 'user experience' and can cause confusion to the user visiting the site, looking for the specific material. It can also confuse the hell out of a search engine (where did all those other pages I had index'ed go!)
You might also damage the inbound link quality of pages in place now that have links pointing to it. Sure that would pass on to the main domain URL (losing value of 5-15% in the 301), but it's just not a great strategy tpo blindly 301 everything to the homepage. Alan said it right below - you need to look at the links, inbound anchor text, where it's coming from and redirect to the right page via 301) for the best user end experience.
I would map out all the URL's that are new, and then map out the URL's that correlate to those pages exactly. Then setup the webconfig file to have individual page level 301 redirects and put those into place. He provided a link below to his webconfig file for 301 redirects.
Cheers, Rob
Page and site load speed/time is a factor as Matt has said in the past. It's beginning to take more effective value in the 'user experience' which Google is trumpeting in past months.
I would surely aim to analyze and look over the data you need to see the value in taking the time to start correcting issues with regards to your site speed and load times.
Yslow is a great tool for sure, but you can also download a chrome AP called 'SEO Site Tools', which has some great features to determine things like this. I would also look to incorporate Gzip into your site for compression.
You might also want to look at the file size of the homepage. If it exceeds 30kb, it's probably too large, and taking more time than needed to call the HTTP requests for various CSS, JS scripts, images, etc..
Just some more things too look at Cheers.
You can/need to build links from within the domain to thePDF report you want to share also. Links within the PFD file will also help for people who share the file via email, and not within the site. This will bring people back to your domain.
Also consider that each sub-domain is considered by Google to be a separate domain.
So, actually building out sub-domains (like www.google.com and adwords.google.com for example) would be considered 2 separate sites wihtin the same company, and when starting to expand and build links too, those links to the sub-domain only stand for value to that actual domain - link value and 'juice' isn't passed through the whole domain to the rest of the site naturally.
It would be better to run with building sub-folders for the main domain like (google.com/adwords/ as an example) as the value of any links to /adwords/ sub-folder is passed all the way through the entire domain, even from a sub-folder.. as well as the other mentioned list above! It makes links building easier to manage without having to build links to all these individual channel sub-domains.
Hope this also helps you out!
Everything else also counts EGOL mentions above.
Cheers! Rob
The only way it could be duplicate content is if you load the video to multiple sites (it's better to load to one location on your site and share the EMBED feed) that way you also get credit for the inbound links too, and if you transcribe the content of your video feed for the site and populate that across multiple other sites. If you just transcribe the content of the video and leave it as is, you'll be fine
It really does depend on the product and type of site for sure.
I agree here, but with the Panda updates this past year, just having pages up won't really do much. You'll need to improve the user experience to build on the page.
I would build on the page, but look to improve the landing pages of the products that are either no longer available, or the pages that are or will still be online but with products that are not available 'at this time'.
Bring in social media, product landing pages, perhaps a posting 'comment' section for customers to review the products (to offer some user generated content), alongside other features like customized descriptions (don't copy the supplier site), features about the history of the product, the origin of it.. etc..
If it's no longer available, redirecting to a products page of similar relation will help keep the client on the site - while also offering various products of similar needs for their use.
Do all this - I and I see a win-win for you
You definately don't want to use a 404 error code, so avoid that at. There would be a lot of 301 redirects after that as Google isn't a big fan of 404 pages and it doesn't help your 'user experience'.
Because your product pages (individually) might be gaining links and resources/mentions/social mentions, etc, from customers as they find products, a 404 would produce a loss in valuable inbound linking juice into the domain.
I would simply keep those pages live at all times, but build on the pages/products description, history, talk about it's features, etc, keep those deeply seeded and index'd pages in the domain. Then when they come back online (if they do) ir provides users excellent content about the product and in parallel works with thier user experience.
If the product after time doesn't return, then just work to find a solution for those specific pages. Perhaps a directory of 'out of date products' that visitors could reference if they were looking or searching for something in particur on that site - and offer an alternative (if available) to them in it's place?
Hope this helps a little. Rob
Thanks, I am aware of this happening... and read this post when Barry did it.. I was just curious to see if there was a way around it.. (for Brand Search)..
When doing a search in Google (US Proxy) - Google is stripping and replacing my functional TITLE with the brand name only (say 'Nike'), but if you do a specific search term like ('buy nike shoes') and see a top 10 listing for my site's homepage, now the title works and shows correctly.
I saw this a few years ago with another one of my company domains, but didn't ask the question as it worked out.
Thanks for any insight..
NOTE: It's not damaging any results, or rankings for the site.. but: when searching for BRAND name of the company, like I explained, it's replacing a optimized title for the BRAND name, and then re-placing it naturally when deep search brings up the homepage and the TITLE looks fine..
Very weird at best!
Thanks, Rob
Really detailed overlook. Nice touching on everything.
Essentially, keep your site design around the "user experience" as this is a post-panda world. Don't overload it with blocks of advertising, breaking up the page, content, layout and user experience. The Keep It Simple (KIS) strategy usually works best for me when it comes to running ads on sites, within the user interface/design.
NEVER use more than 1 block on a site's page as I looks "spammy" and cluttered.. As a user myself, I immoderately click back or out of a site that I find is running more than 1 block of ads. The page just looks terrible.
You might also want to consider A/B split testing of different landing or home pages. This will allow you to rey different layouts to get the best results on both, where the ads perform, and as well as the user conversion model for the sites content and lead generation.
Another note to keep in mind is to reduce the blocks of ads running relevant to the content of the market/site you are working in. Don't run open Adwords capaigns on sites for just "anything" to generate clicks. If your running a bike store website, keep the AD's targeted people in the "bike" industry. Keep the ads filtered and related on your site's side to the market content you are optimizing for.
Remember, in post-Panda, it's the user experience that the quality teams are looking to improve.
Thanks Ryan and Ryan! I'm just unfamiliar with this command set in the robots file, and getting settled into the company (5 weeks).. so I am still learning the site's structure and arch. With it all being new to me with limitations I am seeing from the CMS side, I was wondering if this might have been causing crawl issues for Bing and or Yahoo... I'm trying to gauge where we might be experiencing problems with the sites crawl functions.
So, for this parameter, should I keep it in the robots file?
Hey Everyone!
Perhaps someone can help me. I came across this command in the robots.txt file of our Canadian corporate domain. I looked around online but can't seem to find a definitive answer (slightly relevant).
the command line is as follows:
Disallow: /*?*
I'm guessing this might have something to do with blocking php string searches on the site?. It might also have something to do with blocking sub-domains, but the "?" mark puzzles me
Any help would be greatly appreciated!
Thanks, Rob
Thanks. It was a little of both in terms of concerns. I didn't want indexing issues, and the hyphen just threw me off - as well as it impacting the usability of the page. As long as they hyphen will work in the sub-domain I'm good to go. I didn't want any issues later Thanks to all who replied!
this is actuallly for the sub-domain, not the primary domain.
so football.mysite.com as opposed to something like football-sport.mysite.com
It's the hyphen that's throwing me out of whack..
Ideas? and thanks for the insights~!!
Sorry - should have said "not very friendly!"
I agree. I'm still on the fence about the hyphenated sub-domain. I can't find too many sites that actually practice this technique. i'm looking for some references online.
This domain won't be spoken over the phone, and from a usability perspective, but very flashy r friendly.
I wanted to use something like say football.mysite.com instead of football-sport.mycompany.com
i'm still perplexed!! LOL
For our corporate business level domain, we are exploring using a hyphenated sub-domain foir a project.
Something like www.go-figure.extreme.com
I thought from a user perspective it seems cluttered. The domain length might also be an issue with the new Algorithm big G has launched in recent past.
I know with past experience, hyphenated domains usually take longer to index, as they are used by spammers more frequently and can take longer to get out of the supplementary index.
Our company site has over 90 million viewers / year, so our brand is well established and traffic isn't an issue. This is for a corporate level project and I didn't have the answer!
Will this work? anyone have any experience testing this. Any thoughts will help!
Thanks, Rob