I'm after any good examples of badges people have used to give to their community.
If anyone wants to share examples that would be really helpful.
Thanks.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I'm after any good examples of badges people have used to give to their community.
If anyone wants to share examples that would be really helpful.
Thanks.
Hi,
It shouldn't make a difference. Google will crawl either link and follow them (unless instructred otherwise).
The issues start to come if you want to launch multiple versions of your site (ie different country etc) - you could have issues but this has little to do with SEO, more a technical thing.
This probably doesn't help you much sorry.
A suggestion that all major search engines obey. We used it massively and it is 100% listened to by search engines.
If you head down this route it is very important that you don't just take your page and re-make them verbatim with a new folder in front - search engines will consider this duplicate content.
You will need to tailor make your content per area and make it unique, which is sounds like you are going to do.
Re: link building, you could quite easily find out the IP address of the bigger influences in each city and target those as somewhere to star link building, that will give you relevance in the cities you are trying to target.
If you want to boost your organics and social media, you need to look at all the options, not just Twitter.
For Google, use Google+ (I've already +1ed things and seen immediate, short-term results) and for Bing use Facebook and Twitter.
However, it's really important that you don't focus on these things too much, they are short term gainers, whereas link building and getting the site in order from A-Z will be much better in the long term.
Also things like blogger out-reach etc is important - they are still partly a social signal - and their links will last a lot longer than Twitter mentions etc.
Hope this helps.
Hey Nightwing
I think you need to be careful here. A 301 is a re-direct whereas a canonical is telling Google, Bing etc to treat this page as a duplicate and not index it.
To quote Google: "A canonical page is the preferred version of a set of pages with highly similar content."
See Matt Cutts and all his beauty explain here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
A canonical won't re-direct the page like a 301, the page still exists, it's just that search engines will remove it from SERPs.
As long as the code is all set up right, the only way to check for a canonical is working is to review the code.....and it should also drop out of the SERPs too.
Hope this helps.
Stay warm
Hey,
I agree with Alan - you've set it up correctly from what I looked at.
What else have you done recently to the site? It might just be a natural dip in traffic (have you compared YoY to see if you always get a dip at the same time) or have you done anything to the site?
What do you mean by visits dropping also? Are you talking about a few percent here or 50% drop?
You will naturally get less pages indexed with the canonicals in place - that's the idea.
Here is the info on pushstates - as mentioned above: https://developer.mozilla.org/en/DOM/Manipulating_the_browser_history
Again, that is not possible because of the #. You can't redirect a browser side page.
There are some tricky things you can do with modern browsers to push things through via ajax but this is another kettle of fish and even then I am not sure that it is possible to 301.
Sorry!
A
I might be wrong but isn't a # done browser side, so I am not sure this is possible as a search engine can't see the # anyway.
Willing to be proved wrong on this one but think that is the case.......might go and grab one of our techies and check.
Hi Neil.
The sites we have done are all new but from experience dealing with moving to new urls the best thing to do is create a mapping document in excel. It'squite easy if you know that for example:
www.domain.com/berlin-hotel is moving to www.domain.com/de/berlin-hotel.
Then all you need to do is put in the 301s based on the mapping and monitor WMT for issues - you will always miss something.
From what you are saying however there is no logical structure to your site - which will make this harder. I have had to deal with this in the past too, you might just need to identify all of the more important pages and 301 these first and go via mechanical turk of get an intern in or something to just plough away and find all of your urls.
If you have an XML sitemap you should be able to get them all pretty quickly and map from here.
Hope this helps.
I would definitely put them on your own site as well and have a few keyword rich links back to the areas you want to push on your site.
We are looking at doing this to support the main areas on the site. We will also still host them online in other places too.
Nothing wrong with doing this. The few links back from these will pass link juice across to your key areas.
Ooops, should have said link juice.
As you can all see Alan and I have different views on this but at least you have a range of views Tommo!
Good luck - hope it all goes well.
A 301 will pass 80% of the page authority over. You also don't leave any 404s.
I'm not sure why you are so against 301s? It's tidier, "best practice" and not hard to do. Why risk missing something out that might prove the difference?
The pages will still have some authority even without links, so I would definitely pass any little bit there. Social signals too if there are any that might not have been picked up. And I would also worry that while some tools say there are no links, I don't know one 100% reliable tool to tell you page X has no links.
If the local sites are in DE, FR, ES, or whatever, then they are not duplicate, they are local language. IBM, Apple to name a couple certainly do this route also.
Matt Cutts may say that, but we certainly do not suffer from this problem in the least. Another Cutts "we do this but really don't" comment maybe?!
Disagree with you on that one Alan. We have no issue with duplicate content and it is also what everyone (including those at MOZcon this year) recommend.
In fact I have an email from SEOMoz themselves recommending it.
TLD will get better over time but from experience running 7 sites (6 country sites) I would only ever use folders now.
I also disagree with your comments below about not 301ing all old content. You must do this when re-launching.
You are having to link build to multiple sites, they are start from zero in a search engines eyes (no matter how strong your current site is), your marketing materials cannont just mention domain.com etc.
Our use of folders with our strong domain strength has seen them launch and a month later be at the top of local search engines for hard to rank for search terms.
I cannot recommend strongly enough that going down the folder route is much better for SEO.
Yes that is what I am saying. Definitely head down the domain.com/country-folder ie domain.com/uk
This has a lot of advantages around marketing, all of the links to this one domain help strengthen the entire site instead of having to having to link building to a number of new sites (which are starting from scratch in Google's eyes).
And you can still target them in GWMT by country too - which will definitely work. I am presuming they will be in local language as well?
Hope this helps.
A
Okay. That makes sense but I would stick to one TLD if you could as you can run with folders and all of the country country sites (in the folder) will inherit all of the your one TLD strenght, makes things like link building easier too.
I would strongly recommend this. We have 6 international sites now and three are TLD (before I got here) and three folders and all of the folder sites are doing so much better than the TLD - the difference is amazing.
Even if you go with the new TLD I would miss out the GD_ bit - you don't need that at all if you have a TLD and makes no sense to me. I would rather have domain.com/city-hotels/hotel-name and optimise around this. You can then have landing pages around city hotels ie berlin hotels and pull in traffic this way.
Just a thought!
I would 301 all of the pages - to relevant new pages otherwise you will end up with a heap of 404s too if the old content just disappears. Blogs etc will have linked to them etc, you want to make the US as good as possible.
Mapping out the 301s will take time but be worth it in the long run.
I have done a website with 500,000 pages and mapped it and it worked well.
A
Hi Tommo.
Sounds like the clean up is a great idea. I wonder if you still need the GB at all? Is this for language or country? If you are doing country/language I would have a folder for each ie
domain/en/us/title-of-hotel
You can then geo-target these to country specific areas in GWMT as well - which would be helpful.
And definitely 301 them all - this is a must.
A
Have you thought of 301ing it to your stronger sites - passing ever a little bit of authority over to any other sites you have is one idea.
Sounds to be like the site is very flat and not really adding much - I would look at the less is more approach.
I have also had this problem when the site goes down for maintenance - it says there are 404s etc. Have you taken the site or any pages down recently and then put them back up?
A
Have you IT guys done a 301 or 302 from http to https or vice versa? That would be a good place to start looking.
Also check you have an canonical tags in the right place and pointing to the righ areas, people mess them up all the time - that's quite common.
Hope that's a start.
Definitely bad practice. You can have a class in the first h1 but there should be only one h1 to be semantically correct.
The exception is HTML5 where you can have multiple h1s per block but the above is not HTML5 by the looks.
A
Yes, a 301 will lost some link juice, around 80% is thought to be passed through, give or take a little and depending on who you believe.
Hope that helps.
ditto - just seeing what people think, you're in my camp by the looks.
Nope, it's not free. It's actually something that appears in SERPs - like the House of Fraser result you see here: - it's not acutally on site as such.
http://www.google.co.uk/search?gcx=c&sourceid=chrome&ie=UTF-8&q=house+of+fraser
A
Hi,
I was just wondering if anyone had used Google Site Search before and what they thought of it?
http://www.google.com/sitesearch/
It seems quite expensive for just returning your own pages but would be interested to find out more.
Thanks
I agree with all three of your guys - make it as clean as you possibly can.
Yeah - that would work. Well it should work if done the right way.
Hi,
We have 4 foreign language sites, one in spanish and just remove the special characters in all and rank very highly, so there is no harm in doing this, it actually makes it harder.
I would stick with all lower cases or at least have the same logic in the URL - as long as it is consistent, then no biggie.
No matter what you do, make sure if you make changes to any of this that you 301 all of the old pages to their new version otherwise you will be starting from scratch!
Hope this helps.
Is your email being hosted online in HTML form? Google won't crawl an email unless you host it online - there's no way for a rob to crawl something in email form unless you host it online and link to it.
If it is hosted online, then definitely nofollow any links you don't want followed. Hope that helps.
We run one site with all https and there is no problem at all - we link build as usual and see no bad impacts, in fact we are doing very well.
It's not usual practice but for SEO as long as you are playing by the rules it will have no impact whatsoever.
Have you tried nofollowing the links? That might help: http://en.wikipedia.org/wiki/Nofollow
Hi,
We use Adwords for our Spanish site - and get very good data out of it for both SEO and PPC, even down to the long tail. What types of keywords are you looking for - I will have a look and see if I can help you out and find more.
I asked our Spanish PPC expert, she said adwords works really well for her.
A
From my experience HTML sitemaps are generally ignored a lot now by search engines. You could do a basic one for users but otherwise put your efforts into getting a very good XML sitemap, that will help you more than what is generally just a static HTML page, that like most things on footers nowadays, are ignored by the SEs.
A
Good luck Arjun, would love to hear how it all goes!
Hi,
First up, good idea but I would be a bit careful about asking bloggers to post 4 links. Bloggers hate being told what to do from my experience. I would just ask them to link through to the resort without putting too much pressure on them.
The idea is very sound but I would be a bit more careful, I have seen bloggers come out and post about "X place" wants us to blog for links and put out a lot of bad press.
However, otherwise I would use opensite explorer to have a look at the blogs you are interested in and rank them against each other, this is what I do and it's quick and easy. Remember your link profile should look nice and natural, so don't try and get too tricky.
Also today's small blog might be the next big thing tomorrow, so there is no harm in getting a smaller blog to link through.
Great idea - just be careful in the execution, bloggers can turn on you very quickly!
A
Hi,
You can actually cap FCF at X number of visits per user per day by dropping a cookie. Otherwise what you are proposing is potentially a bit dodgy - if a human tester visits the site and gets a different experience to the bot, you might be at risk. I dbout you will get found out but at the same time, if you want to go pure white hat, then you need to follow the rules. Your call really.
A
You can also use Google First Click Free to let it index the site - really easy to set up the run. I suggest you use this, I did it at a previous company and it works so well it's not funny.
More info here:
http://googlewebmastercentral.blogspot.com/2008/10/first-click-free-for-web-search.html
Hi,
My question is why you would need two mobile sites to start with. One mobile site that renders for any device would be best.
Google will not see them as duplicate content from my experience. You should submit the new site(s) via WMTs. They should be light in code and load a lot faster than your normal site. Google is starting to favour lighter (faster to load) sites on mobile, so this will help them out.
However, until you get a big link profile for the site they probably won't rank higher than your normal blog for some time - I have had a mobile site for over a year and in mobile serps we still have our heavily non-mobile site win in serps.
We do an auto redirect from normal site to mobile when it knows that a person is looking at it on a mobile device.
Hope this helps!
I would definitely get up to speed with Javascript too, a lot of sites use that, you might not need to be able to code it but being able to read it and understand it's implications would be helpful.
Also flash, whilst it is a pain, understand how this works with SEO (you can do some things around it) would be good.
But definitely get yourself on a basic HTML and CSS course!
A