Questions created by Capote
-
How best to handle (legitimate) duplicate content?
Hi everyone, appreciate any thoughts on this. (bit long, sorry) Am working on 3 sites selling the same thing...main difference between each site is physical location/target market area (think North, South, West as an example) Now, say these 3 sites all sell Blue Widgets, and thus all on-page optimisation has been done for this keyword. These 3 sites are now effectively duplicates of each other - well the Blue Widgets page is at least, and whist there are no 'errors' in Webmaster Tools am pretty sure they ought to be ranking better than they are (good PA, DA, mR etc) Sites share the same template/look and feel too AND are accessed via same IP - just for good measure 🙂 So - to questions/thoughts. 1 - Is it enough to try and get creative with on-page changes to try and 'de-dupe' them? Kinda tricky with Blue Widgets example - how many ways can you say that? I could focus on geographical element a bit more, but would like to rank well for Blue Widgets generally. 2 - I could, i guess, no-index, no-follow, blue widgets page on 2 of the sites, seems a bit drastic though. (or robots.txt them) 3 - I could even link (via internal navigation) sites 2 and 3 to site 1 Blue Widgets page and thus make 2 blue widget pages redundant? 4 - Is there anything HTML coding wise i could do to pull in Site 1 content to sites 2 and 3, without cloaking or anything nasty like that? I think 1- is first thing to do. Anything else? Many thanks.
Intermediate & Advanced SEO | | Capote0 -
This stuff works - but be patient
..or subtitled "This is how I did it" Managed to get clients second most important keyword to #1 on Google recently (most important is about 5 or 6, am getting there with that one) 🙂 Yes, we all know 'rankings aren't important, traffic/searches are' but you know what - when a Client sees their keyword at #1 it kinda helps getting paid. Thus, in summary, this is how I did it. I don't think there's anything earth shattering here but it might help a few out as I see so many 'where do i start' type posts. 1 - Do everything within Webmaster Tools that is humanly possible. Sitemaps, fix errors, preferred domain the lot. Spend ages here! 2 - Get your on page optimisation sorted out. Decide on THE main keyword for page, fire it into your campaign and test it. Do not give up until it's an A and you've done ALL the tips/hints (yes, even bold text on the page.) It's obvious but you have to tell a Search Engine exactly what the page is about - and give it a few hints too. 3 - Links - ahhh, good old links the bain of our lives. This site was struggling for links so i bought some - yes, the shame of it. I still think the best directories are a good starting point, spent about £400 or so. (BOTW, JoeAnt, HotVsNot etc,) Also, find low hanging fruit places for links. See where top 3 or 4 competitors are getting links from (via OSE) and get your site linked there too. These aren't always chargeable. 4 - Get rid of as many errors you can from Crawl Diagnostics (Roger) and do everything you can to ensure page(s) load as quick as possible. e.g. for images, resize them and reduce colour depth. 5 - Go over Steps 1-4 again and again AND AGAIN. I think that's about it (will add anything if i think of it), this all took about 4-5 months - not to do the SEO work but for Google (or any SE) to recognise it all, so thanks to all at SEOmoz, the blogs and this forum for all the assistance. (Just need to get Linkscape updated quicker now folks - i couldn't resist that one!)
Moz Pro | | Capote1 -
FB og: tags - why?
Hi everyone, Can someone explain the benefits of adding OG tags to a site? As far as I can see/read doing this 'enables a site to become part of the FB social graph' (or similar..) ...and that's what exactly? (confused smiley) Cheers.
Social Media | | Capote1