Same content, different target area SEO
-
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms)
My question 1 would be:
How should I mark the content for Google and other search engines that it would not be considered "duplicate content"?
As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs.
What i thought of so far is:
1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .comIs there anything I am missing here?
Question 2:
Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question)
Thank you for your answers.
T
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which is Important? Backlinks or Internal Links? For SEO purpose.
Which is Important? Backlinks or Internal Links? For SEO purpose.
White Hat / Black Hat SEO | | BBT-Digital0 -
Adult Toy Store SEO
Hi fellows, I'm not so strange to SEO. I have been promoting our spiritual network through SEO and we have received great returns from it. I'm planning to promote an adult toy store via SEO. I have never done any adult store promoting before but I think there are a lot of down sides to it, such as: #1 When I search related keywords many porn websites show up; I assume it seems spammy to google's eye. Also most of the links that I will get are probably from porn websites due to relevancy. #2 Many of our returning customers are coming from retargeting but I assume there is no adult promotion via google display. Is that right? (It's not SEO related) I'm wondering to know if google is against adult content in any way? Any feedbacks are appreciated.
White Hat / Black Hat SEO | | Arian-Ya0 -
Have just submitted Disavow file to Google: Shall I wait until after they have removed bad links to start new content lead SEO campaign?
Hi guys, I am currently conducting some SEO work for a client. Their previous SEO company had built a lot of low quality/spam links to their site and as a result their rankings and traffic have dropped dramatically. I have analysed their current link profile, and have submitted the spammiest domains to Google via the Disavow tool. The question I had was.. Do I wait until Google removes the spam links that I have submitted, and then start the new content based SEO campaign. Or would it be okay to start the content based SEO campaign now, even though the current spam links havent been removed yet.. Look forward to your replies on this...
White Hat / Black Hat SEO | | sanj50500 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Seo back linking proposal review
Hi guys, below is a proposal i received from someone on freelancer.com for some seo building. Is this really all it takes? Obviously done overtime but basically is this it aside from the usual basics onsite keywords, urls, artciles, content etc. This is a the proposal for $250 (some are cheaper but almost the same details as below). This is one of the top seo people on freelancers.com and they all have good reviews. Is this basically it? Shell out $100 bucks or more a month to someone who will just post stuff all over the internet. It just seems all very simple, what is $100 bucks a month to stay at #1. Is there any real questions i should ask to make sure i am not just throwing my money away? I would like to recommend the following services for attaining better search results for the website. 1)Press Release Submissions
White Hat / Black Hat SEO | | topclass
2)Social bookmarking submissions
3)Drip Feed Article Links - 100 Article submissions everyday for 25 days
4)Article directory submissions
5)Link directory submissions
6)Blog Post Submissions(All Blogs have PR1 to PR6)
7)Wiki Page Submissions(.EDU and .GOV Sites Included) PR of the directories, social bookmarking websites, Blogs, wiki pages and Article directories are from PR0 to PR8.
Most of them are in the range of PR1 to PR4. If you are interested in the above services then these are the details about those services. 1)Press release Submissions - We will write 3 press release and submit them to 25 press release websites.
Submitting press release gets the news to Google news, Yahoo news etc..
Please note we even submit to Paid press release websites like PRBuzz, SBWire, pressdoc etc.. 2)Social Bookmarking submissions - I will also submit your website to 150 social bookmarking websites.
Here are the example of social bookmarking websites.
www.digg.com
www.furl.net
After we finish submitting to social bookmarking websites we will then create rss feeds with approved link URL's and ping them so that links get indexed. 3)Drip Feed Article submissions - We will be writing one article.
Everyday we will submitting the article to 100 different websites.
We will be submitting for 25 days.
100 submissions x 25 days = 2500 submissions.
In each article submissions we can use 2 links to the website. 4)Article directory submissions - We will write 5 articles.
Each article will be around 500 words.
Then we will submit them to 300 different article directories. That means 5 articles x 300 article directories = 1500 article submissions.
In each article we can use 2 links to the website.
1500 x 2 Links.
I have experience in submitting articles to article directories.
Till now i have submitted more than 1000 articles to article directories.
I will also create separate accounts with article directories wherever possible. 5)Link directory submissions - I have a list of 1300 directories.
I will submit your website to these directories.
I have experience in submitting to link directories.
Till now i have submitted more than 2500 websites.
All the submission work is done manually.
All these directories provide one way links. 6)Blog Post Submissions(700 PR1 to PR6 Blogs) - We will write 1 article.
we spin and post to 700 PR1 to PR5 blogs.
We can spin the article, title of article and links
You will be given a confirmation when complete, and a code to search backlinks in the search engines.
They are hosted on 650 different C Class IPs! 7)Wiki Page Submissions - Get 200+ wiki site contextual backlinks (3 per posted article) from a range of PR 0 to 8 wiki sites including over 30 US .EDU and US .GOV sites.
I will also ping Them.0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
HOW TO: City Targeted Landing Pages For Lead Generation
Hi guys, So one of my clients runs a web development agency in San Diego and for lead generation purposes we are thinking of creating him city targeted landing pages which will all be on different domains ie. lawebdesginstudio / sfwebdesigngurus I plan to register these 20-30 domains for my client and load them all up on a my single linux server I have from godaddy. I noticed however today using google's keyword tool that roughly only 5-10 cities have real traffic worth trying to capture to turn into leads. Therefore I am not sure if its even worth building those extra 20 landing pages since they will receive very little traffic. My only thought is, if I do decide to build all 30 landing pages, then I assume I will have a very strong private network of authority websites that I can use to point to the clients website. I mean I figure I can rank almost all of them page 1 top 5 within 2-3 months. My question is: 1. Do city targeted micro sites for the purpose of lead generation still work? If so are there any threads that have more info on this topic? 2. Do you suggest I interlink all 30 sites together and perhaps point them all to the money site? If so i'm wondering if I should diversify the ip's that I used to register the domains as well as the whois info. Thanks guys, all help is appreciated!
White Hat / Black Hat SEO | | AM2130 -
SEO Experiment with Google Docs
Please check out this doc - https://docs.google.com/document/d/19VS4SnVvq6VJHQAIrB3CX7iL1ivZU4DH6fyfrHLsNFk/edit Any insights will be highly appreciated! Oleksiy
White Hat / Black Hat SEO | | wcrfintl0