Need to duplicate the index for Google in a way that's correct
-
Usually duplicated content is a brief to fix.
I find myself in a little predicament:
I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal.
The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads.
So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines.
I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path.
So the question is: Have someone else stumbled upon a similar problem?
If so...? How did you fix it.
-
Indexation of ad-server code may prove tricky as often it will have a generic & dynamic make-up (i.e. JavaScript). It depends on how you set-up the code and ultimately how your engine will serve the code to end users.
What you are wanting to achieve is definitely possible through automation (whether via an ad-server or a custom script). The variable is just how the ad-server you use serves the listings. (You might like to try DoubleClick or OpenX first. Both are free).
Best of luck!
Anthony -
Thanks Anthony, I was going to try that approach from the start with a small script handling these listing thorugh an xml flow that's more easily controlled when it comes to layout. It seems like handling these listings as ads is the way forward.
Thnx!
Ps.
Do you think this will help the listings to become indexed on the smaller sites as well?
.Ds
-
You say it's identical. So by definition, I assume it's the same piece of code calling the job listings being replicated across all the areas you want the listings to show?
You can use an ad-server for this purpose. It will give you a greater sense of control over impressions, not to mention offer you a greater insight into how effective they are in terms of clicks & engagement. The code should be a little lighter, too.
-
The problem is that its not common "ads" but joblistings that's running on several sites with identical format and url structure, these listings are all in an own database that we use to pull the listings from. So the thing is: I want them to be indexed since it would drive a lot of longtail traffic to the nisched sites.
I've never used Doubleclick but know how it works. If this is a problem that's possible to solve by using an ad server please let me know
-
Can you not just put your ads in an ad server such as DoubleClick and run an include across the entire network, or am I missing something here?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a difference between 'Mø' and 'Mo'?
The brand name is Mø but users are searching online for Mo. Should I changed all instances of Mø to be Mo on my clients website?
Intermediate & Advanced SEO | | ben_mozbot010 -
Only 285 of 2,266 Images Indexed by Google
Only 285 of 2,266 Images Indexed by Google. Images for our site are hosted on Amazons CDN cloud based hosting service. Our Wordpress site is on a virtual private server and has its' own IP address. The number of indexed images has dropped substantially in the last year. Our site is for a real estate brokerage firm. There are about 250 listing pages set to "no-index". Perhaps these contain 400 photos, so they do not account for why so few photos have been indexed. The concern is that the low number of indexed images could be affecting overall ranking. The site URL is www.nyc-officespace-leader.com. Is this issue something that we should be concerned about? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
Investigating Google's treatment of different pages on our site - canonicals, addresses, and more.
Hey all - I hesitate to ask this question, but have spent weeks trying to figure it out to no avail. We are a real estate company and many of our building pages do not show up for a given address. I first thought maybe google did not like us, but we show up well for certain keywords 3rd for Houston office space and dallas office space, etc. We have decent DA and inbound links, but for some reason we do not show up for addresses. An example, 44 Wall St or 44 Wall St office space, we are no where to be found. Our title and description should allow us to easily picked up, but after scrolling through 15 pages (with a ton of non relevant results), we do not show up. This happens quite a bit. I have checked we are being crawled by looking at 44 Wall St TheSquareFoot and checking the cause. We have individual listing pages (with the same titles and descriptions) inside the buildings, but use canonical tags to let google know that these are related and want the building pages to be dominant. I have worked though quite a few tests and can not come up with a reason. If we were just page 7 and never moved it would be one thing, but since we do not show up at all, it almost seems like google is punishing us. My hope is there is one thing that we are doing wrong that is easily fixed. I realize in an ideal world we would have shorter URLs and other nits and nats, but this feels like something that would help us go from page 3 to page 1, not prevent us from ranking at all. Any thoughts or helpful comments would be greatly appreciated. http://www.thesquarefoot.com/buildings/ny/new-york/10005/lower-manhattan/44-wall-st/44-wall-street We do show up one page 1 for this building - http://www.thesquarefoot.com/buildings/ny/new-york/10036/midtown/1501-broadway, but is the exception. I have tried investigating any differences, but am quite baffled.
Intermediate & Advanced SEO | | AtticusBerg10 -
Why Is Google Indexing These Product Pages On Shopify?
How can we communicate to Google the exact product pages we'd like indexed on our site? We're an apparel company that uses Shopify as our ecommerce platform. Website is sportiqe.com. Currently, Google is indexing all types of different pages on our site. **Example of a product page we want indexed: ** Product Page: sportiqe.com/products/PRODUCT-TITLE (Like This) **Examples of product pages being indexed: ** sportiqe.myshopify.com/products/PRODUCT-TITLE sportiqe.com/collections/COLLECTION-NAME/products/PRODUCT-TITLE See attached for an example of how two different "Boston Celtics Grateful Dead" shirts are being indexed. Any suggestions? We've used both Shopify and Google Webmaster tools to set our preferred domain (sportiqe.com). We've also added this snippet of code to our site three months ago thinking that would do the trick... {% if template == 'product' %}{% if collection %} {% endif %}{% endif %} sKwNZOl
Intermediate & Advanced SEO | | farmiloe0 -
Long term strategy to retain link 'goodness', I need some help!
Hi, I have a few questions around the best approach to retain as much link juice / authority from transitioning multiple domains into 1 single domain over the next year or so. I have 2 similar websites (www.brandA.co.uk and www.brandB.co.uk) which I need to transition to a new website (www.brandC.co.uk) over the next 2 years. Both A&B are established and have there own brand value, brand C will be a new website. I need to start introducing the brand from website C onto A&B straight away and then eventually drop the brands from A&B and just be left with C. One idea I am considering is: www.brandA.co.uk becomes brandA.brandC.co.uk (brandA sits as a subdomain on brandC website) Ultimately over time I would drop the subdomain (brandA) and just be left with www.brandC.co.uk The other option is: www.brandA.co.uk becomes brandC.co.uk/brandA...with the same ultimate aim as above. In both above case the same would be done for brandB, either becoming a subdomain of a folder on brandC website What I need to know is what is the best way to first pass any SEO goodness from the websites for brandA and brandB to the intermediate solution of either brandA.brandC.co.uk or brandC.co.uk/brandA (I see this intermediate solution being in place for approx 2 years). And then how to transition the intermediate solution into just having brandC.co.uk Which solution will aid growing the SEO goodness on the final brandC.co.uk website? Does google see subdomains as part of the main domain and thus the main domain will benefit from any links going to the subdomain or is it better to always use /folders as google sees these as more part of one website? ...or is there another option that I haven't considered? I know it's rater confusing so please give me a shout if you want anymore info. Thanks James
Intermediate & Advanced SEO | | cewe0 -
Dynamic 301's causing duplicate content
Hi, wonder if anyone can help? We have just changed our site which was hosted on IIS and the page url's were like this ( example.co.uk/Default.aspx?pagename=About-Us ). The new page url is example.co.uk/About-Us/ and is using Apache. The 301's our developer told us to use was in this format: RewriteCond %{REQUEST_URI} ^/Default.aspx$
Intermediate & Advanced SEO | | GoGroup51
RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L] This seemed to work from a 301 point of view; however it also seemed to allow both of the below URL's to give the same page! example.co.uk/About-Us/?pagename=About-Us example.co.uk/About-Us/ Webmaster Tools has now picked up on this and is seeing it a duplicate content. Can anyone help why it would be doing this please. I'm not totally clued up and our host/ developer cant understand it too. Many Thanks0 -
Duplicate content on index.htm page
How do I avoid duplicate content on the index.htm page . I need to redirect the spider from the /index.htm file to the main root of http://www.manandhisvan.com.au and hence avoid duplicate content. Does anyone know of a foolproof way of achieving this without me buggering up the complete site Cheers Freddy
Intermediate & Advanced SEO | | Fatfreddy0