How do I optimize pages for content that changes everyday?
-
Hi Guys
I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend.
However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page.
As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes?
How can I optimize the Title Tags and Meta Tags for pages that are constantly changing?
I'm really stuck on this one and would appreciate some feedback into this tricky beast.
Thanks in advance
-
Hey RedSweater,
Thanks for your in-depth response.
So firstly I wanted to ask with regard to the pop up modals for each star sign are you saying if we use the AJAX instead of the javascript to pull in each additional sign like you were saying would this be a better way to optimize the page for spiders to see, also how we currently have them creates the duplicate content effect?
Another question I might ask is, why are you archiving all the daily horoscopes? Are you seeing visitors to old horoscopes, or are you holding onto them for your own records?
No, we are not holding them for any records at all, not that I see or know if there is any benefit in doing so. We currently do keep backup copies offline.
That way, you could have a permanent "Daily Cancer" page with a more refined meta description that ever changes, and every day you just go in and edit that sign. Same for all the other signs, and same for the weekly sign pages. Anytime people link to these pages, they would keep that value - and I think that might help your rankings.
Right now, with every horoscope being on its own separate daily page, say 2 people share that link on Facebook. Those links go straight to say the Cancer horoscope for June 1, 2015. That means they're kind of frozen in time. If instead you had a "Daily Cancer" page and they linked there, and then a week later they shared it again because they liked the new horoscope, you'd have 4 incoming links to the same page.
I think I see what your saying now: So if I'm correct in my understanding I can create a separate page for each star sign, and every day while updating the dailies I can go in and replace the old with the new this way the links are staying the same etc the only thing I would be tweaking is the meta tags des now and again and the content for each star sign.
Is that correct?
Thanks for you in-depth look into it really appreciate it.
Justin
-
Yes - duplicate content is exactly what I mean. If you're a human visitor, it's a good way to be able to look at multiple signs quickly. I do this too - I always look at all the signs for my family. But if you're a search engine, on each sign's page, you see the full text for all the signs, so it looks like there are 12 pages with almost exactly the same content. Yes, it would be a good idea to look at competitors' sites and see how they handle it. One option might be to have each page with only that one sign's content on it, and use AJAX to pull in each additional sign - so visitors would see exactly what they're seeing now, and be able to immediately click another sign and see that horoscope, but because spiders typically don't see much content that is pulled in by JavaScript, they would see 12 unique pages to index. It looks like this competitor is doing it this way: http://astrostyle.com/daily-horoscopes/cancer-daily-horoscope/
Another question I might ask is, why are you archiving all the daily horoscopes? Are you seeing visitors to old horoscopes, or are you holding onto them for your own records? You may have visitors looking at them, and if so, keep archiving as you go. But if you're not getting much traffic to them, I would suggest considering keeping your own backup copies offline. That way, you could have a permanent "Daily Cancer" page with a more refined meta description that ever changes, and every day you just go in and edit that sign. Same for all the other signs, and same for the weekly sign pages. Anytime people link to these pages, they would keep that value - and I think that might help your rankings. Right now, with every horoscope being on its own separate daily page, say 2 people share that link on Facebook. Those links go straight to say the Cancer horoscope for June 1, 2015. That means they're kind of frozen in time. If instead you had a "Daily Cancer" page and they linked there, and then a week later they shared it again because they liked the new horoscope, you'd have 4 incoming links to the same page.
-
Hi RedSweater,
Thanks for your in-depth response
Thats a great idea about the automatic date based title tags like the example you gave. Would it be wise to take a look at what my competitors are using? I can't figure out how they have set themselves up for this part.
My system was custom built - when you say this:
On a related note, I noticed the horoscopes for all the other signs actually appear on the same page, just in a modal, when you click on them.
This may be working against you because you have all the signs' content on multiple pages.
Would this create a duplicate content situation? Is that what you mean in terms of working against me?
I would try to find a way to keep only each sign's content on its page and link off to the other signs' individual pages for people who want to view multiple signs. That should really reinforce to crawlers that each page has unique content.
The reason why we did it like that is most people who are viewing their horoscopes like to view their partners, friends or family horoscopes as well. Would you be able to elaborate on this? I have looked into my competitors and can't quiet see how they have done this.
I can provide you a link to competitors horoscope pages?
I would love to hear your feedback on this,
Thanks so much
Justin
-
Since it doesn't sound feasible to create new titles and meta descriptions by hand every time, I would come up with something automatic that's date-based. For example, your title could be "6/11/2015 Daily Horoscope for Cancer." Meta descriptions will be harder - if you're using something like WordPress with Yoast's SEO plugin, you could have it auto-create the meta description using the first line of your actual content. I'm sure you could find or create a similar auto-functionality if you're using a different platform. That way you aren't doing anything by hand, yet you have a unique title and helpful meta description for each page on the site.
On a related note, I noticed the horoscopes for all the other signs actually appear on the same page, just in a modal, when you click on them. This may be working against you because you have all the signs' content on multiple pages. I would try to find a way to keep only each sign's content on its page and link off to the other signs' individual pages for people who want to view multiple signs. That should really reinforce to crawlers that each page has unique content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Negative SEO to inner page: remove page or disavow links?
Someone decided to run a negative-SEO campaign, hitting one of the inner pages on my blog 😞 I noticed the links started to pile up yesterday but I assume there will be more to come over the next few days. The targeted page is of little value to my blog, so the question is: should I remove the affected page (hoping that the links won't affect the entire site) or to submit a disavow request? I'm not concerned about what happens to the affected page, but I want to make sure the entire site doesn't get affected as a result of the negative-SEO. Thanks in advance. Howard
White Hat / Black Hat SEO | | howardd0 -
What are your views on recent statements regarding "advertorial" content?
Hi, Recently, there's been a lot said and written about how Google is going to come down hard on 'advertorial' content. Many B2B publishers provide exposure to their clients by creating and publishing content about them -----based on information/ content obtained from clients (for example, in the form of press releases) or compiled by the publisher. From a target audience/ user perspective, this is useful information that the publication is bringing to its audience. Also, let's say the publishers don't link directly to client websites. In such a case, how do you think Google is likely to look at publisher websites in the context of the recent statements related to 'advertorial' type content? Look forward to views of the Moz community. Thanks, Manoj
White Hat / Black Hat SEO | | ontarget-media0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Duplicate content showing on local pages
I have several pages which are showing duplicate content on my site for web design. As its a very competitive market I had create some local pages so I rank high if someone is searching locally i.e web design birmingham, web design tamworth etc.. http://www.cocoonfxmedia.co.uk/web-design.html http://www.cocoonfxmedia.co.uk/web-design-tamworth.html http://www.cocoonfxmedia.co.uk/web-design-lichfield.html I am trying to work out what is the best way reduce the duplicate content. What would be the best way to remove the duplicate content? 1. 301 redirect (will I lose the existing page) to my main web design page with the geographic areas mentioned. 2. Re write the wording on each page and make it unique? Any assistance is much appreciated.
White Hat / Black Hat SEO | | Cocoonfxmedia0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0 -
Is pulling automated news feeds on my home page a bad thing?
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom. After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
White Hat / Black Hat SEO | | amit20760 -
Can a Page Title be all UPPER CASE?
My clients wants to use UPPER CASE for all his page titles. Is this okay? Does Google react badly to this?
White Hat / Black Hat SEO | | petewinter0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0