How do I optimize pages for content that changes everyday?
-
Hi Guys
I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend.
However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page.
As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes?
How can I optimize the Title Tags and Meta Tags for pages that are constantly changing?
I'm really stuck on this one and would appreciate some feedback into this tricky beast.
Thanks in advance
-
Hey RedSweater,
Thanks for your in-depth response.
So firstly I wanted to ask with regard to the pop up modals for each star sign are you saying if we use the AJAX instead of the javascript to pull in each additional sign like you were saying would this be a better way to optimize the page for spiders to see, also how we currently have them creates the duplicate content effect?
Another question I might ask is, why are you archiving all the daily horoscopes? Are you seeing visitors to old horoscopes, or are you holding onto them for your own records?
No, we are not holding them for any records at all, not that I see or know if there is any benefit in doing so. We currently do keep backup copies offline.
That way, you could have a permanent "Daily Cancer" page with a more refined meta description that ever changes, and every day you just go in and edit that sign. Same for all the other signs, and same for the weekly sign pages. Anytime people link to these pages, they would keep that value - and I think that might help your rankings.
Right now, with every horoscope being on its own separate daily page, say 2 people share that link on Facebook. Those links go straight to say the Cancer horoscope for June 1, 2015. That means they're kind of frozen in time. If instead you had a "Daily Cancer" page and they linked there, and then a week later they shared it again because they liked the new horoscope, you'd have 4 incoming links to the same page.
I think I see what your saying now: So if I'm correct in my understanding I can create a separate page for each star sign, and every day while updating the dailies I can go in and replace the old with the new this way the links are staying the same etc the only thing I would be tweaking is the meta tags des now and again and the content for each star sign.
Is that correct?
Thanks for you in-depth look into it really appreciate it.
Justin
-
Yes - duplicate content is exactly what I mean. If you're a human visitor, it's a good way to be able to look at multiple signs quickly. I do this too - I always look at all the signs for my family. But if you're a search engine, on each sign's page, you see the full text for all the signs, so it looks like there are 12 pages with almost exactly the same content. Yes, it would be a good idea to look at competitors' sites and see how they handle it. One option might be to have each page with only that one sign's content on it, and use AJAX to pull in each additional sign - so visitors would see exactly what they're seeing now, and be able to immediately click another sign and see that horoscope, but because spiders typically don't see much content that is pulled in by JavaScript, they would see 12 unique pages to index. It looks like this competitor is doing it this way: http://astrostyle.com/daily-horoscopes/cancer-daily-horoscope/
Another question I might ask is, why are you archiving all the daily horoscopes? Are you seeing visitors to old horoscopes, or are you holding onto them for your own records? You may have visitors looking at them, and if so, keep archiving as you go. But if you're not getting much traffic to them, I would suggest considering keeping your own backup copies offline. That way, you could have a permanent "Daily Cancer" page with a more refined meta description that ever changes, and every day you just go in and edit that sign. Same for all the other signs, and same for the weekly sign pages. Anytime people link to these pages, they would keep that value - and I think that might help your rankings. Right now, with every horoscope being on its own separate daily page, say 2 people share that link on Facebook. Those links go straight to say the Cancer horoscope for June 1, 2015. That means they're kind of frozen in time. If instead you had a "Daily Cancer" page and they linked there, and then a week later they shared it again because they liked the new horoscope, you'd have 4 incoming links to the same page.
-
Hi RedSweater,
Thanks for your in-depth response
Thats a great idea about the automatic date based title tags like the example you gave. Would it be wise to take a look at what my competitors are using? I can't figure out how they have set themselves up for this part.
My system was custom built - when you say this:
On a related note, I noticed the horoscopes for all the other signs actually appear on the same page, just in a modal, when you click on them.
This may be working against you because you have all the signs' content on multiple pages.
Would this create a duplicate content situation? Is that what you mean in terms of working against me?
I would try to find a way to keep only each sign's content on its page and link off to the other signs' individual pages for people who want to view multiple signs. That should really reinforce to crawlers that each page has unique content.
The reason why we did it like that is most people who are viewing their horoscopes like to view their partners, friends or family horoscopes as well. Would you be able to elaborate on this? I have looked into my competitors and can't quiet see how they have done this.
I can provide you a link to competitors horoscope pages?
I would love to hear your feedback on this,
Thanks so much
Justin
-
Since it doesn't sound feasible to create new titles and meta descriptions by hand every time, I would come up with something automatic that's date-based. For example, your title could be "6/11/2015 Daily Horoscope for Cancer." Meta descriptions will be harder - if you're using something like WordPress with Yoast's SEO plugin, you could have it auto-create the meta description using the first line of your actual content. I'm sure you could find or create a similar auto-functionality if you're using a different platform. That way you aren't doing anything by hand, yet you have a unique title and helpful meta description for each page on the site.
On a related note, I noticed the horoscopes for all the other signs actually appear on the same page, just in a modal, when you click on them. This may be working against you because you have all the signs' content on multiple pages. I would try to find a way to keep only each sign's content on its page and link off to the other signs' individual pages for people who want to view multiple signs. That should really reinforce to crawlers that each page has unique content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doing URL change losses SEO ranking or not?
Hi Webmasters, I would like to move shipwaves.me to shipwaves.aeHowever, our website is concentrated on middle east countries and moreover, we have though .me is middle east [United Arab Emirates} and later with SEO advice, we have taken .ae.Besides, our confusion is if the website move from Shipwaves.me to the new domain shipwaves.ae this makes our SEO ranking loss or not?some of our keywords has been started showing on various search pages. So, anyone knows about this concern, please let me know.
White Hat / Black Hat SEO | | LayaPaul0 -
Question regarding subdomains and duplicate content
Hey everyone, I have another question regarding duplicate content. We are planning on launching a new sector in our industry to satisfy a niche. Our main site works as a directory with listings with NAP. The new sector that we are launching will be taking all of the content on the main site and duplicating it on a subdomain for the new sector. We still want the subdomain to rank organically, but I'm having struggles between putting a rel=canonical back to main site, or doing a self-referencing canonical, but now I have duplicates. The other idea is to rewrite the content on each listing so that the menu items are still the same, but the listing description is different. Do you think this would be enough differentiating content that it won't be seen as a duplicate? Obviously make this to be part of the main site is the best option, but we can't do that unfortunately. Last question, what are the advantages or disadvantages of doing a subdomain?
White Hat / Black Hat SEO | | imjonny0 -
A Sitemap Web page & A Sitemap in htaccess - will a website be penalised for having both?
Hi I have a sitemap url already generated by SEO Yoast in the htaccess file, and I have submitted that to the search engines. I'd already created a sitemap web page on the website, also as a helpful aid for users to see a list of all page urls. Is this a problem and could this scenario create duplicate issues or any problems with search engines? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Difference between anchor text pointing to an article in our section pages and the title of our article
My concern is described more in details in the following hypothetic scenario(basically this is the same method that CNN site applies to its site): In one page i have a specific anchor text e.g. "A firefighter rescued a young boy" and this one is linked to an article which if you enter you will see that it has a different title than the anchor text/short title that i mentioned above. So the internal titlte of the article is "A firefighte rescued a young boy in Philippines while it was rainy". I want to know whether this is a good SEO practice or not. Regards, Christos
White Hat / Black Hat SEO | | DPG_Media0 -
G.A. question - removing a specific page's data from total site's results?
I hope I can explain this clearly, hang in there! One of the clients of the law firm I work for does some SEO work for the firm and one thing he has been doing is googling a certain keyword over and over again to trick google's auto fill into using that keyword. When he runs his program he generates around 500 hits to one of our attorney's bio pages. This happens once or twice a week, and since I don't consider them real organic traffic it has been really messing up my GA reports. Is there a way to block that landing page from my overall reports? Or is there a better way to deal with the skewed data? Any help or advice is appreciated, I am still so new to SEO I feel like a lot of my questions are obvious, but please go easy on me!
White Hat / Black Hat SEO | | MyOwnSEO0 -
EXPERT CHALLENGE: What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change?
FOR ALL SEO THOUGHT LEADERS...What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change? NOTE: My hope is that the responses left on this thread will ultimately benefit all members of the community and give recognition to the true thought leaders within the SEO space. That being said, my challenge is a 2 part question: With the 80/20 rule in mind, and in light of recent algorithm changes, what would YOU focus most of your SEO budget on if you had to choose? Let's assume you're in a competitive market (ie #1-5 on page 1 has competitors with 20,000+ backlinks - all ranging from AC Rank 7 to 1). How would you split your total monthly SEO budget as a general rule? Ex) 60% link building / 10% onsite SEO / 10% Social Media / 20% content creation? I realize there are many "it depends" factors but please humor us anyways. Link building appears to have become harder and harder as google releases more and more algorithm changes. For link building, the only true white hat way of proactively generating links (that I know of) is creating high quality content that adds value to customers (ie infographics, videos, etc.), guest blogging, and Press Releases. The con to these tactics is that you are waiting for others to find and pick up your content which can take a VERY long time, so ROI is difficult to measure and justify to clients or C-level management. That being said, how are YOU allocating your link building budget? Are all of these proactive link building tactics a waste of time now? I've heard it couldn't hurt to still do some of these, but what are your thoughts and what is / isn't working for you? Here they are: A. Using spun articles edited by US based writers for guest blog content B. 301 Redirects C. Social bookmarking D. Signature links from Blog commenting E. Directory submissions F. Video Submissions G. Article Directory submissions H. Press release directory submissions I. Forum Profile Submissions J. Forum signature links K. RSS Feed submissions L. Link wheels M. Building links (using scrapebox, senukex, etc.) to pages linked to your money site N. Links from privately owned networks (I spoke to an SEO company that claims to have over 4000 unique domains which he uses to boost rankings for his clients) O. Buying Contextual Text Links All Expert opinions are welcomed and appreciated 🙂
White Hat / Black Hat SEO | | seoeric2