How do I optimize pages for content that changes everyday?
-
Hi Guys
I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend.
However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page.
As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes?
How can I optimize the Title Tags and Meta Tags for pages that are constantly changing?
I'm really stuck on this one and would appreciate some feedback into this tricky beast.
Thanks in advance
-
Hey RedSweater,
Thanks for your in-depth response.
So firstly I wanted to ask with regard to the pop up modals for each star sign are you saying if we use the AJAX instead of the javascript to pull in each additional sign like you were saying would this be a better way to optimize the page for spiders to see, also how we currently have them creates the duplicate content effect?
Another question I might ask is, why are you archiving all the daily horoscopes? Are you seeing visitors to old horoscopes, or are you holding onto them for your own records?
No, we are not holding them for any records at all, not that I see or know if there is any benefit in doing so. We currently do keep backup copies offline.
That way, you could have a permanent "Daily Cancer" page with a more refined meta description that ever changes, and every day you just go in and edit that sign. Same for all the other signs, and same for the weekly sign pages. Anytime people link to these pages, they would keep that value - and I think that might help your rankings.
Right now, with every horoscope being on its own separate daily page, say 2 people share that link on Facebook. Those links go straight to say the Cancer horoscope for June 1, 2015. That means they're kind of frozen in time. If instead you had a "Daily Cancer" page and they linked there, and then a week later they shared it again because they liked the new horoscope, you'd have 4 incoming links to the same page.
I think I see what your saying now: So if I'm correct in my understanding I can create a separate page for each star sign, and every day while updating the dailies I can go in and replace the old with the new this way the links are staying the same etc the only thing I would be tweaking is the meta tags des now and again and the content for each star sign.
Is that correct?
Thanks for you in-depth look into it really appreciate it.
Justin
-
Yes - duplicate content is exactly what I mean. If you're a human visitor, it's a good way to be able to look at multiple signs quickly. I do this too - I always look at all the signs for my family. But if you're a search engine, on each sign's page, you see the full text for all the signs, so it looks like there are 12 pages with almost exactly the same content. Yes, it would be a good idea to look at competitors' sites and see how they handle it. One option might be to have each page with only that one sign's content on it, and use AJAX to pull in each additional sign - so visitors would see exactly what they're seeing now, and be able to immediately click another sign and see that horoscope, but because spiders typically don't see much content that is pulled in by JavaScript, they would see 12 unique pages to index. It looks like this competitor is doing it this way: http://astrostyle.com/daily-horoscopes/cancer-daily-horoscope/
Another question I might ask is, why are you archiving all the daily horoscopes? Are you seeing visitors to old horoscopes, or are you holding onto them for your own records? You may have visitors looking at them, and if so, keep archiving as you go. But if you're not getting much traffic to them, I would suggest considering keeping your own backup copies offline. That way, you could have a permanent "Daily Cancer" page with a more refined meta description that ever changes, and every day you just go in and edit that sign. Same for all the other signs, and same for the weekly sign pages. Anytime people link to these pages, they would keep that value - and I think that might help your rankings. Right now, with every horoscope being on its own separate daily page, say 2 people share that link on Facebook. Those links go straight to say the Cancer horoscope for June 1, 2015. That means they're kind of frozen in time. If instead you had a "Daily Cancer" page and they linked there, and then a week later they shared it again because they liked the new horoscope, you'd have 4 incoming links to the same page.
-
Hi RedSweater,
Thanks for your in-depth response
Thats a great idea about the automatic date based title tags like the example you gave. Would it be wise to take a look at what my competitors are using? I can't figure out how they have set themselves up for this part.
My system was custom built - when you say this:
On a related note, I noticed the horoscopes for all the other signs actually appear on the same page, just in a modal, when you click on them.
This may be working against you because you have all the signs' content on multiple pages.
Would this create a duplicate content situation? Is that what you mean in terms of working against me?
I would try to find a way to keep only each sign's content on its page and link off to the other signs' individual pages for people who want to view multiple signs. That should really reinforce to crawlers that each page has unique content.
The reason why we did it like that is most people who are viewing their horoscopes like to view their partners, friends or family horoscopes as well. Would you be able to elaborate on this? I have looked into my competitors and can't quiet see how they have done this.
I can provide you a link to competitors horoscope pages?
I would love to hear your feedback on this,
Thanks so much
Justin
-
Since it doesn't sound feasible to create new titles and meta descriptions by hand every time, I would come up with something automatic that's date-based. For example, your title could be "6/11/2015 Daily Horoscope for Cancer." Meta descriptions will be harder - if you're using something like WordPress with Yoast's SEO plugin, you could have it auto-create the meta description using the first line of your actual content. I'm sure you could find or create a similar auto-functionality if you're using a different platform. That way you aren't doing anything by hand, yet you have a unique title and helpful meta description for each page on the site.
On a related note, I noticed the horoscopes for all the other signs actually appear on the same page, just in a modal, when you click on them. This may be working against you because you have all the signs' content on multiple pages. I would try to find a way to keep only each sign's content on its page and link off to the other signs' individual pages for people who want to view multiple signs. That should really reinforce to crawlers that each page has unique content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Links to Spans with Robots.txt Blocked Redirects using Linkify/jQuery
Hi, I was recently penalized most likely because Google started following javascript links to bad neighborhoods that were not no-followed. The first thing I did was remove the Linkify plugin from my site so that all those links would disappear, but now I think I have a solution that works with Linkify without creating crawlable links. I did the following: I blocked access to the Linkify scripts using robots.txt so that Google won't execute the scripts that create the links. This has worked for me in the past with banner ads linking to other sites of mine. At least it appears to work because those sites did not get links from pages running those banners in search console. I created a /redirect/ directory that redirects all offsite URLs. I put a robots.txt block on this directory. I configured the Linkify plugin to parse URLs into span elements instead of a elements and add no follow attributes. They still have an href attribute, but the URLs in the href now point to the redirect directory and the span onclick event redirects the user. I have implemented this solution on another site of mine and I am hoping this will make it impossible for Google to categorize my pages as liking to any neighborhoods good or bad. Most of the content is UGC, so this should discourage link spam while giving users clickable URLs and still letting people post complaints about people that have profiles on adult websites. Here is a page where the solution has been implemented https://cyberbullyingreport.com/bully/predators-watch-owner-scott-breitenstein-of-dayton-ohio-5463.aspx, the Linkify plugin can be found at https://soapbox.github.io/linkifyjs/, and the custom jQuery is as follows: jQuery(document).ready(function ($) { 2 $('p').linkify({ tagName: 'span', attributes: { rel: 'nofollow' }, formatHref: function (href) { href = 'https://cyberbullyingreport.com/redirect/?url=' + href; return href; }, events:{ click: function (e) { var href = $(this).attr('href'); window.location.href = href; } } }); 3 });
White Hat / Black Hat SEO | | STDCarriers0 -
Would this be duplicate content or bad SEO?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Duplication Effects on Page Rank and Domain Authority
Hi Does page rank and domain authority page rank drop due to duplication issues on a web domain or on a web page? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
How to 301 redirect from old domain and their pages to new domain and pages?
Hi i am a real newbie to this and i hope for a guide on how to do this. I seen a few moz post and is quiet confusing hopefully somebody able to explain it in layman terms to me. I would like to 301 redirect this way, both website contain the same niche. oldwebsite.com > newwebsite.com and also its pages..... oldwebsite.com/test >newwebsite.com/test So my question here is i would like to host my old domain and its pages in my new website hosting in order to redirect to my new domain and its pages how do i do that? would my previous page link overwrite my new page link? or it add on the juice link? Do i need to host the whole old domain website into my new hosting in order to redirect the old pages? really confusing here, thanks!
White Hat / Black Hat SEO | | andzon0 -
Disabling a slider with content...is considered cloaking?
We have a slider on our site www.cannontrading.com, but the owner didn't like it, so I disabled it. And, each slider contains link & content as well. We had another SEO guy tell me it considered cloaking. Is this True? Please give feedbacks.
White Hat / Black Hat SEO | | ACann0 -
Victim of Negative SEO - Can I Redirect the Attacked Page to an External Site?
My site has been a victim of Negative SEO. During the course of 3 weeks, I have received over 3000 new backlinks from 200 referring domains (based on Ahref report). All links are pointing to just 1 page (all other pages within the site are unaffected). I have already disavowed as many links as possible from Ahref report, but is that all I can do? What if I continue to receive bad backlinks? I'm thinking of permanently redirecting the affected page to an external website (a dummy site), and hope that all the juice from the bad backlinks will be transferred to that site. Do you think this would be a good practice? I don't care much about keeping the affected page on my site, but I want to make sure the bad backlinks don't affect the entire site. The bad backlinks started to come in around 3 weeks ago and the rankings haven't been affected yet. The backlinks are targeting one single keyword and are mostly comment backlinks and trackbacks. Would appreciate any suggestions 🙂 Howard
White Hat / Black Hat SEO | | howardd0 -
HOW TO: City Targeted Landing Pages For Lead Generation
Hi guys, So one of my clients runs a web development agency in San Diego and for lead generation purposes we are thinking of creating him city targeted landing pages which will all be on different domains ie. lawebdesginstudio / sfwebdesigngurus I plan to register these 20-30 domains for my client and load them all up on a my single linux server I have from godaddy. I noticed however today using google's keyword tool that roughly only 5-10 cities have real traffic worth trying to capture to turn into leads. Therefore I am not sure if its even worth building those extra 20 landing pages since they will receive very little traffic. My only thought is, if I do decide to build all 30 landing pages, then I assume I will have a very strong private network of authority websites that I can use to point to the clients website. I mean I figure I can rank almost all of them page 1 top 5 within 2-3 months. My question is: 1. Do city targeted micro sites for the purpose of lead generation still work? If so are there any threads that have more info on this topic? 2. Do you suggest I interlink all 30 sites together and perhaps point them all to the money site? If so i'm wondering if I should diversify the ip's that I used to register the domains as well as the whois info. Thanks guys, all help is appreciated!
White Hat / Black Hat SEO | | AM2130 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0