How do I optimize pages for content that changes everyday?
-
Hi Guys
I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend.
However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page.
As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes?
How can I optimize the Title Tags and Meta Tags for pages that are constantly changing?
I'm really stuck on this one and would appreciate some feedback into this tricky beast.
Thanks in advance
-
Hey RedSweater,
Thanks for your in-depth response.
So firstly I wanted to ask with regard to the pop up modals for each star sign are you saying if we use the AJAX instead of the javascript to pull in each additional sign like you were saying would this be a better way to optimize the page for spiders to see, also how we currently have them creates the duplicate content effect?
Another question I might ask is, why are you archiving all the daily horoscopes? Are you seeing visitors to old horoscopes, or are you holding onto them for your own records?
No, we are not holding them for any records at all, not that I see or know if there is any benefit in doing so. We currently do keep backup copies offline.
That way, you could have a permanent "Daily Cancer" page with a more refined meta description that ever changes, and every day you just go in and edit that sign. Same for all the other signs, and same for the weekly sign pages. Anytime people link to these pages, they would keep that value - and I think that might help your rankings.
Right now, with every horoscope being on its own separate daily page, say 2 people share that link on Facebook. Those links go straight to say the Cancer horoscope for June 1, 2015. That means they're kind of frozen in time. If instead you had a "Daily Cancer" page and they linked there, and then a week later they shared it again because they liked the new horoscope, you'd have 4 incoming links to the same page.
I think I see what your saying now: So if I'm correct in my understanding I can create a separate page for each star sign, and every day while updating the dailies I can go in and replace the old with the new this way the links are staying the same etc the only thing I would be tweaking is the meta tags des now and again and the content for each star sign.
Is that correct?
Thanks for you in-depth look into it really appreciate it.
Justin
-
Yes - duplicate content is exactly what I mean. If you're a human visitor, it's a good way to be able to look at multiple signs quickly. I do this too - I always look at all the signs for my family. But if you're a search engine, on each sign's page, you see the full text for all the signs, so it looks like there are 12 pages with almost exactly the same content. Yes, it would be a good idea to look at competitors' sites and see how they handle it. One option might be to have each page with only that one sign's content on it, and use AJAX to pull in each additional sign - so visitors would see exactly what they're seeing now, and be able to immediately click another sign and see that horoscope, but because spiders typically don't see much content that is pulled in by JavaScript, they would see 12 unique pages to index. It looks like this competitor is doing it this way: http://astrostyle.com/daily-horoscopes/cancer-daily-horoscope/
Another question I might ask is, why are you archiving all the daily horoscopes? Are you seeing visitors to old horoscopes, or are you holding onto them for your own records? You may have visitors looking at them, and if so, keep archiving as you go. But if you're not getting much traffic to them, I would suggest considering keeping your own backup copies offline. That way, you could have a permanent "Daily Cancer" page with a more refined meta description that ever changes, and every day you just go in and edit that sign. Same for all the other signs, and same for the weekly sign pages. Anytime people link to these pages, they would keep that value - and I think that might help your rankings. Right now, with every horoscope being on its own separate daily page, say 2 people share that link on Facebook. Those links go straight to say the Cancer horoscope for June 1, 2015. That means they're kind of frozen in time. If instead you had a "Daily Cancer" page and they linked there, and then a week later they shared it again because they liked the new horoscope, you'd have 4 incoming links to the same page.
-
Hi RedSweater,
Thanks for your in-depth response
Thats a great idea about the automatic date based title tags like the example you gave. Would it be wise to take a look at what my competitors are using? I can't figure out how they have set themselves up for this part.
My system was custom built - when you say this:
On a related note, I noticed the horoscopes for all the other signs actually appear on the same page, just in a modal, when you click on them.
This may be working against you because you have all the signs' content on multiple pages.
Would this create a duplicate content situation? Is that what you mean in terms of working against me?
I would try to find a way to keep only each sign's content on its page and link off to the other signs' individual pages for people who want to view multiple signs. That should really reinforce to crawlers that each page has unique content.
The reason why we did it like that is most people who are viewing their horoscopes like to view their partners, friends or family horoscopes as well. Would you be able to elaborate on this? I have looked into my competitors and can't quiet see how they have done this.
I can provide you a link to competitors horoscope pages?
I would love to hear your feedback on this,
Thanks so much
Justin
-
Since it doesn't sound feasible to create new titles and meta descriptions by hand every time, I would come up with something automatic that's date-based. For example, your title could be "6/11/2015 Daily Horoscope for Cancer." Meta descriptions will be harder - if you're using something like WordPress with Yoast's SEO plugin, you could have it auto-create the meta description using the first line of your actual content. I'm sure you could find or create a similar auto-functionality if you're using a different platform. That way you aren't doing anything by hand, yet you have a unique title and helpful meta description for each page on the site.
On a related note, I noticed the horoscopes for all the other signs actually appear on the same page, just in a modal, when you click on them. This may be working against you because you have all the signs' content on multiple pages. I would try to find a way to keep only each sign's content on its page and link off to the other signs' individual pages for people who want to view multiple signs. That should really reinforce to crawlers that each page has unique content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site in 2 page
my site in 2 page how can i rank with this keywords in dubai legal translation in Dubai
White Hat / Black Hat SEO | | saharali150 -
I have 100+ Landing Pages I use for PPC... Does Google see this as a blog farm?
I am currently using about 50-100 domains for geotargeted landing pages for my PPC campaigns. All these pages basically have the same content, I believe are hosted on a single unique ip address and all have links back to my main url. I am not using these pages for SEO at all, as I know they will never achieve any significant SEO value. They are simply designed to generate a higher conversion rate for my PPC campaigns, because they are state and city domains. My question is, does google see this as a blog/link farm, and if so, what should I do about it? I don't want to lose any potential rankings they may be giving my site, if any at all, but if they are hurting my main urls SEO performance, then I want to know what I should do about it. any advice would be much appreciated!
White Hat / Black Hat SEO | | jfishe19881 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
Website Vulnerability Leading to Doorway Page Spam. Need Help.
Keywords he is ranking for , houston dwi lawyer, houston dwi attorney and etc.. Client was acquired in June and since then we have done nothing but build high quality links to the website. None of our clients were dropped/dinged or impacted by the panda/penguin updates in 2012 or updates previously published via Google. Which proves we do quality SEO work. We went ahead and started duplicating links which worked for other legal clients and 5 months later this client is either dropping or staying in local maps results and we are performing very badly in organic results. Some more history..... When he first engaged our company we switched his website from a CMS called plone to word press. During our move I ran some searches to figure out which pages we needed to 301 and we came across many profile pages or member pages created on the clients CMS (PLONE). These pages were very spammy and linked to other plone sites using car model,make,year type keywords (ex:jeep cherokee dealerships). I went through these sites to see if they were linking back and could not find any back links to my clients website. Obviously nobody authorized these pages, they all looked very hackish and it seemed as though there was a vulnerability on his plone CMS installation which nobody caught. Fast forward 5 months and the newest OSE update is showing me a good 50+ back links with unrelated anchor text back links. These anchor text links are the same color as the background and can only be found if you hover your mouse over certain areas of the site. All of these sites are built on Plone and allot of them are linked to other businesses or community websites. These websites obviously have no clue they have been hacked or are being used for black hat purposes. There are dozens of unrelated anchor text links being used on external websites which are pointing back to our clients website. Examples: <a class="clickable title link-pivot" title="See top linking pages that use this anchor text">autex Isuzu, </a><a class="clickable title link-pivot" title="See top linking pages that use this anchor text">Toyota service department ratings, </a><a class="clickable title link-pivot" style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;" title="See top linking pages that use this anchor text">die cast BMW and etc..</a> Obviously the first step is to use the disavow link tool, which will be completed this week. The second step is to take some feedback from the SEO community. It seems like these pages are automatically created using some type of bot. It will be very tedious if we have to continually remove these links. I hope there is a way to notify Google that these websites are all plone and have a vulnerability, which black hats are using to harm the innocent... If i cannot get Google to handle this, then the only other option is to start fresh with a new domain name. What would you do in this situation. Your help is greatly appreciated. Thank you
White Hat / Black Hat SEO | | waqid0 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Campaign landing pages
Hi At our company we decided we wanted to reach out to a more global audience. So we bought a bank of domains for different countries, e.g. ".asia". Some are our company name, others are things like "barcelonaprivatejets.com." We then put up single page websites for each of these domains, which link to our main .com site. However, I don't know if this is good for our SEO or bad. I've seen so many different things written but I cannot find a definitive answer. The text will be different on all the pages, but being only one page, and the "design" being the same, will we get penalized in some way or another? I've also added links to 2/3 of them in the footer of our main site but now I'm reading that this is bad too - so should I remove these? If anyone also has any ideas of how better we could use these Country-specific domains I would be welcome to suggestions to that too! I am not an SEO person really, I'm a web developer, so this is all completely different to me. P.S My name is Michael not Andy.
White Hat / Black Hat SEO | | JetBookMike0