How do I optimize pages for content that changes everyday?
-
Hi Guys
I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend.
However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page.
As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes?
How can I optimize the Title Tags and Meta Tags for pages that are constantly changing?
I'm really stuck on this one and would appreciate some feedback into this tricky beast.
Thanks in advance
-
Hey RedSweater,
Thanks for your in-depth response.
So firstly I wanted to ask with regard to the pop up modals for each star sign are you saying if we use the AJAX instead of the javascript to pull in each additional sign like you were saying would this be a better way to optimize the page for spiders to see, also how we currently have them creates the duplicate content effect?
Another question I might ask is, why are you archiving all the daily horoscopes? Are you seeing visitors to old horoscopes, or are you holding onto them for your own records?
No, we are not holding them for any records at all, not that I see or know if there is any benefit in doing so. We currently do keep backup copies offline.
That way, you could have a permanent "Daily Cancer" page with a more refined meta description that ever changes, and every day you just go in and edit that sign. Same for all the other signs, and same for the weekly sign pages. Anytime people link to these pages, they would keep that value - and I think that might help your rankings.
Right now, with every horoscope being on its own separate daily page, say 2 people share that link on Facebook. Those links go straight to say the Cancer horoscope for June 1, 2015. That means they're kind of frozen in time. If instead you had a "Daily Cancer" page and they linked there, and then a week later they shared it again because they liked the new horoscope, you'd have 4 incoming links to the same page.
I think I see what your saying now: So if I'm correct in my understanding I can create a separate page for each star sign, and every day while updating the dailies I can go in and replace the old with the new this way the links are staying the same etc the only thing I would be tweaking is the meta tags des now and again and the content for each star sign.
Is that correct?
Thanks for you in-depth look into it really appreciate it.
Justin
-
Yes - duplicate content is exactly what I mean. If you're a human visitor, it's a good way to be able to look at multiple signs quickly. I do this too - I always look at all the signs for my family. But if you're a search engine, on each sign's page, you see the full text for all the signs, so it looks like there are 12 pages with almost exactly the same content. Yes, it would be a good idea to look at competitors' sites and see how they handle it. One option might be to have each page with only that one sign's content on it, and use AJAX to pull in each additional sign - so visitors would see exactly what they're seeing now, and be able to immediately click another sign and see that horoscope, but because spiders typically don't see much content that is pulled in by JavaScript, they would see 12 unique pages to index. It looks like this competitor is doing it this way: http://astrostyle.com/daily-horoscopes/cancer-daily-horoscope/
Another question I might ask is, why are you archiving all the daily horoscopes? Are you seeing visitors to old horoscopes, or are you holding onto them for your own records? You may have visitors looking at them, and if so, keep archiving as you go. But if you're not getting much traffic to them, I would suggest considering keeping your own backup copies offline. That way, you could have a permanent "Daily Cancer" page with a more refined meta description that ever changes, and every day you just go in and edit that sign. Same for all the other signs, and same for the weekly sign pages. Anytime people link to these pages, they would keep that value - and I think that might help your rankings. Right now, with every horoscope being on its own separate daily page, say 2 people share that link on Facebook. Those links go straight to say the Cancer horoscope for June 1, 2015. That means they're kind of frozen in time. If instead you had a "Daily Cancer" page and they linked there, and then a week later they shared it again because they liked the new horoscope, you'd have 4 incoming links to the same page.
-
Hi RedSweater,
Thanks for your in-depth response
Thats a great idea about the automatic date based title tags like the example you gave. Would it be wise to take a look at what my competitors are using? I can't figure out how they have set themselves up for this part.
My system was custom built - when you say this:
On a related note, I noticed the horoscopes for all the other signs actually appear on the same page, just in a modal, when you click on them.
This may be working against you because you have all the signs' content on multiple pages.
Would this create a duplicate content situation? Is that what you mean in terms of working against me?
I would try to find a way to keep only each sign's content on its page and link off to the other signs' individual pages for people who want to view multiple signs. That should really reinforce to crawlers that each page has unique content.
The reason why we did it like that is most people who are viewing their horoscopes like to view their partners, friends or family horoscopes as well. Would you be able to elaborate on this? I have looked into my competitors and can't quiet see how they have done this.
I can provide you a link to competitors horoscope pages?
I would love to hear your feedback on this,
Thanks so much
Justin
-
Since it doesn't sound feasible to create new titles and meta descriptions by hand every time, I would come up with something automatic that's date-based. For example, your title could be "6/11/2015 Daily Horoscope for Cancer." Meta descriptions will be harder - if you're using something like WordPress with Yoast's SEO plugin, you could have it auto-create the meta description using the first line of your actual content. I'm sure you could find or create a similar auto-functionality if you're using a different platform. That way you aren't doing anything by hand, yet you have a unique title and helpful meta description for each page on the site.
On a related note, I noticed the horoscopes for all the other signs actually appear on the same page, just in a modal, when you click on them. This may be working against you because you have all the signs' content on multiple pages. I would try to find a way to keep only each sign's content on its page and link off to the other signs' individual pages for people who want to view multiple signs. That should really reinforce to crawlers that each page has unique content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor ranking well with duplicate content—what are my options?
A competitor is ranking #1 and #3 for a search term (see attached) by publishing two separate sites with the same content. They've modified the title of the page, and serve it in a different design, but are using their branded domain and a keyword-rich domain to gain multiple rankings. This has been going on for years, and I've always told myself that Google would eventually catch it with an algorithm update, but that doesn't seem to be happening. Does anyone know of other options? It doesn't seem like this falls under any of the categories that Google lists on their web spam report page—is there any other way to get bring this up with the powers that be, or is it something that I just have to live with and hope that Google figures out some day? Any advice would help. Thanks! how_to_become_a_home_inspector_-_Google_Search_2015-01-15_18-45-06.jpg
White Hat / Black Hat SEO | | inxilpro0 -
Google's Related Searches - Optimizing Possible?
Does anyone know how Google determines what suggestions show up at the bottom of SERPs? I've been working with a client to boost his local ranking, but every time we do a branded search for his business his competitors keep popping up in the "Searches related to ______" section.
White Hat / Black Hat SEO | | mtwelves0 -
Guest post linking only to good content
Hello, We're thinking of doing guest posting of the following type: 1. The only link is in the body of the guest post pointing to our most valuable article. 2. It is not a guest posting site - we approached them to help with content, they don't advertise guest posting. They sometimes use guest posting if it's good content. 3. It is a clean site - clean design, clean anchor text profile, etc. We have 70 linking root domains. We want to use the above tactics to add 30 more links. Is this going to help us on into the future of Google (We're only interested in long term)? Is 30 too many? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Preparing for Penguin: Delete or Change to Branding 25 small blogs, anchor text
Hello, This site has 80 root domains pointing to the domain, call it site X. 25 of them are tiny blogs the owner put up himself. The blogs consist only of 4 posts or so, where each post has a 2 keyword anchor text links to each one of his 4 sites. One link in each post goes to the home page and one goes to an internal page. Let's concern ourselves with cleaning up the anchor text profile of site X. All blogs are on private registration. Half of the blog domain names are furniture related and furniture is not relevant to this niche. But 3/4 to 3/5 of the content of each blog (2 paragraphs per post and 4 posts) is relevant. My concern is that even though the anchor text is varied and there's only 2 links going out to site X per blog, none of it is branded and so I'm concerned about Penguin type updates. Should we change these to branded or delete them? We're working on content promotion for backlinks in case we have to delete these blogs, but it's a small budget. What should we do? '
White Hat / Black Hat SEO | | BobGW0 -
Duplicate Content
Hi, I have a website with over 500 pages. The website is a home service website that services clients in different areas of the UK. My question is, am I able to take down the pages from my URL, leave them down for say a week, so when Google bots crawl the pages, they do not exist. Can I then re upload them to a different website URL, and then Google wont penalise me for duplicate content? I know I would of lost juice and page rank, but that doesnt really matter, because the site had taken a knock since the Google update. Thanks for your help. Chris,
White Hat / Black Hat SEO | | chrisellett0 -
Should I redirect old pages
I have taken over SEO on a site. The old people built thousands of pages with duplicate content. I am converting site to wordpress and was wondering if I should take the time to 301 redirect all 10,000 or so pages with duplicate content. The 10,000 pages all have links back to different to different pages as well as to the homepage. Should I just let them go to a 404 page not found.
White Hat / Black Hat SEO | | Roots70 -
Merging four sites into one... Best way to combine content?
First of all, thank you in advance for taking the time to look at this. The law firm I work for once took a "more is better" approach and had multiple websites, with keyword rich domains. We are a family law firm, but we have a specific site for "Arizona Child Custody" as one example. We have four sites. All four of our sites rank well, although I don't know why. Only one site is in my control, the other three are managed by FindLaw. I have no idea why the FindLaw sites do well, other than being in the FindLaw directory. They have terrible spammy page titles, and using Copyscape, I realize that most of the content that FindLaw provides for it's attorneys are "spun articles." So I have a major task and I don't know how to begin. First of all, since all four sites rank well for all of the desired phrases-- will combining all of that power into one site rocket us to stardom? The sites all rank very well now, even though they are all technically terrible. Literally. I would hope that if I redirect the child custody site (as one example) to the child custody overview page on the final merged site, we would still maintain our current SERP for "arizona child custody lawyer." I have strongly encouraged my boss to merge our sites for many reasons. One of those being that it's playing havoc with our local places. On the other hand, if I take down the child custody site, redirect it, and we lose that ranking, I might be out of a job. Finally, that brings me down to my last question. As I mentioned, the child custody site is "done" very poorly. Should I actually keep the spun content and redirect each and every page to a duplicate on our "final" domain, or should I redirect each page to a better article? This is the part that I fear the most. I am considering subdomains. Like, redirecting the child custody site to childcustody.ourdomain.com-- I know, for a fact, that will work flawlessly. I've done that many times for other clients that have multiple domains. However, we have seven areas of practice and we don't have 7 nice sites. So child custody would be the only legal practice area that has it's own subdomain. Also, I wouldn't really be doing anything then, would I? We all know 301 redirects work. What I want is to harness all of this individual power to one mega-site. Between the four sites, I have 800 pages of content. I need to formulate a plan of action now, and then begin acting on it. I don't want to make the decision alone. Anybody care to chime in? Thank you in advance for your help. I really appreciate the time it took you to read this.
White Hat / Black Hat SEO | | SDSLaw0