URL Structure for Deal Aggregator
-
I have a website that aggregates deals from various daily deals site. I originally had all the deals on one page /deals, however I thought that maybe it might be more useful to have several pages e.g. /beautydeals or /hoteldeals. However if I give every section it's own page that means I have either no current deals on the main /deals page or I will have duplicate content. I'm wondering what might be the best approach here? A few of the options that come to mind are:
1. Return to having all the deals on one page /deals and linking internally to content within that page
2. Have both a main /deals page with all of the deals plus other pages such as /beautydeals, but add re="canonical" to point to the main /deals page
3. Create new content for the /deals page... however I think people will probably want to see at least some deals straight away, rather than having to click through to another page.
4. Display some sub-categories on the main /deals page, but have separate URLs for other more popular sub-categories e.g. /beautydeals (this is how it works at the moment)I should probably point out that the site also has other content such as events and a directory. Any suggestions on how best to approach this much appreciated!
Cheers,
Andy
-
Could you consider a version of option 4?
Main deal page could feature the top 3-5 deals of each of your most popular subcategories. Then your category pages feature all the deals for that category so it doesn't seem like there would be a duplicate content issue with this approach. This also seems to be a user-centric approach b/c it allows people to see a variety of deals on the main page and they are being curated by popularity of category and top 3-5 best sellers within the category.
Does that make sense?
-
Hi Andy
Think it's very wise of you to have considered this potential duplicate content problem.
Having a rel=canonical tag on the separate categories, or even a tag on them, would make sure that the URL is not indexed in Google, thus removing any potential duplicate content.
I can't really see a way of having both a main deals URL and a category deals URL both being indexed and ranking because, as you have said, the pages would either have zero content or duplicate content.
With that in mind, I think you're current format is the best one. Having a big /deals page with all your offers on it will hopefully provide lots of rich content so that people will link to the page, which in turn will rank the page for a number of keywords - while you're also allowing people to filter down and get to what they want. Just make sure that the separate sub-category pages have either enough unique content on them, or a canonical/meta noindex tag on them to avoid a duplicate content issue.
I would say though that as you're site aggregates daily deals, you are at a bit of a risk of still supplying duplicate content from other sites. I'm not sure how the deals are fed in, but if you get a rush of deals en masse from one website and the deal's titles and descriptions are all the same, this might also be seen as duplicate. If you can off-set this with unique content on the page and a system to put in your own titles/descriptions, then it shouldn't be a problem.
Good question this - would be interested to see some other Mozzer's POVs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category url with resultsperpage loop
Hi there, We upgraded our webshop last weekend and our moz crawl on monday found a lot of errors we are trying to fix. I am having some communication problems with our webmaster so I need a little help. We have extremely long category pages url, does anyone have a guess which kind of mistake our webmaster could make:
Technical SEO | | isabelledylag
https://site-name.pl/category-name?page=3?resultsPerPage=53?resultsPerPage=53 .... And it keeps on repeating the string ?resultsPerPage=53 exactly 451 times as if there was some kind of loop. Thanks in advance for any kind of hint 🙂
Kind regards,
Isabelle0 -
Which URL structure holds the best SEO value?
Hello Community! We are rewriting URLs to better rank and provide better visual usability to our visitors. Would it be better to serve a URL that looks like this: www.domain.com/category-subcategory or www.domain.com/category/subcategory Please note the slight difference--the 2nd URL calls out a category that has a subcategory under it. Would it give us more value? Does it make a difference? Thanks in advance!
Technical SEO | | JCorp0 -
.htaccess: Multiple URLs catches filename
Hi, I have the following line in my .htaccess:
Technical SEO | | rasmusbang
RewriteRule privacy stdpage.php?slug=privacy [L] So if you go to the www.mysite.com/privacy it takes the stdpage.php with the argument above. But if you go to www.mysite.com/privacysssssssss catches the same file. How can I prevent this? It will give me multiple URLs with the exact same content. I have a 404 page which i would like to show instead when the match is not 100%. -Rasmus0 -
Spider Indexed Disallowed URLs
Hi there, In order to reduce the huge amount of duplicate content and titles for a cliënt, we have disallowed all spiders for some areas of the site in August via the robots.txt-file. This was followed by a huge decrease in errors in our SEOmoz crawl report, which, of course, made us satisfied. In the meanwhile, we haven't changed anything in the back-end, robots.txt-file, FTP, website or anything. But our crawl report came in this November and all of a sudden all the errors where back. We've checked the errors and noticed URLs that are definitly disallowed. The disallowment of these URLs is also verified by our Google Webmaster Tools, other robots.txt-checkers and when we search for a disallowed URL in Google, it says that it's blocked for spiders. Where did these errors came from? Was it the SEOmoz spider that broke our disallowment or something? You can see the drop and the increase in errors in the attached image. Thanks in advance. [](<a href=)" target="_blank">a> [](<a href=)" target="_blank">a> LAAFj.jpg
Technical SEO | | ooseoo0 -
Old URL redirect to New URL
Alright I did something dumb a year a go and I'm still paying for it. I changed my hyphenated URL to the non-hyphenated version when I redesigned my website. I say it was dumb because I lost most of my link juice even though I did 301 redirects (via the htaccess file) for almost all of the pages I could find in Google's index. Here's my problem. My new site took a huge hit in traffic (down 60%) when I made the change and even though I've done thousands of redirects my old site is still showing up in the SERPS and send much if not most of my traffic. I don't want to take the old site down in fear it will kill all of my traffic. What should I do? Is there a better method I should explore then 301 redirects? Could the other site be affecting my current rank since it's still there? (FYI...both sites are built on the WP platform). Any help or ideas are greatly appreciated. Thank you! Joe
Technical SEO | | kaje0 -
Dealing with 404 pages
I built a blog on my root domain while I worked on another part of the site at .....co.uk/alpha I was really careful not to have any links go to alpha - but it seems google found and indexed it. The problem is that part of alpha was a copy of the blog - so now soon we have a lot of duplicate content. The /alpha part is now ready to be taken over to the root domain, the initial plan was to then delete /alpha. But now that its indexed I'm worried that Ill have all these 404 pages. I'm not sure what to do.. I know I can just do a 301 redirect for all those pages to go to the other ones in case a link comes on but I need to delete those pages as the server is already very slow. Or does a 301 redirect mean that I don't need those pages anymore? Will those pages still get indexed by google as separate pages? Please assist.
Technical SEO | | borderbound0 -
Are URL's with trailing slash seen as two different URLs
Hello, http://www.example.com and http://ww.example.com/ Are these seen as two different URL's ? Just as with www or non www ? Or it doesn't make any difference ?
Technical SEO | | seoug_20050 -
URL Structure
Hi Guys, I'm in the process of creating a very exciting startup aimed at the baby industry. It's essentially a social commerce question where parents can shop for products, create lists of products and ask questions. The challenge I'm facing is how best to structure my URLs from an SEO standpoint. For example a common baby topic such as "feeding", can sit in all three categories: Shopping category aggregates all products related to feeding List category aggregates all lists related to feeding Question category aggregates all question and answers on feeding So for that keyword "feeding" you have 3 potential landing pages. What I was wondering is what is the most effective way of doing it? I was thinking of something along these lines: /shopping/feeding /baby_list/feeding /ask/feeding Would love to hear your points of view on this. Thanks! Walid
Technical SEO | | walidalsaqqaf0