Many pages small unique content vs 1 page with big content
-
Dear all,
I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content.The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla).When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages.What would you choose?
Let me know what you think.
Thanks!
-
Wow, Jose, you got a whole audit from Luis.
1. Luis makes a good point about Seville vs Sevilla. When you're trying to target a region other than your own, make sure that you change the location in Google Keyword Planner. Seville is the English version of Sevilla (which I know sounds strange, but we also call your country Spain rather than España).
2. Both subdomains and subfolders can effectively designate different languages. If you've made the call to use subfolders, that's fine. It's probably what I would have done, too, since that means the Domain Authority will transfer easily.
4 & 5. Keyword repetition in URLs isn't necessarily bad in your case, because it's caused by a lot of subfolders.
It seems like there's been some debate here on more subfolders vs less: there isn't a hard and fast rule about it. If you have more subfolders, those pages higher up in the structure tend to get more link equity out of the deal and rank better. That takes away from deeper pages, though, which are presumably targeting the most important words. If you use fewer subfolders, the link equity will be evenly distributed, but that means that higher level pages will be weaker and deeper pages will be stronger. In your case, I don't know the answer, since I don't know how competitive different keywords are at different levels. If I were your SEO, I'd tell you to stick with your current URL structure, because moving pages to new URLs tends to cause a big knock in rankings for awhile.
-
Hi Jose,
I like your current set up, with more pages at 500 words. 500 words doesn't make for thin content from a search engine's perspective, and it means that you're delivering a more targeted result for searchers; if you don't have separate pages and they search for "restaurants in Seville," they're not going to be thrilled if they land on your mega guide page and have to search to find what they're looking for.
That said, you may want to change the language on the main Seville page so you don't call these "detailed" city guides.
Good luck!
Kristina
-
hello again,
1. Seville has 22,000 searches in UK but very few people look for Sevilla.
2. It depends, I prefer subdomain.domain.com instead of subfolders.... I only found English language at your site. Even if you use /en/ you need a main language (that could be English), and this is not necessary to have the subfolder: www.domain.com (for English), then www.domain.com/es (for Spanish).... and so on. But well is a personal decision
3. OK
4. You didnt get my point. Please read my message and my example carefully (since I checked your site carefully). It's very very important you dont repeat similar or same keywords in the same URL. In my example before it was "Seville+Sevilla" and "university+universities" in one single URL.
5. Again, the best is to have the minimum subfolder as possible! URL like this: www.eurasmus.com/erasmus-seville-city-guide are much nicer for Google than www.eurasmus.com/erasmus-sevilla/city-guide
You can keep only one if this is your strategy, or both since they have different content and context. It's up to you and if you apply a good SEO strategy I dont see any problem having two pages.
About the long tail, I already explained before. You are maybe ranking now for non-competitive keywords (study the keyword difficulty rankings for your pages/keywords) for those pages. I recommend to focus on why you are not ranking well for the pages/KW you want and optimize your strategy.
Hope this helps!
Luis
-
Hi Luis!
Thank you for your message.
I will try to answer to all your comments.1. We have done all the research of the cities already and used the one with more results.
I will recheck it, to check that it is applied everywhere.
2.We are now publishing spanish and 7 languages more, that is why we have the /en.
We decided to go for the /fr /it etc...as far as i know there is not a relevant difference, we believe.
Am I right?
3. I agree. That is why we are redesigning (also not friendly user at all).
4. It is eurasmus.com, brand name, what is not erasmus. Different words. Another chapter would be
to discuss if it is a good brand election for SEO that has been a long discussion in our company long time.
5. We will study how to make it better!Concerning my direct question, would you recommend using all the guides content in the erasmus-sevilla
home page and delete the guide area or would you leave the guide and just make more content in the home?
Main thing is that we get results for long tail but those keywords do not really generate conversion....What do you think?
-
Hi Jose,
Some advices and questions:
- Have you done a keyword analyse before? How many searches you have for your supposed "focus keywords"?After checking a little bit I see the word "Seville" is much better than "Sevilla".... foreign users call it like that
- Don't abuse of URL sublevels: /en/eramus-sevilla/guide/... (You don't need the /en/ since your site is only in English. Please, if you plan to translate to new languages you can use subdomains for this (es.eurasmus.com, fr.eurasmus.com,...)
- Add much more content to your landing page (/erasmus-sevilla is quite poor in content)
- Don't repeat keywords in the URL: http://eurasmus.com/en/erasmus-sevilla/universities/university-of-seville (here you have two repeated keywords man!)
- Make things simplier! Some ideas:
- www.euramus.com/erasmus-spain/seville-city-guide
- www.euramus.com/erasmus-spain/seville-city-transport
- www.euramus.com/erasmus-spain/seville-universities
- www.euramus.com/erasmus-spain/madrid-city-guide
- www.euramus.com/erasmus-belgium/brussels-city-guide
Long tail results for different keywords are normal and that happen. Have you tested with the Moz Grade tool if your pages need some improvements for the related keywords? That would be necessary too.
Btw, I'm Spanish so dont hesitate to send me a PM if you need more help man
Luis
- Have you done a keyword analyse before? How many searches you have for your supposed "focus keywords"?After checking a little bit I see the word "Seville" is much better than "Sevilla".... foreign users call it like that
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain vs Subdirectory - Specific Case: A big blog in a subdomain
Hi. First of all, I love MOZ and learned a lot about SEO by reading articles here. Thanks for all the knowledge that i received here. I read all the articles about "Subdomain vs Subdirectory" in the MOZ community and I have no doubt that subdirectories are the best option for a blog. But, the company that I work now has a blog with more than 17.000 articles, 1.000 categories and tags, hosted on a subdomain structure. The website has a Domain Authority of 78 (I am working to improve these numbers) and the blog subdomain has the same (78). We had 2.7 million hits per month in the blog and 4.5 million hits per month in the site. I am advising the company to change the blog structure to subfolders inside the domain, but I'm finding resistance to the idea, because the amount of work involved in this change is enormous and there is still the fear of losing traffic. My questions are: Is there any risk of losing traffic with the amount of articles we have? What do we probably get if we change the blog structure to subfolders? Could we have increased authority for the domain? More Traffic? How can I explain to my superiors that we would probably have increase traffic for our keywords? Is there any way to prove or test the gains from this change before we run it? Thanks in Advance.
Intermediate & Advanced SEO | | Marcus.Coelho0 -
Unique domains vs. single domain for UGC sites?
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better: XXX,XXX pages on one site vs. A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site. The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary. Thoughts? Any other considerations we should be thinking about?
Intermediate & Advanced SEO | | intentionally0 -
I've seen and heard alot about city-specific landing pages for businesses with multiple locations, but what about city-specific landing pages for cities nearby that you aren't actually located in? Is it ok to create landing pages for nearby cities?
I asked here https://www.google.com/moderator/#7/e=adbf4 but figured out ask the Moz Community also! Is it actually best practice to create landing pages for nearby cities if you don't have an actual address there? Even if your target customers are there? For example, If I am in Miami, but have a lot of customers who come from nearby cities like Fort Lauderdale is it okay to create those LP's? I've heard this described as best practice, but I'm beginning to question whether Google sees it that way.
Intermediate & Advanced SEO | | RickyShockley2 -
Does Google still don't index Hashtag Links ? No chance to get a Search Result that leads directly to a section of a page? or to one of numeras Hashtag Pages in a single HTML page?
Does Google still don't index Hashtag Links ? No chance to get a Search Result that leads directly to a section of a page? or to one of numeras Hashtag Pages in a single HTML page? If I have 4 or 5 different hashtag link section pages , consolidated into one HTML Page, no chance to get one of the Hashtag Pages to appear as a search result? like, if under one Single Page Travel Guide I have two essential sections: #Attractions #Visa no chance to direct search queries for Visa directly to the Hashtag Link Section of #Visa? Thanks for any help
Intermediate & Advanced SEO | | Muhammad_Jabali0 -
Site less than 20 pages shows 1,400+ pages when crawled
Hello! I’m new to SEO, and have been soaking up as much as I can. I really love it, and feel like it could be a great fit for me – I love the challenge of figuring out the SEO puzzle, plus I have a copywriting/PR background, so I feel like that would be perfect for helping businesses get a great jump on their online competition. In fact, I was so excited about my newfound love of SEO that I offered to help a friend who owns a small business on his site. Once I started, though, I found myself hopelessly confused. The problem comes when I crawl the site. It was designed in Wordpress, and is really not very big (part of my goal in working with him was to help him get some great content added!) Even though there are only 11 pages – and 6 posts – for the entire site, when I use Screaming Frog to crawl it, it sees HUNDREDS of pages. It stops at 500, because that is the limit for their free version. In the campaign I started here at SEOmoz, and it says over 1,400 pages have been crawled…with something like 900 errors. Not good, right? So I've been trying to figure out the problem...when I look closer in Screaming Frog, I can see that some things are being repeated over and over. If I sort by the Title, the URLs look like they’re stuck in a loop somehow - one line will have /blog/category/postname…the next line will have /blog/category/category/postname…and the next line will have /blog/category/category/category/postname…and so on, with another /category/ added each time. So, with that, I have two questions Does anyone know what the problem is, and how to fix it? Do professional SEO people troubleshoot this kind of stuff all of the time? Is this the best place to get answers to questions like that? And if not, where is? Thanks so much in advance for your help! I’ve enjoyed reading all of the posts that are available here so far, it seems like a really excellent and helpful community...I'm looking forward to the day when I can actually answer the questions!! 🙂
Intermediate & Advanced SEO | | K.Walters0 -
Duplicate content that looks unique
OK, bit of an odd one. The SEOmoz crawler has flagged the following pages up as duplicate content. Does anyone have any idea what's going on? http://www.gear-zone.co.uk/blog/november-2011/gear$9zone-guide-to-winter-insulation http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone http://www.gear-zone.co.uk/blog/july-2011/telephone-issues-$9-2nd-july-2011 http://www.gear-zone.co.uk/blog/september-2011/gear$9zone-guide-to-nordic-walking-poles http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone https://www.google.com/webmasters/tools/googlebot-fetch?hl=en&siteUrl=http://www.gear-zone.co.uk/
Intermediate & Advanced SEO | | neooptic0 -
"Duplicate" Page Titles and Content
Hi All, This is a rather lengthy one, so please bear with me! SEOmoz has recently crawled 10,000 webpages from my site, FrenchEntree, and has returned 8,000 errors of duplicate page content. The main reason I have so many is because of the directories I have on site. The site is broken down into 2 levels of hierachy. "Weblets" and "Articles". A weblet is a landing page, and articles are created within these weblets. Weblets can hold any number of articles - 0 - 1,000,000 (in theory) and an article must be assigned to a weblet in order for it to work. Here's how it roughly looks in URL form - http://www.mysite.com/[weblet]/[articleID]/ Now; our directory results pages are weblets with standard content in the left and right hand columns, but the information in the middle column is pulled in from our directory database following a user query. This happens by adding the query string to the end of the URL. We have 3 main directory databases, but perhaps around 100 weblets promoting various 'canned' queries that users may want to navigate straight into. However, any one of the 100 directory promoting weblets could return any query from the parent directory database with the correct query string. The problem with this method (as pointed out by the 8,000 errors) is that each possible permutation of search is considered to be it's own URL, and therefore, it's own page. The example I will use is the first alphabetically. "Activity Holidays in France": http://www.frenchentree.com/activity-holidays-france/ - This link shows you a results weblet without the query at the end, and therefore only displays the left and right hand columns as populated. http://www.frenchentree.com/activity-holidays-france/home.asp?CategoryFilter= - This link shows you the same weblet with the an 'open' query on the end. I.e. display all results from this database. Listings are displayed in the middle. There are around 500 different URL permutations for this weblet alone when you take into account the various categories and cities a user may want to search in. What I'd like to do is to prevent SEOmoz (and therefore search engines) from counting each individual query permutation as a unique page, without harming the visibility that the directory results received in SERPs. We often appear in the top 5 for quite competitive keywords and we'd like it to stay that way. I also wouldn't want the search engine results to only display (and therefore direct the user through to) an empty weblet by some sort of robot exclusion or canonical classification. Does anyone have any advice on how best to remove the "duplication" problem, whilst keeping the search visibility? All advice welcome. Thanks Matt
Intermediate & Advanced SEO | | Horizon0 -
Pop Up Pages Being Indexed, Seen As Duplicate Content
I offer users the opportunity to email and embed images from my website. (See this page http://www.andertoons.com/cartoon/6246/ and look under the large image for "Email to a Friend" and "Get Embed HTML" links.) But I'm seeing the ensuing pop-up pages (Ex: http://www.andertoons.com/embed/5231/?KeepThis=true&TB_iframe=true&height=370&width=700&modal=true and http://www.andertoons.com/email/6246/?KeepThis=true&TB_iframe=true&height=432&width=700&modal=true) showing up in Google. Even worse, I think they're seen as duplicate content. How should I deal with this?
Intermediate & Advanced SEO | | andertoons0