Many pages small unique content vs 1 page with big content
-
Dear all,
I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content.The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla).When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages.What would you choose?
Let me know what you think.
Thanks!
-
Wow, Jose, you got a whole audit from Luis.
1. Luis makes a good point about Seville vs Sevilla. When you're trying to target a region other than your own, make sure that you change the location in Google Keyword Planner. Seville is the English version of Sevilla (which I know sounds strange, but we also call your country Spain rather than España).
2. Both subdomains and subfolders can effectively designate different languages. If you've made the call to use subfolders, that's fine. It's probably what I would have done, too, since that means the Domain Authority will transfer easily.
4 & 5. Keyword repetition in URLs isn't necessarily bad in your case, because it's caused by a lot of subfolders.
It seems like there's been some debate here on more subfolders vs less: there isn't a hard and fast rule about it. If you have more subfolders, those pages higher up in the structure tend to get more link equity out of the deal and rank better. That takes away from deeper pages, though, which are presumably targeting the most important words. If you use fewer subfolders, the link equity will be evenly distributed, but that means that higher level pages will be weaker and deeper pages will be stronger. In your case, I don't know the answer, since I don't know how competitive different keywords are at different levels. If I were your SEO, I'd tell you to stick with your current URL structure, because moving pages to new URLs tends to cause a big knock in rankings for awhile.
-
Hi Jose,
I like your current set up, with more pages at 500 words. 500 words doesn't make for thin content from a search engine's perspective, and it means that you're delivering a more targeted result for searchers; if you don't have separate pages and they search for "restaurants in Seville," they're not going to be thrilled if they land on your mega guide page and have to search to find what they're looking for.
That said, you may want to change the language on the main Seville page so you don't call these "detailed" city guides.
Good luck!
Kristina
-
hello again,
1. Seville has 22,000 searches in UK but very few people look for Sevilla.
2. It depends, I prefer subdomain.domain.com instead of subfolders.... I only found English language at your site. Even if you use /en/ you need a main language (that could be English), and this is not necessary to have the subfolder: www.domain.com (for English), then www.domain.com/es (for Spanish).... and so on. But well is a personal decision
3. OK
4. You didnt get my point. Please read my message and my example carefully (since I checked your site carefully). It's very very important you dont repeat similar or same keywords in the same URL. In my example before it was "Seville+Sevilla" and "university+universities" in one single URL.
5. Again, the best is to have the minimum subfolder as possible! URL like this: www.eurasmus.com/erasmus-seville-city-guide are much nicer for Google than www.eurasmus.com/erasmus-sevilla/city-guide
You can keep only one if this is your strategy, or both since they have different content and context. It's up to you and if you apply a good SEO strategy I dont see any problem having two pages.
About the long tail, I already explained before. You are maybe ranking now for non-competitive keywords (study the keyword difficulty rankings for your pages/keywords) for those pages. I recommend to focus on why you are not ranking well for the pages/KW you want and optimize your strategy.
Hope this helps!
Luis
-
Hi Luis!
Thank you for your message.
I will try to answer to all your comments.1. We have done all the research of the cities already and used the one with more results.
I will recheck it, to check that it is applied everywhere.
2.We are now publishing spanish and 7 languages more, that is why we have the /en.
We decided to go for the /fr /it etc...as far as i know there is not a relevant difference, we believe.
Am I right?
3. I agree. That is why we are redesigning (also not friendly user at all).
4. It is eurasmus.com, brand name, what is not erasmus. Different words. Another chapter would be
to discuss if it is a good brand election for SEO that has been a long discussion in our company long time.
5. We will study how to make it better!Concerning my direct question, would you recommend using all the guides content in the erasmus-sevilla
home page and delete the guide area or would you leave the guide and just make more content in the home?
Main thing is that we get results for long tail but those keywords do not really generate conversion....What do you think?
-
Hi Jose,
Some advices and questions:
- Have you done a keyword analyse before? How many searches you have for your supposed "focus keywords"?After checking a little bit I see the word "Seville" is much better than "Sevilla".... foreign users call it like that
- Don't abuse of URL sublevels: /en/eramus-sevilla/guide/... (You don't need the /en/ since your site is only in English. Please, if you plan to translate to new languages you can use subdomains for this (es.eurasmus.com, fr.eurasmus.com,...)
- Add much more content to your landing page (/erasmus-sevilla is quite poor in content)
- Don't repeat keywords in the URL: http://eurasmus.com/en/erasmus-sevilla/universities/university-of-seville (here you have two repeated keywords man!)
- Make things simplier! Some ideas:
- www.euramus.com/erasmus-spain/seville-city-guide
- www.euramus.com/erasmus-spain/seville-city-transport
- www.euramus.com/erasmus-spain/seville-universities
- www.euramus.com/erasmus-spain/madrid-city-guide
- www.euramus.com/erasmus-belgium/brussels-city-guide
Long tail results for different keywords are normal and that happen. Have you tested with the Moz Grade tool if your pages need some improvements for the related keywords? That would be necessary too.
Btw, I'm Spanish so dont hesitate to send me a PM if you need more help man
Luis
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fresh page versus old page climbing up the rankings.
Hello, I have noticed that if publishe a webpage that google has never seen it ranks right away and usually in a descend position to start with (not great but descend). Usually top 30 to 50 and then over the months it slowly climbs up the rankings. However, if my page has been existing for let's say 3 years and I make changes to it, it takes much longer to climb up the rankings Has someone noticed that too ? and why is that ?
Intermediate & Advanced SEO | | seoanalytics0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
SEO for 1,000,000 page site
Dear All, I hope you can help me with another question about doing SEO for a large site: 1 - My domain is 11 year old, all time was a parking domain
Intermediate & Advanced SEO | | SteveTran2013
2 - We have 10,000 articles - unique content (500-1500 words)
3 - the remaining are automated content, however, they are also unique with data (numbers, figure) We are going to launch it in 2 weeks, and intend to do the following things: Stage 1: first 2 months - only post 10,000 articles with unique content, NO using automated ones.
Link building: get 5-10 authority links pointing to it, either article writings or link pages (authority links Yahoo directory/Dmoz) Stage 2: month 3 to 6: gradually put the automated content online while still posting unique and well written articles.
Link building: Start building links with PR websites, article submission. Do you think there are any problems with this plan? and if 5-10 links can improve our site ranking, given it has a lot of unique content? Thank you very much. BR/Tran1 -
I have search result pages that are completely different showing up as duplicate content.
I have numerous instances of this same issue in our Crawl Report. We have pages showing up on the report as duplicate content - they are product search result pages for completely different cruise products showing up as duplicate content. Here's an example of 2 pages that appear as duplicate : http://www.shopforcruises.com/carnival+cruise+lines/carnival+glory/2013-09-01/2013-09-30 http://www.shopforcruises.com/royal+caribbean+international/liberty+of+the+seas We've used Html 5 semantic markup to properly identify our Navigation <nav>, our search widget as an <aside>(it has a large amount of page code associated with it). We're using different meta descriptions, different title tags, even microformatting is done on these pages so our rich data shows up in google search. (rich snippet example - http://www.google.com/#hl=en&output=search&sclient=psy-ab&q=http:%2F%2Fwww.shopforcruises.com%2Froyal%2Bcaribbean%2Binternational%2Fliberty%2Bof%2Bthe%2Bseas&oq=http:%2F%2Fwww.shopforcruises.com%2Froyal%2Bcaribbean%2Binternational%2Fliberty%2Bof%2Bthe%2Bseas&gs_l=hp.3...1102.1102.0.1601.1.1.0.0.0.0.142.142.0j1.1.0...0.0...1c.1.7.psy-ab.gvI6vhnx8fk&pbx=1&bav=on.2,or.r_qf.&bvm=bv.44442042,d.eWU&fp=a03ba540ff93b9f5&biw=1680&bih=925 ) How is this distinctly different content showing as duplicate? Is SeoMoz's site crawl flawed (or just limited) and it's not understanding that my pages are not dupe? Copyscape does not identify these pages as dupe. Should we take these crawl results more seriously than copyscape? What action do you suggest we take? </aside> </nav>
Intermediate & Advanced SEO | | JMFieldMarketing0 -
Duplicate page content query
Hi forum, For some reason I have recently received a large increase in my Duplicate Page Content issues. Currently it says I have over 7,000 duplicate page content errors! For example it says: Sample URLs with this Duplicate Page Content http://dikelli.com.au/accessories/gowns/news.html http://dikelli.com.au/accessories/news.html
Intermediate & Advanced SEO | | sterls
http://dikelli.com.au/gallery/dikelli/gowns/gowns/sale_gowns.html However there are no physical links to any of these page on my site and even when I look at my FTP files (I am using Dreamweaver) these directories and files do not exist. Can anyone please tell me why the SEOMOZ crawl is coming up with these errors and how to solve them?0 -
Having a hard time with duplicate page content
I'm having a hard time redirecting website.com/ to website.com The crawl report shows both versions as duplicate content. Here is my htaccess: RewriteEngine On
Intermediate & Advanced SEO | | cgman
RewriteBase /
#Rewrite bare to www
RewriteCond %{HTTP_HOST} ^mywebsite.com
RewriteRule ^(([^/]+/)*)index.php$ http://www.mywebsite.com/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L] I added the last 2 lines after seeing a Q&A here, but I don't think it has helped.0 -
Pop Up Pages Being Indexed, Seen As Duplicate Content
I offer users the opportunity to email and embed images from my website. (See this page http://www.andertoons.com/cartoon/6246/ and look under the large image for "Email to a Friend" and "Get Embed HTML" links.) But I'm seeing the ensuing pop-up pages (Ex: http://www.andertoons.com/embed/5231/?KeepThis=true&TB_iframe=true&height=370&width=700&modal=true and http://www.andertoons.com/email/6246/?KeepThis=true&TB_iframe=true&height=432&width=700&modal=true) showing up in Google. Even worse, I think they're seen as duplicate content. How should I deal with this?
Intermediate & Advanced SEO | | andertoons0 -
Do sites with a small number of content pages get penalized by Google?
If my site has just five content pages, instead of 25 or 50, then will it get penalized by Google for a given moderately competitive keyword?
Intermediate & Advanced SEO | | RightDirection0