No Index, No Follow Short *but relevant) content?
-
One of the sections of our blog is "Community Involvement." In this section, we post pictures of the event, what it was for, and what we did to help.
We want our clients, and potential clients, to see that we do give back to our local community. However, thee are all very short posts (maybe a few hundred words).
I'm worried this might look like spam, or at the very least, thin content to google, so should I no index no follow the posts or just leave them as is?
Thanks,
Ruben
-
Okay, thanks Patrick!
-
Hi there
I personally wouldn't be worried, just because this is relevant content to your business and the work you do for the community. We do this and have never seen issues. Plus, I would imagine you're not writing about an event every week - if you are, I would stick to the bigger events or the ones you are more involved and hands on with, not just events you sponsored. I would just make sure your content for these pages answers the following types of questions:
- Who did you help?
- What do they as an organization do?
- What was the event?
- Why is your business passionate about this organization?
- Where was the event held?
- How many people attended?
- What kind of events happened at this event? (auctions, music, sports, etc.)
- Any useful future event information.
It's relatively easy to fatten this type of content up. Just make sure you're not repeating the same things over and over on different page. Organizations, events, and your reasons for partnering are all different, make sure you showcase that and you should be good!
Hope this helps, good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google still indexing home page even after with 301 - Ecommerce Website
Hi all,
Local Website Optimization | | David1986
We have a 301 redirect problem. Google seems to continue indexing a 301 redirect to our old home page. Even after months. We have a multiple language domain, with subfolders: www.example.com (ex page, now with a redirect to the right locale in the right country) www.example.com/it/home (canonical) www.example.com/en/home (canonical) www.example.com/es/home (canonical) www.example.com/fr/home (canonical) www.example.com/de/home (canonical) We still see the old page (www.example.com) in Google results, with old metadata in English and, just in some countries (i.e.: France), we see the correct result, the "new" homepage, www.example.com/fr/home in first position.
The real problem is that Google is still indexing and showing www.example.com as the "real" and "trusted" URL, even if we set: a 301 redirect the right language for every locale in Google Search Console a canonical tag to the locale url an hreflang tag inside the code a specific sitemap with hreflang tag specified for the new homepages Now our redirect process is the following (Italy example).
www.example.com -->301
www.example.com/en/home --> default version --->301
www.example.com/it/home --> 200 Every online tool, from Moz to Bot simulators see that there is a 301. So Correct. Google Search Console says that: on www.example.com there is a 301 (correct) in the internal link section of Google Search Console the www.example.com is still in first position with 34k links. Many of these links are cominig from property subdomains. Should we change those links inside those third level domain? From www.example.com to www.example.com/LOCALE/home? the www.example.com/LOCALE/home are the real home page, they give 200 code Do you know if there's a way to delete the old home page from Google results since this is 301? Do you think that, even after a 301 redirect, if Google sees too many internal links decides to ignore the 301? Thanks for your help! Davide0 -
Why am I not ranking although my domain authority is higher than competitors with same relevancy?
So I am trying to rank the domain http://jamesriver.org for the term "Churches in Springfield, MO" Not sure why we are not ranking as well as we ought to rank. I have a few assumptions, but wanted to see what other have to say to get better input. Below are some details about the us: We have done a brand name change in the past 2 years - James River Assembly to James River Church We have two locations: Ozark, MO - which has been there for a very long time and Springfield, MO which is a newer campus We have higher domain authority than others that rank higher for the term We have a new website that was launch about 3 months ago We have a location page for each of the 2 campuses I am wondering what factors might be at play in our lesser rankings even though we are relevant to the term and have higher authority than those that are ranking much higher than us. Thanks for any help you can provide.
Local Website Optimization | | chris.oursbourn0 -
What is the optimal approach for a new site that has geo-targeted content available via 2 domains?
OK, so I am helping a client with a new site build. It is a lifestyle/news publication that traditionally has focused on delivering content for one region. For ease of explanation, let's pretend the brand/domain is 'people-on-the-coast.com'. Now they are now looking to expand their reach to another region using the domain 'people-in-the-city.com'. Whilst on-the-coast is their current core business and already has some search clout, they are very keen on the city market and the in-the-city domain. They would like to be able to manage the content through one CMS (joomla) and the site will deliver articles and the logo based on the location of the user (city or coast). There will also be cases where the content is duplicated for both regions. The design/layout etc. will all remain identical. So what I am really wanting to know is the pros, cons and ultimately the best approach to handle the setup and ongoing management from an SEO (and UX) perspective. All I see is problems! Any help would be greatly appreciated! Thanks,
Local Website Optimization | | bennyt
Confused O.o0 -
Implementation advice on fighting international duplicate content
Hi All, Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation. The situation is that we have 5 sites with similar content. Out of these 5: 2 use the same URL stucture and have no suffix 2 have a different URL structure with a .html suffix 1 has an entirely different URL structure with a .asp suffix The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level). 4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content. Is there an easy way to go about this or is the only way a manual addition? Has anyone had a similar experience? Your advice will be greatly appreciated. Many thanks, Emeka.
Local Website Optimization | | OptiBacUK0 -
Location Pages and Duplicate Content and Doorway Pages, Oh My!
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services. Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc. They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well. My initial reaction agrees with this article: http://moz.com/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names". In a nutshell, Google's Guidelines seem to have a conflict on this topic: Location Pages: "Have each location's or branch's information accessible on separate webpages"
Local Website Optimization | | eyeflow
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one." Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page: Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content." ...starting to feel like I'm in a Google Guidelines Paradox! Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?0 -
UK website to be duplicated onto 2 ccTLD's - is this duplicate content?
Hi We have a client who wishes to have a site created and duplicated onto 3 servers hosted in three different countries. United Kingdom, Australia and USA. All of which will ofcourse be in the English language. A long story short, the website will provide the user 3 options on the homepage asking them which "country site" they wish to view. (I know I can detect the user IP and autoredirect but this is not what they want) Once they choose an option it will direct the user to the appropriate ccTLD. Now the client wants the same information to appear on all 3 sites with some slight variations in products available and English/US spelling difference but for the most part, the sites will look the same with the same content on each page. So my question is, will these 3 sites been seen as duplicates of each other even though they are hosted in different countries and are on ccTLD's? Are there any considerations I should pass onto the client with this approach? Many thanks for reading.
Local Website Optimization | | yousayjump
Kris0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0 -
Canonical for 80-90% duplicate content help
Hi . I seem to spend more time asking questions atm. I have a site I have revamped www.themorrisagency.co.uk I am working through sorting out the 80-90% duplicated content that just replaces a spattering of geographical and band styles eg: http://www.themorrisagency.co.uk/band-hire/greater-manchester/ 'manchester' being changed to : http://www.themorrisagency.co.uk/band-hire/oxfordshire/ etc So I am going through this slow but essential process atm. I have a main http://www.themorrisagency.co.uk/band-hire/ page My question is: Would it be sensible to (using Yoast SEO plug in) use a canonical redirect as a temp solution from these dup pages to http://www.themorrisagency.co.uk/band-hire/ Rather than remove them What are your thoughts as I am aware that the damage using a rel= could make it worse. Thanks as always Daniel
Local Website Optimization | | Agentmorris0