SEO best practice: Use tags for SEO purpose? To add or not to add to Sitemap?
-
Hi Moz community,
New to the Moz community and hopefully first post/comment of many to come. I am somewhat new to the industry and have a question that I would like to ask and get your opinions on. It is most likely something that is a very simple answer, but here goes:
I have a website that is for a local moving company (so small amounts of traffic and very few pages) that was built on Wordpress... I was told when I first started that I should create tags for some of the cities serviced in the area. I did so and tagged the first blog post to each tag. Turned out to be about 12-15 tags, which in turn created 12-15 additional pages. These tags are listed in the footer area of each page. There are less than 20 pages in the website excluding the tags.
Now, I know that each of these pages are showing as duplicate content. To me, this just does not seem like best practices to me. For someone quite new to the industry, what would you suggest I do in order to best deal with this situation.
Should I even keep the tags?
Should I keep and not index?
Should I add/remove from site map?
Thanks in advance for any help and I look forward to being a long time member of SEOMoz.
-
Hi Keri,
I have actually checked out that post numerous times in the past. Seems to be a great "manual" for anyone just starting out.
Thanks for the information!
-
Hello Dr. Peter J. Meyers,
Thanks for your response to my post. I am always looking to further my educations and make some better studies. I will definitely implement this into some of our new studies and see what I find.
Thanks for the link.
Have a great night.
-
Thanks for the great input Peter.
The site does have, let me rephrase... did have 12-15 tags total and I realized after everyone's input that I needed to just get rid of them. It seemed that it was a tactic that once worked great, but as things move forward, probably not in the best interest to aid in the rankings later on.
I am going to take the most natural approach and try to keep the on-page clean and free of "tricks" and focus on off-page now and see how well that works.
Thanks again for your advice.
-
Since you're talking about a site with <20 pages total, 12-15 tag pages linked through the entire site is a lot. It's just an issue of dilution - with a new site, you only have so much authority (inbound link power) to go around, and you're now pushing it to twice as many pages, many of which are just internal search results that Google could see as low value. It's not a disaster, but it's probably going to hurt your ability to rank in the short-term.
I assume you mean 12-15 tags total, not per post, right? The other issue with tags is just that they have a tendency to spin out of control as a site grows. You can easily end up with 50-100 tags linked from every page that are not only diluting your ranking power but that have very limited value for users (people just can't parse that many options).
All in all, I don't think tags are bad, but I do think it's worth being conservative, especially when you're first getting started. A site with <20 pages doesn't need a ton of fancy navigation options. A clear, solid site architecture will be better for visitors and Google, in most cases.
-
It's possible that these percentages may hold true for some site in some vertical, but "on-page" counts issues like site architecture that are incredibly important for larger sites. I've seen sites made or broken by on-page. All 22 Panda updates are essentially focused on on-page.
There's no one formula, and I think it can be very dangerous to suggest a one-sized-fits-all approach. I wrote a whole post on this debate:
http://www.seomoz.org/blog/whats-better-on-page-seo-or-link-building
I also think the role of social in 2012 has been seriously overstated. Social mentions can be great for getting a site noticed and indexed, especially in a low-competition vertical, but the impact often isn't lasting, without sustained activity. Sustained activity requires content and solid on-page, so it's really tough to separate the two.
-
I wrote out a nice 1000+ words and it wouldn't allow me to post it, so I am going to make this 2nd one short lol.
To make a long story short, yes we use it for our business and have found that it works great. We use this on 2200 + clients on a monthly basis give or take a few, I am not sure where we are at this month.
Again this all depends on the niche, site type, competitiveness of a keywords, etc... General SEO is going to rank up low competitive keywords, mid comp keywords your going to need little of everything I talked about , and competitive keywords your going to need all of them and then some.
Take SEO services keyword for example. Analyze why they are ranked up where they are. I can tell you quick... 73 domains they built, AKA a network. They then diversified it with some other low quality types so they don't get penalized for anchor text diversity issues. They have a great platform diversity, and not to mention their class C ip span across multiple. All 73 domains have a min authority of 40 from what I could send from a 15 min check or so.
So if I wanted to beat that site, I know 1... I need better tactics, I need to beat their social media. I know that I need to make 80 domains that have a min of 45 authority from every page and better page rank as well... I don't believe in page rank much anymore, but it's burnt into my mind from using it for so many years. So I buy pr + domain authority to make sure I can't go wrong... I analyze for traffic, possible keywords, niche type etc.. This all plays into factors when you build a network.
This is the method that works to rank SEO services to rank 1 in the world, because I helped do it some time ago. Although the sites that they are using now are very low quality and it is mostly likely a rinse and repeat site and will possibly last 3 months top if they keep building more domains.
You can use a $20 domain name with some age, authority and you may as well get some page rank just to be safe. Try to buy a domain with 50 + authority and pr3 +. After that start some general safe link types and diversify it very good. Build links slowly on it for about 2 months, then start building your network and bam... Page 1 rankings as easy as that.
Now for a term like payday loan in the US, it would take hacking like they are doing right now and that is very unethical and not to mention illegal. So would most likely go as far as the network and thats about it..
Have a great night everyone.
Matthew Boley
-
Keri,
That's why I said broad keywords, not specific. You need to make sure your site is setup for the broad keywords. There is a large difference in broad and specific.
Google must know what your site is about. I don't advise spamming a page with a 10% + density with keywords, that is why I tell everyone a 1.5% density to make sure they are safe.
I am personally not a firm believer in onpage SEO at all and really don't think it does anything anymore. If you have your title setup right, you meta tags, and you don't have anything duplicate on your site, you can rank it up. I have ranked sites that I wasn't allowed to touch do to them being a government site with terrible url structure and I ranked them up perfectly fine with 0 SEO. Now a domain authority of 80 probably didn't hurt, but w/e their are instances where something will work for one site and not the next.
Have a great night. Thanks for the reply and the links to those sites.
Matthew Boley
-
Hi Matthew,
Many in the field (including several on the SEOmoz blog) consider aiming for a specific number for keyword density to be a bit of an outdated tactic. Rand talks about it in this whiteboard friday at http://www.seomoz.org/blog/10-myths-that-scare-seos-but-shouldnt-whiteboard-friday. Matt Cutts also talks about it at http://www.youtube.com/watch?v=Rk4qgQdp2UA.
It can be helpful to include in answers (like you did in your other answers) "in my experience" or "on my sites" you've found the following to be true, or to reference where an authority has also said something is true.
-
Hi Ben,
Welcome to SEOmoz! We're glad to have you here. Have you check out this post by Dan Shure, an SEOmoz Associate who specializes in Wordpress? He talks about how to set it up Wordpress for SEO success, including how to work tags, categories, what to noindex, etc. It's at
-
Hi Matthew,
You present some interesting figures. You do say that "even though this may not be correct, but has worked for me." -- does this mean that the following percentages are things you have based off of you own testing, or do you have some references you could give that would help back this up?
-
That is correct. I kind of look at it like this even though this may not be correct, but has worked for me. 5% of the algorithm is onpage SEO properly setup, 65 - 75% SEO offpage pending on niche type, 25-30% social media metrics. It may not be right on, but I do hold rank 1 for payday loans right now in a popular country other than the US, so I won't complain. We also rank for many terms for our main SEO company that are very competitive here in the US. We did hold rank 1 for Affordable SEO until I woke up one morning and someone spammed our site with 1000's of bad links, so we have been working on getting that back up, but it happens.
Anyways have a great night and I am glad I could contribute and help.
Matthew Boley
-
Thanks Matthew for the response. That is exactly what I was looking for. Seems that all on-page optimization now needs to be 100% natural looking and focus the rest of it with off-page, etc.
-
Hello,
I would make a natural looking page, make sure your main/broad keywords are optimized at or around 1.0% - 1.5% density to be safe. I would personally just get rid of the tags, because they look unnatural and that's a very old method that was used last year and many years before that.
A good rule of thumb for SEO is, if it looks unnatural then it is going to count as spam to Google as some point. So do exactly as they want you to and build a nice looking website for a visitor. Then after you have built it, focus on off page SEO tactics, social media marketing, local if your targeting local ads, etc.
Now if your writing a blog post, it's ok to have 1 or 2 tags per blog, that isn't a problem at all, but if you have 15 on 1 page, that is spam.
For the offpage SEO, find relevant sites to yours, or niches, then analyze them with the site explorer. If the sites have a good rating, rank, authority, etc, then link to it. If it isn't relevant and a quality site, don't link to it.
Have a great day and a happy holidays, if you have any other questions I will be happy to try and help you out.
Matthew Boley
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do home page carry more seo benefit than other pages?
hi, i would like to include my kws in the URL and they are under 50 characters. is there anything in the algo that tells engines to give more importance to homepage?
White Hat / Black Hat SEO | | alan-shultis0 -
It's possible a bounce-rate attack manipulate SEO?
My site has been visited by unusual users with one second session times. This leaves my analytics data confused.
White Hat / Black Hat SEO | | CompraBit0 -
A Sitemap Web page & A Sitemap in htaccess - will a website be penalised for having both?
Hi I have a sitemap url already generated by SEO Yoast in the htaccess file, and I have submitted that to the search engines. I'd already created a sitemap web page on the website, also as a helpful aid for users to see a list of all page urls. Is this a problem and could this scenario create duplicate issues or any problems with search engines? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
How to remove trailing slashes in URLs using .htaccess (Apache)?
I want my URLs to look like these: http://www.domain.com/buy http://www.domain.com/buy/shoes http://www.domain.com/buy/shoes/red Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Abused seo unintentionally, now need a way out
Hello, I have been in contact with a smo to optimize my site for search engines and social media sites. my site was doing great from last 4 years. but suddenly it started dropping in ranking. then i came and joined seomoz pro to find a way out. i was suggested to categories content in form of subdomains ... well that put a huge toll on my rankings.. thanks to suggestions here i have 301 them to sub directories. Now another huge question arises. i found out that my smo guy was taking artificial votes or whatever youc all them on twitter, facebook and g+ ...twitter and facebook's are understandable but i am getting to think that these votings on g+ might have affected my site's ranking ? here is a sample url http://www.designzzz.com/cutest-puppy-pictures-pet-photography-tips/ if you scroll below you will see 56 google plus 1s... now the big question is, i have been creating genuince content. but nowt hat i am stuck in this situation, how to get out of it ? changing urls will be bad for readers.. will a 301 will fix it ? or any other method. thanks in advance
White Hat / Black Hat SEO | | wickedsunny10 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0