SEO best practice: Use tags for SEO purpose? To add or not to add to Sitemap?
-
Hi Moz community,
New to the Moz community and hopefully first post/comment of many to come. I am somewhat new to the industry and have a question that I would like to ask and get your opinions on. It is most likely something that is a very simple answer, but here goes:
I have a website that is for a local moving company (so small amounts of traffic and very few pages) that was built on Wordpress... I was told when I first started that I should create tags for some of the cities serviced in the area. I did so and tagged the first blog post to each tag. Turned out to be about 12-15 tags, which in turn created 12-15 additional pages. These tags are listed in the footer area of each page. There are less than 20 pages in the website excluding the tags.
Now, I know that each of these pages are showing as duplicate content. To me, this just does not seem like best practices to me. For someone quite new to the industry, what would you suggest I do in order to best deal with this situation.
Should I even keep the tags?
Should I keep and not index?
Should I add/remove from site map?
Thanks in advance for any help and I look forward to being a long time member of SEOMoz.
-
Hi Keri,
I have actually checked out that post numerous times in the past. Seems to be a great "manual" for anyone just starting out.
Thanks for the information!
-
Hello Dr. Peter J. Meyers,
Thanks for your response to my post. I am always looking to further my educations and make some better studies. I will definitely implement this into some of our new studies and see what I find.
Thanks for the link.
Have a great night.
-
Thanks for the great input Peter.
The site does have, let me rephrase... did have 12-15 tags total and I realized after everyone's input that I needed to just get rid of them. It seemed that it was a tactic that once worked great, but as things move forward, probably not in the best interest to aid in the rankings later on.
I am going to take the most natural approach and try to keep the on-page clean and free of "tricks" and focus on off-page now and see how well that works.
Thanks again for your advice.
-
Since you're talking about a site with <20 pages total, 12-15 tag pages linked through the entire site is a lot. It's just an issue of dilution - with a new site, you only have so much authority (inbound link power) to go around, and you're now pushing it to twice as many pages, many of which are just internal search results that Google could see as low value. It's not a disaster, but it's probably going to hurt your ability to rank in the short-term.
I assume you mean 12-15 tags total, not per post, right? The other issue with tags is just that they have a tendency to spin out of control as a site grows. You can easily end up with 50-100 tags linked from every page that are not only diluting your ranking power but that have very limited value for users (people just can't parse that many options).
All in all, I don't think tags are bad, but I do think it's worth being conservative, especially when you're first getting started. A site with <20 pages doesn't need a ton of fancy navigation options. A clear, solid site architecture will be better for visitors and Google, in most cases.
-
It's possible that these percentages may hold true for some site in some vertical, but "on-page" counts issues like site architecture that are incredibly important for larger sites. I've seen sites made or broken by on-page. All 22 Panda updates are essentially focused on on-page.
There's no one formula, and I think it can be very dangerous to suggest a one-sized-fits-all approach. I wrote a whole post on this debate:
http://www.seomoz.org/blog/whats-better-on-page-seo-or-link-building
I also think the role of social in 2012 has been seriously overstated. Social mentions can be great for getting a site noticed and indexed, especially in a low-competition vertical, but the impact often isn't lasting, without sustained activity. Sustained activity requires content and solid on-page, so it's really tough to separate the two.
-
I wrote out a nice 1000+ words and it wouldn't allow me to post it, so I am going to make this 2nd one short lol.
To make a long story short, yes we use it for our business and have found that it works great. We use this on 2200 + clients on a monthly basis give or take a few, I am not sure where we are at this month.
Again this all depends on the niche, site type, competitiveness of a keywords, etc... General SEO is going to rank up low competitive keywords, mid comp keywords your going to need little of everything I talked about , and competitive keywords your going to need all of them and then some.
Take SEO services keyword for example. Analyze why they are ranked up where they are. I can tell you quick... 73 domains they built, AKA a network. They then diversified it with some other low quality types so they don't get penalized for anchor text diversity issues. They have a great platform diversity, and not to mention their class C ip span across multiple. All 73 domains have a min authority of 40 from what I could send from a 15 min check or so.
So if I wanted to beat that site, I know 1... I need better tactics, I need to beat their social media. I know that I need to make 80 domains that have a min of 45 authority from every page and better page rank as well... I don't believe in page rank much anymore, but it's burnt into my mind from using it for so many years. So I buy pr + domain authority to make sure I can't go wrong... I analyze for traffic, possible keywords, niche type etc.. This all plays into factors when you build a network.
This is the method that works to rank SEO services to rank 1 in the world, because I helped do it some time ago. Although the sites that they are using now are very low quality and it is mostly likely a rinse and repeat site and will possibly last 3 months top if they keep building more domains.
You can use a $20 domain name with some age, authority and you may as well get some page rank just to be safe. Try to buy a domain with 50 + authority and pr3 +. After that start some general safe link types and diversify it very good. Build links slowly on it for about 2 months, then start building your network and bam... Page 1 rankings as easy as that.
Now for a term like payday loan in the US, it would take hacking like they are doing right now and that is very unethical and not to mention illegal. So would most likely go as far as the network and thats about it..
Have a great night everyone.
Matthew Boley
-
Keri,
That's why I said broad keywords, not specific. You need to make sure your site is setup for the broad keywords. There is a large difference in broad and specific.
Google must know what your site is about. I don't advise spamming a page with a 10% + density with keywords, that is why I tell everyone a 1.5% density to make sure they are safe.
I am personally not a firm believer in onpage SEO at all and really don't think it does anything anymore. If you have your title setup right, you meta tags, and you don't have anything duplicate on your site, you can rank it up. I have ranked sites that I wasn't allowed to touch do to them being a government site with terrible url structure and I ranked them up perfectly fine with 0 SEO. Now a domain authority of 80 probably didn't hurt, but w/e their are instances where something will work for one site and not the next.
Have a great night. Thanks for the reply and the links to those sites.
Matthew Boley
-
Hi Matthew,
Many in the field (including several on the SEOmoz blog) consider aiming for a specific number for keyword density to be a bit of an outdated tactic. Rand talks about it in this whiteboard friday at http://www.seomoz.org/blog/10-myths-that-scare-seos-but-shouldnt-whiteboard-friday. Matt Cutts also talks about it at http://www.youtube.com/watch?v=Rk4qgQdp2UA.
It can be helpful to include in answers (like you did in your other answers) "in my experience" or "on my sites" you've found the following to be true, or to reference where an authority has also said something is true.
-
Hi Ben,
Welcome to SEOmoz! We're glad to have you here. Have you check out this post by Dan Shure, an SEOmoz Associate who specializes in Wordpress? He talks about how to set it up Wordpress for SEO success, including how to work tags, categories, what to noindex, etc. It's at
-
Hi Matthew,
You present some interesting figures. You do say that "even though this may not be correct, but has worked for me." -- does this mean that the following percentages are things you have based off of you own testing, or do you have some references you could give that would help back this up?
-
That is correct. I kind of look at it like this even though this may not be correct, but has worked for me. 5% of the algorithm is onpage SEO properly setup, 65 - 75% SEO offpage pending on niche type, 25-30% social media metrics. It may not be right on, but I do hold rank 1 for payday loans right now in a popular country other than the US, so I won't complain. We also rank for many terms for our main SEO company that are very competitive here in the US. We did hold rank 1 for Affordable SEO until I woke up one morning and someone spammed our site with 1000's of bad links, so we have been working on getting that back up, but it happens.
Anyways have a great night and I am glad I could contribute and help.
Matthew Boley
-
Thanks Matthew for the response. That is exactly what I was looking for. Seems that all on-page optimization now needs to be 100% natural looking and focus the rest of it with off-page, etc.
-
Hello,
I would make a natural looking page, make sure your main/broad keywords are optimized at or around 1.0% - 1.5% density to be safe. I would personally just get rid of the tags, because they look unnatural and that's a very old method that was used last year and many years before that.
A good rule of thumb for SEO is, if it looks unnatural then it is going to count as spam to Google as some point. So do exactly as they want you to and build a nice looking website for a visitor. Then after you have built it, focus on off page SEO tactics, social media marketing, local if your targeting local ads, etc.
Now if your writing a blog post, it's ok to have 1 or 2 tags per blog, that isn't a problem at all, but if you have 15 on 1 page, that is spam.
For the offpage SEO, find relevant sites to yours, or niches, then analyze them with the site explorer. If the sites have a good rating, rank, authority, etc, then link to it. If it isn't relevant and a quality site, don't link to it.
Have a great day and a happy holidays, if you have any other questions I will be happy to try and help you out.
Matthew Boley
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googlebot crawling AJAX website not always uses _escaped_fragment_
Hi, I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
White Hat / Black Hat SEO | | yohayg
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragment For example:
Googlebot crawl log for https://my_web_site/some_slug Results:
Googlebot crawled this URL 17 times in July: http://i.imgur.com/sA141O0.jpg Googlebot crawled this URL additional 3 crawls using the escaped_fragment: http://i.imgur.com/sOQjyPU.jpg Do you have any idea if this behavior is normal? Thanks, Yohay sOQjyPU.jpg sA141O0.jpg0 -
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
Are businesses still hiring SEO that use strategies that could lead to a Google penalty?
Is anyone worried that businesses know so little about SEO that they are continuing to hire SEO consultants that use strategies that could land the website with a Google penalty? I ask because we did some research with businesses and found the results worrying: blog farms, over optimised anchor text. We will be releasing the data later this week, but wondered if it something for the SEO community to worry about and what can be done about it.
White Hat / Black Hat SEO | | williamgoodseoagency.com0 -
Can I leave off HTTP/HTTPS in a canonical tag?
We are working on moving our site to HTTPS and I was asked by my dev team if it is required to declare HTTP or HTTPS in the canonical tag? I know that relative URL's are acceptable but cannot find anything about HTTP/HTTPS. Example of what they would like to do Has anyone done this? Any reason to not leave off the protocol?
White Hat / Black Hat SEO | | Shawn_Huber0 -
Competitor Bad Practice SEO Still Ranking Well But Why ?
Moz Friends, A very close competitor have always been challenging for similar competitive keywords. We seem to have the advantage for alot of long tail keywords but on one of the higher traffic relevant keywords they seem to do well. I really struggle to understand why, particularly with the back links they use Just my thoughts and notes on the two: Our Page Better written text content (Maybe slightly written to for experienced target audience but we are working on simplifying things) Good Clear site URL structure and navigation for usability Fresh content updates Mobile optimized Reasonable page speeds Good on-page optimization Good back links from industry influences Competitor Page Negatives Site structure and URL's are inconsistent and messy Lower quality content site wide They use tried and tested on page optimization methods like Keyword spamming, Bold Keywords,Underlining Keywords (Sarcasm) Terrible back links, all directories and free article submission sites (Seriously take a look) Less focused on page optimization Not mobile optimized Most of the rest of the sites carry on the same sort of differences, Engine: www.google.co.uk Keyword: Sound level meters **Our Page: **www.cirrusresearch.co.uk/products/sound-level-meters/ **Competitor Page: **www.pulsarinstruments.com/product-information/Sound-Level-Meter.html Any feedback would be greatly appreciated please, i am really struggling to get my head around this Thanks James
White Hat / Black Hat SEO | | Antony_Towle1 -
Website rankings plummeted after a negative SEO attack - help!
Hello Mozzers A website of a new client (http://bit.ly/PuVNTp) use to rank very well. It was on the top page for any relevant search terms in its industry in Southern Ontario (Canada). Late last year, the client was the victim of a negative SEO attack. Thousands upon thousands of spammy backlinks were built (suspected to be bought using something like Fiverr). The links came from very questionable sites or just low quality sites. The backlink growth window was very small (2,000 every 24 hours or so). Since that happened that site has all but disappeared from search results. It is still indexed and the owner has disavowed most of the bad backlinks but the site can't seem to bounce back. The same happened for another site that they own (http://bit.ly/1tErxpu) except the number backlinks produced was even higher. The sites both suffer from duplicate content issues and at one point (in 2012) were de-indexed due to the very spammy work of a former SEO. They came back in early 2013 and were fine for some time. Thoughts?
White Hat / Black Hat SEO | | mattylac0 -
Does IP Blacklist cause SEO issues?
Hi, Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation. Does it hurt our SEO if we have a domain hosted on that IP? Any solid evidence? Thanks.
White Hat / Black Hat SEO | | bjs20100 -
Would having a + plus sign between keywords in meta title have an effect on SEO?
I have seen one of my clients' competitors do this in their meta title and it got me a little intrigued... I understand that google uses the + sign as an operator in adwords, and to a certain extent, as a search tool, but would it help or make any difference to the SEO in the meta title/data (eg. 'SEO+Marketing+Services')? Thanks
White Hat / Black Hat SEO | | LexisClick10