Yes exactly. Set the author of each post in the post editor, and be sure that author has his/her G+ profile link in their user settings.
Posts made by evolvingSEO
-
RE: Yoast WP Plugin - Social - G+ Author
-
RE: Yoast WP Plugin - Social - G+ Author
Hi Dan
I would select "don't show" for the blog home page author. Authorship is really only for indivdual posts and articles.
-
RE: My site dissapeared from google search...
Hmm I thought the lazy load might have been an issue but all the content is showing the text-only cache: http://webcache.googleusercontent.com/search?q=cache:https://www.claydip.com/airbnb.html&strip=1
Claydip - try doing a fetch and render in Google Webmaster Tools: http://googlewebmastercentral.blogspot.com/2014/05/rendering-pages-with-fetch-as-google.html - and this will show you exactly what Googlebot sees for content.
One thing I can tell you too - I think your page could use some CRO. For example the "call to action" at the bottom "check out our demo" is barely noticeable.
Comparing your page to the top ranked one: http://www.cogzidel.com/airbnb-clone/ - they have
- testimonials
- pricing
- user generated content (comments)
- screenshots
- really easy to see buttons for calls to actions
- easy to find contact info
Does your page convert well? Try to make it a really great landing page with a good conversion rate (do you drive PPC traffic to it?).
-
RE: Yoast WP Plugin - Social - G+ Author
Hi There - Donna is correct. This is for Publisher and a company G+ page, not an individual author page. Instead to enter your G+ personal page, try going to User->Your Profile and adding your G+ URL. Yoast should then add the author URL in the as he describes here: https://yoast.com/push-rel-author-head/
-
RE: 429 Errors?
I highly doubt this error would have anything to do with that. I would also recommending cross checking those rankings with another party tool like Authority Labs - or you can look for your average position in Google Webmaster Tools. Moz runs rankings once a week, and sometimes it might happen to pick up on a temporary fluctuation. So I'd confirm the ranking drop before deciding what to do next
-
RE: 429 Errors?
Sounds like probably the same issue wcbuckner describes - if it's problem in any way I would contact GoDaddy about it and see what they have to say.
-
RE: Long title problem
I can point you towards the best places online to find wordpress developers;
- https://clarity.fm/browse/technology/wordpress
- https://www.odesk.com/o/profiles/browse/skill/wordpress/
- http://premium.wpmudev.org/blog/find-wordpress-developer-designer/
Try those!
-
RE: 404's - Do they impact search ranking/how do we get rid of them?
Hi
As far as I know there is no way to do this in webmaster tools. You can test your robots.txt file with the Robots.txt Tester - but you need to actually update the real file to block URLs from being crawled.
At any rate, normally you would not block 404s from being crawled - Google with either stop crawling them on their own, or this way if they are indexed they can drop out of the index.
-
RE: 404's - Do they impact search ranking/how do we get rid of them?
What do you mean by "submit links to Google Webmaster Tools"? As far as I know there isn't a way to submit 404 URLs in there.
The way to solve 404s are;
- make the URL a real page again (if it broke by accident)
- remove links pointing at the bad page
- 301 redirect the 404 page to one that works
- you can opt to leave it alone if there was nothing important on that page and there is no good page to redirect it to
404s might hurt rankings, but only in extreme cases where it was a popular page and now you're losing the back link value or referral traffic etc. I'd say in 90/100 cases 404s will not hurt your rankings.
-
RE: Google is showing 404 error. What should I do?
Hey There
Are you all set? One thing to consider is sometimes browsers see things differently then search crawlers or web crawlers. Try a header checker like http://urivalet.com/ _and make sure to test different user agents _- like Chrome, Googlebot etc. Check the response code for each user agent. Maybe your browser sees a 200 but Googlebot sees a 404.
-
RE: 404's - Do they impact search ranking/how do we get rid of them?
Hey There
Google's webmaster documentation says;
"Generally, 404 errors don’t impact your site’s ranking in Google, and you can safely ignore them."
When Google says "generally" this tends to mean "in most cases" or "not directly" or "there may be secondary effects"... you get the idea.
But I think they are assuming you need to be smart enough to know if the 404 was intentional, and if not why it happened. For example - if you had a really popular piece of content with back links directly to that URL, and then the URL 404s - you supposed may lose the "link juice" pointing into that article. So in that regard 404s can hurt rankings secondarily.
But as other have said, you can redirect your 404s to a similar page (Google recommends not the homepage).
I am not sure why the Moz report puts them in "high priority" - perhaps they mean "high priority" from a general web best practice point of view, and not strictly SEO.
-
RE: Long title problem
I just wanted to clarify that the SEO plugin has nothing to do with this, and also turning all in one on/off will probably not fix anything.
Either you may have the free version of Screaming Frog which limits to 500 URLs, or you may need to adjust crawl settings - my crawl definitely was heading towards the 57k
-
RE: Long title problem
The root of your issue is that there are links that are coded incorrectly
--> http://screencast.com/t/ndeKw3PL
which is resulting in infinite crawling of pages that do not really exist, and thus the same duplicate/long title tags.
For example this page is a good URL: http://northstarpad.com/category/business-portrait-metro-detroit/
But as shown in my screenshot the "Pet Photography" image links to: http://northstarpad.com/category/business-portrait-metro-detroit/pet-photography// which is a bad URL and NOT http://northstarpad.com/pet-photography/ which is where it should link.
Essentially your links should be "absolute" URLs (which show the full file path) not "relative"
--> http://screencast.com/t/koL5QX9B
You'll need to pass this to a web dev who knows how to edit your WordPress theme files.
-
RE: Google indexing staging / development site that is redirected...
Hi There
1. It could still be in the index because they are 302 redirect and not 301. 302 is temporary, and therefore Google may not de-index those URLs. It also takes time. I've seen Google take months to noindex redirecting URLs. Also, make sure you are not blocking crawling of the dev site, or Google will not see the redirects.
2. I am not sure how they got there to begin with. I pretty much always can find some sort of error - maybe someone tweeted a staging URL, maybe crawling wasn't blocked, maybe there was one link to staging from the live site etc etc. Regardless - somehow Google crawled it To prevent this in the future always block crawling of staging servers well before you ever put anything on them.
3. Usually Google tries to sort this out. They won't give you a penalty for "technical" duplicate content (penalties are more for "malicious" duplicate content ie: stealing people's content). So you won't get penalized, but the more you can help Google out by sorting it out, the more time Google can spend crawling the correct site etc.
What I would do now is, if you do want the staging URLs to redirect (which might not be the best solution if you want to ever go back and work on the staging server again) - but if you do, use 301 redirects and make sure you are allowing crawling of the staging site. Keep it registered in webmaster tools and this way you can monitor the indexation levels.
-
RE: When I changes Template, why traffic goes down?
I'd try to get really specific on two things;
-
"traffic" - are you saying only Google traffic? All traffic?
-
if it IS google traffic, did you notice if Google has actually crawled and re-cached the new version on the website?
-
also, does traffic decrease evenly across all pages on the site? or is it limited to just certain pages or sections?
-
"changing themes" - are these theme changes only design / skin changes - or do they affect URLs, internal links, content (such as page titles/descriptions etc)
I think answering these questions really matters in terms of figuring out what happened. I think it's very unlikely for just a surface skin/design change to hurt traffic so much - there must be deeper things going on, and it's just a matter of figuring out what the specifics are.
-
-
RE: At a bit of a loss as to poor rankings
I think there are generally three areas of weakness for this site at the moment;
- Brand Strength - searching google.co.uk for "superted" or "super ted" and the site is no where on the SERPs, instead you get stuff about a TV show --> http://screencast.com/t/Am735xTr0 - and so if you're going to have the same brand name as another thing out there (even if you came before the TV show) the fact you are not ranking for your own brand means Google does not weigh the "brand" very strongly. The build this up you need
- lots of strong profiles on the web (Google Plus, social, etc) all clearly connecting back to your site
- recent mentions in reputable publications
- people searching for your brand name or branded terms and clicking on your site in the SERPs
- (sidenote: I just realized a while later the "brand" name might also be "the entertainment directory" and you do show one result in the SERP for that --> http://screencast.com/t/hy2wi88VFQtH but this also seems weak and unclear.
- Keyword Targeting / Architecture - I know it would be desirable to rank for "wedding entertainment" but it's really hard to even find a page solely dedicated to "wedding entertainment" that answers that specific query for the user really well. Instead, you have to dig many clicks in to even find such a page and when you do, it's "wedding bands" not wedding entertainment" --> http://www.superted.com/profiles.php/bands-musicians/wedding-bands - what I would do is some really thorough keyword research and align it with architecture - this is a pretty involved process, you can't fix it overnight, but in short;
- You're first do very in depth keyword research - my intuition tells me a site of this magnitude would result in probably 10,000 keywords minimum.
- Then you'd prioritize these by many metrics - search volume, current ranking, conversion potential, competition etc
- Then here's the key - you'd look at top keywords that do not have a singular page focusing on answer this user's query better than any other page on the net
- You'd have to work through a process of aligning keyword opportunities with best site architecture - this may mean building new pages, this may mean removing pages that don't provide value for you or users.
- User Experience - When I land on the homepage it is a terrifying experience. There are so many options, and it is not clear what direction to go in.
- The "hamburger" menu option is probably not clear that it's even there to most people. When I do click on it, there options are overwhelming and clicking on an initial option just brings us to another set of options.
- Then trying the search bar, I expect to get a normal search result - like a list of "wedding bands in ___" but instead I just got a form to fill out, which is surprising and doesn't seem immediately helpful.
- When you click "play" on the video in the slider, the video starts playing but then the slider also starts moving so not only can I not seem to watch the video, I can't stop it from playing once the slider kicks in.
A common oversight I see, is that the site is organized from the site owners perspective not the shoppers perspective. In other words, your basic categories are "bands" "look alikes" etc - but a user would _start _with "I need entertainment for a wedding/party/event ..." - the user starts with the need (X event, situation etc) and would like to see options - so I would consider organizing the core structure of the site around "wedding entertainment" "party entertainment" "corporate even entertainment" etc - think of the categories of events and needs the user has and show them the listings from that perspective.
To answer your questions;
1. I believe the comparison numbers "total links" are including internal links - which there are a lot of.
2. I don't think anchor text is as much of an issue as the other things mentioned above. The sitemap is also not the biggest issue in my opinion. If I were to boil it down it would be site architecture / keyword alignment coupled with user experience.
-
RE: Losing Page Rank
First off, I'm not sure I'd lose sleep over some shuffling around of just 5-10 positions. These are going to move around a LOT day to day and week to week. I'd worry if you completely dropped off pages below you were.
But pro-actively speaking, obviously you want to improve. The trick here is "Flour Sack Towels" is a really crowded and competitive SERP. It's not only competitive to rank there, but insanely competitive to get clicks.
Instead, I'd maybe use something like http://keywordtool.io/ and get some more specific searches off of "flour sack towels" - look for more specific searches where you know you offer one of the best solutions and focus on some more niche rankings.
Beyond that I'd zoom out and look at some more global site wide things you can do to improve. The thing that strikes me is that the site appears somewhat generic. I'd really work on upgrading your branding - you want users (especially new ones) to feel really comfortable giving you their credit card information to buy stuff.
- Put some real people on your about page: http://www.acshomeandwork.com/about-acs/
- Fix the lack of padding in product descriptions: http://screencast.com/t/ZOUStle7paT
- Fix the spacing issues here: http://screencast.com/t/dtqiUDv3ZW
- The whole top header area I think could be improved: http://screencast.com/t/1NPxiotb56 - I'm not a designer of course, but I can tell you my very first look at the site gave me a feeling like "this site does not look professional". I think the search bar is kind of big, the login area icons etc are funny. I bet if you upgraded you header it would cast a much better impression for users - which will trickle down to SEO because of user metrics, brand memorability etc.
-
RE: 429 Errors?
What exactly is happening, the same 429 errors? Does wcbuckner's response explain it for you?
-
RE: Why root domain go down of my website?
Hi There
We need to know the number of _______ for the root domain? The number of links? The Page Authority of the root domain? And do you mean it has not increased in comparison to its self or in comparison to the rest of the site?
-
RE: Question concerning a freelance article link
Hi There
I would look at the page as a whole - and the anchors coming to it - http://moz.com/researchtools/ose/links?site=http%3A%2F%2Fwww.familyfootwearcenter.com%2FWolverine-Boots-c38.html
The reason it may be an issue, is that in my 10 seconds eyeballing that report I can sense a pattern. And that's exactly what Google is looking for. A pattern of 'link building' activities to try and boost that page for 'wolverine boots'. The problem is the page doesn't appear to have many (or any) natural links or natural anchor text. So I'd work on the page as a whole to reduce unnatural links into it. You can also dig deeper with Webmaster Tools to get more in depth than OSE will show.
-
RE: No descripton on Google/Yahoo/Bing, updated robots.txt - what is the turnaround time or next step for visible results?
Hi There
It seems like there are some other issues tangled up in this.
- First off it looks like some non-www URLs indexed in Google are 301 redirecting to www but then 404'ing. It's good they redirect to www, but they should end up on active pages.
- The NON-www homepage is the one showing the robots.txt message. This should hopefully resolve in a week or two when Google re-crawled the NON-www URL, sees the 301 - the actual solution is getting the non-www URL out of the index, and having them rank the www homepage instead. The www homepage description shows up just fine.
- You may want to register the non-www version of the domain in webmaster tools, and make sure to clean up any errors that pop up there as well.
-
RE: Webmaster Tools vs. Google Trends data doesn't add up
You're welcome! One last suggestion I'd make is if you have the budget, to augment Moz rankings with daily rank tracking. Weekly might be enough, but if you want to turn it up a notch try Authority Labs. I find them accurate and robust for daily rank tracking.
-
RE: I am doing the On-page SEO for a website that's never had any SEO done before. I will start with the Pages. Is it necessary to do SEO/Keywords for older Posts?
I pretty much agree with the comments here. What this is basically called is a "content audit" - a process by which you go through old existing content, pull together data and assigned actions to each piece of content.
Here are some guides on how to preform a content audit;
- http://www.quicksprout.com/2014/04/24/how-to-conduct-a-content-audit-on-your-site/
- http://moz.com/blog/content-audit-tutorial
- https://www.distilled.net/blog/seo/how-to-perform-a-content-audit/
The end goal is to determine which content you might;
- Remove
- Update
- Noindex
- Conslidate
- Repurpose
- ...etc
Then you can assign keywords to the content you have left. But I would do the content audit first then assign keywords, because otherwise you'd be assigning keywords to content you might not even keep
-
RE: Webmaster Tools vs. Google Trends data doesn't add up
Also, you can try to see if SEMrush has any past ranking data - depending on the volumes, you might find something: http://www.semrush.com/
-
RE: Webmaster Tools vs. Google Trends data doesn't add up
I wouldn't call it inaccurate, it just operates a bit different. For example;
- It's only when a page ranks - the "avg position" is only when someone actually searched, and the page ranked somewhere - it does not calculate a hypothetical ranking
- It's averaged over time periods - when you are looking at average ranking - the time period can muddy numbers. Shifting in actual ranking over 2-3 months might show an "average" position of 5, but actual could have been 5, 2, 7, 15, 1, 6, etc over time
- It's averages for logged in, logged out, search plus your world etc - rank checkers give a constant number based upon trying to de-personalize. But WMT is averaging personalized rankings, G+ "search plus your world", localized rankings etc - which is a muddy number as well.
In short - tracking rankings give you a steady ranking eliminating changing variables. I'm sure WMT is accurate, but there's a lot more moving parts so it's suspect to these things and can not be looked at as "rankings" but rather an actual look at where you happened to rank when a page did show up in someone's search.
-
RE: Joomla to Wordpress site migration - thousands of 404s
Hi There
Generally those types of 404's won't be too harmful - they sound like they may have been somewhat artificial WordPress pages.
What I would do is get your list now from Analytics or Webmaster Tools - this way you will capture URLs that actually got traffic or Impression in Google and redirect those.
So run a landing pages report, and an top pages report in webmaster tools - maybe for the last 6 months. Create a text file of all the URLs, and run them in list mode through Screaming Frog. Redirect any that 404.
If you were to go back in time, what I would have done with Screaming Frog is - let it crawl everything - you have to allow it to "follow redirects" and "ignore robots.txt" etc - I know Google is not supposed to crawl anything in robots.txt - but basically you'd be letting Screaming Frog get to everything, that way you don't miss any URLs.
-
RE: Webmaster Tools vs. Google Trends data doesn't add up
I think your first step should be to verify rankings with another tool. Hopefully you've tracked rankings somewhere else? Do you see any changes there?
-
RE: Since two years i lost place in google search
Hi There
There certainly looks like there's an issue with links. Here's a few resources on cleaning them up;
- http://searchengineland.com/five-steps-to-clean-up-your-links-like-a-techie-166888
- http://moz.com/blog/google-webmaster-tools-just-got-a-lot-more-important-for-link-discovery-and-cleanup
- http://www.greenlaneseo.com/blog/2014/01/step-by-step-disavow-process/
- http://savvypanda.com/blog/guide-how-to-use-google-disavow-tool.html
Essentially you need to remove and/or disavow as many bad links as you can.
I'm also going to guess that you could use a design upgrade. I don't have any data to back this up, but it's just my opinion looking at the site. A poor design could also bring Panda into play. I've seen some sites get a bump after a design upgrade.
-
RE: I am having an issue with my rankings
Just looking at the anchor text in your back link profile alone gives clues to why it's hard to rank - http://moz.com/researchtools/ose/anchors?page=1&site=http%3A%2F%2Fwww.mesocare.org%2F - that is, the anchors are mainly commercial keywords that one would try to rank for - this is exactly the site of link building that is not working as much, and will even get you penalized in extreme cases.
You have links on such pages as this: http://blogs.creighton.edu/klb89788/2012/08/31/hello-world/ and this http://wsn.eecs.berkeley.edu/?p=56 etc etc - which are not going to help and will likely only hurt the site in rankings.
Unfortunately this will take quite a bit of effort and work to overcome. You'll really have to show Google over time (6-12 months at least) some more quality signals - not only links but user metrics etc as well.
-
RE: I am having an issue with my rankings
Google and being able to rank certain sites (especially affiliate and especially in health related fields) has changed a lot and has gotten much harder in recent years. The same tactics that worked in 2011/2012 won't work now. I'd say it was probably a little too artificially easy to rank those sites in the past, and now might be more a reflection of reality.
(I'll add more in another response)
-
RE: Spotted Hidden Omiod Links in Footer - What do you think is Going on Here?
Hi There
It's a little tricky to diagnose without seeing the full context. But in general I'd say if the code is serving no purpose that you can tell - it doesn't show to the user ever (check all devices / screen sizes, all page types etc etc)... if it never displays for the user for any purpose, you can remove the code. I don't think the code is very bad though, in the grand scheme of things. I've seen far worse blatant links placed within themes and designs. This one doesn't seem malicious, although I'm not entire sure what it's for out of context.
-
RE: Huge Dip in Traffic Last Week - New Algo Update?
Hey There
Few recommendations;
-
Your robots.txt may be blocking CSS/JS - http://www.consumerbase.com/robots.txt which is now something Google recommends NOT blocking. See this article for backstory and about the fetch and render tool in webmaster tools.
-
How long has the homepage redirected to /index.html? Google does they they actually sort this out for you, and it doesn't quite cause the harm some people assume - but it can't hurt to use the normal domain for the homepage.
-
This one returns a 404 http://www.consumerbase.com/index.php
-
One troubling issue is that bad URLs don't return a 404, they 301 to the homepage: http://www.consumerbase.com/bad - I'd highly consider using a standard 404 not found response code with an error message.
-
This ALSO makes it impossible to run a proper crawl test on the site to check for bad pages. It's possible old pages are not being 301 redirected to the right new pages when they break, and you're losing traffic because Google won't pass value if the redirects don't match in content (more info here)
Did you change URLs with the redesign? It doesn't look like it, but just checking. You may want to export a list of trafficked pages prior to the migration from analytics, and run a crawl test on them. As said above, this is almost impossible to test properly right now because bad URLs 301, and aren't being picked up as 404s.
In terms of next steps. You have to get really granular. Segment, segment, segment - and isolate things until you find a "smoking gun" (which you don't always find, but you should look for).
The redesign date seems suspect - and your increase in traffic could have been in the lag time Google has from crawling and caching the updated site. It's just my hunch about the redesign - I'd look thoroughly everywhere.
-
-
RE: Huge Dip in Traffic Last Week - New Algo Update?
To my knowledge this was just speculation by the author "reading between the lines" and there was no such update.
-
RE: Webmaster Tools HTML Improvements Page Blank / Site Not Ranking Well
Hi Sean
In my experience there are always many differences in the crawl reports from Moz and Webmaster Tools. Bear in mind, they are all just computer programs running automatically to detect "issues" - and that a human eye is the best tool at the end of the day.
To step back a little, your overall ranking will likely have very little to do with these sorts of things (titles, descriptions, etc). They can have a small effect but will usually not bump the needle way up. So just keep that in perspective when fixing things.
In general, I would aim to have your titles and descriptions unique, within length guidelines, compelling for users and so on. Google has a great guide here: https://support.google.com/webmasters/answer/35624?hl=en
There can sometimes be a delay in what Google shows you in WMT as well. They might not have crawled everything as recently as Moz. I do tend to find Moz will check everything whereas WMT will only alert you to the pages they find are problematic. (For example, you might have pages with duplicate titles, but Google has determined the pages not important - a quality check if you well - whereas Moz doesn't make this qualitative check, they just crawl and rate everything).
In order of priority I would fix WMT issues first and then move to Moz issues. But again, your overall ranking is likely more due to site authority & trust as measured by links and usage.
-
RE: WordPress and Redirects
Pages load correct at: http://www.waikoloavacationrentals.com/kolea-rentals/9g.html
And do not load at: http://www.waikoloavacationrentals.com/kolea-rentals/9g
So you need to Redirect http://www.waikoloavacationrentals.com/kolea-rentals/9g TO http://www.waikoloavacationrentals.com/kolea-rentals/9g.html
You can do this with the line of code in .htaccess
Redirect 301 /kolea-rentals/9g http://www.waikoloavacationrentals.com/kolea-rentals/9g.html
Let us know if that works?
You can also try the redirect plugin, although personally I prefer using .htaccess for redirects
-
RE: Weird rankings on my website, can't figure it out
I just wanted to point out a few issues I saw right away;
-
Internal linking might be playing a roll. The site is not linking to these hacks pages right from the homepage, it is down a second level via the "download hacks" link. Yet the forum is being linked to universally across the main navigation.
-
In fact the forum comes on top for most internal links --> http://screencast.com/t/MYjf3dNzZGE
-
Bear in mind Google will treat privacy pages and sitemap internal links differently, because they know these are standard pages linked to universally, so you don't need to go removing links to those - but you should think about overall site architecture in relation to your content pages.
-
The top YouTube video won't play - it says: "This Video has Been Removed as a Violation of YouTube's policy against spam, scams, and commercially deceptive content". That can't be good for the pages quality score.
-
All comments below are unanswered. This is your opportunity to engage with visitors, show some extra and helpful content etc - I would respond to some of the commenting.
-
I see 225,000 pages indexed on the entire root domain. Is there really this many pages with actual content that people are looking at and using on a regular basis? Maybe it's a good idea to do a content audit on the site and remove or consolidate low quality and/or extra pages.
-
I am not familiar with this industry and type of site - but Google seems to be trying to rank forum pages for other domains as well - and only a few "landing pages" - as these landing pages just try to get the visitor to register, only only after they register can they get access to the content. Google is reluctant to rank pages in which the user can not easily complete the desired action on that page. Maybe this industry is out of that norm?
-
Make sure you are not blocking CSS and JS from being crawled in robots.txt - http://www.ilikecheats.com/robots.txt - there's a lot of stuff in your file there, and Google now prefers to crawl these files, so I would check you're not blocking that. Use the new fetch and render tool in webmaster tools.
-
You've got an infinite crawl loop happening --> http://screencast.com/t/hNekO6fx3A - I found this with screaming frog SEO spider. Something you'll definitely want to resolve.
-
Lots of pages link internally via 301 redirects, which can also hurt crawl efficiency
-
Did you intend to make this page a "post" and not a "page"? http://www.ilikecheats.com/01/rust-cheats-hacks-aimbot/ - see my video about the differences - either way you should be updating this page, and fixing issues as they arise to make sure users are happily finding what they need on it.
Is doesn't seem like there is any one silver bullet thing to fix, but it seems like Google might be having trouble figuring out which page to rank for certain queries - so perhaps the architecture, keyword targeting etc is not clear enough.
I also suspect the forum may be more trusted and authoritative due to user metrics. It's likely users are visiting the forum more often, staying longer and engaging due to the nature of the forum. But you can certainly help Google out by clarifying the site structure and page targeting a bit better.
-
-
RE: Blog tags are creating excessive duplicate content...should we use rel canonicals or 301 redirects?
The easiest way to resolve issues with tags is to noindex them. I wrote a post about how you can safely do this: http://www.evolvingseo.com/2012/08/10/clean-sweep-yo-tag-archives-now (you basically just double check to see if they are receiving traffic, and leave the few that receive traffic via search indexed).
But at the root level it comes down to knowing how to use tags correctly on a blogging platform to begin with - and knowing how they function, and what happens when you tag something.
First off, tagging any post creates a new page called a "tag archive". The only way someone can get to tag archives by default is if you allow some sort of navigation or links to them on the site itself. This is usually in the form of a "tag cloud" (sidebar or footer) or at the bottom of posts when it says "tagged in....." and links to the tags.
Then if they are internally linked to, they will get indexed (unless you noindex them like I have suggested above). They are typically low to no-value pages because most bloggers just tag everything, and use lots of tags per post. Then you end up with hundreds of pages (tag archives) with no value.
So noindexing them is the safest way to go, except for very extreme cases where a blogger uses them 100% perfect (which is rare, so I always assume most people asking should just noindex but use my post to check for traffic to any of them first).
-
RE: Blog tags are creating excessive duplicate content...should we use rel canonicals or 301 redirects?
Thanks for chiming in! Just to reiterate something - canonical tags are only a suggestion, not a hard directive. Google can and does ignore them. The canonical tag and also pass noindexing directives to the page you point them at. So with tag archives, if they are set to noindex and you canonical them to posts, you might deindex your posts.
And finally, canonical is only something that should be used that can't be solved via indexation, crawling or architecture solutions. In the case of tags in a blogging system (probably wordpress) the easiest and 100% definite way to handle tags is just to noindex them. Then you don't need to worry about canonicals or duplicate content.
Also, tags are no harmful because of duplicate content per se, but just that they add a lot of unneeded pages to the index.
-
RE: Wordpress rel next & previous for SEO
Hi There
I also do not know how this is done at the coding level in WordPress. The Yoast plugin handles it automatically - that's probably the least painful to get it implemented!
-
RE: How to peroperly use h1 , h2 and h3 tag on your website.
The above answers are spot on. Have one H1 per page, and that H1 should be unique and reflect the main heading/title.
I just wanted to add this great article by Bill Slawski - he goes into really great depth about their best usage and importance: http://www.seobythesea.com/2012/01/heading-elements-and-the-folly-of-seo-expert-ranking-lists/
-
RE: Webmaster Tools says that Structured Data is missing (author and updated)
Hi Robin
Are you all set with this? Martijn has the right idea. You can add these as attributes to your markup to make the errors go away and get the implementation perfect. It's probably not hurting SEO though.
You can find the documentation for hentry here: http://microformats.org/wiki/hentry
-
RE: 404 Errors in WMT
Thanks for the info via direct message. As far as I know, those /feed/ URLs should not return 404's. I checked my site for example;
http://www.evolvingseo.com/2014/08/15/hiring-evolver-number-one/feed/ - and that returns a 200 OK.
I am not sure why WordPress would be doing this to be honest. Do you have a developer working with you? Or if it's a Theme you could contact the theme vendor about it.
-
RE: 404 Errors in WMT
Hi There
As mentioned above - it would be optimal to see an example - or if you can't share the site, just a generic example. It may be that wordpress is adding feed URLs where they don't need to be, so we'd need to take a good look.
-
RE: I need an XML sitemap expert for 5 minutes!
Few rules about sitemaps;
-
You should only include in them pages you also want crawled and indexed
-
They should not contain URLs with 404s or blocked by robots.txt
My guess is there are too many URLs in the sitemaps, since I'd guess the website is not over 2 million actual "real" pages,
Also, I randomly clicked on a URL in one of the sitemaps and it 404'd;
http://www.eumom.ie/forums/topic/oakhill-school-leopardstown-/
This is probably causing a lot of the errors you see. It's honestly not a 5 minute fix - but if it were my site, I would be using the Yoast SEO plugin and using the sitemap feature within Yoast. It makes it very easy to include / exclude certain pages and updated automatically etc.
I think there must be a way to tell your plugin what to include / exclude from the sitemap but I don't have as much experience with it.
But generally - only include pages you want crawled and indexed. Don't include pages that 404.
-
-
RE: Should publish as page or blog posts on Wordpress ?
The navigation doesn't have so much to do with it being a page or a post. You can place any page in the menu by going to Appearance -> Menus and dragging and dropping.
Google does not look at pages vs posts differently - it's generally a content-type decision, not so much an SEO decision.
-Dan