I'd publish this is a page. Full explanation here! --> https://www.youtube.com/watch?v=2RrcUKaiAc4
Posts made by evolvingSEO
-
RE: Should publish as page or blog posts on Wordpress ?
-
RE: Do H2 tags carry more weight than h4 tags?
Hi There
It's questionable how much weight H tags carry at all. If they carry any, it's probably really small, or relative to how easy the keyword and industry is. H1's may help a bit more than all others - but when you start getting into H2's vs H4's it's splitting really thin hairs.
This is the best resource on H tags and SEO I know of: http://www.seobythesea.com/2012/01/heading-elements-and-the-folly-of-seo-expert-ranking-lists/
-
RE: Questions Regarding Wordpress Blog Format, Categories and Tag pages...
Hi There
To answer your first question about URL structure, I would see slides 12-15 in this Mozinar I did: http://www.slideshare.net/evolvingseo/hands-onwpseodanshure (you can also watch the whole Mozinar here).
In short - if your site is mainly a blog, it's ok to do site.com/blog-post but if your site is a blog within a site you probably want to do site.com/blog/post-name
You will want to 301 redirect the old URLs to the new one. I think it's worth it.
For many answers about tags etc you can see my post about WordPress SEO here but in general;
- categories - index
- tags - noindex
- author archives - noindex for single author blog, index for multi-author with customized author pages
- dated archives - noindex
- subpages of archives - noindex
You're not noindexing to avoid duplicate content, you're noindexing to avoid too many pages being indexed that don't need to be, and probably won't rank anyway
-
RE: Robots.txt on refinements
Hi There
In general you probably don't need to do that. Here's how I would normally deal with indexation in WordPress (assuming you're using WordPress);
- Categories - index
- Tags - noindex
- Date archives - noindex
- Author (single author blogs) - noindex
- Author (multi-author) - index
- Subpages - noindex
Basically all these settings are shown in my post here on setting up WordPress: http://moz.com/blog/setup-wordpress-for-seo-success
Yoast is the best plugin to do all this with!
-
RE: How can I tell if I am using Universal Analytics?
ThompsonPaul is right - the activation to Universal does not mean the code has been changed on your website, which is why you'll still see the old _gaq.push code. See my separate answer/screenshot
-
RE: How can I tell if I am using Universal Analytics?
When you go to Admin -> Tracking Info -> Tracking code
If Universal has been activated, but the code has NOT been updated, you will see this;
http://screencast.com/t/FJuKuBDI <--click for screenshot.
-
RE: Page not ranking despite indicators showing should easily be mid-1st page?
"...slowly going downwards over the year"
This sounds like possibly user metric related, which are important ranking factors no tool is able to give us. It's possible, users have showed preference to other pages ranking for that query over time. Metrics could be;
- click through rate from SERPs
- bouncing back to SERPs ("pogo-sticking")
- time on page
- repeat visits to the page
- direct visits the page
- completed actions on that page
- ...etc
Essentially, users might be telling Google they don't like the page based upon their behavior. Some suggestions;
- Watch Rand's video about how to improve Pogo Sticking
- Check out these articles on improving CTR in the SERPs (most the top ranked resources there are good)
- Set up goal tracking to get a conversion rate
- Try Justin Cutroni's scroll tracking to see how users are interacting with the content
- Try a heatmap program like http://www.crazyegg.com/
Without being able to see your page, it's tough to give custom recommendations, but I would also recommend making sure you have the best design of any site in your niche for your audience.
-
RE: Is Google suppressing a page from results - if so why?
Technically the disavow acts like a nofollow, so unless you think they might turn into "followed" at some point, you do not need to disavow them.
It can take 6+ months for a disavow to take effect too. So if it was submitted only recently, it might need some more time.
-
RE: Is Google suppressing a page from results - if so why?
Hi - I would recommend using webmaster tools in addition to Moz to check for backlinks. There are likely more links in there that OSE does not have.
What I usually do is pull the links from there, and crawl them with Screaming Frog (as some may be old and are now gone). There's a really good process for going through links here: http://www.greenlaneseo.com/blog/2014/01/step-by-step-disavow-process/ - although it's for disavowing in the article, you can use the process to find bad links for any situation.
-
RE: Is Google suppressing a page from results - if so why?
It's possible, although I would definitely look into any followed links that are of low quality or over optimized. The site may have just been over some sort of threashold and you'd want to reel back that percentage.
-
RE: Is Google suppressing a page from results - if so why?
Hi - yes you'd want to clean up the links to that page, and ideally in this order or preference;
- Try to get exact anchors changed on pages where the link quality is ok, but the anchors are over optimized
- Try to get links removed entirely from low quality pages
- If #1 and #2 are not possible, than disavow the links.
- Ideally of course, you'd want to acquire some new trusted links to the page.
At minimum you'd want to see the page show up again for site: searches with the term in question. That to me would be a sign the filter being applied has been removed. I'm not sure how long this would take Google to do, it may depend on how successful you are at the steps above.
-
RE: Need to access robots.txt to block tags
Sajio is right on this one. The best thing to do with tags in general is noindex them. Then they won't show up in search results, and the duplicate titles won't even matter
To noindex tags with Yoast, go to "SEO->Tiles & Meta->Taxonomies" and check off the box "noindex" under Tags.
But, as Prestashop said, if you do ever need to edit the robots.txt in the future, Yoast allows you to do this without FTP access. You go to 'edit files'.
-
RE: Is Google suppressing a page from results - if so why?
This doesn't feel like an on-page thing to me. Perhaps it's the exact match anchor links from Press Releases? See the open site explorer report. For example, here is one page with such a link: http://www.prweb.com/releases/2013/8/prweb10974324.htm
Google's algo could have taken targeted action on that one page due to the exact match anchor backlinks, and because many are from press releases.
Have you checked webmaster tools for a partial manual penalty?
The suppression of this page when using site: searches could further indicate a link based issue.
-
RE: Re-Post: Unanswered - Loss of rankings due to hack. No manual penalty. Please advise.
I would check to see if the fake pages are still indexed or cached. So search the old fake URLs in Google, or use the cache: operator. Perhaps something was done to prevent Google from knowing to completely deindex these pages, like maybe they are blocked in robots.txt - Also look at crawl stats in webmaster tools. Does Google crawl a good portion of the pages on a daily basis?
Some of this delay might be Google waiting to make sure they can trust the site won't be hacked again. They are sometimes cautious in these situations. How long was the site hacked etc before it was fixed?
As Lantec said, part of this is just pushing forward - and those future links, shares, mentions, usage stats etc should help get the site back up there.
The last thing I'll mention is perhaps there were prior issues with the site that fell below Google's radar - and now maybe the hacking brought the site to their attention, and now it's being looked at with more scrutiny. It doesn't have to be a manual penalty, it could be a algorithmic penalty etc. So I would do a top to bottom examination of everything from this perspective.
-
RE: "HTTP error: 404 not found" submitting YOAST SITEMAP
If you do not have any tags or categories, you should exclude them from Yoast's XML sitemap. To do this;
- Go to SEO -> XML Sitemaps
- Check off Tags and Categories to exclude them from the sitemap.
If this doesn't fix it, let us know - I can try to help further.
-Dan
-
RE: "HTTP error: 404 not found" submitting YOAST SITEMAP
I would noindex tags, but leave categories indexed. You can check my post for full details on setting up Yoast: http://moz.com/blog/setup-wordpress-for-seo-success - I'll answer the sitemap question separately.
-
RE: Can "window.location" javascript on homepage affect seo?
Hi There
I'm honestly not versatile in the intricacies of how window.location works - but I can tell you how to check if this is OK for search engines.
You can try a few things to see how Google "sees" the content;
- Do a fetch as Googlebot in Webmaster Tools
- Check the cache of the page if it has already been indexed (try text-only cache)
- Try running it through a header checker like URI Valet (set to Googlebot as user agent)
In all cases, you will get a "search engine view" of the page content. If it's what you want engines to see, then you are all set!
As far as the details of how to implement everything, I'd have to lean on a developer for that. But that's how you would check and verify it.
-
RE: Wordpress Site Structure and H1 Tags
Gotcha. With a "normal" WordPress install it should be pretty easy to make a change to the
tags. You just go into the editor and find the right .php file(s).
Are you by chance running Thesis or Genesis? I know it can be trickier with those platforms.
If not, it sounds like an issue with how this specific theme or customization was done. Because normally it should be pretty straightforward.
-
RE: Duplicate pages
Awesome - thanks for all the extra info David!
Just to clarify, do you mean 301 the author archives to the homepage? Yoast does this when you check "disable author archives".
-
RE: Duplicate pages
David
You have Yoast SEO installed, so follow these steps;
- Go to SEO->Titles/Meta->Other
- and for "author archives" check "noindex, follow"
- and if this is a single author blog, check "disable author archives"
For more details on setting up WordPress for SEO, you can check out my guide here: http://moz.com/blog/setup-wordpress-for-seo-success
-Dan
-
RE: Can you get rich snippets for Youtube hosted videos?
Good point - thanks Phil!
-
RE: Can you get rich snippets for Youtube hosted videos?
You can add markup to videos in your website even hosted on YouTube. Here are a few resources;
- http://www.sistrix.com/video-rich-snippets/
- http://www.searchenginejournal.com/time-come-actually-create-video-rich-snippet/90953/
- http://www.branded3.com/blogs/how-to-add-rich-snippets-for-video-seo/
The problem as you mentioned, is the fact YouTube will most likely outrank you depending on the specifics of the query. You may try making the videos "unlisted" (that is people can only find the YouTube URL with a direct link) but still embed. I don't have direct experience with this though.
Ultimately, to rank a video on your site with rich snippets etc you may want to self host with a platform like Wistia.
Also, whichever platform you use, don't forget to create a video sitemap. This is a great resource on Distilled about video sitemaps: https://www.distilled.net/blog/video/creating-video-sitemaps-for-each-video-hosting-platform/
-
RE: Wordpress Site Structure and H1 Tags
Hi Pamela
I wrote a whole answer, and then realized I should ask for some clarification
Do you mean;
A) The H1 tags are in the right place, you just want to be able to individually edit what's in them on a post by post or page by page basis?
B) You are getting H1 tags in wrong places, multiples per page etc and need to get to the underlying code to change where the
tags are being used to begin with?
Or both?
-
RE: Are these doorway pages or not? Concerned due to Panda 4.0
Hi There
Just to clarify, by definition there are not doorway pages - doorway pages function by redirecting the user to another page than the one that was indexed. These would just be additional pages indexed that more or less show the same content as other pages.
That aside though, it's good to still question their validity. What Etsy is doing is a bit more complex and I wouldn't compare your site to Etsy (unless it's going to be millions of pages with hundreds or thousands of categories). But Etsy is doing something slightly different than what you're describing for your site.
If you want to create the silo effect you can simply "nofollow" any links you don't want them to continue crawling off of the homepage. And in general I would try to control everything through good architecture.
Now, for Etsy, if you have two similar pages showing in search - one a /search/ URL and the other a /Market/ URL - that, in my opinion is not ideal. I would noindex the search pages that are also duplicated by static pages.
-
RE: Schema for Landing Pages
Hi Michael
1. Bullets - This is also something I have not heard much about, but you could try to method in this post for bulleted lists in SERPs which Google goes on to explain here.
2. Landing page code - Do you mean having this show in the title tag and use "near me"? I don't think there's a way to do this. Google does not dynamically change title tags for you in such a way.
3. You can make a brand image show up, usually only for brand searches, by using rel=publisher: https://support.google.com/plus/answer/1713826?hl=en
-
RE: E-commerce site hit by the latest Panda, please help
Hi Claudio
I was not using any tools, or even saying the text exactly matches. I was saying the general look, feel, setup, etc of that site is similar to yours (and many others in this genre). I am also saying you need to think of a way to differentiate your site, not just by having unique text, but by anything like: helpful content targeted at your audience, building a community, developing an audience that follows you and comes back for repeat visits. Something that makes you stand out from the rest of the shareware download sites.
-
RE: E-commerce site hit by the latest Panda, please help
I think trust (or lack of trust signals) is a huge factor here.
- For example, the only contact information is a contact form: http://www.freesharewaredepot.com/contact.asp - no phone number, email address, or physical address. This is something heavily mentioned in Google's Quality Raters Guidelines. I've actually happened upon it here: http://www.freesharewaredepot.com/about.asp - but users might want more easily accessible ways to contact you.
- The site also does not show any indicate of who is behind the site - no people's names, and barely pictures etc. This is bad for trust too.
- In general, as a normal user, I have no way of knowing I can trust the editor reviews, or other information - was it actually all created by an expert? is a question that pops into my head.
- Bill's right too - the site is very duplicative of other sites out there in the space, such as - http://www.tucows.com/downloads
The issues above I've addressed only have to to with Panda. Few more thoughts.
- The other thing that strikes me is the imbalance of links to social shares. I don't think Google uses social shares as a direct ranking signal, but I could see them looking for weird differences in numbers as one flag. In other words - the site needs to be promoted and liked holistically - through links, social shares, mentions, etc.
- The site also gets zero brand searches - by this, I mean no one is going to Google and typing "free shareware depot" - which signals to Google, maybe people don't think the site is all that important or memorable. This is why some branding and differentiation is so important.
-Dan
-
RE: Duplicate Title Tags on Word Press
Hi Chris
There are some suggestions here: http://wordpress.org/support/topic/toolbar-not-showing-on-the-site - including clearing your cookies. So try to clear cookies and do it again.
Try disabling your plugins one by one and see if it shows back up.
Some more ideas here - http://wordpress.org/support/topic/not-showing-toolbar
It could be any number of things, so it might take a little trial and error.
Some ideas here too: http://www.webmechanix.com/wordpress-admin-bar-not-showing-up-fix
-Dan
-
RE: Duplicate Title Tags on Word Press
Hi Chris
No problem! Go to Users > You Profile > Toolbar and check off the box "Show Toolbar When Viewing Site"
-Dan
-
RE: Duplicate Title Tags on Word Press
Christopher
Is this a WordPress site? If so, are you using an SEO plugins? Sometimes the code in the header.php file still causes WordPress to spit out more than one title tag. Let us know what setup you have.
-Dan
-
RE: Duplicate Title Tags on Word Press
Thanks Devanur
Just wanted to clarify that the new recommended length for titles is roughly 55 characters, or more accurately 512 pixels. Screaming Frog SEO Spider will now measure pixel width.
-
RE: Brand sections performing badly in SERP's but all SEO tools think we are great
Jonathan
First off, I would ignore the competitors to some degree. It's going to lead you in circles. It's not so simple that links relate directly to rankings. There are a ton of factors as to why competitors can be ranking better. I'd focus purely on cleaning up your site as best as possible.
You also do seem to have an issue with anchor text in your link profile - a lot the top anchors are commercial keywords ""hoisery online uk" "tights" etc. These need to be changed or cleaned up. This is going to give you a flag as being over-optimized.
I don't think number of internal linking pages would create a penalty.
How's your non-google traffic as a percentage? If it's anything less than 30% of overall traffic (and organic Google is 70% or more) I'd work on getting traffic from other sources - this will all feed back into your SEO.
-
RE: Brand sections performing badly in SERP's but all SEO tools think we are great
Hi There
Bill Sebald offers a fantastic method for link cleanup, and then submitting a disavow here: http://www.greenlaneseo.com/blog/2014/01/step-by-step-disavow-process/ - if you have never submitted a disavow, I would do that. It's in Bill's post, but generally the links in Webmaster Tools are a good place to start, and use Cognitive SEO to process them and review.
-Dan
-
RE: Brand sections performing badly in SERP's but all SEO tools think we are great
Thanks Andy, great advice! Just to clarify for the asker, Penguin is purely algorithmic, not a manual penalty in any way.
-Dan
-
RE: Switch to www from non www preference negatively hit # pages indexed
Hi Brigitte
To echo some of the other answer here, simply having www vs. non-www does not affect rankings at all directly. What matters is choosing one and keeping it consistent. This would mean consistent across;
- Internal links
- Always redirect from the non-preferred to the preferred
- Don't switch if you don't have to
- Try to get back links pointing at the preferred version
By the way, you need to register a separate google webmaster tools account for non-www (it is treated as a different website in terms of some of the data).
I would choose the version with the most backlinks pointing at it, honestly, and then keep it that way forever.
-Dan
-
RE: How to overcome blog page 1, 2, 3, etc having no or duplicate meta info?
Thanks Anthony! This is the best way to handle it. If you are using Yoast SEO, go to SEO->Titles & Meta - and check off "noindex subpages of archives".
-
RE: Pages with Duplicate Page Content Crawl Diagnostics
Hi There
I would;
- Noindex your tags
- Set your title templates for the category archives so they end up unique. Hopefully you're using Yoast SEO. So go to SEO->Titles/Meta and click "Other" to get all the variables for the title templates. Play around until you get titles that are unique, not too long, and make sense to read.
-
RE: How Can I intreptret The Crawl Report Resulst?
Hi There
So were you getting 404's on your posts themselves? Or 404 errors on weird/extraneous pages? I should mention that 404 errors on extraneous pages will not hurt your search rankings. But it is an issue that should be worked on.
The report has a LOT of different headings, which in particular would you like definitions on?
-Dan
-
RE: Different user experience with javascript on/off
This is probably not cloaking as long as it's not malicious. Engines are going to have a non-js view by default for the most part, so that is the version they will see anyway. You can check Google text-only cache of the page to see how they are seeing it.
-
RE: Can wordpress actually be bad for sites if it static?
Noted, we really appreciate Thomas' participation!
-
RE: Can wordpress actually be bad for sites if it static?
Hey There
Just wanted to confirm that as far as we know there is no bias towards having WordPress and thus needing to post content more. Google is pretty platform-agnostic when it comes to how they rank sites.
-
RE: Impressions Fell off a Cliff, No Manual Action, What Gives?
Hi John
First thing - have any other metrics changed? Traffic via Google search? Rankings (do you track these independently of WMT)?
Do know that the disavow can take 6+ months to fully process and have an effect back in the SERPs. I do see some suspect links. With Google being so aggressive lately, I could see only a few bad links hurting the site;
- http://wiis.tu-graz.ac.at/people/tom.html "jazz online server"
- http://enn2.com/nitelife.htm "internet cafe index"
- http://public.homeagain.com/faq.html "found pets"
The more I look in OSE there's definitely a lot of link issues. I know they may be old, but it's possible some could have come back to haunt the site.
I would be extra certain you've disavowed all the bad links. Greenlane SEO has two great posts on the process they use;
- www.greenlaneseo.com/blog/2014/01/step-by-step-disavow-process/
- http://www.greenlaneseo.com/blog/2014/04/how-to-uncover-those-harder-to-find-links/
The site design LOOKS great - terrific actually. So it's almost easy to assume everything is technically OK on-site, yet there are definitely some issues there.
For example there are almost 800 pages indexed - which seems like a lot for this site (I could be wrong). There are lost of really long titles and descriptions etc. So as Andy suggests, I'd take a good look at cleaning up anything on-site as well. It may not have caused a penalty, but anything to help Google re-processes the site will help
-
RE: Can a homepage have a penalty but not the rest of the pages?
Check the homepage to see if it ranks for some other things;
- the domain name without the www
- the domain name with the www
- the title tag with the title: operator
- cut and paste a string of text (maybe 12-15 words) and search for it in quotes
- the OLD brand term (you said this was a re-brand as well?)
- the OLD domain name
What do those all return? Further;
- Does the homepage show as a landing page in webmaster tools?
- Does it show getting traffic from search in analytics?
- Does it rank for anything in Bing or Yahoo?
It sounds like more diagnostics are needed before we can conclude anything
-
RE: I'm Getting Attacked, What Can I Do?
I'm not so sure I'd jump to the conclusion you've been hacked so quickly (as a few other answers have mentioned). You just have bad backlinks pointing at your site, right? No pages have been created ON your website that you did not allow correct?
If so, it does sound like some sort of negative SEO link building. Be hard to tell how/why they were built without a big analysis - but anyhow - you can disavow them. But I would also look deeper into why they have happened to begin with. Could it be a competitor or someone who has decided to make enemies with you?
-
RE: Brain Teaser - Dead Link Ranking in SERP's
Hi There
Few things;
- They are no longer ranking for me on "caliber signs and imaging" set to Irvine, CA.
- I think they are ranking still for "caliber signs and graphics" because it's an exact title match, and the domain is carrying a large part of the ranking
- Their site is really messed up - like phenomenally messed up - looks like they still exist as of last year on facebook but they either don't use their site or it's just not updated. I don't think there was anything malicious, it's just bad.
- The blog - strange! The "author's" profile IS in fact "martin" - I wonder if it somehow got hijacked. I think it DID get hijacked actually. All the posts 2011 and prior are real (like this one) - then the spam starts.
Now, if the word "caliber" is not commercial (like a "caliber sign" is not a thing, right?) - then they are still going to rank very easily for things like "caliber" despite their bad site and geographic location. So essentially, they are ranking because they are super low-competition terms, they actually haven't done anything malicious on their end - and now we know Google can process iFrame sites
-
RE: How do you reduce duplicate content for tags and categories in Wordpress?
Hi Michael
It looks like everyone here pretty much has you covered. I also wanted to point you to this resource I did for Moz on setting up WordPress for SEO: http://moz.com/blog/setup-wordpress-for-seo-success
In general, I'd agree with noindexing tags. You can index categories, but this won't result in overlapping content as long as you put posts in just 2-3 categories each.
-
RE: CMS dynamicly created pages indexed?
Hey Dylan
Either of those are possibilities for Google finding and indexing a page like that. There could be many ways that happened - I've seen them spider "links" in a drop down depending on how it's implemented.
One thing you can do to check how, is looked at the text-only cache of the page (type cache:www.domain.com/page-name in your browser and click text only) - and look to see if the drop down items actually appear and clickable links. You can also try crawling the site with Screaming Frog and set the user-agent to GoogleBot and see if they got picked up.
If the filter is just for example re-sorting the list of items in a category, there is probably not a need to have this crawled or indexed, because it's just the same content in a different order.
If you do want to remove them from the index, you will want to add a meta noindex tag to the HTML, wait for them to drop out of the index, and then block crawling with robots.txt or nofollow the links that might be generated.
Hope that helps!
EDIT - I'd also check to be sure they are not showing up in your XML sitemap.
-
RE: Why is my site not getting crawled by google?
Hi There
As Alan mentioned HTML is going to be a much more guaranteed way to get indexed. The HTTPS alone shouldn't be affecting anything. But do you have a different robots.txt for https and http? Is the https one blocking crawlers? Do you have the https version of the site registered in webmaster tools? When you go to crawl stats, how many pages does it show that they are crawling?
-Dan
-
RE: Duplicate Errors from Wordpress login redirects
Greg
That's right, the best way is to block crawling with robots.txt - makes sense to keep crawling clean and efficient. If you're using Yoast you can edit robots.txt right in there, or you can do via FTP.
-
RE: Magento Category URL
Hey There - I too am not very familiar with "under the hood" of Maganto. Sounds like you may need to do individual redirects / URL re-writes. I'm not sure if there is a way to do it automatically. Here are a few reousrces that might help;
- http://www.magentocommerce.com/boards/viewthread/294348/
- http://blog.maximusbusiness.com/2012/10/magento-url-rewriting-regex-and-301-redirects-tips/
If you can't find it there, please circle back so I can send the question to another Moz Associate who might have more of a working knowledge with Magento
Thanks!
-Dan