Hey Moz,
For some reason on my father's website, michaelpadway.com, the top 2 keywords from organic are complete urls from his site. This doesn't makeugh sense to me. Any ideas?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hey Moz,
For some reason on my father's website, michaelpadway.com, the top 2 keywords from organic are complete urls from his site. This doesn't makeugh sense to me. Any ideas?
I have a case in which the whole site is AJAX, the method to appease to crawlers used is
<meta< span="">name="fragment" content="!"> Which is the new HTML5 PushState that Bing said it supports (At least I think it is that)
This approach works for Google, but Bing isn't showing anything.
Does anyone know if Bing supports this and we have to alter something or if not is there a known work around?
The only other logic we have is to recognize the bing user agent and redirect to the rendered page, but we were worried that could cause some kind of cloaking penalty</meta<>
How has no one mentioned SiteTuners! http://sitetuners.com/
I like their newsletter.
We have faceted navigation for the desktop version, so breadcrumbs are overkill. In the mobile version all of the faceted nav disappears, but we show the breadcrumbs.
We just aren't sure how marking up breadcrumbs functions with responsive if they only show on one version.
I am working on a site that has responsive design. We use faceted search for the desktop version but implemented a style of breadcrumbs for the mobile version as sidebars take up too much screen real estate.
On the desktop design we are putting a display:none in front of the breadcrumbs. If we mark up those breadcrumbs and they are behind a display none, can we still get the rich snippets? Will Google see this is cloaking?
In follow up, is there a way to markup breadcrumbs in the or somewhere else that is constant?
My website has a main section that we call expert content and write for. We also have a community subdomain which is all user generated.
We are a pretty big brand and I am wondering should the rel publisher tag just be for the www expert content, or should we also use it on the community UGC even though we don't directly write that?
Google has a separate mobile crawler. This crawler probably uses guidelines from the mobile sitemap instead of the desktop sitemap.
I think it is worthwhile to make a separate mobile sitemap even with responsive design so Google knows those are the mobile as well as desktop pages.
Thanks guys, We definitely mark up entities that have a chance of showing rich snippets, so far though I haven't seen any of these for purely article markup.
I guess that answers my question though, probably not worth the implementation costs at this time.
I am working with a site that has sitemaps broken down very specifically. By page type: article, page etc and also broken down by Category. Unfortunately, this is not done hierarchically. Category and page type are separate maps, they are not nested. My question here is:
Is is detrimental to have two separate sitemaps that point to the same pages? Should we eliminate one of these taxonomies, or maybe just try to make them hierarchical? IE item type -> category -> pagetitle
Is there an issue with having a sitemap index that points to a nested sitemap index?
(I dont think so, but might as well be sure.
Thanks Moz Community!
Can't delete my question, but turns out that isn't how they are structured. Food for thought anyway I suppose.
I'm working with a few international sites that we are going to collapse into one main site. Our current plan is to 301 the 4 other sites into our main site home page.
Is this ok? Is there a better way to do this?
Thanks
I have a website that has similar pages on a US version and a UK version. Currently we want Uk traffic to go to the US, but the US domain is so strong it is outranking the UK in the UK.
We want to try using rel alternate but have some concerns. Currently for some of our keywords US is #1, UK is #4. If we implement rel alternate, will it just remove our US page? We don't want to shoot ourselves in the foot and lose traffic. Is this worth doing, will it just remove our US ranking and our double listing?
Any anecdotes, experiences or opinions are appreciated. Thanks.
I know things like location, pagination, breadcrumbs, video, products etc have value in using schema markup. What about things like articles though?
Is it worth all the work involved in having the pages mark up automatically? How does this effect SEO, and is it worthwhile?
Thanks,
Spencer
Question: Do 410 show in the 404 not found section in Google Webmaster Tools?
Specific situation: We got rid of an entire subdomain except for a few pages that we 301'd to relevant content on our main domain. The rest return a 404 not found. These show up in our google webmaster tools as crawl errors. I was wondering since 410 is a content gone error and we intentionally want this content gone, if we switch it to 410, does Google still report it as a 404 error?
Thanks
I know I am way late to the party, but MagicDude4Eva, have you had success just putting a noindex header on the soft 404 pages?
That sounds like the easiest way to deal with this problem, if it works, especially since a lot of sites use dynamic URLs for product search that you don't want to de-index.
Can you have multiple 404 pages? Otherwise redirecting an empty search results page to your 404 page could be quite confusing..
I have a site with faceted search but sometimes when someone drills down too far it ends up with no results. The page and outlined and faceted navigation are still there.
The site uses dynamic URLs for the faceted navigation but Google is reporting these no results pages as Soft 404s. How should we handle these?
Should we redirect these?
Can we return 404 in the status code but still show the no results page they are looking for?
Thanks for your responses
I am working on a site that has lots of dynamic parameters. So lets say we have www.example.com/page?parameter=1
When the page has no parameters you can still end up at www.example.com/page?
Should I redirect this to www.example.com/page/ ?
Im not sure if Google ignores this, or if these pages need to be dealt with.
Thanks
That is true, but I also have them 301'd to the http version and canonicaled! That is pretty much every possible signal to tell them those pages aren't pages and don't index them.
I suppose we can submit the URLs, unfortunately there are a LOT of tag pages.
Thanks for the advice Dana!
I have some tag pages on one of my sites that I meta noindexed. This worked for the http version, which they are canonical'd to but now the https:// version is indexing.
The https version is both noindexed and has a canonical to the http version, but they still show up! I even have wordpress set up to redirect all https: to http! For some reason these pages are STILL showing in the SERPS though. Any experience or advice would be greatly appreciated.
Example page: https://www.michaelpadway.com/tag/insurance-coverage/
Thanks all!
I have a facebook like button on my homepage. Currently, it points to my facebook page, so every time someone likes this button they are liking my facebook page.
Is this how it should be structured? Do the likes count towards my home page?
Is there a way that a like can simultaneously like my facebook page AND my website?
Which is best for search metrics?
Thanks for your responses.
I have seen some sites that always redirect to https and some sites that always redirect to http://, but lately I have seen sites that force the url to just the site.
As in [sitename].com, no www. no http://.
Does this affect SEO in anyway? Is it good or bad for other things?
I was surprised when I saw it and don't really know what effect it has.
Sorry, let me clarify. By used to I mean before the domain was migrated over. So it was hyphens on the old site, but was changed to underscores on the new site.
I work for a website that recently did a redesign, and switched from hyphens to underscores. We have seen some drop in traffic, although that may be attributed to the migration.
I have read that while Google prefers hyphens, the underscore problem is not as much of an issue as it used to be.
Is it worth 301'ing the page to a version of itself with hyphens instead of underscores in the URL?
I have worked with a few different SEO firms lately and a lot of them have recommended on the sites I was working on to "no-follow" all external links on the site.
On one hand this traps all the link equity/Pagerank. On the other I would think this practice is frowned upon by Google.
What are some opinions on this?
I would like to look at data or sources to get the most accurate measure I can on search engine market share.
Does anyone have reliable sources on search engine volume/market share etc?
So right now I am looking to develop an infographic for a client and I was wonder from your guys' experience, how much has it cost? I have all the statistics and basic idea of how i want it to look, and its not going to be a super interactive type, more of a standard style graphic. I'm curious about where you went to find your designer and just kind of a ballpark price to work off of so I know what the market is like out there. Any input in anyway would be greatly appreciated!
I have quite a few website in Wordpress but I continuously run into the same issue.
With permalinks it is not recommended to use /%category%/%post_name%/ because it puts an undue load on your bandwidth, server and makes the crawler crawl a ton of duplicate content pages. On one site changing to that hierarchy even crashed some of the pages (probably a permissions error).
I would like a correct information hierarchy, but this doesn't seem like correct play.
What do you use as your URL hierarchy?
Do you have any plugins or fixes for this issue?
Thanks
So the ratio is MozTrust to MozRank, but what is this good for? What can I deduce from this and what can I use it for?
Hopefully, I started one really early yesterday and one midday and still haven't gotten them.
I am just wondering if anyone has had a full SERP Report in the keyword difficulty tool finish for them or if they know how long these take to populate?
Currently my company uses basecamp with a splash of google docs a little bit of Paymo, and Raventools for project management. (We use other tools like SEOmoz, but not so much for management as SEO tasks)
The ideal features we are looking for are:
Collaborative document editing and sharing(Google Docs)
Task lists and project organization (basecamphq)
Time Tracking on a per task basis (Paymo)
(SEO tracking software is great too, but it doesn't have to be integrated with the project management directly. Currently using Raventools and SEOMoz along with some smaller tools.)
The project management side can be completely separate from the SEO tool side, but it would be great if there was one simple interface that all of this could be done from.
Any suggestions? Are there features I'm missing in my current software that could bring them up to this level?
Thanks EGOL, you are a a valuable asset to these boards.
I have a hosting account that is ancient. So is it's CPanel, its way of operation (I have to call in to change the zone file), and its hardware and software (It can't even recognize Wordpress as a user so i have to change permissions to change anything.)
I plan on moving the site, but I want to prepare for any changes that may happen. Currently the site ranks between #1 or 3 for quite a few very valuable words. It is also in season for this business. I know changing hosting data or servers can cause google to temporarily drop rankings. Does anyone have any experience with this or now how long the faded rankings can last? Or if its even true?
We solved it, our web programmer wrote a program to scrape all of our posts and turn them into a format that imported into wordpress. As far as the redirects we kept the page titles the same and did a sitewide 301 that sent them from blog.example.com to example.com/blog/
Although I would still like to grab Richard's php script for doing this in a more efficient manner in the future.
Exactly Joe, although I'm focusing on the SEO value derived from social media. As was stated above social media sites have no-follows that don't pass link juice, but still affect rankings. I am basically trying to ascertain if anyone on here knows "if a link is in a redirected truncated form, does it lose most or all of it's link value, including ANY value a link can have?" with an emphasis on the social media pass through value.
Does it still count as a like or a share when the link is truncated and redirected?
Do you have any evidence to back up this claim? Any reasoning behind assuming a 301 passes link juice, but it wont pass through any social media ranking signals?
I am aware that social media has its own signal as I stated above.
Let me try to refine the question once more.
Does anyone know if a 301 redirect link on a social media page still sends ranking signals?
Thanks. I wonder how this works in a social media context though? Since twitter and facebook don't count for direct link juice, how would a 301 link affect the social media power?
Also does anyone have any idea what the anchor text counts as in that scenario? probably just stopped.at/SAuadh
I help a site that helps spread word by getting links on peoples social media pages. These links are truncated ie website.com/XyUE for the purposes of tracking clicks, referrals and so forth. I have heard that when a link is in a redirect form like that it loses close to all, if not completely all of its link value. The links themselves are technically 301's. Do these still maintain value?
For example, the stopped.at links on this person's twitter. http://twitter.com/#!/Melewis18
Is there any way to make links of this type maintain SEO value? Is there a workaround to truncating for tracking purposes?
The and operator is an and operator, not an OR. It works like you would think it would.
Straight from the mouth of Google:
"The OR operator
Google's default behavior is to consider all the words in a search. If you want to specifically allow either one of several words, you can use the OR operator (note that you have to type 'OR' in ALL CAPS). For example,
Link to the operator page:
http://www.google.com/support/websearch/bin/answer.py?answer=136861
In addition to danzspas advice, Rankings can shoot up if you have a lot of social mentions. Was there a spike in your social mentions? twitter links, facebook shares etc? These can give huge short term boosts.
I am doing SEO for a website that has constantly rotating and only temporarily pertinent subjects on it. Let's say these information and subject cycles go for about 6 months. Assuming this would it be more effective to optimize exact match domains for each 6 month cycle or make a main domain with a few of the keywords and just target a page for each roaming subject?
Advantage of the subject is I get domain authority to feed off of, advantage of the exact match is, of course exact match domains are a powerful tool to rank highly and it is only a medium competitive market, usually about 40 domain and page authority.
What do you guys think? Do you have any techniques to dominate temporary and rotating markets?
I would not get a link from this as it seems very suspicious, are you sure the subdomain has links pointing to it?
Otherwise, I would suspect a penalty from google that has not been resolved.
It depends on what kind of videos they are but youtube trends a lot on views and posts your video on things that are related. Make sure to relate them to popular topics, use correct tabs, and promote them as best you can all at once, when youtube videos are trending up the will be ranked a lot higher.
Google actually specifically told SEOMoz to not include their pagerank in the Mozbar. Luckily there is a way to add pagerank!
Download a 3rd party toolbar that has pagerank. I use the quirk searchbar: http://www.quirk.biz/searchstatus/
You can then take as many or as few fields as you want, I use pagerank and alexa, and move them into your mozbar!
Go to options: General: Location and you can set where you want the quirk bar to show up.
The time it takes is not based off of your site alone, there are so may sites and links being crawled which all take processing power and bandwidth.
Well Q & A is moving up my list pretty fast, but the Pro Campaigns are pretty amazing. The on-page optimizatoin tool, rank tracker, and open site explorer are definitely some of my favorites.
LDA and link acquisition tool are ones that I enjoy playing with as well.
The one instance I feel submitting to multiple sources is ok is if the sources are actual people. Typically if you submit a press release to a journalist or a relevant source they will not use the exact copy you gave them, but quote or cite it.
Just make sure not to use this for a whole ton of 301 redirects. If it is a big enough redirect project using PHP might be the way to go.
Well if you 301 redirected the mysite.net to themysite.com then the links will use value, I have heard they lose somewhere around 5% of their value to the redirected domain.
Like Alan said, rel canonical is great if the site accept that tag.
In addition, You can set up your site to or manually ping the search engines when you release a new article so it then hopefully attributes your site as the first one to have that content.