Questions created by CleverPhD
-
Facebook ignores multiple slashes for business listing - true duplicate page issue?
Hi everyone, I am doing an external link audit for a site that contains a large number of public business profiles in it. As a part of each profile, there can be a website listed for the business, as well as a FB page and twitter etc. Pretty standard. There are thousands of business profiles on the site and I noticed that in a small group of links to FB profiles there seem to be typos. These are not the actual businesses that I found, but I have taken common public profiles and modified to show what I am seeing to give examples. https://www.facebook.com//nissanusa/ https://www.facebook.com//pizzahutus/ https://www.facebook.com//WhiteHouse/ https://www.facebook.com//TrekBicycle/ Notice the double slash. You can put as many slashes as you want and it makes no difference, e.g. https://www.facebook.com///////////////////////////////////////////////////////////TrekBicycle/ FB shows a 200 for all of pages when it should show a 404. There is no 301 to the correct URL. Facebook also does not canonical these other pages to the one slash URL. You could call this a potential duplicate content issue due to typos. These types of pages would be important for brand related searches for a business. Google may be smart enough to ignore them, or maybe the typo does not happen often enough that it does not really matter. I am just surprised that FB does not 404 or 301 these pages. When I checked my personal FB page URL and some of my friends, this does not happen. FB shows a 404 if you add extra slashes to personal pages. So, the duplicate issue seems to only be with business type FB pages. Curious about what the group thinks or if they have seen similar situations like this one. Thanks!
Social Media | | CleverPhD0 -
Google and JavaScript
Hey there! Recent announcements at Google to encourage webmasters to let Google crawl Java Script http://www.googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html http://googlewebmastercentral.blogspot.com/2014/05/rendering-pages-with-fetch-as-google.html We have always put JS and CSS behind robots.txt, but now considering taking them out of robots. Any opinions on this?
Intermediate & Advanced SEO | | CleverPhD0 -
Suggestion for Improving the Crawl Report on Canonicals
This came up in the answer to a question I gave here http://moz.com/community/q/canonicals-in-crawling-reports#reply_222623 Wanted to post here to put it in as a suggestion on how to improve the Moz Crawl reports Currently, the report shows FALSE if there is no canonical link on a page and TRUE if there is. IF you get a TRUE response, this shows up as a warning in your report. I currently use Canonical to Self on almost all my pages to help with some indexing issues. I currently use the EXACT function in excel to create a formula to see if my canonical link matches the URL of the page (as this is what I want it to do). I can then know that the canonical is implemented properly, or if I need to manually check pages to make sure the canonical that points to another page is correct. I would like to suggest that the Moz crawl tool does this. It can show FALSE is the canonical is missing, TRUE if the canonical is present and SELF if the canonical points to the URL of the page it is on. I think for the most part this would be much more actionable information. I would even suggest that TRUE would need to be more of a high priority alert, and SELF can't do any damage, so I would leave that info in the CSV but not have that as a warning in the web interface. Thanks for listening!
Moz Bar | | CleverPhD0 -
Dealing with 410 Errors in Google Webmaster Tools
Hey there! (Background) We are doing a content audit on a site with 1,000s of articles, some going back to the early 2000s. There is some content that was duplicated from other sites, does not have any external links to it and gets little or no traffic. As we weed these out we set them to 410 to let the Goog know that this is not an error, we are getting rid of them on purpose and so the Goog should too. As expected, we now see the 410 errors in the Crawl report in Google Webmaster Tools. (Question) I have been going through and "Marking as Fixed" in GWT to clear out my console of these pages, but I am wondering if it would be better to just ignore them and let them clear out of GWT on their own. They are "fixed" in the 410 way as I intended and I am betting Google means fixed as being they show a 200 (if that makes sense). Any opinions on the best way to handle this? Thx!
Technical SEO | | CleverPhD0 -
Http to https conversion - what were your experiences?
Background: Our devs have been talking about changing some of our websites so that all pages are https vs just those that are a part of our logins and shopping carts. From what I have read things that need to be done as a part of this are make sure that https pages will allow caching setup new site in GWT put 301 redirects in place update all internal links, social profiles etc everywhere you can to https URLs can server handle the extra load so no impact on site speed There is an old MCutts video that says essentially, "works for Pay Pal" and "you can try it, but test it on a smaller site first" http://www.youtube.com/watch?v=xeFo4ytOk8M The comments below the video are all stories about loss in rank and traffic with some coming back. Not sure if these folks did the move correctly, but still, you never know. Question: Have any of you done a technically "correct" move of an entire site from http to https using the suggestions above? What was your experience? Any gotchas? Just to be clear, I am not talking about setting up a site from scratch, but wanted to know impact on an established site. Thx
Intermediate & Advanced SEO | | CleverPhD1 -
Should I nofollow Geo-located links on a site?
I run various sites that use Geo-location to place related links in navigation menus on a page. For example, if you land on the home page, we will see that you are in Florida and then in one of the content boxes on the page, show job listings that this site has in Florida. We also give the option to search for other jobs or use other navigation options. The idea is to try to help the user along the best we can, but ..... What opinions do persons have here on if these links should be nofollowed as GoogleBot will always see links to places in California etc. - wherever Googlebot is crawling from? Would this then be confusing as we are a site that focused on the entire US and not just California etc Thanks!
Technical SEO | | CleverPhD0 -
GWT Error for RSS Feed
Hello there! I have a new RSS feed that I submitted to GWT. The feed validates no problemo on http://validator.w3.org/feed/ and also when I test the feed in GWT it comes back aok, finds all the content with "No errors found". I recently got a issue with GWT not being able to read the rss feed, error on line 697 "We were unable to read your Sitemap. It may contain an entry we are unable to recognize. Please validate your Sitemap before resubmitting." I am assuming this is an intermittent issue, possibly we had a server issue on the site last night etc. I am checking with my developer this morning. Wanted to see if anyone else had this issue, if it resolved itself, etc. Thanks!
Technical SEO | | CleverPhD0 -
Authorship Markup worth it for "invisible" authors
Greetings everyone! Background I help run multiple continuing education sites for Allied Health professionals. Our editors do a great job of getting some of the best authors in their respective fields to come onto the site and present webinars and we publish articles around those presentations. I would love to be able to use the rel=author tag on these sites as the authors we use help to improve our credibility when a user is on the site and I would like to take advantage of this in the SERPs. The issue is that while most of these authors are leaders in their respective fields and have published in many academic publications, they are not on Facebook or Twitter, let alone Google+. Also, they are probably not interested in setting up a G+ profile. They are "famous" and well published within their fields, yet they are somewhat "invisible" on the web. We are looking to implement author bios on our site and then could use the rel=author tag internally so that seems like a good first step. The question is then around linking out with rel=me to any profiles (FB, Twitter, G+) The issue is that, as I mentioned above, the online profiles are pretty scarce. Question / Discussion Is it worth it to setup all the authorship markup to internal bios on a site when many of the authors are "invisible" on G+, twitter, FB, etc. and so I will be limited in how I can link rel=me to those profiles. If the Google+ profile is not available for an author, what do you prefer to link to. Would you say FB over Twitter as FB has more users, or if a user has both profiles, but uses twitter more often, would you link to the Twitter profile instead? Many of these authors work at the university and have a bio page on the university website, would it be working linking to that profile? How do you judge the "best" place to link to if there is no Google+ profile. Thanks!
Technical SEO | | CleverPhD0 -
SEO Terms for Internal Vs External
Hey there! I am writing up an SEO plan for our company and wanted to get the groups input on the use of some SEO terms. I need to organize and explain these efforts to nonSEO people. I usually talk about, SEO in terms of "Internal" vs "External" efforts. Internal SEO efforts being things like Title Tags, Description Tags, Page Speed, Minimizing errors, proper 301 redirect, content development for the site, internal linking and anchor, etc. External SEO efforts being things like Link building, social media profile setups and posts (FB Twitter Pinterest, YouTube), PR work. How do you split these out? What terms do you use? Do you subdivide these tasks? What terms do you use? For example, with Internal, I sometimes talk about "Technical SEO" that has do to with making sure that site speed is working well, 301s are setup correctly, noindex tag etc are all used properly. These are things that different versus "On Page" efforts to use keywords properly etc. I will also use the term "Site Visibility" for non SEOs to explain the technical impact. For example, if your site has the wrong robots.txt, if you have 500 errors everywhere and a slow site, if you are sending spiders down a daisy chain of 301s, it is difficult for the key parts of your site to be found and so your "Visibility" to the engines are poor. You have to get your visibility up, before you begin to then worry about if you have the right keywords on a page etc. Any input or references would be appreciated.
Technical SEO | | CleverPhD0 -
Entireweb.com
http://www.entireweb.com/express_inclusion/ Anybody used this? There is a express inclusion that you pay for, vs the free. Wanted to see what the group thinks. Worth it, not? It is not included in the SEOMoz list of directories.
Branding | | CleverPhD0 -
Influencing Google Instant Preview
Hello there! I have been looking at how our articles are shown in Google Instant Previews. While the description in the SERPs picks up the first paragraph from the article from the main table/div body, the Instant Preview highlights a text ad in the top right of the side column. According to Google https://sites.google.com/site/webmasterhelpforum/en/faq-instant-previews Q: How can I influence the text highlighted in the preview image?
Technical SEO | | CleverPhD
A: The highlighted text is automatically chosen based on the user's search query. Only text that is visible on the page can be selected for highlighting. Google is highlighting the same block of text from an ad space over and over again. "Download our Free Guide ...." vs the first paragraph in the article. Any ideas folks?0