I understand that Yelp was down for a period of time in the past few days, and it was not just their website, their app was also down. I don't think it was related to the APIs, those sites were down entirely. They are up and running again today, though.
Posts made by GlobeRunner
-
RE: Yellowpages, Yelp, & HotFrog Are All Unavailable.
-
RE: Change of Address in Google Search Console
Brad, since the domain is not going to be yours, and you'll be unable to actually verify it in Google Search Console, then you probably won't be able to use the Google Change of Address Tool.
However, if you are using 301 permanent redirects to redirect the content to your site, that should be good enough. Typically, when you use the Change of Address Tool you get more "credit" from Google, as it looks like they will pass most of the "link juice" over to the new domain--you don't get that when you only 301 redirect from domain to domain, you will lose some "link juice".
I do recommend that you go ahead and use the 301 redirects.
-
RE: Has anyone ever seen Google truncating the beginning of a meta description on a mobile device?
Sometimes Google just picks what they think is appropriate to show depending on the search query. We've seen this in the past, sometimes due to the fact that Google, for whatever reason, doesn't like the meta description tag. Sometimes it's too long, sometimes it doesn't contain the keywords or something similar than the search query, etc..
-
RE: Can you see if you have a penalty / downgrading for a keyword stuffed title tag ?
Looking down the list they all seem to be using less keyword stuffed <title>tags than our the site sitting in #7</p> <p>If that's the case, then you should update your title tag(s), one page at a time, and then wait a few days after you update each one. You may see changes to rankings. It's worth testing. If you lose rankings, which I don't think you will, then you can always switch the title tag(s) back.</p> <p>But just updating the title tag (and updating the page) should give you a boost.</p></title>
-
RE: Can the end of a competition cause a drop in organic visitors?
We've seen this happen with a lot of sites, especially with sites that are related certain events. For example, a 5k race or a marathon website would suffer the same fate.
It's the drop in engagement on the site, as the competition is over so visitors have no need to go to the site anymore. Also, the site may have lost links pointing to the site. The site owner pointing links to you figured the competition was over so they removed the links.
I would take a look at the links to the site and look at the historical links (using Majestic.com) or see if you have a list of links from previous link evaluations--crawl those links and see if any of them have changed or been removed. If they have, you may be able to see if you can get those links back.
-
RE: Has anyone ever seen Google truncating the beginning of a meta description on a mobile device?
Yes, this is definitely possible and typical on desktop results. When a page doesn't have a meta description tag or when the page's meta description doesn't match the search query, Google tends to pick text from the page that they feel is appropriate. So, yes, it's very possible, depending the search query/keyword used.
First time I've seen it happen on mobile or pointed out on mobile, but it "should" be happening, especially if the page doesn't have a meta description tag.
-
RE: Will Google crawl and rank our ReactJS website content?
Google does crawl JavaScript, and they do index it. Googlebot is really a form of the Chrome web browser, so they will see the information that you give to them and most likely the other remaining content. Keep in mind that cloaking is against their guidelines, so may get the site penalized.
I would go ahead and give Google and all visitors the full content.
-
RE: Any excellent recommendations for a sitemap.xml plugin?
The most popular sitemap plugin for Magento appears to be this one: https://www.magentocommerce.com/magento-connect/xml-sitemap-generator-splitter.html, the XML Sitemap Generator & Splitter.
-
RE: GTLDs in Open Site Explorer
We have run into this error time and time again. Unfortunately, OSE does not support New gTLD domain names. It's a "feature" that we've been asking for for a while (they've been out for two years now) and it's logical that OSE should support these domain names.
I understand that "they're working on it".
-
RE: Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
usDragons, the best way to deal with these links is to use Google's Disavow Links tool to disavow them.
First, you need to identify all of the links, and you an do that by downloading all your links from Open Site Explorer, Majestic.com, ahrefs.com, and Google Search Console. Combine the lists and remove the duplicates.
You'll want to manually review all of them, make a list of the ones you want Google to ignore, then upload a list of the domain names using Google's disavow links tool. Google has more info about their disavow tool here: https://support.google.com/webmasters/answer/2648487?hl=en
-
RE: Google Penalties not in Webmaster tools?
Issa, there are plenty of Google algorithmic penalties that a site can have, like the Google Panda and Penguin penalties, that do not show up in Google Search Console. These can be tough to diagnose, and even when diagnosed they're can be tough to fix or can take time to fix.
What I would do is look at the Google Algorithm change history (https://moz.com/google-algorithm-change) and compare the date(s) where the site had traffic drops with the dates of algorithm updates. That should help diagnose some of the issues the site is having.
-
RE: Lawyer versus Attorney... does it matter?
Google does, in fact, know that lawyers are attorneys and attorneys are lawyers. However, you'll notice that Google's search results don't reflect this. You'll see different results for lawyer phrases versus attorney keywords. When optimizing, you'll want to make sure you've done your keyword research and mention those keywords appropriately on the site.
-
RE: Re-directing Multiple Sites to a Single Location
that by doing as much, they will increase their opportunities to rank for related KW terms
This is, in fact, a misconception and a false assumption. Simply redirecting domain names will not increase rankings. In fact, if you redirect too many domain names it could hurt rankings and give the main site a penalty (we've seen it happen before).
What I do recommend, however, is that you do your due diligence on all of the domain names that you're redirecting, as you could be redirecting a domain that has a penalty or low quality backlinks.
-
RE: Do Search Engines Try To Follow Phone Number Links
They certainly will make note of those phone numbers, but since the phone number is not a web page there's no way to pass on "link juice" to it. So, there's really no reason to mark it as a nofollow link.
-
RE: Ecommerce product rankings tank when procuct out of stock
HDPHNS, it's quite possible that Google is doing that on purpose--that they see the product is out of stock so they don't rank it as high. Google wants to provide a good user experience, and it would be frustrating if someone went to Google, did a search, went to your site, and found that the product is out of stock.
-
RE: Any excellent recommendations for a sitemap.xml plugin?
Hi Inevo,
You don't mention which CMS your client is using, but if you mention "plugin" I'm assuming that the client is using WordPress. If that's the case, then they should use the Yoast SEO plugin, which will automatically generate the sitemaps(s) that the client needs.
-
RE: Why has my clients domain authority dropped 4 points?
If the site has good search engine rankings, then I wouldn't necessarily worry too much about Domain Authority dropping. It could be that the other sites linking to your client site has dropped, so that would then drop your client's DA.
-
RE: Redirected Old Pages Still Indexed
Whenever you migrate a domain name to another domain name (I think this is what you're saying you did), Google will keep the URLs of the old domain name in their index for at least a year. That's if you migrate to a new domain name.
If the pages have 301 redirects to other pages on the site or new pages on the site, then those old URLs could still remain in the index for a period of time--you'll be able to see them if you search for the URL.
Typically even if those URLs are still indexed, it shouldn't be hurting rankings at all.
-
RE: Do I miss traffic (thus, page value) by using the GWMT Parameter Handling Tool?
You really should be using Google's parameter tool if you do NOT want those pages crawled.
If they are crawled, and you use the canonical tag in order to tell Google to pass any "juice" to the page you're setting the canonical tag to, then you won't lose any traffic. Both of those URLs (the one with the parameters and one without) essentially will be crawled, but won't be indexed.
-
RE: What domain name do you think is better for SEO: sirocco-webdesign.com or sirocco-web-design.com?
Personally, if you cannot use siroccowebdesign.com, then sirocco-web-design.com would be preferred since it separates all of the words. I would, however, buy both and redirect one to the other.
-
RE: Two major pages ranking for the same keyword phrase
You could, in fact, test whether or not removing the keyword from the page has an effect on it's rankings, but most likely that won't necessarily prove anything. In fact, you may even be more confused, as a page can rank well even if the page doesn't contain the keyword or keyword phrase.
There may be a lot more factoring into these search engine rankings, such as links pointing to the page or the number of social media shares. It could also be related to how often you update certain pages on your site and if you update those pages or not. Another factor could be user engagement on the page, such as how long they spend on the page, bounce rate, or whether or not there are comments (if it's a blog post).
If there's one page that you want to rank (because it happens to convert better for you), then I would focus on link building and social shares.
-
RE: How to rip benefits of Facebook Group likes and shares?
Google will crawl Facebook pages, but they can't crawl individual Facebook accounts unless those accounts are made public. I believe your Facebook posts are hidden to the outside world (like to Google) unless you make them public.
What we don't realize, though, is that for the pages Google can crawl, Google is using the data from those Facebook pages, such as Facebook Likes & Shares & Comments to gauge the amount of user engagement. If the post has a link in it to an external site, then that page can actually rank better in Google's search results because of good engagement.
The benefits that we see by getting more Likes & Shares (and comments) is about user engagement, which actually helps organic rankings in Google.
-
RE: What is the "UPDATE" indicate in the Google Search Console Query Reports?
It means that Google updated the data on April 27th. John Mueller from Google wrote about it here.
It is related to how Search Console reports & calculates clicks and impressions in Search Analytics.
"As a result, you may see a change in the click, impression, and CTR values shown there. For most sites, this change will be minimal. A significant part of this change will affect website properties with associated mobile app properties. Specifically, it involves accounting for clicks and impressions only to the associated application property rather than to the website. Other changes include how we count links shown in the Knowledge Panel, in various Rich Snippets, and in the local results in Search (which are now all counted as URL impressions)."
-
RE: Will reviews be ranked higher if responded to?
As far as we could find, there hasn't been any recent articles written about this lately. At least where someone has tested it and publicly come out with the results. However, we aren't particularly interested in the ranking or order of reviews per se, our biggest concern is more of a customer service type of issue. All reviews, good & bad, are responded to in a very prompt manner. That way people reading the reviews will see the response.
-
RE: H1 Tags the same as Title Tags and other meta questions
Alex, using the product name and then the company name is perfectly fine. That's what is recommended, unless you want to add another keyword in there from time to time when it's appropriate. Having the product name in the title tag and in the H1 tag is perfectly normal, as well.
If you have meta description tags that are very short, then that's not typically going to be good. If you can use the first 140 characters of the product description then that is going to be preferred if you can't get those 800 products updated quickly. A short or bad meta description may not be good to have, so you might want to remove them entirely if you have enough text on the page to suffice. Either way, you'll want to start adding meta description tags as soon as you can to product pages.
-
RE: Wrong meta descriptions showing in the SERPS
Brain, based on the example you provided, Google is correct--there is no proper meta description tag on those pages, so that's why they're not including your meta description tag. If there is no meta description tag, then Google will just show text from the page.
For example, the syntax of your meta description tag on the /7175 page is incorrect.
-
RE: Can you compare social profiles (FB, Twitter, Instagram, Linkedin) for sites - in a similar way you can compare sites using OSE
Yes, in Open Site Explorer all you have to do is put in the URL. Keep in mind that you will see the Domain Authority, and that's for the whole domain like Twitter.com. But you will be able to see the links to the profile and the Page Authority.
-
RE: Does business name capitalisation count when making sure you have the same NAP across all directories ?
The capitalization should not make any difference. As long as the name of the company is the same, and the address and phone number is correct you should be just fine.
-
RE: Removing old content
When it comes to content on your site and Google Panda issues, you need to look at the site as a whole. Generally speaking, we want all of the content to be quality content. If there are book reviews that have plenty of content (as in there is a good amount of text on the page), then even if they are older books you'd want to keep those pages.
When it comes to Google Panda, which you're describing, we're more concerned about "thin content pages", not necessarily content that no one as viewed in a while. When it comes to identifying this content, though, I prefer to look at pages that haven't had any views for 6 months. Even if the page has had less than 10 views in any given month we'd still want to leave content that has had a few views in the past month.
I do believe that there is a use for older book reviews, so if the pages do have enough unique text on the page, then it would be worth it to leave those pages. You might need to look at your overall site structure, though, if you have a lot of pages then they might not be ranking well (or have a lot of page views) because of the site structure--and not necessarily because of the content.
-
RE: 2 sites with low DA and PA and unoptimised <title>outranking everyone for competitive search term - How ?</title>
Ikke is right, there is quite some benefit from having an older domain name that has quality backlinks pointing to it. Also, don't underestimate the power of social media. Based on what I see, there's a good chance that the site is benefiting from social shares and being active on the social sites, such as Facebook, Twitter, etc..
-
RE: Regarding FB advertisment
Google Analytics typically will filter out all of the "bots" and certain visitors to the site that they don't believe are real visitors. Also, there may be some traffic from Facebook that Google shows as having come directly to the site, rather than showing a direct referral.
Another potential issue is that your site may have been down at some point, and Facebook could have sent traffic but they could have received an error.
If you have any questions, though, you may want to get in touch with Facebook and show them your screen shot of the Google Analytics account. They may be willing to give you a credit for certain traffic to the site that was "fake" or "bot" traffic.
-
RE: Meta Robots query
Joshua, it looks as if you turned on the canonical tag on a certain page. Usually you will have the canonical tag added to a page when it's generating duplicate content--and you want that page to be considered to be "combined" with the page you're "canonicaling" to.
So, if you have a product that comes in several colors, those "color" pages will canonical back to the main product page.
The meta noindex, follow tag really shouldn't be used in most cases, if you're going to stop the search engines from indexing a page on your site, you should consider if you really need that page on the site anyway.
In the screen shot you provide, you're not giving me enough information about which pages on your site you're canonicaling from--so I can't give you specific advice on that. But to answer your question, these settings can definitely affect rankings.
-
RE: What is the best way to search across my entire sub domain for a keyword?
Margaret, I'm not sure what exactly you want to do this for, but the best way, actually, is to use the site: search operator in Google. For example, you can use something like this:
site:subdomain.yourdomain.com keyword
That will show the results for that keyword, and it will show what Google thinks.
-
RE: 404 Error Pages being picked up as duplicate content
kfallconnect, if the 404 errors are being picked up as duplicate content, then most likely they're not actually showing up as 404 error pages. It's quite possible that it's a 404 error on the site (that's what the user sees) but, in fact, the server header is not displaying a 404 error. It could be showing up as a "200 OK".
First, I would identify the pages. If the user sees an error on the page, then that's fine. Use a server header check tool to see what the response code is when someone goes to the page. You can use something like Rex Swain's HTTP header tool to check it: http://www.rexswain.com/httpview.html . If the page shows a 404 error then you should be fine, it's not duplicate content.
If the page is showing a "200 OK" then it most likely IS duplicate content. If the page is showing an error to users but showing a '200 OK' in the server header, then that needs to be fixed.
But if the page is showing actual content (and not an error to visitors) then you need to look at potentially using the canonical tag or removing the content on the site completely (which is preferred).
-
RE: Redirect Plugin: Redirecting or Rewriting?
Yes, it has the same result. One reason why we typically don't recommend editing the .htaccess file yourself in WordPress, for example, is that other Plugins may rely on editing/changing it. For example, security plugins like Wordfence may update it, and WordPress' permalinks, for example, need access to update the file as well.
On WordPress, we typically use the Redirection plugin without any problems.
-
RE: Google My Business: Multiple businesses operating from same address
In the past, for these situations, we've had to get the location to set up suite numbers for each business. That way the address is correct, and then there is a suite number for each one. What we prefer to do is to get the USPS (postal service) to also set up and have those suite numbers in their system, as well. So, that way it's official.
You'll also need to post the suite numbers on the outside of the business, as well.
-
RE: How to stop Spam Referral Traffic?
Probably the best way, actually, is to stop it even before it gets to your site. Using Cloudflare or something similar, you can stop that bad traffic (or at least some of it) before it even hits your site. Another option is to use Wordfence, which is really good at stopping that traffic, as well.
Google recently did say that they have fixed the referral spam issue in Google Analytics, but I am still seeing some of it from time to time.
If you've set up Cloudflare and Wordfence on your WordPress site, you can still set up filters on Google Analytics. The best solution that we've seen recently is to use the site's hostname to set up filters. So, if you look at the referrals from the past year or so, and you look at the hostnames, you'll see the hostnames that are for your site. For example, if your site's GA is for moz.com, then you'd want to set up a filter that only shows referrals that use moz.com, www.moz.com, etc.. Typically, the referral spammers don't know your hostname, so that will filter all of that traffic out.
-
RE: How preproduction website is getting indexed in Google.
Anoop, when a 'development' or 'preproduction' website or subdomain is getting indexed, that means that you haven't stopped the search engines from crawling it. The search engines, especially Google, are very aggressive at crawling, and they will crawl just about any URL that they find. It seems as though all you have to do is visit that page and it's going to get crawled.
Best way to stop Google from crawling (then indexing) a website is to stop it from getting crawled using the robots.txt file. Keep in mind, though, that even if you tell them to stay out of it using the robots.txt file they will still index those URLs.
The only way to stop Google from crawling would be to password protect the website or make it available only on a private server, or available via VPN only.
-
RE: Content Spinner Tool??? Worth? Recommendations?
The others are absolutely correct. A content spinner "tool" or script is an outdated technique and will only cause your site tp have ranking problems. I couldn't imagine a use for a spinner tool now when it comes to organic search.
However, if you have a lot of ads to generate and you need to rewrite ad copy in order to test it you might find a spinner helpful. Even in that case, though, you'd want to manually review all of the results to make sure that they read correctly and the grammar is correct.
-
RE: Google SERPs displaying Tracking Tags
Oleg is right, you should be using canonical tags in this case. However, keep in mind that you can also tell Google, in Google Search Console, which parameters to ignore. If you're using tracking parameters on a regular basis then you'll want to do that, as well. And you should tell Bing Webmaster Tools, as well, to ignore those parameters.
Another option is to list those parameters in your site's robots.txt file so that they're not indexed.
-
RE: Dublicate Content: Almost samt site on different domains
Google has made it clear, time and time again, that if a web page is in a different language (it's translated), then it's not considered to be duplicate content. So, we recommend translating it into the appropriate language, it will (should) do just fine in Google and won't have an duplicate content issues.
If, however, there's more than one site that has the same content in the same language, like using English in more than one country (having two English sites but targeting different countries), then your content will need to be unique. If it's not unique, then we recommend using the canonical tag to specify which one Google should use. Using the canonical tag should be a last resort, though, as unique content is going to be best.
-
RE: Republishing Breaking News
Yes, in your example it WOULD affect the ranking, as the first URL no longer exists. Ideally the 1st URL should 301 redirect to the 2nd URL, the updated one. In most cases, timing is everything--and getting a URL crawled and indexed quickly means a lot when it comes to rankings. Keep in mind, though, that you're getting a good ranking, but you then essentially ignore it and get rid of that ranking when you change URLs.
-
RE: In Google SERPs some companies / government agencies have a Google-generated card for their organization and it references their Wikipedia page. It does not show for all companies /orgs that have a Wikipedia page. What is the criteria to have it shown?
This "card" that you're referring to is actually the Knowledge Graph data that's being displayed. While Wikipedia data is actually a part of the Knowledge Graph, just having a Wikipedia page doesn't mean that it will get a knowledge graph entry shown on the right hand side of the search results.
We have been able to get certain companies and entities their own Knowledge Graph entry previously by adding an entry into Freebase, that option is no longer available. The knowledge graph is, in fact, something that's closely guarded by Google, as there is no place where they list all of the websites that make up the KG, and I believe it changes or is updated regularly.
Based on your example, it is clear Google knows about OIN, but most likely it isn't showing a KG entry because of what it is--an online version of the database published by the U.S. Department of Labor. So, OIN isn't an entity according to Google.
-
RE: PortfolioID urls appearing in my wordpress site- what to do?
Simon, I'm not sure where you're seeing the duplicates, but generally speaking there are a few ways to deal with this:
-
use the robots.txt file to disallow indexing of the duplicate URLs (keep them but disallow from being indexed if they're helpful for users
-
remove the PortfolioIDs entirely from the site. If they're not needed and they're not helpful to users then I would remove them entirely.
-
set up canonical tags so that even though they're crawled they will still pass on the credit to the main URL.
-
-
RE: How do I fix duplicate title issues?
Typically, the best way to deal with this is to make sure that your subdomain cannot be crawled by the search engines. If Moz crawls it then you will still see the errors. But, if you block Google from crawling it (use the robots.txt file), then you should be fine.
-
RE: HTTPS Migration & Preserving Link Equity
Logan is correct, whenever you use a 301 redirect from one page on your domain to the same domain the link equity is passed (all of it, 100 percent). So, migrating from http to https isn't going to hurt at all. You won't lose any link equity.
I still prefer to updated any links on other sites whenever I can, such as links from social media profiles, etc. and any other links I can get updated.
-
RE: Will Huge bounce rate from social media visits affect SEO or website Ranking according to Google algorithm factors?
Typically, website visitors from social media does tend to have a higher bounce rate. But the fact that it's coming from social and those are actually considered links (tweets are links) trumps the fact that they bounce or have a higher bounce rate.
When it comes to bounce rate from a particular source, Google does have info about clicks and bounces from Google's organic search, but I don't believe they have the bounce rate from someone coming from Twitter, clicking on your site, then going back to Twitter. They do have the overall bounce rate data, but apparently they don't use Google Analytics data.
Overall, I don't believe a high bounce rate from social media will have an effect on rankings--it's bounce rate from Google organic search that may have an effect.
-
RE: Republishing Breaking News
When it comes to republishing content, Google is going to look at whether or not the content is duplicate or not. Based on what Google has told us, we know that Google considers something to be duplicate by comparing page to page--not just a headline or an H1 tag.
So, although the headline may be the same, it's quite possible that the content of the page (when looked at as a whole) is not duplicate. There could be other content on the page (usually text) that is shorter or longer than the original page, so there isn't an issue.
But with all duplicate content situations, the first to get crawled is the originator--so it's quite possible that the other site gets crawled first and the actual originator of the content ends up not doing as well in the organic SERPs.
-
RE: Does LinkedIn Pulse Backlinks add to domain authority?
Typically no, LinkedIN decides whether or not the link is nofollow or not. But, if you share the article, and you put the URL in your description, it will be a "do follow" link.