Typically, the best way to deal with this is to make sure that your subdomain cannot be crawled by the search engines. If Moz crawls it then you will still see the errors. But, if you block Google from crawling it (use the robots.txt file), then you should be fine.
Best posts made by GlobeRunner
-
RE: How do I fix duplicate title issues?
-
RE: Does LinkedIn Pulse Backlinks add to domain authority?
Most of those links are going to be nofollow links, so when it comes to building Domain Authority, it's generally not going to be recognized as a link that passes DA. Google, on the other hand, may actually count those links--as we've seen nofollow links help rankings.
As you build links, keep in mind that it's good to drive real traffic (which helps rankings) and social media shares (which helps rankings) and build your natural link profile (which helps rankings). You need nofollow links for your link profile to be natural.
-
RE: Ecommerce product rankings tank when procuct out of stock
HDPHNS, it's quite possible that Google is doing that on purpose--that they see the product is out of stock so they don't rank it as high. Google wants to provide a good user experience, and it would be frustrating if someone went to Google, did a search, went to your site, and found that the product is out of stock.
-
RE: PortfolioID urls appearing in my wordpress site- what to do?
Simon, I'm not sure where you're seeing the duplicates, but generally speaking there are a few ways to deal with this:
-
use the robots.txt file to disallow indexing of the duplicate URLs (keep them but disallow from being indexed if they're helpful for users
-
remove the PortfolioIDs entirely from the site. If they're not needed and they're not helpful to users then I would remove them entirely.
-
set up canonical tags so that even though they're crawled they will still pass on the credit to the main URL.
-
-
RE: How to lay off your SEO compnay?
Armin,
Before you tell your SEO company, I recommend that you thoroughly review any contracts that you signed in order to make sure you understand everything. I would make sure that you make backups of any content that has been produced, and make sure you change the appropriate passwords on the site and on any tools that the SEO firm may have access to: Google Webmaster Tools, Google Analytics, etc.
Your conversation should be focused around leaving your SEO firm in good standing if possible, because there could be issues later on that arise (if they're mad at you, they could start some negative SEO or cause other hassles).
-
RE: Does More Guest Posts Effect Website Rankings in a Negative Way?
Any guest posting that you're doing and how it affects your site's search engine rankings will depend on your site's link profile. If you're only doing guest posting and that's how you're getting your links, then that may put up a red flag and it might affect rankings. However, if you have thousands of links from other sites then a few guest posts aren't going to hurt, they should help.
Keep in mind, though, that any guest posts should comply with Google's webmaster guidelines, the blogger should disclose that it's a guest post, and typically it should be a nofollow link to your website. If it doesn't comply, it doesn't matter how many posts you have, even a few could get your site penalized.
-
RE: What domain name do you think is better for SEO: sirocco-webdesign.com or sirocco-web-design.com?
Personally, if you cannot use siroccowebdesign.com, then sirocco-web-design.com would be preferred since it separates all of the words. I would, however, buy both and redirect one to the other.
-
RE: Something happened within the last 2 weeks on our WordPress-hosted site that created "duplicates" by counting www.company.com/example and company.com/example (without the 'www.') as separate pages. Any idea what could have happened, and how to fix it?
Most likely it's a setting in WordPress that has been turned on or turned off. I'm not sure how you were redirecting your site from www.company.com to company.com, but that's typically done in the .htaccess file on the site.
What you'll need to do is verify that it's still happening first. Use a server header check tool to see if company.com is redirecting to www.company.com (or vice versa).
There are several ways to set up these redirects, and you'll have to figure out how you were doing it previously. If you were using a plugin, it could be that the plugin was removed or deactivated (or updated). It could also be in the site's theme, some themes allow you to set your "preferred version" of your domain.
Lastly, I would go into Google Search Console and make sure you have set your preferred domain there (www or non-www) so Google knows which version to use. If you have it set there, there's a chance that your site's rankings may not be affected.
-
RE: Lawyer versus Attorney... does it matter?
Google does, in fact, know that lawyers are attorneys and attorneys are lawyers. However, you'll notice that Google's search results don't reflect this. You'll see different results for lawyer phrases versus attorney keywords. When optimizing, you'll want to make sure you've done your keyword research and mention those keywords appropriately on the site.
-
RE: Duplicate Title Tags How harmful is it?
Duplicate title tags are generally not something that you want--but in some cases they are necessary. If the content on the page is different, then it is not going to be a disaster for the site.
I would make sure that Yoast is enabled (or use some other way of taking care of your site's meta data, etc.).
If you do have a duplicate title tag issue, though, keep in mind that there may be other issues associated with that. For example, you may need to use the canonical tag to take care of pages that are duplicates (thus the duplicate title tags) or use the robots.txt file to disallow indexing of certain sections of your site.
-
RE: ECommcerce internal linking structure best practice
Like you mention, the introduction copy on each category page contain naturally placed links down to sub categories and products and should each sub category link back up to the main category page. What we typically recommend is that you concentrate on linking to "on topic" pages.
Take a look at what Amazon has done, as their internal link structure has always worked well for them. Each product links to other related products, and products that "others have bought" or viewed.
-
RE: Do I need to do a 301, as well as adding re-write rules on Apache
The URL rewrite "should" take care of it. So, when you enter www.domain.com/page.html it would 301 redirect to domain.com/page.html (or vice versa). Once you have the rewrites in place, then test them by going to a page and see if it redirects properly. Use a http server header checker to see if it's serving up a 301.
Even though the internal links may be using your preferred domain, then you'll want to make sure that anyone requesting (or linking to) the "wrong version" will be 301 redirected to the right version of the page.
Also, I would go into Google Search Console and make sure you tell Google which version is your preferred version: the www or non-www version of your site.
-
RE: Demoting a URL (not-WMT Related)
Christa, if you're not able (or don't to) use a canonical tag, then you're left to do a few things:
- optimize the site so that there are more internal links to the preferred page
- keep that page "fresh" by updating it's content on a regular basis
- get more links to that page from other sites
- get social media mentions and work on 'engagement' on social media that mentions the preferred page.
Don't think of it as 'demoting' one particular page, I would work heavily on promoting the preferred page so that it's stronger overall via social media and links.
-
RE: Need help with Robots.txt
Nahid, before you use the robots.txt file's disallow for those URLs, you may want to reconsider. You may want to use the canonical tag instead. In the case where you have different sizes, colors, etc. we typically recommend using the Canonical Tag and not the disallow in robots.txt.
Anyhow, if you'd like to use the disallow you can use one of these:
Disallow: /?
or
Disallow: /?cat=
-
RE: Is it bad for SEO to have a page that is not linked to anywhere on your site?
When you have a page that's still on the site (even though it's removed from navigation), the page is still on the site and indexed--so users will still get to it. It's still probably going to rank for at least something in the search results.
What you need to do is figure out if the page still contains valid content and is useful for visitors. There's obviously a reason that the page was requested to be removed by the content manager, though.
If the page needs to be removed from navigation, and it won't have any internal links, then most likely you probably need to remove the page, as well. What I would do is set up a 301 Permanent Redirect from that URL to another on-topic page on the site. As long as you use a 301 redirect and redirect it to another internal page on the same domain, you won't lose and "SEO value".
-
RE: Robots.txt Allowed
Yes, that should work just fine. As Logan mentioned, I recommend you test it in the robots.txt testing tool in Google Search Console.
-
RE: Our parent company has included their sitemap links in our robots.txt file - will that have an impact on the way our site is crawled?
While that is kind of odd to do, it should not have any effect on your site's search engine rankings or visibility. Simply providing a sitemap file is not going to help or hurt rankings in any way. It will help them discover and crawl pages. That's all.
A sitemap file is not required at all, and in fact some sites don't have them--they rely on Google's spiders to crawl their site through the links on their site. The sitemap is only a tool to help them crawl.
-
RE: Sitemap with homepage URL repeated several times - it is a problem?
Generally speaking, this is normal, and you shouldn't have to worry about it. There's a Google Webmaster thread that pointed to this issue specifically, and Google gave an official response (https://www.seroundtable.com/google-sitemaps-duplicate-16128.html)
"This is apparently normal if you submit Sitemap files with the same URLs listed multiple times. Since submitting the same URLs multiple times doesn't change anything with regards to your site's crawling & indexing, I'd just submit it once if you want the correct counts (alternately, if the count doesn't bother you, you can also just leave it like this."
-
RE: Soft 404 error for a big, longstanding 301-redirected page
Eric, you're right that you should be 301 redirecting the old page to the new one using a 301 Permanent Redirect. If Google Search Console is showing you that they're getting a 404 error on that URL, then they're getting it--it's not that they're telling you you no longer are getting any benefit from the 301 redirect.
I would check the redirect to see if it's still working. Use a server header check tool, or I like Rex Swain's HTTP tool: http://www.rexswain.com/httpview.html
Also, you should use Google's own Fetch and Render tool to make sure that they can reach the page and they don't get a 404 error: https://support.google.com/webmasters/answer/6066468?rd=2
I have seen cases where we can get to a page or see the redirect but Google cannot. So you need to use the Fetch & Render to make sure Google isn't being blocked. I've see a case where users could get to the site but Google was being blocked and given a 404 error.
-
RE: Mass Removal Request from Google Index
Any article that has release date prior to 1st-June-2012 should return a custom 410 page with "noindex" metatag, instead of the actual content of the article.
The error returned should be a "410 gone" and not just a 404. That way Google will treat it differently, and may remove it from the index faster than just returning a 404. Also, you can use the Google removal tool, as well. Don't forget the robots.txt file, as well, there may be directories with the content that you need to disallow.
But overall, using a 410 is going to be better and most likely faster.
-
RE: We used to speak of too many links from same C block as bad, have CDN's like CloudFlare made that concept irrelevant?
Here is one: http://www.crimeflare.com/cfs.html there are others out there if you search for them
-
RE: Deleting Outdated News Pages??
As EGOL suggests, if the pages haven't received any traffic in the past year or so, then they most likely are dead weight and you need to get rid of them. I would, however, do two things:
-
Review all the links to your website and 301 redirect any news articles or URLs that have links pointing to them. You'll want to make sure that you keep any links that you have pointing to those pages.
-
Rather than use a 404 error on those pages when you remove them, I would use a '410 gone' error to indicate to Google that they're no longer present, have been removed, and they need to remove them from their index.
-
-
RE: Any excellent recommendations for a sitemap.xml plugin?
Hi Inevo,
You don't mention which CMS your client is using, but if you mention "plugin" I'm assuming that the client is using WordPress. If that's the case, then they should use the Yoast SEO plugin, which will automatically generate the sitemaps(s) that the client needs.
-
RE: Has anyone ever seen Google truncating the beginning of a meta description on a mobile device?
Sometimes Google just picks what they think is appropriate to show depending on the search query. We've seen this in the past, sometimes due to the fact that Google, for whatever reason, doesn't like the meta description tag. Sometimes it's too long, sometimes it doesn't contain the keywords or something similar than the search query, etc..
-
RE: Has anyone ever seen Google truncating the beginning of a meta description on a mobile device?
Yes, this is definitely possible and typical on desktop results. When a page doesn't have a meta description tag or when the page's meta description doesn't match the search query, Google tends to pick text from the page that they feel is appropriate. So, yes, it's very possible, depending the search query/keyword used.
First time I've seen it happen on mobile or pointed out on mobile, but it "should" be happening, especially if the page doesn't have a meta description tag.
-
RE: Change of Address in Google Search Console
Brad, since the domain is not going to be yours, and you'll be unable to actually verify it in Google Search Console, then you probably won't be able to use the Google Change of Address Tool.
However, if you are using 301 permanent redirects to redirect the content to your site, that should be good enough. Typically, when you use the Change of Address Tool you get more "credit" from Google, as it looks like they will pass most of the "link juice" over to the new domain--you don't get that when you only 301 redirect from domain to domain, you will lose some "link juice".
I do recommend that you go ahead and use the 301 redirects.
-
RE: Googles tells com.au but the site redirects to com
This can definitely be a problem, and it needs to be fixed.
It sounds as if there may be two websites with the same content showing up on the .com URL and the .COM.AU URL. What we typically recommend is that you verify your site in Google Search Console and tell Google which version you prefer--you'll need to verify both versions of your site.
Also, you can use the hreflang tags on your site to tell Google that the .COM.AU site is meant for Australia, and the .COM site is meant for the USA.
-
RE: SEO Audit for a National Section of a Global Website
Whenever you use a subdomain, it's pretty much seen as a separate site. So, while the main domain's Domain Authority will help the subdomain a bit, it's going to be more "powerful" if that content is in a folder or directory on the main domain.
Also, if the content is currently in a directory on the domain currently (rather than being on a subdomain), I would generally not move it to a subdomain.
-
RE: We used to speak of too many links from same C block as bad, have CDN's like CloudFlare made that concept irrelevant?
There are tools that allow you to find out the "real" IP address of a server that's using Cloudflare. I just looked up a few using some of these tools, and they still work--so I'm assuming that Google will have the same access or ability to see these, as well.
So, your theory of using Cloudflare to not have to worry about class C blocks anymore when linking is good, but I wouldn't count on it.
-
RE: Redirect Plugin: Redirecting or Rewriting?
Yes, it has the same result. One reason why we typically don't recommend editing the .htaccess file yourself in WordPress, for example, is that other Plugins may rely on editing/changing it. For example, security plugins like Wordfence may update it, and WordPress' permalinks, for example, need access to update the file as well.
On WordPress, we typically use the Redirection plugin without any problems.
-
RE: Need help figuring out who to hire for this type of project?
Gina, it sounds as if there are several possible solutions. But it really depends on your budget. You could, actually, perform all of these tasks by using Adobe Acrobat PDF files, where you could upload the form to your website. People would then download it, fill it out, and email it to you or upload it to your website. This is fairly simple--and most web designers and web developers should be able to get it set it up fairly quickly. The PDF files then would be accessible to you (you could save them on a hard drive or a laptop) and they can be searched.
Another option, though, would be to have someone set up a custom website with the forms that are filled out online--and then saved in a database that you can then access. This would require more website development, and is costlier.
Another option is to use something like http://www.captira.com/, which appears to be software for your industry. I don't know much about it, but thought I would point it out to you. I'm not sure if integrates with your website.
-
RE: Shutting down m. domain
If you are going to literally shut down your m. subdomain, then the best way to do it is to use 301 Permanent Redirects to redirect the traffic to the appropriate page on the main domain. I would NOT use canonical tags in this case.
-
RE: Rankings just took a wallop in the last hour.
For most of the sites that I monitor on a regular basis, we're not seeing any major changes in rankings. It seems as though they're always making changes to the algorithm, and there were a few days last week where I saw some dips. At this point, though, since it's only been a short amount of time, I would wait to see how it plays out.
-
RE: Strange 404s in GWT - "Linked From" pages that never existed
It's quite possible that at one point there was a link there--because the page rendered for some reason. I would crawl the site yourself using a crawler (there are several available) to make sure that the page isn't reachable from, perhaps, a bad link on the site.
Check the archive.org to see if the page existed at one time or not.
I would also take a look at the page's server header again to see if the site is showing a 404 error or a "200 ok" along with a "page not found". It's possible that the page doesn't exist but it delivers a "200 OK" server header anyway. Another option is that it might be in your sitemap.xml file.
When in doubt, if the page doesn't exist, I would mark it as fixed in Google Webmaster Tools and watch if it comes up again. If it doesn't come up again as an error, then I wouldn't worry too much about it.
-
RE: City in title tag hurt Local Search?
Whenever working on local search engine rankings, I try to be as consistent as possible when it comes to the NAP (Name, Address, Phone Number) data. If you were to put the name of a larger city in the title tag (I am assuming it is in the same metro area), you'll need to mention that larger city in copy on the page. If you just keyword stuff the larger city name in the title tag and don't make it part of a larger strategy (such as getting anchor text links to the site with that city name pointing to your site), then you won't be successful.
Google knows that smaller cities are a part of a larger city or metro area, and usually it isn't a problem with the NAP data being confused. When you set up the Google Plus Local listing, make sure you specify that you serve customers in that same metro area that you're including in the Title Tag.
-
RE: Anybody have a SMX West 2016 Coupon Code?
I would check the social media sites, some sponsors may have coupon codes, and they typically share those codes on the social sites if you follow and connect with them.
-
RE: New ecommerce site: Close old site and full domain redirect or keep it linking to new site?
The biggest decision here is whether or not you want to continue to maintain two separate websites--or if it's better to spend your time building one brand and concentrating on one website.
When it comes to links, Domain Authority, and Page Authority, you can easily 301 redirect the old site to the new one--so that really shouldn't be a concern. It's the overall decision of maintaining and building two sites and two brands--rather than one.
-
RE: Webmaster Tools Average keyword position
Webmaster Tools' keyword data is what they're saying: it' the AVERAGE position, whenever that keyword showed up in the search results. There are a lot of factors that can prevent you from seeing it in the position that they say it is in:
the timeframe you are looking at in Webmaster Tools (it varies day to day)
the customization or personalization factors
the data is different now than it was 2 days ago (GSC data is delayed by 2 days)
the GSC data may not reflect any penalties that the page or site may have -
RE: Can you give me some advices to rank this domain?
Those keywords are extremely competitive. What we generally recommend is that you go after geo-based keywords such as "city name + mobile development".
The other issue I'm seeing is that there are some low quality backlinks pointing to the site. You'll need to work on the link profile and get the overall Domain Authority and overall trust much higher than it is currently.
-
RE: Killing it in Yahoo/Bing...Sucking it in Google. What gives?
Audra,
One of the big differences between Bing and Google is how they deal with links. Google analyzes links to your site a lot more than Bing does, and is "pickier" when it comes to the types of links and how they're obtained. I would review all of the links to your website and make sure that they're all "high quality" links. A quick analysis of the links to your site shows some undesirable links pointing to the site that violate Google's webmaster guidelines.
-
RE: Web Site Migration Testing and SEO-QA Automation?
George,
There aren't any specific SEO migration tools out there, but there are several tools out there that will do the job. If you're an advanced SEO and know what you're doing, you can use tools like Screaming Frog's SEO spider to crawl the list of URLs and even Scrutiny (on the mac) to crawl the URLs and grab the data you need.
-
RE: History of Page or Domain Authority...how?
Right now there is no tool that is going to show you the DA authority over a period of time. However, you can look at the site's rankings over time if you were to use SEMRush.com. You can also download the site's backlinks, and using most link tools they'll show you the history of the links and when that site acquired a particular link.
-
RE: Search traffic hit after switching magazine to subdomain
Whenever you change to a new subdomain, or you change domains, you'll need to tell Google that you're doing it. Here's what we recommend:
1. Set up 301 Permanent Redirects from the old URLs to the new URLs. This will forward/redirect users to the new location, including the search engine crawlers.
2. Verify the new location (the new subdomain) in Google Search Console and use the Change of Address Tool to tell Google that you've moved.
3. Update Google Analytics so that it reflects the new URL.
By verifying the site you'll see the traffic in the Google Search Console and Google will recognize that change faster if you use the Change of Address tool.
-
RE: Yelp Jumps Into Home Services - Will You Jump With Them?
Just like any other marketing or advertising option, you really have to test it first. There are going to be certain home services businesses that may do well with Yelp, and then there are going to be others that don't do well with it. So generally speaking, we do recommend that businesses test it and see if they get leads from it and if it's profitable for them.
-
RE: CcTLDs vs folders
There definitely is a benefit for keeping all of your content on one domain (using folders), and building up the overall Domain Authority of one domain/one site.
When it comes to making the decision on whether or not to go to a ccTLD, consider your users/visitors first. How will they interact with the site, will they trust it more if it's a ccTLD in their country? If so, then consider the fact that it will ultimately be better for your business if the users like it and trust it better.
Another consideration is the fact that you'll be creating an entirely new site on a ccTLD. You'll be starting fresh, and will need links and time to ultimately get it to rank and get the traffic to where you need it to be. Then there's the whole issue of content, you'll need unique content for the site. If you can afford the time and effort involved in creating a completely new site, and it makes sense for users then I would consider the ccTLD route.
-
RE: What is considered a keyword score that is too competitive?
It really depends on your particular website, and the overall Domain Authority of your site versus the Domain Authority of your competitors (or the sites you want to compete with). If your Domain Authority is in line with the other sites that you want to rank for, then you should really have any problems ranking for those keywords.
However, if your site's Domain Authority is half of what the other sites are, then it may be more difficult. I usually recommend working on getting trust first, and working on overall Domain Authority to get your site in line with the other sites that already rank for those keywords. Then, start looking at trying to rank for specific keywords.
-
RE: Is my client's site penalized and if not, why are other lesser quality sites ranking with lower metrics than my client's?
It's always extremely difficult to diagnose exactly what's going on when you don't have any specific URLs to refer to. However, you mentioned that it appears to be one keyword or one particular page--not a lot of pages or all the pages on the site. So, I don't think it's a penalty on the site. Rather, it may just be some "over optimization" for that particular keyword or that one page.
What I would focus on is the anchor text pointing to that page. I would look at the anchor text of links to that page, and then to the pages that are ranking to see what the differences are. Also, while I do like OSE, you have to realize that all of these link data tools have different ways of crawling, and may not pick up on all the links. It's best to consult several different backlink tools, along with Open Site Explorer.
-
RE: What is the "UPDATE" indicate in the Google Search Console Query Reports?
It means that Google updated the data on April 27th. John Mueller from Google wrote about it here.
It is related to how Search Console reports & calculates clicks and impressions in Search Analytics.
"As a result, you may see a change in the click, impression, and CTR values shown there. For most sites, this change will be minimal. A significant part of this change will affect website properties with associated mobile app properties. Specifically, it involves accounting for clicks and impressions only to the associated application property rather than to the website. Other changes include how we count links shown in the Knowledge Panel, in various Rich Snippets, and in the local results in Search (which are now all counted as URL impressions)."
-
RE: Will reviews be ranked higher if responded to?
As far as we could find, there hasn't been any recent articles written about this lately. At least where someone has tested it and publicly come out with the results. However, we aren't particularly interested in the ranking or order of reviews per se, our biggest concern is more of a customer service type of issue. All reviews, good & bad, are responded to in a very prompt manner. That way people reading the reviews will see the response.