I know it's frustrating, we have come across lots of listings like this in the past. About the only thing you can do is flag the listing and explain that it's a duplicate listing. You might have several people using different Google accounts flag the listing, as well.
Posts made by GlobeRunner
-
RE: Can I request removal of a duplicate competitor Google Business listing?
-
RE: UK English and USA English - two flags on navigation?
Yes, you can definitely use an IP redirect and redirect users to the correct 'version' of the site based on their IP address. You still may want to add a link to both versions, though, since it is possible that someone from the USA might be traveling in the UK and want to see the USA version of the site.
Keep in mind that most likely these two versions may look like duplicate content to Google, you may want to only allow Google to see one 'version'. Otherwise, use the proper hreflang tag to indicate US English and UK English.
-
RE: Delays in Search Console Data
Google Search Console data is delayed, and so far Google hasn't generally been totally transparent about the issue. Hopefully they're working on an update and going to be giving us more data?
-
RE: Indivdual Property Listings
This is certainly a big issue for all of the real estate websites that pull in the MLS listings, since those are essentially duplicates of each other. It creates a lot of duplicate content.
There are a few things to keep in mind:
- The first crawled is considered the originator of the content--so you'll want to get those pages crawled as soon as possible or frequently.
- Pages that get crawled quickly tend to have more Domain Authority (you get DA from other sites, through links)- Links to pages typically help rankings, as well as social media mentions, as well.
So, in order to get your site crawled quickly or frequently, you'll need to get your Domain Authority to be higher than your competitors' sites. Also, new pages need to have links. So, if you can get those pages (new MLS listings) some links quickly then that will help, as well.
-
RE: Improve my on-page SEO
Rob, generally speaking, creating pages or sections of your site just to target certain cities is not going to get you anywhere. Those typically are going to be considered "doorway pages" and violate Google's Webmaster Guidelines. What we typically recommend is being honest with the location of where you are, posting that location/address on your site, and using Schema.org markup to the search engines where you're located. Then, using Google My Business (Google maps, etc.) to set up your local business listing is key. I would also consider working on local citations in order to get more listings for your business, as well.
Since you're a photographer, using social media such as Instagram, Pinterest, Twitter, and Facebook is going to be key to driving visitors to your website, as well. Google is watching engagement on the social media sites, and that typically helps your site's search engine rankings.
When it comes to Yoast, we typically use it for on-page changes only, such as optimizing the title and meta description tags. If you need to redirect pages, you might consider a separate WordPress plugin to do that. If you're having trouble with duplicate pages showing up, then you may need to delete the specific duplicate pages and wait for Google to re-crawl your site.
-
RE: Should I optimize my home-page or a sub-page for my most important keyword
Generally speaking, if you think of the keyword set as part of a large "topic", then you should optimize your site's home page for the main topic in general--and then the sub-topic keyword(s) and the "sub-sub" topic keywords would also be pages, as well. The sub-topic keyword would be linked from the site's home page, and then the sub-sub topic would linked from the sub-topic's page.
I like to think of this as a theme pyramid approach
Main Topic
Category
Sub-Topic
Category
Sub-Topic -
RE: Rich Snippets appear differently for Wikipedia, Why?
What I believe you're seeing is information taken from the knowledge graph. Wikipedia is a data source of the knowledge graph (there are a lot of sources), so it's logical that Google would include that information in their results.
-
RE: Hundreds of 404 errors are showing up for pages that never existed
The 404 errors and other crawl data is provided by a separate database in Google, so the timing of the links and the data you're seeing isn't exactly "in sync" so to speak. I would do a few things--look at the pages to see if there's something on your site that might be causing the errors.
Another option is to look at the link data to see if there are any links pointing to those pages. I would use another source, such as OSE or Majestic.com to see if there are links pointing to those pages that Google isn't reporting (which is often the case).
-
RE: Blogs created by a company for us and another company
Jordan is right, most likely you will run into duplicate content issues if you allow duplicate content to be posted on your website. That duplicate content definitely hurts your site's search engine rankings. If you feel the content is good for your site's users, then you may want to keep it, but make sure that it's not being indexed by the search engines.
If you're paying someone to provide you content and that content isn't unique for your website, then I wouldn't see it as very valuable--at least for SEO purposes. I would re-consider whether or not you want that duplicate content on your website, let alone pay a premium for it.
-
RE: Can you expedite the correction of erroneous information on the Knowledge Graph?
While some people have been successful by using non-traditional means of contacting Google for updates or changes with unsupported "products" (such as going through Google My Business or AdWords), that may be helpful.
When it comes to the knowledge graph and specific information getting corrected, if you are logged into your Google Account and visit the Google search results where that knowledge graph appears, you may be asked if the information is correct. That's where you need to submit the information or corrections. It usually takes only a few days based on our experience with making corrections there. The "official representative" of that organization needs to be logged in, it won't work with another Google Account and you won't see that information for changes if you're not the official representative.
The other issue is that you need to figure out where the data is coming from. The Knowledge Graph is made up of a lot of different sources, and you need to get it corrected at the source.
-
RE: Duplicate Content for Non-SEO Purposes
Jordan is right, I don't recommend spinning any of the content, it generally doesn't turn out well for users, and tends to make it 'unreadable'.
Your best bet here is to think of Search and Email as two separate things. If you're going to use the content in multiple newsletters and reach other audiences (the same person won't read it twice), then that's perfectly fine. However, if you're going to allow ANY of it to get indexed by the search engines, then I would only allow one copy to get indexed. You can archive the other copies, but just make sure that you don't allow indexing of those copies.
Duplicate Content isn't generally thought of as 'penalty', it's more about the fact that only the first crawled version gets indexed, and the others generally don't.
-
RE: ECommcerce internal linking structure best practice
Like you mention, the introduction copy on each category page contain naturally placed links down to sub categories and products and should each sub category link back up to the main category page. What we typically recommend is that you concentrate on linking to "on topic" pages.
Take a look at what Amazon has done, as their internal link structure has always worked well for them. Each product links to other related products, and products that "others have bought" or viewed.
-
RE: Outreach, Relationships and Link Development
Using freelancers can be very successful--and it can also be a nightmare as well. What we typically recommend is that the more specific you are in what you want that freelancer to do the better. So, just asking them to get more links or work on outreach is setting yourself up for disaster.
If you are able to set up very specific tasks, use a task management system, and have the proper training in place then using freelancers should work out well for you.
-
RE: Yelp Jumps Into Home Services - Will You Jump With Them?
Just like any other marketing or advertising option, you really have to test it first. There are going to be certain home services businesses that may do well with Yelp, and then there are going to be others that don't do well with it. So generally speaking, we do recommend that businesses test it and see if they get leads from it and if it's profitable for them.
-
RE: Something happened within the last 2 weeks on our WordPress-hosted site that created "duplicates" by counting www.company.com/example and company.com/example (without the 'www.') as separate pages. Any idea what could have happened, and how to fix it?
Most likely it's a setting in WordPress that has been turned on or turned off. I'm not sure how you were redirecting your site from www.company.com to company.com, but that's typically done in the .htaccess file on the site.
What you'll need to do is verify that it's still happening first. Use a server header check tool to see if company.com is redirecting to www.company.com (or vice versa).
There are several ways to set up these redirects, and you'll have to figure out how you were doing it previously. If you were using a plugin, it could be that the plugin was removed or deactivated (or updated). It could also be in the site's theme, some themes allow you to set your "preferred version" of your domain.
Lastly, I would go into Google Search Console and make sure you have set your preferred domain there (www or non-www) so Google knows which version to use. If you have it set there, there's a chance that your site's rankings may not be affected.
-
RE: International SEO
Rankings take time. When you say "problems with .com on English in my country", are you referring to the fact that you have a .COM or do the other sites have the .COM? I would concentrate on getting your site's Domain Authority higher and getting more trusted links to your site from your country.
Also, make sure you're using the hreflang tags correctly on your site, that can certainly cause issues with Google if they're not used correctly.
-
RE: What is considered a keyword score that is too competitive?
It really depends on your particular website, and the overall Domain Authority of your site versus the Domain Authority of your competitors (or the sites you want to compete with). If your Domain Authority is in line with the other sites that you want to rank for, then you should really have any problems ranking for those keywords.
However, if your site's Domain Authority is half of what the other sites are, then it may be more difficult. I usually recommend working on getting trust first, and working on overall Domain Authority to get your site in line with the other sites that already rank for those keywords. Then, start looking at trying to rank for specific keywords.
-
RE: Duplicate Page Titles
Your site's title tags should exactly reflect the site's content. So, if you have an Obama dart board backboard, and another product is a Texas dart board backboard, then each page would have a unique title. Unless the title tag is exactly duplicate, then you probably don't have an issue. The title tags are similar, but they're not exactly duplicates.
You didn't mention as to where these are being flagged as duplicates, if it's in Moz or in Google Search Console, so it's tough to give you any more specifics.
-
RE: Webmaster is giving errors of Duplicate Meta Descriptions and Duplicate Title Tags
If you have changed the permalinks structure, then I would use a crawler to crawl your own site and look at the meta data. The "duplicate title tags", etc. errors take some time to show up in Google Search Console, at least a day or two. So you may want to re-check it in a few days if you just made corrections on your site.
You can use a campaign here in Moz to check the data, or you can use a crawler like the Screaming Frog SEO spider to check all the data yourself (which is much quicker).
-
RE: 'SEO Footers'
You're exactly correct--if a link or a list of links is buried in the site's footer then they're really not useful to visitors. No one really clicks on those links. I recommend only having links like "terms of service", "privacy policy", etc. links there.
If you want proof that no one clicks on those links, then check out Google Analytics and see the "in page analytics" to see where people are clicking on your page. You will see that no one clicks on footer links, especially link lists.
I would remove any link lists that you have in your site's footer.
-
RE: Do I need to do a 301, as well as adding re-write rules on Apache
The URL rewrite "should" take care of it. So, when you enter www.domain.com/page.html it would 301 redirect to domain.com/page.html (or vice versa). Once you have the rewrites in place, then test them by going to a page and see if it redirects properly. Use a http server header checker to see if it's serving up a 301.
Even though the internal links may be using your preferred domain, then you'll want to make sure that anyone requesting (or linking to) the "wrong version" will be 301 redirected to the right version of the page.
Also, I would go into Google Search Console and make sure you tell Google which version is your preferred version: the www or non-www version of your site.
-
RE: Http://newsite.intercallsystems.com/vista-series/sales@intercallsystems.com
I'm not sure where you're seeing these crawl errors. Most likely, though, they are links in your website where you list your contact information (your email address). So, the pages that list sales@intercallsystems.com might have the link to your email coded wrong.
-
RE: 404 crawl errors ending with your domain name??
In order to understand your question, can you give me more information? What crawler are you using? What do the URLs look like? You can give an example, just remove the domain name if you like.
-
RE: What Google Analytics Data to Share with Potential Website Buyer
Initially, I would get the competitor to sign an NDA so that there aren't any issues later. It really doesn't matter how much you're going to sell the website and domain name for, you want to protect yourself in the future.
Initially, I would point them to SEMrush.com for data, as that typically has as much as someone would need. And, it's a third party offering the data. As for Google Analytics access, I wouldn't give them Analytics access at first, I would ask them what data they are looking for. You can typically give them a PDF that shows the past year of page views, showing the "monthly" view. I also would share with them the referral sources, so that they understand that the site isn't gaming the page view numbers.
-
RE: How do I set up 2 businesses that work together but are ran seperately with two separate websites but similar content?
It is definitely possible to maintain two separate websites. It sounds as if they don't do the same thing, so technically the content will be different. The websites use a different template, and from what I can tell they don't have any duplicate content issues.
Since the sites are related, it would be natural for the two websites to link to each other. But you don't want to run into any search engine penalties from having them link to each other. In order to do that, you'll need to make sure that the link profiles of each website are completely different.
What I would focus on is the links to each website, and have a plan to acquire good, trusted links to each of them. One is product related, so you'll want to focus on where you can get your products listed. The other is a service type business, so getting the site links that are appropriate for that site would be helpful.
-
RE: Whitehat site suffering from drastic & negative Keyword/Phrase Shifts out of the blue!
I know this is frustrating. There are a few areas that I would look into that could be causing this: duplicate content issues and links. First, look to see if you have any duplicate content issues on the site. There could be a duplicate copy of the site (perhaps a dev version that should not be indexed) or even certain content on your site that's causing issues. You might try Siteliner's crawler to identify if there are any issues you can fix.
Another possible reason is the links to the site. The site could have been hit by negative SEO, and a lot of "low quality" links or off-topic links could be pointing to your site. I've seen this in the past, and the only thing you can do is identify the links and disavow them. Sometimes you can get them removed, but disavowing them should work.
-
RE: Is my client's site penalized and if not, why are other lesser quality sites ranking with lower metrics than my client's?
It's always extremely difficult to diagnose exactly what's going on when you don't have any specific URLs to refer to. However, you mentioned that it appears to be one keyword or one particular page--not a lot of pages or all the pages on the site. So, I don't think it's a penalty on the site. Rather, it may just be some "over optimization" for that particular keyword or that one page.
What I would focus on is the anchor text pointing to that page. I would look at the anchor text of links to that page, and then to the pages that are ranking to see what the differences are. Also, while I do like OSE, you have to realize that all of these link data tools have different ways of crawling, and may not pick up on all the links. It's best to consult several different backlink tools, along with Open Site Explorer.
-
RE: Is pagerank still a ranking factor for Google?
Google may still be using something related to PageRank or may still be using PageRank internally. They may use it for determining which pages to crawl first--and which ones to crawl less often.
However, publicly there is no PageRank data available, and it's not going to be updated in the future. So, we can consider it dead at this time.
-
RE: GWMT / Search Analytics VS OpenSiteExplorer
Whenever you deal with links, even though I really like OSE, typically we have to compile all of the link data from multiple sources. We typically use OSE, Majestic, ahrefs, Google Search Console, as well as others and compile all of the links into one spreadsheet and then look at them there. Different sites have different crawlers and no one source is the most accurate.
-
RE: Anybody have a SMX West 2016 Coupon Code?
I would check the social media sites, some sponsors may have coupon codes, and they typically share those codes on the social sites if you follow and connect with them.
-
RE: CcTLDs vs folders
There definitely is a benefit for keeping all of your content on one domain (using folders), and building up the overall Domain Authority of one domain/one site.
When it comes to making the decision on whether or not to go to a ccTLD, consider your users/visitors first. How will they interact with the site, will they trust it more if it's a ccTLD in their country? If so, then consider the fact that it will ultimately be better for your business if the users like it and trust it better.
Another consideration is the fact that you'll be creating an entirely new site on a ccTLD. You'll be starting fresh, and will need links and time to ultimately get it to rank and get the traffic to where you need it to be. Then there's the whole issue of content, you'll need unique content for the site. If you can afford the time and effort involved in creating a completely new site, and it makes sense for users then I would consider the ccTLD route.
-
RE: Paid Link/Doorway Disavow - disavowing the links between 2 sites in the same company.
I'm not sure if I totally understand your question. Can you explain what type of doorway/paid link activity you're referring to?
You need to evaluate each link, one by one. If the link is a good, natural link, then I wouldn't disavow it. If that link is a sponsored link or paid link, then you might consider adding a nofollow link attribute. If it's something that violates Google's Webmaster Guidelines for linking, then you should remove the links, not just disavow them.
-
RE: Doing a re-design but worried about my new navigation affecting rankings
Generally speaking, the navigation should be fine. I would be more worried if you were going to change URLs (the page URLs would be changing to other URLs). I would take a look further at the internal linking structure currently and see which pages link to which pages. Then, consider if your new navigation would add more internal links to those pages or take away internal links to those pages.
You can crawl your own site and see how many links are pointing to certain pages in order to see if your new navigation will increase the internal links to decrease those internal links.
-
RE: Strange 404s in GWT - "Linked From" pages that never existed
It's quite possible that at one point there was a link there--because the page rendered for some reason. I would crawl the site yourself using a crawler (there are several available) to make sure that the page isn't reachable from, perhaps, a bad link on the site.
Check the archive.org to see if the page existed at one time or not.
I would also take a look at the page's server header again to see if the site is showing a 404 error or a "200 ok" along with a "page not found". It's possible that the page doesn't exist but it delivers a "200 OK" server header anyway. Another option is that it might be in your sitemap.xml file.
When in doubt, if the page doesn't exist, I would mark it as fixed in Google Webmaster Tools and watch if it comes up again. If it doesn't come up again as an error, then I wouldn't worry too much about it.
-
RE: Can a homepage have a penalty but not the rest of the pages?
Daniel,
Yes, it is very possible (and that's a lot of the penalties we see lately). The penalties are only given to certain page on the site, usually because of the over-optimization of anchor text links pointing to those pages that are penalized.
So, I would do a thorough review of all the links pointing to the site and make sure that the anchor text isn't over-optimized. There should be more "brand" phrases and compound phrases rather than exact-match keyword anchor text links pointing to the page.
In Google Webmaster Tools, when sites get manual actions, they're either site-wide or partial matches when it comes to the link penalties that are applied to sites.
-
RE: Disabling a slider with content...is considered cloaking?
ACann, as long as you're not serving up content to the search engines separately than what the users see, there shouldn't be a problem. I could see if you disabled it just for the users and you still allow bots or search engine bots to crawl the content then that would be an issue. It sounds like you just disabled it for all visitors. So, then I don't see any issues.
-
RE: Is there a way to set up a wordpress site so that the content is changed based on a location?
Ron,
There are WordPress plugins that allow you to serve content based on geolocation. For example, the "Custom Content by Location" plugin comes to mind http://wordpress.org/plugins/custom-content-by-country/
Alternatively, outside of WordPress, you can use PHP code to determine the user's location and then serve them up separate content. There's a sample of that code here: http://www.adviceinteractivegroup.com/how-to-display-unique-content-based-on-geolocation/
-
RE: Wordpress themes causing google penalty(need experts to settle a debate)
The theme generally doesn't matter at all. As long as the content is unique on your site, then there's no problem whatsoever. Like previously mentioned, there are plenty of people using the same WordPress theme, and it's the content that's unique, not the theme.
-
RE: Site Redesign: 302 Query
Hemblem,
Although you're redesigning the site, I actually don't recommend using a 302 redirect during the 'redesign' process, as it can have disastrous effects on search engine rankings. I would prefer that you keep the current website up and running: and then 301 redirecting the appropriate pages when the site is ready to go live.
I realize that you want to do, but I have seen too many websites have problems getting things straightened out with the search engines to do what you're suggesting.
-
RE: Brain Teaser - Dead Link Ranking in SERP's
Vanadium Interactive, I'm not sure what you're asking here. Obviously the first order of priority would be to fix the site. Most likely there's an issue with the .htaccess file that is causing the issue, since the non-www version of the site is accessible and the www version is not. Should be an easy fix by any competent programmer familiar with .htaccess files.
The next priority would be to totally get rid of iframes on the site, there's just no need for them anymore, and they're not very search friendly. Each individual page can get indexed separately, and you might have someone visit the "top frame" page of the site, and they couldn't navigate the site very well.
Sites like this generally rank well because of the links pointing to them, not because of the actual content on the page. I can see that the page was cached and it was working recently, but just recently ended up with the current problem.
-
RE: Estimating Search Volume An Impossibility?
Darcy,
For whatever reason, Google really doesn't want to give us the keyword search data that they have. However, Microsoft is actually taking a different approach, where they give us the ACTUAL search numbers, and even will predict it in the future. If you have access to MSN AdCenter, you can get that data. I prefer to use the Bing Advertising Intelligence to pull the data into MS Excel and then crunch the numbers from there. You can start with a keyword list from Google AdWords, put the data into MS Excel, and then get the real numbers of queries right there in Excel. Then, you can calculate possible Google search queries by multiplying by their current market share percentage.
-
RE: Killing it in Yahoo/Bing...Sucking it in Google. What gives?
Audra,
One of the big differences between Bing and Google is how they deal with links. Google analyzes links to your site a lot more than Bing does, and is "pickier" when it comes to the types of links and how they're obtained. I would review all of the links to your website and make sure that they're all "high quality" links. A quick analysis of the links to your site shows some undesirable links pointing to the site that violate Google's webmaster guidelines.
-
RE: Rank Tracker - above 50
JG, since rankings change so much based on personalization, it's difficult to exactly pinpoint what position the site is in right now. If you were to use software or a service to check the rankings (there are several out there, including Moz), they would vary depending on the location of the searcher, their past searching history, and even if they're logged in or not.
Your best bet is to verify your site in Google Webmaster Tools and use the search query section to see what the average position is, how many impressions your site is getting, and how many clicks it's getting.
-
RE: How many articles are ok to publish a day on my website blog?
Federico is right, it's not necessarily how often you post, but what you post. Keep in mind, though, that typically just posting is not enough: make sure that you do some extra promotion of each post, like adding it to your Twitter account, Facebook account, Google Plus account, etc. so that there are links and social media mentions going to each post.
-
RE: How to lay off your SEO compnay?
Armin,
Before you tell your SEO company, I recommend that you thoroughly review any contracts that you signed in order to make sure you understand everything. I would make sure that you make backups of any content that has been produced, and make sure you change the appropriate passwords on the site and on any tools that the SEO firm may have access to: Google Webmaster Tools, Google Analytics, etc.
Your conversation should be focused around leaving your SEO firm in good standing if possible, because there could be issues later on that arise (if they're mad at you, they could start some negative SEO or cause other hassles).
-
RE: How long before your rankings improved after Penguin?
Dave, it's entirely possible for sites to start ranking again really quickly, within days or weeks. The problem is not that it's taking Google a long time to process the disavow file. The problem lies with the actual links in that disavow file to be recrawled by Google. Since those are low quality links it will take some time (usually) for them to recrawl them. From what I understand, Google has to crawl a URL first and see the link: then they look at the disavow file in order to 'process' that link (to disavow that link).
It's the crawling that it taking time: not the processing of the disavow file.
So, to speed up recovery, you'll need to get those disavowed links crawled again. Or force Google to recrawl them.
-
RE: First Link on Page Still Only Link on Page?
Tyler, just so we're clear: it's the first TIME the link is mentioned on a page that counts, not just the first link on the page.
So, if you have several links on the same page to page1.html then only the first time counts--the other mentions (if they have different anchor text) don't count.
If you have a link that's useful (like a link to site's home page) then having the logo on your page link to your home page at the top is useful. Then, having a link on the sidebar to the home page might be useful for users, as well. Then there is the link to the home page in the footer. Those are all useful.
-
RE: Recovering from an algorithmic bodyslam
Ryan,
I'm sorry you're having trouble. You're not the only one who is having issues like this. My first thought is that you're probably not seeing all of the links to the site. You haven't mentioned which tools you've used to gather the links to the site, but nowadays you need to use several tools and combine the data. Then, it's really a fine art, in a way, to figure out which links to remove and which ones to keep. You mentioned that there are a few hundred that you contacted. We typically try to remove thousands or hundreds of links, not just 100 or so.
The other issue here is that you may have targeted the correct links to remove. But it takes time for Google to recrawl those links and then give you credit for disavowing them. You can speed up that process by forcing Google to recrawl those links.
-
RE: A while back there was a strategy presented about developing games or widgets and then using them for link bait. Is this still a viable strategy to improve ranking?
Lara, this has been an issue that has been debated recently, and it's kind of a grey area. If you're going to build the app or widget in a way to take advantage of its SEO value, then that's really not a good plan. It's logical that you include a link back to your site since you're developer of the widget or app, so that's fine. But if you really just want to include a keyword rich anchor text link to your site then I would take a different approach.
Your best bet is to consider the traffic that the widget or app will bring your site, and the notoriety your company will get by providing a widget or app. If you include a link, then it should be a branded link and not a keyword rich anchor text link.