Does Google play fair? Is 'relevant content' and 'usability' enough?
-
It seems there are 2 opposing views, and as a newbie this is very confusing.
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair.
Here's an example to illustrate one related concern I have:
I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content:
Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind..
Thoughts?
-
Hi David,
Sorry for such a delayed response but I keep wondering about your point on the meganav. Its known that Google is able to figure out menus and wont count those toward duplicate content? I just would like to be sure since my menus are fairly substantial when dropdowns are included.
-
You are giving me SOME hope for a site I've been working on for about 5 years and am getting ready to launch. Thanks very much.
-
Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. [...] Are those significant factors used by Google?
In my opinion, google has every ability to measure visitor actions. They own the Chrome browser and could measure the engagement of visitors with a page, they have access to what gets bookmarked in Chrome, they know when a visitor clicks in the SERPs and when that same visitor reappears in the SERPs, they don't have to have links because they can read when people mention your site in a forum, they know if people navigate to your site by typeing the name of your site into search... I believe that all of these things are important for rankings but how important I can't say.
I have lots of really good content that when I published it the page ranked at #150 or deeper in the SERPs. Then, I built zero links and did zero promotion and slowly that page rises in the SERPs and is now in the top three - over a year later. I have hundreds of pages that have done that. You gotta have a LOT of patience to do things that way but you spend zero effort on promotion and 100% effort producing assets for your website. That is what I have done since about 2006. Virtually zero linkbuilding. My visitors are my linkbuilders.
-
EGOL, Thanks very much. I, being a one person biz, am very interested in the idea of ranking by popularity, as my goal is to have the best site out there but I have limited funds to promote it. Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. After all, usage and return usage is what it is all about! Are those significant factors used by Google? If so maybe there is hope..:)
-
Egol has this summed up perfectly!
-Andy
-
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth?
They are both a small piece of the truth. To rank on google your PAGE must be:
-
relevant to the search term and presented to google with proper title, crawability, and text visibility
-
have substantive content about the search term
-
be validated by other websites by being linked from them or mentioned by them (these are just a few validations)
-
be validated by visitors because they have queried it by name, stayed on it, bookmarked it, mentioned it by name in web readable content (these are just a few validations)
Any idiot can do #1. A good author can do #2. But, #3 and #4 are really difficult to accomplish by people who are not related to you or paid by you.
In low competion #1 and #2 can be enough to get your ranked. The higher the competition for a query the more you need #3 and #4 to rank. For some queries it can be almost impossible for a newcomer to rank on the first page of google without investing $xxx,xxx or more in website assets and promotion.... AND... having a plan in place to present the site in a way that google will be able to read it and interpret it in a way that will maximize the #3 and #4 assets.
-
-
A meganav is not considered duplicate content. Duplicate content means product description pages that are identical, having the same articles multiple places on your site, etc.
To the main parts of your question - Google does not want it to be easy for people in the SEO world. They give guidelines, but following them means nothing. What Google considers an ok tactic one years becomes an unacceptable tactic the next (see guest blogging). There are many ways to succeed in ranking. Some follow Google's rules and wait for rankings to come, others use tons of spammy tactics and rank instantly (though they always risk losing it overnight if Google catches on).
The idea that an easy to use site and relevant content will make Google rank you fairly is a joke. And though only 1 has said it publicly, there are many top minds in the SEO world who will tell you that in private.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content - Local SEO - 250 Locations
Hey everyone, I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc. I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates. So here's my question: If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct? Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations. I really appreciate any insight! Thank you,
Local Website Optimization | | SEOJedi510 -
Diagnosing a likely Penguin Penalty that's never been recovered from
The context: my market Here, for reference, is what I’d like to see with my website (New York Jazz Events), and I think I deserve to see: https://www.dropbox.com/s/1gf2ajw80iciqii/Screenshot 2015-11-27 12.09.08.png?dl=0 Intrigued by that screenshot? Please read further! I have only a few competitors in my market (jazz bands offered in the city of New York for corporate events and weddings), those being Gigmasters, Gigsalad, and Thumbtack. (Each of those three, by the way, are much more general sites than mine (they offer everything from musicians to jugglers), and should be behind me if one is ranking based on quality and relevance.) Of the next nearest type of competitor, single, individual jazz (which also should be behind me if one is ranking based on quality and relevance), there are a dozen or so. The context: my plans No matter what, at the least I’m going to be doing a complete modernization and redesign of my site soon. Please refer to the following screenshot of my Google organic traffic throughout the life of my site while reading the account that follows: https://www.evernote.com/l/AAOQpSw8Hn9DGpCQAt5onH9WMBiwGTDcCk8 What I’d like to find out: exactly what caused the Penguin penalty (if there was one); exactly what would remove it and restore my site to its previous standing. You can see that when my site launched, it only took four months (12/10-4/11) for it to consistently, and seemingly effortlessly, ranking 5th or 6th in Google for the most important keyword combinations related to my industry (such as “jazz band new york,” “jazz trio new york,” “jazz wedding new york”). That's for a new site with no backlinks. From this I inferred that there is little to no direct competition in this market (i.e., jazz bands in New York marketed specifically for weddings and corporate events). Then, around November of 2013, I paid for some bogus links (51 to be exact) to these keyword combinations in order to improve the ranking of my site, which worked briefly (see Google Analytics screenshot, January 13) until Penguin launched the following Spring, at which time my site was essentially removed from the search results altogether due to an apparent algorithmic (not manual) penalty which I presumed were due to these links (although I could be wrong, it could be penalized due to something else that I don’t understand). After removing most of the bad links (down to 3 from 51, see https://www.dropbox.com/s/kolb665rth47q11/bad links 2013-10-24 explorer.numbers?dl=0) and disavowing all the offending URLs, and after Penguin updated to 3.0, Google still failed to recognize my site, with one odd exception: in Fall of 2014 it began to place the keyword combination "jazz bands new york" ("bands" plural, not singular) back on page one, and tied it to a completely undeveloped Google Plus page with zero reviews on it, that it displayed simultaiously (the “knowledge graph?” or “maps listing”?). (Google works in some strange and not very intelligible ways. For example, in a searcher removed the “s” from “bands” and the site remained banished from the results altogether. The same is true for every other keyword variation.) Encouraged by this unexpected development, last Winter (2014-15) and Spring (2015), I developed my Google Plus Local Business page with lots of useful videos and photos, increased the review count from 0 to 13 (all real and all five star, by the way), linked my YouTube page to it, and, on Google’s advice and against my better judgment, closed down my other Google Plus Local Business pages related to other business services I market on the web (I’m a graphic designer and videographer in addition to being a bandleader). (Unhelpfully, Google keeps them in the search results but just marks them as “closed.” Thanks so much, Google. I probably could have left them up.) I also made a massive effort to clean-up my local directory listings so far as possible, removing listings for my competing businesses (again, against my better judgment), making the format of my business address and contact information consistent so far as possible (I'm a service business and so hide my full address when possible, but this is not always possible depending on the policies of the particular citation website, hence some inconsistencies), and added this information to the footer of all the pages on my site. After making these improvements, rather than improving my rankings, my site was entirely removed from the first several pages of Google’s search results, including for the keyword combination "jazz bands new york.” On occasions when my site could be located (several pages down), it was no longer associated with my Google Plus Local Business page, unless one searched specifically for my site’s name, New York Jazz Events (which nobody does, because 99.9% of people searching on Google don't know my business name). Some questions this raised in my mind: Why did Google make a link between my site and my Google Plus Local Business page back when the page was undeveloped? Why did Google then break that link (stop the association my website with their business page (or knowledge graph, or maps listing, whichever it is now), apart from the exception noted above) once the Google Plus Local Business page was developed? And indeed, why wouldn't developing that page, along with cleaning up my citations, logically result in more search term combinations bringing my results back to the first page, along with the link to the Google Plus Local Business page, rather than the opposite? Then, unexpectedly, this last November my website rank for "jazz bands new york" in Google briefly returned from "buried" all the way to #1! And the 1st page of the search results was dominated by my site in three places, all #1: the top spot for paid ads (as usual), the very top of the natural search results (first time ever), and the top and only local listing, on the right! I was even ahead of two giant national corporate competitors, which would seem to be impossible to me as they probably have thousands of backlinks. I basically “owned” page one of Google to an extent I’ve never seen for anyone before. It was actually a bit bizarre. You can see this here: https://www.dropbox.com/s/1gf2ajw80iciqii/Screenshot 2015-11-27 12.09.08.png?dl=0 Now, what is also bizarre, was that, as before, I was still buried for every other keyword combination that's relevant to my site, including extremely similar combinations (for example, substituting "band" for "bands," or "NYC" for "New York," etc.). These keyword combinations essentially return the exact same results, only with my site missing from organic and local. As I mentioned, these astonishing results were temporary, and now my site is again buried for all keyword combinations including the once and sometimes astonishingly-performing “jazz bands new york.” Something else interesting and relevant to this conundrum: I’ve done searches for all my three major keyword search terms in Bing, and guess what? In the top three results for two out of the three of my search terms in organic results, with my Bing local listing right up there, and my other website (NYCJazz.com) not far behind! Now, it's strange to me that these incredibly great (and, as far as I'm concerned, high quality) Bing rankings lead to no inquiries, that nearly all of my customers find me from my paid advertising in Google, but that's another bafflement for another day… what is relevant to this discussion, is that my Bing results makes the essential invisibility of my website and my local business listing in Google's natural results all the more baffling. One could speculate that Google is a more sophisticated search engine and is returning more relevant results, except that that's not true… my site is in fact the most relevant for those terms (or at least, to be generous, in the top few in terms of relevance). And in the past, before Penguin, it used to be in the top few results in Google, just like in Bing. It's hard for me to swallow that I'm just lacking in proper SEO, when it used to rank great, when I've subsequently been working hard to further improve the SEO for years, and it's a top site everywhere else. Something has to be up with Google… I wish I knew what it was and what I could do… What I have done already: I’ve worked hard over the last five years cleaning up bad backlinks and making citations consistent. I think I understand well my most important keywords already, and have my pages optimized for them. I understand on-page optimization and think my site’s in pretty good shape in that regard (and I will further improve the on page optimization when I redesign it very soon.) It could use more good backlinks, but that’s a problem for the future as far as I’m concerned, and not related to the penalty in any case. I understand AdWords well and my ad is at the top of the search results consistently for all relevant keywords, so I don’t need any help there… Anyone who may have any insight to this… thanks very much in advance!
Local Website Optimization | | ChuckBraman0 -
How to correctly move subdomain to subfolder (google webmaster)?
Hello, This is my first post in here 🙂 I just wondered what is the correct way to move a subdomain to subfolder? I've moved it, re-done sitemap, so that main website would include a subfolder, as they are part of one big website now (it was something like a blog on a subdomain). Subdomain now does correct 301 redirects. Submitted new sitemap to google, asked google to re-fetch the whole domain (thus subfolder should be re-fetched too, as it's part of main nav). The areas i'm in doubt: I can tell google that the domain got moved, however it is moved to the one that is already approved in the same account, but is in a subfolder, so should i do this? Or should i simply somehow erase it on webmaster? The blog was launched about a month ago, and it isn't perfectly optimized yet, it wasn't on google SERPs pretty much at all, excluding googling it straightly, and there are pretty much 0 traffic from google, almost all of it is either direct either referral, mostly social, Thanks, Pavel
Local Website Optimization | | PavelGro920 -
Building a new site and want to be found in both Google.co.uk and Goolge.ie. What is the best practice?
We are building a new site which is a .com site and the client would like to be found in both Google.co.uk and Goolge.ie. What is the best practice to go about this? Can you geo-target two countries with the one site?
Local Website Optimization | | WSIDW0 -
Should digital marketing agencies treat SEO differently when it comes to homepage content?
When I review competitor digital agency sites, they seem to have very little homepage content. But how would this be beneficial in gaining a higher SERP rank?
Local Website Optimization | | randomagency1 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Want to move contents to domain2 and use domain1 for other content
Hello, We would like to merge two existing, fairly well positioned web forums. Contents (threads and posts) from www.forocreativo.net would be moved to www.comunidadhosting.com. We are testing some scripts which will handle redirect 301 for every single thread from forocreativo.net to comunidadhosting.com. But here is the thing: once all current contents are moved out of www.forocreativo.net, we would like to use this domain to point it to a specific geographic region and to target other niche/topics. Would you say we can do this and Google will not penalize neither of those 2 domains? Any input is more than welcome. Thank you! 🙂
Local Website Optimization | | interalta0