Does Google play fair? Is 'relevant content' and 'usability' enough?
-
It seems there are 2 opposing views, and as a newbie this is very confusing.
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair.
Here's an example to illustrate one related concern I have:
I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content:
Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind..
Thoughts?
-
Hi David,
Sorry for such a delayed response but I keep wondering about your point on the meganav. Its known that Google is able to figure out menus and wont count those toward duplicate content? I just would like to be sure since my menus are fairly substantial when dropdowns are included.
-
You are giving me SOME hope for a site I've been working on for about 5 years and am getting ready to launch. Thanks very much.
-
Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. [...] Are those significant factors used by Google?
In my opinion, google has every ability to measure visitor actions. They own the Chrome browser and could measure the engagement of visitors with a page, they have access to what gets bookmarked in Chrome, they know when a visitor clicks in the SERPs and when that same visitor reappears in the SERPs, they don't have to have links because they can read when people mention your site in a forum, they know if people navigate to your site by typeing the name of your site into search... I believe that all of these things are important for rankings but how important I can't say.
I have lots of really good content that when I published it the page ranked at #150 or deeper in the SERPs. Then, I built zero links and did zero promotion and slowly that page rises in the SERPs and is now in the top three - over a year later. I have hundreds of pages that have done that. You gotta have a LOT of patience to do things that way but you spend zero effort on promotion and 100% effort producing assets for your website. That is what I have done since about 2006. Virtually zero linkbuilding. My visitors are my linkbuilders.
-
EGOL, Thanks very much. I, being a one person biz, am very interested in the idea of ranking by popularity, as my goal is to have the best site out there but I have limited funds to promote it. Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. After all, usage and return usage is what it is all about! Are those significant factors used by Google? If so maybe there is hope..:)
-
Egol has this summed up perfectly!
-Andy
-
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth?
They are both a small piece of the truth. To rank on google your PAGE must be:
-
relevant to the search term and presented to google with proper title, crawability, and text visibility
-
have substantive content about the search term
-
be validated by other websites by being linked from them or mentioned by them (these are just a few validations)
-
be validated by visitors because they have queried it by name, stayed on it, bookmarked it, mentioned it by name in web readable content (these are just a few validations)
Any idiot can do #1. A good author can do #2. But, #3 and #4 are really difficult to accomplish by people who are not related to you or paid by you.
In low competion #1 and #2 can be enough to get your ranked. The higher the competition for a query the more you need #3 and #4 to rank. For some queries it can be almost impossible for a newcomer to rank on the first page of google without investing $xxx,xxx or more in website assets and promotion.... AND... having a plan in place to present the site in a way that google will be able to read it and interpret it in a way that will maximize the #3 and #4 assets.
-
-
A meganav is not considered duplicate content. Duplicate content means product description pages that are identical, having the same articles multiple places on your site, etc.
To the main parts of your question - Google does not want it to be easy for people in the SEO world. They give guidelines, but following them means nothing. What Google considers an ok tactic one years becomes an unacceptable tactic the next (see guest blogging). There are many ways to succeed in ranking. Some follow Google's rules and wait for rankings to come, others use tons of spammy tactics and rank instantly (though they always risk losing it overnight if Google catches on).
The idea that an easy to use site and relevant content will make Google rank you fairly is a joke. And though only 1 has said it publicly, there are many top minds in the SEO world who will tell you that in private.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What about a No-record backlink in the eye of Google
I have an uncertainty - when I make a backlink as a piece of SEO in some site when I reviewed similar couple of days after. It hasn't filed and I checked its robots record. It appearing Client specialist: Mediapartners-Google Disallow: User-Agent: * Disallow: However, is this make any backlink uphold or only this with the end goal of not ordering in google. I make it straightforward - "Is this sort of backlink creation my site kitchen meg uphold my SEO action or Not?" In this No-record site.
Local Website Optimization | | Salman425520 -
Subdomain vs. Separate Domain for SEO & Google AdWords
We have a client who carries 4 product lines from different manufacturers under a singular domain name (www.companyname.com), and last fall, one of their manufacturers indicated that they needed to move to separate out one of those product lines from the rest, so we redesigned and relaunched as two separate sites - www.companyname.com and www.companynameseparateproduct.com (a newly-purchased domain). Since that time, their manufacturer has reneged their requirement to separate the product lines, but the client has been running both sites separately since they launched at the beginning of December 2016. Since that time, they have cannibalized their content strategy (effective February 2017) and hacked apart their PPC budget from both sites (effective April 2017), and are upset that their organic and paid traffic has correspondingly dropped from the original domain, and that the new domain hasn't continued to grow at the rate they would like it to (we did warn them, and they made the decision to move forward with the changes anyway). This past week, they decided to hire an in-house marketing manager, who is insisting that we move the newer domain (www.companynameseparateproduct.com) to become a subdomain on their original site (separateproduct.companyname.com). Our team has argued that making this change back 6 months into the life of the new site will hurt their SEO (especially if we have to 301 redirect all of the old content back again, without any new content regularly being added), which was corroborated with this article. We'd also have to kill the separate AdWords account and quality score associated with the ads in that account to move them back. We're currently looking for any extra insight or literature that we might be able to find that helps explain this to the client better - even if it is a little technical. (We're also open to finding out if this method of thinking is incorrect if things have changed!)
Local Website Optimization | | mkbeesto0 -
Content Strategy – Blog Channel Questions
We are currently blogging at a high volume to hit keywords for our 1,500 locations across the country. We are trying to make sure we rank well near each location and we have been using our blog to create content for that reason. With recent changes on Google, I am seeing that it is more about content topics than hitting all variations of your keywords and including state and city specific terms. We are now asking ourselves if the blog channel portion of our content strategy is incorrect. Below are some of the main questions we have and any input that is backed by experience would be helpful. 1. Can it hurt us to blog at a high volume (4 blogs per day) in an effort to include all of our keywords and attach them to state and city specific keywords (ie. "keyword one" with "keyword one city" and "keyword one different city")? 2. Is it more valuable to blog only a couple of times per month with deeper content, or more times per month with thinner connect but more keyword involvement? 3. Our customers are forced to use our type of product by the government. We are one of the vendors that provide this service. Because of this our customers may not care at all about anything we would blog about. Do we blog for them, or do we blog for the keyword and try and reach partners and others who would read the content and hope that it also ranks us high when our potential customers search? 4. Is there an advantage/disadvantage or does it matter if we have multiple blog authors? Big questions for sure, but if you have insight on any one of them, please provide and maybe we can answer them all with a group effort. Thanks to all of you who are taking the time to read this and contribute.
Local Website Optimization | | Smart_Start0 -
Google fetch showing error
Hello All, I am Fetching my url in Google fetch pages, But everytime, i Fetch showing error "Temporary unavailable", But My site is working perfect, Also robots file Also given Allow, But still Error coming Any Expert Can help please Thnx
Local Website Optimization | | falguniinnovative0 -
Australian local business website on a dot.com - how do I ensure its indexed/ranked by Google.com/au as priority
look forward to your advice My client is a local business in australia but has a dotcom site which is hosted in US. We are just moving it to wordpress and new hosting. I want to ensure that Google.com/au will be able to index and rank the content. How can I tell google its a site for people in australia? I thought best to set up a subfolder like this hissite.com/au and redirect anyone from australia to go to this url? Thanks for your recommendations
Local Website Optimization | | bisibee10 -
Search Result Discrepancy: Keyword "Dresses" shows international sites in the search results of Google.co.in.
Hi All, What would be the reason that Google shows international websites in the first page results while there are huge local players available. Eg: Dresses - Keyword that shows results with almost all the results from International websites whereas the local big players in the same category are not shown. This is not the case for other keywords like Women dresses, Clothing, Shoes etc., Is it a bug or any particular reasons? Thanks,
Local Website Optimization | | Myntra0 -
Updated site with new Url Structure - What Should I expect to happen ?. Also it's showing PR 1 for my urls on Opensite explorer
Hi All, We updated our website with a new url structure. Apart from the root domain , everyother page is showing up in opensite explorer with a page rank 1. Although we only went live with this yesterday, I would have thought that the 301's etc from the old urls would be coming through and the PR would show ?.. I am not familiar what to expect or what alarms bells I need to watch out for when doing this type of thing although I would probably expect a small drop in traffic ?..I don;t know what the norm is though so Any advice greatly appreciated? thanks PEte
Local Website Optimization | | PeteC120 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0