Does Google play fair? Is 'relevant content' and 'usability' enough?
-
It seems there are 2 opposing views, and as a newbie this is very confusing.
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair.
Here's an example to illustrate one related concern I have:
I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content:
Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind..
Thoughts?
-
Hi David,
Sorry for such a delayed response but I keep wondering about your point on the meganav. Its known that Google is able to figure out menus and wont count those toward duplicate content? I just would like to be sure since my menus are fairly substantial when dropdowns are included.
-
You are giving me SOME hope for a site I've been working on for about 5 years and am getting ready to launch. Thanks very much.
-
Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. [...] Are those significant factors used by Google?
In my opinion, google has every ability to measure visitor actions. They own the Chrome browser and could measure the engagement of visitors with a page, they have access to what gets bookmarked in Chrome, they know when a visitor clicks in the SERPs and when that same visitor reappears in the SERPs, they don't have to have links because they can read when people mention your site in a forum, they know if people navigate to your site by typeing the name of your site into search... I believe that all of these things are important for rankings but how important I can't say.
I have lots of really good content that when I published it the page ranked at #150 or deeper in the SERPs. Then, I built zero links and did zero promotion and slowly that page rises in the SERPs and is now in the top three - over a year later. I have hundreds of pages that have done that. You gotta have a LOT of patience to do things that way but you spend zero effort on promotion and 100% effort producing assets for your website. That is what I have done since about 2006. Virtually zero linkbuilding. My visitors are my linkbuilders.
-
EGOL, Thanks very much. I, being a one person biz, am very interested in the idea of ranking by popularity, as my goal is to have the best site out there but I have limited funds to promote it. Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. After all, usage and return usage is what it is all about! Are those significant factors used by Google? If so maybe there is hope..:)
-
Egol has this summed up perfectly!
-Andy
-
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth?
They are both a small piece of the truth. To rank on google your PAGE must be:
-
relevant to the search term and presented to google with proper title, crawability, and text visibility
-
have substantive content about the search term
-
be validated by other websites by being linked from them or mentioned by them (these are just a few validations)
-
be validated by visitors because they have queried it by name, stayed on it, bookmarked it, mentioned it by name in web readable content (these are just a few validations)
Any idiot can do #1. A good author can do #2. But, #3 and #4 are really difficult to accomplish by people who are not related to you or paid by you.
In low competion #1 and #2 can be enough to get your ranked. The higher the competition for a query the more you need #3 and #4 to rank. For some queries it can be almost impossible for a newcomer to rank on the first page of google without investing $xxx,xxx or more in website assets and promotion.... AND... having a plan in place to present the site in a way that google will be able to read it and interpret it in a way that will maximize the #3 and #4 assets.
-
-
A meganav is not considered duplicate content. Duplicate content means product description pages that are identical, having the same articles multiple places on your site, etc.
To the main parts of your question - Google does not want it to be easy for people in the SEO world. They give guidelines, but following them means nothing. What Google considers an ok tactic one years becomes an unacceptable tactic the next (see guest blogging). There are many ways to succeed in ranking. Some follow Google's rules and wait for rankings to come, others use tons of spammy tactics and rank instantly (though they always risk losing it overnight if Google catches on).
The idea that an easy to use site and relevant content will make Google rank you fairly is a joke. And though only 1 has said it publicly, there are many top minds in the SEO world who will tell you that in private.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Daytona Beach Web Design vs. Daytona Web Design: What's Best?
Three months ago we had our team create local pages for some of the services we render -- _i.e., _web design. As we reviewed the pages, they created two pages with similar content; one with URL: /daytona-beach-web-design/ & /daytona-web-design/ We knew we had to kill one of them to avoid duplicate content. Here is where the hard decision came and hence the question. We though about keeping the '/daytona-beach-web-design/ ' URL but for some reason, Google had already crawled the shorter version of the URL '/daytona-web-design/' So we ended up deleting the long tail URL and kept Daytona Web Design instead. Which one would you keep and have you experienced similar issues?
Local Website Optimization | | WebDaytona0 -
International subdirectory without localized content - best practice / need advice
Hi there, Our site uses a subdirectory for regional and multilingual sites as show below for 200+ countries.
Local Website Optimization | | erinfalwell
EX: /en_US/ All sites have ~the same content & are in English. We have hreflang tags but still have crawl issues. Is there another URL structure you would recommend? Are there any other ways to avoid the duplicate page & crawl budget issues outside of the hreflang tag? Appreciate it!0 -
Matching page for keyword doesn't show in search
Hello! I'm having an issue with my website Rooms Index, the website is in Hebrew so I'll provide examples in English for better understandings. When I'm searching Rooms by Hour in Haifa, google doesn't show the intended category page which is this, instead it shows my homepage in the results, this happens only for certain areas, while other areas are working well such as Tel aviv. For example if I searched day use in Las Vegas it'd show me the Las Vegas page dayuse.com/las-vegas, but searching for Brooklyn I'd only see dayuse.com. the pages are indexed and I can find them if I search site:roomsindex.co.il what could cause such problem?
Local Website Optimization | | AviramAdar0 -
What's the best SEO solution for international targeting of different english speaking countries?
Hi guys, recently won a client who operates globally, their domain is .com and their head office is in the UK. They have built regional sub-directories and translated content and pages of their site for /ru, /fr etc. The issue comes with their /us and /ca pages. This content for the most part is identical to the main .com site. The content is still in English and can't in most situations be changed to be more localised, so there are duplicate content issues. Trying to think of options: Ensure hreflang is added properly, build regional links to regional pages, get local contact details / NAP on all regional pages, set up Google business listings for each regional office and link accordingly. Will Google be able to identify these regional pages as more suitable search results for US searches? Make the main .com version of the content the canonical, which takes away any ranking benefits of the regional pages altogether, but removes the duplicate content issues and means we can focus link building and content resources into making sure the .com version of the page ranks well. Thanks!
Local Website Optimization | | SamFanatic0 -
Even after doing every possible thing required for SEO my client's website is not coming on top.can you tell me where i am lacking?
_ Hi team_ I have been working on a website called signboards.co.in since 4 months.it was not in top 100 but now below 50 for 2-3 keywords.even after submitting in many directories after competitor analysis moz shows only one external link in its link metrics.apart from this every possible thing required for SEO is done in a proper way,but still it is not giving results.can you help me out?all my other clients work is going good except this one.can you please let me know what is going wrong with my project?As the project submission date is near i need your help as soon as possible. Thanks Najia jehan
Local Website Optimization | | Najia-ali0 -
Google for Jobs: how to deal with third-party sites that appear instead of your own?
We have shared our company's job postings on several third-party websites, including The Muse, as well as putting the job postings on our own website. Our site and The Muse have about the same schema markup except for these differences: The Muse...
Local Website Optimization | | Kevin_P
• Lists Experience Requirements
• Uses HTML in the description with tags and other markup (our website just has plain text)
• Has a Name in JobPosting
• URL is specific to the position (our website's URL just goes to the homepage)
• Has a logo URL for Organization When you type the exact job posting's title into Google, The Muse posting shows up in Google for Jobs--not our website's duplicate copy. The only way to see our website's job posting is to type in the exact job title plus "site:http://www.oursite.com". What is a good approach for getting our website's posting to be the priority in Google for Jobs? Do we need to remove postings from third-party sites? Structure them differently? Do organic factors affect which version of the job posting is shown, and if so, can I assume that our site will face challenges outranking a big third-party site?1 -
Need Help - Google has picked up an overseas company with the same name and put it in search on the right
Hi All, Google has picked up a competitors logo from overseas (same name) and input it with the wikipedia excerpt on the right hand side of search. What the heck can I do to get this removed as its a serious legal/brand issue. See URL - http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu Hope someone can help !! Cheers Dave http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu
Local Website Optimization | | CFCU0 -
Google Maps- Get Directions not working properly
Hello All, We have added a "Get directions" for google Maps on our website for each of our depots. It works correctly on the latest version of google maps if we just put in our company name , postcode. However if users are using an older version of google maps, it doesn't work so well ,and brings up a massive list of companies in the same vicinity. If we put in our whole address, it's hit and miss if it finds it okay and brings up a list of companies as well. I have some other companies just use long/latitude but I that isn't so accurate in some cases either especially when the depots are on business parks. I am wondering if others have found this problem and whether its possible to check what version of google maps the user has say on their mobile etc before choosing which code to run ? Any thoughts ? I can't find anything to help us to code this. thanks Peter
Local Website Optimization | | PeteC120