Does Google only look at LSI per page or context of the Site?
-
From what I have read i should optimise each page for a keyword/phrase, however, I read recently that google may also look at the context of the site to see if there are other similar words.
For example i have different pages optimised for Funeral Planning, funeral plans, funeral plan costs, compare funeral plans, why buy a funeral plan, paying for a funeral, prepaid funeral plans.
Is this the best strategy when the words/phrases are so close or should i go for longer pages with the variations on one page or at least less pages?
Thanks
Ash
-
Thank Everett thats really helpful.
I must admit i do wonder whether i have split out too many pages that could be combined, I think when i launched this version of the site in June, i was trying to focus a core keyword per page, with plurials etc, as i didnt think google would in the example you used pick up funeral planning & funeral plans as connected - god this job is hard sometimes!
Ash
-
Hello AshShep1,
I don't think anyone is going to be able to tell you for sure whether Google looks at LSI specifically on a per-page or site-wide basis. However, it sounds like there is definitely some overlap with your landing pages. For example, "funeral planning" and "funeral plans" could probably be one page, which then links off to the others. The fact that Google bolds "planning" when I search for "funeral plans" tells me they view the two keywords as being so semantically related as to be almost synonymous.
The "funeral plan costs" and "paying for a funeral" page sound like they could be combined, and the "compare funeral plans" and "prepaid funeral plans" could probably be combined. I would look into the keywords to see which, if any, have the overwhelming majority of search volume and focus on that one while using the others as secondary or tertiary keywords within the context of the page.
But...Really it comes down to how you think the information is best presented to your visitors. If the "prepaid funeral plans" or the "compare funeral plans" content is best presented on its own then that is what you should do.
You may be seeking a more concrete answer from someone so I will leave this question open as a discussion for awhile. Please let me know if I can be of any assistance.
-
Everything should be done within the context of the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdirectory site / 301 Redirects / Google Search Console
Hi There, I'm a web developer working on an existing WordPress site (Site #1) that has 900 blog posts accessible from this URL structure: www.site-1.com/title-of-the-post We've built a new website for their content (Site #2) and programmatically moved all blog posts to the second website. Here is the URL structure: www.site-1.com/site-2/title-of-the-post Site #1 will remain as a normal company site without a blog, and Site #2 will act as an online content membership platform. The original 900 posts have great link juice that we, of course, would like to maintain. We've already set up 301 redirects that take care of this process. (ie. the original post gets redirected to the same URL slug with '/site-2/' added. My questions: Do you have a recommendation about how to best handle this second website in Google Search Console? Do we submit this second website as an additional property in GSC? (which shares the same top-level-domain as the original) Currently, the sitemap.xml submitted to Google Search Console has all 900 blog posts with the old URLs. Is there any benefit / drawback to submitting another sitemap.xml from the new website which has all the same blog posts at the new URL. Your guidance is greatly appreciated. Thank you.
Intermediate & Advanced SEO | | HimalayanInstitute0 -
Impact of Removing 60,000 Page from Sites
We currently have a database of content across about 100 sites. All of this content is exactly the same on all of them, and it is also found all over the internet in other places. So it's not unique at all and it brings in almost no organic traffic. I want to remove this bloat from our sites. Problem is that this database accounts for almost 60,000 pages on each site and it is all currently indexed. I'm a little bit worried that flat out dumping all of this data at once is going to cause Google to wonder what in the world we are doing and we are going to see some issues from it (at least in the short run). My thought now is to remove this content in stages so it doesn't all get dropped at once. But would deindexing all of this content first be better? That way Google would still be able to crawl it and understand that it is not relevant user content and therefore minimize impact when we do terminate it completely? Any other ideas for minimizing SEO issues?
Intermediate & Advanced SEO | | MJTrevens1 -
Site structure for location + services pages
We are in the process of restructuring our site and are trying to figure out Google's preference for location pages and services. Let's say we are an auto repair company with lots of locations and each one of them offer some unique services, while other services are offered by all or most other locations. Should we have a global page for each service live with a link to the location page for each shop that offers that service? OR Should we built a unique page about each service for every location as a subfolder of each location (essentially creating a LOT of sub pages because each location has 15-20 services. Which will rank better?
Intermediate & Advanced SEO | | MJTrevens1 -
How do I know what pages of my site is not inedexed by google ?
Hi I my Google webmaster tools under Crawl->sitemaps it shows 1117 pages submitted but 619 has been indexed. Is there any way I can fined which pages are not indexed and why? it has been like this for a while. I also have a manual action (partial) message. "Unnatural links to your site--impacts links" and under affects says "Some incoming links" is that the reason Google does not index some of my pages? Thank you Sina
Intermediate & Advanced SEO | | SinaKashani0 -
Hidden keywords - how many per page?
Hi All, We have a booking website we want to optimize for keywords we cannot really show, because some of our partners wouldn't want it. We figured we can put said keywords or close synonyms onpage in various places that are not too dangerous though (e.g. image names, image alt tags, URLs, etc.). The question is how much keywords we can target though? We know keyword stuffing is detrimental, and we will not start to create long URLs stuffed with keywords, same for H1 tags or page titles. So how many is acceptable/not counterproductive? Thanks!
Intermediate & Advanced SEO | | Philoups0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
When you buy a domain or website, does that trigger a fresh look by Google?
I recently purchased a domain and the corresponding website. For as far as I could tell, in the 12 months prior to my purchase, the site was well optimized within Google and had over 40 search terms on page 1 of Google in a really competitive space (lending-related). When I made the purchase, the domain was transferred from seller's GoDaddy account and into my GoDaddy account and I placed privacy protection on the domain. We did not move the hosting of the site--I took over his hosting account. And I did not make any significant changes to the website. About 1 week later, the site was totally removed from Google's index and I received notice in Google Webmaster Tools that the site may violate Google's quality guidelines. I filed reconsideration request telling Google that I was the new owner and that if there were any violations, they were caused by old owner. One week later, I got note back from Google saying they had received my reconsideration request and if they think issues are cured, then they will reindex the site. That was over a week ago and so seemingly they are not putting it back. My question is this: Does Google somehow automatically know when domains change hands and does this cause them to manually review sites? The site in question was aggressively optimized but I don't understand what would have caused Google to take action on the site when they did. In other words, if they were going to take action, why wouldn't they have done it in the prior 12 months or does the domain transfer put the site into some queue that makes them review it? BTW, the site in question has a SEOMoz domain authority grade of 85 and still is showing up as PR 5 Thanks very much for your time and consideration
Intermediate & Advanced SEO | | whodatyat0 -
What on-page/site optimization techniques can I utilize to improve this site (http://www.paradisus.com/)?
I use a Search Engine Spider Simulator to analyze the homepage and I think my client is using black hat tactics such as cloaking. Am I right? Any recommendations on to improve the top navigation under Resorts pull down. Each of the 6 resorts listed are all part of the Paradisus brand, but each resort has their own sub domain.
Intermediate & Advanced SEO | | Melia0