If I were to change the geographic keyword such as "foreclosures in Dallas" on 20 related blogs to "foreclosures in Los Angeles" what would happen?
-
In other words I'm wondering if someone built up an internet presence for their company through multiple websites over the years and then decided to move to another part of the united states, would it work to change all the keywords to the new location?
Would that work toward getting them ranked in the new area or would you have to create entirely new websites?
Thanks guys.
-
No, from my understanding, Google continually indexes your pages. Rand updates his Search Engine Ranking Factors and he puts the new information at the same URL:
-
Thats what I thought. I thought that the pages would then start to climb in the rankings for that geographic area, but someone told me that it wouldn't work because google would have alrleady indexed all those pages?
-
I think that would work from a content-optimization standpoint. The issues would be external: anchor text, local links, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Links to Spans with Robots.txt Blocked Redirects using Linkify/jQuery
Hi, I was recently penalized most likely because Google started following javascript links to bad neighborhoods that were not no-followed. The first thing I did was remove the Linkify plugin from my site so that all those links would disappear, but now I think I have a solution that works with Linkify without creating crawlable links. I did the following: I blocked access to the Linkify scripts using robots.txt so that Google won't execute the scripts that create the links. This has worked for me in the past with banner ads linking to other sites of mine. At least it appears to work because those sites did not get links from pages running those banners in search console. I created a /redirect/ directory that redirects all offsite URLs. I put a robots.txt block on this directory. I configured the Linkify plugin to parse URLs into span elements instead of a elements and add no follow attributes. They still have an href attribute, but the URLs in the href now point to the redirect directory and the span onclick event redirects the user. I have implemented this solution on another site of mine and I am hoping this will make it impossible for Google to categorize my pages as liking to any neighborhoods good or bad. Most of the content is UGC, so this should discourage link spam while giving users clickable URLs and still letting people post complaints about people that have profiles on adult websites. Here is a page where the solution has been implemented https://cyberbullyingreport.com/bully/predators-watch-owner-scott-breitenstein-of-dayton-ohio-5463.aspx, the Linkify plugin can be found at https://soapbox.github.io/linkifyjs/, and the custom jQuery is as follows: jQuery(document).ready(function ($) { 2 $('p').linkify({ tagName: 'span', attributes: { rel: 'nofollow' }, formatHref: function (href) { href = 'https://cyberbullyingreport.com/redirect/?url=' + href; return href; }, events:{ click: function (e) { var href = $(this).attr('href'); window.location.href = href; } } }); 3 });
White Hat / Black Hat SEO | | STDCarriers0 -
How Can I Safely Establish Homepage Relevancy With Internal Keyword Links?
My website has roughly 1000-2000 pages. However, our homepage is lacking relevancy as to what it is about. One way that I'd like to tackle this problem, is by updating many of our pages with internal linking. I often hear, use exact keyword links with caution, but have assumed this mainly referred to external backlinks. Would it be a disaster to set up our single most relevant keyword on about 300 pages and point it to our homepage? There are breadcrumbs on our site, but the home link uses an image (It's a picture of a house, if you're curious.) Am I better off just to change that to our most relevant keyword? I could use any advice on internal links for establishing better homepage relevancy. Thank you!
White Hat / Black Hat SEO | | osaka730 -
Why is this Page Ranking for such a competitive keyword?
Hello MOZ Community! I have a question, I am hoping someone can help me understand. I am looking at this URL: http://goo.gl/BkSish ...it is ranking for this Keyword: POS Systems Now, this seems to be a pretty new URL, with few links being generated to it, as seen here: Open Site Explorer: http://moz.com/researchtools/ose/comparisons?site=http%3A%2F%2Fwww.shopkeep.com%2Fpos-system Majestic SEO: https://www.majesticseo.com/reports/site-explorer/link-profile?folder=&q=http%3A%2F%2Fwww.shopkeep.com%2Fpos-system&oq=http%3A%2F%2Fwww.shopkeep.com%2Fpos-system&IndexDataSource=F&wildcard=1 QUESTION: Can someone help us understand how or why this page is ranking so well, so quickly, for such a competitive Keyword? Thank you!
White Hat / Black Hat SEO | | mstpeter0 -
Not ranking for keywords. wehhh
I am not ranking for any of my keywords despite getting a number of Backlinks to my pages and mentioning them in both content and meta tags?!
White Hat / Black Hat SEO | | Johnny_AppleSeed0 -
Can anyone recommend a Google-friendly way of utilising a large number of individual yet similar domains related to one main site?
I have a client who has one main service website, on which they have local landing pages for some of the areas in which they operate. They have since purchased 20 or so domains (although in the process of acquiring more) for which the domain names are all localised versions of the service they offer. Rather than redirecting these to the main site, they wish to operate them all separately with the goal of ranking for the specific localised terms related to each of the domains. One option would be to create microsites (hosted on individual C class IPs etc) with unique, location specific content on each of the domains. Another suggestion would be to park the domains and have them pointing at the individual local landing pages on the main site, so the domains would just be a window through which to view the pages which have already been created. The client is aware of the recent EMD update which could affect the above. Of course, we would wish to go with the most Google-friendly option, so I was wondering if anyone could offer some advice about how would be best to handle this? Many thanks in advance!
White Hat / Black Hat SEO | | AndrewAkesson0 -
Stuffing keywords into URLs
The following site ranks #1 in Google for almost every key phrase in their URL path for almost every page on their site. Example: themarketinganalysts.com/en/pages/medical-translation-interpretation-pharmaceutical-equipment-specifications-medical-literature-hippa/ The last folder in this URL uses 9 keywords and I've seen as many as 18 on the same site. Curious: every page is a "default.html" under one of these kinds of folders (so much architecture?). Question: How much does stuffing keywords into URL paths affect ranking? If it has an effect, will Google eventually ferret it out and penalize it?
White Hat / Black Hat SEO | | PaulKMia0 -
Spammy Links, SERPs, and Low Competition Keywords
While I've seen a lot of news about Google cleaning up content farms, link farms, and similar spam, I've also seen some companies start ranking very well for niche terms using these same practices. Question: Does Google completely discount links from content farms and similar sites or simply give them low value? Observation: I've seen a company start ranking well (top 3) for several terms when they used be on page 2. When I looked at their links, they are from article farms, directories, do-follow blogs and similar low-vale sources. Relative to others, they have about 10x the volume of links with the precise anchor text they are targeting. I wonder in absence of other information that these spammy links still count for something. Given the low competition for the term, this is enough to boost their rank. Just thoughts some thoughts as we are working on long-tail strategies for some key terms.
White Hat / Black Hat SEO | | jeff-rackaid.com0 -
How much pain can I expect if I change the URL structure of the site again?
About 3 months ago I implemented a massive URL structure change by 'upgrading' some of the features of our CMS Prior to this URL's for catergorys and products looked something like this http://www.thefurnituremarket.co.uk/proddetail.asp?prod=OX09 I made a few changes but din't implement it fully as I felt it would be better to do it instages as the site was getting indexed more thouroughly. HOWEVER... We have just hit the first page for some key SERP's and I am wary to rock the boat again by changing the URL structures again and all the sitemaps. How much pain do you think we could feel if i went ahead and optimised the URL's fully? and What would you do? 🙂
White Hat / Black Hat SEO | | robertrRSwalters0