Logan when I grow up I want to be like you... jejeje is a joke but as, I see you are very active member of the community and always your answers are clear and concise.
Good for you
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Logan when I grow up I want to be like you... jejeje is a joke but as, I see you are very active member of the community and always your answers are clear and concise.
Good for you
In 2016 Google sent out a batch of new manual penalty notices that mostly hit bloggers. Bloggers were penalized for accepting free products in exchange for a review with a link to merchant’s website or accepting paid reviews with such links.
It’s a well known fact for years now that Google doesn’t like to see paid reviews or reviews paid through free product or free service pass PageRank.
Online stores who were buying lots of links that pass PageRank would get hit by a manual penalty or even worse, by the Penguin algorithm.
Google now decided to focus on those who enable merchants to get such links – the bloggers. So Google sent out manual penalties to bloggers who didn’t listen to this guideline.
Impact can be both positive and negative, depending on how good you were in obeying Google’s guidelines in past. Till now, if you only obtained a few links with this method where you give a free product or pay for review to a blogger, Google would be unable to figure out that you’re doing something wrong on a massive scale and you wouldn’t get penalized in any way.
As I know Google has made a significant change since 2016 to its search results pages by extending the length of titles and descriptions.
Title tags have been increased to 70–71 characters, which is up from its previous 50–60 characters. That’s at least enough to fit in another word or two.
Meta descriptions have been increased by 100 characters per line, and extended from two to three lines. That’s a significant increase, and presents far more of an opportunity to tell searchers what the page is about.
Google is still truncating the descriptions to two lines for many search results still, so you may still see them coming in at around 160 characters at times. When a three line snipped is displayed they come in at 278 characters per line.
t’s important to note that this may be a test which Google could reverse at any time.
https://www.reddit.com/r/bigseo/comments/4ixf86/google_has_increased_the_width_of_the_search/
It useful just for your users.
for local seo you can use
So if one of your client is looking your phone number and type
YOURBUSINESS PBX or whatever google can give the right answer.
But as everybody set that will not help you to rank better.
While not appealing, you should rewrite all the content to be 100% unique, if it is privacy policy, tos, etc, you can no index those to reduce duplication. Otherwise, your options are limited. I realize that the products/ services will be similar in nature, but writing them in a different way to reduce the significantly similar content.
Alternatively, you can do a cross domain canonical tag, this tells Google that this content is duplicated intentionally on the other URL.
Here are a few articles about that:
https://moz.com/learn/seo/duplicate-content
https://blog.kissmetrics.com/myths-about-duplicate-content/
https://yoast.com/rel-canonical/
http://webmasters.stackexchange.com/questions/56326/canonical-urls-with-multiple-domains
Next, focus on building local links to the individual city pages, to further differentiate the cities and the intent. Also, using the schema.org for 'local business' on each versions of the URL's. And, again I will say this is not an ideal situation and the best case scenario would be to add that content on ONE domain just with different location pages, within a subdirectory format.
IF THE ANSWER WERE FINE ---->>>>> DONT FORGET CLICK ON THUMB UP<<<<<
this a great list to keep in mind
Based on my experience "The Keyword Density" does not have a big impact on the rank performance of a web page.
Don't get me wrong you need your Keyword in Title Tag, Header 1 etc. But there other factors with a better impact and more weight than "Keyword Density"
1. Domain Age
2. Keyword Appears in Top Level Domain
3. Keyword As First Word in Domain
4. Domain registration length
5. Keyword in Subdomain Name
6. Domain History
7. Exact Match Domain
8. Public vs. Private WhoIs
9. Penalized WhoIs Owner
10. Country TLD extension
For me redirect will be the right way.
Why? Because in another way you will have a duplicate content issue. Remember yoursite.com and www.yoursite.com will be indexed as differents pages.
Have the same Issue in the past. And I resolved the problem
1- redirections
2- setting the right URL on Search Console
(Search Console > Site Settings > Preferred domain)
3- Submit a sitemap for every language
(Search Console > Sitemaps)
4- And Finally config the language on search console
For free, there is no better option than Keyword Planner
If your work on PPC campaigns,
for me the best option out there is Wordstream.
Trust me, it will save you money, time and headaches. But it will cost you $250/ month