Domain SEO
-
Hi, May I know for the keyword "engagement rings" which domain is the best in term of SEO perspective?
3)www.engagement_rings.com
Thank you
-
2 because it's easiest to remember. In 2019 exact-match domains have less impact on SEO, it's more about 10x content and demonstrating a solid value proposition (watch up to the point where issue #1 is fully outlined). SEO is a pretty vast field in modern times. Coding tweaks and URL slugs are still somewhat important, but they provide slight, slight bonuses to your core value proposition (the value-add of your site, to the internet). I don't think engagementrings.com is too bad, but without a solid 'idea' and value-prop behind it, the URL won't magically make it rank alone
-
As others mentioned here using exact match keyword is no longer useful from an SEO perspective. Based on what I see you are trying to build a niche/vertical website so your site structure and clean internal links, schemas, will be more relevant to your rankings.
But, if I have to choose, the second one for me is the best option
-
Hi, Backing Joe in, it can sometimes be more complex as can negatively spam yourself, with too much branding.
-
Having a keyword in a domain name is no longer relevant & hasn't been for a good while - but - in answer to your question, I personally would go for...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you deal with Scam-Type SEO businesses?
One of our potential clients is a limousine rental service. His current "Marketer" is going about his business in a seemingly sketchy way. I'm pretty new to having to compare myself to other SEO/Marketing competition. So, this guy has 100's of websites that are nearly identical. Quite a few have duplicate content, but all of them generally look the same. He leases these websites as lead generators: Think of it like this: he probably has 15-20 websites all geared for different parts of the DFW area. Denton Limo Service, Plano Limo Service, Dallas Limo Service, Etc. He also has a bunch of websites for other industries. Every "business" has its own phone number via a Google Number that he forwards to the actual business line. Every "business" has a Google My Business Listing setup as well with no address listed. When someone fills out the contact form on one of these sites, it is forwarded to the business who is leasing it. He also creates his own backlinks on his websites to all of his other websites. I imagine that eventually he will be caught, right? I mean, this has to be Black Hat SEO. Have any of you encountered an SEO/Marketer like this? If so, what do you do about it?
White Hat / Black Hat SEO | | roger2050 -
SEO Template Recommendations - example provided but would welcome any advice
Hi there, I'm trying to improve the templates used on our website for SEO pages aimed at popular search terms. An example of our current page template is as follows: http://www.eteach.com/teaching-jobs Our designers have come up with the following new template: http://www.eteach.com/justindaviesnovemeber I know that changing successful pages can be risky. One concern is putting links behind JQuery, where the 'More on Surrey' link is. Does anyone had any strong suggestions or observations around our new template? Especially through the eyes of Google! Thanks in advance Justin
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Competitor Bad Practice SEO Still Ranking Well But Why ?
Moz Friends, A very close competitor have always been challenging for similar competitive keywords. We seem to have the advantage for alot of long tail keywords but on one of the higher traffic relevant keywords they seem to do well. I really struggle to understand why, particularly with the back links they use Just my thoughts and notes on the two: Our Page Better written text content (Maybe slightly written to for experienced target audience but we are working on simplifying things) Good Clear site URL structure and navigation for usability Fresh content updates Mobile optimized Reasonable page speeds Good on-page optimization Good back links from industry influences Competitor Page Negatives Site structure and URL's are inconsistent and messy Lower quality content site wide They use tried and tested on page optimization methods like Keyword spamming, Bold Keywords,Underlining Keywords (Sarcasm) Terrible back links, all directories and free article submission sites (Seriously take a look) Less focused on page optimization Not mobile optimized Most of the rest of the sites carry on the same sort of differences, Engine: www.google.co.uk Keyword: Sound level meters **Our Page: **www.cirrusresearch.co.uk/products/sound-level-meters/ **Competitor Page: **www.pulsarinstruments.com/product-information/Sound-Level-Meter.html Any feedback would be greatly appreciated please, i am really struggling to get my head around this Thanks James
White Hat / Black Hat SEO | | Antony_Towle1 -
Cross-Site Links with different Country Code Domains
I have a question with the penguin update. I know they are really cracking down on "spam" links. I know that they are wanting you to shift from linking keywords to the brand name, unless it makes sense in a sentence. We have five sites for one company in the header they have little flag images, that link to different country domains. These domains all have relatively the same domain name besides the country code. My question is, linking these sites back and fourth to each other in this way, does it hurt you in penguin? I know they are wanting you to push your identity but does this cross-site scheme hurt you? In the header of these sites we have something like this. I am assuming the best strategy would probably be to treat them like separate entities. Or, just focus on one domain. They also have some sites that have links in the footer but they are set up like:
White Hat / Black Hat SEO | | AlliedComputer
For product visit Domain.com Should nofollows be added on these footer links as well? I am not sure if penguin finds them spammy too.0 -
Schema.org tricking and duplicate content across domains
I've found the following abuse, and Im curious what could I do about it. Basically the scheme is: own some content only once (pictures, description, reviews etc) use different domain names (no problem if you use the same IP or IP-C address) have a different layout (this is basically the key) use schema.org tricking, meaning show (the very same) reviews on different scale, show a little bit less reviews on one site than on an another Quick example: http://bit.ly/18rKd2Q
White Hat / Black Hat SEO | | Sved
#2: budapesthotelstart.com/budapest-hotels/hotel-erkel/szalloda-attekintes.hu.html (217.113.62.21), 328 reviews, 8.6 / 10
#6: szallasvadasz.hu/hotel-erkel/ (217.113.62.201), 323 reviews, 4.29 / 5
#7: xn--szlls-gyula-l7ac.hu/szallodak/erkel-hotel/ (217.113.62.201), no reviews shown It turns out that this tactic even without the 4th step can be quite beneficial to rank with several domains. Here is a little investigation I've done (not really extensive, took around 1 and a half hour, but quite shocking nonetheless):
https://docs.google.com/spreadsheet/ccc?key=0Aqbt1cVFlhXbdENGenFsME5vSldldTl3WWh4cVVHQXc#gid=0 Kaspar Szymanski from Google Webspam team said that they have looked into it, and will do something, but honestly I don't know whether I could believe it or not. What do you suggest? should I leave it, and try to copy this tactic to rank with the very same content multiple times? should I deliberately cheat with markups? should I play nice and hope that these guys sooner or later will be dealt with? (honestly can't see this one working out) should I write a case study for this, so maybe if the tactics get bigger attention, then google will deal with it? Does anybody could push this towards Matt Cutts, or anybody else who is responsible for these things?0 -
Domain Structure For A Network of Websites
To achieve this we need to set up a new architecture of domains and sub-websites to effectively build this network. We want to make sure we follow the right protocols for setting up the domain structures to achieve good SEO for the primary domain and local websites. Today we have our core website at www.doctorsvisioncenter.com which will ultimately will become dvceyecarenetwork.com. That website will serve as the core web presence that can be custom branded for hundreds. For example, today you can go to www.doctorsvisioncenter.com/pinehurst. Note when you start there, you can click around and it is still branded for Pinehurst or spectrum eye care. So the burning question(s). - if I am an independent doc at www.newyorkeye.com, I could do domain forwarding but Google does not index forwarded domains so that is out. I could do a 301 permanent redirect to my page www.doctorsvisioncenter.com/newyorkeye. I could then put a rule in the HT Access file that says if newyorkeye.com redirect to www.doctorsvisioncenter/newyorkeye and then have the domain show up as www.newyorkeye.com. Another way to do that is we point the newyorkeye DNS to doctorsvisioncenter.com rather than a 301 redirect with the same basic rule in the HT Access file. That means that, theoretically, every sub page would show up, for example, as www.newyorkeye.com/contact-lens-center which is actually www.doctorsvisioncenter.com/contact-lens-center. It also means, theoretically, that it will be seen as an individual domain but pointing to all the same content under that individual domain just like potentially hundreds of others. The goal is we build once, manage once and benefit many. If we do something like the above which will mean that each domain will essentially be a separate domain, but, will google see it that way or as duplicative content? While it is easy to answer "yes" it would be duplicative, it is not necessarily the case if the content is on separate domains. Is this a good way to proceed, or does anyone have another recommendation for us?
White Hat / Black Hat SEO | | JessTopps0 -
How do you remove unwanted links, built by your previous SEO company?
We dropped significantly (from page 1 for 4 keywords...to ranking over 75 for all) after the Penguin update. I understand trustworthy content and links (along with site structure) are the big reasons for staying strong through the update...and those sites that did these things wrong were penalized. In efforts to gain Google's trust again, we are checking into our site structure and making sure to produce fresh and relevant content on our site and social media channels on a weekly basis. But how do we remove links that were built by our SEO company, those of which could be untrustworthy/irrelevant sites with low site rankings? Try to email the webmaster of that site (using data from Open Site Explorer)?
White Hat / Black Hat SEO | | clairerichards0 -
Multiple domains pointed at one site
I know things are changing and the things Google thinks are cheating searchers from finding what they are really looking for are changing too. So, I have multiple domain names that are related to my site, but not the actual site name. For instance, I have a certification program called Certified NetAnalyst that has a few domains for it... .com, .org and other derivatives like NetAnalyst. I would like to point the domains to my main company web site and not create a site just for the certification. Does Google think it is cheating to point domain names with my company branding names to my main web site? What about domain name forwarding to a specific URL, like taking the certification name domains and pointing them to the certification page instead of the main site? Wondering if one could no follow (don't know how to do that) the domain forwarding links so it is not duplicate content? Is that possible in some way? Could you put another robots.txt file with excludes in the domain forwarding url landing page so it would not be duplicate content? For the future I want all SEO "juice" to go to the main domain, but the keyword value of the domain names is valuable. I sure would be grateful if someone that has a good understanding and specific recent experience with Google policy and enforcement could offer some sage and practical advice and perhaps a case study example where Google "likes it" or on the other hand a good explanation of why I may not wish to do this! Thank You! Bill Alderson www.apalytics.com
White Hat / Black Hat SEO | | Packetman0071