Instead of blocking it via robots.txt try blocking via a meta robots tag like below:
This way Google (and others) won't index the page but they will follow all of the links.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Instead of blocking it via robots.txt try blocking via a meta robots tag like below:
This way Google (and others) won't index the page but they will follow all of the links.
Everything is coming up fine for me. Though sometimes Google will change or truncate your title tag depending on what keywords or phrases the user searched with. What title tag are you seeing for that particular term?
You want to stay away from directories that have little to no screening process. If it's easy to get, it's not worth it. Make sure to do an "SEO background check" on them as well by checking their backlinks in open site explorer. See if they have good quality links, many linking c-blocks, DA, PA, PR, etc..
Here are some great directories, but you have to pay:
Yahoo Directory ($299/Year)
Best of the Web ($299/One-time)
Business.com ($299/Year)
JoeAnt ($39/One-time)
SEOmoz also has a list of good directories here http://www.seomoz.org/directories
I think you have the right ideas in mind. I would start with the link building outside of your site. Then after awhile if that doesn't work I would do a blog post on them or something. I wouldn't link to them from the home page.
If it's a resource that you pride yourself in creating then yes. Copyrighting this would only run you about $30 so it is always worth it in the long run. It's always better to protect yourself. Or at the very least, put the copyright symbol at the bottom like you mentioned, just to scare off anyone from ripping you off.
Moz has an affiliate program here: http://moz.com/partnersBut if you're looking to sell a lot of different products through affiliate marketing try visiting Clickbank.com or CJ.com
Hi Tommy,
Not exactly. I think I misunderstood your original question. I thought you had two pages with the same content, and they were accessible via two different categories.
But I think you're saying you have one page, but you can access that one page via the two different categories, but the breadcrumbs are the same no matter which route they took, whether through A or B, they show category A breadcrumbs.
I wouldn't worry so much about the breadcrumbs, I would worry more about duplicate content and urls.
Let's say you're selling a flashlight, and you just have one flashlight product page. But, because of the content of your site, you listed it under two different categories. Let's just say the categories are tools and gadgets.
So if you had two urls:
http://www.site.com/tools/flashlight and http://www.site.com/gadgets/flashlight
but they were technically the same page (same content and everything just different url), this would be bad.
The fix for this would be to pick the url you want to rank, then put that url as the canonical for the other, so when google crawls it, they know you prefer the other url.
However if it were the same url, no matter which category they came from, there is no problem, because there is no duplication.
Now back to the beginning
If you really want the breadcrumbs to reflect which category they came from, instead of just redirecting to category A, then create another page for category B, make it identical to the page for category A. But on the new page, put the url of page A as the canonical on the new page for B.
So users get the same product page (content speaking) with the breadcrumb that reflects their path, but Google will only count one url no matter which one they crawl.
I ran it through the keyword difficulty tool and it appears it's a very low competition keyword. The site has a PA of 25 and a DA of 15, so I wouldn't be surprised that the keyword in the domain alone is the cause. The .com also ranks #6 and it has nearly no content which makes me believe this is also the case for this domain as well.
Matt Cutts said they would be "turning the dial down" on exact match domains, not necessarily getting rid of it as a metric. It is still a pretty powerful ranking signal, just not as much as a few years ago.
If you're trying to rank for this keyword just make sure your website design looks great and has plenty of good content, this alone should get you up there with them. Then just follow that up basic link building, nothing serious.
To follow up with what Steve said, PR doesn't necessarily determine how you rank. PageRank is determined by how high the PageRank of the other sites that link to you. So having 20 PR3 sites linking to you won't give you too high of a PR, mostly likely will make it PR3 as well. Ranking higher has more to do with overall metrics, like Domain Authority, Page Authority, Domain Links, Page Links, Social Signals, Etc..
Here is a great post that explains it better:
Breadcrumbs are more for UX like you say, however they do help search engines crawl your site's pages better as well, especially if they're not in main navigation.
I think the canonical issue is the more important one rather than what links appear in the breadcrumb. I would select which page you would prefer to rank, then put that url in the canonical tag of the other page.
So the canonical would be for Google, and the breadcrumb would be for user.
Also, who knows, maybe having the different breadcrumb is better for the user, because they came from a different path to that product in the first place. But Google would count both pages as the same.
It would only be bad if the home page and category page are almost identical. If the page products are 90% identical to both the home and category page, then yes this is duplicate. But if you just have a few scattered products on the home page that happen to be somewhere else, you should be fine. All ecommerce stores deal with a similar situation as you, I think if Google were to ban sites that display a product on different pages nobody would rank. Just make sure that each page has enough unique products on it.
Depending on your goals for the three sites you may not have to "hide" it at all. Now if you have a mini link farm thing going on then you would definitely want to hide the whois info, as well as get hosting that can give you a different c-block ip address for each domain, try looking at http://seohosting.com/. But if the intention is "harmless" and you don't plan just using them as a way to link to each other (even though they still might in a whitehat way) you don't have to hide the info. Take opensiteexplorer.org and moz.com for example, it's clear that moz owns that site, but they don't feel the need to hide it because each site serves its own purpose.
How would you properly enter hours into the Moz Local spreadsheet for a business that sometimes is closed for lunch, therefore they're open for a few hours in the morning and a few hours in the afternoon.
I've attempted below at what I thought it could be:
2:9:00:AM:12:00:PM:3:00PM:6:00PM,3:7:30:AM:10:30:AM:3:00PM:6:00PM,4:9:00:AM:12:00:PM:3:00PM:6:00PM,5:7:30:AM:10:30:AM:3:00PM:6:00PM,6:3:00PM:6:00PM
Hi Matthew,
There's a couple ways to get the ball rolling on your brand page ranking rather than your product page.
The best way to change this is to market your brand name page on the web. Here's a few suggestions:
Onsite changes:
1. Have breadcrumbs enabled. This gives your brand page a link on every single product page associated with that brand name with the appropriate anchor text.
2. If you have a blog, link to that page whenever that brand is mentioned in a post. This gives the page some more pagerank and anchor text occurrences with that particular brand name.
3. Link to brand name page throughout site where appropriate in a non-spammy way (ex. navigation, footer navigation, category pages, etc.)
Offsite changes:
1. Increase your incoming links to that particular page. Reach out friendly and relevant sites/blogs and try to have them list your page as a reference for a retailer of that brand name.
2. Increase your social engagement to that page. Research and reach out to people talking about that brand on social media using your moz tools. You can connect with people who have used those products and have them visit your page, and webmasters who visit your page may link to it as well.
Getting all of the listings correct for the practice should always be first before you get into individual doctors. I personally would hold off on submitting a listing for each doctor except for those that are specific to doctors like healthgrades.com, ratemds.com and then Google Plus Local. Google prefers a practice to have one main practice Google Plus Local profile for the whole practice, then have an individual Google Plus Local for each doctor in the practice.
Here's an example of a practice Google Plus Local page:
https://plus.google.com/+ChoiceChiropracticWellnessCenterBoulder
And here is one for the doctor specifically (this is the type of profile that is the equivelant to the practice profile, but for each doctor): https://plus.google.com/100970026051930711588
Also, those two are different than the regular personal profile, like the one used for authorship:
https://plus.google.com/+JonathanSchnelle
If all you want to do is measure data from the alternate domain name from your print campaign, try creating a unique landing page and have the domain forward to that URL, instead of the home page. This would also help in tracking traffic in Analytics. But if you want the alternate domain name to show up in SERPs like you suggest near the end, then all of the things you suggested will definitely do that. I guess it just depends on what you want out of the domain name.
If you're in WordPress, changing your permalink structure will automatically transfer the pages for you with the new URLs. Then all you would need to do is add the 301 redirects for the old URLs to the new ones. So technically you would not have to create new pages, you would just need a list of the old URLs to input into your htaccess file for the redirects. Hopefully that helps to see if it's worth it for you.
It's my understanding that you treat each subdomain as a unique site. So each subdomain should have its own unique XML sitemap and robots.txt file, as well as submitted separately to Google Webmaster Tools. But to answer your inter-linking question, I would avoid including the other domain URLs in those files (XML and Robots). Only include the URLs for that particular domain and/or subdomain. With that being said however, I would inter-link them on the actual site somewhere. Maybe in the HTML sitemap, navigation, footer, or even naturally throughout your body content where appropriate as well.