Hi there,
On the campaign setup, you are given 3 options there:
subdomain, root domain and subfolder.
Try to change the setting to root domain so that only the mywebsite.com will be crawled and analyzed.
Hope that helps!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi there,
On the campaign setup, you are given 3 options there:
subdomain, root domain and subfolder.
Try to change the setting to root domain so that only the mywebsite.com will be crawled and analyzed.
Hope that helps!
Hi Michelle!
Here are the steps on how tell Google when your site had moved:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=83106
Just follow the steps there and you'll be fine.
Cheers!
Hi there,
Have you tried to Ping your sitemap?
Google:
http://www.google.com/webmasters/sitemaps/ping?sitemap=http://www.yourdomain.com/sitemap.xml
Bing:
http://www.bing.com/webmaster/ping.aspx?siteMap=http://www.yourdomain.com/sitemap.xml
It normally takes days (even weeks) several days, weeks, or months for the search engines to show any results. You can check from time to time though..
Hi Mememax,
I think your idea of providing Google Places support for local businesses with no website is a really good idea. But linking them to your site might be tricky. I havent seen any google places listings which are hosted on a only one, and non-related, website.
For one, if the listing is being searched and people will want to learn more, will you be able to provide those details in your site? Also, pointing them all to one domain is kind of risky.
Maybe you can offer google places and website support. Then include your link as the host on either the google places or site. That would mean more links for your site. Just a thought...
Hi there,
As far as I know, Google places (Google +) do not have the multiple users function. Google says only one gmail account can manage a claimed Google Places.
An alternative method is transfer your listing between Google places accounts:
http://support.google.com/places/bin/answer.py?hl=en&answer=17104
Hope that helps...
Hi there,
Bing is all about good, original contents, authoritative inbound links and well structured webpages. If you want to be optimized in Bing, you may want to focus on those.
Here are some useful information on Bing optimization.
Search engine optimization on Bing
Hope that helps!
If its organized and really simple like the footer here in SEOMoz (check below), then keep it. If its just 5-6 footer links, I dont think it would take up a lot of link juice.
Plus footer links, (not considering SEO and Google) are indeed useful for site navigation purposes.
Hi Kristian,
Global footer links are not anymore recommended to have on websites as they used to.
Aside from what you mentioned about link juice being passed on away from the more important pages, footer links are, most of the time, devalued by search engines and they also get low CTR.
Still, a lot of sites are still using them as resource for link placement.
So I guess it still depends if you have a really nice organization and layout as a footer (like this: http://shopper.cnet.com) then keep it. If not, it might just be a waste of your time.
Cheers!
Hi there,
Having .com or not having .com on the title do not really make that much of a difference. If you think you have a unique title, then the best thing to do is to just optimize that title without the .com
For example. Amazon.com has the .com on their title, but Google, Yahoo, Ebay do not have. But they are all branded names just the same. So i think its a matter of branding your title or name.
Hope that helps!
Cheers!
Hi there,
It is possible that your site will be searchable for the keyword "breast cancer" and if someone searches for "Breast Cancer Foundation AND breast cancer".
But for that to happen, you need to optimize that keyword first. You need aggressive and targeted optimization since it is a very competitive keyword. And if you do a sample search on Google using that keyword, you'll see that there are a lot of other sites that are ranking for that.
You asked if you will get a backlink for both "Breast Cancer Foundation AND breast cancer". A backlink is when someone links to your site. For example, if the foundation added your link in their site, then you'll get a backlink for that. If you're not linked in any authority or good sites, you wont get any backlink.
Cheers!
You're welcome! Hope that study will help you and your client decide what strategy best to use...
Hi there,
I've heard about some stories and case studies stating that organic listings seem to be disappearing and is being replaced by local listings. There are some articles about that online:
Is Google plus local replacing organic results?
Are your organic listings being replaced with local listings?
In your client's case, maybe the local listing has not been optimized yet.
You can link your Google + local page to the site as "publisher". This tells Google that the site is the publisher of the profile's content. Relevant contents on the site is also very important as they help strengthen the site's brand. Gathering customer reviews about the business is also very helpful.
Besides, according to SEOMoz's Eye-Tracking Google SERPs, local listings are more likely to be clicked in the search results.
Hope this helps!
Cheers!
Hi there,
In SEO point of view (and as SEO good practice) it is preferred that blogs are placed in subfolders.
In your case, the subfolder would be: furnacefilterscanada.com/blog
Since your blog will have link worthy contents that will be useful to your website, then it is recommended that you create a subfolder for that instead of the subdomain (blog.furnacefilterscanada.com).
Hope that helps!
You're welcome. Happy to help!
Hi there,
I asked this same question last week and got really good opinions from some of the members here.
Blog commenting can be risky if you'll just use it to optimize your keywords. However, if done right (written properly and with value), this is an effective way to build keywords, and credibility, online. If you're giving out comments that are useful, sort of like this Q and A here, then readers actually listen. Plus Google will also see how authoritative you are and will probably credit your site for it.
Here's a case study of one SEOMoz member, who have found out how blog commenting can still be effective Post Penguin. This might help you decide whether to still pursue that strategy.
Cheers!
Hi there,
SEOMoz has a great article about link building for 2013. With all the changes that has been going on with Google, you may want to try and do proper SEO for the website, meaning no keyword stuffing and buying of links, as Takeshi Young had said.
The focus now on link building is providing "evergreen contents". These are valuable contents that people would want to share, repost and retweet. Examples of these are tutorials, know-hows.
Also, it is important to build a reputable and authoritative online presence. Social networks, groups and communities can help you with that.
Hope that helps!
Hi there,
Try Sophos Anti-Virus for Mac Home Edition.
This is one of the most reputable malware scanner for Mac.
I understand your despair. Site optimization has really gone a long way from just throwing links out there.
Thanks for the advice. I think a lot of marketers are still using this technique but they're just real careful on how they do it so they don't fall out of the good graces of Google.
One website I checked is ranking well on Google. Upon checking its backlinks, I found that most of them are blog comments.
Is blog commenting still valuable? Anyone encountered any recent problem (ranks gone down, etc)? Are there any specific strategy to blog commenting these days?
Thanks!
Hi there,
The special characters on your website URL is changed because URLs are sent over the Internet using the ASCII character-set. That's why the & is converted to %26.
If you're trying to make your site rank, it is better to just simplify or shorten the URL. This is because search engines have problems indexing sites when the URLs contain special characters.
The special characters below are known to be "search-engine-spider-stoppers":
Hope that helps!
Yes I think its always wise to get other people's opinion about these things.
Google counts that as one link. Matt Cutts explained that well in this video.
So I dont think you have anything to worry about.
Cheers!
Hi there,
You can find the setting on the webmaster dashboard or home. This is the page wherein you can see the list of sites that you are managing on the webmaster tools accounts.
Click on the drop down arrow of the Manage Site found on the right corner of the site name. Click Google Analytics Property.
You'll be taken to a page: Enable Webmaster Tools data in Google Analytics
That's where you can click on the site you wish to incorporate to Google Analytics. Try to refresh after you received a notice that the site has already been added to Analytics.
Thanks!
Hi there, I think having the exact same alt text on the links is not a problem unless you did received a warning from Google about excessive usage or worst yet, spamming using that alt text.
But just to be on the safe side, I would advise for you to check those 600 links to see if they need to be disavowed. Normally, links that you need to disavow are:
"Unnatural" backlinks that you created on your own or are purchased. This is especially for those with targeted high-volume keywords in the anchor text.
Backlinks that are auto generated.
If those 600 links do not fall into any of those categories, then I advise for you not to disavow those links. It would be such a waste if you disavow those links from good sites that link to you because they think your site is good.
You could use a wildcard operator in your robots.txt to deal with this.
Something like this should work:
Disallow: /subfolder/myforum/pop_profile.asp?mode=display&id=*
See here for more details:
http://sanzon.wordpress.com/2008/04/29/advanced-usage-of-robotstxt-w-querystrings/
BBB is one of those things that most business people have a strong opinion about (either for or against). From a pure SEO point of view, I consider it a very strong link. Here's why:
From a pure SEO link perspective, the directory has a very high DA and MozRank (by far the highest MozRank of any of the directories recommended by SEOmoz)
the directory is exclusive. You have to be a business owner to get listed. This prevents someone from listing 5 or 10 near-spammy sites, which crowd some other directories.
they are very exclusive about linking out. It's kind of annoying, but if you don't pay them again next year, you're link will be gone fast! It ultimately ensures that they are not linking to businesses that are no longer operating. This increases the overall trust of the directory.
the average consumer feels warm and fuzzy seeing a BBB banner, and viewing the businesses profile on the BBB website.
I think I paid something like $300 or $400 for it, so maybe the price varies depending on your region and business type? In any case, I'll continue to renew with the BBB every year.
Name.com, NameCheap.com, Dynadot.com are all good and have reasonable prices. I personally use fabulous.com for most of my domains.
As annoying as Godaddy can be, I was happy to see they now offer two-factor authentication as a way of protecting domains.
I think reputation of a registrar can be a factor - mainly if they mess up something in your account, or don't handle renewals properly, causing your site to go offline.
Just curious, is there any reason you did a 410 instead of a 301? I think most webmasters would setup 301 redirects to the most relevant remaining page for each of the pages that you did remove. With a 410, you're effectively dropping backlinks that might have existed to any of the pages that you had.
Well, because they are 'orphans', you probably can't find them using a spider tool! I'd recommend the following process to find your orphan pages:
1. get a list of all the pages created by your CMS
2. get the list of all the pages found by Screaming Frog
3. add the two url lists into Excel and find the URLs in your CMS that are not in the Screaming Frog list.
You could probably use an Excel trick like this one:
http://superuser.com/questions/289650/how-to-compare-two-columns-and-find-differences-in-excel
I've had the best luck with expired domains by 'rehabbing' them. More or less building out a high quality site related to the topic of the domain. This is on a whole a 'gray hat' area, so use any of this advice at your own risk:
1. be careful about analyzing and verifying the backlinks of domains. DMOZ seems to remove links as soon as a domain goes past the expiration date, so don't count on getting that link. Other directories like Yahoo Dir don't seem to care.
2. often times a link looks great because it has lots of backlinks. But those backlinks could have been domains all owned by the same person, and all expiring at the same time.
3. I've rebuilt some domains that had completely dropped (so completely past the auction stage), and found them to have traffic shortly thereafter. I've also rebuilt names that I thought were good, and had a great backlink profile, and never had them rank for anything. So on a whole it's hard to understand why some domains perform better than others.
It's really going to be a bit complex to get this done right. But based your example above, it looks like you just want to redirect anything with a query string back to the base url.
There's a discussion specifically about that right here:
http://www.webmasterworld.com/apache/3203401.htm
You could start with this code and refine it from there:
RewriteCond %{QUERY_STRING} .
RewriteRule (.*) http://www.example.com/$1? [R=301,L]
Good luck.
Based on their reputation, I would be wary. They might be trying to sell you a domain name that they don't even own.
You should be able to check the whois record to see who owns the non-hyphenated version of the domain (assuming they don't have whois privacy setup). If your friend really wants to get the domain, they could try to contact that domain owner directly, rather than replying to a spam email.
Also, for any domain transaction (even small transactions like this), you use escrow, especially if there's reason to suspect fraud from the other party.
There's a service ecop.com that specializes in low value escrow transactions.
Have you seen the URL parameter tool in Google Webmaster Tools? It's made just to handle these kinds of situations:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687
But, there's no reason you can't setup redirects for all of these URLs, but it's probably going to be time consuming to figure out all of the possible URL combinations that you have and where to redirect them to.
Did you verify your business listing? Google sends you a postcard with a verification code on it. If you haven't done that yet, it's probably the reason.
http://support.google.com/places/bin/answer.py?hl=en&answer=145585&topic=1656748&ctx=topic
I guess my question would be why do you have two completely different url paths, returning the same content? Generally, canonical is used to catch things like, www vs non www, trailing slashes, caps vs no caps in the urls, # signs in urls, etc.
http://en.wikipedia.org/wiki/URL_normalization
My guess is that it's probably a bug on the part of the SEOMoz software, and that Google would probably respect your canonical tag here. Just to be safe, you might block the second url in your Robots.txt file.
It's an interesting idea. There's a very popular blog in the SEO industry that is 'do follow'. He is very interactive and the comments there are generally good.
Here are some considerations:
outside of the SEO industry, how many people really understand the difference of 'do follow' vs 'no follow'. If you're topic is something niche and very technical, the fact that it's 'do follow' may potentially have 0 impact on the type of person that you really want to have leave feedback.
be prepared to be a firm moderator for the comments. I think you'll often find yourself on the fence you'll see a comment with marginal value. I would try to set the bar higher on which comments you approve.
Facebook comments might be useful. Since Facebook comments show up on a users wall, and can generally be viewed by friends who might be in the same industry, it might be a good way of generating more discussions. We've used Facebook comments for a few sites, and have had very little problem at all with spam comments. The downside with Facebook comments is that the content is stored on Facebook rather than your own blog (though there are some plugins that attempt to address this by download the comments to the blog).
The SEOMoz profile system is kind of cool. Once a user has generated enough 'points' - their profile link becomes followed. I sometimes wish there was a similar system for blog comments. Maybe there is?
Hi Titan -
I'm not an SEOMoz employee - so I can only speculate here.
From the blog post on Raven Tools - they made the comment that the reason they decided to stop offering the rank tracker is because they failed an audit for the Google Adwords API. Google gave them an ultimatum - either stop using scraped data, or lose access to the API.
From what I understand, SEOmoz was recently faced with the same problem, and they chose to drop the access to the Adwords API, instead of giving up access to the scraped data. This is understandable, since SEOMoz's tools focus exclusively on SEO - while Raven offers a a wide range of services - some SEO, some PPC, some Social Media, etc.
So, I would be surprised if SEOMoz would stop offering the rank tracker. Though I really have no inside knowledge about this at all.
I think the SEOMoz rank tracker is an internally developed tool. Raven tools, and a lot of other SEO tools in the industry used a company called Authority Labs.
Rank tracking is the only thing that they do, and from what I've seen, their data is a bit better overall, if you're just looking for rank tracking. They also do daily monitoring of keywords, which is a bit overkill from my perspective - but could be useful if you're in a very competitive niche.
The link form Stephen to Search Engine Land is a good resource. I'll just add a few additional thoughts here:
1. There are several types of domains that you might be picking up - and they are not all equal. There are 'pre-release' domains which are auctioned off about a month after they've expired. These are the kind you get from GoDaddy Auctions or Namejet. If you buy those - the creation date is not actually reset - so a 10 year old domain would still be 10 years old. During this period of time, the existing owner of the domain could actually re-register the domain as well.
After this, domains go through the 'Redemption Period' and then the 'Pending Delete' period. These domains will be completely deleted from the registry, and have their creation dates reset.
You're much more likely to get some 'juice' from PreRelease names, than names that have completely dropped from the registry.
2. That said, if you're still considering this technique, you'd probably want to look very carefully at the backlinks of the site you're buying. A large portion of the expired domains with backlinks were used for spamming. It's probably not worth your time to disavow all of the bad links from a domain you've picked up at auction.
3. Expired domains can be 'rehabbed'. If you take the time to rebuild the site with valuable content, it will be able to rank for search terms, and build up page rank again. You'd probably have much less risk in the long run by rehabbing some related domains with good content and linking back to your main site, than using the 301 technique - though I've never done any side by side experiments to say for sure.
There's nothing wrong with upgrading your site. In general, I would recommend to update your site, if you have the time to do it right. Things to watch out for are:
will every page still have a unique URL? There are a number of HTML5 based systems which run the entire site on a single URL (look at how Gmail runs for example). It can be great for an end user experience since the pages appear to load very quickly. However, such a system would probably make it difficult for all of your pages to be properly indexed.
assuming you'll have unique URLs - make sure that the new site will use the same URLs as the original site. If not, plan to put in 301 redirects for all of the existing pages.
just because you're going HTML5, don't forget the basics. Use your title tag, H1, H2, etc on the new template.
after you launch the new site, make sure spiders can still crawl the site properly. If yes, then you should be all set with your new site.
ahh, I see. I think there can be cases where it's valuable to pull in a feed. Perhaps an index of relevant headlines around a topic, or maybe headlines from a users blog posts. I wouldn't use it as the primary basis of an SEO strategy though.
Also, one tip if you're planning on doing something like this. It's much better to cache the results of a feed on your server, rather than pulling in a live RSS feed every time a user loads a page. Given the value that Google places on page speed loading, you would probably have a greater negative impact on page load speed by adding RSS feeds to your site, unless you have a cache system in place.
In general, I think it's good to have RSS feeds for a site. It's a good way of reaching new viewers, who might only come across your content in a reader or a syndicated version. Also, lots of social platforms have an option for including an RSS feed - so if you interact on the social platform - people you are interacting with would be able to see some of the most recent content from your site.
Google doesn't count RSS feeds as duplicate content, and is generally smart enough to figure out the original source of the content if it was republished in part somewhere.
I've never had a problem on creating a large number of redirects on a site before. It's something that happens quite a bit, for instance if a site is moving to a a site to a new domain or a new CMS, where it can often be very difficult to exactly recreate the same URL structure.
There's no limit to the number of redirects, just the number of hops. If the site had existing redirects in place, you might want to update those existing redirects as well, to point to the new final destination.
hmm - I think it is what it is. The products you're selling are things that people need, but are fundamentally not that interesting.
I think the strategy of incentivising people to leave reviews is a good start (WEbucks). If you're able to generate enough UCG this way, you might consider not showing the same comments on both of the pages.
It looks like you're using a generic Meta description / keywords for both of the pages. You should be able to customize this inside your templates to use something more relevant to the page.
It looks like the descriptions are shared by other sites across the web. Ideally you could figure out a strategy generating something more unique here. Maybe by you could find a way of highlighting some of the best comments left specifically about that particular product in question?
Looks like it's only been about 3 weeks since you've registered the domains. You probably just need to wait a bit longer, and the other site will catch up. A couple of tips would be:
1. Explicitly create a robots.txt file. I just feel better doing this, even if I'm not add any special conditions / restrictions to it.
2. Register your sites with Google Webmaster Tools
3. Build more good quality links to your sites. 500 plus pages is a lot for Google to index for a brand new site without any authority. The more good links you have to your sites, the more pages Google will index.
Hay - thanks for those links. I do remember reading those Webmaster Central posts a while back, but hadn't used that technique in practice ever. I think either of the techniques requires good cooperation from your syndication partners to implement. I think in practice, it may not always be easy to have a syndication partner add meta tags specifically for a page of content they are publishing.
In terms of which one is better - I really can't say. I would guess that a nonindex plus a link would probably be more explicit, since in that case, the search engines don't really have to decide which is the real canonical version - since there's only one page of content existing.
Also, the way they describe cross domain canonical sounds kind of wishy-washy ---> "While the rel="canonical" link element is seen as a hint and not an absolute directive, we do try to follow it where possible."
It seems like two different issues to me. If your content is syndicated on a 3rd party site, Google is saying - ask your partners to no-index the content and provide a link back to your original source. That way your original source will rise above all of those syndicated sources (on many other places around the WWW) to be the highest ranked page
If you are optimizing your own site, they are saying be careful to avoid duplicate versions of the same page within your own site, coming about as a result of canonicalization problems. Canonicalization problems on your site make it appear you have lots of very similar versions of the same page on your own site.
I think I can see how you got confused here - since they are talking about the topic of duplicate content in general - which can be caused either by syndication (publishing one page of content across many different sites) or canonicalization issues (where the same page of content on your own site appears on several different URLs).
Hope that helps!
Amy - a few ideas here:
After you do all of that optimization, I also like to run a tool like the Screaming Frog Seo Spider, to see if there are any pages across the site that have exactly the same or very similar title tags. If you have more than a few pages, it can happen very easily so it makes sense to go back and double check that later. You can sometimes discover two or more pages on your site that actually have very similar content. In a case like that, pick the better one, add any relevant content and 301 redirect the old page.
After that, make sure you've signed up for Google Webmaster Tools, submitted a sitemap, and make sure you don't have any issues with slow site loading or canonicalization (where the same page is referenced with slightly different URLs).
Hope that helps - without looking at the site too closely, that's the best advice I can give.
Hard to say - but I see the following potential issues:
there's really not that much to say about a piece of furniture. It would be hard enough to craft one really good sales page for a walnut sleigh bed, let alone, original and interesting pages about that product 5 times. Eventually you're going to be creating thin content for the lower priority of those sites.
product names usually end up being used for the title tag and the H1 tag of an ecommerce product page. Given that you're products are being named the same across all 5 of the sites, and that they are all running in the same ecommerce software, that's going to leave a pretty clear footprint that the sites are all related.
IF you're successful in this, you're going to be competing against yourself in the SEO rankings. In that case, you'd probably want to consider about how the title tags and the page descriptions are going to look, stacked up against each other.
given that the sites are hosted on a 'multi-store' system, they'll probably be hosted on the same IP address, which is something that all search engines use as a clue to find relationships between different sites.
Hope that helps!
In terms of post panda / penguin linking, the best thing is to stick with an honest strategy by observing the following rules:
use cookie crumb style links within all pages. These are good for usability, as it allows users to know where they are on your site. They also provide a way of including rich anchor text links on your site
use internal link anchor text in a natural way. For most pages, there are several very closely variations of anchor text that fit to a close theme. For instance, in one case, I link to a certain page on one of my websites the following keywords --> free domain appraisal, free domain name appraisal, free domain name valuation, free domain valuation. The page in question is very relevant for all of those variations and ranks well for all of those variations. It just takes a little bit of extra thought to think of these keyword variations, and can ultimately generate much more traffic for the page in question too.
cross link related pages with relevant anchor text. There may be related pages within different category pages. Rich in content text links between those pages provide value to your readers and should not hurt you in any way.
I think a basic site hierarchy is reasonable to follow. Pick major categories for the top level links, and related topics below each of those categories.
Stay away from domain names containing brands. After your hard work in building up the site, it can just easily be claimed by the brand holding company. You'll need to turn over the domain to the brand holder without any compensation. Even worse, you could be subject to a huge fine.
You can learn more about the process by which a brand holder can seize the name here:
http://en.wikipedia.org/wiki/Uniform_Domain-Name_Dispute-Resolution_Policy