I found matt cutts video which says that keyword rich domains are being devalued.
I would recommend doing more brand awareness exercises rather than switching domain names for a more keyword rich domain.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I found matt cutts video which says that keyword rich domains are being devalued.
I would recommend doing more brand awareness exercises rather than switching domain names for a more keyword rich domain.
1st of all, make an image sitemap and submit to Google. Here are the guidelines to do so.
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=178636
Then, do some default tag and category definition. As Barry pointed out, it makes a lot of sense to categorize. That would add relevance to your website in searches.
Then, at the backend, assign default tags for the categories. Like for cars, the default tag could be automobiles, for scenery it could be nature so on and so forth. That would help in your optimization.
You can try a mod_rewrite on apache or if its not a big website URL based redierction on .htaccess.
Therefore, you can actually layer the architecture in such a way to actually write something like a vanity URL.
That would solve your URL problem.
I would say that golf clubs would serve you better than just clubs. Clubs can also mean a gathering place for folks with similar interests where you get clear results for the keyword golf clubc if you mention it in the URL itself.
For Google:
http://www.google.com/support/websearch/bin/answer.py?hl=en&answer=136861
and
http://www.google.com/support/bin/static.py?page=guide.cs&guide=30275&topic=1051770
For Bing, there is no official source, but bloggers have put together one.
http://www.searchbrat.com/bing-search-advanaced-command/
Hope that helps.
Hi Laurent,
If its wordpress, I can think of a few plugins to help you there. SEO Smart links is a great plugin to help you in that respect. URL: http://www.prelovac.com/vladimir/wordpress-plugins/seo-smart-links
Then there is a popular post plugin which will showcase your popular posts in the sidebar. This is a great way to leverage you old/archived content.
You can extract the initial set of keywords from Google keyword external. That should give you a reasonable starting matrix of the set you should targeting.
Also check out from real internet users ( maybe aspirants, students who would be likely to search something similar to your website's offering) to see what they would use to search for a website product like your's.
That would give you an idea of what keywods people are looking for.
Think as a user, not a internet marketer.
You can also try the SEOMOZ pro keyword difficulty tool. Thats pretty good as well
The best way is to enable the creation of unique content daily. UGC and user participation if added to the unique content would add to the crawl ability. If you want to go a bit high tech, you can put the fresh content on a webcache to enable Google bots to visit you more frequently. ALso, make sure that your older/archived pages get some content cycled ( new comments, new tweets etc etc)
I am not 100% sure about this, but this is the normal way of consideration. Normally, Google also looks at which data center the user has been querying and based on that it decided the location. It tries its best to give you the location accurately, but sometimes, the mechanism fails, therefore the not set comes in.
Well, A strict No No there on that practice. Your competitor would have issues in the long run. It seems quite easy to get folks to comment and it is aimed at "gaming" the google ranking algorithms. That practice would surely fall prey to Google webspam team one day. Its best not to tread that path at all. (by the way, Mattcutts- head of Google webspam, is one of seomoz's users...hope he gets to read this post and suggest algorithmic changes)
SEOMOZ had an awesome whiteboard on this.
http://www.seomoz.org/blog/whiteboard-friday-faceted-navigation
Some more additional resources:
http://www.seomoz.org/ugc/dealing-with-faceted-navigation-a-case-study
Matt Cutts on faceted navigation:
http://www.stonetemple.com/articles/interview-matt-cutts-012510.shtml
Hope they help you
Contact sitesupport@seomoz.org and I am sure they'll compile a list soon.
I would suggest that depending on the number of pages, you can either do a htaccess based redirection or an apache mod redirect.
Here is an article which could help you.
You may want to see how your website cache for your homepage looks like on Google. Maybe its reading the wrong header everytime. My suggestion would be to check that 1st. If it shows fine, I dont think it will hurt your rankings as such. If you have rel=caononical in place correctly, you would be fine. It would be worthy to identify your pages where you want rankings and based on that, you could diagnose.
The problem is very common for content heavy websites where content lies somewhere way down the hiearchy.
I am considering or assuming a few things here:
1. The webpage you are referring to is already crawled atleast once.
2. It is accessible from atleast one link on your homepage
3. It does not have a huge number of outbound links ..that is, around 100(within and outside your domain).
Your 1st task should be to get Google to crawl the page (s)
1. get a tool like gsite crawler and crawl your entire website. Create and submit a XML sitemap of your website to Google webmaster tools. Create links from your pages that are already indexed to this page (pages). That way, Google bot will find its way eventually.
2. Update fresh content on the page. Create a RSS feed of the content updates very frequently and serve it up front on the homepage or an important page of your website (which ranks well in Google).
All said, you have to wait and watch. There is no way you can forcefully ask Google to crawl your webpage. Also, updating your homepage content (just text with no link to your deep pages) wouldnt help in speeding up the process. But, its a good practice to keep your homepage content fresh so that Google bots visit your website regularly and you get Google love.
Hope that answers your question.
I dont think that there are legitimate ways to influence suggest for popular keywords. I have noticed one thing though: Popularity of a particular term leads to its inclusion in the suggest list. Example: I ran an awareness/social campaign to save our historical monuments from vandalism by making a website where people could scribble whatever they want.
That campaign went off really well with retweets and shares among good influencial folks. It got shared on bookmarking websites as well. Suddenly, I started seeing a keyword "responsible travel" coming up on suggestions. But as the momentum died, we lost that preference. Maybe the QDF algorithm kicked that keyword out?
Shwan,
I have noticed that when you have a long URL structure with multiple folders, Google tends to lose "interest" in your deep pages.
Let me give you an example: If you have a domain called www.website.com and you have a category called gemstones. In gemstones, you have diamond as a subcategory and a solitaire as a page.
If you consider your homepage to have an importance of 1, you would not have a category page which also has an importance of greater than or equal to 1. So, your category page gets a page weight value...lets say 0.9. Now, your subcategory page is treated that same way and you give it a page weight of say 0.8. Now, your solitaire page gets a value less than 0.8. Now, if you cut out one or more levels in your URL, you have a better chance of assigning of a higher value to your page.
Now, coming to your question. Breadcrumbs are essentially meant to help your users navigate better. So, your website hiearchy (the folders, sub folders or categories, sub categories) should reflect in your breadcrumb.
So, keep your URLs short, but keep your breadcrumbs like your website flow.
Add a meta tag as or
That will do it for you.
Google's resource for this:
http://www.google.com/support/webmasters/bin/answer.py?answer=35264#2
the 100 links is more of a guideline and not a strict rule as such. Your 1st objective should be to enable the page to be indexed. If Query Deserves Freshness(QDF) algorithms in Google will eventually index your URL. Its a matter of time with you linking to that page from atleast 1 page.
My advice would be to link it from more pages (if possible) and keep the content fresh.
Maybe you can even try the RSS idea as well.
If you are there just for reputation management, I would advice that you push the brand name mostly. Of course, if you are targeting secondary (read related keywords) which point to the website's offering rather than the brand keywords, you are doing a great service for your client.
You can actually talk to your client letting him know about your intentions so that you maybe credited for the effort.
Sounds fishy to me. Its not possible to get links the white hat way. There are some possibilities that I can think of. Paying for inclusions, getting majority of links based on press release product announcements or some viral element on the website which has been covered by reputable media sources.
It would be interesting to explore the full set of links.
Joe,
1st of all, which one of your websites rank better for your target keywords?
Which one is an older domain?
Which one gets you more natural traffic (not from search engines or referrals, but direct traffic).
I would go in with the domain that comes up on all these 3 answers.
a rel=nofollow tag on the embedded tag would be a good thing to do since that would tell Google not to give any link authority to that URL.
I got an interesting post from one of the bloggers: SEJ shared this on facebook.
http://explicitly.me/manipulating-google-suggest-results-–-an-alternative-theory. Worth a read
1st of all, do you have https on your website ? It would be a good idea to disallow robots through robot.txt from indexing sensitive parts of the website (user login etc etc).
Then submit a website through a sitemap. ( You can also mention sitemaps in your robot.txt).
That should get the ball rolling. You shouldnt expect huge benefits over a short period of time, but it will slowly show up.
In that case, I would do a url based redirect to your affinityproperties URL. Joecline doesnt ring many bells. But do not do a domain level redirect, rather put in a URL based redirect.
You can use open site explorer to see the link profile just to make sure that affinityproperties has a better link profile. If Joecline has a better link profile, I would keep Joecline since the link authority matters in Google rankings.
Based on the version you have on your server, these documents would be helpful:
http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html
http://httpd.apache.org/docs/current/mod/mod_rewrite.html
Please note: this is not something you would like to do unless you know Apache confuration yourself and you know the codes and its logic.
If you are competing in the local space, you might consider enrolling into local directories, Google places and ask people to write reviews on them. Company directories like hotfrog could be beneficial to you as well.
Sadly, domain age is still a factor in ranking and age old websites tend to rank better due to ranking history in Google.
Construction agencies have mocks, build ups and all kinds of work models which they can showcase. That is something you can identify and do well on.
Also, construction niche is a very less explored niche where you do not find local blogs catering specifically to that niche. It would be a good idea to start a blog giving advice, tips and all.
At the end of the day its all about engagement and I am sure that one day, you will be able to make it through to the top.
Domain name change is a very tough decision and changing domain name to accommodate keywords wouldnt work to bump up rankings.
I would do a deep analysis of the website and do some better interlinking coupled with great inbound marketing. Maybe a viral video or something as well...
Changing domain solve your problem. Remember what happened to meta keywords?
Make a social network or profile website where everyone has their own space where they can write (something like a personal blog). Now, ask people to write reviews, send customers and users of your services for any of these professionals a feedback form with the URL to their profile and tell them to rate and review these guys.
Keep a strict moderation policy and try extracting useful information and compile them in a wiki kind of a storyboard as a summary of discussions, or a composed real time bio.
That should keep search engines interested.
add a mobile sitemap to your GWT.
Here is a link that will help you.
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=34648
Let me give you an example:
If there are say 3 copies of your webpage
www.domain.com; www.domain.com/index.php; www.domain.com/home.html
Ideally, you would want everyone to land on the 1st option, so here is what you could do.
Activate canonical for the url #2 and #3, in the rel=canonical tag specify the complete URL for the #1 option. That way, even if Google crawls the #2 and #3 URL, it will know that the URL that should be considered is the #1 URL.
rel=canonical does not redirect the page unlike a 302 or 301 redirection where the page is redirected to the URL you want.
If you have enabled canonical tag on your URL to redirect the shadow copies of your webpage to 1 location, it should be fine. In the long run however, you should think about getting a 301 redirect to your homepage URL for the URLs that are shadow copies.
I think that its an issue. Although the link juice passed is not that high, it makes complete sense to get the images on the same server as the website.
As an immediate step, you can add a nofollow tag to the links just to make sure that no link juice is passed.
On the tags and category pages, you should have a rel=noindex, follow tags. That would ensure that crawlers (seomoz custom crawl or Googlebot) dont have duplicate content issues.
This was a problem with wordpress blogs as well, which has been handled nicely now.