Hi,
Try Google Insights (http://www.google.com/insights/search/), you can do a search over various time frames, get a graph to visualise traffic levels over time and download the data to CSV for Excel.
Cheers,
Aran
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi,
Try Google Insights (http://www.google.com/insights/search/), you can do a search over various time frames, get a graph to visualise traffic levels over time and download the data to CSV for Excel.
Cheers,
Aran
Never used Marketwire, however I use PR Web regularly for syndicating clients press release. I like the service, I get a good spread of sites publishing articles and a decent amount of links (embedded in the article through PR Webs online editor).
In my experience it sounds like your link profile could be letting you down. Use OSE to analyse your links and try and clean it up.
I always prefer the pipe for the following reasons, though I don't think it has any specific SEO value.
a) its reader friendly
b) its a natural separator
c) as Seth says below, it looks cool!
d) Whats good for SEOmoz is good for me!
Hi, your question led me on an interesting trip though Google help, revisiting the basics of analytics.
Google define the Bounce Rate as:
"Bounce rate is the percentage of single-page visits or visits in which the person left your site from the entrance (landing) page" - Google Webmaster Help http://bit.ly/gPPNPj - This makes me think of single page sites, do they have 100% bounce rate?
Heres a look at how Google Analytics performs it calculations. http://www.google.com/support/analytics/bin/answer.py?hl=en&answer=77234
What I took from this is that all visitor times are taken and averaged, bounce or not.
Cheers
Aran
Would it be possible to 'Archive' articles after the 1-6month period ?
Archive could just be a database flag that keeps the articles from appearing in Article index thus keeping the same url, but not clogging up main site with hundreds of links to expired articles?
Hi,
Making facebook apps for iframe is really simple, just like creating any webpage however you just have to take into account the width limitations of the iframe. You essentially host the page on your site and point the iframe to the approriate page.
Check out this resource: http://bit.ly/jFpqVX
This should get your started.
Good luck, and I'd love to see it when its done!
Cheers
Aran
Only used once and I abandoned it as it never gave me any actionable data, but youtube has a Tool similar to Adwords tool, only it focuses on Youtube rather than Google.
You could try the keyword difficult tool in SEOMoz research tools
http://pro.seomoz.org/tools/keyword-difficulty
It will give you a rough idea of the difficulty to rank and also the Page and Domain Authority of the top 10 competitors
Hi, As far as I'm aware, a sitemap is exactly that, A site map, thus they shouldn't be used to map URLs for mulitple domains.
To avoid duplicate content canonical references should be used.
Use a separate site map for each domain.
Hi Atul,
Add the trailing slash.
/abc could be a page url. Where as /abc/ is definitely a folder.
http://www.robotstxt.org/robotstxt.html <-- Everything you ever wanted to know about robots.txt
Regards
Aran
[EDIT: Damn it, Ryan submitted whilst I was answering! Must type faster ]
Agreed, very questionable. Maybe hes saving on hosting fees!
In the current scenario any inbound links to your friends site will be building the Domain Authority etc of the FunktionalDesignStudios domain. Thus, yes he is boosting his own domain. Though technically speaking it will also be boosting your friends site whilst it is served from a sub directory of the funktionaldesign domain.
Hi,
I think its unlikely that you will get a penalty for using display none in this way. Also, Google will index the content from the product description. If you have the Web Dev Toolbar in Firefox, try turning CSS off and look at the page for a basic Googlebot view of the page. You can see that all the 'tabbed' content that is normally hidden is shown. This may also highlight some other issues on the page (hint hint)...
Cheers
Aran
Hi,
Working in SEO I always cringe when a designer mentions Flash. However my phobia of Flash is old school and maybe I need to get with the times. Google will crawl textual content in Flash, including anchor texts. It will also follow links in Flash and presumably applies the same 'link juice' factors that it would normally apply to links.
Heres an article on Flash and crawling textual content http://bit.ly/dDXYoc
Hope this helps.
Aran
You are probably experiencing this error because of the "> in the URL you are trying to redirect.
Looks like the crawl has highlighted the as an error as it may have found a malformed link linking to the Paper Airplane Flight page. It looks as though you have a link that may look as follows ">paper airplane flight Notice the extra "> after the link URL.
You need to find the offending link on the site and fix it. That will solve the error.
For some of the phrases I'm currently working on all of the top ten SERPs have the phrase in the URL. Not saying its a factor but I wouldn't like to leave it out of the equation just yet.
As EGOl and iNet state, using flash for a homepage isn't a great idea.
You will need links outside of the flash to ensure crawlers find deeper pages.
If you do optimise sub level pages over the homepage, then fancy homepage flash become inert as most visitors are going to land on sub pages, by passing the homepage.
Include Flash by all means, but make it a page element, not the entire page.
Jquery on the other hand is awesome and if done correctly will not hinder your SEO.
Hi Damon,
Get the SEOmoz Toolbar, it has a function to highlight NoFollow links. very handy!
http://www.seomoz.org/seo-toolbar
Regards
Aran
not sure if thats the correct tinyurl you have posted in! lol.
Hi,
Allow: / isn't valid syntax in a robots.txt file, Anything that isn't disallowed is allowed by default.
Other than that all looks good. Perhaps the 200 or so links to blocked pages were indexed before the robots.txt was last updated with the disallows?
Regards
Aran
PA = Page Authority and DA = Domain Authority. These are values SEOmoz use to "emulate" PageRank (emulate may not be the best choice of words).
you can get access to these metrics via the SEOmoz toolbar or opensiteexplorer
I try and go a little out of the box on things like this, particularly when the subject is dull (i work alot with insurance!).
How to guide are always a winner with niche bloggers. As are Top 10 articles, etc.
Are these articles to be hosted within your site to act as additional content landing pages for particular keywords, as part of link baiting campaign or an external article marketing campaign?
In any case you should vary your Titles, to prevent duplicate content issues. In terms of links I agree with Paessler, vary the anchor text. But if the article is a great one with really useful information etc, why not contact a few niche websites /bloggers and send them a link, perhaps you can pick up a few natural links!
Hi Yannick,
The time can vary hugely depending on the competitiveness of the terms you targeting.
A general rule of thumb I use is wait a month and check opensiteexplorer results, check PA and DA for improvements, if these metrics have improved then you should see some positive SERPs movement.
regards
Aran
My best tip for shopping sites is to ensure your content is unique and well written giving as much genuineinformation on the product as possible. I always like to include bullet points to highlight features too.
Allowing customers to rate and comment on products is a great way to build up some awesome UGC on product pages.
Both of these options will help SEO and potentially help conversion too.
Including social sharing buttons for people to FB Like/Tweet Products cant hurt either.
Hi Simon,
Sound advice, after a few days of head scratching I worked with the developers to create a new routine that generates page titles as products are added to the site. There are way to many products to handle it manually so the programmatic approach is a must.
We have managed to vary the page titles enough with the data we have to generate 'near' unique page titles and meta descriptions.
I just wondered how others may do it.
Cheers for your input Simon.
Regards
Aran
EDIT: correcting my typos