You will need to check the server error log files for the new site in order to pick these up, if this hasn't been detected in WMT.
Posts made by Vahe.Arabian
-
RE: Is 301 redirecting all old URLS after a new site redesign to the root domain bad for SEO?
-
RE: Which Large Guest Blog Network is Cutts talking about?
Guys,
My Blogguest was confirmed as the penalised site - http://searchengineland.com/googles-matt-cutts-weve-taken-action-large-guest-blog-network-187028
-
RE: How can I estimate annual traffic for this deep link?
Moz can only show you traffic data once you have connected your profile with Google Analytics, however in saying this my understanding is that it is only for organic search and social.
I'm afraid it would be difficult to get the direct traffic estimates for a specific site alone unless you are getting referral traffic or actually have their web analytics data. However in saying this competitor intelligence tools like similar web and SEM Rush provide broad traffic website estimates.
-
RE: New google serps page design
Guys you should check out Dr Pete's post which Moz recently published today. The study actually outlines that the Title get's truncated on average after 50 characters; and with titles with full caps 44 characters.
http://moz.com/blog/new-title-tag-guidelines-preview-tool
I guess Google at the end of the day is trying to get users to write more meaningful titles, and the SERP redesign is just a step in the process.
Regards,
Vahe
-
RE: Whether letting an old category just 404 out is OK
Hi Bob,
Just to extend on Jane's point and answer your question - if you 301 redirected every single page on your site, it would overtime slow down your website and if a lot of those pages don't have strong authority then it might be detected and potentially harm your sites rankings. A link to a quora post below provides additional reasons.
http://www.quora.com/Why-should-one-avoid-overusing-301-redirects
-
RE: Does my website need a search bar?
Alot of search bar functions really don't provide accurate results, hence I would agree with you on not having this. Focus on good SEO. Organising blog posts into specific categories and then proper internal linking to encourage site dwell time and utlimately where you would like to end up is far more valuable (but more work).
-
RE: Need a keyword tool for the whole company
Tim,
From my understanding who has a Google Account and sign up to Google Keyword planner can use the tool for free.
Hence it can be something which can be used by all.
-
RE: Secondary Domain Outranking Master Website
Thomas,
I reckon this is just a geo personalisation factor and not an issue with site authority or optimisation. What I mean by this is, is that because you are based in the US and the IEEE US website is more strongly associated with a US audience, then Google is likely to rank the US domain first.
When I do a Google US search I see the exact same "issue" issue you are having - https://www.google.com/search?q=ieee&gl=us&pws=0
But I don't see this issue here when searching from Australia - https://www.google.com.au/search?q=ieee&pws=0. In fact the US domain doesn't appear at all on the first page.
To have the IEEE domain outranking over the US, there are two on-page elements you could potentially focus on:
1. Content - IEEE USA has a alot more content implicitly targeting "IEEE" on the homepage. Providing overview content about the organisation, can probably help with providing context and initial value to users to define a brief org background and how they can use the site.
2. Cannonicalisation - Cross link between all IEEE owned websites (maybe in the footer) to provide signals to Google that these sites all essentially sit under the IEEE brand. And because the IEEE.org site is the most authoritative, there is a higher likeliehood of the site outranking the other.
I hope this helps.
Regards,
Vahe
-
RE: Whether letting an old category just 404 out is OK
Hey Bob,
If these pages have been clearly hidden in the first place, then from my perspective I already see no value in these pages.
Having non-important pages returning 404 error pages is ok. We only maintain the existing link juice or page authority for pages that have been taken down or reoptimised, in order to maintain existing SEO efforts or to direct users to an alternative page.
I hope this helps.
Regards,
Vahe
-
RE: How long takes to a page show up in Google results after removing noindex from a page?
You can do two things in Google Webmaster tools to identify how long it will take for a page to index or even speed up the process of re indexation:
- Use Google's crawl rate and indexation reports
2) google tools fetch as googlebot
-
RE: Wordpress shortcode or plugin for current time
Hi Kerry,
I'm no expert at this solution, but I found this link which might help with your situation - http://botcrawl.com/display-date-and-time-on-wordpress-using-php-shortcodes/
They basically use php functions so that it shows your sites server actual time and date rather than the users.
BTW that person from Australia wasn't me
Hope this helps.
Regards,
Vahe
-
RE: A tool to tell a websites estimated traffic
Hi Ryan,
I'm pretty sure that SEM Rush does provide historical data, you just need to purchase the GURU or higher plan in order to gain access to this.
You're right about Compete - and it is only limited to the US.
-
RE: Are Meta Descriptions Really Necessary?
Hey JP,
I agree with the above comment, as meta descriptions make are part of your websites realestate on Google's search results. However based on Moz's SEO guide, you are correct in saying that you don't always need to write the meta description. Read here for more information - http://moz.com/learn/seo/meta-description
-
RE: A tool to tell a websites estimated traffic
I would suggest you specifically look at competitive intelligence tools like compete.com or semrush as they can provide somewhat accurate estimates.
Here's a reference from SEM Rush on how they collect their data
"SEMrush analyzes keywords in the first 20 search results in Google. We now have over 100 million keywords and 71 million domains from the Google top 20. We use our Live Update algorithm to keep our 25 regional databases fresh."
-
RE: Tool that can retrieve mysite URL's
Or a crawl test with moz pro tools
-
RE: Tool that can retrieve mysite URL's
That would show what's indexed (which is most) but not all pages
-
RE: Tool that can retrieve mysite URL's
Screaming frog SEO spider tool should be able to help you with this. However to crawl more than its' 500 URL limit, you will need to purchase a licence key.
http://www.screamingfrog.co.uk/seo-spider/
Good luck.
Regards,
Vahe
-
RE: What happened to moz Crawl Test? Is it moved in the redesign?
It is now being redirected to the tools section, does this mean it is now integrated into Moz Analytics, and if so when will it be available?
-
RE: Has Bing rolled out an algorithm update?
They don't really provide any algorithm update news. But I would has it a guess and say that the Webmaster Tool changes that they are currently doing might be a correlating factor.
Check out their blog for more information - http://www.bing.com/blogs/site_blogs/b/webmaster/default.aspx?pi4743=1
-
RE: In-house search engine
Have you consider google site search or google custom site search?
Here's some more information - http://support.google.com/customsearch/bin/answer.py?hl=en&answer=72326
-
How to conduct catch 301 redirects & have the separate 301 redirects for the key pages
Hi,
We've currently done a site migration mapping and 301 redirecting only the sites key pages. However two GWT (Google Webmaster Tools) is picking a massive amount of 404 areas and there has been some drop in rankings.
I want to mitigate the site from further decline, and hence thought about doing a catch 301 - that is 301 redirecting the remaining pages found on the old site back to the home page, with the future aim of going through each URL one by one to redirect them to the page which is most relevant.
Two questions,
(1) can I do a catch 301 and if so what is the process and requirements that I have to give to the developer?
(2) How do you reduce the number of increasing 404 errors from a site, despite doing 301 redirects and updating links on external linking sites.
Note: The server is apache and the site is hosted on Wordpress platform.
Regards,
Vahe
-
RE: Getting individual website pages to rank for their targeted terms instead of just the home page
Hi Alex,
Thanks for the response. I should have clarified to mentioned that the home page keyword target for the home page would be updated. The issue is that the home page get's the bulk of the traffic and keywords, which makes me think that it's an internal linking issue. Ur thoughts?
-
Getting individual website pages to rank for their targeted terms instead of just the home page
Hi Everyone,
There is a pattern which I have noticed when trying to get individual pages to rank for the allocated targeted terms when I execute an SEO campaign and would been keen on anyones thoughts on how they have effectively addressed this.
Let me try and explain this by going through an example:
Let's say I am a business coach and already have a website where it includes several of my different coaching services. Now for this SEO campaign, I'm looking to improve exposure for the clients "business coaching" services. I have a quick look at analytics and rankings and notice that the website already ranks fairly well for that term but from the home page and not the service page.
I go through the usual process of optimising the site (on-page - content, meta data, internal linking) as well as a linkbuilding campaign throughout the next couple of month's, however this results in either just the home page improving or the business page does improve, but the homepage's existing ranking has suffered, therefore not benefiting the site overall.
My question: If a term already ranks or receives a decent amount of traffic from the home page and not from the page that its supposed to, why do you think its the case and what would you be your approach to try shift the traffic to the individual page, without impacting the site too much?. Note: To add the home page keyword target term would have been updated?
Thanks,
Vahe
-
RE: Branded & Non-Branded Keywords
Hi Mohamed,
The SEOMoz tool can help you extract your most important keywords, however you might feel that it is not enough due to your keyword research methodology.
To help maximise your keyword research ask yourself the following questions (credit to distilled U
1) In one sentence, what is the core business of your website?
2) How does your business make money?
3) What products or services do you offer?
4) Describe your ideal customer or target market?
5) Are your services location specific?
6) Are there any specific graphic design terms for your product/services?
If you have been able to effectively answer these questions, along with doing some competitior analysis and topical research from blogs and social media platforms, in addition to using other keyword idea tools i.e. soovle, google trends suffice to say you would have conducted thorough keyword research and built a more stronger list, then simply relying on just SEOMoz tool.
Good luck and hope this helps.
Regards,
Vahe
-
RE: Canonical referencing and aspx
Hi Gutam,
Based on your provided URL's it seems that your website is built using .NET, as your mentioned problem is common problem for these type of sites.
Assuming that your website server is on IIS, it would be best to install both the IIS toolkit and the URL rewriter on your server.
Use the IIS SEO toolkit to first identify all the technical SEO problems and then the URL rewriter to redirect and create your search friendly URL's.
Dave Sottimano (from Distilled) has written a good post on using IIS SEO toolkit for site analysis -http://www.seomoz.org/blog/what-every-seo-should-know-about-iis
Here's one pretty good post (abit outdated) on how to deal with the most common URL errors using the URL rewriter - http://weblogs.asp.net/scottgu/archive/2010/04/20/tip-trick-fix-common-seo-problems-using-the-url-rewrite-extension.aspx
Good Luck!
Vahe
-
RE: Google keyword tool no data for a KW- what to assume in this case?
Hi Vikas,
When local volume is 0, this means there is not enough/or no people searching for that term.
Assuming you believe your targeted term is the right one and that you are using exact match try the following:
-
Use phrase or broad match to find any similar terms with search volume
-
You can use google trends to see any correlating search patterns
-
Maybe try broadening the type of keywords you are using? Keyword suggestion tools such as Ubersuggest helps.
Hope this helps,
Vahe
-
-
RE: Books about Content Marketing & Persona creation?
Nikolas,
All these books on this page are probably the best of the crop - http://www.amazon.com/s/ref=nb_sb_noss_1?url=search-alias%3Dstripbooks&field-keywords=content+marketing#/ref=sr_pg_1?rh=n%3A283155%2Ck%3Acontent+marketing&keywords=content+marketing&ie=UTF8&qid=1333154670
I've read "content strategy" for the web and personally I think it's not to bad.
Good luck,
Vahe
-
RE: Text Link vs image link?
Hi Samuel,
Text links will definently pass more link juice than an image due to the fact that crawlers can better read the html text than alt attribute from an image, as it is more difficult to parse.
Also in my experience I have found people linking to the site using an image really do not put an alt attribute
Hope this helps,
Vahe
-
RE: Should we add our site to Google Webmaster Tools
Hi Bob,
I think you should definently add nlpca to Google Webmaster Tools; as it provides you clues about the general health of your site.
In GWT you can look at the some of the following clues for drops in ranking:
-
Malware Errors
-
Crawl Rates
-
Duplicate meta data
-
Changes in search querie/landing page impressions and position
-
Recent server side issues i.e. if a certain page has been detected as a 404 page e.t.c
Hope this helps,
Vahe
-
-
RE: What are tier 1, tier 2, tier 3 keywords (pages)?
Hi Vince,
I think they refer to primary, secondary or tertiary keyword targets used for the page; primary being the main keyword used to optimise the page.
Hope this helps,
Vahe
-
RE: How To Create Dynamic WordPress Tags
MyNet, I would encourage you to actually write unique meta descriptions as to some extent it can determine you sites ctr.
-
RE: Duplicate Page Titles
Hi Erik, If there is a good intent (which there is), this is normal. It's similar when SEOmoz will record the number of 301's, you are looking to see whether or not what you have implemented has actually been noticed. In general for your scenario I would have done a 301 redirect, using the htaccess file to strengthen the domain authority. In addition, Erik check out Google Webmaster Tools to ensure any other duplicate page titles are not appearing for the site. Hope this helps, Vahe
-
RE: Social Sharing Buttons: SEO friendly ?
Hi utatiger,
Did you mean social friendly sites so that you can add buttons to your site?
Try Google +, FB, Twitter, LinkedIn, Stumpleupon, Reddit, Digg, Paper.li A video on the top 15 referral sites - http://www.youtube.com/watch?v=8kN6-Opsqv8&list=UUvOLeOv_0_ijT8JlmxAFmGA&index=3&feature=plcp
Regards,
Vahe
-
RE: URL shortener and backlinks
It would be better to use the complete address for the following reasons:
(1) This can be cited across other websites, which can help your site build more links
(2) Depending on the URL shortner service (in this instance Google does), does/doesnot contain a 301 redirect, which means either way you would be loosing some link juice value you would have otherwise received for a page.
Hope this helps,
Vahe
-
RE: Find competitors based on keywords
Competitive intelligence keyword research tools i.e. SEM Rush does this.
-
RE: Over optimization penalty on the way
It's there larger effort towards updating their algorithm based on sematic search - http://searchengineland.com/wsj-says-big-google-search-changes-coming-reality-check-time-115227
Next gen of search will arrive within a year.
-
RE: Google Analytics - Goal tracking question
Hi kynduvme,
You need to alter the GA code for multiple domain tracking.http://gaconfig.com/multiple-domains-with-subdomain/. Best to create a new profile to prevent any data skew.
You would then as normal add the goal conversion tracking as normal.
Hope this helps,
Vahe
-
RE: How do I get rid of rogue listings?
As you have your listing I would suggest that you continue to try and get those listings removed by reporting it to Google.
One way that you can demonstrate that your verified listing is the "credible" and only one is through local citation linkbuilding and getting people to place reviews on your listing.
The more activity will show hopefully push the other two down in SERP's and eventually be removed from Google.
Good luck,
Vahe
-
RE: How critical is Duplicate content warnings?
For pagination problem's it would be better to use this cannonical method- http://googlewebmastercentral.blogspot.com.au/2012/03/video-about-pagination-with-relnext-and.html .
Having dup content in the form paginated results will not penalise you, rather the page/link equity will be split between all these pages. This means you would need to spend more time and energy on the original page to outrank your competitors.
To see these errors in Google Webmaster Tools you should go to the HTML sections area where it will review the sites meta data. I'm sure ull find the same issues there, instead of the sitemaps.
So to improve the overall health of your website, I would suggest that you do try and verify this issue.
Hope this helps. Any issues, best to contact me directly.
Regards,
Vahe
-
RE: Exact keyword URL or not?
From my experience targeting the higher traffic term usually captures traffic/rankings for the lower traffic term aswell, since they are both related.
I would go with "shoe" however optimise the content for both shoe & shoes. In addition you would need to build links for both terms, so that you can improve the ranking for both terms.
Hope this helps.
V
-
RE: Long page - good or bad?
You need to keep in mind that the nature of blog posts/articles, is that users typically look for information and once they find it they leave. In general these type of pages have a bounce rate >80%. So it won't affect your standing with Google.
However in working with legal sites, what I have found is that they purposely write long pieces of copy because they can. To be honest most people wouldn't read through all of it, due to their lack of understanding and patience from all the technical terms. The target market is typically middle class income earners.
So just recommend to your client to write content that get's to the point, rather than creating additional pages. More pages does not mean you have a better standing with Google. However if you know for a fact that users typically ask certain questions, then put these on FAQ page.
Regards,
Vahe
-
RE: What is the most SEO friendly Shopping Cart?
Hi,
Magento in addition with this extension - http://www.mageworx.com/seo-suite-ultimate-magento-extension.html
Magento is robust and alot more SEO friendly, as you can deal with the cannonical issues most carts experience.