Hello,
Are there any plans to expand Moz Local to Canada?
In the mean time, does anyone have a suggestion for a similar tool for Canadians?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Job Title: SEO Analyst
Company: Vitopian
Website Description
Vitopian provides online services for small and mid-size businesses including search engine optimization and website development.
Favorite Thing about SEO
The idea of changing the overall internet landscape so it actually provides quality content.
Hello,
Are there any plans to expand Moz Local to Canada?
In the mean time, does anyone have a suggestion for a similar tool for Canadians?
Is there much difference in the recovery process for either [Penguin or manual link penalty]?
Theoretically no, practically yes.
A manual penalty will be reviewed by the Google Spam Team. If you are not successful at removing the links, you will need to provide extensive documentation on the steps taken to remove the penalty. When Google manually reviews links, they will not remove the penalty simply because you adjusted anchor text. If the link is spammy, it needs to be removed regardless of the anchor text.
A penguin penalty can be algorithmically removed. Many SEO companies are simply manipulating the anchor text rather then removing the spammy links and they are getting away with it to at least some degree...for now. Another tactic is to "drown out" the links penalized by Penguin with other spammy links which do not use anchor text. These solutions are quite bad as these sites are subject to future penalties as Google improves their algorithms.
You rank as #1 in Google.com for "refund fx " which seems to be the focus of your home page.
The population of Australia is around 22 million. In comparison the world's population is 7 billion.
When you compare google.com.au to google.com it is a completely different ball game. You can rank #1 in google.com.au but not even make it to the top 200 in google.com.
If you wish to improve your ranking in google.com, you need to sharply increase the quality of your SEO. For example, your pages all show the Australian flag in the upper-left side bar. That doesn't seem like a company who wishes to have a strong international appeal.
If I want to target a keyword phrase to a particular phrase, but do not want to change the URL of that page, will that negatively impact my rankings?
A better way to say it is you are missing an opportunity to make a change which can positively impact your rankings.
A page's URL has a very minor effect on rankings. There are two larger secondary effects. First, if you offer a clear, relevant URL your click-through-rate may increase. Additionally, when others link to your page by copying and pasting the URL, you will naturally have good anchor text which is very helpful, especially in our post-penguin world.
Stefan correct shared there is not a reason to create a new short URL.
Google ignores the hash tag when indexing URLs. You can offer your home page with various versions of hash tags appended to the end of the URL and Google will not mind a bit. It will not case any issue for SEO.
A few more notes:
If you search Google.com for "Guitar History" you will notice the WIki page is listed first. (see attachment). The URL offered by Google is the page URL without any hash tag. Google does offer the ability to "Jump to History" which includes the hash tag link. That is a benefit to using anchor text on a page. Otherwise Google does not take the hash tag nor anything after it into account when indexing pages.
Rand offers a short video on this exact topic: http://www.seomoz.org/blog/whiteboard-friday-using-the-hash
I am not familiar with the exclamation point (bang) being used after the hash tag outside of twitter. The standard twitter URLs use it.
Summary - the hash bag is not the reason for your recent drop in rankings.
I am unclear what you mean by "Google still has thousands of the old hashbang (#!) URLs in its index." Can you share an example?
Crimson offers a great reply and gets a thumbs up from me. I'll just add a bit.
Whether or not you submit a sitemap, Google will visit your site as long as it knows the site exists. If your site offers solid navigation, there is absolutely no need to submit a sitemap. Google will find and crawl all of your pages. If you have coding issues on your site, navigation issues, island pages, etc. then a sitemap is helpful so Google can be aware of these pages it would otherwise not be able to find.
With the above noted, a sitemap is easy to set up and automate. You can pretty much "set it and forget it" so it's still a good practice. About your questions,
1. It's your call. If a page is linked to in your main navigation such as About or FAQ then Google should find it 100% of the time. There is no need to include it in your sitemap but there is no harm either. Either way works.
2. Yes, as per the above as long as Google can find the page it will index them. You can even have horrible coding and navigation and Google may locate your pages if you have earned external links to them from credible sources.
3. Last I checked a sitemap can hold 50k URLs. If your site has more then 50k URLs, then you can break up the sitemaps into smaller files. The advice Crimson shared is correct.
In summary, if you implement all best practices in your site design and do not have any island pages then a sitemap is not needed but it is a nice backup.
** How do I tell Roger no to crawl these blank pages?**
Any easy solution is to block roger in robots.txt
User-agent: rogerbot
Disallow: [enter pages you do not wish to be crawled]
But a better solution would be to fix the root problem. If your only goal is to provide clean reporting to your client the above will work. If your goal is to ensure your site is crawled correctly by Google/Bing, then Jake's suggestion will work. You can help Google and Bing understand your site by telling them how to handle parameters.
I would prefer to fix the root issue though. Do the pages which are being reported as duplicate content have the "noindex" tag on them? If so, you can report the issue to the moz help desk (help@seomoz.org) so they can investigate the problem.
Hello Joe.
I have experienced this issue many times. I believe it is a bug in the mozbar. I have often found the bug resolves itself after a period of time. Sometimes if you reload the page it works correctly, other times not.
The most successful way to resolve it for me is switching from Chrome to Firefox browser. I like the SEOmoz toolbar in FF better anyway.
Can you share what changes have been made to the site? A few ways this can happen are:
a change to the robots.txt file
a change to your site's template either removing a canonical tag, a noindex tag, or altering your pagination in any way such as modifying paginated titles
resolving an onsite issue which prevented crawling of these pages
There are three ares of SEO you likely wish to address: your website itself, content and offsite.
It sounds like you addressed some of the onsite areas such as code validation and removing the meta keywords tags. There are dozens of areas involved in onsite optimization. A few more areas to consider:
do you offer trust badges such as BBB, McAfee, TRUSTe, etc? Are there any industry based affiliations you should present?
are all your seo tags optimized? title, meta description, etc?
do you offer live chat? a toll-free number for visitors to contact you?
do you engage in social media? You can share your social accounts, offer visitors the opportunity to like / tweet / +1 your articles, etc. and allow for "single sign on" so visitors can sign in with their social account.
For your content, is it truly fantastic? Is it compelling? Is it authoritative and accurate? Is it written at a readability level suited for your audience?
For offsite SEO, we recommend the following webinar: http://www.seomoz.org/webinars/future-of-link-building
You are likely seeking SH404SEF: http://anything-digital.com/sh404sef/seo-analytics-and-security-for-joomla.html
We use Yoast's SEO plugin on all our WP sites, and the SH404SEF plugin on all our Joomla sites. Like Yoast, the SH404SEF extension uses best SEO practices and the author reads SEOmoz. If I had to compare the two extension, Yoast's would be considered better if for no other reason then it's more user friendly. Nevertheless the SH404SEF extension is awesome.
What I mean is lets assume you were to get a link from BBC or CNN how long does it take for that link to have a positive impact on your rankings?
The question is a bit vague so I need to make some assumptions. Feel free to correct me if I make any errors.
First, I am presuming you are referring to Google. Each search engine handles links differently. Next, I presume by "take effect" you mean the link will offer a positive benefit to the site's rankings. You could be referring to PR or even SEOmoz PA.
When Google discovers a link the target page will immediately benefit from the link. If you receive a link from the home page of CNN you will likely notice the benefits within minutes of the link being published. If your link is deeper in the site it may take longer to be discovered but most likely within a couple hours. If you receive a link from other sites the link may never be discovered or it may be deemed spammy and offered no value.
** if you do get a link is it normal that google would crawl your site and if so do you see this reflected in googles cache immediately?**
Once again, there are numerous factors involved. If you receive a link from a web page with 1000 other links, then Google may not follow the link at all. If you receive a link from the home page of CNN it is highly likely Google will follow it, but they may not do so on the initial visit. Whenever the link is followed, you can expect for Google to update their data.
There are three ares of SEO you likely wish to address: your website itself, content and offsite.
It sounds like you addressed some of the onsite areas such as code validation and removing the meta keywords tags. There are dozens of areas involved in onsite optimization. A few more areas to consider:
do you offer trust badges such as BBB, McAfee, TRUSTe, etc? Are there any industry based affiliations you should present?
are all your seo tags optimized? title, meta description, etc?
do you offer live chat? a toll-free number for visitors to contact you?
do you engage in social media? You can share your social accounts, offer visitors the opportunity to like / tweet / +1 your articles, etc. and allow for "single sign on" so visitors can sign in with their social account.
For your content, is it truly fantastic? Is it compelling? Is it authoritative and accurate? Is it written at a readability level suited for your audience?
For offsite SEO, we recommend the following webinar: http://www.seomoz.org/webinars/future-of-link-building
Would I be ruining my SEO work if I begin to publish blog posts for the same keywords that my content pages target? Am I basically forced to find alternative keywords and only target one page per keyword?
In short, yes.
When Google provides search results they need to search trillions of pages to determine which result is most likely to satisfy a user's query. One of the key components of their algorithm is relevancy. If you have a page titled "chocolate ice cream" and then a blog article with the same title, which result should be returned to a user who searches in Google for "chocolate ice cream"?
If you offer multiple pages with the same keyword focus you run into an issue called cannibalization. You can solve that issue by narrowing the focus of one of the pages. For example, the main page on your site is what I would refer to as "evergreen" content. 10 years from now someone can read that page and the information is likely still valid. Your blog often offers fresh content which is more time sensitive. Some possible topics for an article:
Top 10 Chocolate Ice Creams in the world
Lowest Calorie Chocolate Ice Cream
Chocolate Ice Cream Recipes
I would also recommend being very careful when providing content on two similar keywords. It takes a level of expertise to do it in such a way that it adds value to your site. One helpful step is to use anchor text. If you write an article on "Chocolate Ice Cream Recipes" then one time in the article when you refer to "Chocolate Ice Cream" present it as an anchor link to your main page.
** How do I tell Roger no to crawl these blank pages?**
Any easy solution is to block roger in robots.txt
User-agent: rogerbot
Disallow: [enter pages you do not wish to be crawled]
But a better solution would be to fix the root problem. If your only goal is to provide clean reporting to your client the above will work. If your goal is to ensure your site is crawled correctly by Google/Bing, then Jake's suggestion will work. You can help Google and Bing understand your site by telling them how to handle parameters.
I would prefer to fix the root issue though. Do the pages which are being reported as duplicate content have the "noindex" tag on them? If so, you can report the issue to the moz help desk (help@seomoz.org) so they can investigate the problem.
Your site is very poorly optimized. You would benefit a lot by implementing basic SEO best practices. A good place to start is here: http://www.seomoz.org/beginners-guide-to-seo
Regarding Bing/Yahoo, Bing has a 10 year agreement to control Yahoo's search results so they are basically the same company from a search result perspective. They have different ownership then Google and operate differently. If you have an outstanding site, then your results may be aligned but otherwise there is often differences.
What keyword are you trying to rank for? A best practice is to focus a single keyword per page. Your home page title is: Roof Installation, Roofing Contractor, Siding Contractor Chicago, IL. Apparently you are trying to rank for multiple words which will not yield the best results.
Another sign of your keyword focus is your H1 tag which is missing on your home page.
When I search Google for "All American Exterior Solutions" you rank #1. If you want to rank for "roof installation" there is a lot of SEO work to perform. Your home page only uses the term a single time in content and it is not in the first sentence so it does not appear to be the focus of the page. You also have another page of your site which is better optimized for the term: http://www.aaexs.com/residential/roofing
Also, your footer is incredibly spammy.
Google desires to return relevant, quality pages in their search results. You may be a great company but your web pages need to improve in order to rank in Google for the terms related to your business.
Roughly once per month. There are several factors which affect the schedule so it seems to vary. Here is a link to the schedule: https://seomoz.zendesk.com/entries/345964-linkscape-update-schedule
It just updated on May 31st and the next update in scheduled for June 27th.
Do you think the english homepage will have more seo power if it goes directly tohttp://www.website.com/
In short, yes. When you redirect any URL you will lose between 1-10% of the link juice from your backlinks. That does not sound like much but since it does affect every link it's something to consider.
Are the overwhelming majority of your visitors English readers? If so, having the site default to the English version makes sense.
Depending on the nature of your site, you may wish to offer pages for the various types of English such as EN-UK, EN-US, etc.
Jarin,
I cannot specifically answer why your site's PR is not available. There are numerous possibilities including a glitch in the latest data feed which offers PR. I was able to verify using a couple tools the PR for your website is not available.
I understand your concern. Penalized websites will show the same results you are seeing, a PR of n/a or a grey bar. Since you are ranking #1 for numerous keywords it seems clear you are not penalized.
I do not look at the PR for any of my sites. It has no tangible value. You can choose to agree or disagree, but I have worked with many sites and helped them earn top rankings without looking at PR at all. If you don't wish to take my advice, perhaps you will accept the same advice directly from Google.
In 2009, Google discontinued the practice of updated PR and now they only offer the information about 3-4 times per year. Susan Moskwa from Google wrote:
"the PR you see publicly is different from the number our algorithm actually uses for ranking. Why bother with a number that’s at best three steps removed from your actual goal, when you could instead directly measure what you want to achieve? "
For further details read the full article: http://googlewebmastercentral.blogspot.com/2011/06/beyond-pagerank-graduating-to.html
Hi Vince.
What exactly do you mean by "PR"? Google internally uses PageRank as one of over 200 metrics when evaluating a page's placement in results. Google does not make that information available except for 3-4 times per year. PR has almost no value in daily SEO, even Google clearly and repeatedly has stated such. You may wish to read Susan Moskwa's discussion of this topic along with her advice to not use PR as a metric for evaluating websites: http://googlewebmastercentral.blogspot.com/2011/06/beyond-pagerank-graduating-to.html
One possible scenario where a website could have an internal page rank higher then the home page would be if the site performed an important interview. For example if a local website conducted an interview with the Vice-President, then the internal page which offered that article could receive coverage and links from hundreds of important sources around the country. The PR value of that page can then exceed the home page.
For an actual example, the best I can offer is using SEOmoz Page Authority (PA). A company called Screaming Frog offers SEO services but they are most famous for creating a web crawler. Their home page (http://www.screamingfrog.co.uk/) has a PA of 57. The screaming frog crawler page (http://www.screamingfrog.co.uk/seo-spider/) has a PA of 62.
I hope this helps.
** Edited reply based on feedback from Keri
SEO Analyst working for a quality company in Rancho Cordova, CA. Focused on implementing SEO best practices.
Looks like your connection to Moz was lost, please wait while we try to reconnect.