Hi Chad,
You are correct. You should list the local phone number on the website and the GetListed.org directories. While 800 numbers are convenient, they do not give any indication as to the location of the business.
Hope this helps.
Mike
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Chad,
You are correct. You should list the local phone number on the website and the GetListed.org directories. While 800 numbers are convenient, they do not give any indication as to the location of the business.
Hope this helps.
Mike
Cool. No worries
StackOverFlow has always been awesome in helping me with my IIS rules and such.
If you Google: site:stackoverflow.com apache redirect
You will see MANY examples of how to set up 301 redirects, including redirecting from non-www to www pages, etc.
Hope this helps.
Mike
Hi Jesus,
This is an interesting article about how fixing duplicate content increased a websites indexation, in turn increasing their website traffic by 150%.
I would definitely attack the Crawl Errors found ASAP. Duplicate page content, 4XX errors, and missing or duplicate title tags can definitely mess with rankings.
Once you have your errors under control, I would work on the warnings whenever you have time or are bored.
Long story short... the errors can definitely have a big impact on how you are ranking, while the warnings are just a "heads-up".
Hope this helps.
Good luck.
Mike
Google Analytics or Google Webmaster Tools? You will need to do that in Webmaster Tools.
That is a bummer they are having issues with your 301 redirects. If you know whether you are using Apache, IIS, etc. for your backend, you could post the code you are using in a new question and hopefully someone in the SEOMoz community can help; otherwise, there are Apache and IIS forums where you can post and get some great results and/or examples to base your redirects off of too.
Good luck Sarah! I hope you get your site in shape and back on page 1!!!
Mike
... I had so much written in here, then just clicked back on accident... <sigh>Let me start over...</sigh>
Unique content will help gain followers. The more followers you have, the more FB shares and likes you "should" have. This is where FB pages can impact how your website ranks in Google's SERPs.
The biggest thing with FB pages is that they need to be monitored to interact with followers, responding to their questions and concerns, and updating your content. If you do not do these things, you will not get the followers, shares, and likes you are wanting.
Depending on your company, Twitter, LinkedIn, or Google+ might be a better form of social media; however, like FB, they still need to be monitored and followers will demand interaction.
As for increasing your DA... it is wicked tough to do in a short period of time. According to SEOMoz:
"How do I influence this metric?
Unlike other SEO metrics, Domain Authority is difficult to directly influence. This is because it is made up of an aggregate of metrics (mozRank, mozTrust, link profile, etc...) that each have an impact on this score. This was done intentionally because this metric is meant to approximate how competitive a given site is in Google.com. Since, Google.com takes into account a lot of factors, a metric that tries to calculate it must incorporate a lot of factors as well.
This means the best way to influence this metric is to improve your overall SEO. Particularly you should focus on your link profile (which influences mozRank and mozTrust) by getting more links from other well linked to pages."
The short of it is that if your SEO provider is creating multiple FB accounts and they are having issues with maintaining multiple accounts, you are definitely better of with just using one. If they put the amount of time and resources into managing one account over multiple, in theory, you should be able to accumulate more followers, likes, etc., because you will be creating unique content for one source instead of creating thin content, you will be providing great interactions with followers, instead of responding a few days later, etc.
Does that help?
Mike
You can go back and fix all of your old title tags, making them unique, like Newsletter Archive | Month Year | Sunday School Network, which will get rid of your errors and provide a better user experience. This approach will allow you to target specific keywords on each page for ranking in Google. When you have the same title across multiple pages, the assumption is that the content is either the same or very similar.
I noticed you have a canonical issue, where you can access your site via http://sundayschoolnetwork.com as well as http://www.sundayschoolnetwork.com
The issue with this, that you have 44 relatively important links from external websites pointing to the non-www version (http://sundayschoolnetwork.com)... which means you are splitting up your potential power between two sites instead of one. There are many ways you can fix this.
As for why you are not ranking as well, it could be the market became more competitive for the keywords you were originally using. It could be that your site content does not reflect the keywords you are targeting. It could be lots of things.
Like I said in my previous post, the nofollow tells crawlers not to follow the internal and external links on those pages; however, they will still get indexed. This means that you will still have duplicate titles appearing in results. The way to remove them from the results would be to use the noindex directive - which will eventually remove them from the index and you will not have competing title tags.
If you fix your title tags, you do not need to worry about the nofollow or noindex directives.
That is about all I can help with, without knowing any additional information.
The only other thing I can suggest is to read the SEOMoz Beginners Guide to SEO - which will help a TON!
I hope that helps.
Mike
Hi Sarah,
If the titles are different and the page content is different, I do not understand why you should be getting any errors.
What tool are you using that is giving you the "similar content" message?
Your site visitors will still be able to search your site with nofollow in place, because nofollow is simply a directive telling search engines to not follow the internal and external links on your page.
The noindex directive tells Google to not index the content on the selected pages.
If you can provide me with the name of the tool you are receiving the "similar content" message from and/or provide me with your website address I could take a look into things further.
... long story short, if your titles are unique and your content is unique, you should not have to worry about duplicate content.
Hope this helps,
Mike
Hi Sarah,
It would be best for your to create your own topic so that others can give their help to your specific question.
Once you post your question, I will be happy to respond there.
Thanks,
Mike
I guess that is something you should ask your self... does it make sense?
If it doesn't, don't do it; however, if each location has different specials or coupons or something unique to offer at different locations... it makes sense to me.
And if it is any consolation, Target has MANY Facebook pages, including a main Target page with Target Style, Target Baby, Target Canada, Target Black Friday, Target Wedding, etc.
And all of their different pages of different numbers of followers, likes, etc. giving each individual Facebook page its own page authority.
Make sense?
Mike
Hi Robert,
GetListed.org is probably the best. It not only tells you which directories to get listed in, it also verifies when your business has been listed.
The directories on GetListed.org are great, then any other industry specific or local business directories you can get into will make complete sense to Google when crawling. It starts to become a sketchy area when you start adding your website to random or generic directories... some are OK, you just don't want to over do it in the generic ones.
Hope this helps.
Mike
Hi Scott,
I would not worry about that being considered duplicate content. Is it duplicate? Yes - however you are listing your mission in business directories, so some duplicate content is assumed.
When Google indexes, it can easily determine where the duplicate content originated and gives complete credit to that source... in this case, your homepage.
You can even Google: "seomoz is the world's most popular provider of seo software" and see that this same paragraph is listed on LinkedIn, CrunchBase, VendorStack, StartupGenome, etc.
These business directories are all reputable, so I would not worry about it. If you do see random webpages or your own webpages duplicating your content, then you should be more concerned about fixing that.
Hope this helps.
Mike
LAME! You may just want to let the 301 redirect you have in place take its course or remove the URL from Google's index since it was added by mistake anyway.
Mike
If that does not work, give this a whirl:
RewriteCond %{REQUEST_URI} !\.[a-zA-Z0-9]{3,4}
RewriteCond %{REQUEST_URI} !/$
RewriteRule ^(.*)$ $1.html
Maybe give this a whirl:
RewriteCond %{REQUEST_URI} !(.|/$)
RewriteRule (.*) /$1.html [L]
Hi John,
I would actually suggest trying to put St. Charles, MO in your title tag like the competitor that is ranking above you.
That will help with ranking higher for the keywords "St Charles Lawn Care" AND is way easier than creating additional pages AND your homepage already has very good rankings.
If you haven't already, make sure you are listed in all of the local business directories at GetListed.org to further increase your local presence.
Good luck.
Mike
"I accidentally manually submitted the url to google and manually in submitted it to index and that when this issue began...."
It sounds like you accidently added this URL to the index. You can follow the procedure outlined below to request Google remove the specific URL from the index:
https://support.google.com/webmasters/bin/answer.py?hl=en&answer=59819
I checked your site's structure using Screaming Frog and it does not appear that you are linking to any non-.html versions. If I perform a scan using one of your non-.html pages, it appears that it only links to itself.
Since you have the 301 redirect in place, you can choose to wait it out and Google should correct things eventually; otherwise, requesting Google remove the URL is a faster... PERMANENT process.
Good luck.
Mike
It can actually take GWT weeks or months to actually remove these warnings from their reports.
As long as you have personally verified that they are fixed on your live site, you do not need to worry.
I just verified that your /estimate-request.html is using the description you stated above; however, Google is still using the meta description you had in place on Jan 30,2013.
Once Google re-indexes your page, it will appear correctly in the SERPs, but like I said, it may take months for this fix to be reflected in Google Webmaster Tools.
Does that help?
Mike
Hi John,
SEOMoz has the following plans:
PRO
PRO Plus
PRO Elite
PRO Agency
PRO Enterprise
http://www.seomoz.org/plans/enterprise
Mike
Hi John,
Yes:
1,000 words using the PRO Plus plan and 3,500 words using the PRO Elite plan.
Mike
Hi Darrin,
Can you provide your website URL or a few examples of the URLs that are getting the duplicate content error?
If you can't provide them, I would recommend referencing http://www.seomoz.org/help/fixing-crawl-diagnostic-issues which says the following about fixing duplicate page content:
"Duplicate Page Content
Duplicate Content means there are pages that are identical (or nearly identical) to content on other pages of your site, which can force your pages to unnecessarily compete with each other for rankings.
Here are some things you can do about duplicate pages:
Delete content that is similar on each page.
Add some new and unique content to each page that is on the report. This can be done by adding more information, ideas, product descriptions, or anything that can make it differ from other pages on the domain.
You can also add a rel=canonical to one of the duplicate pages. Here are a few ways to do this:
Add a rel="canonical" link in between the and elements. This should be done on the version of the page you want to be ranking or that non-canonical version of the two (or multiple) pages.
To specify a canonical link to the page http://www.seomoz.org/blog.php?item=seomoz-iscool, create a element that looks like this: <link rel="canonical"href="<a href="http://www.seomoz.com/blog.php?item=seomoz-iscool">http://www.seomoz.com/blog.php?item=seomoz-iscool"/></link rel="canonical"href="<a>
Copy this link into the section of all non-canonical versions of the page, such as http://www.seomoz.com/blog.php?item=seomoz-iscool&sort=fun.
Keep in mind that that canonicals will stop the pages from ranking against each other, but they will still show up as duplicate content from a UI perspective, so we will still count them as duplicate."
Thanks,
Mike
Hi Kyle,
I am confused by looking at the URL you provided.
The link is missing an "i" in clients and also cannot end in .comXYZ. It would need to end with something like http://www.XYZclientsSite.com/XYZ. But the displayed URL is also different than the linked to one: www.XYZclientSite.com/XYZ vs www.XYZclientsSiteXYZ.com - like one appears to be a directory and one appears to be the sites root.
Original:
Rewritten?
I don't know if I answered your question or if I need additional information because of the items described above.
Mike
It depends on whether you use Apache or IIS to manage your website.
Here is an example of what you'd use if you have IIS 7:
<code><system.webserver><rewrite><rules><rule name="Force www" stopprocessing="true"><match url="(.*)"><conditions><add input="{HTTP_HOST}" pattern="^site\.com$"></add></conditions>
<action type="Redirect" url="<a" href="http://www.site.com/{R:1}">http://www.site.com/{R:1} redirectType="Permanent" /></action></match></rule></rules></rewrite></system.webserver></code>
This shows what you'd use for Apache (at least I think... I use personally use IIS):
http://stackoverflow.com/questions/4907348/force-www-via-htaccess
And once you have one of these in place, you will also want to configure your Google Webmaster Tools to specify your preferred domain (Configuration > Settings > Preferred domain > Display URLs as www.YOURSITE.com).
This is technically a 301 permanent redirect, so if you have any other links on the web that point to the non-www version, they will be redirected to the correct www version - since there is a redirect at play here, it would not pass the full amount of link juice. But like I said in my previous post, it appears that you only have 1 link pointing to the incorrect version. So, you will actually be fixing a broken link by doing this. In theory, your www version will be gaining some of the link juice that the non-www version was previously receiving.
This is also going to provide your users with a better experience as well... which is nice
Make sense?
Mike
Rock and roll.
Glad you got it all figured out.
Mike
Test it in a different browser.
For instance, I use IE for my main surfing habits, Chrome for my SEO and web development, and Firefox to check my Google rankings.
I do this, because I do my Google-ing in IE - so things get cached, Chrome has me signed in all of the time because of my Google accounts, and Firefox I never use.
Also, I am in Minneapolis, MN, so my browser is using the location to display my results.
I just switched my location to Santa Ana, CA and you are coming up #8.
That is why SEOMoz states in their help, "SEOmoz tries to represent the ranking results that “most” users will see."
Make sense?
Mike
Hi Jack,
According to Rankings - Help
"Why do the rankings in SEOmoz not match what I see in Google? Great question! Although there is never one “right” ranking for any keyword, SEOmoz tries to represent the ranking results that “most” users will see. If you see wildly different results from your search engine compared to your Ranking Report, it could be because of the following reasons:
Personalization – Most search engines will personalize your results based on your search history. Make sure you are logged out when performing a search. For Google, try adding &pws=0 to the end of your query string.
For example, if you searched for “SEO” the URL in Google would look like google.com/search?q=SEO. To turn off personalization, append &pws=0 to the end, so it would look like google.com/search?q=SEO&pws=0.
Geographic Bias – Search engines also customize results based on location. This means users in one city often see very different results than users in another location.
Latency – Rankings can change between the time we retrieve them and the day when they are viewed.
“www” vs. non-“www” Subdomains – The Web App is subdomain specific, meaning it only tracks rankings for the specific subdomain you track. So if you enter “example.com” as your subdomain, but “www.subdomain.com” is what ranks in the search results, this ranking won’t appear in your rankings report. This also might indicate a problem with canonicalization or duplicate content.
If you set your campaign up on the wrong subdomain, the only way to correct it is to start a new campaign. You can either archive or delete your old campaign, or choose to keep it running if you have enough campaign slots available."
And WorldofMoulding.com is coming up at #40 for me when I search "crown moulding" (see screenshot below).
Hope this helps.
Mike
Hi Tony,
You could setup a rewrite to always force the use of WWW when a user or crawler accesses your site without the use of WWW.
Depending on whether you use IIS or Apache, it will be different.
I did a quick look at your site and it looks like all of your absolute paths are correct with using WWW; however, if you use OPENSITEEXPLORER.org for http://asggutter.com/sitemap.xml you will notice that it has one linking domain: http://iwebchk.com/reports/view/asggutter.com#.UTDhWzA4tqk
I am guessing that Google is crawling iwebchk.com and is getting the non-WWW version of your sitemap from this website. If you Google the iwebchk.com URL above, you will notice that it has a date of Jan 16, 2013, so it must of cataloged this incorrectly somewhat recently.
You could also maybe try contacting iwebchk.com and asking if they would correct the sitemap URL on their website - but the rewrite to force WWW would probably be easier.
Good luck,
Mike
Hi David,
Read the Beginners Guide to SEO.
That will should answer 97% of your questions.
Then read the SEOMoz Help for 2% more.
And finally ask very specific questions here for the final 1%.
Good luck.
Mike
Not to my knowledge.
Anything that claims to do so is probably simply analyzing backlinks for anchor text, keyword density, or looking at the keywords meta tag - so the data you get back will only be hypothetical as to what your competitors maybe or should be targeting... not necessarily the keywords they are targeting or ranking well for.
SEMrush.com claims to do such a task; however, I personally do not believe that it does an accurate job. At least in my opinion.
Good luck.
Mike
Hi Spencer,
You can check relatively easily whether Google is ignoring it or not.
Type the following into a Google search: site:www.example.com/page?
If you see: www.example.com/page?parameter=1 AND www.example.com/page? in the SERP, then YES, Google is seeing duplicate content.
If you only see one version, that means that Google is probably ignoring the other version. Google does understand that some content management systems naturally produce “duplicate content.” In these cases Google just ignores one of them. If you think about it, this makes sense. If you have two very similar URLs with the same content, it probably isn’t intentional that you’d have duplicate content; however, if you have two VERY different URLs and have duplicate content on those… THEN you’d be a little more concerned.
Hope that helps.
Mike
If you have SEOMoz Pro, you can setup Competitors in your Campaign Settings. Once configured, click on the Rankings tab of your compaign, then click on a given keyword. You will see a graph comparing how you rank against your competitors for the given keyword over the past 6+ months.
If you do not subscribe to SEOMoz Pro (which you should, cause it ROCKS), you can try using Google Adwords. Instead of searching for a word or phrase, you can search by Website. This displays keyword ideas for the particular website you entered.
So for instance, when I just did Microsoft.com, Windows 7, SQL, Xbox 360 all appeared as keywords. Once you generate these keywords, you can then just Google them and see where your competitors rank in comparison to you.
Hope that helps.
Mike
"... the main thing is conversion and getting them into someplace valueable for them and you."
Well played Brian. That is the perfect way to think about it.
Thanks for your help.
Mike
UPDATE: By "getting rid of" I just mean we are not going to link to them internally and that the videos themselves will no longer exist. We plan to either redirect those URLs to either product pages or to pages requesting a demonstration.
Eventually we would like to create a YouTube channel or something, but right now, those videos are from like 2007-ish... so they are wicked outdated and provide no relevant content to new or existing customers.
I am starting to think that 301ing them to the product pages may be the best option.
Thanks for the info.
Mike
The backlinks are from our partners linking to the videos. Roughly 15 partner links per video and there are roughly 30 videos or so.
The highest ranking page linking to one of those video pages has a PA of 21 and mR of 3.2... so nothing too crazy cool.
That is why I was wondering if I should completely remove the pages, forcing the partners to update their websites to our new and better content. I personally think that from a customer standpoint, if they are on a partner page, then click on a link, expecting to find a video, and instead am redirected to a product page, that is not really what I was wanting to see... you know?
But I don't know the impact of removing 30+ pages that have decent PA and mR from my site?
Thanks,
Mike
Here is the entire story:
We are planning on creating new, unrelated videos in the future and putting a transcript on those video pages is a stellar idea.
I also just checked Google Analytics and it looks like we are not getting any resonable traffic from the video pages.
Do you think I should redirect these video pages to the applicable content pages or just remove them, forcing our partners to update their sites?
Thanks,
Mike
We are also getting rid of the video index page. We plan to remove all videos from the site.
I currently have some html pages that have camtasia videos demo-ing our products; however, these videos are WAY outdated and we no longer want to maintain them. The are only videos, no text at all is on the page.
The pages have an average page authority of 40 and mR of 4.
Should I just remove these pages or should I create a redirect to the homepage or product page?
Or...?
I have roughly 30 or 40 of these types of pages.
Thanks for any help.
Mike
You sir are a gentleman and a scholar.
Thanks for your help Matt.
SEOmoz is saying that I have duplicate content on:
The only difference I see in the URL is that the "content.asp" is capitalized in the second URL.
Should I be worried about this or is this an issue with the SEOmoz crawl?
Thanks for any help.
Mike