I would use SEOmoz's Competitive Link Finder Tool - http://www.seomoz.org/labs/link-intersect
Best posts made by Copstead
-
RE: Missing Juicy link finder
-
RE: Website redesign and URL restructuring
I think you have the key components covered with the 301, but I would also make sure that you have no internal redirects and that you update your XML Sitemaps and get them submitted at the same time. Once you make the change, and run internal crawls of the site (and no errors), then you should utilize GWT and up your crawl speed and Fetch as Googlebot to start resubmitting the new content.
-
RE: Another high bounce rate
My first impression is that I have no idea what the page is about. But mainly, there is no guidance in what I should do next. There is no call to action.
So what is the main purpose of this page?
-
RE: Is the Page Authority/Rank of my corporate site affected by my blog's PA/PR and vice versa?
It will all depend on how you setup your blog, a subdomain or a subfolder.
subdomain: http://blog.example.com
subfolder: http://www.example.com/blog/
If you are setting the blog up as a subfolder, then yes, the blog will help the site, and vise versa. (It will also hurt the site as well). For optimal SEO, I would do it this way. http://www.example.com/blog/
Rand wrote a great post about this topic here:
Root Domains, Subdomains vs. Subfolders and The Microsite Debate
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
-
RE: Is there a way to see what keywords users of my site are using to find it online?
GA does still provide keywords, so you must be talking about the (not provided) list of keywords, correct?
If this is the case, then No, there isn't an easy way to track this information. But if you want to spend some time you can get a rough idea of keywords based on the landing page they are hitting.
For any page on your site, you can see a complete list of keywords that landed on the page (with percentages). So now you can take these percentages and use them with the Not provided numbers from Google.
For example:
page1.html - 100 landed visits from the keyword "example", which accounted for 10% for that page.
page1.html - 1000 visits from the keyword "(not provided)", so with a little math you can estimate that 10% of the 1000 visits came from the term "example"
**I should note the term used from GA is "Entrances". This is the number you need to use.
-
RE: Getting home page content at top of what robots see
Absolute Positioning will allow you to create a div at the bottom of the page, but display it at the top using css.
for example to have div2 display about div1, it would look like this in the code.
Google sees first.
People see firstcss code:
#div1 {width:900px;height:500px}
#div2 {position:absolute;top:-15px;width:900px;height:100px}I know it's not a very good example of code, but you should get the idea enough to start testing your layout. Also, go ahead and use this for testing. http://www.w3schools.com/cssref/tryit.asp?filename=trycss_position_absolute
-
RE: Quantifying link building efforts
If you really want to dig deep into your links, Tom Anthony wrote a great post and offers a great tool. http://www.seomoz.org/blog/link-profile-tool-to-discover-linking-activity
We use it for every client to analyze and get a better understanding of their links on a monthly basis.
-
RE: Can fun, slightly unprofessional text be good for conversion rates?
I would make it fun but keep them related to your sports industry.
-
RE: Google + 1
But as Google +1 is to Google's serps, Facebook is to Bing's serps.
-
RE: Link Building - Quality,Quantity, or both?
Rand has a great post here: http://www.seomoz.org/blog/buying-links-is-shallow-buying-blogs-now-thats-a-strategy
Our goal is to focus on Quality, but we will take what we can.
-
RE: Google Analytics Customer filters & the correct syntax
As far as excluding traffic from the United States, you would need to exclude "Visitor Country" and then type "United States". Knowing what to type for the Country is based on what Google displays to us in the Demographics, Location section.
Instead of filters, I would probably setup Advanced Segments. Filters actually remove the information, where Advanced Segments allows you to look at certain areas without losing any information.
-
RE: What company/person do you recommend for improving conversion rates on landing pages?
I have had great luck with http://visualwebsiteoptimizer.com/ but they are more of an A/B testing company. What I found useful is their blog. I think you will find your answers in there.
-
RE: Pages not indexed by Google
Do you have XML sitemaps? If not this is a great way to measure what is being indexed by Google. Make sure you create multiple sitemaps based on your categories so you can track exactly which pages are not being indexed.
-
RE: Am I Stuffing Internal Anchor Text?
If adding links, try to keep the links within the same category. For example on the product page you provided, I would have links to the other 5 horse turnout blankets.
The goal with internal linking is to focus category and silo groupings. Or if you think of each page and remember that everything on that page should be related to what you want to rank for.
And to answer your overall question, I don't believe the changes you have made would hurt you. But really try to focus your links as groups.
As for sitemaps, if you have correctly designed your site, a sitemap really isn't necessary, so I would probably get rid of it.
-
RE: Is a website with no images really a good idea?
What is the reason they don't want to use images. I believe to have a professional looking site, you do need images. I don't believe it's a SEO issue, but more of a UX issue.
"An image is worth a 1000 words"
I would find two websites. One designed beautifully, and the other minimal. Ask him which company he would call. 46% of people will judge the company based by the appearance of their website.
-
RE: Is there an easier way from the server to prevent duplicate page content?
You should definitely setup your site Canonicalization, and you should also utilize rel=canonical tags to help distinguish which page is the actual page.
For example, if you want to identify that www.example.com is the correct url, then you would use the following:
-
RE: New linkbuilding: If networks are useless, and I need high volume through a 1-man team, what's the best option?
I wish I had an easy answer. There are a bunch of great posts by SEOmoz on Linkbuilding.
http://www.seomoz.org/pages/search_results?q=linkbuilding
A few I have bookmarked:
http://www.seomoz.org/blog/10-extraordinary-examples-of-effective-link-bait
http://www.seomoz.org/blog/the-power-of-using-lists-for-link-building
http://www.seomoz.org/ugc/outsource-link-building-like-a-small-seo-company
-
RE: Am I Stuffing Internal Anchor Text?
If you have your sitemap.xml and have submitted it to both Google and Bing, then you should be fine.
-
RE: Is there an easier way from the server to prevent duplicate page content?
You should set up the correct Canonicalization rewrites at the server level with IIS or .htaccess. (Not sure which one you have). If you know what type of sever you are on, then you can find all the correct rewrites. (www, non www, lowercase, trailing slash / , etc.)
For example, here is a great post if you have IIS. http://www.seomoz.org/blog/what-every-seo-should-know-about-iis
And you should also use rel=canonical tags.
-
RE: The Google/Yahoo Connection
Bing controls Yahoo so make sure you are using Bing's Webmaster Tools and have XML sitemaps.
But the truth is that they are both different so it's really hard to control them both.
-
RE: Correlation Between Domain Authority and Crawl Penetration?
Since SEOmoz doesn't provide all of their metrics they use to calculate, it's hard to say just how much the correlation is. But I would have to say that the correlation would be there since good crawl penetration and juice distribution leads to better indexing and a better user experience.
SEOmoz Domain Authority
Domain Authority represents SEOmoz's best prediction about how a website will perform in search engine rankings. Use Domain Authority when comparing one site to another or tracking the “strength” of your website over time. We calculate this metric by combining all of our other link metrics (linking root domains, number of total links, mozRank, mozTrust, etc.) into a one single score.
To determine Domain Authority, we employ machine learning against Google's algorithm to best model how search engine results are generated. Over 150 signals are included in this calculation. We constantly refine this model over time. This means your website's Domain Authority score will often fluxuate. For this reason, it's best to use Domain Authority as a competitive metric against other sites as opposed to a historic measure of your internal SEO efforts.
-
RE: Sitemap for 170 K webpages
Don't use html sitemaps, rather use XML sitemaps instead.
Here is a great post about utilizing XML sitemaps.
http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic
-
RE: Where to point Rel = Canonical?
I would see if they could restructure their site so that the 2nd link was the actual link. (without type2). I don't see why they would have those parameters as the main category.
You really want the link architecture to flow with what your Canonical tags are saying.
-
RE: Domains and subdomains
Yes this is a problem since you are able to access the home page of your website via two different urls, with and without www. This issue is can be resolved with Canonicalization.
More information about Canonicalization: http://www.seomoz.org/learn-seo/canonicalization
From the article:
SEO Best Practice
For SEOs, canonicalization refers to individual web pages that can be loaded from multipleURLs. This is a problem because when multiple pages have the same content but different URLs, links that are intended to go to the same page get split up among multiple URLs. This means that the popularity of the pages gets split up. Unfortunately for web developers, this happens far too often because the default settings for web servers create this problem. The following lists show the most common canonicalization errors that can be produced when using the default settings on the two most common web servers:
Apache web server:
- http://www.example.com/
- http://www.example.com/index.html
- http:/example.com/
- http://example.com/index.html
Microsoft Internet Information Services (IIS):
- http://www.example.com/
- http://www.example.com/default.asp (or .aspx depending on the version)
- http://example.com/
- http://example.com/default.asp (or .aspx)
- or any combination with different capitalization.
Each of these URLs spreads out the value of inbound links to the homepage. This means that if the homepage has multiple links to these various URLs, the major search engines only give them credit separately, not in a combined manner.
-
RE: Where to point Rel = Canonical?
Canonicals are fine, but you should always canonical the correct or main url. (The one being used in the navigation)
-
RE: Duplicate Content Issue with
I believe you can just add this to your robots.txt file and you will be alright.
Disallow: ?p=contactus
-
RE: Product Vartiations
Keep only 1 page both for UX and SEO. If you start to build different pages and the only differences are size or something, then you will start running into duplicate content issues.
The goal with any page, is to have it so unique and valuable, that people will want to link to it.
-
RE: Having trouble removing homepage from google
The best way to do this would be at page level. For the pages you don't want indexed, add
This will tell the search engines not to index that page, but to follow and index the other pages.
But I do have to say, I have never heard of anyone trying to not index the home page. It will be interesting to see how this turns out.
-
RE: SEOmoz recommended Directories
I just want to remind everyone that since the Penguin update, we need to be very careful about directories. Here is a great article about Directories and with an analysis of SEOmoz's directory list.
Web Directory Submission Danger: Analysis of 2,678 Directories Shows 20% Penalized/Banned by Google
-
RE: Sitemaps / Google Indexing / Submitted
It doesn't appear to be validating correctly.
http://www.xml-sitemaps.com/validate-xml-sitemap.html
Change this:
to this:
-
RE: Removing robots.txt on WordPress site problem
copy whatever you have in your robots.txt file here and we will tell you the issue.
SEOmoz has a great article about Robots.txt files here: http://www.seomoz.org/learn-seo/robotstxt
-
RE: Domain "Forwarded"?
I agree with Phil. The site isn't being redirected to the new one. Most likely what they setup is a rewrite instead of a redirect.
Tell the tech team that the url of the old site should never be visible.
And you can use tools like this to verify that it is actually redirecting. (you should see a 301 redirect)
-
RE: Can you 404 any forms of URL?
I believe this file is the base template for all the others. So if you put it on the main one, it should be on the others. That is why I suggested to add it and then view the code of the others to verify it is there as well.
-
RE: Why is the ideal rel canonical URL structure?
The rel=canonical is based on the actual url structure. So whatever your url structure is (your navigation), then the rel=canonical tag should match.
-
RE: Website disappeared from Google organic keyword searches.
Run a detailed Linkdetective scan and start contacting each blog.
-
RE: URL Structure: When to insert keywords?
This type of marketing works better with larger sites. When you have a simple website with just a home, about, contact, and pricing page, it is really hard to optimize a Contact page for something else then contact. (it may be done, but I wouldn't focus on this).
Remember that you already have greenscreen in your url. So if you really want to focus on different phrases, you may have to break apart or build more pages about what you do. Or even have a section that is "what-is-greenscreen/"
With this site and only having 4 pages, I would focus on link building towards the home page.
Also, with taking a quick look you have a few other issues you should take care of. The first is with Canonicalization. As of right now you can access your home page with two different urls (seen as duplicate content).
How to fix that here: http://www.seomoz.org/learn-seo/canonicalization
I hope that helps.
-
RE: Site being indexed by Google before it has launched
I would be very careful on how you manage this right now. It all depends on the old and new urls. If this is something in Dev staging, like http://dev.examplesite.com, then it is alright to do everything possible to clean them up. But if the new urls are going to be the new final urls, then you have to be careful.
The best thing you could do right now would be to password protect the new site. That way there isn't a chance of anything getting to it.
All it takes is Google finding one link to the site then it will crawl from there.