I'd be very hesitant to add target anchor text in these links - I've seen sites that had built links like Web Design and SEO services by Company X - web design points to the webdev page, seo points to the seo page on the company website. These sites got rocked by Penguin - footer site wide links with keyword rick anchor text pointing to target pages leads to heavily overoptimized backlink profiles that are awesome penguin food. Don't do it! Just use a branded link or a naked anchor link (www.companyx.com) as the link back, otherwise you're just asking for trouble.
Best posts made by Mark_Ginsberg
-
RE: Web Designers and SEOs using backlinks from client sites
-
RE: Google is displaying my pages path instead of URLS (Pages name)
They've understood the navigation of the site, and are now showing each path of the nav as a link to that individual page/category - this is good - you're basically getting mini site links on each serp listing, which should help to send traffic to these category pages. The display looks good, and has the potential of sending visitors to multiple pages on your site, so I don't see a problem here - do you?
-
RE: Why does Google add my domain as a suffix to page title in SERPS?
Is your title tag well written? Is it around 60 characters? I would make sure the title tag is a good description of the page/product, otherwise Google might try to create better, longer title tags for your pages.
Can you share the URL and the search result? That will help in solving your problem
Mark
-
RE: 'The Guardian' Is Moving to a New Domain
According to what I saw here, http://www.guardian.co.uk/help/insideguardian/2013/may/24/theguardian-global-domain, they'll be working with the team at Yoast, who are some of the best in the biz at onsite SEO, so I think the Guardian should be in good hands.
-
RE: Please Settle a Bounce Rate Debate
The form could trigger a google analytics event on successful submission without having to take you to a confirmation page. You often have ajax forms that don't load a new page, and you can track success of the form with a google analytics event and a not a pageview of a thank you page. A very popular solution that works this way on Wordpress is Contact Form 7.
When your form "wipes the data" as you said and shows the customer the successful form submission, you can trigger a Google analytics event then.
Mark
-
RE: Have SEOmoz members ever considered joining forces with link-building?
Hi Marisa,
It sounds like you're advocating for all of us members to get organized and create the largest link network the world has ever seen (mwahahaha). While there is some logic to it, I really think this is exactly the type of stuff that Google is going after. In the end, what it would come down to is the user. By your linking from your site to the coat hanger site, are you benefiting your user? Is the link in context and does it make sense? If someone was genuinely interested in reading your article, would following your link help them? If so, by all means do it. I don't think Google will penalize people for naturally linking to friends, business partners, associates, local businesses, etc. But what happens when this becomes organized, and your DIY furniture blog starts linking to forex sites and cheap online prescriptions? While that is clearly an exaggeration, I see it as only a matter of time before things would deteriorate. While you should by all means be building relationships and establishing connections and then leveraging those connections for all purposes, including link equity, you need to keep your user in mind in the end and be focused on them.
That being said, I'm all for contextual outbound links to relevant sites - let's all spread the link love a bit more!
-
RE: Finding the source of duplicate content URL's
Using the Screaming Frog SEO Spider (free version to download will crawl 500 URLs, paid version [99 GBP for a yearly license] will crawl as much as you want), you can see all of the inlinks to a particular page. So run a crawl of the site, you should find those pages with Screaming Frog, and then you can view the inlinks to those pages. Visit the inlinks, and check the code for the links to the page you're looking for - this will quickly show you where the links are to the pages you're trying to hide.
Also, have you checked the sitemap - the CMS might create links to these pages in the sitemap.
good luck and let me know if you need any more help with this.
Mark
-
RE: Taking out a .html plugin
After removing the plugin, you should configure a 301 redirect sitewide to strip out the .html and redirect to the version without the file extension. This way, both internal and external links won't lead to error pages, and you won't lose any link juice.
You'll also want to make sure your canonical tags are configured to link to the non .html version of each page, if they're hand coded.
-
RE: How to allow googlebot past paywall
Google has a program called first click free - basically, you need to allow google bot, along with users, to view the first full article they land on. So if you have multiple page articles, you need to give them access to the entire article. After that though, the rest of the content can be behind a paywall.
You can read more about it here - http://support.google.com/webmasters/bin/answer.py?hl=en&answer=74536
And here are the technical guidelines for implementation - http://support.google.com/news/publisher/bin/answer.py?hl=en&answer=40543
Hope this helps,
Mark
-
RE: MozCast metrics: Big movement yesterday - what happened?
Apparently this is reflective of code change made to the system and not algo updates here - see Dr. Pete's tweet here - https://twitter.com/dr_pete/status/288665186951380994 - stay tuned for his blog post
-
RE: SEO Fightback
I agree with what William wrote, it's just not super easy. In addition to building out the great content, you have to market it. You have to get the word out - you have to do the outreach. It's not enough to build awesome content, you have to share it, via social networks, email, outreach, word of mouth, contacts you may have, the press, etc. You can build the best blog pieces out there, but if no one hears about it, it ain't gonna do you much good.
Adria wrote a great post here on SEOMoz the other day - I'd check it out - http://www.seomoz.org/blog/why-link-building-strategies-fail
-
RE: Is there a way to get the domain authority of a subdomain?
I agree with Moosa, but it may make sense to also look at mozrank for the subdomain - OSE calls it subdomain mozrank, which would look at the strength of the whole subdomain. you can also look at the number of linking root domains and followed linking root domains, compare that to the total numbers for the domain, and determine how much of the overall strength of the domain authority is resultant from the particular subdomain you're looking at. Those are just my theories, but to me it makes sense
-
RE: What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
So you want to create duplicate content on the web?
If you're syndicating content, the reason you're doing it is to provide this content to the readership of these other sites. If someone searches, and discovers this content in the search engines, you want the attribution of that content to come from the original source, you guys. Why would you want some other website to be discovered as a result of content you created and syndicated to them? If you want the credit to go to them, and the engines to bring visitors to their site for content you created, that's why you should write a guest post or column and put it on their site. But if you're syndicating content and want it to show up on other sites, I think it makes sense to actually have the canonical tag pointing to your site, indicating to the search engines that you are the authentic source of that piece of content. The other site's readers will still enjoy the content on that site, you'll just be credited for it in the search engines. To me, this is a win win situation where everyone benefits. The other site benefits because it gets free content to give to their readership, and you get attribution for it in the engines and the additional exposure to the other sites visitors.
-
RE: Re-Platforming our ecommerce site. What am I missing?
Sounds like you've got your bases covered.
One suggestion - make sure you don't block the old URL's with your robots.txt file - you need to make sure the robots can reach the old pages, then follow the 301's to update their index.
I'd also use your favorite backlink profile to check external links - I'd contact the really juicy sites linking to you and inform them of the change, asking them to correct the link to the proper page, for server loads and users. This way, you won't be losing any juice through the redirects. Also make sure you're not losing juice by having external links go to 404 pages - would be a good time to recover all lost juice.
While you may drop in rankings temporarily, things should sort themselves out in a bit.
-
RE: International pages - SEO - which metatags to use?
Hi,
From an SEO standpoint, ideally you should be implementing the hreflang tag - it indicates to the search engines not just the language of the page, but the relationship between the pages on the site in the different languages. It tells the search engines these pages are alternate translations of one another, and you should show the right page for the person in their correct language.
You can learn more about the hreflang tag from Google directly:
https://support.google.com/webmasters/answer/182192
https://support.google.com/webmasters/answer/189077?hl=en
I would ook to implement this first.
Good luck,
Mark
-
RE: Is there a tool to keep track of internal links?
You can use screaming frog to crawl the site, and it will show you the internal links and anchor text to each page across the site as well. Though this post from Branko is a bit old,it's still a great resource on using Screaming Frog to check for internal linking issues - http://www.seo-scientist.com/seo-spider-review-xenu-on-seo-steroids.html
Screaming Frog has a paid version, but the free version will you crawl up to 500 URLs, so that might be sufficient for ya.
Hope this helps and good luck.
Mark
-
RE: What is the best tool for checking do follow outbound links?
Depends on what you're trying to do - if you're looking at a particular page, and you want to highlight external links or followed links, use the mozbar's highlighting links function.
If you want to crawl your site and find all of the external links, I would recommend using Screaming Frog SEO Spider - http://www.screamingfrog.co.uk/seo-spider/
-
RE: Penguin Rescue! A lead has been hit and I need to save them!
This sounds like a plan. Give it a shot and test the results
-
RE: How to handle (internal) search result pages?
I'd do exactly what you're saying. Make the pages no index, follow. If they're already indexed, you can remove the page search.php from the engines through webmaster tools.
Let me know how it turns out.
-
RE: Best X-Cart SEO Solutions / Modules?
This is the SEO solution I've been using on a site and it's been pretty good. Just make sure you have good site developers handling and maintaining the site - if the site goes down or becomes very slow and unresponsive, no SEO plugin is going to help you.
-
RE: Can anyone suggest a AHREF's alternative?
Hey Pete, the moz keyword difficulty tool will give you a score on a scale of 1 - 100 how difficult the term is - not many other tools will you give levels of competition for organic search results. However, SEOMoz no longer has search volume data, due to Adwords API issues.
It doesn't look like many companies out there will both be able to scrape Google search results and provide info using the adwords api. So you'll have to pick your poison - do your keyword research right in the adwords interface, or with other tools, and rely on tools like SEOMoz which scrape Google for organic difficulty scores.
Mark
-
RE: Does the root domain hold more power then an inner page?
Think of it this way - if you're a user, and you want to find out info about this page, what is the best page for you to land on? If it's a specific keyword relevant to a deep page full of content, then you'd want Google to target your inner page. If it's a general term, maybe your best page is the home page.
Take one example - electronic cigarettes.
If someone searches electronic cigarettes, then that's the head term, and pretty generic - there's lots of relevant subtopics - so you'd think the main page should be your home page. But if someone searches electronic cigarettes quit smoking, an inner page relevant to the uses and scientific proof / lack thereof about using ecigs to quit smoking would be more relevant.
Bottom line - Keep the user in mind when doing keyword targeting, and think what the best page for you to display the keyword would be.
-
RE: Problems with a NoIndex NoFollow Site
Firstly, why add the no follow tag - let the pages be followed - so if you link back to your parent site, some link juice may flow.
Secondly, I agree with Nigel about the canonical tag. In terms of robots, the no index tag is more effective than blocking the site in robots.txt. You can also implement the no index tag via the http status responses in apache using your .htaccess file - see here for more info
This way you can control this site wide from one location and also it won't be as apparent to anyone who looks at the code of the nonbranded site that it shouldn't be indexed.
Mark
-
RE: External Links Discrepancy
Make sure you're looking at the right metrics, first off. External links to the domain is different from external links to a particular page.
More importantly, external links is the total number of external links pointing to the page. This has nothing to do with the number of domains linking to your page. A site could have a sitewide link to your competitor in their footer, and with it being a big site, that one domain could have something like 500, 1000, or even 10,000 links pointing to your competitor. So while OSE will say that the competitor has 10,000 links from one domain, that's 10,000 external links, but only 1 domain linking to this page. That's why you'll have a discrepancy between numbers of external links and domains linking - they measure two different things.
-
RE: Wordpress + Google Analytics = Pulling My Hair Out
hmm, pretty strange - you can always test it by putting the code on, and then viewing it in real time - you should see visits happening, especially if you open the site in a different browser.
You can send through an example and I'll be happy to take a look.
-
RE: Does Open Site Explorer show juice passing links
When you filter links by followed and 301, that shows you the "juice passing" links - it's just another way of saying links that should count in the search engines to pass strength from another website to yours
-
RE: Which search engines still use Meta Keywords?
If the search engines are using it, it's as a spam signal - I'd recommend staying away from it, and usually tell my clients to leave it blank and have them remove keywords tags that are stuffed and could seem to be a spam indicator - see this article on Search Engine Land
-
RE: Why has my keyword dropped and risen so much in only 7 days?
As the other people already responded, there is a lot of fluctuation in the SERPs on a regular basis.
A few things come to mind
- Google could be trying out different landing pages for the term - it may be your homepage was blocked and so they showed an inner page, but much further down.
- This fluctuation seems to be happening every few months, where the site gets downgraded temporarily and then recovers - see the attached image
- Have you been having site connectivity issues - any messages in webmaster tools?
Hope this helps,
Mark
-
RE: Registration tracking with Google Analytics
I would recommend trying to create a virtual page view that will be fired whenever someone registers on the site - this way, you don't need to create a success/confirmation page for the form, just have the site fire off a virtual page view, and track this as a goal.
Mark
-
RE: Recommended Guest Blogging Platforms
You can also check out blogger link up (an email list sent three times a week) with opportunities for guest posts, and also check out Zemanta
-
RE: Registration tracking with Google Analytics
What I'm saying is why not write your code to fire the virtual page view only when a successful registration takes place, or when someone registers via Facebook and Twitter?
That should be something you can capture and then fire only once someone registers, no?
-
RE: Recommended Guest Blogging Platforms
Following up on what Mark wrote, buzzstream has a great generator tool for creating these queries - http://tools.buzzstream.com/link-building-query-generator
It's very helpful
-
RE: Bad link profile?
If the site got hit in the search engines (I see a major drop in visibility for the site in Search Metrics June 21 and 28 2012), and the client has now moved to a new domain, why would you 301 the old pages to the new ones - don't you want to start with a clean slate, which is why you started a new site?
If the 301 passes the penalty or not, why take the risk? Why not build your link profile the proper way this time?
I also see a lot of over optimized anchor text in OSE and very little branded or generic - diversify your anchor text and focus on branded, unoptimized, and naked anchor text (www.claims...) in your link building efforts.
If you're worried about visitors to the current domain, why not redirect using a 302 and then eventually let the domain die?
Instead of switching domains, you can also do a link disavow after a thorough review of your links and then do a reconsideration request, and then start the link building again on the current domain, instead of switching to a new hyphenated domain that just looks like a spammier version of the original URL. I would do a thorough back link review and then disavow and reinclusion request instead of the new domain route. You can refer to this article for how to review your links from Search Engine Journal.
That's my two cents - hope it helps
Mark
-
RE: Google Analytics question
See this info here - http://support.google.com/analytics/bin/answer.py?hl=en&answer=1011811&ctx=cb&src=cb&cbid=-1lv5wpxwzizr&cbrank=1
I'd recommend building a custom report to see full traffic referal info - look at the full url, and not just the domain - I built this custom report for myself and I find it very helpful - it's for an ecommerce site, but you can change the ecommerce info for goal info if that fits better
https://www.google.com/analytics/web/permalink?uid=E0KXNJqvRdS51dg3wkEfwg
Mark
-
RE: Guest Posts on Established Blogger/Wordpress.com sites
I usually look at subdomain metrics as reported by OSE to gauge the strength of a particular wordpress or blogspot domain - if it is has legitimate external links, it's definitely something I'll consider.
-
RE: Huge google index with un-relevant pages
Hi Assaf,
(I'm not stalking you, I just think you've raised another interesting question)
In terms of index status/size, you don't want to create a massive index of empty/low value pages - this is food for Google's Panda algorithm, and will not be good for your site in the long run. It'll get a Panda smack if it hasn't already.
To remove these pages from the index, instead of doing hundreds of thousands of 301 redirects, which your server won't like either, I'd recommend adding the noindex meta tag to the pages.
I'd put a rule in your cms that after a certain point in time, you noindex those pages. Make sure you also have evergreen pages on your site that can serve as landing pages for the search engines and which won't need to be removed after a short period of time. These are the pages you'll want to focus your outreach and link building efforts on.
Mark
-
RE: How to solve the meta : A description for this result is not available because this site's robots.txt. ?
You should remove the robots.txt block on the redirect - if the redirect is implemented properly, and it directs straight to the new page and not through some sort of infinite loop, it should be fine for you server to handle.
The bots think those pages are real pages, thus they have them indexed - you have blocked them from visiting the redirect and seeing that it points to another page and that the redirect page should be replaced with the real page. If things are implemented properly, they would follow the 301 redirect and remove the URL of the redirect from the search engines.
If you provide a link to one of these redirecting URLs, I'm happy to take a look and see if I can help you.
-
RE: Hit by a penalty... but which one?!
Based on the graph you provided, it looks like a panda hit.
I would check your landing pages and keywords reports in Analytics, and see if you can identify head terms that were hit, or if this is more a sitewide, across the board issue.
-
RE: Realtors SEO and Active Rain
HI Joel,
I'm looking at a blog post now on the site - http://activerain.com/blogsview/3275081/the-true-cost-of-collecting- - it's links are followed - you can use the mozbar to check/highlight followed and nofollow links. The links on this page are certainly followed and passing link equity.
That being said, blog here for the exposure to the readership and potential clientele, in addition to the brand exposure. Don't just do it for SEO purposes, but contribute value to the community - your returns will be greater in the long run with this mentality!
Mark
-
RE: Huge google index with un-relevant pages
Exactly - I'd build a strategy more around promoting pages that will have long lasting value.
If you use the tag noindex, follow, it will continue to spread link juice throughout the site, it's just the individual page with the tag will not be included in the search results and will not be part of the index. In order for the tag to work, they first have to crawl the page and see the tag - so it doesn't happen instantaneously - if they crawl these deeper pages once every few weeks, once a month, or even longer, it may take a while for these pages to be removed from the index.
-
RE: How to solve the meta : A description for this result is not available because this site's robots.txt. ?
It looks you have a bit of redirect loop here - that might be what's hurting the server a bit
-
RE: Why are plus signs (+) suddenly showing up in Google Analytics organic search keywords reports?
Not sure why this is growing recently, but when learning regex for Google Analytics with the awesome LunaMetrics regex guide, I remember coming across the need to write brand names for advanced segments and to cover the possibility of two words being written with or without a space. Don't remember exactly where I saw it, but since then I've been writing them this way (\s|+), if I were writing seomoz for a brand advanced segment, and wanted to cover seo moz and seomoz, I would do it seo(\s|+)?moz
Basically, the regex for a space is \s, but analytics sometimes treats spaces as +, so to cover your bases, you do it either with a \s or a +.
My point is, this has been around for a while - not sure why the sudden increase, but I know this has been around for quite a bit. Maybe try drilling down a bit and seeing if you can find a common denominator here about the traffic and what is causing it.
Mark
-
RE: Is it ok to point internal links to index.html home page rather than full www
Try this - of course take what you need from it - source is here - http://stackoverflow.com/questions/6059920/removing-index-html-from-url-and-adding-www-with-one-single-301-redirect Options +FollowSymlinks -MultiViews RewriteEngine on RewriteCond %{HTTP_HOST} !^www. [NC] RewriteCond %{REQUEST_URI} ^(./)index.html$ [NC] RewriteRule . http://www.%{HTTP_HOST}%1 [R=301,NE,L] RewriteCond %{HTTP_HOST} !^www. [NC] RewriteRule . http://www.%{HTTP_HOST}%{REQUEST_URI} [NE,R=301,L] RewriteCond %{REQUEST_URI} ^(./)index.html$ [NC] RewriteRule . %1 [R=301,NE,L]
-
RE: Website is not indexed in Google, please help with suggestions
Just an aside - you're going to have indexation issues - you have both www and non-www versions live on the site, with no canonicals pointing to one version. You also have index.php as a live page linked to from the logo. I'd definitely recommend implementing canonical tags across the site.
Mark
-
RE: Webmaster Tools and Geolocation
Hi Glen,
I don't have any specific insight regarding Venezuela, however a few notes:
-
In general, Analytics data is viewed in the industry as MUCH more accurate than Webmaster Tools data. If Analytics is implemented correctly (I'd start by checking this with the Google Tag Assistant plugin for Chrome), then I'd simply assume an error in WMT reporting unless I saw other evidence that might convince me that WMT is correct.
-
I think you may have misunderstood how secure search works. Secure search doesn't mean that the visits aren't tracking, it just means that you won't get keyword data for those visits in Analytics. If your hypothesis that you have way more visibility in Venezuela is correct, that would show in Analytics.
Hope this helps! Feel free to message me if you have any other questions
Mark
-
-
RE: How bad is it going over 70 character for title tag length?
Going over 70 characters means your title tags will be truncated by Google. And nowadays, it's not dictated by characters, but the amount of pixels those characters take up - staying under 65 will usually mean you're in the clear.
If you are formulaically creating title tags, and a few of those go over 70, I don't think it's the end of the world. It's not bad - you aren't going to be punished - Google will just truncate the tag and add an ellipsis - worst case scenario, they'll create their own title tag.
That being said, I'd make sure for your target pages (pages you really care about and are aiming for as landing pages in the SERPs), you manually review/write them and make sure they are optimized in terms of keyword inclusion, Call to action, branding, length, etc.