Yes, I highly recommend moving the transcripts to the actual page that the video's on. This will not only help with accessibility, but also improve your on-page SEO.
Posts made by EricaMcGillivray
-
RE: CTR of Video Rich Snippet vs. Normal SERP
-
RE: CTR of Video Rich Snippet vs. Normal SERP
I'd definitely keep the video.
Instead, I would 1) make sure your video is completely optimized (transcript, video sitemap, etc.); 2) build more links (and more legit than your competitors); and 3) freshness of page is a huge factor since Google's Freshness update.
-
RE: Local Keyword Searches With Broad Terms.
Have you looked at Google Insights for Search? It gives you the ability to drill down further into regions, which Miami is definitely one you can use. Doing a combo of Insights and Adwords should give you a good idea about your keywords and the volume. And actually, Insights will only show you terms that give enough traffic to be relevant.
-
RE: How would one get on the Huff Po's "Around the Web" list?
While I can't say for certain (and Google searches were completely unhelpful), I'd bet that these links come from partnership / content share deals that HuffPo (or AOL) has made with the sources that show up in those links.
-
RE: Ajax pagination and filters for ecommerce site
It all depends on how you code the ajax and what you're looking to get indexed on your site. Vague enough?
The major things you want to make sure of are 1) you're not using ajax when you could be using simple HTML/CSS; 2) that you're not using ajax on things you want to be indexed; and 3) that the ajax doesn't make it look like you're cloaking things from Googlebot.
-
RE: Adding 'NoIndex Meta' to Prestashop Module & Search pages.
I'd implement canonical tags for your duplicate content problem instead of noindex tags. This is the recommended practice for duplicate content.
As far as pages that you don't want indexed, when you use robots.txt to accomplish this, Google can/will still put the URL & title in SERPS. In order to stop this, you do need to put meta noindex tags on every page.
-
RE: Embed elements on my page help decrease my bounce rate?
Most embedded content is on the same page as your code, so this will not affect your bounce rate. Say, I embed a YouTube video on a page, it's still just one page and my bounce rate will be the same. If you come to my site on that page, watch the video, and leave, you will count as a bounce.
Things that can cause your bounce rate to go to zero as two things are being called for one view for the user, include iframe content, ajax tracking codes, duplicate tracking codes, or some kind of 3rd party vendor software that's calling more than one page. Of course, the site will still have bounces, but the analytics won't be able to track it.
-
RE: ASP.Net How to Allow Google to Skip a Disclaimer Page
You want to use a nofollow & a noindex tag on it. Here's a guide for going it in ASP .NET.
-
RE: Schema Tags Configuration - Ecommerce Category Pages
I believe it's because of the itemscope="itemscope" call, which is saying that the area is another item. Leave out the ="itemscope" part.
Example:
[Blue Denim Castaway Flip Flops](/shoes//xsl:value-of select="translate(/page/subCategory[Name = $subCat]/Name,'ABCDEFGHIJKLMNOPQRSTUVWXYZ ,','abcdefghijklmnopqrstuvwxyz-')"/>/blue-denim-castaway-flip-flops/300) £30.00 [](/shoes//xsl:value-of select="translate(/page/subCategory[Name = $subCat]/Name,'ABCDEFGHIJKLMNOPQRSTUVWXYZ ,','abcdefghijklmnopqrstuvwxyz-')"/>/blue-denim-castaway-flip-flops/300)
Hope this helps!
-
RE: Count of words in all urls in a subdomian
So did some searching, and the only tool I found is a bit manual to use, but it will give you a running sum of your count. Though it only counts one page at a time. http://www.kwintessential.co.uk/translation/website-wordcount-tool.php
-
RE: API for testing duplicate content
While I don't know of an API that does that, you can set up your site using the SEOmoz tools and our Crawl Diagnostics section does look for Duplicate Cotent.
-
RE: SEO pluses/minuses of Content used in pop-up windows
For SEO, it all depends on how you code it. If you want the landing page to have the SEO, you need to make sure the content is on that landing page, not a separate page being called. Lightboxes typically are coded all on the same page.
Definitely pay attention to user experience stuff, like page speed and compatibility with different browsers.
Page speed: customers expect your site to load in less than 2 seconds and will (on average) have an abandon rate of 40% if it takes 3 seconds to load your page.
For compatibility, make sure to check out your analytics to see what devices/browsers the bulk of your customers are coming in on and test those.
-
RE: Count of words in all urls in a subdomian
HI Luis,
Can you clarify your question? I'm a bit confused at what you're asking. Are you talking words in the URL or word count on pages on the subdomain?
Thanks!
-
RE: How to allow one directory in robots.txt
Yes, you can set it up like this:
Disallow: /user/ Allow: /user/password/
And that should do it!
-
RE: Duplicate content and blog/twitter feeds
Most feeds, like Feedburner, and blogs solve this issue with the canonical URL tag. Your post is marked as the canonical one and your feed a copy, which eliminates the duplicate content issue. You might also dig this post about advanced canonical tags.
Twitter is a different issue. Google doesn't treat Twitter like websites, but like a social platform. Additionally, you can only post 140 characters on it. Though I do encourage you to not just put a feed to Twitter, but instead craft it to be interesting for your audience or a call out for them to actually read it. I wouldn't worry about Twitter as duplicate content at all.
-
RE: Recommended marketplace for SEO
Check out our SEOmoz LinkedIn Group. People add job listings all the time, and I'm pretty good at keeping it updated.
-
RE: HTML E-mail Preventing Link Request?
Check out this post on email deliverability from 37 Signals. They have a list of great tools, including checking if you've been blacklisted.
-
RE: Counting up local reviews
The only service I can think of that would do anything like this -- as they're a full service local pages creation/monitoring/etc service -- is Yext. However, I don't think they do anything to compare you to your competitors.
What I'd do is create a monitoring system yourself. Instead of focusing on every single review service out there, focus on the review sites that have strong signals: the Google Places, Yelps, etc. And track those as they're bound to have more weight than random ones. Also keep an eye out for strong sites that are local to your area.
-
RE: How to get listed on Bing TV
It looks to me like in the "web" listings on Bing's TV section that they always only show the top 2 listings, which seem to consistently be the official website for the show & then wikipedia. So if you can get the #2 ranking on that SERP, then it looks like you can be included on Bing's TV page.
-
RE: OSE Link Report Question.
Hi Seb,
"Number of Links" includes the total amount of links from all websites. Two links from the same website would be included twice.
Hope this helps!
Erica
-
RE: Should i Change On Page Optimization ?
It all depends on where you want to drive your traffic due to what you think your customers will get the most value from. Even with tweaks, your homepage may never be as relevant for "computer monitoring software" as your pc-monitoring-software page. Yes, SEO is important, and it seems to me that you've got the basics down; but you always want to think what will make the customer the happiest as that's usually the best SEO answer too.
-
RE: Redirect between domains: any real number on how much link juice is lost?
How you phrased your question, I thought you were considering only moving pages that didn't have any backlinks, not the entire subfolder. You want to keep your URLs consistent, and it's pretty easy to build 301 redirects around an entire subfolder.
-
RE: Syndicated posts extracts on wordpress and impact on SERPS
You'll want to make sure that you're using the Canonical Tag so Google and others know that your site owns & is the originator of the content.
-
RE: Is it just me or is there an increase in call tracking?
Yes, you are correct that certain kinds of call tracking and having different numbers out there online do dilute your NAP and you don't want it for your local rankings. That said, there are definitely services out there which do different types of tracking without giving different numbers. One I can across was m.Call. (Note: I know nothing about them except that they say their service avoids this.)
-
RE: Articles / Balanced Profile
Google says they want a "natural" looking anchor text pattern as they don't want anyone buying links. Thus, you are correct in that they are penalizing you for having the same anchor text in your links.
You need to focus on words that people would actually be searching for to find your site and like what they find. If I was running www.houserestoration.com and drove people via the anchor text "kittens," people would not like my site as it wasn't relevant to their needs/interests as they are looking for information about "house restoration," not "kittens."
Likewise, even words closer to your category might not be the best. People searching for "house" can be looking for a lot of things, but I' d say given a Google search for "house" that more people looking for House (the TV show), Government branches, or Real Estate which have nothing to very little to do with "house restoration."
You know what your customers are looking for. Use those words.
-
RE: Articles / Balanced Profile
Yes, the search engines will eventually recognize the varied anchor text. It will be a slow recognition.
-
RE: Mitigating duplicate page content on dynamic sites such as social networks and blogs.
You can disallow these sections in your robots.txt to cut out all these. However, they can still show are URL only in search results. In order to completely remove them, you need to add noindex tags to the header of each pages. I'm assuming that these are created dynamically with a template that you should be able to add the nofollow. But be careful that you only add them to the pages you want!
-
RE: Redirect between domains: any real number on how much link juice is lost?
301 is the best solution possible for moving your site to pass along your linkjuice. And you want to move all of it, not just pages with backlinks.
Many people have asserted that you can see up to a 10% dip initially. (Most people only see dips from a week to a month at most.) However, over the long-term, with a better site & URL structure, you should see a rise in traffic. I have no seen anyone doing a predictive modle on this.
Seer Interactive did this test 301 Redirect Test: How Much Link Juice are YOU Losing? which saw no rankings lost, but favorable outcomes in the long-term for the better site.
While I'd never put words in Cutts' mouth, I believe he was saying that if you can, getting your backlinks changed to your new site's URL is optimal. But obviously, this is not always possible. Instead, I'd concentrate on link building new links to your website on its new URL. (One should always be working on getting back links as part of your on-going SEO anyway.)
-
RE: Re-platform effects on Page Rank
301 is the best solution possible. Many people have asserted that you can see up to a 10% dip initially. (Most people only see dips from a week to a month at most.) However, over the long-term, with a better site & URL structure, you should see a rise in traffic. I have no seen anyone doing a predictive modle on this.
Seer Interactive did this test 301 Redirect Test: How Much Link Juice are YOU Losing? which saw no rankings lost, but favorable outcomes in the long-term for the better site.
-
RE: Too Many On-Page Links: Crawl Diag vs On-Page
To answer #1, yes, no-follow is the valid strategy to reduce link count for a page.
For #2, I'm not sure, but we'll get an answer for you.