Google does limit the number of pages it crawls and indexes based on your PageRank. So if you don't have a lot of external links, then if you publish 10 articles a day they may not be crawled or indexed in a timely fashion. The higher the authority of your site, the more content Google will index, and sooner.
Posts made by TakeshiYoung
-
RE: How many articles are ok to publish a day on my website blog?
-
RE: A tool to tell a websites estimated traffic
There are a bunch of sites that offer this service such as Alexa, Compete, Google Trends, etc. but none of them are close to accurate. But they can provide very rough ballpark figures as well as relative popularity:
http://moz.com/blog/testing-accuracy-visitor-data-alexa-compete-google-trends-quantcast
Another option, which is more time consuming but potentially more accurate, is to do some research on the site, and identify their top keywords. The Google Keyword Planner can help you with this, just pop in the URL and it will show you the keywords that the site is optimized for.
Then run the keywords through a rank checker and combine that data with estimated click through rates and the search volume for the keywords to estimate how much traffic they are getting. Multiply that number by 10 to account for long tail variations, and that will give you an estimate of their total traffic. Paid tools such as Linkdex can help with this.
-
RE: Help Crawl friendliness for large site
Good point. If you don't want the filter pages crawled at all, it would be better to just block them via robots.txt. My preferred approach is to use query parameters for filters, and canonicaling the filtered pages back to the original, unfiltered page.
Another approach is to use AJAX to dynamically filter the page. This takes more programming overhead, but won't result in tons of extra pages being crawled and potentially indexed.
-
RE: Question about links on blogs
Google has devalued sitewide sidebar links over the years, but they shouldn't be hurting your SEO. However, if you want to pass the maximum amount of link value, you'll want to include links from within relevant pieces of content rather than throwing them in your navigation.
-
RE: Help Crawl friendliness for large site
Nofollowing internal links is almost never a good idea. You're just wasting valuable link juice.
Google actually just recently came out with a good guide for how to handle ecommerce navigation with lots of product options: http://googlewebmastercentral.blogspot.com/2014/02/faceted-navigation-best-and-5-of-worst.html
Also, if you have a lot of categories in you store, try to show navigation that is only relevant to the section of the store the user is in. For example, if the user is in the Flowers section, don't show a ton of links for Cellphones.
-
RE: Duplicate content on yearly product models.
The exact threshold that Google uses to determine duplicate content is a tricky one.
The more important question is, are you noticing a problem? Are both pages being indexed (2013 & 2014)? When you search for the 2014 model in Google, is it showing up in the search results or is it being filtered out as duplicate content? If your content isn't being indexed or isn't ranking, then you have a problem.
Panda also can be an issue, but only if a large portion of your site is duplicated. Is this model upgrade process something you apply to 1 or 2 products, or are you talking hundreds? What proportion of your site is nearly duplicate content compared to original content. If the percentage is too large, you could be at risk of Panda.
If this is only for a couple of pages, you can always just take a little time to re-write the description from last year to improve its uniqueness, and also bolster the page with unique content like user reviews and new photos & videos of your product. Once a product model is discontinued, you can also 301 redirect it to the newer models.
-
RE: Where do the Facebook star ratings come from?
For location pages, Facebook users can give the location a star rating on their page (if you visit the page, you should see it on the right hand column). Facebook will also prompt users to rate places they've visited in the right sidebar. I'm not aware of anywhere you can go to view the reviews, it just appears as a star review at the top of the page.
-
RE: Sub Domains and Robot.txt files...
The way that Google finds robots.txt files is by taking your URL, and adding /robots.txt to it. So a good way to see if the robots.txt file is affecting your subdomain is to go to subdomain.domain.com/robots.txt. If the file exists, then it is affecting your subdomain. If it doesn't, then it's only active on your main domain.
Getting indexed is function of having unique content and pagerank, so make sure your subdomain has unique content and links if you're having trouble getting it indexed. Submitting a sitemap is no guarantee that Google will index your site.
-
RE: Headers & Footers Count As Duplicate Content
Google can easily identify navigational elements that appear on every page, and will ignore those for the purposes of identifying duplicate content. You don't need to be worried about your navigation being flagged as duplicate content, that's a standard feature of most websites.
That being said, you can be penalized for having lots of pages with little or no content (Panda). You'll want to make sure all the pages on your site provide some kind of actual value.
-
RE: How To Rank For Easy Keywords (24-26 keyword competitivness)
If you want to improve your domain authority, link building is the way to go. Here is a great list of mostly white-hat ideas that should get you started:
http://pointblankseo.com/link-building-strategies
As far as getting specific pages to rank, try approaching other publishers in your niche and seeing if you can guest post to them. Submit your infographics to directories such as visual.ly and send them to bloggers in your niche and see if they might include them on their sites.
-
RE: Grabbed up some branded domains now what
You can get a refund for the domains within 5 days of purchase. Get the domains refunded and stop using GoDaddy.
-
RE: Is it me? Or is the spam getting worse on professional social networking sites?
I personally haven't noticed that much spam, although I have seen a lot of Linkedin spam in Google. The site has such a high domain authority that it ranks pretty highly for pretty much any query.
-
RE: Should I have as few internal links as possible?
I'm personally not a fan of mega menus (this article explains why), but I think it can be OK on the homepage. It becomes more problematic on the inner pages, however. If someone is in the "Energy" section of your site, they don't need to be able to get to every page of the "Funeral Planning" section. So if you can make the menus more contextual based on what section of the site the user is on, that will provide the best experience for users, preserve link equity, and improve topic relevance.
-
RE: Blog.domain or domain.com/blog
Domain.com/blog is generally preferred, but if you can't do that then a subdomain should be fine.
-
RE: Cannot work out why a bunch of urls are giving a 404 error
Hard to say without looking at the actual page, but it looks like http://www.domainname.com/category/thiscategory/page/2/ is probably a paginated category page (when you're in a category, and looking through older posts). There is probably a relative link somewhere on the page ( vs ), that is causing the link to be malformed. So take a look at the category template, and replace any relative paths on those pages.
-
RE: Should I have as few internal links as possible?
Yes, try to reduce the number of links you have on all your pages while maintaining a good user experience. Contextual navigation is a great way to achieve this. For example, if someone is exploring the "Energy" section of your site, then they probably don't need 50 links about "Funeral Planning".
Try to keep the links in the nav relevant to the section of the site they are on, while providing access to the top level categories if the user wants to visit a different section. This will reduce the number of links on the page, and also improve the keyword relevance of each of the individual categories.
-
RE: Going after GWMT queries - Smart or Risky???
Yes this is a solid strategy.
Optimize your titles and descriptions for keywords that have good position & low CTR. You can also try adding Google authorship (if it's a content page) and semantic markup for reviews (for product pages) to get rich snippets displaying in the SERPs.
Targeting keywords that are ranking in the 5-15 range is also a good idea. Organize those by search volume, and do some onpage optimization. A few tweaks could boost their rankings and give you more traffic for a little bit of work.
-
RE: Text to HTML Important ?
That probably means they just have less html code, since it's a ratio (text/code).
Text to code ratio has no impact on SEO, and I'm surprised some SEOs still obsess over it. If you have an incredibly code heavy page, Google may have trouble crawling it, but that's more about the sheer amount of code vs any kind of ratio.
That being said, you should try to keep your html code to a minimum, as excess code will increase the file size of your page, leading to a longer load time which is a poor user experience and can have a negative impact on rankings.
-
RE: Robots.txt and Multiple Sitemaps
Yes, what you have is the proper format. The best way to submit sitemaps, of course, is to submit them via Google & Bing Webmaster Tools.
Sitemaps won't have much impact on your site unless you have a really large site, so I wouldn't focus on them too much. The best way to get content crawled & indexed by Google is good internal link structure and authoritative external links.
-
RE: What should a small company with a difficult SEO/SEM challenge do?
I gotta say whatever company you were engaging with must have sucked, because you could get anything to rank back in 2000.
I've worked with a number of smallish businesses, and the SEO landscape has gotten tough in recent years, unless you're targeting local. Your competition is every other website in the world that sells the same thing that you do.
It's really difficult to rank a website with thousands and thousands of products without a very high domain authority. So you need to focus. Identify the keywords with the highest SEO potential for you. Here is a quick formula: go into Google Webmaster Tools, and identify all the keywords you rank between position 5-15 for, then sort those by search volume and/or your margin. Then evaluate the competitiveness for those keywords by looking at the sites that rank above you. Focus on the keywords that have the weakest competition, and you should see the greatest amount of return for the least amount of effort.
Product reviews are huge for ecommerce. It provides user generated unique content that helps you stand out from competitors selling the same product, and you can add semantic markup so that you get rich snippets showing in the search results (review stars), which increase CTR. Incentivise reviews by offering coupons to your customers for leaving reviews.
In the long term, think about what you can be the best in the world at, because that is what it will take to stand out in today's competitive landscape. That might be prices, that might be content, that might be customer service. Focus on that and start building a brand. It can take a long time to see results with SEO, so you may want to focus on other lead gen methods until you can build up your business and can afford to spend more time/resources on SEO.
-
RE: What metrics does the fb:admins Facebook Open Graph tag actually provide?
Adding fb:admins will allow you to see Facebook Insights for your domain, which shows you things like how many times your site has been liked & shared on Facebook. The fb:admins tag also serves other functions like allowing the admins to moderate comments (if you're using Facebook comments) and automatically creating pages in Facebook when the admin likes a specific page. So the tag does a lot of things.
You can only use user IDs for the fb:admins markup, not page IDs.
-
RE: Googlebot stopped crawling
Very hard to say without more details. Does your site have unique, high quality content? If it's just duplicate content, Google may crawl it but won't necessarily show it in the SERPs.
Also, what does your backlink profile look like? Google allocates crawl budget based on your PageRank, so if Google isn't crawling all your pages, then you will want to acquire more external backlinks.
-
RE: Launching a Reseller
Being a reseller on the web is tough. You're basically competing against Amazon.com and every other reseller out there that sells the same thing.
Definitely write unique product and service descriptions for everything if you can afford it, otherwise Google will just filter you out of the search results and display your more established competitors ahead of you. If you don't have time or resources to re-write everything, then focus on the top 10% of products in terms of traffic & profitability, then expand from there (Rand recently did a good whiteboard Friday on the subject).
Try to add value beyond what the manufacturer provides on their website. Getting user reviews is a great way to have user generated unique content that's useful to users. Incentivise reviews by giving out coupons. Create filters and comparisons and other features that differentiate your shopping experience. Think about what you can be the best in the world at, because that's what it will take to compete on a global scale.
-
RE: URL String Tracking Question--Need help!
This is not a Google Analytics tracking code (you can find those specifications here).
This looks more like query parameters that were tacked on by some internal search feature or submission form. Do you have any forms or filters on your site? If so, that could be the cause. It could also be the case that some external site is linking to you and tacking on those parameters for whatever reason.
To solve this issue, just add a canonical tag to the header of all of your pages. This is an SEO best practice that can prevent things like query parameters from messing up your URLs and resulting in possible duplicate content issues. You can also block these parameters explicitly in Google Webmaster Tools, but I would start with the canonical approach first.
-
RE: First Link on Page Still Only Link on Page?
The first mention of a link on a page is the most important in the sense that Google looks at the anchor text of the first link, so you want to make the first mention contain relevant anchor text instead of something generic.
That's not to say subsequent mentions of a link are useless. Every link on a page still gets PageRank distributed to it, so the more times you mention a link on the page, the more link value it's going to receive. Also, from a user perspective it can be helpful to have the same link repeated throughout the page in the body and footer, even if it's already in the top nav.
-
RE: Is it a good idea to remove old blogs?
You may find this case study helpful of a blog that decided to exactly that:
http://www.koozai.com/blog/search-marketing/deleted-900-blog-posts-happened-next/
-
RE: Flash site ranking well for a competitive keyword
I would guess because it is on the .ca TLD. Google's international search engines favor sites that are on country specific TLDs. Also, the domain is a partial exact match, which still gives sites a small boost when it comes to ranking.
The interactive Flash site may also have positive user metrics (time on site, # of pages visited) which could also help boost its rankings.
-
RE: Did .org vs. .com SEO importance recently changed?
Google treats the 3 main TLDs (.com, .org, .net) equally in terms of rankings. As long as you are using any of those, you should be fine (gTLDs like .co.uk can impact your search results, as well as spammy TLDs like .info).
You should not use redirects to switch from a .com domain to .org as 301 redirects result in a small PageRank loss.
-
RE: Impact of number of outgoing links on Page Rank of an optimized page?
Having many links on a page will not reduce the PageRank of the page that it's on.
However, it will reduce the amount of PageRank being passed to the individual links on the page, so you should try to keep the number as low as you can while maintaining a good user experience. A general rule of thumb is to keep the number of links to 100 or less, although many ecommerce sites have many more (the typical Amazon.com page has 400+ links).
Linking to high-quality, relevant external sites is believed to improve the relevancy of your own page. These should not be nofollowed. In general, it should not be necessary to use nofollow for external links unless you are linking to a direct competitor (but then, why would you be linking to them?) or in the case of UGC where people may leave links to low quality sites.
In general, just use your common sense and link to any relevant internal or external pages that you think will enhance the value of your content. Don't use nofollow. It's good to use target="_blank" to have external links open in a new window.
-
RE: Is my copy too keyword rich?
Most of it reads fine to me, but the line "working together to provide San Francisco video production services that are truly inspired. "video production services in San Francisco" would be more natural. Google is getting smarter at understanding the meaning of content, so it's not necessary to constantly repeat your keywords over and over again in an unnatural way in your content.
As a design suggestion, I would recommend adding your contact info as well as your social media icons to the footer as well as the header. Unless people immediately know they want to contact you or follow you, they are going to consume your content first, then look for contact info. It's easier for them if that's conveniently in the footer rather than them having to scroll all the way back up to the top of the page.
-
RE: Testing.
First of all, let me just say Moz is great and offers many features that the companies you mentioned don't provide, and at its low price point is a great complementary tool to have
I have used both BrightEdge and Conductor, and both are overpriced for what they provide, IMO. If you're a big brand they might make sense, but for a smaller company I would not recommend them.
One service I would recommend is LinkDex. They offer a lot of features in terms of rank tracking and competitor analysis, as well as tools aimed at identifying authors and influencers in your niche that are great for content marketing and outreach:
-
RE: Duplicate Content Issues on Product Pages
BJS1976 makes some good suggestions.
Another option is to create a category type page that lists all the product variations on it, then canonical each of the individual products to the category page. That way, you still have multiple product pages, but as far as Google is concerned you only have 1 page with the content on it.
-
RE: Authorship without Google+ - Ideas and strategies
The Author schema does not do anything in any search engine yet, as far as I know.
In order to take advantage of Google Authorship, you need a Google+ profile. Google+ is Google's identity platform that they are using identify individual authors. Without Google+, your image will not show up in the SERPs. Without Google+, your friends will not see your contact appearing higher in the SERPs. Without Google+, you will not be able to take advantage of any changes that Google makes to their algo with regards to Author Rank.
In short, no you cannot benefit from Google Authorship without a Google+ profile
Bing is also experimenting with a type of authorship which combines Facebook profiles and Klout scores, but again this has nothing to do with the schema.org markup.
-
RE: Google+ Page Question
It looks like #2 (+WhiteboardCreations) is a Google+ local page, while #1 is a regular page. If you have the option to delete the page, then it's likely that someone on your team with access to that e-mail address created the page at some point. If you used to use the old Google Local, then Google may have automatically created #2 for you, which you now manage.
It should be safe to delete the first page without any consequence since it doesn't appear to be connected to anything. It should warn you about any potential content that may be deleted when you go to delete it. You could also just keep the first page around if you want to, although that could get confusing. As you build out the followers for your main page, it should start to outrank the other page in the search results.
-
RE: Does Yext Directory Listings Help with Links?
Directory links don't pass much link value, but it can still be useful to be listed in the top directories for local citations. Yext can automate the process if you're short on time & resources, but you can also submit to the directories manually for full control of the process.
They even provide a list of all the directories they submit to:
-
RE: Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
BV does provide a newer format for their reviews, if your server allows server side scripting such as PHP. I believe it's called "Cloud SEO". This is the simplest solution.
If you can't run PHP, then I would recommend talking to YourStoreWizards (http://www.yourstorewizards.com/). They provide customized solutions that can automate the data pulling and updating process.
As far as reviews.mysite.com, you want to get that de-indexed as soon as you get the HTML reviews on your site. Otherwise, not only will the subdomain compete with your main site for traffic, bu tall the reviews on your site will be seen as duplicate content.
-
RE: Will multiple domains from the same company rank for the same keyword search?
As you yourself pointed out, there are perfectly legitimate reasons for owning multiple domains that all rank for the same term (ex: nike.com, nikeinc.com, nikeplus.com, etc). Not sure why you are arguing with what I wrote.
-
RE: Will multiple domains from the same company rank for the same keyword search?
Where in Google's TOS does it say that ranking multiple domains for the same phrases is against the guidelines? My original answer is correct: Google will not penalize you for owning multiple domains, only if you are being spammy about it.
-
RE: Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
Yup. Once you have the GWT verification in the header, you should be able to deindex the entire subdomain instantly.
-
RE: Content question about 3 sites targeted at 3 different countries
Yes, you should be fine in that case. Google is not going to penalize you.
-
RE: Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
Yes, we manually bulk upload the HTML reviews every couples weeks or so to keep them fresh. We also had BV noindex the review.subdomain so that it wasn't competing with us in the SERPs (have them add a noindex tag in the header as well as your Google Webmaster Tools verification code, so you can instantly deindex all the pages).
-
RE: Content question about 3 sites targeted at 3 different countries
Google's Matt Cutts recently responded to a similar question. Basically, Google does not generally penalize sites for having duplicate content unless they're being spammy about it. However, Google will only show one version of the same content to searchers. So your sites won't be penalized, but Google isn't going to show the US, UK, and Australia sites in the same SERPs for the same query for the same content. The geo targeting should also help.
-
RE: Exact Match Anchor Text - How Can These Guys Be Getting Away With It?
How are you checking their backlinks? Keep in mind that even the best link checkers can find only a fraction of the links pointing to a site. It could be that there are more links going to the site that you're not aware of, that Google knows about. Ahrefs has the largest index of links, in my experience.
It's also possible that the site has a lot of other positive signals going to the site. Generally, the more authoritative and trustworthy a site is in Google's eyes, the more it can get away with in terms of spammy tactics without getting penalized.
You also don't know which of those links are helping with the rankings, if any. It's entirely possible that the company has already disavowed most of those links in Webmaster tools, or Google is discounting them automatically. Or maybe they will end up getting penalized in the next Penguin update.
In short, don't rely too much on competitive backlink analysis. It can give you a few ideas of links to go after, but it doesn't always give you the full picture.
-
RE: Will multiple domains from the same company rank for the same keyword search?
Google won't actively penalize you for owning multiple domains, unless you are going out of your way to be spammy about it. However, you will need a lot more resources in terms of link building, social media promotion, content production, etc.
In general, the best practice from an SEO perspective is to have a single site with the all the content living in subdirectories of the domain. Subdomains are considered in many cases to be separate sites, so you would run into the same issues as having multiple domains.
-
RE: I am switching shopping cart providers, and I cannot keep the same URL's we've had for the past 10+ years.
There is no limit to the number of 301 redirects a site can have. A few hundred pages is relatively small. Just make sure they are implemented properly.
-
RE: Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
We use BazaarVoice reviews for our ecommerce site too. What we do is right below the iframe reviews, we have a link that says "click here to see more reviews". When you click the link, it opens up a div with the html version of the reviews. So similar idea to what you are proposing, but less "cloaky" than a noscript tag, and it doesn't impact user experience much.
BazaarVoice can also do html reviews that are not iframed if you have a server that can handle server side scripting like PHP (which unfortunately our legacy Yahoo store does not).
-
RE: Duplicate Titles caused by blog
Hi Josh, I'm not sure what CMS you are using, but there should be some way to update the template for your archive pages, which will update all your archive pages. Just add this code in the section of the template:
name="robots" content="noindex,follow" />
-
RE: Any arguments against eliminating all (non-blog) subfolders?
I prefer shorter URLs, but it depends on the size and structure of your site. If you have tens of thousands of pages, then using subfolders to organize things probably makes sense. Likewise, if your site has 5 main categories, it may make sense again to have 5 subcategories to give your site structure.
If you are changing directory structure for an existing site, be aware that you will likely see your rankings drop as Google tries to figure out the changes. 301 redirects do not pass all the link value, so you may need to do a bit of link building to get back up to your current level.
If it's just a few pages you want to redirect, you could also create a shortcut URL such as "domain.com/primary" that 301 redirects to the full URL ("why/specialists/primary-care") so that you can have a short URL that customers can memorize without sacrificing your site structure.
-
RE: Backlinks embedded in posts or backlinks in sidebar?
I would say in general links from within a post/article carry more weight than sidebar links. Sidebar links have been devalued quite a bit by Google, and with Penguin you could even potentially be penalized for a sidebar link on a site with thousands of pages. With that being said, I would not turn down a sidebar link from a high authority site if offered one.
PR can be deceptive because it hasn't been updated in nearly a year. So a site can still be high quality even if the toolbar PR shows a 0. It's always nice to get links from sites that are already ranking for your keyword, but make sure to do a thorough backlink analysis first to make sure they aren't spamming their way to the top. If the #1 result is using black hat tactics, it could take your site down with them when they inevitably get penalized down the line.