What is Google's minimum desktop responsive webpage width?
Fetch as Google for desktop is showing a skinnier version of our responsive page.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
What is Google's minimum desktop responsive webpage width?
Fetch as Google for desktop is showing a skinnier version of our responsive page.
Hi,
We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers.
And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings?
Thanks
This is probably the answer you are looking for.
Rather than just focusing on page level metrics you should look at the domain level SEOmoz metrics as well such as DomainAuthority.
For example, in your screenshot you have a higher PA than your competitor. However, if your DA is lower than your competitor then there is a high possibility that the links you are acquiring are low quality and don't provide much value to your domain as a whole.
As Maximise suggested, you will need to dig a little deeper and determine what types of links your competitor is getting and you are not.
Delete everything under the following directives and you should be good.
User-agent: Googlebot
Disallow: /*/trackback
Disallow: /*/feed
Disallow: /*/comments
Disallow: /?
Disallow: /*?
Disallow: /page/
As a rule of thumb, it's not a good idea to use wild cards in your robots.txt file - you may be excluding an entire folder inadvertently.
No, it's still a redirect. See attached image clearly stating a 302 Temporary Redirect from http://www.eco-environments.co.uk/solar-power/ to http://www.eco-environments.co.uk/solar-power/default.phuse
If your developer still doesn't believe you then have them verify it themselves with this web based HTTP header check tool ~> http://www.webconfs.com/http-header-check.php
Yes, using a 302 Temporary Redirect is hurting your page authority because these types of server response codes do NOT pass any link juice. To cultivate all inbound/internal link equity you want to use a 301 Permanent Redirect instead. With a 301 redirect you retain about 90% of the link value.
I'm not sure to what extent your website is being blocked with the robots.txt file but it's pretty easy to diagnose. You'll first need to identify and confirm that googlebot is being blocked by typing in your web browser ~> www.mywebsite.com/robots.txt
If you see an entry such as "User-agent: *" or "User-agent: googlebot" being used in conjunction with "Disallow" then you know your website is being blocked with the robots.txt file. Given your situation you'll need to go through a two step process.
First, go into your wordpress plugin page and deactivate the plugin which generates your robots.txt file. Second, login to the root folder of your server and look for the robots.txt file. Lastly, change "Disallow" to "Allow" and that should work but you'll need to confirm by typing in the robots URL again.
Given the limited information in your question I hope that helps. If you run into any more issues don't hesitate to post them here.
I believe one of the biggest differentiating factors to your question is relevance. This takes into account things like anchor text, content of the link source and link destination - just to name a few.
For example, it will look natural for a cooking utensil website to link to similar verticals such as a local bakery or recipe website. On the other hand, if your website is about the advancement of nuclear fusion, there is no real reason for you to link out with the anchor text "buy discount bath robes". This will definitely raise a red flag on the search engine's side.
In a nutshell, a link exchange between real businesses will "work" if they are contextually relevant; those that aren't will be detected and subsequently devalued.
Hello Knut,
Below are a few articles and White Board Friday's to give you a quick primer regarding SEO. There's definitely more of these out there so don't hesitate to ask Google!
WBF - International SEO: Where to Host and How to Target
YOUmoz - International SEO Part 2
mozBlog - Geolocation & International SEO FAQ
I've had a little experience SEO-ing websites in Japanese and the landscape is completely different. For starters, Yahoo is actually the dominant search engine but they use the Google algorithm - so just focus on Google's main ranking factors.
Since you don't know any Japanese you'll need someone VERY fluent in the written language so that you can account for both Kanji AND Chinese. You'll need to make a business decision on whether you want to write it in one form or the other - keyword research would definitely help here.
Don't be surprised if most of your visitors are coming from mobile - that's just how the technological culture is in Japan. Most people surf the web using their cell phones (since they are light years ahead of us) and not so much from their computer.
Last but not least, create great content to attract links. Your easiest links will come from those your website/business already has a relationship with. It comes from the Chinese concept of "guanxi" which literally means "relationships" and is an extension of the culture.
I hope that helps you get started and good luck!
Hi Darren,
To answer your question on how you can leverage microdata for your clients website? In a nutshell - just do it. Feel free to refer to Schema.org for documentation and examples.
As far as implementing microdata goes, I highly doubt it will help you "win" in SEO. Why? Mainly because it's a signal for the major search engines to highlight the content, context and relevancy of the page - not a crucial ranking factor. Amazon uses microdata but that's not the reason they are dominating the SERPs.
Let us know if you have any other questions and we'll be glad to help!
Hi Anchorwave,
You can check out a post published earlier this year by SEER Interactive. It's my favorite article of all time in helping me determine whether a link is quality or not. Feel free to check out the 25 ways to qualify a link.
Cheers!
No, unfortunately there is no way to prevent search engine indexation within the tags of your web page. As you mentioned earlier in your question, you can either utilize the meta robots exclusion tag or the robots.txt file.
If you are REALLY intent on blocking indexation of your promotional page and can only use the section, perhaps you can consider using an <iframe>? For example, create a totally new page with your promotional copy and blocked by robots.txt while ensuring you have NO links pointing to it. Then on your promotional page use the <iFrame> tag to extract the content from the robots.txt blocked copy.</p> <p>Honestly, I'm not sure if it'll prevent indexation since I've never tried it before but just an idea.</p> <p>Good luck and tell us how it goes if you do! =]</p></iframe>
And if for some reason you don't happen to like Chrome you can check out Page Speed Online by Google.
I know this may sound obvious but I thought I would ask anyways: are you sure your page was indexed?
To check if this is the case go to Google or Bingahoo and type in **site:websiteURL. **If your page in question does NOT show up then you don't have a problem.
However, if it does then I would urge you to quickly register your client's website with GWT and request a URL removal. Also, if you want the page to get de-indexed "faster" I would recommend taking down the page altogether and implementing a 301 Permanent Redirect to a relevant page. If you don't have a relevant page then server up a header response of 404 Not Found.
Of course, if that is too technical and you don't have development resources then you can just delete all the content on the page (or insert a "coming soon" image) and no one would be the wiser. =]
I hope that helps!
Two tools immediately come to mind.
Xenu Link Sleuth which is fast and free but for PC only.
ScreamingFrog is very good as well and was built specifically for SEOs. The free version only limits you to 500 links but there is a paid version to unlock all restrictions. ScreamingFrog can run on both PC and Macs.
I use and love both so I would suggest testing them out and see which one you like best. Happy crawling!
Hi Rick,
Great job taking the initiative in trying to fill this information gap at a (unfortunately) commonly overlooked disorder. It's good to know that you are pushing out some quality content on the web. Now, let's talk about SEO.
Before you make ANY changes I would strongly urge you to first check out your web analytics and get a good grasp of your inbound traffic. You'll need to create some advanced segments and do some deep dive analysis to address some important questions...
Essentially, what you are doing with this in-depth analysis is to determine what you already do well in and where you can improve. Your website is already live for 6+ months so you don't want to lose any traffic for something you already have traction in. From there, you can make a smart data-driven decision on what pages need to have their title tags changed, add on-page copy, what new videos/content you should create, etc.
As for your question about the video categorization, I would keep the videos under Noah's Minute and sub-categorize them. The main reason is NoahsDad.com is associated with the name Noah's Minute and in essence brands your website. Maybe you can even ask your existing followers to see if they are okay with this?
Regarding mis-spellings, I do not think that is a good idea. If you want to portray your website as an authoritative source to not just search engines but users as well, everything should be written correctly. Search engines can auto-correct mis-spellings so you don't have to worry about that. Here is an example for "downe syndrom videos".
Lastly, ranking for highly competitive keywords is never impossible - you just need to create extremely valuable content and gains lots of links to them. For example, you can create a category for "down syndrome facts and information". Push out some high quality content that includes some myth busting then link to your Noah's Minute subcategory videos and you'll start building out a robust internal link structure. From there, any incoming link equity would boost up your entire website.
At any rate, I hope this gives you a good head start on where to look first but there is a TON of other things we still haven't covered. I apologize in advance if some parts don't make sense but please don't hesitate to ask if you have questions.
Good luck! =]
Yes, absolutely but you'll need to utilize the SEOmoz Linkscape API and generate your own access key. You can check out the Excel spreadsheet BusinessHut created at http://www.businesshut.com/seo/using-seomoz-free-api-excel/
I use it all the time.
SEOmoz only has one API that provides access to their Linkscape index - it does not include social data such as Like's, +'s and Tweets.
Also, it seems like you have development resources available to utilize the APIs of the major social platforms. Have you considered going straight to the source?
Facebook OpenGraph API - http://developers.facebook.com/
Google+ API - https://developers.google.com/+/api/
Twitter API - https://dev.twitter.com/docs
Hi Diane,
Have you tried checking out the Joomla extensions gallery? There are a ton of extensions presently available in their gallery that can help you implement the noFollow tag on designated links. Please check out http://extensions.joomla.org/search?q=nofollow+links
Hope that helps and good luck!
p.s. Linking out does NOT damage your website but paid link/advertisements will if you don't use the rel=nofollow tag (granted only if you get caught).
Although I don't have an explanation as to why your SEOmoz rankings vary greatly every week, I can offer some suggestions/ideas to get over this little issue.
Have you tried e-mailing the SEOmoz customer service team (help@seomoz.org)? They might be able to look into your specific case and diagnose any technical problems.
In the mean time, have you tried creating a completely new campaign to see if the same thing happens?
What about using the rank tracker that is NOT part of the PRO web app? It has weekly updates and if the same problem occurs you can try manually refreshing the rank once every day.
Lastly, a little more legwork here but you could also try running your rank checker about the same time SEOmoz updates their rankings in the web app/rank checker tool. Maybe you might see a difference there?
Best of luck!
Hi Bill,
I wouldn't set a hard and fast rule to look solely at PageAuthority because it doesn't provide the big picture. Instead, I would use it in conjunction with DomainAuthority so you can understand the authority/trust behind the referring domain and get some context for the link.
That way you aren't ruling out "good" links because they have a low PA. For example, I would rather have a PA=13,DA=100 link from Stanford.edu than a PA=56,DA=5 link from some random article directory.
Does that make sense?
"Links to external sites would cause me to lose out on link juice and would hurt me in google's eyes."
That may be the case back in 2001 but not so much anymore. In fact, NOT linking out and hoarding your PR/link juice could actually hurt you instead.
I performed a quick crawl of your website and immediately found part of the problem - it looks like your HTML sitemaps is still linking to the pages in question.
With that in mind, I would NOT recommend using the canonical tag here. Instead, I would check whether or not these pages have links pointing to them. If they don't, then just change the URL in your sitemaps to the correct location. However, if you do have inbound links to these pages then implement a 301 Permanent Redirect to the appropriate page.
I didn't conduct a comprehensive crawl but if you'd like your own data then please check out Xenu Link Sleuth. Also, I think inside the SEOmoz PRO campaign you can click on each individual URL and it'll show you the referring links as well.
Hope that helps and good luck!
UPDATE: Actually, the problem looks like it's coming from your source code. You have your old URLs still located on the page under
. A parallel issue is 'products.php' and 'Products.php' both resolve completely different pages so you might want to look into 301 redirecting them.
No, you will not gain additional SEO value in your links if you include the link title in conjunction with your anchor text. All it does is increase the relevancy on your page.
Link title and image title work in similar ways but unfortunately do not pass any extra SEO benefit. You will know when a website employs link/image titles by hovering over the image or link - words will display near your cursor.
And although this wasn't part of your question, image ALT text (which is different from image title text) will help your SEO efforts if you are trying to optimize for image search as it is one of the major ranking factors.
I was expecting that people who come to my site through adwords to be completely new visitors (100% instead of 67%).
Not necessarily - there are numerous possibilities why is happening. For example, someone clicks on your ad the first day and leaves your website after browsing it for awhile. They come back the next day and perform the same search and click on the same ad (maybe because that's how they remembered you) - that would make this person a returning visitor and NOT a new visitor.
That being said, you'll need to do a deep dive analysis to figure out what's really going on. To start, try creating an advanced segment between new and returning visitors, check out the navigation path, top content, see where they fall in the conversion funnel, etc.
Hope that helps guide where you need to go!
Yes, from my link building experiences in the past 11 months I've noticed partial-match anchor text offer the best boost in rankings. It's a way to create a natural profile unnaturally...if that makes any sense? Although I have not conducted an actual experiment on this, including your company name is very beneficial as it helps brand your website as well.
For example, you can use these derivatives to get you started:
Boat Covers from [Company]
Buy Boat Covers only at [Company]
Go to [Company] for Boat Covers
Of course, there will be some who disagree with this method so at the end of the day I would suggest you test out both techniques.
Happy link building!
And to supplement THB's response...
Definition of Page Authority - http://www.seomoz.org/learn-seo/page-authority
Definition of Domain Authority - http://www.seomoz.org/learn-seo/domain-authority
How soon do you need the 404 error data? If you are okay waiting for a few days until Google or SEOmoz crawls your website (or your clients) then that works.
If you need it immediately I would highly recommend checking out Xenu Link Sleuth or Screaming Frog. They crawl your entire website (or client and competitors) and report on the HTTP status, title tag, h1 tag and many other things.
The definitions for the mozBar metrics can be found in the URLs below:
PageAuthority - http://www.seomoz.org/learn-seo/page-authority
mozRank - http://www.seomoz.org/learn-seo/mozrank
mozTrust - http://www.seomoz.org/learn-seo/moztrust
Domain Authority - http://www.seomoz.org/learn-seo/domain-authority
For spotting a weakness in a competitor, I wouldn't rely solely on the mozBar. But for all intents and purposes of your question I would look at two key metrics to compare: the PageAuthority and total number of linking root domains.
PA gives you an idea of how much more/less authority you have compared to your competitor's page. The number of linking root domains gives you a rough benchmark of how many more unique linking domains you'll need to catch up to them.
Hope that helps!
Mike - thanks for clarifying it a little more. At this point I would turn to data for my decision rather than theory.
Check your web analytics package and see how many people are searching for the phrase "blue widgets small." Double check again to see how many actually drive revenue. Do they convert better than "blue widgets" searchers? How's their engagement rate? Etc.
I am confident once you find the answer to that it'll help you make your decision on whether or not you should create a new page and allocate link building resources to it.
Echo1 - no need to apologize as I wanted to help you make a decision based on more data points rather than solely relying on the Google keyword tool. As for your question about whether to use broad or exact match numbers I would always err on the side of caution and choose exact match. If you want to be even more pessimistic in your numbers then divide it by half.
In any case, you should still gather competition numbers by performing an intitle:"keyword phrase" and inanchor:"keyword phrase" search in Google (including the quotes as it denotes exact match). This will tell you approximately how many websites are optimizing their title and anchor text with exact match phrases. Those numbers should guide you toward the right keyword choice.
Lastly, try experimenting with the title tag and vary up your anchor text so you don't "lose out" on the "limo service Chicago" keyword. I looked at the results for that keyword and in the organic listings, no one optimized for it until the website ranked #7.
I pulled up some quick and dirty numbers for your target keywords using Google's search operators intitle: and inanchor: This highlights how optimized a keyword is in terms of the title tag and anchor text. Of course, this isn't going to help you win SEO but gives you a general idea regarding the competition for your search phrase.
Taxi Service Seattle - 1.7k results in title, 820k results in anchor text
Seattle Taxi - 3.8k results in title, 81k results in anchor text
Seattle Taxi Service - 996 results in title, 91.4k results in anchor text
To answer your question though, I would never stuff my home page title tag. Instead, based on the numbers above I would change it to this ~> [Company Name] - Seattle Taxi Service. I would then optimize for "seattle taxi service" but link build for all three keywords since they are essentially derivatives of each other.
What do you mean by "unavailable pages/pages that doesn't exist"? Are they returning an HTTP status of 404 NOT FOUND? Have you heard of or ever tried MajesticSEO (https://www.majesticseo.com/)? You can even try using Google/Bing's link: search operator but you won't be able to export the data like YSE. Lastly, have you tried looking in your web analytics referral data? Although it won't illustrate the number of links to the page, it'll give you an idea which links are driving traffic on what domain.
What is the ranking URL for "blue widgets small"? Is it the same as "blue widgets"? If so then I would go with option #1 and include some of your product modifiers on the ranking page. A benefit to this link building approach is it varies up your link profile with partial-match anchor text and can help boost your rankings for the competitive 2-word phrase.
On the other hand, how different is "blue widgets small" from "blue widgets"? Would it create enough value for the user to divert their attention to another page? If you answered yes then I would go with option #2. Just remember to internal link from the "blue widgets" page!
Do you have any business relationships with CloudFare? I didn't read through all of the code but I checked out their website and they appear to be a cloud service for content optimization, website security, analytics and web app deployment.
Although I would recommend you check with your web development or engineering team to confirm 100%.
Yes, Google DOES recognize your request for branded sitelinks demotion.
The only down side is that they unfortunately do not update within a week. It took us two months of weekly tracking and demotion to finally get the branded sitelinks that we want displayed.
In a nutshell, give it a few more weeks and you'll eventually see the requested URL demotions take effect.
+1
Totally forgot about mentioning the inbound links part. Thanks for picking it up, Rick!
We implemented the Schema.org microdata format a few months ago and have seen no strong correlation which suggests using these tags would increase your rankings in the search results. I speculate it's not so much a ranking factor but more of a signal that tells search engines, "Hey, this is the exact price of our product named XYZ potion."
To answer your question though, the tags are "nice-to-have" but not necessarily a "must-have" and implementation is completely contingent on how your business prioritizes your SEO projects. For example, if you have other issues to take care of such as 404 detection & handling, content generation, link acquisition, etc. I would put those at the front of the queue (I think Vanessa Fox would agree).
Afterthought: Search engines are very smart nowadays and long before Schema.org was announced, our competitors had pricing rich snippets in the search results even though they didn't use the tags.
Hi Gary,
Yes, it is always a good idea to cut down the number of 301 redirects (or any redirects in general) because if I remember correctly, Google stops crawling a link after the 5th redirect or so. You also lose another 10% link juice for each additional redirect.
Lastly, don't forget to 301 redirect the URLs from the beginning of the year to the new re-structured website.
Hope that helps!
Hello Matt,
What you are doing sounds a lot like doorway pages and you will definitely be flagged if not penalized for it. Aside from the fact that it's frowned upon by search engines, it is also bad for user experience as well.
For example, if I am on shopping.com and start clicking around their navigation bar, I would expect to be situated in the same domain. However, if I found myself suddenly on amazon.com, that would be a cause for concern and frustration.
You can try to game the search engines by creating identical pages but it's most likely they will see through this via your source code, c-block ip, linking profile, etc. as these patterns become apparently pretty quickly.
There are multiple angles you can go about to getting the webmasters contact information even if it's not listed on their website.
Search for them on popular social media websites like Twitter, Facebook and LinkedIn. That's a great way to put a "face" to who you are and let's the webmaster know you are going out of your way to find out more about them.
A method we like to employ on our team is to examine the target websites backlinks and build relationships with THOSE webmasters. Once you build a decent rapport with them you can broach the subject to see if they can get you in contact with the target website.
Link building isn't easy and as many have said before in the SEOmoz community it's more like relationship building. With that mindset and a little creativity, you'll get some of the best links on the web.
No, that is not a problem and not very important for SEO that you have "clean" looking text in a text browser.
A few months ago I thought I had to optimize (and by optimize I mean make it look pretty) for a text only browser as well. However, after conducting some competitive research I found that some of our top competitors (ranked #1-5) were in far worse condition from a text only browser stand point.
In a nut shell, as long as you have the text visible then you are good to go.
TIP: To see how Google views your website type the following in the search bar~> cache:www.yourwebsite.com then click on "text-only version"
Hi Eric,
Yes, there is definitely a way to determine what pages your visitors are landing on and what search term they used to find it. I'm going to assume you are using the new version of Google Analytics.
A new column should show up and give you the answer to your question.
I hope that helps!
Yes, you probably answered your own question. In WordPress, there are two different settings under Settings > Privacy:
I would like my site visible to everyone, including search engines and archivers.
I would like to block search engines, but allow normal visitors
If option #2 was selected, WordPress doesn't create a robots.txt file for you but instead it automatically generates a tag on every single page.
I hope that helps!
The Keyword Difficulty Tool doesn't tell you if you have a canonical URL problem. The metrics used in that report highlight inbound links and are separated into two categories:
Root Domain Linking Root Domain - this is the total number of unique root domains links to your entire website regardless if it's the home page, contact, about, testimonials, etc.
Page Linking Root Domains - this is the total number of unique root domains linking to that specific page ONLY and in your case, the home page.