What is Google's minimum desktop responsive webpage width?
Fetch as Google for desktop is showing a skinnier version of our responsive page.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: SEO Team
Company: CafePress
Website Description
Funny t-shirts from the world's customization engine - CafePress!
Favorite Thing about SEO
Different solutions for different websites.
What is Google's minimum desktop responsive webpage width?
Fetch as Google for desktop is showing a skinnier version of our responsive page.
Hi,
We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers.
And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings?
Thanks
This is probably the answer you are looking for.
Rather than just focusing on page level metrics you should look at the domain level SEOmoz metrics as well such as DomainAuthority.
For example, in your screenshot you have a higher PA than your competitor. However, if your DA is lower than your competitor then there is a high possibility that the links you are acquiring are low quality and don't provide much value to your domain as a whole.
As Maximise suggested, you will need to dig a little deeper and determine what types of links your competitor is getting and you are not.
Delete everything under the following directives and you should be good.
User-agent: Googlebot
Disallow: /*/trackback
Disallow: /*/feed
Disallow: /*/comments
Disallow: /?
Disallow: /*?
Disallow: /page/
As a rule of thumb, it's not a good idea to use wild cards in your robots.txt file - you may be excluding an entire folder inadvertently.
No, it's still a redirect. See attached image clearly stating a 302 Temporary Redirect from http://www.eco-environments.co.uk/solar-power/ to http://www.eco-environments.co.uk/solar-power/default.phuse
If your developer still doesn't believe you then have them verify it themselves with this web based HTTP header check tool ~> http://www.webconfs.com/http-header-check.php
Yes, using a 302 Temporary Redirect is hurting your page authority because these types of server response codes do NOT pass any link juice. To cultivate all inbound/internal link equity you want to use a 301 Permanent Redirect instead. With a 301 redirect you retain about 90% of the link value.
I'm not sure to what extent your website is being blocked with the robots.txt file but it's pretty easy to diagnose. You'll first need to identify and confirm that googlebot is being blocked by typing in your web browser ~> www.mywebsite.com/robots.txt
If you see an entry such as "User-agent: *" or "User-agent: googlebot" being used in conjunction with "Disallow" then you know your website is being blocked with the robots.txt file. Given your situation you'll need to go through a two step process.
First, go into your wordpress plugin page and deactivate the plugin which generates your robots.txt file. Second, login to the root folder of your server and look for the robots.txt file. Lastly, change "Disallow" to "Allow" and that should work but you'll need to confirm by typing in the robots URL again.
Given the limited information in your question I hope that helps. If you run into any more issues don't hesitate to post them here.
I believe one of the biggest differentiating factors to your question is relevance. This takes into account things like anchor text, content of the link source and link destination - just to name a few.
For example, it will look natural for a cooking utensil website to link to similar verticals such as a local bakery or recipe website. On the other hand, if your website is about the advancement of nuclear fusion, there is no real reason for you to link out with the anchor text "buy discount bath robes". This will definitely raise a red flag on the search engine's side.
In a nutshell, a link exchange between real businesses will "work" if they are contextually relevant; those that aren't will be detected and subsequently devalued.
Hello Knut,
Below are a few articles and White Board Friday's to give you a quick primer regarding SEO. There's definitely more of these out there so don't hesitate to ask Google!
WBF - International SEO: Where to Host and How to Target
YOUmoz - International SEO Part 2
mozBlog - Geolocation & International SEO FAQ
I've had a little experience SEO-ing websites in Japanese and the landscape is completely different. For starters, Yahoo is actually the dominant search engine but they use the Google algorithm - so just focus on Google's main ranking factors.
Since you don't know any Japanese you'll need someone VERY fluent in the written language so that you can account for both Kanji AND Chinese. You'll need to make a business decision on whether you want to write it in one form or the other - keyword research would definitely help here.
Don't be surprised if most of your visitors are coming from mobile - that's just how the technological culture is in Japan. Most people surf the web using their cell phones (since they are light years ahead of us) and not so much from their computer.
Last but not least, create great content to attract links. Your easiest links will come from those your website/business already has a relationship with. It comes from the Chinese concept of "guanxi" which literally means "relationships" and is an extension of the culture.
I hope that helps you get started and good luck!
Hi Darren,
To answer your question on how you can leverage microdata for your clients website? In a nutshell - just do it. Feel free to refer to Schema.org for documentation and examples.
As far as implementing microdata goes, I highly doubt it will help you "win" in SEO. Why? Mainly because it's a signal for the major search engines to highlight the content, context and relevancy of the page - not a crucial ranking factor. Amazon uses microdata but that's not the reason they are dominating the SERPs.
Let us know if you have any other questions and we'll be glad to help!
Hi Rick,
Great job taking the initiative in trying to fill this information gap at a (unfortunately) commonly overlooked disorder. It's good to know that you are pushing out some quality content on the web. Now, let's talk about SEO.
Before you make ANY changes I would strongly urge you to first check out your web analytics and get a good grasp of your inbound traffic. You'll need to create some advanced segments and do some deep dive analysis to address some important questions...
Essentially, what you are doing with this in-depth analysis is to determine what you already do well in and where you can improve. Your website is already live for 6+ months so you don't want to lose any traffic for something you already have traction in. From there, you can make a smart data-driven decision on what pages need to have their title tags changed, add on-page copy, what new videos/content you should create, etc.
As for your question about the video categorization, I would keep the videos under Noah's Minute and sub-categorize them. The main reason is NoahsDad.com is associated with the name Noah's Minute and in essence brands your website. Maybe you can even ask your existing followers to see if they are okay with this?
Regarding mis-spellings, I do not think that is a good idea. If you want to portray your website as an authoritative source to not just search engines but users as well, everything should be written correctly. Search engines can auto-correct mis-spellings so you don't have to worry about that. Here is an example for "downe syndrom videos".
Lastly, ranking for highly competitive keywords is never impossible - you just need to create extremely valuable content and gains lots of links to them. For example, you can create a category for "down syndrome facts and information". Push out some high quality content that includes some myth busting then link to your Noah's Minute subcategory videos and you'll start building out a robust internal link structure. From there, any incoming link equity would boost up your entire website.
At any rate, I hope this gives you a good head start on where to look first but there is a TON of other things we still haven't covered. I apologize in advance if some parts don't make sense but please don't hesitate to ask if you have questions.
Good luck! =]
Hello Matt,
What you are doing sounds a lot like doorway pages and you will definitely be flagged if not penalized for it. Aside from the fact that it's frowned upon by search engines, it is also bad for user experience as well.
For example, if I am on shopping.com and start clicking around their navigation bar, I would expect to be situated in the same domain. However, if I found myself suddenly on amazon.com, that would be a cause for concern and frustration.
You can try to game the search engines by creating identical pages but it's most likely they will see through this via your source code, c-block ip, linking profile, etc. as these patterns become apparently pretty quickly.
Two tools immediately come to mind.
Xenu Link Sleuth which is fast and free but for PC only.
ScreamingFrog is very good as well and was built specifically for SEOs. The free version only limits you to 500 links but there is a paid version to unlock all restrictions. ScreamingFrog can run on both PC and Macs.
I use and love both so I would suggest testing them out and see which one you like best. Happy crawling!
I performed a quick crawl of your website and immediately found part of the problem - it looks like your HTML sitemaps is still linking to the pages in question.
With that in mind, I would NOT recommend using the canonical tag here. Instead, I would check whether or not these pages have links pointing to them. If they don't, then just change the URL in your sitemaps to the correct location. However, if you do have inbound links to these pages then implement a 301 Permanent Redirect to the appropriate page.
I didn't conduct a comprehensive crawl but if you'd like your own data then please check out Xenu Link Sleuth. Also, I think inside the SEOmoz PRO campaign you can click on each individual URL and it'll show you the referring links as well.
Hope that helps and good luck!
UPDATE: Actually, the problem looks like it's coming from your source code. You have your old URLs still located on the page under
. A parallel issue is 'products.php' and 'Products.php' both resolve completely different pages so you might want to look into 301 redirecting them.
Yes, using a 302 Temporary Redirect is hurting your page authority because these types of server response codes do NOT pass any link juice. To cultivate all inbound/internal link equity you want to use a 301 Permanent Redirect instead. With a 301 redirect you retain about 90% of the link value.
There are multiple angles you can go about to getting the webmasters contact information even if it's not listed on their website.
Search for them on popular social media websites like Twitter, Facebook and LinkedIn. That's a great way to put a "face" to who you are and let's the webmaster know you are going out of your way to find out more about them.
A method we like to employ on our team is to examine the target websites backlinks and build relationships with THOSE webmasters. Once you build a decent rapport with them you can broach the subject to see if they can get you in contact with the target website.
Link building isn't easy and as many have said before in the SEOmoz community it's more like relationship building. With that mindset and a little creativity, you'll get some of the best links on the web.
Hi Diane,
Have you tried checking out the Joomla extensions gallery? There are a ton of extensions presently available in their gallery that can help you implement the noFollow tag on designated links. Please check out http://extensions.joomla.org/search?q=nofollow+links
Hope that helps and good luck!
p.s. Linking out does NOT damage your website but paid link/advertisements will if you don't use the rel=nofollow tag (granted only if you get caught).
And if for some reason you don't happen to like Chrome you can check out Page Speed Online by Google.
Yes, Google DOES recognize your request for branded sitelinks demotion.
The only down side is that they unfortunately do not update within a week. It took us two months of weekly tracking and demotion to finally get the branded sitelinks that we want displayed.
In a nutshell, give it a few more weeks and you'll eventually see the requested URL demotions take effect.
We implemented the Schema.org microdata format a few months ago and have seen no strong correlation which suggests using these tags would increase your rankings in the search results. I speculate it's not so much a ranking factor but more of a signal that tells search engines, "Hey, this is the exact price of our product named XYZ potion."
To answer your question though, the tags are "nice-to-have" but not necessarily a "must-have" and implementation is completely contingent on how your business prioritizes your SEO projects. For example, if you have other issues to take care of such as 404 detection & handling, content generation, link acquisition, etc. I would put those at the front of the queue (I think Vanessa Fox would agree).
Afterthought: Search engines are very smart nowadays and long before Schema.org was announced, our competitors had pricing rich snippets in the search results even though they didn't use the tags.
CafePress is where the world shops for custom T shirts and other unique gifts that express people's unique personalities.
Looks like your connection to Moz was lost, please wait while we try to reconnect.