Android Browser is similar to Chrome, and it's the default browser on Android phones.
Web View is an in-app browser. It's not the installed-by-default browser, it's only ran inside of apps that load content.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Job Title: CEO
Company: FullTraffic, LLC
FullTraffic - Buy Website Traffic
Favorite Thing about SEO
Quality Content
Android Browser is similar to Chrome, and it's the default browser on Android phones.
Web View is an in-app browser. It's not the installed-by-default browser, it's only ran inside of apps that load content.
No, you won't get penalized for redirecting the PDFs to HTML versions of them. In fact, Google will like it.
Here's a video that may help you out: https://www.youtube.com/watch?v=oDzq-94lcWQ
You are doing guest blogging for the links? Then just 1 will do it to get you penalized. Forget about guest blogging for links, instead, create great content and promote it trying to get a natural link.
If you are doing guest blogging, don't do it for the links. If in one article you write it happens you have an excellent article to link to in one of your properties then link to it. But do NOT write so you can insert a link.
Hey,
It depends on the penalty, if any.
If you have no manual actions under the Webmaster Tools, that's a hint. However, it could be an algorithmic penalty.
If the penalty, again, if any, applies to the whole site, then changing the site's contents while making sure your entire site (backlinks too) is in compliance with Google's quality guidelines, then the penalty should be revoked.
If the issue is actually only the fact that Google can't access the site, then check why, fix that ASAP and you should be ranking again in no time (check using the fetch as Googlebot to make sure that is/isn't it first).
To sum up, you should run an extensive analysis on links, content, server responses errors and find the cause of the "penalty", then work on fixing it to start ranking. Once you do, you can continue with the other SEO/design tasks.
As I said before, opening a thread in Google's Webmaster Help forums could be of much help.
All the best!
Holy... this IS weird.
Checked the robots.txt and there's nothing blocking the indexing, robots meta tags are present with INDEX.
You clearly need urgent access to Webmaster tools, seems like a penalty for pure spam or something like that, as there's no 1 single page indexed, while there are other sites linking to it.
What I would do? Before doing any further onsite SEO, get that resolved. Go to Webmaster tools and check any manual action, message, etc. Try the fetch as googlebot. Then go to Google's Webmaster forums and ask, usually someone from Google jumps in.
Care sharing the real domain?
Dan,
If you have an English page that is also available on Turkish (same content but rewritten/translated) then an hreflang tag is recommended, not mandatory, but recommended. Although as you said you are already writing in Turkish and geotargeting in GWT, there are other engines too, that regardless their market share, shouldn't be overlooked.
HOWEVER, if you have a page in English not matching a Turkish page, then you don't need the hreflang in that page. The tag is only used when the same content is available on other language/location to tell engines which version they should serve.
What you mention about using x-default and removing the canonical is nonsense. Those are 2 different things and one would not interfere with the other. The plugin I recommended does not mess with Yoast, leaving the canonicals as they should be and adding the hrefland tags as specifies. Check this example on my site English and Spanish using both Yoast and the hreflang Manager plugin:
Check the source code, both have their canonicals and hreflang tags just fine. We chose to use the English version as the default, as you can see in the x-default.
The hreflang tags should be used only when the content is the same (but targeted to a different audience). Of course of the translation from one language to the other some lines must be rewritten to make sense.
In my example, I used two very similar (if not the same langs), however there are things that change, but those are minimal (take as an example a car "hood", in England a "bonnet"). As those are such minimal changes, I don't think a specific version for GB is needed if you are already serving a US version (that's up to you). In that case (1 english version to all english speakers), you only specify the language, instead of the Language and Region:
<link rel="<a class="attribute-value">alternate</a>" href="http://www.example.com" hreflang="<a class="attribute-value">en</a>"/>
Now, just to make sure we have an example that DOES apply a different GEO in en-US and en-GB, could be a page that explains what are car repair centers, plus below it shows a list of repair centers. In these scenario, the content is the same, but the list of repair centers change, you would like to display those in GB to your GB audience (still, from my point of view, useless, but was just an example).
Hope that clears it up
Hey Dan,
If I understood correctly, you should use both. Canonical tags are used tell search engines that the content is located on the canonical content, while hreflang points which version should be served to each visitor depending on the user's location/language.
If you Yoast, then they already handle the canonical tags and there's nothing you need to do. For the hreflang, if you have at the moment only 1 version served to all visitors, then those shouldn't be used. However, if you have 2 versions quite similar, like en-US and en-GB then you will need to choose the one that's default, let's say the US version and have the following on each version:
en-US:
en-GB:
This applies if the en-US and en-GB versions are NOT exactly the same. If the language changes (that's why you create a specific version to each country) you need a canonical in each version pointing to itself.
If the en-US and en-GB have the same contents, then the canonical should point to the en-US version (but there's no need to have the en-GB version really, which makes it useless / expendable).
As you mention that at the moment you do not have any extra langs/regions, then you could leave the tags empty or better remove them.
There's a plugin for wordpress that handles hreflang tags (paid) hreflang Manager
Hope that helps!
Did you try manually fetching as Googlebot the robots.txt file via the webmaster tools? If not, do that, then click submit to index. Then do the same thing for some of the images or better if you have a file linking to all images (like an image sitemap). Once that is done, let Google recrawl everything for a few days (even a week) and try again.
The tag rel="dofollow" doesn't exist...
Hey Chris,
Did site A or B receive a manual penalty?
As any penalty on A, which is 301'd to B, will ultimately pass the penalty to B. I would suggest removing the 301 ASAP. Then cleanup the A domain until it's clean (if a manual action, until it's revoked) and then you can think of putting the 301 back.
Removing a manual penalty could be a long process, it took 1 year for us and 4 reconsideration requests to get the penalty revoked. We had to use the disavow as a machete as disavowed almost our entire link profile leaving aside the domains that we knew were good links, all others were disavowed using the "Domain:" to avoid any missed link.
You published an article in your blog and then in Ezine? the exact same post? Those tactics are long dead (duplicate content). You can now promote your content using several other methods:
just to name a few.
Yup, just wait. However, I would consider switching to a better server, a 5 day downtime is a long downtime! Look for more reliable solution.
Colons are seen by search engines as what they are. You say something, a word, then a colon, and then comes an explanation or enumeration.
In your example, you did it right, perhaps you should move the colon to where they belong, right next to the last letter of the brand, so it reads: GENERAL ALTIMAX ARCTIC: 225/45R17 91Q
The idea you mentioned, building titles for users, not for engines, is the way to go. However, there are some tweaks you can make to make it easier for both.
As in your example, the title could become: GENERAL TIRE: ALTIMAX ARCTIC 225/45R17 91Q - YOURSITENAME (personally I would put the colon next to the brand, and then comes the rest of the product name + you end up with " - YOURSITENAME" to help build YOUR brand.
Hope that helps!
Well, I've removed the keyword tag from all my Websites about 2 years ago. Mainly because Google said they don't use it anymore. However, Bing said that they could still use the meta to figure out what is the page about. Still we haven't see change in terms of rankings.
I personally won't use that meta again, just to remove some clutter from the page. With all today's metas (authorship, OGs, twitter cards, etc.) you are just adding more "garbage" that actually doesn't do anything.
Hey Daniel,
The canonical should point on every possible filtering to the main page without any filter.
Check the following Q&A from Yesterday: http://moz.com/community/q/canonicalization-w-search-and-filter-operators-parameters
Hope that helps.
Joanne,
I'm afraid there's no way to know which pages are actually indexed from your Webmaster Tools. You can use a simple search in Google: site:domain.com and it will list "all" your indexed pages, however, there's no way to export that as a report.
You can create a report using some "hack". Login to your Google Drive, create a new spreadsheet and use the following command to populate rows:
=importXml("https://www.google.com/search?q=site:www.yourdomainnamehere.com&num=100&start=1"; "//cite")
This will load the first 100 results. You will need to repeat the process for every 1000 results you have, changing the last variable: "start=1" to "start=100" and then "start=200", etc (you see where I'm going). This could really be a pain in the butt for your site's size.
My recommendation is you navigate your own site, decide which pages should be removed and then create the robots.txt regardless what google has indexed. Once you complete your robots.txt, it will take a few weeks (or even a month) to have the blocked pages removed.
Hope that helps!
206 means partial content, which is what your Website/Server is delivering to Facebook's request. Have you tested the "Fetch as Googlebot" under Webmaster tools to see if Google can get the files? https://www.google.com/webmasters/tools/googlebot-fetch
If you get an error there, then it must be something IP related with your server, as my test returned a 200 and a test using Googlebot as the user agent also returned 200, which means that the IP wasn't blocked (nor the user agent excluded), basically telling me that if Googlebot is unable to access your site (nor facebook) it must be something IP related.
Hope that helps!
A few Webmaster videos ago, Google's Matt Cutts pointed out that that Googlebot should be treated exactly the same as if it was a regular person visiting your site, which you are currently doing.
However, you are now FORCING users to stay on "their" corresponding location, instead you should "suggest" but not force it.
Example: user access the naked domain: domain.com then you check his/her IP and redirect the the appropriate location, you must in this case, use some kind of "we already redirected him/her" method to avoid forcing the user to a specific country subdomain, you can use either sessions or cookies. Once you redirect, you create a cookie saving the option or a session variable. You now have the visitor in the location you want and you should offer an easy way to switch the location (check live examples, like logitech.com), a drop down menu for example on the footer. Now, IF a user accesses a location, say au.domain.com, you shouldn't do the automatic redirection, but instead, you could bring a lightbox pop-up suggesting the user to go to their "best match" location.
Using the above method allows Google to access any page, without forcing it to a specific location, plus, from my point of view, it is the easier and friendlier way for users too. If I type au.domain.com (while in the US) I probably want to see the AU version, if not, the page will suggest me to switch (and based on my response (closing the window or clicking a "stay here") the site should remember it and avoid re-asking).
Hope that helps!
12/17/2013 So you write a kick-ass article and publish it on a reputable blog. Surely, a post on [insert famous blog here] would send an avalanche of new visitors/subscribers/leads your way. Except that it doesn’t. What went wrong?
Founder and CEO at FullTraffic.
Looks like your connection to Moz was lost, please wait while we try to reconnect.