You also can use JSON-LD to add this in site w/o messing with HTML. Just refer to their documentation because it can be placed everywhere - in HEAD or in BODY but wasn't visible.
Best posts made by Mobilio
-
RE: Schema.org Data Appears on Website
-
RE: Default Robots.txt in WordPress - Should i change it??
In addition of Martijn here is mine robots.txt:
User-agent: *
Disallow:Sitemap: http://peter.nikolow.me/sitemap_index.xml
But using Yoast - categories, tags, most of archives and other generated pages are disabled for indexing.
-
RE: Hosting set up in different country
Well - this was asked many times and answer is No. Here are more info about this https://www.youtube.com/watch?v=hXt23AXlJJU https://www.youtube.com/watch?v=keIzr3eWK8I https://www.seroundtable.com/seo-geo-location-server-google-17468.html https://moz.com/community/q/does-the-location-of-my-server-effect-my-seo
But if you are using generic domain (not ccTLD or SearchConsole targeting country) this could be problem - https://builtvisible.com/ip-location-search-results/ They have .com domain and targeting UK market
So - if you have .de domain and in SC you set preferred geo location to Germany and your server is located in France (near Germany) then you shouldn't have to worry about it.
-
RE: Rich Snippets (Rating stars) not showing up on website in search results
Brian,
i explain both ways as you may loose ratings in SERP. If you're sure that 1st isn't the case we're talking about then good for you.
The problem is N:2 data was shown in SERP using different methods. Example: http://www.mobiliodevelopment.com/pdf417-barcode-standard/ go in SDTT and look this url for rich snippets. There are 0 structured data except head metas. Now go back in SERP and look for "pdf417 standard" this article is maximum on 2-3 page if isn't on first. You can easy see "Mar 2, 2012 - PDF417 is a 2D barcode standard."
This is proof that you can have data w/o rich snippets. Now back on your problem. If you remember i talk about someone who lost his star reviews for year and they're back. I suggest this guy to go and ask here:
https://productforums.google.com/forum/#!forum/webmasters
Guess? Nothing happens. And this was 11-10 months ago.That's why you need to check and recheck your implementation just to see that everything is correct on your side. This mean technical implementation and logical implementation. Then you may looking for support in link above and prepare for waiting with patience.
PS: I can take a look too but need url to inspect this site.
-
RE: How to deal with spam heavy industries that haven't gotten the hammer from Google?
Yeah - this IS problem.
I have friend of mine who runs webhosting for "free hosting". And he told me that no way to get to 1st page with anything lower than 10k root domains is mission impossible. Also in terms of backlinks their scale is in millions. Only getting there is more and more links from anywhere in the world, any site from any niche. And i saw statistics... he is right.
The problem that if you're in some similar industry then maybe patience is over. And maybe next Penguin will fix this, but we already heard this many times. So you can play few games:
- you can keep building links as before. Only pure white hat from various sources. Can be game magazine, game shows, local meetups, etc.
- you can use some gray hat techniques. Forum links, blog comments, etc.
- you can purchase some expired domains on game niche. Using tools as http://tool.domains/ for finding and evaluating them.
- this is blackhat and i won't recommend it but when all options are on the table - using PBN network or PDF linking. And this is "white side of blackhat". There is dark side of blackhat using software vulnerability like in 3P industry (Pharma, Poker and Porn). As i said before it's better to stay away from that, computer hacking can be also federal crime. Even don't think to use this for real client. Bad guys used them for "churn and burn" method where domain is used for few days, then blacklisted; but guys get new domain and used it for few days; next domain...
- sponsoring some good players - like Fatal1ty
- sponsoring some good video game streamers - you can look for them in Twitch or YouTube. One of closest example is mine friend called Gothika47 (i can connect you if you wish). He run video channel in YT https://www.youtube.com/user/VoodooHeadsTV and Twitch http://www.twitch.tv/gothika_47 (he was moved from Hitbox before month ago). So that streamers can bring you HUGE audience if you're used them proper. I was watch streams with over 1k viewers online and this is WOW because streamer listen watchers and can do their requests.
- creating infographic with LOL skills or even build calculator for LOL or item calculator; you can also make "for dummies" walktrough for novices; also strategy guides, clan guides, etc. You also can make a videos about that - only for novices.
- something other... just think about giving "value" to users
As you can see this list isn't full and can be extended with many more ways. But always think "how we should add value to community?" Because this is important to them and this will drive sales.
PS: Dificult situation - LOL is trade mark owned from Riot Games. And should think about possible legal issues from using their TM too.
-
RE: Client wants to show 2 different types of content based on cookie usage - potential cloaking issue?
You can return same page with different content based on cookie safe. Just don't forget to add "Vary: Cookie" in headers. This will to told browsers and bots that this content is different based on cookie.
-
RE: Doctype language declaration problem
Yes - iso-8859-15 is very outdated encoding. Validator suggest that you should use UTF-8.I believe that this is also SEMRush issue too.
Fix just wrote this:
and bug will be fixed.
-
RE: How does decimal rounding of reviews to stars work in ios appstore? Starting from which average review score to get full 5 star rating?
Mobile dev here!
So for all stores (AppStore + GooglePlay + Microsoft Marketplace + Amazon App Store + etc) algorithm is simple - they calculate ALL ratings on all versions. Some of them only for local store, other for global.
Example Duolungo:
https://itunes.apple.com/bg/app/duolingo-learn-spanish-french/id570060128?mt=8 <- this is in local Bulgarian Market. 4.928 in 9 ratings.
https://itunes.apple.com/us/app/duolingo-learn-spanish-french/id570060128?mt=8 <- in US market. 4.791 in 4068 ratings.
https://play.google.com/store/apps/details?id=com.duolingo&hl=en <- Worldwide. 4.648 in 2343538 ratings.As you can see sites provide correct review rating to Google but they rounding it. Probably your guess for over 4.75 is correct. There isn't official explanation from Google about this. Some sites showing 5 star rating system. Other 10 star - 0, 0.5, 1, 1.5, 2, 2.5, 3, 3.5, 4, 4.5, 5. For other it's complicated because i have seen even results as 4.2 and this is 100 star rating system.
-
RE: Links Identified in WMT not on Webpages
You need to check link presence with help of Google Cache.
Some toxic assets using link cloaking so only Google can see links, but users can't find them. Isn't fair, but can be seen often.
-
RE: Ranking could be better - wondering why
No.
First Google isn't real time processing and if you do changes now can take few weeks before seeing results. Since we don't know when you make changes only can guess about this.
Second - your internal link building need improvements:
http://www.conny-ehm.de/bewerbungsfotos-freiburg.html
linked with anchor texts as (how many times): Bewerbungsfotos (12), Mehr (1), Bewerbungbilder (1), Bewerbungs-Shooting.Warum? Internal anchor text is important too!Third - this local page doesn't have links from outside (internet). Don't forget also anchor text.
Forth - you have BUG in canonical!
see on href= quote is different! It's “ (unicode) but should be ". FIX IT ASAP!
Six - your text "Bewerbungsfotos Freiburg" on subpage can be seen in Title, Meta KW, 2 times in alt text, in footer. How you expect to rank without single type "Bewerbungsfotos Freiburg" inside of text?
Please read this about some on-page tricks:
https://moz.com/blog/7-advanced-seo-conceptsI think that that's all.
-
RE: What to keep in mind: to 301 redirect every page in an entire online store
Building rewrite URLs one-by-one for webserver isn't suitable for your needs. All you need is to use redirect map based on your webserver or use/create CMS redirect map.
If your URLS for redirects around 100 you can use webserver configuration typing all of them in server configuration. But when links become 1000+ then this will slow down webserver and make configuration not-so-simply for changes. This is why almost all of webservers support redirect map:
https://httpd.apache.org/docs/2.4/rewrite/rewritemap.html
https://www.nginx.com/resources/wiki/modules/lua/
http://www.iis.net/learn/extensions/url-rewrite-module/using-custom-rewrite-providers-with-url-rewrite-moduleThen all you need is to build redirect map for your webserver and do one quick test in dev environment. With many redirects anything can happen - ram or cpu hogging, etc. Once you check and recheck this you should make also CMS redirections with some plugins or write new one. You should keep both (redirect map and CMS redirects) working for some time (6 months or year). Then you can drop redirect map and use only CMS redirects for old bookmarks.
-
RE: Doctype language declaration problem
@mods - maybe respost from here: https://moz.com/community/q/doctype-language-declaration-problem-2
I'll drop answer here too just in case you will delete old post.Yes - iso-8859-15 is very outdated encoding. Validator suggest that you should use UTF-8.I believe that this is also SEMRush issue too.
Fix just wrote this:
and bug will be fixed.
-
RE: How does decimal rounding of reviews to stars work in ios appstore? Starting from which average review score to get full 5 star rating?
Get iOS or Google Play link of app and paste it on SDTT (Structured Data Testing Tool).
That's all. In SDTT they're even deep than 3 points...
-
RE: Clean-up Question after a wordpress site Hack added pages with external links from a massive link wheel?
If you know what you do then you can use 410.
Because with 404 bot will coming to check that pages many times. With 410 it's only one or twice. Example from mine .htaccess:
Redirect 410 /salons/galka-3/
Redirect 410 /salons/galka-4/
Redirect 410 /salons/galka-1/
Redirect 410 /salons/galka-2/As you can see it's very easy and and sleek. BUT beware because if you send 410 to normal page then later you can't get this page ranked!
-
RE: Google indexing https sites by default now, where's the Moz blog about it!
Some sites comes with redirectors or "beacons" for detecting user presence. Example i'm on site X page A and there i click on link to go on page B. But due marketing department this pass via HTTP redirector or pure HTTP (and there 301 redirect to HTTPS). Then this page B can be not indexed.
This mean that once you set sitewide 301 redirect to encrypted connection you must make few more steps:
- you must check all resources to pass via this encrypted channel. Images, CSS, JS - just anything.
- you must check canonical to be set to HTTPS
- you must check that link between pages to be also HTTPS
- you must see any 3rd party tools for encrypted connection. Can be analytics software or "tracking pixels" or heat maps or ads.
- you must check if outgoing links from your site can be via other sites with encryption. Can be Wikipedia, Moz, Google. Since everything there is already encrypted you will skip frustrating HTTPS -> HTTP -> HTTPS jump too.
So then your site can be indexed in HTTPS. It's tricky procedure with many traps.
-
RE: Invest in a Image Sitemap - Yes or No?
Yes,
mine site comes with this sitemap:
http://www.mobiliodevelopment.com/post-sitemap.xml
http://www.mobiliodevelopment.com/page-sitemap.xmlThey're automated generated from Yoast SEO plugin for WordPress.
PS: Just right click and select "View Source"
-
RE: Updating product pages with new images - should I redirect old images ?
1. Yes
2. No
3. NoI'm doing same on mine own site but with renaming files. And i found that if i rename file w/o 301 then i suddenly get lot of 404 traffic. When i proper 301 i keep traffic normal. Because i get good positions in Bing images and Google images.
-
RE: Moz could not crawl my httpS website
Did you block AWS IP range or Mozbot? Or did you applied some firewall policy?
PM me your URL so i can take a look on server and network level with different tools.
-
RE: Can a Self-Hosted Ping Tool Hurt Your IP?
TL;DR - YES
Long story - i'm author of similar desktop tool called SEOPingler:
http://www.mobiliodevelopment.com/seopingler/
so anyone you can use to ping anything. And bots are coming within second or two. This works perfect.The problem is when you use this to ping many URLs (like 10k-20k). At some time this stop working and ping API endpoint receive your request but i can't see that bots are coming. This mean that there is some threshold that if you pass it for IP and you're temporary blacklisted. I also heard (but i can't confirm this) that this temporary may vary due previous usage. For me this isn't problem because users can blacklist their own IPs. And they can use hotspot wifi internet or VPN for continuing pinging.
But on server this will be HUGE problem because you can't switch IPs on-fly. And no one can guarantee how long your IP will be blacklisted.
-
RE: Google indexing https sites by default now, where's the Moz blog about it!
Or you can leave but change their links to pass some URL shortener - bit.ly or t.co until they comes with HTTPS version.
Or you can also make some page as "partners" where you can link only HTTP external sites.
Or you can also make internal page redirector to HTTP site. Like HTTPS -> HTTPS (inside redirector and dummy page) -> HTTP. On this case redirector won't be indexed and that's why it's dummy.
And this is just three ideas that i think for one minute. Probably mine favorite is #3. But it's IMHO.
-
RE: Realtor site with external links in navigation
This was explained https://moz.com/blog/spam-score-mozs-new-metric-to-measure-penalization-risk and https://moz.com/blog/understanding-and-applying-mozs-spam-score-metric-whiteboard-friday
Isn't 2/7 but it's 2/17.
- According Moz pages with "external link in navigation" percent of sites penalized with this flag - without that flag are penalized = 19% vs. 7%. Hmm... with 19% i will be hesitated about this flag! Same numbers are for "large number of external links".
- Nofollow wont help too much because links are internal and external. And giving external link nofollow in navigation isn't helping. Should you ever have external link in this navigation? What is purpose of this site - low quality site with landing pages full with keywords and EMD? Should this site exist in SERP? Please note - it's end of 2015 and there are new rules for SEO shared in Rand Fishkin presentation: http://www.slideshare.net/randfish/onsite-seo-in-2015-an-elegant-weapon-for-a-more-civilized-marketer So today people are worked for getting higher CTR, avg. visit duration, pages/session even bounce rate. And single external link in navigation can broke statistics/analytics.
If everything is OK on your both sites or network of sites. But that 19% wont give me a good and quality sleep. Who's know how tomorrow algorithms will evaluate sites?
-
RE: 404 errors
Seems as related to AJAX crawler escaped fragment.
You should see why bot is trying to crawl normal pages with deprecated method. Please see "incoming links" to this 404 in SearchConsole to see where is root of this problem and fix it.
-
RE: Moz could not crawl my httpS website
I check and seems that your site is OK from mine side - GCE, Amazon EC2 and in Eastern Europe. There are few issues - SSL3, weak DH and files as: https://www.pouyasazan.org/IT-news/25876/css/css/css/css/css/css/css/css/css/css/css/css/css/tooltipster.css (breaking relative and absolute path), also no 404 page because everything return "200 OK" (soft 404). But they isn't critical for your site working and it works. Server works and i just hammering it with many requests.
Please contact with Moz on mail here: help@moz.com or fill form there: https://moz.com/help/contact/pro to investigate your situation.
-
RE: Getting Spam Links
Disavow w/o any doubts. Also report for scrapper and hosting weird redirects (and maybe malware soon!):
http://www.webpagetest.org/result/160202_M1_GH/
just see waterfall diagram and watchout for iTunes, Alibaba, Bing, Codecanyon, Walmart, and few more.Starting from today Google also sent notes about spam:
http://www.thesempost.com/google-sending-confirmation-notices-for-spam-reports/
so for now every report count.PS: I just make another test here:
http://www.webpagetest.org/result/160202_19_Y1/
this time with "save results body". Seems not so trivial! This site is trying to add someone in mine list of CodeCanyon. Also trying to manipulate Groupon + increase visits of Indeed job offer. -
RE: Google SERPs changes
Just as Russ says - many things may happen. I will add few more:
- Panda 4.2 https://moz.com/google-algorithm-change#2015 even was on 17 July in UK SERP can be delayed to August. You also can be hit from Panda at any time. Just recovering is slow.
- Penguin can hit you any time. You need to check and keep one eye on your backlinks. Just recovering is slow.
- You can be hit from hosting infrastructure, CMS theme update, CMS plugin update, etc.
- You should investigated previous searches in SearchConsole and compare this to actual one. You can use archive.org and see historical changes in site and it's pages.
- You should check website log. Similar to "ship captain's log book" you and team there should log every change to site and it's pages/content. And even small change can invoke disaster after 2-3 months or even longer.
I wrote few things that need to be seen. I can wrote even longer but sometime there are even outside of SEO events. Imagine that you're auditing something as "Rover cars" and there is sudden drop... because company is now defunct. You also can be hit from negative campaign, rumors, etc. As you can see it isn't hard to explain what's happening without know this customer, it's niche, competitors and social networks.
-
RE: Is it bad I have a cluster of canonical urls that 301 re-direct?
Messing with canonical is dangerous! This daisy chain in rel=canonical can quick eat your crawling budget and/or deindex important pages. You can see Glenn Gabe articles about them:
http://www.hmtweb.com/marketing-blog/dangerous-rel-canonical-problems/
http://www.hmtweb.com/marketing-blog/redirects-duplicate-content-seo/In reality such incorrect canonicals confuses bot too much.
-
RE: My New Pages Are Really Slow to Index Lately - Are Yours Slow Too ?
I can't see this because just within minutes i push mine content to social networks - G+, Twitter, Facebook and LinkedIn.
-
RE: Domain Authority and PageRank Aren't Meshing
PageRank isn't updated from December 6, 2013 and won't be updated in future. Please don't rely on this metric anymore.
https://www.seroundtable.com/google-next-pagerank-update-19902.htmlYou can use Moz DA,PA, MozTrust, MozRank. You also can use Majestic CF, TF; also Ahrefs URL rating, domain rating. Because all of them are updated least monthly. But comparing them with 2 years old data (PR) isn't fair.
-
RE: Dfferent domains on same ip address ranking for the same keywords, is it possible?
Yes, it's possible. I have example that on same host i host few sites that can rank on same keywords in SERP.
But you should watch for some spam sites on same IP:
https://www.youtube.com/watch?v=AsSwqo16C8s
there is example 26k vs. 1.Update:
So few sites can rank. But when you try to rank much more then you can trigger some algorithmic filter (i can't find hangout where this is explained) because ranking engine will start to demote you. Since you have time to invest in few sites you should focus on single site. It's like quantity vs quality debate. -
RE: SEO Help in LA
Looking for marketing consulting?
Moz doesn’t provide consulting, but here's a list of recommended companies who do!
https://moz.com/rand/recommended-list-seo-consultants/As alternative you can browse current user list here: https://moz.com/users and looking for LA guys and their availability.
-
RE: Progressive JPEGs. Wondering if I should consider it OR not?
Exactly. For small images even it's better to combine them in large image and serve individual as sprites.
But when you have large images - then progressive jpegs is much better (even for mobile users https://code.facebook.com/posts/857662304298232/faster-photos-in-facebook-for-ios/ ). But progressive jpg also have drawbacks - they require little bit more memory and little bit more CPU power for draws. You also can see Patrick Meenan article about that http://blog.patrickmeenan.com/2013/06/progressive-jpegs-ftw.html
Patrick show that only using of progressive jpegs can improve speed index (webpagetest score) between 7% to 15% at same size in bytes.
-
RE: My New Pages Are Really Slow to Index Lately - Are Yours Slow Too ?
Can you try ping sitemap? Because some CDNs (i'm don't know your infrastructure) cache files for few hours and bots sometime get cached but old versions.
You also can try pinging specific pages. If you're scary for spam just use mine SEOPingler and bot comes within seconds.
-
RE: Open Site Explorer Problem
I also experience some issues with OSE at the moment too.
Probably it's related with forthcoming Moz index refresh and will be fixed within few hours.
-
RE: Hacked Websites (Doorways) Ranking First Page of Google
Too bad.
Probably best way is to contact someone from webspam team and sent your observations there directly.
Other way is to report same sites as hacked because they're definitely hacked.And will take time.
-
RE: How do I set-up a 301 redirect?
If you have 404 somewhere inside of your site there is only one way to fix them - fixing URLs. So if on page A i have link to page B that doesn't exist. Robots will crawl page A and seeing about presence of page B. But on checking page B robots will get 404. So it's much better to fix page A to link to page C that exists. And this is for internal sites.
If you have incoming links to 404. Site A page B link to your site X page Y. Since you can't fix page B then you can make 301 redirect on your website and redirect page Y to page Z that exist.
If you have outgoing links to 404. Then try to find update URL/site if they still exist. If doesn't exist anymore try to find replacement or redirect to archive.org. Or update your page that this doesn't exists anymore.
301 redirects are quick to fix with Apache and it's .htaccess. All you need is just to make somewhere line as:
Redirect 301 /dir/page.htm http://www.example.com/new-dir/new-page.html
Redirect 301 /dir/subdir/page.htm http://www.example.com/new-dir/new-subdir/new-page.php
of course this is example and you need to edit them for your specific case. You also must watch http access log for 404s. -
RE: Rich Snippets For Q&A Forums?
I'm not sure about rich snippets but you can mark them with schema.org. I saw similar in Apple Q&A section but i can't find original link so i find new one:
http://apple.stackexchange.com/questions/130255/safari-has-a-problem-displaying-certain-imagesJust pass this on Strucured Data Testing Tool and you can see:
-
QAPage
-
Question
-
Answer
I just didn't find this link to author or rel=author but this is described in schema.org with some examples.
-
-
RE: SSL Certificate Install Conerns
I got PM. But will post response here.
So there are two situations in SSL (there are much more but it's complicated) - SNI or w/o SNI.
With SNI on one IP you can use many TLS sites. Because in process of handshake browser put hostname and server knows this request for what site inside is. But some browsers doesn't support SNI - Windows XP, IE6, Android 2.2/2.3 and few more. For that you need dedicated IP just they can connect correct on your site.
I think that you have issue with SNI. Because if you trying to open your IP - http://212.48.85.138/ you get warning (about host mismatch) and self-signed certificate (on some machines).
Also you need to tighten your secure connection - stop SSL (it's 15 year old and it's now deprecated), you should support only TLS. Also enable forward secrecy, OCSP stapling and TLS session tickets. It's long but you can see all recommendations here:
https://www.ssllabs.com/ssltest/analyze.html?d=quellabicycle.comI hope that implementing few of them will bring GoogleBot back in site w/o warnings.
-
RE: The difference between api value and screen value
Redirects:
yyxysh.com -> DA=8; UPA = 17 in OSE, API {"pda":7.946532082305239,"upa":16.535296818533226}
www.yyxysh.com -> DA=8; UPA=21; {"pda":7.946532082305239,"upa":21.185637908436604}zaegi.com -> 17/27; {"pda":17.31598169835115,"upa":27.430491280580874}
www.zaegi.com -> 17/13; {"pda":17.31598169835115,"upa":13.339025908887258}As you can see API and OSE works. But checking different domain screw test. Please check your software did add "www" in beginning of checks? Because mine tests shows that it ADD it.
If you have Mac then you can use mine SEOAuditor very small app that return few important metrics:
http://www.mobiliodevelopment.com/seoauditor/
Actual API examples are from it. I just enable "debug log" to get raw results. -
RE: URL Link Counts
Links is "All links to this page including internal, external, followed, and nofollowed".
It's same number as if drop this URL into OSE where it's count as "Total Links" -
RE: How do I set-up a 301 redirect?
Redirect 301 /registerlogin-2/ http://everlastingchanges.com/registerlogin-3/
But please - remove "scheduling" from main homepage that doesn't exist anymore.
-
RE: Google User Click Data and Metrics
If you remember before 5 years ago all urls was unencrypted in SERP and lot of tools using this for capturing "keywords" and linking them to pages. After they introduce this in 2010 they begin rollout in few years and today only way to see keywords is in SearchConsole. Of course encryption is for "to improve your search quality and to provide better service". Original text can be seen here. Please note "provide better service" there. This is tricky!
So imagine that you search for moz and here is actual URL i can see now:
https://www.google.bg/search?q=moz&ie=utf-8&oe=utf-8&gws_rd=cr&ei=4wNVVpnZBYGoUZiXh4AG
you can definitely see keyword there in ?q=moz now first result is Moz.com and it's URL is:
https://www.google.bg/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwjenMnUq6rJAhXIRBQKHXuVCcUQFggfMAA&url=https%3A%2F%2Fmoz.com%2F&usg=AFQjCNHNW83KUfvLcZOMILlYW49NobxUig&sig2=nOVvQ05KIPrGB3XFAFmIGgAs you can clearly see - there isn't keyword anymore but everything comes with encrypted data (ved, usg, sig2). This link /url is actual redirector that count your click on specific result and position.Now if i click on 1st result and go in Moz.com i can scroll down and i find "this isn't MOZ i'm looking for" so within some time (few seconds) i will return to SERP. This is actual "dwell time" and bounce back to SERP. It's negative signal because it's show to Google that result he return for first place isn't correct with human verification. Now back on same SERP i can see Moz in Wikipedia:
https://www.google.bg/url?sa=t&rct=j&q=&esrc=s&source=web&cd=19&cad=rja&uact=8&ved=0ahUKEwjenMnUq6rJAhXIRBQKHXuVCcUQFghhMBI&url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FMoz_(marketing_software)&usg=AFQjCNGCgqmsKNIdaZdGrbugf8bJk6NhTg&sig2=jS-vt68NFtD5YhgSV4lTGwIf i click this and i doesn't have return to SERP anymore this give to Google enough to calculate bounce rate for this site (only to return in SERP) so give Wikipedia some "goal completition". And time for next search can be used to calculated "time on site".And since all searches are encrypted they knows when specific user search for something and when they make new search based on already returned data. Example is "Napoleon". This can be anything - french emperor, movie, cake, drink and other things. So now i can do subsequent search "Napoleon height". This is example how one search can give me enough information to do another refined search. Other good example can be "32 us president". Then i can type "franklin d roosevelt height".
This was explained much better in closing MozCon 2015 presentation "SEO in a Two Algorithm World ":
http://www.slideshare.net/randfish/onsite-seo-in-2015-an-elegant-weapon-for-a-more-civilized-marketer
and you should see it. There also shown few tests inside with terrific results. -
RE: Mobile website indexing
You don't need BOTH sitemaps. You need ONE as i show you in mine link.
-
RE: First Mozscape index of the year is live
Real case... almost 10 years ago SEO manager in some company was flying to their HQ just to explain why their PR was dropped from 6 to 5. C-level execs wasn't happy about that.
Now fun part - impressions, CTR and visits was increased during that update.
-
RE: Rogerbot will not crawl my site! Site URL is https but keep getting and error that homepage (http) can not be accessed. I set up a second campaign to alter the target url to the newer https version but still getting the same error! What can I do?
I just check and see that http is 301 to https. But https works.
Can you mail this issue to help@moz.com?
-
RE: Htaccess and robots.txt and 902 error
Usual .htaccess mess... WHY? Because of flags:
https://httpd.apache.org/docs/2.4/rewrite/flags.htmlAs you can see there are few flags L, R, NC and other. But we will focus on L and R only:
R - redirect. This make 302 redirect but you can specify other response codes between 301 and 399. Example R=301.
L - last. This flag causes mod_rewrite to stop processing the rule set.Let's go back on your file here is structure:
W3TC cache setW3TC compression set
W3TC CDN
W3TC page cache
WordPress handler - with L flag!
Redirects
Force non-wwwWhat is the problem? The problem is that after L flag - everything is stopped. This mean that www and non-www works and no redirect between them. You need to make changes in your file as this:
Force non-www
Redirects
W3TC
WordPress handlerAnd check and recheck everything one more time. Including redirects.
-
RE: Google User Click Data and Metrics
This was example with GA. I believe that they use dwell time and next or subsequent searches for this.
Because they can't fight against shopping cart abandonment's and other issues. So they have some as benchmark against other sites. If your metrics are above average in your industry then it's great. If your metrics are weak - you're in trouble. You can see benchmarking in Google Analytics. So whatever you do just try to make better metrics than them. Example - i just have seen that some of mine sites have pages/session 1.40 vs 2.99 in benchmark. Also mine session duration is 1:32 vs. 2:19 in benchmark.
Similar metrics are in PPC too - you need to be above the average for better positions, prices and conversions.
I know that all this explanation can sound little bit messy... but this is question all SEO specialists think about these days. If you know the answers you can become millionaire and retire quick.
-
RE: How do I "undo" or remove a Google Search Console change of address?
Yes - you can!
https://support.google.com/webmasters/answer/93636?hl=en
Just follow instructions there. And check 301 redirects for correct location also canonicals, sitemap, robots.txt and links to static assets (CSS/JS) and link to pages. -
RE: Advanced popup for WP website
Sound as job for NinjaPopup, OptinMonster, PopupDomination, Subscribers Magnet.
If some of them isn't fill all your needs then you can get some free and open source plugin like "Displey Pop". And tweak it for your needs.