If your domain is ccTLD (.co.uk) then geolocation of server didn't matter. Of course just need to be fast for that country where users will be.
Best posts made by Mobilio
-
RE: Does the physical location of a server effect the local rankings of a site?
-
RE: Why I lost my app rich snippet?
It's normal. I think that there is change in Google Play and their rich snippets are broken:
https://play.google.com/store/apps/details?id=com.mobilio.seoggestor&hl=en
just see structured data:
https://developers.google.com/structured-data/testing-tool?url=https%253A%252F%252Fplay.google.com%252Fstore%252Fapps%252Fdetails%253Fid%253Dcom.mobilio.seoggestor%2526hl%253DenSo - reviews are there but due warning and errors they wasn't showed in SERP. Your competitor also can lost his rich snippet soon.
-
RE: Rankings appear mixed up causing huge drop in organic
Google released "core update" on 8 and 15:
http://mozcast.com/
https://algoroo.com/
http://serp.watch/Lot of SEO (including in Moz) trying to catch what's changed. This isn't Panda or Penguin.
-
RE: SEO, 301s & backslashes at end of URL
So, this is HUGE technical SEO mess.
There are two chances - hosting or plugin:
- Hosting - you can fix this with .htaccess quick. There are lot of examples in internet about adding or removing slash in end of url. You need to check and recheck your .htaccess or webserver configuration.
- Plugin - sometime when WP update their plugins some of them comes with something more than bugfix or improvement. This is how they think "shouldn't hurt anybody", but in reality hurt ANYBODY because broke things. You need to try disable plugins one-by-one and see is this caused from them. This can be caused also from theme, WP itself and/or some custom code.
This is cause of problem. But fix is hard because you need evaluate approx time of chance. There are two cases:
- If this happen soon you should fix it ASAP because for bots old pages are missing (you need to confirm that, please check, sometime they can 301 to new page).
- If this happen few months ago and rankings wasn't lost then you can leave it in new format (do also .htaccess fix!). If rankings was lost then you can try to fix this (and also .htaccess fix!). It's difficult decision anyway.
As you can see everything dependent how long is this change and what damages already make. Try to see in SearchConsole more info about crawl and ranking.
I will save you little time with code for removing AND adding trailing slashes:
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)/$ /$1 [L,R=301]RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*[^/])$ /$1/ [L,R=301]First code remove /, second code add / to urls. You also need to check canonicals, sitemaps and fix them according changes.
And one tip. You should always install in WP some plugin for backups. Period. You wasn't first nor last person in the world with situation "something happens". Of course one weekly backup cost you a penny and can save many hours of debugging or trying to fix things. There are many plugins in market with many different features so i can't give advise what will fit to your needs.
-
RE: Does blocking traffic from a country via a firewall affect my ranking?
I think that answers here can help you:
https://moz.com/community/q/blocking-certain-countries-via-ip-address-locationSince question is similar. Adding PHP code for geolocation checks (maxmind geoip) is just 2-3 lines of code. And you can ignore submissions from other countries.
-
RE: Specifying Your Organization's Logo Schema Required If Corporate Contacts Schema is in Place?
No,
Corporate contacts and corporate logo are part of Schema.org/Organization.
Example:
is valid and:
is also valid. Also valid is this:
All these JSON-LD can be validated w/o problems.
-
RE: URL Index Removal for Hacked Website - Will this help?
If your "bad" link is like http://OURDOMAIN/flibzy/foto-bugil-di-kelas.html then your .htaccess should be:
Redirect 410 /flibzy/foto-bugil-di-kelas.html
that's all.Yes - you should do this for ALL 1205 URLs. Don't do this on legal pages (before hacking), just on hacked pages. I say "gone" with 410 redirect. It's amazing. In your case gone for good. Time for identify that 1205 URLs and paste them into .htaccess is let's say X hours. Time for identify that 1205 URLs and temporary remove them is Y hours. Since "temporary removal" is up to 30 days this make same job each month. In total for one year you have X in first case and 12*Y in second case. You can see difference, right?
Also today Barry Adams release story about hacking:
http://www.stateofdigital.com/website-hacked-manual-penalty-google/
and it's amazing that site was hacked just for 4 hours but Google notice this. You can see there traffic drop and removal from SERP. Ok, i'm not trying to "fear sells", but keeping bad pages with 404 will take long time. In Jan-Feb 2012 i have new temporary site on mine site within /us/ folder and even today Jan 2016 i still receiving bots crawling this folder. That's why i nuke it with 410. This save the day!On your case it's same. Bot is wasting time and resources to crawl 404 pages over and over but crawling less your important pages. That's why it's good to nuke them. ONLY them. This will save bot crawling budget on your website. So bot can focus on your pages.
-
RE: Search Console shows structure keywords more significant over content keywords. What is wrong?
Same here. This is when you put "share bar" on top somewhere in pages. And repeating "share" many times is also causing this.
Probably solution is to put share as side bar or in bottom and definitely minimize texts as "share" or "facebook" in text of articles.
-
RE: Should I fetch in WMT with all 4 options?
The case is when you have special website that serve different versions of site based on user agent. So in this case you want to see what GoogleBot could retrieve from your site. This also can assist you in mobile redirects too also based on user agents.
In only one way where can't help you - it's responsive web because there bots see only one version of HTML.
-
RE: URL Index Removal for Hacked Website - Will this help?
Yes.
Disavow needed for each site (http/https).
-
RE: Can bad traffic by association ever be a good thing?
I think that you should check is this real visit or fake one. Check HTTP access logs about such referrer and see in Analytics (or other website analytics software that you used). Also visit URL where backlink should be. Then:
- In case of link to hacked pages - disavow ASAP. Bad guys make backlinks to hacked webpages to get higher position in SERP. But this could change your linking profile. And in some situation you may face Penguin algo filter.
- In case of link to other pages and source look shady - disavow ASAP. This could be even revenge for removed hack. Also can bring Penguin.
- In case of link to non-existing link i.e. bot traffic - leave. This is case when bot make "fake visit" tricking Analytics to count it. And webmasters go to inspect source. This is "curiosity driven visit", but messing your Analytics statistics.
- In case of regular link that is relevant - keep it. Example - i have link from Pakisanian forum about people discussing something like type X vs. type Y. And someone share link from mine site explaining "here is difference between them" in article that is exactly about their discussion.
So until you didn't see original source can be anything. Also please check question and answers here:
https://moz.com/community/q/is-there-value-in-disavowing-links-if-you-there-is-no-google-penalty
so disavowing isn't tool that you only use in case of penalty. You can use it even w/o penalty to make link profile clean as possible. -
RE: Why can't google mobile friendly test access my website?
Nadav,
this could happen in restrictive robots.txt where there are some enabled parts and other parts disabled for indexing.
But you help you could you sent link to that file so we can give you best advise for your case?
-
RE: How long should I keep the 301 redirect file
This is also bad idea.
.htaccess have parameter with enable or disable subdirectory .htaccess override - AllowOverride. And this can kill all Apache performance! Why?
Let you browse /index.html Apache will parse .htaccess, execute rules and return /index.html. This is normal case scenario - with one .htaccess. But if you browse /subdir1/subdir2/subdir3/subdir4/subdir5/blah.jpg this is BAD! Apache will parse /.htaccess then /subdir1/.htaccess then /subdir1/subdir2/.htaccess .... subdir1/subdir2/subdir3/subdir4/subdir5/.htaccess and then blah.jpg.
Remember! Apache doesn't cache .htaccess. They're loading, parsing and executing on each resource loaded. And when you access 2nd resource - they make this over and over.
It's much better if you make static configuration in httpd.conf because this configuration is loading on startup only and there you can define <site><directory><resource>for each of them.</resource></directory></site>
Other modern webservers also used static config - nginx, lighttpd, etc.
-
RE: Search Console Errors 400 and 405
Both issues are on ALL WordPress sites.
xmlrpc.php return 405 because it's work under HTTP POST but bot crawl under HTTP GET.
And error 405 mean switch from GET to POST. But bot can't do that.admin-ajax.php return 400 because it's not designed for direct use. You must make call with some parameter and it will return some information.
But as name says - it's responsible for AJAX calls and it's not for direct use that's why it return 400 when it's called w/o parameters.So - that issues are not related to your issue.
For example mine WP sites also return same error codes to the bot.Peter
-
RE: My Search Rankings Are In FreeFall Since Last One Month. Should I Be Worried?
Yes - you should be worried.
Only known change was RankBrain on 26 Oct 2015:
https://moz.com/google-algorithm-changeSo you must check for Panda (probably). In Moz there are many articles how to escape it.
Now let's see your site structure:
http://www.midigital.co/?s=
http://www.midigital.co/partners-affiliates/?s=
http://www.midigital.co/digital-marketing-and-media-innovations-agency/www.meezanbank.com/ribaseazaadi/?s=
http://www.midigital.co/digital-marketing-and-media-innovations-agency/page/2/?s
http://www.midigital.co/digital-marketing-and-media-innovations-agency/page/3/?s
and this happen 981 times... RU index also search results?http://www.midigital.co/tag/anderson/
http://www.midigital.co/tag/consulting/
http://www.midigital.co/tag/data/
total - 1402 URLs with tags... RU index also tags?http://www.midigital.co/mi-digital-thinking-and-ideas-digital-happenings/page/8/
http://www.midigital.co/mi-digital-thinking-and-ideas-digital-happenings/page/7/
http://www.midigital.co/mi-digital-thinking-and-ideas-digital-happenings/page/5/
total - 214 URLs... RU index also pages?Also your host sabotage you:
http://midigital.co/wp-content/uploads/2014/12/HIp2-1030x704.jpg
http://www.midigital.co/wp-content/uploads/2014/12/HIp2-1030x704.jpg
total 346 images sitting on non-www version. But if you access HTML document - 301 redirect to www. RU your 301 redirect works?You also have 32 404 pages. Few domains that didn't work anymore -> http://www.pascrackit.com/ http://blog.mediastation.co.uk/ but you still keep links to them. Few minor problems -> mailto:mailto:careers@mediaidee.com + http://www.midigital.co/digital-marketing-and-media-innovations-agency/page/15/www.meezanbank.com/ribaseazaadi/ http://www.midigital.co/digital-marketing-and-media-innovations-agency/page/15/www.meezanbank.com/ribaseazaadi/?s=
As you can see you have colossal duplicated content. And you're with huge probability hit by Panda.
-
RE: How can I restrict the domains country by country?
There isn't proper way as you want it.
You can make .com and .co.uk country preferred with geo targeting:
https://support.google.com/webmasters/answer/62399
but this doesn't limit indian users that type english queries to see your sites in SERP. This is best way IMHO.If you need true protection against indian users then you need to make some changes. You can get list of GeoIP database:
http://php.net/manual/en/book.geoip.php
and based on this code to give users access or stop them.
BUT this is very risky in real world since all bots comes from worldwide. Example - if you disable US IPs on .co.uk then you will stop GoogleBot visiting your site too. Also Roger will be stopped, Bing and many other bots. No bots - no ranking... This can be recognized as sneaky redirect.So i don't recommend you to do this geo ip limitation without calculating all pros and cons of this.
-
RE: I'm struggling to understand (and fix) why I'm getting a 404 error. The URL includes this "%5Bnull%20id=43484%5D" but I cannot find that anywhere in the referring URL. Does anyone know why please? Thanks
%5B is [
and %5D is ]
%20 is " " (space)So technically speaking something add this to the URL - "[null id=43484]" without quotes. You need to find that and fix it.
-
RE: Angular JS - Page Load
Did you read prerender documentation?
https://prerender.io/documentation/install-middleware#apacheBecause there you can find two examples (Apache + nginx):
https://gist.github.com/thoop/8072354
https://gist.github.com/thoop/8165802How they works? Simple - bot's are received proxified version from this url:
http://service.prerender.io/http://example.com/url
this works as your server is switch to specific proxy mode called reverse proxy. This works similar as proxy. Proxy caches results from few computers/network to the internet. Computers are clients, they sent requests, proxy go in internet and execute it, then return result to clients. This is normal way. In reverse way - internet is client and proxy serve requests to internal infrastructure. This allow hiding internal infrastructure, easy scaling or even make complex site with few internal servers (one will process /blog, other /shop, third /support, etc).But - this "prerender" version is served only to bots. Normal clients (not in list) received AngularJS version of HTML. Since everything is served from your own server you shouldn't hesitated.
Second - do not (!!!) sent prerendered version to clients because prerender can't load pages from your server to make it prerendered. You can make easy overload your server in redirect loop. Also prerender server's too.
-
RE: Extract price from API to a Rich Snippet
This is not rich snippet. This is title manipulation.
If you look their HTML code their title is as:
<title>Bitstamp - buy and sell bitcoins</title>But if you load page you will see:
($price)Bitstamp...In javascript they have codes as:
var page_title = 'Bitstamp - buy and sell bitcoins';
$('head title').html('($'+loc_num($('span.live_price').attr('price'))+') '+page_title);
this code is responsible for updating price in title. Seems that Google index this and store modified title in their index.Here is official documentation of Rich Snippets:
https://developers.google.com/structured-data/rich-snippets/
and there isn't anything closer to you. Probably you can use only reviews. -
RE: Is this setup of Hreflang xml sitemap correct?
That's correct.
Long story - there is hreflang in HTML head and in xml sitemap. What's the difference? In HTML you can use same-site or cross-site links (between domains), in XML they are only in same domain. So in XML sitemap you can't use cross-site links (between different domains).
Also in SearchConsole you can set "international targeting" (SearchConsole -> Search Traffic -> International Targeting -> Country) for different sites.
I know that hreflang is little bit complicated but once you setup your first site correct then on next sites will be much faster.
-
RE: Internal Linking
Have you read this article:
https://moz.com/blog/10-illustrations-on-search-engines-valuation-of-links
specially #5, #6So - as EGOL says - use CrazyEgg + Riveted (Analytics plugin) + ELA https://developers.google.com/analytics/devguides/collection/analyticsjs/enhanced-link-attribution to see where they click.
-
RE: International site
Why you make 301/302 redirect to other site when you just can make hreflang cross domain between sites?
Then italian users will see italian version in SERP and english users can see english version.Here is how to do it:
https://moz.com/learn/seo/hreflang-tag
https://moz.com/blog/hreflang-behaviour-insights
https://moz.com/blog/using-the-correct-hreflang-tag-a-new-generator-tool
https://moz.com/blog/open-source-library-tool-check-hreflang
http://www.searchenginejournal.com/getting-a-better-understanding-of-hreflang/60468/You also need to verify in SearchConsole both sites and can setup on Italian site that it's for Italians. And setup English for worldwide.
-
RE: January 2016: Massive Rankings Fluctuations
Same here... first goes up and now (this week) goes down. Probably they tweak something like fine-tuning algos.
-
RE: International site
Google recommend to keep this with 302:
https://googlewebmastercentral.blogspot.com/2014/05/creating-right-homepage-for-your.html
so keep it as now with 302, just add hreflang for both pages after redirect. Also on original domain add "vary: accept-language" header.I think that one redirection is better than two redirections specially for mobile users.
-
RE: Community Discussion - Are annotations an overlooked avenue for driving traffic from YouTube?
Yes! I have tested them in this one:
https://www.youtube.com/watch?v=5ip03xNfpwgCTR was 6.36% and closes are 9.09%. I know that they're not "best-of-best" annotations, but definitely helps. But video is strange because have long intro and one of note is for jump inside of video where action begins. I'm trying to implement also "cards" in YouTube videos to see their performance too.
-
RE: Search Console Hreflang-Tag Error "missing return tag": No explanation
This happens when a page includes a hreflang link to an alternate language, but the linked page doesn't link back to it. This post explains that:
https://googlewebmastercentral.blogspot.bg/2014/07/troubleshooting-hreflang-annotations-in.htmlAnnotations must be confirmed from the pages they are pointing to. If page A links to page B, page B must link back to page A, otherwise the annotations may not be interpreted correctly.
So if you have this error reported, you should edit the Alternate URL page code to ensure there's a hreflang link back to the matching Originating URL.
-
RE: Startpage and shop page shows the same thing, shall i set canonical url?
Well - this is easy win. But you need to ask yourself - should you put products in homepage?
In Woo you can easy duplicate /shop page with / (homepage). And when i go in some site i want to know about their company, best (best-of-best) products, etc. This is huge niche for A/B tests. For example - you can put some promo on some products and drive traffic to them. Or company innovation, or company news, etc.
Imagine Amazon (just example) if you visit them did you get ALL of their billion products on their homepage? Or if you visit newspaper did you get ALL their news on their homepage?
Only change between /shop and / is one image in homepage in ATF.
So - yes, you can canonical them, but then you will miss much better opportunity in CTR or CRO.
-
RE: Search visibility increase with international SEO
I believe that noone can give you answer. But let's try!
US population is 318.9 million (2014). UK population is 64.1 million (2013). This make very rough 383 million.
Hispanic or latino population in US is approx 54M. http://www.cdc.gov/minorityhealth/populations/REMP/hispanic.htmlNow let's count all spanish talking in the world. This is Spain + whole South/Latin America except Brazil (they talk in Portuguese). Latin America is 626M (2015 est.), Brazil is 200M (2013) and Spain is 46.77M (2014). So rough calculation there will be 472.77M.
As you can see with this translation you can effective double your potential audience. As you can see i talk about "potential audience" because there are many things can goes wrong. But maybe in long term (this is shoot in the dark!) this can double (or even more) your traffic in best scenario. Or (again shoot in the dark) this can bring you 15% increase (US-hispanic+UK/US+UK population). Of course both are very rough assumptions.
-
RE: Multiply List of Keywords | Tools?
Like mine SEOggestor? But it's for one keywords.
The trick is that for large set of keyword they will can detect your robot activity and temporary stop you with captcha filling.
-
RE: Web Site Migration - Time to Google indexing
301 can take weeks or months dependent how large is your website. With site move this happen for days. And site move have "undo" function.
-
RE: Hacked Wordpress Site! So many 404s
Well - go in SearchConsole and reevaluate site from scratch. As long as you return 404 to hacked pages its' OK for you and for Google. But what they told you in "reconsideration request"? Because "this site may be hacked" isn't helpful for your CTR.
PM me site please.
-
RE: Web Site Migration - Time to Google indexing
I think that this one is best:
https://moz.com/ugc/accidental-seo-tests-how-301-redirects-are-likely-impacting-your-brand
(i can describe same, but this is article with diagrams and lot explanation).Also think about this let we have two pages - PageA about Coca Cola and PageB about Pepsi Cola. Let both pages have content optimized for them and have ranking. If we 301 PageA to PageB do you think that PageB will getting same positions about "Coca Cola"? That's right - will be lost positions because content is different.
Of course there is special case when you get ranking for some keywords with anchors even if such content doesn't exist:
https://en.wikipedia.org/wiki/Google_bombThat's why when wise people talk for about "over 200 ranking factors" they're damn right!
-
RE: Page speed - what do you aim for?
It's hard to be explained but "Less is MORE!" in general for that numbers.
Examples - redirectors. Redirects can overkill your site specially on mobile users. For that even simply site redirect can took second or two. Example www.example.com -> 301 -> m.example.com; looks simple isn't? But in reality after client took 301 redirect he must make new domain resolving (for m.exmaple.com) and then new connect to new server (m.example.com). And this is simply case... if you have 2 or even 3 redirects mobile users wait for 5 seconds before see anything. Hint - that's why i won't click on most bit.ly, ow.ly, goo.gl links in Twitter, Facebook, G+ when i'm on mobile. Because they first pass via t.co redirect then redirect that i can see and sometime even 3rd redirect. I know that marketers want to see "clicks", but isn't good for mobile users.
Server connection is also need to be less. But this mean that server need to be closer to user. Best example is Australia. There even simply DNS resolving + connection took one second. And client doesn't receive single byte from server yet... You can see WebPageTest.org (there are Australian servers). But of course providing single server there is expensive, so you need to have deep pockets to make servers there. That's why most of companies providing CDN support. Since CDN endpoint is closer to user it make things little bit faster for them. And if CDN is setup correct should be much faster.
So - idea is "Less is More!". The best is if you use WPT to benchmark your site from all over the world. And also setup Analytics to count speed. Because it's different speed when your site is on perfect conditions in datacenter than in real world.
-
RE: Updating 2013 Site Built with Custom Theme, Modify Existing Theme, Create New Custom Theme, Or Use Child Theme?
Well... i have two news for you. First - good one... theme is good looks fantastic.
Now bad news - i look it on mobile phone and you can see result in attached file. Yes - i have iPhone SE because i love it's form-factor.
I'm so sorry but it's year of 2018 and everyone today works with smartphone. Probably something custom should works perfect for you. Because it's 2018 and there is also AMP that you should check that too:
https://moz.com/blog/amp-digital-marketing-2018 -
RE: Mac-Friendly, DOM-Rendering Spidering Tool for Multiple Users
Right now we're updating SEOSpyder ( http://www.mobiliodevelopment.com/seospyder/ ) for rendering pages but i can't give you timeframe when will be done.
So far memory requirements isn't too high and was crawl 250k site with 8G ram machine.
-
RE: Site under attack from Android SEO bots - expert help needed
Just as EGOL describe it.
If you're on Amazon AWS then you can use their CloudFront as CDN. But also you can observe source of traffic. Could coming from one country, one IP range or one user-agent. There should be some kind of pattern and you should investigate it.
Then just need to make rule to block that traffic or just redirect them to one static "hello world" page.
I was also victim of such traffic, but was from humans trying to depleting an AdWords daily budget. Once budget it over ads was stopped showing, after few hours they recalculate clicks, some funds was returned, ads are shown again, they click it, budget is over... and so on.
-
RE: How to enable lost trailing slash redirection in WordPress with Yoast plugin
Is code on .htaccess works?
It should be on top because that file is executed from top to bottom. And if some rule must be executed then execution flow can stop so next rows can't be executed.
-
RE: .co.uk to .com domain move Dec 26th, still 40% down - do I risk moving back? (desperate)
I think that i was answer you on other forum...
-
RE: Can I remove certain parameters from the canonical URL?
Best practices is to remove and noindex such parameters. One of latest projects was Ecommerce shop with 5k products and over 220k indexed pages from categories.
So you need to AVOID indexing such ?results=16 or ?order=asc or ?search=test and to have PURE and only pagination.
Example:
https://www.jamestowndistributors.com/product/epoxy-and-adhesives
https://www.jamestowndistributors.com/product/epoxy-and-adhesives?page=2
https://www.jamestowndistributors.com/product/epoxy-and-adhesives?page=3
https://www.jamestowndistributors.com/product/epoxy-and-adhesives?page=4
are ONLY valid categories pages.If you have infinite scroll situation is described here:
https://developers.google.com/search/blog/2014/02/infinite-scroll-search-friendly?hl=enI hope that this will help you!
Peter