Get iOS or Google Play link of app and paste it on SDTT (Structured Data Testing Tool).
That's all. In SDTT they're even deep than 3 points...
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Get iOS or Google Play link of app and paste it on SDTT (Structured Data Testing Tool).
That's all. In SDTT they're even deep than 3 points...
Mobile dev here!
So for all stores (AppStore + GooglePlay + Microsoft Marketplace + Amazon App Store + etc) algorithm is simple - they calculate ALL ratings on all versions. Some of them only for local store, other for global.
Example Duolungo:
https://itunes.apple.com/bg/app/duolingo-learn-spanish-french/id570060128?mt=8 <- this is in local Bulgarian Market. 4.928 in 9 ratings.
https://itunes.apple.com/us/app/duolingo-learn-spanish-french/id570060128?mt=8 <- in US market. 4.791 in 4068 ratings.
https://play.google.com/store/apps/details?id=com.duolingo&hl=en <- Worldwide. 4.648 in 2343538 ratings.
As you can see sites provide correct review rating to Google but they rounding it. Probably your guess for over 4.75 is correct. There isn't official explanation from Google about this. Some sites showing 5 star rating system. Other 10 star - 0, 0.5, 1, 1.5, 2, 2.5, 3, 3.5, 4, 4.5, 5. For other it's complicated because i have seen even results as 4.2 and this is 100 star rating system.
Search Analytics is only one way to check what position is your keywords and/or pages for specific queries.
There are two ways for aggregating - per site or per page. They are described here:
https://support.google.com/webmasters/answer/6155685
In first site when we have site that on one page return 3 pages on position 1,2,3 and user click somewhere - CTR = 100% because we have one click to site and average position = 1 because is calculated from highest position.
In second way (per page) - CTR = 33% (3 results and one click) and average will be 2 (1+2+3)/3 = 2.
Now going back to your questions. It's normal your positions to goes up or down. Google was making changes to their algorightms - over 400-500 per year. Some of them can be seen, other was unseen. There are few ways to be informed for changes:
http://mozcast.com/ <- this is something as weather report for Google. There are sunny days and rainy too.
https://moz.com/google-algorithm-change <- this is one of lists with changes. Some of them are confirmed, other doesnt.
Watching SEO community - can be in Twitter, Facebook, G+
Reading SEO news sites
Now back on your question. I believe that your site was going up and down from updates on 6 Jan and 16 Jan (they are confirmed as Core update). But i don't know about updates on 22 Dec or 30 Dec. Since "core update" is still in analysis of many industry leaders SEOs (even in Moz) i can't give you advise about it.
Yes - in Unicode there are too many quotes and seems that pasting different quote make test OK for humans but garbage for bots and browsers.
This is also bad idea.
.htaccess have parameter with enable or disable subdirectory .htaccess override - AllowOverride. And this can kill all Apache performance! Why?
Let you browse /index.html Apache will parse .htaccess, execute rules and return /index.html. This is normal case scenario - with one .htaccess. But if you browse /subdir1/subdir2/subdir3/subdir4/subdir5/blah.jpg this is BAD! Apache will parse /.htaccess then /subdir1/.htaccess then /subdir1/subdir2/.htaccess .... subdir1/subdir2/subdir3/subdir4/subdir5/.htaccess and then blah.jpg.
Remember! Apache doesn't cache .htaccess. They're loading, parsing and executing on each resource loaded. And when you access 2nd resource - they make this over and over.
It's much better if you make static configuration in httpd.conf because this configuration is loading on startup only and there you can define <site><directory><resource>for each of them.</resource></directory></site>
Other modern webservers also used static config - nginx, lighttpd, etc.
I think that this one is best:
https://moz.com/ugc/accidental-seo-tests-how-301-redirects-are-likely-impacting-your-brand
(i can describe same, but this is article with diagrams and lot explanation).
Also think about this let we have two pages - PageA about Coca Cola and PageB about Pepsi Cola. Let both pages have content optimized for them and have ranking. If we 301 PageA to PageB do you think that PageB will getting same positions about "Coca Cola"? That's right - will be lost positions because content is different.
Of course there is special case when you get ranking for some keywords with anchors even if such content doesn't exist:
https://en.wikipedia.org/wiki/Google_bomb
That's why when wise people talk for about "over 200 ranking factors" they're damn right!
Yup. Phone will be real and have confirmation to avoid incidental clicks.
No - adding subcats into menu will make navigation heavy for mobile users. Click hamburger, click category, click subcategory, click sub-subcategory. No way someone to nagivate there.
I sent you mine contacts on your site where we can discuss "boring details".
Well everything is doable - but remember about users first.
So - lets imagine that i'm looking for someone to be official photograph of mine wedding or engagement or baby photo session. So first if i look is text "i'm photograph that can make your special events memorable...". Ok - you have such text. Second is gallery, 3rd some reviews and pricing. Now in mobile design people just can't reach to them and you get visits with 1 page per session. Right?
That's why i think that "Hamburger Gallery Partners Price" could work. Hamburger is actual menu as now - just need to make it sticky with "Gallery Partners Price". WHY?
Let's assume that i'm on mobile and browsing for "Edinburgh wedding photographer". I see results (term is SERP - search engine result pages) and i click on your site. So i start reading your texts. And as i scroll down to them - sticky menu appear on top. And users can click on them - in result you can have user convinced that "here is the man". So they can give you a call or fill contact form.
2nd Idea - place somewhere in sticky menu "phone" icon that will be linked to your phone using "callto:here-is-your-phone-number". So give user another way to book you.
This is not rich snippet. This is title manipulation.
If you look their HTML code their title is as:
<title>Bitstamp - buy and sell bitcoins</title>
But if you load page you will see:
($price)Bitstamp...
In javascript they have codes as:
var page_title = 'Bitstamp - buy and sell bitcoins';
$('head title').html('($'+loc_num($('span.live_price').attr('price'))+') '+page_title);
this code is responsible for updating price in title. Seems that Google index this and store modified title in their index.
Here is official documentation of Rich Snippets:
https://developers.google.com/structured-data/rich-snippets/
and there isn't anything closer to you. Probably you can use only reviews.
I understand your fears completely because WP + themes + plugins can be HUGE mess.
Why? Because some of devs don't know technical and on-page SEO. I have seen themes where they put text "Comments" within H1 tag. At same time post title was encapsulated with H3. Things as hidden text, messy HTML codes, bloatware HTML are countless. You also can get server overloading, slow SQL queries and some issues. Can be performance issues, PHP issues, hosting issues. Running WP mean that you will get 10 CSS files and 10 JS libraries. And this is "best case scenario". Just imagine what is worst. Now add "upgrading" procedure where everything can be broken (or changed!) with just one click. Ah, and mine favorite - security exploits and hacking. Sound like "perfect storm". Isn't?
Now i know that this sound scary. That's why you should see and review HTML code of WP before migration. Probably you will see potential for improvements. And this changes need to be patched over original files (i'm talking about theme or plugins). For theme is OK - you can make child theme based on original. But for plugins - you need to "fork" original plugin and make your own custom version. Then on each update you should "diff" your version and original to keep your patches and new code. This mean very strong backup solution plus local dev environment and extra work on each update.
Also - it's year of 2016. Why you don't around for alternatives? I can give you suggestion - static site generators:
https://www.smashingmagazine.com/2015/11/modern-static-website-generators-next-big-thing/
https://www.smashingmagazine.com/2015/11/static-website-generators-jekyll-middleman-roots-hugo-review/
As you can see i'm not giving answer Yes or No. I'm just giving you few extra points to think about. Just put cards on the table.
PS: May sound negative little bit because i have some theme and plugins in past. One wrong choice and all SEO efforts can be ruined. Such is life...
Glossary:
"fork" - process of creating different version of something existing with some changes that doesn't exist in original. You can hear that devs are forking projects too. In your case - since you can apply your patches but on next update everything will be gone back to original. That's why you need to fork them.
"diff" - process for checking difference between files/project and extracting/showing only difference between them.
Yes.
Disavow needed for each site (http/https).
You can use Google PageSpeed Insights:
https://developers.google.com/speed/pagespeed/insights/
And if your site is on WordPress then you can install this plugin:
https://wordpress.org/plugins/google-pagespeed-insights/
You also can use WebPageTest, GTMetrix, Pingdom Tools. Even one simply *nix tool as wget can make local mirror to site where you can see filesizes (command is wget -r -m http://m.domain.com/au ).
Alternative you can use desktop crawlers (like mine SEOSpyder) where you can see also images size in bytes.
If they are not your - it's better to disavow them. If they are spammy - disavow them.
Those links may hurt your ranking.
Skeleton have JS to NOT include sticky menu on mobile. But was nice example about mine idea and first that i remember quick during writing this comment.
Second idea is sticky menu with "Hamburger_icon Gallery Partners Price" for mobile and normal (just as now) for desktop. This probably can skip confusing users about menu and hierarchical structure of your website.
But as i said - A/B testing is needed.
Dirk, Q&A section is missing but they (Mods + community dept + Moz staff) actively track all activity here.
Same here. I reading Moz since 2007 but, make mine account somewhere in 2011 and start contributing here in 2015. So all mine points was in last year.
I'm preparing now small script for tracking all activity in 2016 in Top 1000 users maybe
And this is the point where you clearly can see that RWD (responsive web design) isn't "one-size-fit-all" solution.
You have few choices but all are weird - double menu, hamburger + kebap menu, sticky menu on left, etc. But as i said this is weird. And only A/B test can show how they works.
I think that best in your case is to try sticky menu on top. One of best example is here:
http://getskeleton.com/
but you need to watch this on desktop. When you scroll down you can see that "Intro Code Examples More" is sticky on top. In your site you need to make changes to "Gallery Partners Pricing". This menu can be sticky even for desktop. This will bring more size in width for main content.
As i said - only A/B test can show who's right. Because all double menus have better navigation but in reality confuses user with "paradox of choice".
301 can take weeks or months dependent how large is your website. With site move this happen for days. And site move have "undo" function.
If your "bad" link is like http://OURDOMAIN/flibzy/foto-bugil-di-kelas.html then your .htaccess should be:
Redirect 410 /flibzy/foto-bugil-di-kelas.html
that's all.
Yes - you should do this for ALL 1205 URLs. Don't do this on legal pages (before hacking), just on hacked pages. I say "gone" with 410 redirect. It's amazing. In your case gone for good. Time for identify that 1205 URLs and paste them into .htaccess is let's say X hours. Time for identify that 1205 URLs and temporary remove them is Y hours. Since "temporary removal" is up to 30 days this make same job each month. In total for one year you have X in first case and 12*Y in second case. You can see difference, right?
Also today Barry Adams release story about hacking:
http://www.stateofdigital.com/website-hacked-manual-penalty-google/
and it's amazing that site was hacked just for 4 hours but Google notice this. You can see there traffic drop and removal from SERP. Ok, i'm not trying to "fear sells", but keeping bad pages with 404 will take long time. In Jan-Feb 2012 i have new temporary site on mine site within /us/ folder and even today Jan 2016 i still receiving bots crawling this folder. That's why i nuke it with 410. This save the day!
On your case it's same. Bot is wasting time and resources to crawl 404 pages over and over but crawling less your important pages. That's why it's good to nuke them. ONLY them. This will save bot crawling budget on your website. So bot can focus on your pages.
The best is to keep them 404. But fast is to 410 them.
All you need is to place this topmost somewhere of .htaccess:
Redirect 410 /dir/url1/
Redirect 410 /dir/url2/
Redirect 410 /dir1/url3/
Redirect 410 /dir1/url4/
But this won't help you if your URLs have parameters somewhere like index.php?spamword1-blah-blah. For this you need extended version like this:
RewriteEngine on
#RewriteBase /
RewriteCond %{QUERY_STRING} spamword
RewriteRule ^(.)$ /404.html? [R=410,L]
RewriteCond %{QUERY_STRING} spamword1
RewriteRule ^(.)$ /404.html? [R=410,L]
RewriteCond %{QUERY_STRING} spamword2
RewriteRule ^(.*)$ /404.html? [R=410,L]
So why 410? 410 act much faster than 404 but it's DANGEROUS! If you sent 410 to normal URL this is effective nuking it. I found that with 410 bot visit this url 1-2-3 times, but with 404 bot keep visiting over and over eating your crawling budget. URL removal in SearchConsole is OK, but it's fast but works only for 30 days. And will eat almost same time as building list for 404/410s. Hint: You can speedup crawling if you do "fetch and render" then submit to index.
Yes - there is bug in your robots.txt. You should wrote some as:
Disallow: /?display=table
or:
Disallow: /?display=*
Hey - i just found that we can "scrape" historical MozPoints using their profile pages:
https://moz.com/community/users/397332
if you view source this you can see "var mozpoints_data" with JSON inside.
I can scrape all historical points for Top1000 for example starting from 2015-11-01 so you can built even better sheet.
Google released "core update" on 8 and 15:
http://mozcast.com/
https://algoroo.com/
http://serp.watch/
Lot of SEO (including in Moz) trying to catch what's changed. This isn't Panda or Penguin.
Yes - there is turbulence now in SERP:
http://mozcast.com/
https://algoroo.com/
http://serp.watch/
According to Google this was "core update".
Ok, here is document:
https://docs.google.com/spreadsheets/d/1YGkY2Yz1RefwcG4nqJ1lbnEfTghn7pRtb9di2HICiaA/edit?usp=sharing
captured before hour.
I leave it as-is just extract username but later make regex to extract their posts and ugc posts.
This is Top 4291 as far as i see.
Here is procedure how to get it:
Just use some of mine formulas to extract number of posts or UGC. I can make formula for extracting username too.
This is pretty awesome! And i like it!
Would you like to extend this in Jan 31 this year for all members that we can see? I can make importer of their table to XML so later we can use =ImportXML. And one bad news "Only members who have logged in within the last 60 days and have 25 or more MozPoints will be shown." So there can be other "top valued" members but we can't see it because this limitation.
And also special users like "ghost in the shell" -> https://moz.com/community/users/2380
XML sitemap is well defined here:
http://www.sitemaps.org/protocol.html
But i can quickly resume:
I think that this is most important in XML sitemaps.
Yes - backlinks and their anchors are also factor for ranking. For example there are few Google Bombs -> https://en.wikipedia.org/wiki/Google_bomb where many sites give backlink to other site for specific anchors and this make "victim" site N:1 for this keywords.
So - you can evaluate backlinks and their anchors with many tools:
So far from all tools OSE is free, all other require monthly subscription. This make OSE N:1 in this list. OSE also provide "Link Opportunity".
I think that A1 Sitemap Generator support this function.
If you just will display icons then it's OK. But it's dangerous to display text with display: none:
https://www.seroundtable.com/google-display-none-20626.html
https://support.google.com/webmasters/answer/66353?hl=en
https://youtu.be/7y-m_jiayLQ
https://youtu.be/B9BWbruCiDc
https://www.seroundtable.com/google-hiding-content-17136.html
https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html
Because hiding text give "less weight" and you may get into trouble.
Long story short - YES. Any JS framework can impact your ranking.
I share again here link on Distilled how they using custom service to providing pure HTML version of site to bots only:
https://www.distilled.net/resources/prerender-and-you-a-case-study-in-ajax-crawlability/
And seems that experiment works very well on them.
I go in KeywordPlanner with same keyword and here are results that he show me. You can find them in link below.
So seems that SEMRush shows whole ad group sum for impressions. And you go in group and look only for specific keyword.
TL;DR - It dependents.
Main answer is - how you get traffic? If this is traditional way - "write great content and they will follow" this also bring comments. And comments create new content with new traffic. Because Google/Bing/Yahoo can index content and comments too. Moz uses this way in their blog section for example. This is organic way.
But in new way - comments in social networks bring traffic. Because more people seeing that person comment/share/like something and they do "curiosity driven research" what is this content. But this bring only social network traffic since not all of Facebook comments can be indexed from search machines. This is social way.
This is theory about traffic and i describe only two of them (they're much many but i simplify little bit things). Now turning things into practical. Disqus can have some issues with crawling comments from Google. This is well known and described many times: https://help.disqus.com/customer/portal/articles/762307-why-isn-t-google-indexing-my-comments-
Facebook comments can't be indexed at all too from Google. There are some articles from 2011 that they can but i can't find working example.
That's why it's best if you can check - are your existing comments indexed? If they are you can leave both commenting systems. It's much better. If they wasn't indexed you can switch to Facebook commenting system. Also check on other site can you find some Facebook comment in Google.
And one suggestion "Don't Build Your House on Rented Land". If you can use your own commenting system this will be much better. Just like Moz.
Well on mine first answer there is number for 54M population of hispanic/latin in US. But you can't know how many of them do searches in english or in spanish?
I can talk about Bulgarians. Even if they migrate to other country they still talk in Bulgarian in home, watch Bulgarian TV, read online Bulgarian newspapers, purchases Bulgarian goods. And more interesting - they still search in Bulgarian. Example - even if google.co.uk you can get Bulgarian searches and results. Real case - a friend of mine own TV repair service center and get phone call from London about TV repair. Just lady's there want to find someone to fix his mother TV. Funny - distance between service center and home was almost 100-200 meters.
You don't know what you don't know...
This isn't answer but you need to read that articles:
https://moz.com/blog/pruning-your-ecommerce-site
https://moz.com/blog/15-seo-best-practices-for-structuring-urls
http://www.stateofdigital.com/optimising-urls-seo-ux/
And reading that could increase internal debates about your information architecture of site or silo structure.
Well - true. This is true forecasting w/o market evaluation.
For example - your site can be for local business. Can Latin America visit it? No, but local hispanic or latin residents can visit it.
Or you can sell something expensive - RollsRoyces, Teslas, or Elon Musk's Rockets. Can latin america audience interested about this? Probably.
Or you can share them Taco's recipes - YES! They will be interested.
There is special market evaluation reports but this report will take months and will be expensive. It's more accurate, but it's also shoot in the dark.
One thing that definitely will happen - your traffic will rise. You can evaluate using KeywordPlanner and special keywords on Spanish how is monthly searches of them. It's easy - go in search console and grab your best keywords for searches. Like Top100 of them. Translate them to Spanish and then go in KWP paste them and watch monthly searches about them. You also can compare English KW vs. Spanish KW as difference.
I believe that noone can give you answer. But let's try!
US population is 318.9 million (2014). UK population is 64.1 million (2013). This make very rough 383 million.
Hispanic or latino population in US is approx 54M. http://www.cdc.gov/minorityhealth/populations/REMP/hispanic.html
Now let's count all spanish talking in the world. This is Spain + whole South/Latin America except Brazil (they talk in Portuguese). Latin America is 626M (2015 est.), Brazil is 200M (2013) and Spain is 46.77M (2014). So rough calculation there will be 472.77M.
As you can see with this translation you can effective double your potential audience. As you can see i talk about "potential audience" because there are many things can goes wrong. But maybe in long term (this is shoot in the dark!) this can double (or even more) your traffic in best scenario. Or (again shoot in the dark) this can bring you 15% increase (US-hispanic+UK/US+UK population). Of course both are very rough assumptions.
Well including sitemap in robots.txt is good praticle but isn't required. You can add sitemap in SearchConsole (or Bing WMT) and bots will index site again. Even bots can index site w/o sitemap but sitemap helps them.
That's why don't worry. You can edit your robots.txt at anytime and add sitemap if it's possible.
Well if both sitemaps are for same site then it's OK. But it's much better to implement hreflang as this is explained here:https://support.google.com/webmasters/answer/2620865?hl=en
I'm not sure that Magento can do this but you always can hire 3rd party dev for building plugin/module for this.
Rebuilding in Angular will effective nuke all search visibility of this site. I'm not kidding. You can see similar question here:
https://moz.com/community/q/index-problem
and answers there. Just open site in this question and see it's source (not DOM! Just HTML source that bot get). And you also will be shocked.
PS: I'm not saying that site with Angular can't rank and index. But it's relative harder comparing with pure HTML site.
You can also have multiple sitemaps on 3rd sites. Look at Moz robots.txt:
Sitemap: https://moz.com/blog-sitemap.xml
Sitemap: https://moz.com/ugc-sitemap.xml
Sitemap: https://moz.com/profiles-sitemap.xml
Sitemap: http://d2eeipcrcdle6.cloudfront.net/past-videos.xml
Sitemap: http://app.wistia.com/sitemaps/36357.xml
Also Google.com robots.txt:
Sitemap: http://www.gstatic.com/culturalinstitute/sitemaps/www_google_com_culturalinstitute/sitemap-index.xml
Sitemap: http://www.gstatic.com/dictionary/static/sitemaps/sitemap_index.xml
Sitemap: http://www.gstatic.com/earth/gallery/sitemaps/sitemap.xml
Sitemap: http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml
Sitemap: http://www.gstatic.com/trends/websites/sitemaps/sitemapindex.xml
Sitemap: https://www.google.com/sitemap.xml
Also Bing.com robots.txt:
Sitemap: http://cn.bing.com/dict/sitemap-index.xml
Sitemap: http://www.bing.com/offers/sitemap.xml
So using multiple sitemaps it's OK and they can be also hosted on 3rd party server.
Merging all domains under one will be best for you and your client. Here is proof:
https://moz.com/blog/2-become-1-merging-two-domains-made-us-an-seo-killing
But isn't just consolidation, you need to make silo structure, content changes, redirects and many other internal changes.
Can be also CDN issue. But w/o example we only can guess...
This is article from year of 2011 and today things are different.
Today you have Google MyBusiness where you can manage all your locations. And there is "insights" where you can see performance of this location individually and clicks to your site. Also clicks for driving directions, click to phone, etc.
But if you wish you also can tag url to minimize spending time going back and forth in MyBusiness and Analytics.
Just as Dmitrii says this is UTM is for tracking special campaigns.
But you can't track organic visits because bot crawl your website and find pages A, B, C, etc. And later in SERP you can see A, B ,C and rest pages as-is their physical location. Example http://www.example.com/a.html but you can't see tagged URL in SERP.
This make things impossible for implementing in their request.
If they really, really, really want to be URL tagged - you need to change their canonical of pages and internal linking between pages where all links to be tagged too. And this is pure recipe for SEO disaster. Also will be disaster for Analytics tracking because all traffic will be tagged.
Just as Russ says - many things may happen. I will add few more:
I wrote few things that need to be seen. I can wrote even longer but sometime there are even outside of SEO events. Imagine that you're auditing something as "Rover cars" and there is sudden drop... because company is now defunct. You also can be hit from negative campaign, rumors, etc. As you can see it isn't hard to explain what's happening without know this customer, it's niche, competitors and social networks.
You have few more things to do:
change redirect from 302 to 301 between HTTP and HTTPS sites
you need to verify in SearchConsole HTTPS site too and then do "change of address". Change of address can be used also if you switch protocols.
you need to change in your pages - canonical, assets, images so everything to point to HTTPS pages/elements. Also internal linking should be only to HTTPS pages. I check 2-3 pages of your site and they're still pointing to HTTP. This give bots wrong signal.
setup HSTS header. This will prevent browsers/bots to visit anymore HTTP site for one year:
Header always set Strict-Transport-Security "max-age=63072000; includeSubdomains; preload" put this in .htaccess
about errors.txt I think that it's much better if you enable indexing at all. Here is example of mine site:
User-agent: *
Disallow:
Sitemap: http://peter.nikolow.me/sitemap_index.xml
as you can see i enable bot to crawl everything within WordPress folders.
Current you make half moving to HTTPS and this sent to bots wrong signals because site isn't moved proper. Fix everything to avoid wasting of crawling budget.