It looks like they're using a custom wordpress theme.
You may be able to find a similar one here: https://themeforest.net/category/wordpress/corporate/directory-listings
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
It looks like they're using a custom wordpress theme.
You may be able to find a similar one here: https://themeforest.net/category/wordpress/corporate/directory-listings
This may help you: https://support.cloudflare.com/hc/en-us/articles/200169886-Can-I-use-a-naked-domain-no-www-with-CloudFlare-
It looks as if you may be not need to migrate the domain to the www. version after all.
My concern with having the H1 in the logo, is actually that it may devalue the other h1 tags on the site. As they would then be duplicates. I haven't really done any research into this, would be interesting to see what would happen.
Does your company do anything to help out the local community? If so perhaps you can list this here. If you go into a mcdonalds (atleast in the few local to me) there is a board where they stress what they've done to help the local community. It's a great way of making yourself known in the local community whilst also keeping your content to a high quality.
I honestly think it's not required anymore. There may be a benefit to a small site but I think it would be negligble.
This could possibly be due to: https://moz.com/community/q/10-14-mozscape-index-update-details
See the following snippet: "Second, many PA/DA and other metric scores will look very similar to the last index because we lost and had problems with some metrics in processing (and believe that much of what we calculated may have been erroneous). We're using metrics from the prior index (which had good correlations with Google, etc) until we can feel confident that the new ones we're calculating are correct."
I would personally link your logo to the canonical version of your homepage (which would commonly be abc.com) As you want to pass as much "juice" directly to the canonical version of the homepage.
It sounds like you're getting referral spam in Google analytics.
They aren't actually linking to you, so the good news is that it won't impact you within Google search, the worst thing they can do is skew your data. Their aim is to try and get you to their sites.
This is a guide that's great for blocking all referral spam: http://help.analyticsedge.com/spam-filter/definitive-guide-to-removing-google-analytics-spam/
You should be able to find the answer to your question right here: https://moz.com/learn/seo/redirection
Thank you for your responses. Hopefully someone who may have experienced this before will be able to contribute. It seems there's very little in this area about the potential impacts.
Hi Rachel,
Glad the resources were helpful, re-reading my response, I did forget to add that you can also specify your target here: https://support.google.com/webmasters/answer/62399?hl=en
Best of luck with the multilingual site, anymore questions please do ask.
Tom
Funnily, I saw this topic pop up yesterday on twitter. The advice is to still disavow:
http://searchengineland.com/google-responds-mass-negative-seo-extortion-emails-200689
This should be a suitable guide for you http://www.yourhtmlsource.com/sitemanagement/urlrewriting.html
Hi Tom,
I use Moz, Screaming Frog and this canonical checker: https://chrome.google.com/webstore/detail/canonical/dcckfeohihhlbeobohobibjbdobjbhbo?utm_source=chrome-app-launcher-info-dialog I'm sure that these canonicals are set up correctly.
I will send you an email to the email you have included on your profile.
Thanks,
Tom
Hi Joseph,
This is what you're looking for: https://support.google.com/mapmaker/answer/6296952?hl=en
Hope it helps, Tom
Am I correct in saying that the allow/disallow is only applied to msnbot_mobile?
User-agent: Googlebot-Mobile
User-agent: YahooSeeker/M1A1-R2D2
User-agent: MSNBOT_Mobile
Allow: /
Disallow: /1
Disallow: /2/
Disallow: /3
Disallow: /4/
Lets see if tweeting Roger awakens him/her/it! https://twitter.com/thomasharvey_me/status/783958116316643328
I don' think Martijn's statement is quite correct as I have made different experiences in an accidental experiment. Crawling is not the same as indexing. Google will put pages it cannot crawl into the index ... and they will stay there unless removed somehow. They will probably only show up for specific searches, though
Completely agree, I have done the same for a website I am doing work with, ideally we would noindex with meta robots however that isn't possible. So instead we added to the robots.txt, the number of indexed pages have dropped, yet when you search exactly it just says the description can't be reached.
So I was happy with the results as they're now not ranking for the terms they were.
Here's a great post for hreflang and canonicals:
https://hreflang.org/use-hreflang-canonical-together/
With this image explaning it well:
https://hreflang.org/wp-content/uploads/2015/12/mobile-hreflang-canonical.png
Got around this by adding catch alls for a-z/0-9.
RewriteRule ^a(.*) http://www.domain.com? [R=301,NE,NC,L]
This meant that the homepage wasn't being touched. However I did lose all of the category specific redirects. I'm sure there was a way around this however I don't know htaccess to do this well enough. Would be interesting if anyone would know how to do all three?
Screaming Frog is a great tool that has a free version. 500 url limit (suitable for most small sites).
Going by: "First, the hreflang tags are implement properly. UK page pointing there, US page pointing there. Further down the page, there are canonical tags - except the UK canonical tag points to the UK page, and the US version points to the US page. "
It looks like you're doing it fine, however just use the chart on the site Nikhilesh linked:https://hreflang.org/wp-content/uploads/2015/12/mobile-hreflang-canonical.png
Personally, I believe that making a site https is something that majority of the sites in the world should do. Google gives a slight ranking boost and slowly customers are trusting https sites more, with Google transitioning to "not secure" then it makes it even more of a reason to do so.
In majority of cases, https is quite a simple process if you're using a common cms. Just check that all of the scripts are functional and redirects are in place, once you've done that submit the sitemap and wait for Google to recrawl your site.
As the moz index grows in size, Moz may have crawled some lower quality sites that have actually lowered your domain authority.
Have you seen an actual drop in traffic? As whilst Moz is a useful metric, the truly important metric is rankings/traffic.
Guess you forgot to add Google Analytics too
I've been using them for the past few months and their offering is from what I can see is very good value. I've had a few issues with setting up and learning their platform (as well as actual bugs) however chatting to them on their live chat the issues have been resolved and also they've given credits to make up for their mistakes. So I'm a pretty happy customer of theirs.
Have you got a link to your site? I'd happily have a quick look around your site if you want from an on-site perspective? (If you don't want to post it publicly please do pm me it)
"or majestic is counting each redirected page on the old toptwincitiesrealtors.com site as a backlink"
That is what I would guess is happening, I believe the same happens in Google Search Console when you do the same thing.
Just going through Laura's list as a checklist for ones that are applicable:
Nothing that I can see, that's causing a major issue.
The main thing I can see is that the product urls and canonicals are different, is there anyway of listing the product urls as their canonical versions in the category?
Very few people will be typing https://www.me.website.com, so I believe that you will be fine with https://me.website.com.
If they're from relevant pages (topics) from within your clients niche to your client, regardless of where in the world you shouldn't have a problem. The idea that where in the world two different sites are based has an impact on link value to me does not make sense. A post that's on a specific topic would have global value, an example would be architecture, they could be a UK based company who have done some great work, a newspaper in the US covers the opening of the building and links to the architect. The fact that they are in different companies is almost irrelevant. I have seen no case studies to show that anything else would be the case.
What I've suggested will be avoiding these duplicate urls? Here's some actual examples, going via a tier two category I get the following product url:
https://www.symectech.com/epos-systems/customer-displays/pole-mounting-kit-94591.html
With a canonical of:
https://www.symectech.com/pole-mounting-kit-94614.html
Yet when going from https://www.symectech.com/epos-systems/?limit=32&p=2 (a tier 1 category) I get the canonical url.
So if there are products listed in multiple tier two categories then that's multiple urls for the same product. With the suggestion I made, there would only be one variation of this product url (the canonical)
They've had a few posts on some domain forums:
http://www.domainnameforum.org/showthread.php?t=97639
https://www.namepros.com/threads/problem-with-tradenames-com.755117/
PA and DA are a third party metric. Meaning that it has no impact whatsoever on Google rankings.
However, the canonical is only an advisory tag. I've had few cases where people have relied on their canonical tag when their site has numerous product url types (as above with category in the url and just product url) which has many references to these different urls elsewhere (onsite and offsite) and they are now indexed as both versions, which is not always ideal. It also means that reporting tools such as Screaming Frog only show the true URLs on the site. It's also saving crawl budget as it doesn't have to crawl the category produced url and the canonical url.
Whilst it's not a major issue, it's something I would look at changing.
I would say it depends on how bad the current page is. If you're going against known best practices I would do a redesign and then do element by element.
However if you're close and just trying to squeeze every penny possible, I would do individual elements on a 50/50 (or 30/30/30 or something like that if more variants) if possible. You don't want your users to load one time and see something completely different the next time.
Yes, I would have a seperate robots.txt files.
As mentioned by Thomas, I would really have a look at cloudflare, their free plan for me personally performs much better than other paid content delivery networks.
I've got to agree, an a/b test on these pop-ups would be a perfect solution.
The biggest benefit would be redirecting old pages to new. If not just redirecting all of the old pages to the new homepage would be ok.
Just make sure that these are the kind of links that you want to have on your domain.
I've had a quick look, I've been able to crawl it without any issues, the meta robots and robots.txt doesn't seem to be conflicting with anything at all.
It just doesn't look like it's even been found by Google.
Bing doesn't look to be having any issues indexing it: http://www.bing.com/search?q=site%3Aoldermann.no&go=Submit&qs=n&form=QBLH&pq=site%3Aoldermann.no&sc=0-17&sp=-1&sk=&cvid=B677970FF39F4A62A9B040806CF19EE0
What's the quality of the content on the site like? Have you "fetched as google" in the search console?
Hi GoMentor,
Rand did a whiteboard friday on this topic: https://moz.com/blog/wrong-page-ranks-for-keywords-whiteboard-friday
Hope it helps,
Tom
You know what, I don't think I am wrong.
Have a look at this source code from Bing's cache, can you see it there? http://cc.bingj.com/cache.aspx?q=site%3Ahttp%3A%2F%2Fwww.oldermann.no%2F&d=4558460857156532&mkt=en-GB&setlang=en-GB&w=Zicz6YAWVXgY8u0BPAR9qKRHF8OEn7o5 (Disclaimer I got my friend to check as I'm on a mobile)
This gyazo was just taken for example: https://gyazo.com/93c71c3767c57b74cb9f8b482c23c38
Just ran it through this too, showing the same thing: http://www.seoreviewtools.com/bulk-meta-robots-checker/
oh oh and so you know I think I know why you've found the noindex, you're looking at the for addthis.
You can also submit to Google without using Search Console. https://www.google.co.uk/search?q=submit+url+to+google&oq=submit+url+to+google&aqs=chrome..69i57j0l5.2372j0j1&sourceid=chrome&ie=UTF-8 (Search "Submit URL To Google") and then just paste your url in. You can do this to any url.
A friend just confirmed it, as you're using developer tools, what you're seeing is actually the fully loaded page, including iframes. So the noindex you have found is addthis. https://gyazo.com/abbdecc7e3d6a2d1f4d3acf65a48e65b
To actually check the source code, right click then click view source. That's the code for the specific site instead of loading all of the extra code you've been seeing. I would strongly advise you do like that from now onwards as the way you are doing is incorrect.
If I'm correct, you're no longer using the old domain and just redirecting to the new site?
If that's correct, why keep it with Shopify at all, you could instead point it to a different server (or to Cloudflare which'll be able to handle to redirects through page rules without you having to do any code).
The best course of action is to 301 redirect the site to the new site, then on the new site (after checking for issues) submit sitemaps/fetch as google. To speed up this process slightly you can also submit a sitemap for the old site, Google will see the redirect when they try to crawl any page.
It shouldn't do as its just an iframe, if that were the case then every site that that used addthis would have a problem (which I know for a fact isn't true as two large sites I work with use it and have no issues).
As far as I'm aware, iframes do not pass on any of their attributes through to a website showing an iframe. The main reason I was curious about your reply was that screaming frog hadn't had any issues, I can understand myself having a small slip up, but a bit of software rarely does.
Back to the original question though, it is curious why there's no indexing by Google, Bing seems to have picked it up, and three days ago atleast. i can't see reason (in my brief look earlier) as to why it shouldn't be. So I came to the conclusion it hadn't been crawled or it had incredibly poor content (which I can't verify as I don't know Norwegian)
In my opinion, it would be a business decision that's then implemented by the SEO. So ask your bosses these questions and what experience they want for their customers.
I'm afraid I don't have an insight into how Google crawls with lazy loading.
Which works better for your user, pagination or lazy loading? I wouldn't worry about lazy loading and Google. If you're worried about getting pages indexed then I would make sure you've got a sitemap that works correctly.