Hi Yael
I completely agree - it is pretty much what canonical tags were developed for.
Regards
Nigel
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Yael
I completely agree - it is pretty much what canonical tags were developed for.
Regards
Nigel
Hi Becky
One was called Pigley Stairs (Social Media), the second one Carousel Projects (Digital Marketing). I was able to merge the two together.
They are completely different!
I hope that helps
Regards
Nigel
Hi Cheedbe
The very thought of you doing this is mind-numbing. Magneto is a far more configurable than Shopify so stay where you are. If things are not going the way you want then get a better Magento developer working on the site.
The move will cause a huge jolt in traffic that you don't need.
All those redirects will have to be done manually for all product and brand pages but possibly all sorts, filters and brand/category combinations. Each redirect could cause a 15% drop in traffic depending on who you listen to and that's after all the time it will take Google to reindex everything!
It makes sense to separate English and French directories but I would do it on the platform you are on. Use hreflang to point to the two variations. https://support.google.com/webmasters/answer/189077?hl=en
We moved from a bespoke platform to Visalsoft and immediately lost 40% of traffic. It took so long to get it back and in the end, we gave up and the company folded. It wasn't the only reason but the drop in traffic and associated turnover was a major factor.
Regards
Nigel
Hi Gerard
I have looked at the URL and backlink profile for starters. So these are my observations:
1. https://www.lewestowntaxis.co.uk/ was registered in 2012 yours in 2008 - Looking at Moz their Domain Authority is 10 and yours is 14 so little difference
2. You both have very few backlinks 7 vs 8
3. Lewstowntaxis is quite skinny and your is better defined. Plus you rank for around 4x the number of keywords they do.
4. You've got better GMB reviews than them too!
If it was me I would.
Work on your local backlinks from Yelp, Yell and Thompson to local news sites - get those links! - this is not difficult. You can use Moz local or Bright Local is good - you can do a 35 site citation burst for $100 https://www.brightlocal.com/take-contol-of-your-citations/
Get more reviews on GMB - talk about yourself and add photos, links & posts
https://support.google.com/business/answer/3038063?hl=en-GB
Set up Google maps and embed one in your site - 'How to Find Us'
https://support.google.com/maps/answer/6139433?hl=en
That really is just for starters and the best things you can do to bump your site upwards.
I hope that helps,
Regards Nigel
Hi SEOanalytics
The better a sentence is written with contextually strong keywords intertwined throughout, the more elegant it reads and the more professional it looks. Just compare reading Oscar Wilde to JK Rowling. Wilde's writing is poetic and richly written with proselike descriptions, fabulously florid verbs, alliteration and dialogue.
For me, it's important not to repeat nouns. for example 'The singer stalked the stage' (Subject, Predicate (verb), noun), then the singer leapt into the audience, smashing his face bloody on the ground'. You would change the second 'singer' to 'he' or 'front man'. There are a million examples and this is a very obvious way of using semantics.
There is an art to good writing and the Subject/Predicate/Object form is just a discipline to ensure you write in a clean and precise way. It's also good for SEO as it ensures that the correct keywords are used in the right way, using the correct tense and not stuffed.
One could imagine that if my sentence was targetting the keyword 'singers' then the use of alternative, or semantic would enhance the SEO value of the piece.
There is rather a wonderful video here: https://www.brightstorm.com/english/grammar/sentence-basics/subject-predicate-and-objects/
But of course, I have tried to explain the practical use of this in relation to SEO. If we didn't write in this way then the piece would simply be a list of interconnected words or bullet points.
I hope that helps,
Regards
Nigel
Hi James
The reality is that it doesn't matter whether there is a trailing slash or not at the end of your URLs. What is important is that only one version is used and preferably there is no 301 from one to the other if it can be avoided. Especially if there are live links going to one or the other on the front end of your website.
So in your case you have navigation links with no trailing slash and a forced 301 adding them on.
I would remove the htaccess code which is forcing everything to a trailing slash and then add a piece of code removing it from any inbound requests.
Clearly, all backlinks will include the slash including Google - adding the code will resolve these pretty quickly and your existing search results will flick over when they are next crawled. This will depend on the size of your website and the crawl rate. You can check this in webmasters.
Remember that if you do this the backlinks from other websites will have a trailing slash and when the hits come in the new 301 will take them to a non-trailing slash. There may be a small drop in link juice from these backlinks. (I say 'maybe' as Rand Fishkin still believes so - others swear blind there isn't) so be prepared.
You have to balance this small backlink problem with actively pointing to URLs that 301s that redirect. Any SEO will tell you that this is not good! Presumably, the sitemaps don't have trailing slashes? So your site says one thing and your sitemaps another - a nightmare.
This is a version of the code to be placed at the top:
RewriteEngine On
RewriteCond%{REQUEST_FILENAME}!-d
RewriteRule^(.*)/$ /$1 [L,R]# <- for test, for prod use [L,R=301]
I hope that helps
Regards
Nigel
Hi DGAU
There is no doubt that cross-domain 301s do pass link juice and depending on who you listen to you may experience a 15% drop in the juice passed.
The problem is that if it is completely different and irrelevant content then you may do more damage than good.
Your second option, linking to a directory with relevant content is a much better idea and may help you get the link back if you can show them that the destination URL has more relevant content. It also means that you may keep it for longer. I should imagine that no Government department would want to link out to an irrelevant URL!
First, though I would enhance the page to make it as relevant as possible - then request the link back. You'll give yourself a much better chance of getting it!
I hope that helps, Regards Nigel
Hi seoanalytics,
Haha, perfectly OK
That's completely OK, in fact, 'stop' words like 'the' 'of' 'it' and 'and' are generally ignored anyway. It also reads much more naturally so not only will you keep Google happy, the reader will prefer it!
Kind Regards
NIgel
Hi Varun
This could well cause problems for you especially if they did it quickly. Usually, if there is a reasonable gap, say one month then Google will assign authority to the site who published the content first. The problem comes when the second site is a large one with a higher Domain Authority - it could be that their published copy ranks higher than yours.
Whatever it is simply bad to have two articles with duplicate content so my best advice is to ask Google to take the copied version down.
This is quite a simple process, all you have to do is to tell Google here:
https://support.google.com/legal/answer/3110420?hl=en-GB
Scroll to the bottom: Submit A Legal request and follow the link.
Then choose: Web Search
Then choose: I have a legal issue that is not mentioned above (Bottom one) Then select: I have found content that may violate my copyright Then fill in all of the details and wait for them to come back to you.
You could then send them a legal letter telling them to remove it - Google will remove the duplicated content from the web.
I hope that help
Regards Nigel
Hi Conversal
There are search results - they are just not showing in SEMrush and Google takes a while to report them
Regards
Nigel
OK
Yes, keep it fluid, there is no need to group them together. Make it read naturally.
The best way to format the Meta title tag is like this
'Main Keyword - Related Keyword (or higher category) - Company Name' so
'Piedmont Bike Tours - Cycling in Italy - Company Name' (keep under 60 characters)
Meta description: Make this your call to action - it does not help SEO per se so don't fill it with keywords.
'Book a thrilling bike tour of Piedmont in the sunny hills of Northern Italy - unbeatable prices from company name - call us on 12345678 or book online here!'
Keep it under 140 characters or Google will add an ellipsis at the end ' ...'
The H1 does not help SEO on its own, only part of the keyword density on the page. It should sit at the top of the page and contain the keyword:
The H2 does not help SEO per se but is useful for breaking up the content on the page. It is usually only used to divide content - so if you do use it make sure that you follow it with a few paragraphs. Some people advocate a minimum of 300 but it, not the number, rather the way it's written. Don't overdo the keyword, think semantically.
eg:
Keep it natural - I can't advocate that enough.
Regards
Nigel
PS when I say 'per se' I mean 'directly' - of course, a well-written description with strong headings will cause people to visit the pages and take action - the interaction through lower bounce rate and less pogo-sticking is good for SEO, obviously.
Hi Jon
Do you have the pro version of AIO? Apparently, if you do then the correct translation is sent to each of the language versions of a page or post.
https://wpml.org/documentation/plugins-compatibility/translating-all-in-one-seo/
Maybe it doesn't work with teh unpaid version?
Whatever you need the meta titles and descriptions to be in the correct language - little point having them in English.
Regards Nigel
Hi Tom
Moz will not reveal a 301 unless there is a nasty redirect chain. If you use Screaming Frog it will reveal all the directives for every page.
There must be a redirect but it might be worth checking if it's a 301 (permanent) or 302 (temporary) - it should be 301.
The good news is that it is redirecting.
As Martijn suggests you should add the preferred one to Search Console. It doesn't 'do' anything but you will be able to see both versions.
Regards Nigel
You can have related topics on the page as long as they don't swamp the main keyword, making it invisible within the text. If you are going to do this then have a heading for each and continue like this.
"Barolo, set in the heart of Piedmont in the province of Cuneo is the perfect place to pack up your trail bike and hit the road. There are miles of country routes to choose from..........."
So you are weaving a story about how great these areas are to cycle in whilst occasionally mentioning the region and province, all of which are important baby anchor texts.
I am not saying 'don't' mention the towns - I am saying that if you do make it clear where they are and don't swamp the main keyword - i.e dilute it too much on the page. And, if Barolo is very prominent maybe consider a page for it.
You could end up with a category killer page of 1000 words + but remember that this pages need to keep the attention of the reader so pepper them with a nice balance of text and images.
In terms of the MOZ keyword tool.
Enter the term 'Piedmont bike tours' and filter by medium lexical similarity. You will see words like 'Cycle'', 'cycling routes' 'rental' ' trail'. Weave these into the sentences that you are constructing. It takes some skill to write in a fluid manner without repetition - make it for the reader, not Google - make it engaging so that people will read it - make sure all pics have different Alt text.
Regards
Hi Michel
I work with eCommerce sites so have a lot of experience in colour and size split pages.
If the exfoliating cream comes in only one size you would be better having the first option. Write a really strong product description 150 words plus along with technical details.
If the cream comes in various sizes then have them as a drop-down variant on the page. So you would have a single product with a strong product description and then a drop down box so that the user could choose the size of tube/bottle they needed.
The reason is that if you don't do this you will end up duplicating the product descriptions. So say you have 3 sizes then the URLs would conceivably be like this:
https://www.example.com.au/aspect-exfoliating-cleanser-240ml
https://www.example.com.au/aspect-exfoliating-cleanser-360ml
https://www.example.com.au/aspect-exfoliating-cleanser-480ml
This would mean that all three would share the same product description and the only thing separating them would be the size. This is bad for SEO as Google doesn't rank pages with duplicate content well. In fact, it could end up that none of them ranks in the top 5 pages.
Keep it simple - one product - size drop down.
Successful stores do it this way.
I hope that helps
Regards Nigel
Hi Roadmedia
More like a large site with many out-links and very few backlinks and skinny content.
Build the site up and scale high-quality content with fewer external links. It's not so much about lack of renewed content or internal links although they are important they won't lead to a high spam score.
Regards
Nigel
Hi Tom
If it redirects to www.domain.com then that must also be set up in GSC as that is now the preferred domain format. It looks better as well without the trailing slash.
Regards
Nigel
Hi iHasco
Neither seem to rank.
What I think
Your sitemap has the wrong URL in it - with a trailing slash at the end: https://www.ihasco.co.uk/courses/detail/manual-handling-training/
The website has a version without a trailing slash! https://www.ihasco.co.uk/courses/detail/manual-handling-training
This means there are effectively two versions of this page so you have perfect duplication as both are regarded as different by Google.
The Solution: 1. Remove the trailing slash version of the page.
2. 301 redirect the trailing slash to the non-trailing slash in htaccess
3. Check for other problems in the sitemap - eg you have a page https://www.ihasco.co.uk/terms-and-policies/terms-and-conditions-of-use/ in the sitemap which redirects to **https://www.ihasco.co.uk/terms-and-policies. **If there is a redirect or a canonical in place DO NOT put the original URLs in the sitemap!
4. Put a general directive in htaccess 301'ing all trailing slashes to non-trailing slashes to avoid any further problems.
5. For a quicker result go to Seach Console and physically remove the trailing slash version of the page. It'll be gone tomorrow. At the same time to a Fetch Google for the correct URL - you will be back at number 1-3 within a week.
You basically have a situation where you have duplicate content, Google doesn't know which version to rank so ranks neither. You also have a problem where Google does not trust your sitemap so make sure the sitemap is a pure reflection of what is on the site. If you don't then Google will not trust your 301s or canonicals and could end up ranking other spurious pages.
I hope that helps
Regards
Nigel
Just double check in a clean browser (with history cleared & F5) or in incognito mode to check the default.
Sounds good Tom!
Brilliant! good luck with it. Please do me a favour and hit the 'good answer' button thanks
Hi Brian
If you try and do that it never works - it strips any subdirectory out. I have only ever been able to do it for the whole site.
I'd be interested if you do find a tool which works at page level!
Cheers
Nigel
Hi Alexis
If the third one is the default then you need a default hreflang tag.
https://moz.com/learn/seo/hreflang-tag
So the last one would have this tag pointing to it:
More on Google here:
https://support.google.com/webmasters/answer/189077?hl=en
It will then become the default site for all people not in England or Canada. Google will not see any of them as duplicate content.
Regards
Nigel
That doesn't work by page Paul - That's what Brian is asking.
Surely the x-default is, as the tag suggests, a default where no country or language is targeted? So if someone resided in an untargeted country and the site happened to rank it would be that one that came up.
Someone in the UK (which contained a UK target tag) would not go to default first, as you suggest, and then select their own country & language. That's misleading.
I agree that the subfolders would be used to target each country but you would still need both country and language. With Canada you may wish to target en and fr as both are relevant and each would reside in a different sub-folder.
The language is essential imho.
Regards Nigel
Hi Mike,
That new tool is very revealing and supports my experience that you can't dupe Google into ranking a different page just by canonicalization. Thanks!
Nigel
Hi Becky
I can see chairs:
https://www.key.co.uk/en/key/chairs
But the paginated versions above are not in there. (can you see them?)
All you need to do is remove this directive for pages without a page 2: rel="next" href="https://www.key.co.uk/en/key/chairs?page=2" > as there is no page 2 for chairs.
Regards
Nigel
Hi
There is no website there. Or you have given us the wrong URL. You can't rank staging sites, they need to be live.
Regards
Nigel
Hi Booedreaux
Whilst all of these issues are important I wouldn't have thought they would cause such a drop.
One by one:
1. Canonicals - each page should have a self-referential canonical - this stops utm versions created by apps such as Twitter being indexed. It's always better to have them. It also allows you to combine identical pages. Note if you have canonicals in place and the two pages are not the same then Google has recently started ignoring them and listing both URLs anyway so check in new Search Console what Google sees as the canonical.
2. Missing H1 - This helps Google understand the page content and should be a close variant or synonym of the primary keyword.
3. Long URLs - some just are and Google has a pretty high tolerance for long URLs - this often can't be helped with attributes being common on eComm sites.
For more, I'd need to see the URL.
Regards Nigel
Hi Jens
You can't add a noindex in the Robots.txt file.
Firstly you need to add a noindex tag to all of the pages in the /node/ directory.
Then remove the nofollow directive in the Robots.txt
You need to do this for Google to see the noindex tags!
If you have a noindex tag and a nofollow then the directory is blocked so Google can't see the tags!
Once all the pages have gone from search then add the nofollow back to the Robots.txt file so that Google doesn't waste crawl budget trying to index them.
This will solve your problem.
Regards
Nigel
Hi Jon
I am with Joseph here - I use Yoast as well.
Just make sure that if you have multi-language version fo pages that you specify the hreflang tag on each.
https://moz.com/learn/seo/hreflang-tag
Regards
Nigel
Hi Jens/David
You should not use a noindex in Robots.txt. You can put it on the page as a robots tag, but not in Robots.txt
I have never ever seen it used in the Robots.txt - I have seen it mentioned a few times on some questionable sites and the odd mention many years ago but it's bad practice in my opinion.
Read more about Robots.txt here: https://moz.com/learn/seo/robotstxt
If you follow what I have said, that is the correct solution.
Regards Nigel
Hi Beck
Absolutely the correct approach. If you find any that simply don't match at all then you may consider gender, if that is appropriate or homepage if there is nowhere else.
Individual products need mapping as well, and if there is no close match for any of them then category>gender>home whichever is most appropriate.
Obviously, if there is no gender then just 301 to home but I work with a lot of fashion sites where it is relevant.
Regards
Nigel.
Hi Dale
If that loop you have specified there was true then the homepage wouldn't show up.
https://www.theirsite.com/keyword>https://www.theirsite.com/>https://www.theirsite.com/keyword ad infinitum...
It would just keep on going and Google wouldn't be able to show the page.
You could use Screaming Frog to check for redirect chains - SEMrush and MOZ also pick them up so scanning the site would be my preferred option before touching it. Failing that I would remove it and see what happens.
Regards Nigel
Hi Roman,
I work with a lot of e-commerce companies and I have to say from one SEO to another this is great advice!
Best Regards
Nigel
Hi Raymond
It's a standard approach for most eComm stores. I prefer 301'ing to the closest new style first but if there isn't one then brand or category is perfectly fine. If the product then comes back into stock then a return to a 200 is fine - Google will pick it up if the 301 is removed. Also, make sure it's back in the sitemap.
Regards
Nigel
Hi James
This is a huge question - 'How do I do SEO?'
If you know nothing then hire a consultant to help.
However:
1. Make the site structure easy to understand so Home>Department>Category>Subcategory
2. Use attributes for colour and size, so you have product pages with drop downs. Otherwise, the colour & size URLs compete with each other.
3. Construct an easy to follow menu.
4. Make sure filters do not create extra URLs - better to use Ajax
5. Make the site as fast as possible - consider a content delivery network (CDN) for larger sites.
6. Keep images under 100k by exporting them as 'save for web'.
7. Keep the descriptions and Add To Cart buttons above the fold - in the visible eye line when loaded.
8. Install a quick & simple checkout.
9. Write rich and interesting descriptions
10. Make sure every page has complete Meta tags - Title and a compelling 'buy me' meta description.
More
https://moz.com/blog/how-to-craft-the-best-damn-ecommerce-page-on-the-web-whiteboard-friday
https://moz.com/blog/heres-how-to-create-a-product-page-that-converts
https://moz.com/learn/seo/on-page-factors
There are 100s of things to consider but these are the priorities. I am sure some of my colleagues will add many more.
Regards Nigel
Hi Jens
I don't know Drupal but if it's like Wordpress it will add a noindex tag to the page.
Do it for one page then take a look at the code.
Go to the page: right click > View Source
Then go to the three dots top right in chrome and search noindex. It will look like this attached. (ignore the red line crossed out piece)
Best Regards Nigel
Hi Julie
If they are your blogs which relate to your business and you have set them with do follow links back to your business website then they will be useful. It's when people use Tumblr to set up PBNs (Private Business Networks) where it becomes grey hat and ill-advised.
If the subject of the blogs, (URL & Title) have no relevance to your business then you may do more harm than good.
Also be careful with exact match anchor text - spread it out over different relevant keywords/phrases.
You determine whether a link is do follow or no follow.
Regards
Nigel
Hi John
Usually, when this happens the page is being redirected to itself. If you have a redirection plugin then check the URL and make sure it is being redirected properly.
If it's redirected to itself then it will be in an eternal loop and will not work.
Regards
Nigel
Hi India
It is unusual to keep both domains if you really want to move site but yes, you can do cross-site canonicals.
So place ON EVERY page of the old site a canonical to the corresponding page on the new site.
The old site will disappear from SERPS and the new site will appear.
Warning
The problem you will have is that the new site will not inherit any of the backlink equity you have built up on the old site. For that, you will need to do a page by page 301 redirect in htaccess on the old site.
I hope that helps
Regards Nigel
Hi John
If the pages are old and have had no useful visits then it would make sense to forward them to more relevant content. This would be standard SEO practice anyway. However, unless you have 1000's of pages then crawling and indexing your site really isn't a problem. If you had a very large site with frequently updated pages then it could be a problem. You can change the crawl rate but if it is 'calculated as optimal' then you needn't bother worrying. read more here...
https://support.google.com/webmasters/answer/48620?hl=en
Regards
Nigel
Then there is no problem simply putting a self-referencing canonical. There is in effect no mobile version as there is a single URL so no need for a rel=alternate.
It's an even easier solution. Well, there isn't a problem in the first place.
rel=alternate is only necessary if you have two different URLs! The fact they are the same takes away the problem.
Regards
Nigel
You are right - you could only use teh rel=alternate if there was an m. version or similar
Regards
Nigel
The reason we answered 'quickly' by the way is because we are in the UK - you were still in bed lol!
Hi seoanalytics
The whole area of primary and secondary keywords (or Keyphrases) has largely been overtaken by Google's new Rankbrain AI system of determining themes and relevancy. So don't think in terms of these terms - also please don't listen to anyone focusing too heavily on LSI's as these have mostly been debunked as a load of flannel used by SEO's to impress, although leading SEO Brian Dean still insists that they have some effect on SEO. If you start researching LSI's you will end up in a rabbit hole from which you will never emerge! to advise - 'use your primary keyword a couple of times is nonsense'.
Think of your website like this:
Each page represents a theme - for example as Alick300 has used 'Laptop chargers' You need to think about writing chunky, relevant content around the term 'Laptop Chargers' but you do not need to mention every 'secondary keyword' you can think of to pepper the text with. Write compelling copy using semantically connected words like batteries within the content.
By all means, search related terms like batteries, tablets and transformers (are they still a thing?) but if you think in the way of 'Primary keywords' and 'secondary keywords' you will risk overusing certain terms, not having free-flowing writing and turning your intended customer into a brief visitor. Think of the use of synonyms as well - (the Bike/Bicycle analogy is a good one).
Modern SEO is about writing semantically connected and lexically relevant content in such a way that you engage your website visitor and ultimately convert them into customers. The more you engage, the longer the 'dwell time' and the more you satisfy user intent.
Have a read of Brian's latest post on Rankbrain - it's pretty brilliant and you should come away inspired!
https://backlinko.com/google-rankbrain-seo
There is no way of outsmarting Google by determining and hoofing in endless keywords and phrases. Modern content needs to be well written with perfect grammar and spelling and theme focused.
Also, break it down into easily readable chunks - this helps the user to stay engaged as opposed to being lost in long paragraphs. Pretty much like I have here.
Use Google search box to help and 'related search terms' at the bottom of the page to find semantically related phrases.
I hope this helps,
Regards
Nigel