Amp version of website
-
Hello & thanks for reading
its maybe the monday morning blues but
i have two versions of a website - www.gardeners.scot and www.gardeners.scot/AMP/
the pages on the amp version have canonicals pointing to the "normal" website
Should the links on "www.example.com/AMP/" point to the amp website or the normal website?
what are your thougths?
-
Here is the official suggested markup: https://support.google.com/webmasters/answer/6340290?hl=en
-
On any non-AMP page, reference the AMP version of the page to let Google and other platforms know about it:
-
On the AMP page, add the following text to reference its non-AMP canonical version:
-
-
thanks for the reply
i too have seen it done both ways. I think i will leave it the way it is i.e. pointing to amp files as opposed to the normal site
-
I've honestly seen it done both ways, I don't believe there is actually any guidance out there for what the best practice is. For example the Guardian does not link to other /amp/ pages. https://amp.theguardian.com/politics/2017/feb/06/uk-tourists-face-mobile-phone-roaming-charges-post-brexit-paper-says
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inspection of the website is required.
Hello, Recently We have a taken a project " http://customerconnect-services.com/ " and are responsible for the Digital Marketing for the website. FYI, we have been following the best On-Page SEO practices ever since we took the project: Performing Keyword Research, Finalizing the Keywords,Using those in the Page-Title, Meta-Descriptions, Heading Tags and of course in the Content as per MOZ suggestions & SEO standards. But, we are unable to rank in first page; which is a serious matter of concern for us. We have also checked whether the Domain / URL has been blacklisted, but it's not (Not even by Google). We are therefore unable to figure out what is going wrong even after following so many best practices to get the keywords a good ranking (1st & 2nd page of SERPs). Therefore I would like to request you to provide your expert opinions in this regard by checking what is it that we are not getting right. The website url is http://customerconnect-services.com/. As this is a high-priority issue for us & the client is a prestigious one of course, please help. Looking forward to hearing from you at the earliest. Thanks & Regards,
Technical SEO | | Harini.M
Harini0 -
Duplicated content & url's for e-commerce website
Hi, I have an e-commerce site where I sell greeting cards. Products are under different categories (birthday, Christmas etc) with subcategories (for Mother, for Sister etc) and same product can be under 3 or 6 subcategories, for example: url: .../greeting-cards/Christmas/product1/for-mother
Technical SEO | | jurginga
url:.../greeting-cards/Christmas/product1/for-sister
etc On the CMS I have one description record per each card (product1) with multiple subcategories attached that naturally creates URLs for subcategories. Moz system (and Google for sure) picks these urls (and content) as duplicated.
Any ideas how to solve this problem?
Thank you very much!0 -
Website crawl error
Hi all, When I try to crawl a website, I got next error message: "java.lang.IllegalArgumentException: Illegal cookie name" For the moment, I found next explanation: The errors indicate that one of the web servers within the same cookie domain as the server is setting a cookie for your domain with the name "path", as well as another cookie with the name "domain" Does anyone has experience with this problem, knows what it means and knows how to solve it? Thanks in advance! Jens
Technical SEO | | WeAreDigital_BE0 -
Migrating Http Site to Https Version
Hello, This coming weekend we will be changing our http sites to https versions. I have a very quick question regarding Google Search Console. Because the migration is happening over a weekend, we want to get as much as possible setup beforehand. Is there any risk to adding the new properties to the search console without the sites being live yet? I want to deliver the Search Console verify files to our IT team in advance for them to add to the site, and then once I get the okay that the migration went successfully, I would go into the Search Console and click on the Verify button to get the sites verified and of course, then fetch as Google to help speed up indexing a bit and ensure there are no errors. Any insight on this would be greatly appreciated! Amiee
Technical SEO | | Amiee0 -
Sitemap & noindex inconstancy?
Hey Moz Community! On a the CMS in question the sitemap and robots file is locked down. Can't be edited or modified what so ever. If I noindex a page in the But it is still on the xml sitemap... Will it get indexed? Thoughts, comments and experience greatly appreciate and welcome.
Technical SEO | | paul-bold0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin <cite>dev.rollerbannerscheap.co.uk/</cite><a id="srsl_0" class="pplsrsla" tabindex="0" data-ved="0CEQQ5hkwAA" data-url="http://dev.rollerbannerscheap.co.uk/" data-title="Roller Banners Cheap » admin" data-sli="srsl_0" data-ci="srslc_0" data-vli="srslcl_0" data-slg="webres"></a>A description for this result is not available because of this site's robots.txt – learn more.This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google.Please can anyone help?
Technical SEO | | SO_UK0 -
Mysterious drop of website ranking in google
Usually, I don't want to bother anybody by posting silly questions on forums. But this time I really might need advice. My wife and I took over the website maintenance and e-marketing of a local air conditioning company end of March this year. Before that the applied SEO strategies were not very user friendly and a little too search engine focused (spammy keyword stuffed articles, confusing website structure, a lot of directory links). Yesterday night (May 15th) the website more or less stopped ranking. For search terms like "ac repair englewood fl" or "trane north port" and many more the website was on page 1. Here are some more details: I replaced the old website with a newer version end of April. Since some of old the url structure did not apply any longer, I did a setup of around 30 301-redirects in .htaccess. The new site seemed to rank more or less as expected. The homepage has a PakeRank of 1 (seomoz Page Authority is 31). I am working on that but good natural links just take some time. site:kobiecomplete.com still brings up all the pages Google Webmaster Tools notified me on May 12th that there was a possible outage: _"_While crawling your site, we have noticed an increase in the number of transient soft 404 errors around 2012-05-08 16:00 UTC (London, Dublin, Edinburgh). Your site may have experienced outages. These issues may have been resolved. Here are some sample pages that resulted in soft 404 errors:" The listed pages under "some sample pages" are only pages from the old website which do not exist any longer and the 301 redirect was not setup. But this should have been already any issue before, if at all.
Technical SEO | | grojoh
I added the missing 301 redirects and marked them as fixed in Google Webmaster Tools. I had a copy of the website on a testing webspace (root directory of brightsidewg.com). Even though I had robots.txt set to disallow everything and WordPress search engine privacy set to do not index / follow, the website appeared on the Google search results yesterday night instead of the original website (kobiecomplete.com). Even though brightsidewg was a few ranks worse than kobiecomplete.com was, it was still ranking.
To remove the duplicate content, I deleted everything on brightsidewg.com and requested the removal of the website in the Webmaster Tools. Now brightsidewg.com is not any longer indexed (good) but it didn't help the ranking of kobiecomplete.com. Especially the homepage and the service area pages were ranking pretty decent on Google before yesterday night. Now I can not find them at all. Only other less important pages rank on page 8+ No malware on website I did not do any big changes on the website yesterday (only really minor ones). I did not acquire any weird/paid links even though there is a new link from a PageRank 0 website which I did not setup: http://www.indo-karya.com/detail/news/2012/kombise But that alone I think would not be enough for a penalty. It almost looks like that Google applied a partial -950 filter!? I could submit the website for reconsideration to Google and tell them about the duplicate content issue with my testing webspace brightsidewg.com. What do you think about it and what shall I do? Thank you so much for any help!0 -
Optimizing a website which uses JavaScript and jQuery
Just a quick question (or 2) If I have divs which are hidden on my page, but are displayed when a user clicks on a p tag and the hidden div is displayed using jquery a user clicks on an a tag and the hidden div is displayed using jquery with the href being cancelled in both examples, will the hidden content be optimized, or will the fact it is initially hidden make it harder to optimize? Thanks for any answers!
Technical SEO | | PhatJP0