Did this work for you Rajiv?
Event Tracking is definitely what you're looking for. You can learn more about it here:
https://developers.google.com/analytics/devguides/collection/gajs/eventTrackerGuide
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Did this work for you Rajiv?
Event Tracking is definitely what you're looking for. You can learn more about it here:
https://developers.google.com/analytics/devguides/collection/gajs/eventTrackerGuide
I love public Q&A because everyone gets to chip in, but nobody wants to share the domain in question (which is understandable) so that makes the job of answering a question really difficult.
Can you hide the actual domain name but provide some examples of URLs? For instance:
ourdomain.com/honolulu/four-seasons?rooms=4&view=0&page=1
Did you try any of Dr. Pete's suggestions? If not, I would implement one of those first, as they are still as relevant today as they were when he wrote them. Rel next/prev has received a bit more attention since then, but it only solves part of the problem if you're dealing with parameters beyond simple pagination (e.g. rooms, views, etc..).
From the information provided above I would probably go with a rel canonical tag to fix this issue.
I would not rely on a rel nofollow tag on links pointing to variants, as was suggested by Smarties, because Google is going to find those URLs regardless and a no follow tag on a link doesn't tell them not to index it.
Smarties #2 suggestion sounds good but I'd allow them to be followed. i.e. robots meta noindex,follow as opposed to noindex,nofollow. This allows pagerank from external links to flow through non-indexable URLs.
Good luck!
Hello Rebecca,
Screaming Frog is capable of a lot when set up properly. Mike King has a great post about running it on Amazon Web Services (AWS) so it could fit your cloud solution request. Have you checked out Seer's guide to doing almost anything with Screaming Frog? Do you have "Check External Links" checked? Also "Always Follow Redirects" should be checked. And you can set the "Max Redirects to Follow" to whatever you like.
If that doesn't work, have you tried Deep Crawl?
Hello Micey123,
Unfortunately, as I mentioned, there is no easy or cheap way to do this. Even with log files, most of the time the keyword data is not going to be available in the referral string.
Jumpshot would be your best bet, but it's not going to be cheap.
Hello Gagan,
I think the best way to handle this would be using the rel canonical tag or rewriting the URLs to get rid of the parameters and replace them with something more user-friendly.
The rel canonical tag would be the easiest way out of those two. I notice the version without the encoding (e.g. http://www.mycarhelpline.com/index.php?option=com_latestnews&view=list&Itemid=10 ) have a rel canonical tag that correctly references itself as the canonical version. However, the encoded URLs (e.g. http://www.mycarhelpline.com/\"/index.php?option=com_latestnews&view=list&Itemid=10) which is actually http://www.mycarhelpline.com/\"/index.php?option=com_latestnews&view=list&Itemid=10 does NOT have a rel canonical tag.
If the version with the backslash had a rel canonical tag stating that the following URL is canonical it would solve your issue, I think.
Canonical URL:
http://www.mycarhelpline.com/index.php?option=com_latestnews&view=list&Itemid=10
Hello Donford,
The easiest thing to do if you are really only concerned about providing a link to more information for the distributor's visitors is to nofollow that link. This way you're not going to chance getting "dinged" for adding 10,000 links all at once.
If you have a few product pages on the manufacturing site that you'd like to improve you can selectively remove the nofollow tag from the link pointing to those pages from the distributor side, but I wouldn't recommend 10,000 followable links from a single domain all at once.
Questions like this are always difficult to answer because Google doesn't treat every site the same. A well-known brand with thousands of high quality links and a trusted, established site that has been in operation for a long time with good user metrics, and has never been penalized can get away with a LOT more than a relatively unknown brand or an "average" site. That is why I recommend starting with the nofollowed links and, if you want, testing the waters by allowing a few of them to be followed.
Please let me know if you feel your question has still not been adequately answered.
Cheers,
Everett
Hello Edlondon,
I think you're probably answering your own question here. Google typically doesn't have any problem indexing images served from a CDN. However, I've seen Google have problems with commas in the URL at times. Typically it happens when other elements in the URL are also troublesome, such as your double file extension.
Are you able to rename the files to get rid of the superfluous .jpg extension? If so, I'd recommend trying it out on a few dozen images. We could come up with a lot of hypothesis, but that would be the one I'd test first.