Just realised that link was the same one ChadC posted, teach me to pay attention rather than crawling through all my saved bookmarks!
Posts made by Sarbs
-
RE: How do I add subdomain tracking to an existing Google analytics account that was set up to track website only (without the subdomain option)
-
RE: How do I add subdomain tracking to an existing Google analytics account that was set up to track website only (without the subdomain option)
If you are using classic analytics you will need to make a small amendment to the code. you can find this information by going to:
Admin> (Middle Column) Tracking Info
You will see a little on/off switch next to the subdomains option, if you turn it on you will notice a single additional line of html is added to the tracking code:
_gaq.push(['_setDomainName', 'example.com']);
If you add this to the code on the site you will be able to track subdomains in analytics.
If you then want to see the sub-domains in the reports you may also want to add a new profile complete with a special filter which will display the entire URL of a page, not just the section after the root domain. There is a guide on how to do this but cant find it right now (Admin>Filters (3rd column)). if anyone knows where that guide is please pop a link as i would like to credit it to the correct person, its incredibly useful.
Note: Make sure you add the filter to a new profile or you'll lose your historic data.
Found the code in an article here, scroll a litte down to see the screen shot of how the filter should look.
Hope this helps.
Thanks, Tom.
-
RE: Partially duplicated content on separate pages
You should be ok, Google is ok about this type of boiler plate text, the fact you are also adding unique content to the pages to is even better as you are not replicating the entire page. There has been a recent video by Mat Cutts on the issue which you can find a write up on in Search Engine Watch.
Hope you find that useful.
-
RE: Duplicate content when changing a site's URL due to algorithm penalty
You should be ok just to replicate it, but by all means use the opportunity to refresh the content, 6 pages shouldn’t take too long. If you want to be extra safe then you can of course just rewrite from scratch. The Penalty will be at the domain level so you should be ok to redirect the existing pages to the New URLs, this will signal to Search engines that the pages have been moved and not to count the redirected pages as unique content, avoiding Dupe content issues. You can also use a cross domain Canonical tag.
If you don’t want to do any redirects to totally severe your links to the old domain profile then remove the original pages from Google’s index in your webmaster tools account and ensure you return 410 status codes to individuals that request the page. If you do still want the users to redirect however 302 the page to the new location as this won’t pass link equity.
Hope this proves useful.
-
RE: Mozbar and Firefox: Feedback please!
If i had specific investigations to carry out i would jump on Chrome and likely use it, as i do now with other tools, yes.
-
RE: Mozbar and Firefox: Feedback please!
Firefox is my primary browser and would be a real shame to lose this functionality, find it very usefull for spotting things during normal browsing and work. Understand why you want to do this however but from my selfish point of view its a shame.
-
RE: This might be a silly question...
Easy way to be sure is to do a quick search on Google to see if they are ranking. If you know for sure the Parameters make no difference its usually better to specifically signal that through the WMT console. While Google tend to be pretty smart at these kind of things they can always make mistakes so may as well give as much info as possible.
-
RE: This might be a silly question...
Have you discounted URL parameters through Google Webmaster tools? This would be particularly prevalent for an ecommerce site as if you have not Google could be looking at /page, /page?p=x, /page?p=y etc and counting these as unique pages. This creates obvious dupe content issues and is easily fixed in WMT by going to:
Crawl>URL Parameters
Hope that helps.
-
RE: 301's, Mixed-Case URLs, and Site Migration Disaster
I would recommend trying to get things right and going to the lower case format. As you mention it will save you headaches further down the line by keeping to a simple convention.
You can reduce the impact of the redirects by amending your previous ones to point to the new directories, this will prevent a redirect chain and you will lose less. You should then additionally redirect the new URLs of course.
Dependant on the types of inbound links your site has (and volume) it may also be worth a little outreach to other sites asking them to amend to the new final destination format.
Good luck however i know what a pain it is, I've done many site migrations myself.
-
RE: How to flag inbound affiliate links to Search Engines
Hi Bradley and thanks for the feedback.
I’m reluctant to use the intermediary page / redirect combo as our publishers also append URL parameters for use in internal tracking and commission payment to their landing pages. As we have a large number of these affiliates having to set up a series of redirect cases (which i know we could do by identifying their source and original URL values) could be more effort than its worth, particularly with on-going maintenance. As such I'm currently inclined to go with the nofollow approach.
It seems through the silence in the Moz community that there is no URL parameter either so I’m assuming our VP was alluding to the two methods discussed which is what i was hoping. Going to leave this question open a little longer however just in case anyone else has any further insights!
Thanks, Tom.
-
How to flag inbound affiliate links to Search Engines
Afternoon Mozzers,
I was chatting to our VP of marketing earlier about one of our sites with a somewhat unhealthy looking link profile. This is primarily caused by the sector its in and the fact that lots of low quality lead generators / affiliates operate there, sending traffic to us in return for payment on accepted leads.
Now, he recalled that there is a way to mark these inbound links, through a URL parameter he thought, as being affiliates and therefore should be ignored as we don’t want them misinterpreted as an attempt to manipulate our rankings.
I have been doing a spot of research but can't find a straight answer. heres a few articles i looked at: blogaid, Yoast, Webmaster world and the Moz Q&A.The problem is all these are from the perspective of a site linking out and acting as an affiliate, so all deal with that page not losing PR. The two methods i have derived from this boil down to:
- No follow the links
- Use an intermediary redirect page and script which you can block from robots.txt
So, back to our case - Are there any ways we can signal to Search engines, as the destination site, that these links are from affiliates (such as this URL parameter our VP had vague recollections off) or should i just get in touch and ask the sites to make them Nofollow?
Thanks, Tom.
-
RE: External 'Source' link in PPC Ad copy
Great thanks Tom, clearly a develpment i missed!
-
External 'Source' link in PPC Ad copy
Afternoon Mozzers,
Noticed something today on Google.co.uk that we have never seen / noticed before on PPC adverts. This exampe is a Google UK search for Wonga payday. Within the Ad copy there is a link to a BBC news article about the company.
Has anyone else seen these on any PPC Ads? I am assuming this is not an intentional feature from Wonga to link to an external site from within their own PPC advert? i assume they would not get charged for these clicks.
If anyone has any further info or insights on this feature it would be appreciated.
-
RE: Hour of the day that my analytics goals are being triggered within the all traffic report.
My bad, i missed a step. In the Goal overview first change to the source/medium report, then view full report. Screen attached.
Thanks, Tom.
-
RE: Website was given to someone else, does a "move" or something need to be performed in Webmaster Tools?
They will need to take control of the domain themselves, if you have removed all the content and have no involvement in hosting / the domain register anymore you need not worry. The responsability is all on the new owner to validate their new ownership with services like WMT.
-
RE: PPC seems to have had a seriously negative impact on organic rankings?!?
While bounce rate and page engagement would play a role i'd be surprised if this was solely down to the PPC campaign, particuarly if its low level as you mention. Whats the URL of the problem page and we may be able to case more light on the matter / offer alternate potential areas to investigate.
-
RE: Hour of the day that my analytics goals are being triggered within the all traffic report.
hours not one of the option for the default all traffic report. Can easily create a custom report if you like but may just be easiest to look at the Goals Overview, click full report on the table then change the dimension to keyword. Result hourly conversion stats by keyword.
Hope thats clear enough, if not let me know and i'll throw in some screens / links.
-
RE: Can you duplicate on site Blog Posts to Google Plus
I believe you are referring to Google Authorship or Google publisher which allows you to associate your content with your Google+ profile. To implement you just need to add a small line of code or can even just use the Rich snippet highlighter in webmaster tools.
Once you have added this you need to confirm that the article was produced by you in Google+ and appears on your time line.
Hope thats helpful.
-
RE: E-commerce store, in need of protecting our own content
You have several options, while you can never stop someone coming to your site and actively taking your content you can attempt to trip them up, particularly if they are using automated tools like scrapers. There a are a few article out there (like this) that go into details but common recommendations you will see include things like adding links to your text and images that go to other pages in your site, often the sites stealing the content will then inadvertently include link back to you in their pages. To avoid issues of low quality link from these sources you should probably make these no follow to be safe. Then there is authorship etc. although that’s not quite right for product descriptions etc., though you could investigate the feasibility of this.
Other than that there is enforcing your copyright but to do so you need to locate the stolen content. Again multiple tools out there such as copyscape that Remus mentioned, but again a quick and easy one would be to set up Google alerts to look for that content. Then you can contact the webmasters and utilise DMCA takedown requests etc if necessary.
But if you are looking for methods to physically stop people taking your content im not aware of a fool proof one i am afraid.
Hope this is helpful.
-
RE: Images added to website automatically become URLs - is this an issue?
When you say they are URLs do you know what their content type is? Which crawling tool did you use? Images need to be hosted somewhere so shall be called through the HTML etc. but they are normally ended by a .png .jpg or whatever is applicable (again this varies). Screaming frog is my personal favourite for crawling and this shall tell you the type of content these images being identified as.
-
RE: I want your opinion?
A second one you may want to check out (Think its free, forget) is Concept Feedback.
-
RE: I want your opinion?
1.
Same reason as Chris although i think he put the wrong number down from your list I also like the need help telephone number as to me signals you’re a legitimate company with contact details and acts a clear trust signal as well as user resource. Although I prefer the URL structure of 2 but that’s only from a personal preference of not liking the question format. You actually have more Keyword targeting in the one you’re using.
Title wise – I think 5 things you need to know rather than everything you need to know works better. People like short concise lists, more chance I’m going to read it.
-
RE: Need help please with website ranking problem!
Hi,
there are many reasons why these sites may be ranked above you, without doing an audit of your site and your competitors i cant advise exactly what this may be, but this white board friday by Rand may be of use to you. theres also this suplimentry article although this focuses on going beyond Moz metrics.
The ones with no links may be new and enjoying their 'holiday' period, you can check out things like that using the Who is directory among other tools to see when the domain went live.
I wouldnt expect its solely URL structure, although generally the further down the URL structure your keywords appear the less weight they carry, this is only a single and relativley small factor. having no social to may not be a deal breaker, its certainly an important up and comming factor, but could be compensated for by other things. Take a nosey at the latest pole on 2013 ranking factor weightings to see what the comunity see as important at the moment.
Sorry i have not given you a definative answer, but hopefully given you a few more things to consider and look into.
Thanks, Tom.
-
RE: Google Analytics complexe solution?
I am slightly confused by your question, do you mean that when you view that report (Traffic Sources" >> "AdWords" >> "Campaigns") you see no visitors that have reached the final conversion stage? Looking at the fact there are only 4 visits could it not simply be that no traffic from this channel has actually converted?
-
RE: Subdomains or subfolders for language specific sites?
In respect to Google and other search engines they will determine the language based on the content. So other than having unique URLs for the different content normal site structure considerations come into play. I would also be inclined to use a subfolder and Google's webmaster guidelines for multi lingual sites show no preference.
Heres the relevant bit:
Google uses the content of the page to determine its language, but the URL itself provides human users with useful clues about the page’s content. For example, the following .ca URLs use
fr
as a subdomain or subdirectory to clearly indicate French content: http://example.ca/fr/vélo-de-montagne.html http://fr.example.ca/vélo-de-montagne.htmlSignaling the language in the URL may also help you to discover issues with multilingual content on your site.
Hope this helps.
-
RE: Duplicate Titles Shown in Moz Analytics
Depends how integral that plugin is to your site. I personally would go straight for the cannonicals but is you can afford to turn it off and experiment then why not. Experimentation is the only way we really find things out in this industry.
-
RE: Duplicate Titles Shown in Moz Analytics
No worries. A canonical is not the same as a 301 however. You should be able to keep the translated pages and allow users to view them, a canonical will signal to search engines only that these pages are the same. A 301 will redirect users to one version of the page too which removes the value of the translated content for your users.
Here's another Q&A thread i was involved in yesterday focusing on Cannonical tags that should make this clearer.
Please do keep me posted, always interested in the outcomes.
Thanks, Tom.
-
RE: The curse of (not provided) data....
One potential way around this is to capture the search term form the URL parameters of the refering URL when a user enters your site from a search engine. But then you need to store that and somehow pass it back to yourselves, and you will not be able to tie it into your analytics data that i am aware off.
Additionally im not sure of the top of my head that the referral URL is sent from firefox or people in private session / logged into Google so if that is the case then this method wont actually help you...
-
RE: What is the best way to use canonical tag
Hi Tim,
Yes a canonical tag is useful for resolving duplicate content, and acts in a similar fashion to a 301 from the perspective of search engines and passing of link juice, yet allows users to view the original page. So the use depends on the context of your duplicate content issues.
For example say you have a common issue, multiple versions of the home page under /, /index.aspx and /home these are all versions of the same page, no content has changed, what the user sees is not reliant on anything in the URL. In this case a 301 is your best bet back to the root version of the page you want, it shall also help prevent people building links to the various versions as when they take the address from the URL bar it shall already be the version you have selected.
Now say your a retail site selling frogs (Don’t ask, i have a fondness for using frogs in my examples) and you have a product listing page of all the frogs you sell. this could stretch over multiple pages and be paginated. For example /frogs, /frogs?page=1, /frogs?page=2 etc. In this case you don’t really want all these versions of what is effectively the same page ranking, particularly as content won’t change much and can be seen as duplicate. Additionally you don’t want any link equity being split between all the paginated versions however you DO want the user to be able to view these pages. In this case the use of Canonical can be perfect (or rel next/prev but I’ll ignore that for now)
Now it gets a little more complicated and we begin to get to the areas where you can hurt yourself from an SEO perspective. Say your customer can sort by clolour of frog aswell, this adds another parameter to your URL and more duplicate content. i.e /frogs, /frogs?colour=red, /frogs?frogs=blue. Here you can do the same as above and canonical back.
This is where the potential danger lies - The URLs you have canonicalled back will not be ranked in search engines, now say Red frogs are a massive seller and really popular with customers, you may want this page to rank, and canonical tags can prevent this. This is the kind of situation that can cause you a mischief. Have a read of Dr Pete’s 'What page is canonical below' for a more detailed explanation.
Hope this helps you out.
Thanks, Tom.
Heres a few extra resources you may or may not have already discovered:
-
RE: Duplicate Titles Shown in Moz Analytics
Hi Eugenio,
Without seeing the Moz reports not totally sure what you are seeing there but if they are as you described they pose no problem, i.e 'page | Brand'.
I took a quick look on Screaming frog however and think i may know what Moz is reporting as i can see a lot of Duplicate titles being caused by the fact you have two versions of each page on two different subdomains
- /en/tag/{Page}
- /en/{Page}
And another few where there is a /{page} and /en/{page}. I thought you may have solved it with Canonical tags as i can see them in the reports to but they just seem to be refering the same pages back to themselve (Not an issue in itself)
I would suggest then this may be the root cause of what you are seeing and should try to take a look at these, Heres a few resources on Dupe content, not sure how familiar or not you are with it, appologies if i am telling you stuff you allready know.
- Which page is canonical (Moz)
- Dupe Content (Webmaster help)
- Common SEO problems and how to fix (Moz, lot of focus on dupe content)
Thanks, Tom.
-
RE: Adwords enhanced campaigns - Specify alternate destination URLs
Thanks Branden, this is what i suspected then, and there is no way to set alternate URLs within same AdGroup/Ad based on device.
Completley agree Respnsive design is the way to go, we have allready rebuilt a clients site like this for them and plan to do more, The mobile version of this site is not even on a sub domain or sub directory, its on a totally different domain which is less than ideal. They have recently hired a new head of online however who is working closley with us to turn this around so hopefully this shall improve soon!
Thanks for the response.
-
Adwords enhanced campaigns - Specify alternate destination URLs
Morning Mozzers,
I am setting up a new campaign for a client, they would like to target mobile devices which evidently i can no longer do with enhanced campaigns; however i have increased the Mobile bid as much as possible and set the default CPC low to try and minimise appearance in desktop search. The client has a desktop and a mobile version of the destination page but the site will not direct users to the correct page based on their device as they are two separate domains.
As such i want to know if i can specify an alternate destination URL for an ad in Adwords based on if the click comes from a mobile or desktop device? The other option is to set all my Ads within the Adgroup to have mobile as the device preference and just use the mobile landing page, but not sure if there is a neater solution here?
Thanks, Tom.
-
RE: Redirecting a questionable domain to a trusted domain
If you redirect the domain then the link equity of those pages shall be passed to the new domain.
The level of risk would be dependant not only on the composition of your link profile on the redirected brand site, but also the link profile of the site you are redirecting to. So if it allready has a large and healthy profile, it shall be more resiliant than if it was smaller and under developed due to the redirected links forming a statisticly less significant proportion of the total link profile.
I know thats not really a yes or no answer, but hopefully is usefull in making your decision.
-
RE: Does "?" in my URL have a negative effect?
The question mark sign signals the beggining of URL parameters. This will not impact your keyword however if the URL parameters do not significantly alter the content of you page then this could cause duplicate content issues as Search engines will see each parameter variation as a unique URL.
Search engines will try to identify these but to help them out you should Discount URL parameters in Webmaster tools for the respective search engine (Not just Google), and to avoid link equity being lost or dilluted implementing the Rel=cannonical tag to the page and referenceing back to a sensible root page version will mean all your inbound links work together.
Hope this helps.
-
RE: Potential problems with multiple users of an adwords MCC
Thanks for the feedback David, i'll let them know that if we do encounter issues we can set up alternate accounts.
-
RE: Confused of Choosing www or non-www domain
Hi Anand, yes you will need to resolve the duplicate url issues. As an extra resource that you may find useful heres a Q&A article from Matt Cutts blog. One of the more interesting responses for your case is that you can just set your webserver to have the non www. version as the default, this helps tell (all) search engines what your prefered format will be and redirects users.
-
Potential problems with multiple users of an adwords MCC
Morning all,
I have had a query regarding granting access to an MCC account for people using various platforms. Now this a pretty specific query for an area outside my expertise, and conveniently both our PPC guys are on holiday so thought i would reach out to the community.
Full request is this:
"On the mcc I am slightly concerned about that, from experience I have run into problems especially with granting access for usage of API’s on an account, so if for example we wanted to use marin they would need access then another wanted to use adobe they would need another and if they are all sat under the SSM teams mcc that could be problematic or we would have to set up another."
Any help would be appreciated.
Thanks, Tom.
-
RE: Are we penalized?
First thing to check is your messages in webmaster tools. If its a penalty Google will communicate that to you through here and let you know the type of penalty. Then you can look into fixing it.
If its an algo related drop then as has already been mentioned, we would need a little more info to do some digging.
-
RE: Can you disavow a spamy link that is not pointing to your website?
Short answer no.
To use the disavow tool you need to be logged into webmaster tools, and you need to use the disavow tool under the profile of the relavent site. As such Google will know that any links you are trying to disavow are associated with, and only authorised for, the site you have signed in under.
-
RE: How can I track traffic source for each user?
Data in analytics is depersonalised so you will not be able to associate behaviour to an individual user, unless as Michael says site usage is low enough for you te be able to spot the single user on your contact pages etc at the time of the contact form submission,
-
RE: Removing inbound links
Is this in response to a notice from Google or general house cleaning? And you need to asses if those links are actually causing you any harm in the first place, for example are they nofollow?
First step in removing links is always to try and contact the webmaster and ask them to remove the links. this is important as this will then impact on all Search engines, not just Google which would be the case if you automatically went for Disavow tool. It also provides you an audit trail if in future you find you need to demonstrate to search engines, say through a reconsideration request, that you have made an effort to clean up your site and comply with webmaster guidelines, To find contact details for a website you can try contact details on the site itself or look them up with tools such as the whois directory.
If contact fails, or if they demand money for removal then you should consider using Googles Disavow tool. Again please note this will only impact Google's algorithm. It should only really be a last resort however. You can access this through Webmaster tools, just search for disavow tool in the help section to the top right and follow the links,
-
RE: Page not cached
Ok, well that case it just may not be cached yet, i know that being indexed and cached do not necessarily happen concurrently, and the cache should catch up in a few weeks.
Im trying to find some resources to back this up now.
-
RE: Page not cached
How long has the delay been between you removing those tags and resubmitting it? And if you do a search in Google for the exact URL does it return the page?
-
RE: Search Term in Contact Email
If you are trying to capture the search term used by a customer that lead to the site visit you can do this through the referral URL when they enter the site. As you can see where the visitor came from you can pull the appropriate URL parameter containing the search term information, save it in a cookie while they navigate about the site the merge it into the email template.
We use this method to attribute search terms to an application, but through XML forms. I’m afraid i do not know the exact technical implementation details, but do know that the above is the process we use, and i also know the parameters you want to look at for certain Search engines. All the below have the search term in the 'q' parameter:
GOOGLE, BING, ASK, AOL, HOTBOT, GIGABLAST, LYCOS, SEARCH-RESULTS, CONDUIT, SEARCH, AVG (powered by google), BABYLON, MAMMA, YANDEX, DAUM (chinese), SEZNAM (czech)
YAHOO: p = query
Hope this helps you out.
Hope this helps you out.
Heres a list of all Google URL Values if you fancy trying to get any other insights.
-
RE: See which Google TLD organic traffic is coming from in GA
AndieF has certainly given you the easiest method there, and the one i would recommend myself. However if you do want the actual URL then you could use advanced filters.
This resource by Santa J Achille shows some pretty cool tricks you can use to get not only the Top Level Domain but also the Keyword and Campaign to show in analytics. I've used it before and found some useful insights. Just a note in the copy and paste code section there is a missing \ before the first question mark in the second referral field, so just add it in or analytics will tell you its an invalid regular expression.
If you like the look of this or fancy making a few of your own / editing it, there are also some very useful guides on how to use regular expressions from Moz and my personal favourite from Luna Metrics.
One more thing to note, if you do decide to implement this filter make sure you do it under a new profile.
-
RE: Is Google indexing something I can't see on my page title?
Which particular URL are you looking at?
I have just crawled the site and the Meta titles being reported are the same i see in the HTMl (that i looked at, the crawl is 6.5k pages allready and only half way through). When i searched for your URL in Google got the home page back as you would expect and see Hull mentioned in the title, but it was in you meta title too.
-
RE: 301 vs 302
All redirects should be a 301 in order to pass link benifits of the legacy page. You will still lose some link juice through a 301 but a 302 shall not pass benefits heres another thread on this topic.
-
RE: Second rebranding, what's the best approach?
Option 2, redirect chains should always be avoided and should go straight to the final destination URL.
-
RE: A case of negative SEO?
agree with wesley, if there is no associated links your probably ok, although from a brand image point of view you may wish to contact the webmaster and request the post is removed. You can normally find contact detail for genuine sites with WhoIs.
Within the SEO community there is a continuing debate as to whether negative SEO actually exisits. With this one mention your almost certainly fine, but i personally am of the opinion this very much a reality of the SEO world having experienced this first hand on multiple occasions now. But in these cases we are talking about many thousands of links built in just a few days, so again i return to my point that if this is isolated you are fine, and even then the inpact of any negaitive SEO attack will be dependant on the size of your link profile,
If you were interested to know a little more on the debate however a few useful articles to start on can be found here (From Search engine round table) and here (A whiteboard friday by Rand). Another good example to look up is when Rand issued a challange to damage his blog from negative SEO,
Anyway, off topic a little, sure you should be fine.
-
RE: Should i disavow all backlinks?
This particular update was targeted at Google.co.uk so not sure if this would have particular impact on you. As it happens this is an area we are watching very closely as we work with some (legitamate) UK Payday loan companies, very interesting period there right now.