I agree fully with everything you say on this, the difference here I feel is that the client and their previous seo agency were the one putting their own content on tons of websites which makes it even worse. Hopefully a change of content will see the site back to where it belongs
Posts made by GrumpyCarl
-
RE: Self inflicted duplicate content penalty?
-
Self inflicted duplicate content penalty?
Wondering if I could pick the brains of fellow mozer's. Been working with a client for about 3 months now to get their site up in the engine. In the three months the DA has gone from about 11 to 34 and PA is 40 (up from about 15) so that's all good. However, we seem not to be moving up the ranking much. The average DA of competitors in the niche in the top ten is 25. We have 9.2 times the average no of backlinks too.
During a call to the client today they told me that they noticed a major drop in their rankings a few months back. Didn't say this when we started the project.
I just searched for the first paragraph on their homepage and it returns 16,000 hits in google, The second returns 9600 and the third 1,400. Searching for the first paragraph of their 'about us' page gives me 13,000 results!!
Clearly something is not right here. Looking into this, I seems that someone has use their content, word for word, as the descriptions on thousands of blogs, social sites.
I am thinking that this, tied in with the slow movement in the listings, has caused a duplicate content penalty in the search engines. The client haven't copied anyone's content as it is very specific for their site but it seems all over the web.
I have advised them to change their site content asap and hope we get a Panda refresh in to view the new unique content. Once the penalty is off i expect the site to shoot up the rankings.
From an seo company point of view, should I have seen this before? Maybe. If they had said they suffered a major drop in rankings a few months back - when they dropped their seo agency, I would have looked into it, but one doesn't naturally assume that a client's copy will be posted all over the web, it is not something I would have searched for without reason to search
Any thoughts on this, either saying yes or no to my theory would be most welcome please.
Thanks
Carl
-
RE: Big rise in "Keyword not defined"
I would agree, in part. However, even if you don't know which keyword is sending you traffic, If anything this makes ranking reports more important. If we see traffic going up, but cannot directly see which keyword is sending it, then one could draw a link (however tenuous) between the rise in rankings and the rise in traffic
-
RE: Big rise in "Keyword not defined"
Scary how the 100% date, in the chart, has become this December. Was scary enough when it was 2017!!!
-
Big rise in "Keyword not defined"
Hi, all.
Anyone else seen a massive increase in the Not Provided keywords in their analytics in the past couple of weeks. Probably related to this (source:http://searchengineland.com/post-prism-google-secure-searches-172487) _In the past month, Google quietly made a change aimed at encrypting all search activity — except for clicks on ads. Google says this has been done to provide “extra protection” for searchers, and the company may be aiming to block NSA spying activity. _
Other than the unreliable stats from WMT, there doesn't seem too many ways which we can now find out what is sending traffic to our sites!
-
RE: Ok to ignore Overly-Dynamic URL from Moz crawl?
Thanks, Mike
Will check out the link you posted. I have added a lot of the filtering options - price ascending, descending and all the related page 2's and onwards to a no crawl file for Google (and Roger Bot) so hopefully that helps. Have also replaced a lot of the html filtering options with menus which cannot be crawled so will wait for the next Moz crawl and see if that's helped matters.
Regards,
Carl
-
Ok to ignore Overly-Dynamic URL from Moz crawl?
I am developing an ecommerce site, just ran it through the Moz crawl to see what's what and it has come back with a lot of issues. Most of these issues are around duplicate page titles (it is not happy with paginated titles, ie Shoes, Shoes Page 2, Shoes Page 3 etc) and it has also found a lot of Overly-Dynamic URL's. Again, these seem to be from some of the search functions and filters used Accessories&pto_sort=priceAsc&pto_page=6 other than spending a lot of time and effort trying to rewrite these urls there is little I can do about them.
Should I just ignore this? I wouldn't imagine it having a massive impact on the rankings of the pages.
Thanks,
Carl
-
RE: Number of reviews in PPC advert
Sorry for the delayed reply, thanks will pass the information on to the client.
-
Number of reviews in PPC advert
Hi all
Got an email from a client asking about this, Ive not come across this one before. The client has a Google + account with about 2500 reviews on their website on. They have linked this into their adwords so these show on their ppc. However, on the ppc ad it says only 650 reviews. Quite a difference!!
Anyone know why this would be the case?
Thanks
-
How to add an affiliate store
Hi all
Just wondering what other Mozewrs' would do about this..... I want to add a revenue stream to a blog of mine and I have decided that an affiliate store is the way to go. I can create a store with merchant datafeeds and pull in products related to my site, and all being well make some pennies from it.
Obviously all the datafeeds are published on many other sites and so it will be very duplicate content. Would blocking Googlebot from the store be enough to ensure that the site doesn't receive a penalty for duplicate content? I would be keen on getting the product category pages indexed but not too worried about the actual products themselves.
I would like to make some revenue from the site but not at the risk of killing the blog.
Thanks
-
RE: Keyword Rankings Compare to X not working
Anyone from tech support online yet? I need to generate reports for clients and it doesn't look good if most of the data is missing
-
Keyword Rankings Compare to X not working
Anyone else having trouble with the compare your keyword positions with competitors section of the analytics? The rankings for my site(s) are fine but whichever competitor I click on to compare rankings to just returns 'Not in Top 50' the competitor is ranking. I have just manually checked and they are very much listed in the engine.
Is this tool broken?
-
RE: 14,000 blog comment spam links placed on one domain!!
John,
Sorry, I think I may have worded the question poorly. The 14,000 spam links are on an external site, pointing to my client's. The client used an seo company a while back and their seo strategy seemed to involve running senuke and Xrumer. With the senuke articles they are fairly easy to get rid of, emailing a site owner and asking them to remove the link from one article isn't too much trouble.
Emailing a website and asking them to remove blog comment spam placed on 14,000 pages of just one site could be more trouble. I cannot expect them to do this so hopefully google will allow me to disavow the whole domain and not expect this to be manually cleaned up
-
RE: 14,000 blog comment spam links placed on one domain!!
Thanks, will check the site out. The links are very clearly blot comment spam. While I cannot give the name of my client here, one sample url from the site should show what I mean. I have broken the link on purpose so it's not active
http://ryantennismusic. com/blog/?p=19&rnment_moveForm(com,_666,/www_paydayadvanceadventeplytocom=556&replytocom=504
The problem I have is that this is clearly a junk link and the penalty we got from Google says
Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
As a result, Google has applied a manual spam action to myclient.co.uk/. There may be other actions on your site or parts of your site.With this in mind i am trying to manually remove as many links as possible. I tried a previous reconsideration and a disavow but no joy with that. I hope they will accept my explanation on this domain
-
14,000 blog comment spam links placed on one domain!!
Trying to clear up a manual link penalty a client has received and I have found the client's site has been blog commenting 14,000 times on just one domain!! I need to do a reconsideration request and show Google that I have cleaned up the mess (the client has 50k backlinks, of which I am not happy with about 75% so far!!) but I cannot expect the website in question to go through 14k pages.
What would people advise here. Would you state the sheer number of links on this site in the reconsideration request and use the disavow tool? Doesn't this suggest that I have been lazy and not put the effort into to clearing things up up
-
RE: Building "keyword" backlinks
thanks for the reply, you make a good point about not overdoing the anchor text and using generic anchors to increase a page authority, that is a good strategy in the ideal world where you can optimise one landing page for one keyword. The trouble occurs when a client wants to rank one page for several different keywords. You may be able to rank for one or maybe two on the strength of good page authority and good onpage seo, but what about the other x keywords they want to rank for? Too few clients think about seo when building their site, in my opinion
-
Building "keyword" backlinks
Looking for some opinions here please. Been involved in seo for a couple of years mainly working on my websites and picking up the odd client here and there through word of mouth. I must admit that up until a few months back I was guilty of using some grey methods of link building - linkvana, unique article wizard and the such. While no penalties were handed out to my domains and some decent rankings gained, I got tired of always being on the lookout for what the next Google update will do to my results and which networks were being hit, and so I moved a lot more into the 'proper' way of seoing.
These days my primary sources for backlinks are much more respectable...
myblogguest
bloggerlinkup
postjoint
Guest Blog Finder http://ultramarketer.com/guest-blogger-finder/ - not sure where i came across this resource but it's very handy
I use these sources alongside industry only directories and general word of mouth.
Ironically I have found that doing the word by hand not only leads to results I can happyily show people (content wise) but also it's much quicker and cheaper. The increased authority of the sites means far fewer links are needed.
The one area I still am having a little issue with is that of building keyword based backlinks. I now find it fairly easy to get my content on a reasonable quality site - DA to 40 and above, however the vast majority of these sites will allow the backlink only as the company name or as a generic read more type thing. This is fine and it is improving my website performance and authority. The trouble I am finding is that while i am ranking for the title tag and some keywords in the page, I am struggling to get backlinks for other keywords. In an ideal world every page on the site would be optimised for a different keyword and you could then just the site name as anchor text to build the authority of that page and make it rank for it's content, but what about when you (or the client) wants to rank the home for a number of different keywords, some not featured on the page. The keywords are too similar to go to the trouble of making unique pages for, and that would also add no value to the site.
My question really then, after a very long winded way of getting there, is are others finding it much more difficult to gain keyword based backlinks these days? The great thing about the grey seo tools, as mentioned above, is that it was super easy to get the backlinks with whatever anchor text you wanted - even if you needed hundreds of the thing to compensate for the low value of each!!
Thanks
Carl
-
RE: 5XX (Server Error) on all urls
Thanks, will check out that plugin. So, in other words, the pages are loading fine for the user but sending out an error to the bots instead of the loaded ok message. That doesn't sound good!!
On the plus side, at least it has stopped Roger noticing some of the pages have up to 600 links on them because of all the retailer and manufacturer filtering options!!
Many thanks, Carl
-
5XX (Server Error) on all urls
Hi
I created a couple of new campaigns a few days back and waited for the initial crawl to be completed. I have just checked and both are reporting 5XX (Server Error) on all the pages it tried to look at (one site I have 110 of these and the other it only crawled the homepage). This is very odd, I have checked both sites on my local pc, alternative pc and via my windows vps browser which is located in the US (I am in UK) and it all works fine.
Any idea what could be the cause of this failure to crawl? I have pasted a few examples from the report
|
500 : TimeoutError
http://everythingforthegirl.co.uk/index.php/accessories.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/bags.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/gloves.html 500 1 0 500 : Error
http://everythingforthegirl.co.uk/index.php/accessories/purses.html 500 1 0 500 : TimeoutError
http://everythingforthegirl.co.uk/index.php/accessories/sunglasses.html | 500 | 1 | 0 |
Am extra puzzled why the messages say time out. The server dedicated is 8 core with 32 gb of ram, the pages ping for me in about 1.2 seconds. What is the rogerbot crawler timeout?
Many thanks
Carl
-
RE: A client/Spam penalty issue
Thanks for the replies everyone, now comes the fun part when I have to crack on and work way through 48,000 backlinks!
-
RE: A client/Spam penalty issue
Thanks for the replies everyone, they are most welcome.
If I could trouble you to one sub question before I mark this as solved. When cleaning up a dodgy backlinks profile, what is the general view on no follow links? Going through the client links and they seem to have a fair few no follow links from generic directories. Even though they shouldn't be counting toward a site ranking, I have been asking people to remove these too. My view is that if I remove all the bad links, regardless of follow situation, that will show Google that I know what is right and what is wrong re the site.
Thanks, Carl
-
RE: A client/Spam penalty issue
No, sorry I may have worded myself poorly...the client used an seo agency until a couple of months back, it seems although a lot of the spam links were posted between Dec and Feb they are only now impacting on the site. When I referred to negative seo, I more meant it as a joke that the links look like the perfect example of a negative seo campaign. Found some forum spam earlier on Arsenal FC forum and a forum about psychological issues faced by transgender people. Both of these sites seemed fine sites in their own right but one would have to question their value when linking to a door handle website!!
The initial (and thus far, only) request was a very basic one to say we have received this penalty, we hired a poor seo company to look at our site and it seems they spammed our domain. I told them I had disavowed several hundred domains but I think it failed owning to the lack of proof of manual work, so, as suggested by Matthew (above) I will include a document this time to show who we contacted, when, the reply and the current link status
-
RE: A client/Spam penalty issue
Matthew,
Many thanks for the detailed reply. Shortly ago I used the linkdetox tool, I didn't realise you can upload files to it so used their built in bad link identifier. It has given me about 1800 bad links which am working though. Helpfully a few of them are blogger sites and have no contact!! Am managing to contact about 30% so far so that's better than nothing.
I have read about using buzzstream.com to try and pull the contact information on the other domains, I will employ this once I have finished going through the list. So far I have documented the urls and contact times in a spreadsheet. I must admit I didn't know you could link to a Google doc in the reconsideration so the spreadsheet I am working through will provide a good start, especially if the removed column starts to fill up!!
Thanks again
-
A client/Spam penalty issue
Wondering if I could pick the brains of those with more wisdom than me...
Firstly, sorry but unable to give the client's url on this topic. I know that will not help with people giving answers but the client would prefer it if this thread etc didn't appear when people type their name in google.
Right, to cut a long story short..gained a new client a few months back, did the usual things when starting the project of reviewing the backlinks using OSE and Majestic. There were a few iffy links but got most of those removed. In the last couple of months have been building backlinks via guest blogging and using bloggerlinkup and myblogguest (and some industry specific directories found using linkprospector tool). All way going well, the client were getting about 2.5k hits a day, on about 13k impressions. Then came the last Google update. The client were hit, but not massively. Seemed to drop from top 3 for a lot of keywords to average position of 5-8, so still first page. The traffic went down after this. All the sites which replaced the client were the big name brands in the niche (home improvement, sites such as BandQ, Homebase, for the fellow UK'ers). This was annoying but understandable.
However, on 27th June. We got the following message in WMT - Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
As a result, Google has applied a manual spam action to xxxx.co.uk/. There may be other actions on your site or parts of your site.This was a shock to say the least. A few days later the traffic on the site went down more and the impressions dropped to about 10k a day (oddly the rankings seem to be where they were after the Google update so perhaps a delayed message).
To get back up to date....after digging around more it appears there are a lot of SENUKE type links to the site - links on poor wiki sites,a lot of blog commenting links, mostly from irrelevant sites, i enclose a couple of examples below. I have broken the links so they don't get any link benefit from this site. They are all safe for work
http:// jonnyhetherington. com/2012/02/i-need-a-new-bbq/?replytocom=984
http:// www.acgworld. cn/archives/529/comment-page-3
In addition to this there is a lot of forum spam, links from porn sites and links from sites with Malware warnings. To be honest, it is almost perfect negative seo!!
I contacted several of the sites in question (about 450) and requested they remove the links, the vast majority of the sites have no contact on them so I cannot get the links removed. I did a disavow on these links and then a reconsideration request but was told that this is unsuccessful as the site still was being naughty.
Given that I can neither remove the links myself or get Google to ignore them, my options for lifting this penalty are limited.
What would be the course of action others would take, please.
Thanks and sorry for overally long post
-
Merging sites, ensuring traffic doesn't die
Wondering if I could get a second opinion on this, please. I have just taken on a new client, they own about 6 different niched car experience websites (hire an Aston Martin for the day, type thing). All the six sites they have seem to perform reasonably well for the brand of car they deal with, the average DA of the sites is about 24.
The client wishes to move all of these different manufacturers into one site and have sections of the site, they can then also target more generic experience day type keywords.
The obvious way of dealing with this move would be to 301 the old sites to the relevant places on the new site and wait for that to rank. However, looking at the backlinks profile of the niched sites, they seem to have very few backlinks and i feel the reason they are ranking so well for all the individual manufacturers is because they all feature the name in the domain. Not exact match, but the name is there.
If I am thinking right, with the 301 we want to tell Google page x is now page y, index this one instead. Because the new site has a more generic name I don't think it will enjoy any of the domain keyword benefits which are helping the sub sites, and as a result I expect the rankings and traffic to drop (at least in the short term).
Am I reading this correct. Would people use a 301 in this case? The easiest thing to do would be to leave the 6 sub sites up and running on their own domain and launch the new site to run alongside them, however the client doesn't want this.
Thanks,
Carl
-
RE: Problems with Crawl Diagnostics/New Campaign
Just tried the domain in the manual crawl tool and it seemed to work in there. Trouble is not such a fan of the output of that report so would prefer the 'proper' report.
-
Problems with Crawl Diagnostics/New Campaign
Hi all
I added a new domain to my campaigns yesterday and got the usual message saying we will have a small set of results ready in a couple of hours, and the rest will be done in a week or so.
24 hours later, this same message is still showing. Anyone else experiencing this or is it just related to my domain?
Many thanks.
Carl
-
RE: Website 'stolen', no contact details
Thanks for the reply. We went down the route of blocking the other domains from accessing the server in the end. Hopefully the duplicate versions of the website won't cause too much trouble.
One thing am considering is adding the domains to webmaster tools and removing them from google, that should help with duplicate content issues. If they are pointing to our server and accessing our files then we may as well exploit that for a webmaster tools verification action
-
RE: Website 'stolen', no contact details
Thanks will look into that. Would be so much easier if this client owned all the domains, they all seem to be owned by different people and not linked to each other in any way
-
RE: Website 'stolen', no contact details
Thanks for the responses, everyone. The situation gets even more random. It would appear that the content is not stolen, but rather the 'copy' domain (and indeed two more, at least) are not only pointing to client's server but also the same directory as their site. The are all loading the same files!!! Must admit, this is a new one to me.
The client IT dept claim so far...they purchased a new ip for their server and are using that. The previous ip used to belong to another host, so it appears, and there were sites pointed to that ip. When the ip was moved to client's site the sites pointing to it now point to the new server. This is just about understandable, how these domains are accessing the files on the server is a mystery. It's a windows server so not my area of expertise.
Oddly enough the two domains which seem to have had their server moved are registered for another 3 years still, so one would assume they are wanted domains, whoever owns them
It's definitely been an interesting Tuesday morning so far!!! Still the afternoon to come so I wonder how many more websites can find sharing client's files!!
Not sure where this leaves me now with regard to calling it spam/theft. All the domains appear to be resolving to exactly the same place,yet only one of them is owned by the client.
-
Website 'stolen', no contact details
Hi all,
Wondering if anyone could help out here, good a very strange issue....
Went into Google Webmaster Tools and looked at the incoming links to a client's site (new client, only just gained access to WMT) and noticed 2563 links coming from a domain. Upon viewing said domain it is a 100% copy of the clients site, I mean 100%; the phone numbers, email address etc are still pointing to the client's site.
Everything is the same, the pages, the navigation etc. When I click on a link on the copy site it loads the same pages but at their site, the internal linking points to the version of the pages on their site. It seems to be an ongoing thing because the last time the client updated their blog was last week and this is on the copy site.
Obviously this cannot be helping with regard to seo. The client knows nothing about it so not come from them. The copy site is indexed in Google!!.
The first thing to do is to contact these people and ask what they are doing. This is proving to be easier said than done, the contact details (as mentioned above) on the pages still point back to the client and the whois gives no details.
What would be the first step to take here? Obviously there is the whole legal area about stolen content but that can wait until we have the site down and out of Google. Is there somewhere in Google to report things such as this?
I will speak to client and if they are happy I will share both the domains in question, they know I am seeking alternative opinions
Many thanks
Carl
-
RE: How to remove a thin site penalty
Thanks for your help gents'. All the content on the site was meant to be 100% unique (I got someone in my company to create with the help of external suppliers). I checked samples and they seemed fine but it appears that all isn't what it first appeared.
I'll get the content replaced and get back to it.
Thanks
-
RE: How to remove a thin site penalty
Hi
Thanks for the reply. Yes, I originally started to rebuild using Magento but swapped to Wordpress as it as quicker and easier. I see your thinking behind removing spam links but I'm convinced that the penalty is more for thin/duplicate content than a dodgy link profile. All the product feeds are identical to merchants as it's an affiliate store and previously these were the only content on the site. There is now lots of unique content on there with these feeds but I think I am still being treated as if it were duplicate
-
How to remove a thin site penalty
Wondering if anyone could help out. A while back I made an affiliate store using wordpress and merchants products feeds. I didn't get found to adding any unique content to the site and, as was to be expected, I gained a penalty and my search traffic died.
A few months back I redesigned the store, still using merchant csv but now with 98% unique content on each page. However, try as I may I still cannot get anywhere in the engines. The domain doesn't even rank for it's own name!! I have submitted reconsideration request but they have replied saying no penalty on the site.
The domain is www.digitalcatwalk.co.uk. While the domain isn't massively strong I would prefer not to have to start again as I feel it is a very good domain name.
Any advise would be most gratefully received.
Thanks
Carl
-
RE: Tracking an onpage 'event'
Sorry, had a thought which is probably foolish but will share anyway
Could I set up a custom Google Analytics and generate a unique GA tracking code and place it inside of the above script or would it still be picked up by the whole page?
-
Tracking an onpage 'event'
Hi all
Wondering if anyone could help out with this one please. My client is a government backed free internet safety website and in the next few days they will be launching an update on all of their pages which will let people know if their browser is out of date. For example, when you go to their site you will get a message advising you to upgrade your browser for security reasons.
They have employed the following code to check the browser
The client are keen to know how many times in a period this message is shown to users. Any idea how one would go about tracking this please. Would it involve some custom GA work, would I be able to track the hits on https://wxxxx.org/javascript/update.js in GA? I'm a little stumped. Obviously I can tell how many people loaded the page but not sure how to work out what % of them see the javascript
Many thanks for your help
Carl
-
RE: A backlinks question
Marcus/Rod
Thanks for taking the time to reply. You make some very interesting points which I will take on board. I will take a look over the studies mentioned. You're right when saying the product pages on big retailers such as Amazon tend to have little to no backlinks. My website has about 85,000 products so one of the alternative strategies being considered is to seo just the root domain or the categories instead. It will take longer to see results this way but if I can get the overall domain authority much higher by working with just a few pages then the knock on effect should help the rest of the site.
Many thanks
-
RE: A backlinks question
Marcus/Rod
Thanks for taking the time to reply. You make some very interesting points which I will take on board. I will take a look over the studies mentioned. You're right when saying the product pages on big retailers such as Amazon tend to have little to no backlinks. My website has about 85,000 products so one of the alternative strategies being considered is to seo just the root domain or the categories instead. It will take longer to see results this way but if I can get the overall domain authority much higher by working with just a few pages then the knock on effect should help the rest of the site.
Many thanks
-
A backlinks question
Hi all
Could do with a second opinion on this one if anyone has a moment please.
Recent Google updates have targeted overally optimised backlink profiles as they are clearly for seo purposes and not natural. The question I have is how does this relate to an ecommerce website?
If I sell 'blue 1980 aged cheese' (I know nothing about cheese so perhaps not the best example!!) and I have a url on my shop domain.com/store/blue-1980-aged-cheese with the product name as the page title along with the domain name. If I were to get backlinks pointing to this page using the anchor of 'blue 1980 aged cheese' (and other variations of that, blue cheese, aged blue cheese etc) would this be considered to be too optimised?
Given the page is about this item then surely it could be considered natural that people link using the product name, as well as using the site name and the domain url
Any thoughts please
Thanks, Carl
-
RE: Merging two websites to one...
Thanks for the reply Rod
To be honest, very little content from the current site will be moved over as most of the products and services are offered in both locations. Given the impact on the local rankings for the domain which is being 'killed' I may suggest to the client that they keep that site live and let it continue with it's rankings for the time being. As, and when, the new site starts to rank for the same keyword the old site could be withdrawn
-
Merging two websites to one...
Hi all. Could do with a second opinion on this please...
At present a client of ours owns two shops (both doing the same but in towns about 20 miles apart - they sell flooring, but using different names) and has a website for each. The plan is to rebrand both of these stores the same and merge both websites into one.
The problem comes that both of the individual websites rank very well in their respective Google Local search results and I fear that killing one of the sites will mean that one store will vanish from the local listings. One domain is a DA 45 and the other a DA 11 so the plan is to use the stronger of the two domains.
The question I would like to ponder with people wiser than myself is how can we ensure that the new single domain ranks for both locations in the local? Would the easiest solution be to have pages such as domain.com/store1 and domain.com/store2 with full listings for that store inc name, address, phone number, customer reviews etc?
At present the DA 45 domain ranks very well in it's Google local so we need to find a way to change the homepage of that to have both the stores phone numbers but without affecting the local listing. I was considering adding the second phone number as a text based image so that it's visible for people but not for bots
Finally, would 301 redirecting the now unused store to domain.com/store2 help with ensuring that we do not lose any local listing for that keyword? If not, are there any suggestions people could offer up
Many thanks for any help and sorry for the very long question
Carl
-
Webinar Recording
Hi
I see there a lot of webinars coming up in the next few weeks, are there plans to record these so people unable to make the webinar time can view them at a later date? I see webinars from July have been recorded but nothing since. There was a webinar last week which I was unable to come to but cannot find a recording of it
-
RE: Safe way to auto geo redirect
Thanks that sounds good. Will investigate more but glad there is a way to do it.
-
Safe way to auto geo redirect
Hi, looking for a second opinion on this please....
I own a couple of web stores, one targets UK and the other USA (they are both the same store more or less just different products targeted at different location). The uk runs on a .co.uk domain and the US on a .us domain. Is there a safe way that I could auto redirect search engine traffic to the right location? Let's say the toys page of the .co.uk is ranking well in google uk and appears high in google us too, obviously I would want the USA users to visit the toys page for the US store rather than the UK one. Ideally I would employ a geo redirect script so if a USA user clicks on the UK domain they are redirected to the USA site but would Google frown on that?
Hope that makes sense?
Thanks
-
Exporting Twitter and FB data in report
Hi
Been with this tool for a few days now and enjoying it so far. I do have one query though. In the campaigns section we have various tabs of data, including Social. However while all the other data is exportable in the created report, social is not available to add to the custom reports. Why is this?
When I click on the social tab I can download it in CSV but it would be good to be able to export the charts in pdf as per the other analysis data. Would make it much easier when sharing reports with clients.
Are there any plans to make the social metrics addable to the custom reports one can create?