Hi,
It can be confusing, but there is actually no difference! Still hasn't been phased out though
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi,
It can be confusing, but there is actually no difference! Still hasn't been phased out though
Hi Daniel,
Maybe something else changed in the weeks prior to this that had an effect, like a template or layout change? Is it just you or have other sites fluctuated in the serps also? You have probably gone through this process, but worth mentioning since the robots.txt itself doesn't look very suspicious.
As for the image indexation, I think I have your culprit. In your existing sitemap http://www.scojo.com/sitemap/sitemap.xml there are 582 image urls and they seem to be all going to a 404 page.
Hope that helps!
HI,
There is a thread here which at the bottom details a temporary fix until the issue is solved: http://moz.com/community/q/moz-email-is-freezing-microsoft-outlook
It is annoying! I find that if I leave outlook to itself after about 25 seconds or so the mail appears (then is a good time to delete it so that you don't get the freeze again by accidentally clicking on the mail later on).
Hi Des,
You will probably want to look at implementing rel=alternative hreflang="xx" in this case and you usually do not want to be putting a rel=canonical pointing at the .com site from the .co.uk site. Google has removed the text referring to the canonical tag from the relevant help page on hreflang exactly due to the confusion you mention in your question. As I understand it you can still use a self referencing rel=canonical tag if you like, but the hreflang is the best way to declare to google that you have language/region specific content available on other urls. Having both hreflang and a canonical tag pointing to the .com site might get you UK searches showing the .co.uk site url (from the hreflang) but showing title and description text from the .com site (because of the canonical tag). Check out these 2 links for a bit more info:
https://support.google.com/webmasters/answer/189077?hl=en
http://dejanseo.com.au/canonical-vs-hreflang/
Hope that helps.
Hi,
This article gives a pretty good rundown of the steps needed (well the intro bit and then straight to step 4). Look out for the cookieDomain: auto setting which is a bit of a gotcha in many cases. Technical reasons behind the cookie domain issue can be found here. Hope it helps!
As Andy says you could pull the information into a popup/lightbox/div using javascript and source it all from the same url, so should do the trick.
Just thinking out loud now, but it might be worth testing putting the generic texts back into some of the product pages and seeing what happens. 250 word unique texts for each product might be enough to make the moz crawler (and the search engines) see enough value and difference on each page for it not to be an issue anymore. You could test in a single category for example and cross check against analytics and ranking data to see what happens.
Hi Daniel,
Since I was looking I put the site through Screaming Frog (http://www.screamingfrog.co.uk/seo-spider/) and there are a bunch of 404 errors being shown.
I think it must be the relative links you have in your footer. Weird thing is on chrome they work ok, but in firefox they return a page not found! Not sure if this is related in any way to your rankings drop, but even if just for user experience worth having a look and maybe even hard coding those footer links.
You can see an example if you go to http://www.scojo.com/mens-reading-glasses.html in firefox and then click on any of the left hand footer links to the other categories.
Good luck!
Check here for a possible workaround until a permanent fix is made: http://moz.com/community/q/moz-email-is-freezing-microsoft-outlook
HI Thomas,
You could use a wordpress multi site installation for this which allows for setting up different wp installs on subdomains or sub directories as Ryan suggests. This will allow you to have separate 'sites' while all still being controlled from one wp admin panel and with a bit of coding/setup work to tweak things if needed. You can also use a domain mapping plugin to run multiple domains (.com, .com.au, .co.uk etc) from the subfolders if that is in the plans.
Hi Justin,
Cutts has said more recently that there should not be any problem going the all https route, although he notes to try it out first on a 'lower traffic domain' if possible(!). https://news.ycombinator.com/item?id=4801641 Quite a few popular sites are heading this way (twitter, fb, paypal for example)
Assuming that is correct and you are not seeing any ranking/indexing issues with the site currently in https, there should not be an obvious reason to worry.
For your second question, again there should be no issue with running the blog on a non https url, but you would do well to avoid having links to the https version of the blog that are then 301 redirected to the http version. So making sure all your internal links to the blog are already http would be a good idea and ideally all links from the blog back to the site would already be https.
Beyond the page loading times, https urls can often have gotcha moments where certain elements of the page (js or css links, image urls etc) can be non https and produce warnings in some browsers which is bad for user experience. Sometimes the pages these errors occur on can be pretty deeply hidden so good to be methodical. Also need to keep in mind that your robots.txt and sitemap should also be served on the https protocol and make sure that your sitemap is not automatically generating https urls for all the site (blog included) which can potentially create unneeded confusion.
Hope that helps!
I will have a stab at it
I am a bit confused over your main problem since you say that individual properties have great unique content and are good for sharing, but then that they have similar descriptions and you would prefer for them not to be emphasized in google.
If I understand right, this seems to be mainly an architecture type issue since you want to put emphasis on location + type of accommodation so that users are landing on the 'category' page such as it is and can then drill down.
If so, I don't think you have any problems if the category/url structure is right so something like
Florida/Miami/Condos gives a pretty good indication of the way you are intending visitors to discover your content. You will want to make sure that each page in that trail has some unique content beyond a simple list of properties. The individual property pages could then be optimized more for the name of the rental than its location (although obviously location would also be included).
I would think that schema markup might be your friend here. Have a look at http://schema.org/LodgingBusiness and particularly the location details and see how you can apply this to the individual pages.You could then replicate the locality information through your url/folder structure.
Using breadcrumbs (and structured markup with them) would also help emphasize this to both users and search engines... and maybe you will get some of those lovely breadcrumb links in your serps!
Hope that makes sense!
Hi Daniel,
At first glance I'm not sure a robots.txt issue would be responsible for the rankings drop you describe. What pages are still blocked that you dont want blocked? Maybe something else is going on in regards the rankings?
For issue 2, have you submitted an image sitemap.xml? If so what is the address? I see plenty of your images in google image search, so some of them at least are being indexed.
Hi,
The duplicate report only shows that weeks results so you don't need to chase your tail
In the duplicate report csv download you should see the duped pages right above/below each other in the left hand column. One of these two pages is probably ranking, unusual for it to be both of them. Having the dup content potentially makes it harder for them to rank since you are in effect splitting authority between two identical pages so always good to get rid of duplicates.
If it is notifying of new duplicate content each week, it sounds like it might be a systemic fault. Check which pages are referring to the two different version (far right hand column in the csv) and that should help guide you to the point in the system where the duplicates are being created. Often times a pretty simple one time fix and you are good to go!
HI Thomas,
The multisite setup is just a way of controlling the content. If you set it up on subfolders then google will see the sub folders as normal and all on the same domain. If you set it up using domain mapping then you are in effect setting up different domains (even though you are hosting the content all in the same place) and google will see the content as separate domains. If you are planning to go the subfolder route you could set up wordpress to get the same behavior without using multisite, it just might be quite dev intensive depending on how different you want the regional variations to be.
Whichever route you choose an important thing will be to make sure your regional hreflang tags are set up correctly - this will likely need a bit of tweaking given your description but is worth it to get it right.
Hi,
You don't mention what keywords you are searching for, but I would think it is the recent(ish) case of google trying out putting the brand name plus colon at the front of titles in the serps. It has been discussed a few times on this forum, I first read about it here: http://www.gordoncampbell.co.uk/colons-page-titles and it seems to be happening quite a lot! If the keywords you are seeing the results for include the brand aptus then this is almost certainly the reason.
Hi James,
That will probably do it. If you want to avoid having to fiddle with the paging code etc check out something like (grabbed the first link I saw) http://www.jquery4u.com/function-demos/ajax/
Loading content with ajax/jquery is pretty straightforward
Hi Johan,
As Tom says definately make sure your 301s are in place. Especially for a large site you will want to submit a new sitemap to help indexing the new urls and a user friendly 404 page perhaps noting the redesign and linking to the main areas of the new site would be a good idea if you expect to still be getting traffic to old urls that are not covered by 301s for some reason. The basics in other words, but worth a mention.
Hi Jessica,
Well that's a pretty good mix
Having a bunch of different kinds of errors like that could be due to a combination of issues, but I would think that first of all you have a duplicate content issue either because your cms is creating it through different urls for things like multiple categories, archive pages and that kind of thing or because you have a server setup issue that is allowing access to the same pages through multiple urls (like www vs non www urls). It might well be a combination of these things accounting for different errors so you are going to need to narrow down the data to find out exactly what is going on.
Download your error report in csv, open it in excel and select all and then do ctrl-windows-L combination. This will put your data into a filterable table. You can then go to the headers and filter (for example) only 404 pages or only duplicate title pages. Once you have filtered for a specific error have a look at the left hand column for which page or pages the problem is appearing on and then the far right hand column to see which page is linking to this page or pages. You will then have a better idea where the problem is and how it is being reached by the crawler and that will guide you on where to look to fix it. If you want to give a couple of example urls a few obvious things might stand out.
Hope that helps!
Hi Justin,
Your QS is (partially) based on your final landing page (after the redirect in other words) so depending on your keywords/phrases that page might be more or less relevant then the original url, but the redirect itself should not be causing any negative quality score issues. What quality score details are you seeing in adwords? You could always experiment using both landing pages for a time to see what quality score the original page gets also and to see if there are any differences in the average cpc (if that is what you are worried about).
Hi,
Well I will throw a couple of comments in the hat!
I have the same issue on a couple of ecom sites I monitor, internal search results are showing in the serps for various terms, and am pondering what to do about this because as you say, it doesnt seem very sustainable long term and you never know when the axe might drop.
My concern with what you suggest re redirects is two pronged:
1. If you are getting decent traffic and/or conversions from these terms currently, then you run the risk of losing these for who knows how long if the redirects for whatever reason don't sit well with google.
2. If you are redirecting what amounts to internal search queries would that not mean you would have to re engineer the search system/urls to some degree? That doesn't seem a undertaking to be taken lightly!
My thoughts might be to try a parallel approach in terms of letting the search results stay in the serps and at the same time working to boost the relevant product pages you were planning to redirect to.
If you can get those pages ranking for the same terms then if/when google decides your search results don't belong in the serps then you will have a ready made replacement to a degree. You could also try linking to the relevant pages at the top of the search results page instead of redirecting. This might help push the product pages higher and perhaps more importantly could help with your immediate problem of many visits but few conversions on these pages. That being said, wouldn't the search results page be linking to these product pages anyway?
My two cents!
Hi Phillip,
Well the short answer is sure the blog can bring benefits to the main site. It opens a number of avenues that your main site might not be able to easily address in terms of content, so more in depth discussions about XXX and posts about related topics that might not fit easily into the content of your main site.
It also can open up networking/community type opportunities with other related blogs (for example) which can result in positive social mentions and/or links, and those links into the blog will help with boosting the overall domain rank of the whole site.
And of course you can use it to target various key phrases etc that might be difficult to get into the main site and then link back to relevant pages in the main site which complement what you are discussing in the blog.
So, yes definitely can benefit the whole domain in a number of ways and having it at /blog/ is a very common practice that should work just fine!
Hope that helps!
Indeed if it is not showing a 404, that makes things a bit difficult!
You could try another way, use OSE!
Use the exact page, filter for only internal links, boom 127 pages that link to it. There might be more, but this should get you going!
I don't think changing the link text on those two sites is going to help much to be honest. If anything I would be tempted to try and reduce the amount of sitewide links, there seem to be a few.
A couple of things come to mind.
1. Why should you be ranking for NLP? I am searching from Europe and the first page has some heavy hitters (wikipedia, stanford U), some local(ish) results and a few nlp.something type sites that I guess might be your online competitors?
The term NLP is a bit ambiguous, It could be Neuro-Linguistic Programming, it could be Natural Language Programming (the Standford link) or it could be the NLP Theatre down the road. So if google is going to return results based on what it thinks the user intent is then you have a mix of local and 'what is NLP' type results which seems pretty reasonable.
So with that in mind, I'm not sure your site has enough 'What is NLP' type content to justify ranking on the first page and maybe that is something to consider in terms of adding content.
2. Sitelinks seem to come and go depending on the search term and who knows what else so I wouldn't worry too much if you are not seeing them right now. The good news is that searching for "NLP California" from where I am you are number one, sitelinks and all!
As mentioned above the exact CPC is going to vary based on a number of factors.
With a smallish budget you start off with a disadvantage against your larger competitors but this does not mean you cannot compete, you just need to think smart and target intelligently!
In addition to the BMM and call only campaigns mentioned I would suggest looking at even longer tail keywords (for example long term manhattan office space lease - I do not know the market but you get the idea, you can focus on areas of manhattan, types of office spaces etc). The objective is to focus both your keywords and your ad groups very tightly and hopefully gain a competitive advantage against your larger competitors who might not be doing such tight targeting because they have the larger budget and therefore do not put in the extra effort (it happens more often than you might think!).
Tight targeting like this should allow you to get high quality scores on your keywords (hopefully reducing cpc) and at the same time help reduce irrelevant or more generic office leasing clicks that will just waste your limited budget. Negative keywords will also be important to avoid the 'shared' and 'short term' type searches, there will probably be a lot of relevant negative keywords!
If you set up your targeting very specifically then you should be able to get better qualified traffic at a better cpc and therefore hopefully a better conversion rate. This is important when you look at your numbers. They look correct if they are based on data you already have from your site, but consider what happens if ANY of those numbers change: If you get 1 contact for every 10 visitors instead of 15, if two out of ten are good leads instead of one out of ten, if your average cpc is 15 dollars instead of 20. Changing any of those numbers makes the situation look better, changing all of them makes it look A LOT better! You will only get real data once you try - just target intelligently to increase your chances of success!
Hi Tim,
If you have deleted the pages and there are still links to them either from the same site or from external sites then these are indeed 404 (page not found) pages and they will continue to be until you either remove all links to them or 301 redirect them to new pages. Are you sure you have removed all the links to them from your site's pages? If you run the site through a moz scan then this will tell you the referring page that linked to the 404 page (the far right column) and you can go and check to see where the links are and remove them.
By the way, if it is an all html site and you have ftp access... cant you just download all the files and work on it locally?
Hope that helps!
Visits dropping to zero is a bit odd, have you confirmed your analytics code is installed and tracking properly?
Hi,
Google recommends using a rel=canonical to a 'view all' page OR prev/next pagination. That being said though, the setup you have now is unlikely to cause any issues, the pagination is correct and the canonical is pointing to the same page so no technical reason for confusion.
One thing to keep in mind is if you have any filters available on these pages (like sort by price for example). In this case the recommendation IS to use both rel canonical and prev/next pagination. The rel=canonical in this case would point back to the main unfiltered page's url and the prev/next links would point to the relevant pages with the filtering included. So something like:
Page is **category/?p=2&sortby=price and **meta data is:
Hi Jim,
What a car!
I don't think there is much point in making unique title tags for each of the images, if the layout stays the same you will still have duplicate content issues even if you change the titles.
What you really want to do is consolidate the content onto one page, and there are 2 ways that should hopefully be pretty painless to do, depending on your CMS:
1. Change your slideshow code so that the images are loaded without changing the url of the page.
2. Use the rel canonical tag so that all the /media/ urls have a rel canonical tag pointing back to the original Chevrolet-Bel-Air/24481/ page.
Either of those solutions should work in terms of avoiding duplicate content issues, I would be inclined to go with the first one myself if feasible.
Hope that helps!
Hi Juan,
Well for the specific url you mention something appears to be wrong.
Firstly, the url you refer to http://www.tarifakitesurfcamp.com/alojamiento does not exist (shows the 404 page) so the moz crawler is correctly telling you that it is redirecting there.
Secondly the redirect from the first page is indeed a 302 and not a 301, I check using http://www.internetofficer.com/seo-tool/redirect-check/ and it shows a 302 redirect, so again the crawler is giving you correct information.
What to do? First of all make sure the pages you want to redirect to exist and you have the correct urls for them and then have a look at your htaccess file and make sure there are no errors and test to make sure it is doing what you expect it to be doing.
I think for 301 redirects the first url need to be without the domain name like so:
redirect 301 /tks-camp/alojamiento/ http://www.tarifakitesurfcamp.com/alojamiento
Thats what I use anyway and seems to work.
Hope that helps!
Hi Tim,
Yes, any links to those non existent pages will be causing the page not found errors, so remove them from the sitemap. You should see the errors reducing (or disappearing completely) in the next scan report. If you still see errors in the next scan you can use the same process again to find any remaining links (if there are any).
Hi Alexander,
The hreflang link in the header is probably the best way to do it. As to if it is impacting you, it depends on how much duplicate content there is to some degree. If you have set up both sites in GWT with separate sitemaps you can keep an eye on how well both sites are being indexed, a lot of unindexed pages on one or the other might indicate a problem. Best practice would be to put in the hreflag as you mention if in doubt.
Hi,
If I understand right these are internal links on your own site? If that is the case then it is not a huge deal (although you should be careful to not go overboard on keyword stuffing your internal links). Sitewide footer and sidebar links on EXTERNAL sites are often done with an eye on boosting search engine rankings and as such have been de-valued by google and sites using this tactic a lot have in some cases been penalized. This is not so relevant for your own site where it is perfectly normal to expect to see footer and sidebar links highlighting main pages/categories of the site. Just be careful to not keyword stuff too much and try to link in a way that helps user experience and feels natural and you should be ok.
Hope that helps!
Hi Gina,
If the link is showing up in the moz reports you should be able to find it by going to the report page and downloading the report as a csv (top right). If you filter by 4xx client errors and then scroll along all the way to the right in the csv file the referring url is shown in the last column.
Hi,
Well you don't mention which your site is, but in regards your competitor from a quick glance it looks like they are doing a pretty good job in quite a few areas.
They have been around since 2006 and it looks like they are on the ball. So all in all, not impossible, rather more like a pretty well planned strategy that seems to be paying off.
I guess it might depend a bit on what are the most important features of each product. Are the differences most important or the generic (but important) features across all products more important? If the first then separate pages makes sense, if the second then maybe consolidating onto one page makes more sense. You could also try giving a full feature rundown on your most important/visible product page and then give a shortened version on the other two referencing the first one with a "for a complete rundown of all features" type link.
Tom sums it up well.
In most if not all of such cases you should ask yourself if this is a 'normal' situation for a 'normal' search query and I think you will find that the answer is no. If the sites are all owned by the same person then it is certainly borderline if this situation is resulting in searchers finding enough variety in the results for them to say that google is returning the best results. Further to that, for any semi-competitive keyword it would be quite unusual for this situation to be happening so I would ask myself if it is worth the bother maintaining all these sites and tracking their rankings. Chances are that at some point the algo will mix things up and some (all?!) of the sites will move out of the top 10.
So, officially against the guidelines? (assuming no funny stuff) probably not. An un-natural situation that is unlikely to be very stable and/or worth the effort of maintaining it... probably.
Hi Dimitrii,
The only way to do this is to run multiple adwords accounts and it is against google policy and can result in all your accounts getting suspended (see the 'gaming the network' sections here). Not something to risk!
That being said, make sure you use all available extensions on your single ad to fill up as much screen space as possible.
Especially make sure your sitelink extensions are completely filled out with title plus two rows of description each.
If your branded campaign is showing the ad plus four full sitelink extensions and your organic results are showing expanded sitelinks also then on most laptop and non tv sized computer screens the fourth organic result is going to be below the fold.
Hi Gina,
Which report are you seeing the 404 in? If it is in a moz report but not in the csv download that is a bit odd. Are you sure it is not in there? I usually select all in the excel and then do the ctrl-shift-windows-L together (great trick can't remember what it is called) which gives you filter dropdown arrows on all the column headers. If you go to the 4xx column and click that column's arrow you can quickly see if there is a 404 true hiding in there somewhere. If not, maybe a mail to support is in order since the page should be in there.
Hey Dan,
I have obviously been (seriously) under-charging for analytics setups
That being said as Paul mentions it does depend a bit on the platform/cms etc... those prices might just be expensive (which is relative to the company and is their right if they can support and sell it) or simply taking the p**s (which is just a bit mean spirited).
HI,
Just in time Dr Pete has put out this post: http://moz.com/blog/googles-2014-redesign-before-and-after which details quite a few of the changes that seem to be rolling out. Title length is certainly on people's minds and with the bigger font you do seem to be losing some space for characters (check the post comments for some discussion on that). Bit too soon to be seeing much data on click through rates, but so many different things could be affecting this I am not sure you could specifically attribute it to the layout changes anyway...
Hi Jesse,
Depends a bit on the competition and where you are already for those phrases but if you are pretty well placed then I would be inclined to approach it as you describe. Google highlights all three words when searching so they know there is a relationship there anyway. Seems to me that the heater/burner results are very similar, the furnace ones a bit different so maybe it is worth considering a title like 'waste oil heaters & furnaces' if it makes sense. Beyond that, if there are potentially small conceptual differences between the words then those sound like good fodder for blog posts which should help.
Hi,
First of all check the basics here to make sure that nothing obvious is wrong.
If it is a new campaign then the system often seems to wait to get some quality indicators that your keywords and ads are highly relevant and deserve to have full expanded sitelink extensions.
I would try using the brand name again (where relevant and normal sounding) in some of the sitelink titles and/or descriptions, link to relevant secondary pages about the company and bid high on the keywords. Since it is a branded query you should have a mid to high quality score already and once the campaign is running a little while you should have a high click through rate also - this combination should be enough to ensure maximum quality score and you descriptions should be shown.
If it is a new site or a brand with little branded traffic this process can take longer to complete (which is why you bid high in the beginning to make sure your ads show - once adwords sees brand relevancy the cpc should drop).
Hi,
I had a look at what I assume is the site and I think you have a combination of things going on that is likely causing confusion (to you, to the moz bot, probably to google too!)
Firstly, it is not recommended to use rel prev/next and rel canonical on the same page. With that what you are effectively doing is only indexing the first page of the results since all the other pages rel canonical back to the first one. If you have a 'view all' type page then you could rel canonical all of the paginated pages back to this one and you would not need to use the prev/next tags at all. It is also possible that your use of relative canonical links in combination with the above is also causing confusion, usually best to use absolute urls if possible.
Beyond that, the site dynamically loads more products as you scroll down the page which also results in the url changing to hoeretelefon/? for ALL the pages. If that is a problem or not depends on how it is coded and how the google and seomoz bots are deciding to parse the page, but it certainly adds another potential area of complexity to the issue.
Lastly, if you browse the site with javascript turned off you can see something odd in that the initial page /elektronik/baerbar-lyd/hoeretelefon has no prev/next OR canonical tag but has a link to /elektronik/baerbar-lyd/hoeretelefon?page=1 on which you find prev/next and canonicals back to the non paginated version. So you are basically skipping the pagination setup that goes from the original to the page=1 (but also giving a canonical back to the original page).
Phew! It is a bit confusing. I would recommend deciding on if you want to go with prev/next or canonical in the first place and take it from there. I would think that if you have the ability to canonical to a 'see all products page' then this might be the best way to go since it should theoretically take care of any issues the dynamic loading is causing also.
Hope that helps!
Hi Robert,
There are a couple of things to think about. You mention media plugins, but the url you mention just looks like a theme image path. So if we start from theme images, I have got to love the roots wordpress starter theme. It has a series of htaccess rules that basically show all your template images sitting in /media and all your uploaded images sitting in /assets. It does the same thing for css and js files etc, so does a pretty good job of hiding the fact you are using wordpress (at least in terms of url paths). It can be a bit tricky to get used to, but once set up it is pretty good. I don't really think it makes a great difference in ranking for much, but I like the clean urls!
Hi,
What you should do depends on what the issue is. 301's are useful if your duplicate issues are due to both www and non www urls being available to the search engines. The canonical tag is useful if you have url parameters that are causing duplicate content or if you have a multi capitalisation issue with your urls. There are other possibilities for the root cause, so first thing to do is properly identify what is causing the duplicate content issue for each case (there might be various reasons). What kind of duplicates they are will then guide you as to the best way to approach the solution. You can check out this moz guide for a good rundown on potential issues and solutions.
Without knowing your site it is tough to tell the exact issue, but:
Duplicate content is related to the page content itself, not the url - so if you are seeing different pages all flagged as duplicate it likely means the actual content on them is very similar. If you 301 redirect these pages then only the final page the 301 redirects to will still be available to users and search engines - so be sure this is what you intend to do. If for example these are pages for different (but similar) products then you would likely not want to do this.
The login page - depending on the issue (perhaps url parameters?) this is likely a good candidate for a rel canonical - in this case you put the canonical tag in each page version - most CMS's will do this automatically for all versions of the page if you set it up properly.
Hope that helps a bit!
Hi,
If you want to have one ad that displays on both desktop and mobile do NOT check the mobile box. This check box is used to indicate that the ad is to be 'preferred' for mobile devices in relation to other ads of the same type (ie text ads) in the same ad group. Google does not recommend having ad groups with only mobile optimized ads in them saying they may show up on desktop and tablet searches anyway.
So if you want to create text ads that show across all devices simply create them as normal and do not check the mobile box. If you want to have a text ad that has different wording for mobile users (lots of reasons why you might want to do this) then best practice is to create at least two text ads, one for desktop/tablet and one for mobile (with the mobile box checked). You can see what devices each ad has been shown on by selecting the 'Device' option from the Segment drop down in the ads tab to get exact numbers of impressions, clicks, CTR etc for each ad.
It should be redirecting to index.php as long as a number of conditions are met:
rewritecond $1 !^(index.php|public|tmp|robots.txt|template.html|favicon.ico|images|css|uploads)
As long as the requested url does not start with one of: index.php, public, tmp, robots.txt, template.html, favicon.ico, imagesloss, uploads and,
rewritecond %{REQUEST_FILENAME} !-f
rewritecond %{REQUEST_FILENAME} !-d
As long as the requested url is not an existing file or directory
Then:
rewriterule ^(.*)$ index.php?link=$1 [NC,L,QSA]
Rewrite the url to index.php?link=REQUESTED-URL (along with any other url variables) and stop processing
So you should be seeing urls something like index.php?link=page.php or similar if the conditions above are met.
robots.txt is not being redirected since it is being specifically excluded in the first line.
Think I got that right, hope it makes sense!