How to turn on persistent urls in WordPress?
-
I'm using an appointment form on my website and I have the option to add a referral url to form submissions so that i know which pages the form submission came from.
I need to be able to distinguish between organically generated form submissions and those that come in via AdWords. If referral url shows the AdWords tracking code i know the form submission came in from AdWords.
My problem is that when a visitor comes in after clicking an ad and then visits another page on my website that AdWords tracking code disappears from the url. I was told that there was a way to turn on persistent urls in WordPress but I can't figure out how to do it.
I'm assuming that if i turn persistent urls on the AdWords tracking code will remain on every subsequent url that they visit on my website. Is this true?
Any help with this will be greatly appreciated.
-
Thanks for your help everyone. I'm working on the GCLID attribution now.
-
Max is definitely right that you need code. The most common attribution method is last non direct. The easiest way to determine PPC v SERPs is to try to grab the GCLID. If you end up growing your business and/or merging this information back to AdWords from the offline conversion tracking option they offer you will need the GCLID.
-
This is just going to disable yoast canonical url, I don't see how could it help passing query string parameters through the user visit path.
-
You can use either one or another, cookie is persistent through different visits (and last as long as you decide it to last), while the session variable last only for the current user session. Depends on the attribution window you want to use.
-
Thanks for your help guys. I've tried using your method smarttill but unfortunately it didn't work.
I will try it your way Max but how do i log where the visitor is coming from with a cookie or a session variable?
-
Add SEO Yoast as a plugin tin Wordpress. add this to your functions.php add_filter( 'wpseo_canonical', '__return_false' );
-
You need coding, when the visitor land on the entry page of your site take the utm_source or utm_campaign from the url and log where he is coming from, in a cookie, session variable, etc... Then pass it through on form submission. You can use header, footer or any wordpress piece of code used in every page.
You can't keep the query string through the visitor path unless you code too, and it's more complex, and I don't believe you can find a wordpress plugin doing that. For sure is not something you can do with a standard wordpress installation.
-
Thanks for your help Max but i don't need to know how many leads came in through the different referral sources. I already know that. What i do need to do is identify each form submission as coming from organic traffic or ppc.
Like i've mentioned earlier, the leads coming in through the form need to be logged into a client management software so i need to take the contact information of the form submitter and enter it in the system as coming from organic or ppc. This is done to track ROI.
-
Maybe I am missing something, but form submission is either PPC or organic because the visitor is coming from PPC or Organic. So if you define a goal in analytics for the form submission, triggered either by url match or javascript, you can later check in analytics how many lead were generated through PPC or organic checking the goals per channel/referral/campaign.
Keep in mind you can use utm_source, utm_campaing, etc... In the links originating the leads, if you control them.
-
I know analytics. I can see referral traffic and goal paths and all that. What I need is to be able to attribute individual form submissions to either organic or ppc traffic.
Each form submission is a lead. Each lead needs to be logged in a client management software so in order to properly attribute a lead to either ppc or organic traffic i need to use persistent urls so that the referral url field in my form reflects the traffic source vie google tracking code in the url.
I hope someone here can help shed some light on this. Thanks.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is having the same URL in several sitemaps a problem for google?
We have 30 sitemaps, one for each language version of our site. About 5000 pages per sitemap.
Reporting & Analytics | | lcourse
To get a better idea on which pages google is not indexing, I thought about quickly generating sitemaps by page cagetories to see if there are any patterns. Any problems if I submit now new additional sitemaps dividing all our pages by product page, considering that the same pages are already in our existing sitemaps we submitted in the search console. So having same URL in more than 1 sitemap would be a problem? As a side note, we observed when adding a sitemap index that google search console in its count of total indexed pages, now counts every page twice since we submitted both the sitemap index and the individual sitemaps, so search console does not recognize in count that sitemaps in sitemaps index are identical to the ones we submitted individually in search console.0 -
"index.htm" for all url's in google analytics
I don't have this issue with other wordpress websites, only this one website, and I don't know what's causing the issue: Google Analytics is adding an "index.htm" to every single page on the website. So it is tracking the pages, I see no errors - is it tracking the right page? When I click on the page link in a report, I naturally go to a "404 page not found" since the website address isn't "www.example.com/rewards/index.htm" - but instead the actual address would be:
Reporting & Analytics | | cceebar
"www.example.com/rewards/". I have navigated to View Settings in GA to insure "default page" is empty. Although adding anything else to this field does not effect the page url in analytics reports either. Could it be htaccess file - or a plugin effecting the htaccess file?_Cindy0 -
Google is not indexing all URLs
My website have company and events profile from 200 countries. So it does have lots of URL. Earlier in August 2014, Google used to crawl 90% of URLs we submit. Thing goes wrong when we shifted from http to https. We lost traffic. But we are gaining it slowly. Main concern is that, It still does not indexed all submitted URLs. It have crawled merely 8% of all URLs submitted. site address is businessvibes.com Any help would be appreciated.
Reporting & Analytics | | irteam0 -
Wordpress site with increase number of Crawl(400 response Code) errors in Others section of GWT
I have a wordpress site http://muslim-academy.com/I check in Google Webmasters tool today and I see the increase number of errors in Others area of Google webmaster Tool.The error code is 400http://muslim-academy.com/%D8%B3%D9%8A%D8%B1%D8%A9-%D8%AA%D8%A7%D8%B1%D9%8A%D8%AE%D9%8A%D8%A9-%D9%84%D9%84%D8%B1%D8%A6%D9%8A%D8%B3-%D8%AC%D9%85%D8%A7%D9%84-%D8%B9%D8%A8%D8%AF-%D8%A7%D9%84%D9%86%D8%A7%D8%B5%D8%B1-2/%D8%B3%D9%....%3Cbr%20/%3E________________%3Cbr%20/%3E___________%3Ca%20href=?lang=zhOne of the example link of this error.Can you guide me why the number of errors are increasing and how to fix the existing errors.
Reporting & Analytics | | csfarnsworth0 -
Get a list of robots.txt blocked URL and tell Google to crawl and index it.
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list. My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches, One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file. I need urgent recommendation as I do not want to see drop in my traffic any more.
Reporting & Analytics | | csfarnsworth0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
Analytics Filter for URL's
Hi Fellow Mozzers I am setting my analytics and need to set some filters and need some help. I have a number of Local Sites i need to include and can't find how to do it. some of the the paths are local.imsm.com/new-york/ local.imsm.com/chicago/ local.imsm.com/long-beach/ local.imsm.com/atlanta/ each of the local URL's are /name/ any help would be great
Reporting & Analytics | | imsmlouis0 -
Webmaster Not found URLs
Dear All, I would really like help with this. Due to some unknown reason (another thread is open for this reason), my google webmaster is showing 7000 not found URL's. Now, when i try to find out the day these broken URLs were detected, webmaster is showing dates between November 2011 to December 2nd, 2011. I havent found a single not found error showing after 2nd December 2011. So does that mean that the mistake has been solved? Because daily webmaster is adding 200-300 not found URLs. Along with this, my traffic has dropped drastically since 12th December and has still not recovered. Are these not found URLs the reason for this sudden traffic drop? If so, then i m ready to find someone for paid seo to remove this error. I would love to have some concrete answers for these questions. Thanksss
Reporting & Analytics | | hith2340