How to turn on persistent urls in WordPress?
-
I'm using an appointment form on my website and I have the option to add a referral url to form submissions so that i know which pages the form submission came from.
I need to be able to distinguish between organically generated form submissions and those that come in via AdWords. If referral url shows the AdWords tracking code i know the form submission came in from AdWords.
My problem is that when a visitor comes in after clicking an ad and then visits another page on my website that AdWords tracking code disappears from the url. I was told that there was a way to turn on persistent urls in WordPress but I can't figure out how to do it.
I'm assuming that if i turn persistent urls on the AdWords tracking code will remain on every subsequent url that they visit on my website. Is this true?
Any help with this will be greatly appreciated.
-
Thanks for your help everyone. I'm working on the GCLID attribution now.
-
Max is definitely right that you need code. The most common attribution method is last non direct. The easiest way to determine PPC v SERPs is to try to grab the GCLID. If you end up growing your business and/or merging this information back to AdWords from the offline conversion tracking option they offer you will need the GCLID.
-
This is just going to disable yoast canonical url, I don't see how could it help passing query string parameters through the user visit path.
-
You can use either one or another, cookie is persistent through different visits (and last as long as you decide it to last), while the session variable last only for the current user session. Depends on the attribution window you want to use.
-
Thanks for your help guys. I've tried using your method smarttill but unfortunately it didn't work.
I will try it your way Max but how do i log where the visitor is coming from with a cookie or a session variable?
-
Add SEO Yoast as a plugin tin Wordpress. add this to your functions.php add_filter( 'wpseo_canonical', '__return_false' );
-
You need coding, when the visitor land on the entry page of your site take the utm_source or utm_campaign from the url and log where he is coming from, in a cookie, session variable, etc... Then pass it through on form submission. You can use header, footer or any wordpress piece of code used in every page.
You can't keep the query string through the visitor path unless you code too, and it's more complex, and I don't believe you can find a wordpress plugin doing that. For sure is not something you can do with a standard wordpress installation.
-
Thanks for your help Max but i don't need to know how many leads came in through the different referral sources. I already know that. What i do need to do is identify each form submission as coming from organic traffic or ppc.
Like i've mentioned earlier, the leads coming in through the form need to be logged into a client management software so i need to take the contact information of the form submitter and enter it in the system as coming from organic or ppc. This is done to track ROI.
-
Maybe I am missing something, but form submission is either PPC or organic because the visitor is coming from PPC or Organic. So if you define a goal in analytics for the form submission, triggered either by url match or javascript, you can later check in analytics how many lead were generated through PPC or organic checking the goals per channel/referral/campaign.
Keep in mind you can use utm_source, utm_campaing, etc... In the links originating the leads, if you control them.
-
I know analytics. I can see referral traffic and goal paths and all that. What I need is to be able to attribute individual form submissions to either organic or ppc traffic.
Each form submission is a lead. Each lead needs to be logged in a client management software so in order to properly attribute a lead to either ppc or organic traffic i need to use persistent urls so that the referral url field in my form reflects the traffic source vie google tracking code in the url.
I hope someone here can help shed some light on this. Thanks.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question on structuring URLs in a Drupal CMS - Adverse SEO or Analytics impacts?
Hello Moz Community, We're building out a health system (think a bunch of hospitals and clinics etc.) website on Drupal for the first time. Nebraskamed.com is our domain. Because we're using nodes instead of pages, our URL structure can pretty much be whatever we think makes sense. Our proposal is to drop /blog/ and related terms from the URL structure, because it doesn't really mean anything to the user. Instead, we'd use the service line "cancer" for example, followed by the name of the blog post or document. Example: nebraskamed.com/cancer/10-bone-cancer-myths Do you see any red flags (perhaps with SEO or Analytics for example) to what I'm proposing? domain name/service line/blog-post-name If so, do you have a URL structure you advise?
Reporting & Analytics | | Patrick_at_Nebraska_Medicine1 -
Weird URL Structure in GA
Hey everyone, Thanks in advance for any insight on this. I've been researching it quite a bit on Google and haven't found anything yet. In Analytics, under our pages report, we're getting a lot of pages that look like this: www.execucar.com/https://www.execucar.com or www.execucar.com/https://www.execucar.com/locations/orlando-car-service Any thoughts on how to fix this? These pages don't exist...I'm at such a loss.
Reporting & Analytics | | SuperShuttle0 -
Different PA in the same URL with canonical differences
Hi, In google listings the url appears like www.enviosadomicilio.com/mexico/ With Moz tools the PA of this are 1. But analizing www.enviosadomicilio.com/mexico PA is 15. What do you recommend to do?. Another point is in Link tools of MOZ www.enviosadomicilio.com/mexico/ said 301 and the redirection is with de root www.enviosadomicilio.com but in codes neither .htaccess we have this redirection. We are hosting in goddady is normal this? thanks
Reporting & Analytics | | ramirez_salvador0 -
How to get multiple pages to appear under main url in search - photo attached
How do you get a site to have an organized site map under the main url when it is searched as in the example photo? SIte-map.png
Reporting & Analytics | | marketingmediamanagement0 -
How to detect where Google gets indexed URL's
Google index some kind of way some links that create duplicate content. We doesn't understand how these are created so we would like detect where Google robots find these links. We tried: Moz Crawl Diagnostics but it shows 0 as Internal Link Count for these kind of links. Find some information from Google Analytics, that maybe there is trace (site content - all content) from visitors side. There wan't. We tried to find some information in Webmaster Tools under Internal link and HTML Improvements but didn't find any trace. Tried some search commands. Is there maybe some good one to search. TO search URL's form code with https://search.nerdydata.com.
Reporting & Analytics | | raido0 -
Changing URL Parameters in Webmaster Tools
We have a bit of a conundrum. Webmaster tools is telling us that they are crawling too many URLs: Googlebot found an extremely high number of URLs on your site: http://www.uncommongoods.com/ In their list of URL examples, all of the URLs have tons of parameters. We would probably be ok telling Google not to index any of the URLs with parameters. We have a great URL structure. All of our category and product pages have clean links (no parameters) The parameters come only from sorts and filters. We don't have a need for Google to index all of these pages. However, Google Analytics is showing us that over the last year, we received a substantial amount of search revenue from many of these URLs (800+ of them converted) So, Google is telling us they are unhappy. We want to make Google happy by ignoring all of the paramter URLs, but we're worried this will kill the revenue we're seeing. Two questions here: 1. What do we have to lose by keeping everything as-is. Google is giving us errors, but other than that what are the negative repercussions? 2. If we were to de-index all of the parameter URLs via Webmaster tools, how much of the revnenue would likely be recovered by our non-parameter URLs? I've linked to a screenshot from Google Analytics ArxMSMG.jpg
Reporting & Analytics | | znotes0 -
Virtual Pageviews vs. Destination URL for Goal tracking
I am working on setting up goal tracking. Currently when we receive a new signup, the person gets sent to their dashboard homepage and this is not a valuable a goal tracking destination. My suggestion was to build a "Welcome" interstitial page to send a user to when they create an account. Our tech team suggested using a Virtual pageview instead as it is their thinking that a "Welcome" page adds little value. Is there any downside to using Virtual Pageviews with regards to goal tracking?
Reporting & Analytics | | Vacatia_SEO
Are interstitial page more reliable?
Can you still use funnel visualization with Virtual Pageviews?0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0