How to turn on persistent urls in WordPress?
-
I'm using an appointment form on my website and I have the option to add a referral url to form submissions so that i know which pages the form submission came from.
I need to be able to distinguish between organically generated form submissions and those that come in via AdWords. If referral url shows the AdWords tracking code i know the form submission came in from AdWords.
My problem is that when a visitor comes in after clicking an ad and then visits another page on my website that AdWords tracking code disappears from the url. I was told that there was a way to turn on persistent urls in WordPress but I can't figure out how to do it.
I'm assuming that if i turn persistent urls on the AdWords tracking code will remain on every subsequent url that they visit on my website. Is this true?
Any help with this will be greatly appreciated.
-
Thanks for your help everyone. I'm working on the GCLID attribution now.
-
Max is definitely right that you need code. The most common attribution method is last non direct. The easiest way to determine PPC v SERPs is to try to grab the GCLID. If you end up growing your business and/or merging this information back to AdWords from the offline conversion tracking option they offer you will need the GCLID.
-
This is just going to disable yoast canonical url, I don't see how could it help passing query string parameters through the user visit path.
-
You can use either one or another, cookie is persistent through different visits (and last as long as you decide it to last), while the session variable last only for the current user session. Depends on the attribution window you want to use.
-
Thanks for your help guys. I've tried using your method smarttill but unfortunately it didn't work.
I will try it your way Max but how do i log where the visitor is coming from with a cookie or a session variable?
-
Add SEO Yoast as a plugin tin Wordpress. add this to your functions.php add_filter( 'wpseo_canonical', '__return_false' );
-
You need coding, when the visitor land on the entry page of your site take the utm_source or utm_campaign from the url and log where he is coming from, in a cookie, session variable, etc... Then pass it through on form submission. You can use header, footer or any wordpress piece of code used in every page.
You can't keep the query string through the visitor path unless you code too, and it's more complex, and I don't believe you can find a wordpress plugin doing that. For sure is not something you can do with a standard wordpress installation.
-
Thanks for your help Max but i don't need to know how many leads came in through the different referral sources. I already know that. What i do need to do is identify each form submission as coming from organic traffic or ppc.
Like i've mentioned earlier, the leads coming in through the form need to be logged into a client management software so i need to take the contact information of the form submitter and enter it in the system as coming from organic or ppc. This is done to track ROI.
-
Maybe I am missing something, but form submission is either PPC or organic because the visitor is coming from PPC or Organic. So if you define a goal in analytics for the form submission, triggered either by url match or javascript, you can later check in analytics how many lead were generated through PPC or organic checking the goals per channel/referral/campaign.
Keep in mind you can use utm_source, utm_campaing, etc... In the links originating the leads, if you control them.
-
I know analytics. I can see referral traffic and goal paths and all that. What I need is to be able to attribute individual form submissions to either organic or ppc traffic.
Each form submission is a lead. Each lead needs to be logged in a client management software so in order to properly attribute a lead to either ppc or organic traffic i need to use persistent urls so that the referral url field in my form reflects the traffic source vie google tracking code in the url.
I hope someone here can help shed some light on this. Thanks.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Vanity URL vs domain URL
Hi guys, Our CEO is having an interview with a known broadcaster on radio. During the interview he will mention a specific URL www.example.com/marketingcampaign that we want track on Google Analytics, therefore behaving like a vanity URL redirecting to the actual URL www.example.com/resources/primary-keyword-2018. Would this work the same way a vanity URL in terms of tracking or not such as following guideline here ? I am asking because vanity URLs are supposed to be completely different domain name that gets purchased and in our case it is the same domain name just with a different URI. thanks guys!
Reporting & Analytics | | Taysir0 -
Page Tracking using Custom URLs - is this viable?
Hi Moz community! I’ll try to make this question as easy to understand as possible, but please excuse me if it isn’t clear. Just joined a new team a few months ago and found out that on some of our most popular pages we use “custom URLs” to track page metrics within Google Analytics. NOTE: I say “custom URLs” because that is the best way for me to describe them. As an example: This page exists to our users: http://usnews.rankingsandreviews.com/cars-trucks/Ram_HD/2012/photos-interior/ But this is the URL we have coded on the page: cars-trucks/used-cars/reviews/2012-Ram-HD/photos-interior/ (within the custom variance script labeled as “var l_tracker=” ) It is this custom URL that we use within GA to look up metrics about this page. This is just one example of many across our site setup to do the same thing Here is a second example: Available page to user: http://usnews.rankingsandreviews.com/cars-trucks/Cadillac_ATS/2015/ Custom “var l_tracker=” /cars-trucks/2015-Cadillac-ATS/overview/ NOTE: There is a small amount of fear that the above method was implemented years ago as a work-around to a poorly structured URL architecture. Not validated, but that is a question that arose. Main Questions: Is the above implementation a normal and often used method to track pages in GA? (coming from an Omniture company before – this would not be how we handled page level tracking) Team members at my current company are divided on this method. Some believe this is not a proper implementation and are concerned that trying to hide these from Google will raise red flags (i.e. fake URLs in general = bad) I cannot find any reference to this method anywhere on the InterWebs - If method is not normal: Any recommendations on a solution to address this? Potential Problems? GA is currently cataloguing these tracking URLs in the Crawl Error report. Any concerns about this? The team wants to hide the URLs in the Robots.txt file, but some team members are concerned this may raise a red flag with Google and hurt us more than help us. Thank you in advance for any insight and/or advice. Chris
Reporting & Analytics | | usnseomoz0 -
Dynamic URL parameter generator
Hi Is any one privy to an online tool that will let me create a dynamic URL parameter string but will allow me to generate multiples of the URL and add a distinct key for each one. E.g Campaign Source, Medium, Name, a Keywords etc. are all the same in the string but then I want to generate a unique id code at the end. Then export them as a csv and integrate into my database lists. Looking to run this into a few thousand as well. I was going to just do this in Excel and combine two columns withe the string and the number count in the other column but if there is a tool that does it all that would be interesting to know.
Reporting & Analytics | | David-E-Carey0 -
How to detect where Google gets indexed URL's
Google index some kind of way some links that create duplicate content. We doesn't understand how these are created so we would like detect where Google robots find these links. We tried: Moz Crawl Diagnostics but it shows 0 as Internal Link Count for these kind of links. Find some information from Google Analytics, that maybe there is trace (site content - all content) from visitors side. There wan't. We tried to find some information in Webmaster Tools under Internal link and HTML Improvements but didn't find any trace. Tried some search commands. Is there maybe some good one to search. TO search URL's form code with https://search.nerdydata.com.
Reporting & Analytics | | raido0 -
Large event site - how should I structure my URLs?
Hi guys, I'm working on a new website which is consolidating a number of existing event sites into one. The existing sites use a variety of URL structures: www.eventsite1.com/events/event-name www.eventsite2.com/festival-program/event-name www.eventsite3.com/event-name This inconsistency has led to issues with tracking category usage properly in analytics - for instance, with eventsite3.com, events fall within categories (www.eventsite3.com/category-name) but as soon as you drill into an event detail page (www.eventsite3.com/event-name) from the category page, the category is lost to analytics. This is compounded when one event lives within multiple categories, as I can't figure out which category is the most effective for a particular event. I've seen other event sites establish a canonical URL for a primary category, display it in the URL (i.e. www.eventsite4.com/primary-category/event-name) yet still let that event get hit via the secondary categories (www.eventsite4.com/secondary-category/event-name). This way, the categories get passed to analytics without any duplicate content issues (i.e. via the setting of canonicals) Basically, I want to make sure that whatever instruction I give to the devs for the new site re: URL structure is correct from an SEO perspective and analytics perspective. Do I even need to worry about having the category in the URL? Can someone please help me with this? Hope this makes sense Cheers
Reporting & Analytics | | cos20300 -
URL Parameters
Hi there, I have a magento sort by feature which has indexed loads of pages in Google with urls that have /shopby/ in them.Over 8k pages have been indexed like this. I cannot edit the robots within the page but have now disallowed the urls in robots.txt - i guess this will prevent new ones being indexed but not deindex current ones? So I looked into URL parameters, I added 'shopby' as a parameter in webmaster tools and told Google not to crawl any urls with this in it, will this deindex the pages already indexed? The only other way seems to be manually removing 8k urls, which i do not want to do. Any advice much appreciated. Obviously I do not want these urls indexed as they are weak/duplicate sort by search pages, I fear the panda update would not be too kind on it long term?
Reporting & Analytics | | tdigital0 -
What is the SEO Impact of Adding a Directory to URL
I would like to add a new directory named “products” for all of the product detail pages on my site. Instead of having the URL for a product be “mysite.com/product-details-page.aspx” we would like to change it to “mysite.com.com/products/product-detail-page.aspx.” I want to do this to enable us to add product pages to our traffic funnel analysis by filtering visits to the "product" directory - right now we can't track visits to product pages in the funnel because they are just one-off the main site. I know this change will require redirects for every single product. Is there anything else that needs to be done? My main question is, will this change negatively impact the SEO value of the product pages? We have several product pages ranking in the SERPs, and I don't know if pushing them one directory further will change that. Thanks for your input!
Reporting & Analytics | | pbhatt0 -
Phantom urls causing 404
I have a very strange problem. When I run SEOmoz diagnostics on my site, it reveals urls that I never created. It seems to combine two slugs into a new url. For example, I have created the pages http://www.naplesrealestatestars.com/abaco-bay-condos-naples/ and http://www.naplesrealestatestars.com/beachwalk-naples-florida/ and now the url http://www.naplesrealestatestars.com/abaco-bay-condos-naples/beachwalk-naples-florida/ exists in addition to the two I created. There are over 100 of these phantom urls and they all show a 404 error when clicked on or crawled by SEOmoz. Any body know how to correct this?
Reporting & Analytics | | DanBoyle760