How do you create tracking URLs in Wordpress without creating duplicate pages?
-
I use Wordpress as my CMS, but I want to track click activity to my RFQ page from different products and services on my site. The easiest way to do this is through adding a string to the end of a URL (ala http://www.netrepid.com/request-for-quote/?=colocation)
The downside to this, of course, is that when Moz does its crawl diagnostic every week, I get notified that I have multiple pages with the same page title and the dup content.
I'm not a programming expert, but I'm pretty handy with Wordpress and know a thing or two about 'href-fing' (yeah, that's a thing). Can someone who tracks click activity in WP with URL variables please enlighten me on how to do this without creating dup pages?
Appreciate your expertise. Thanks!
-
Not a bad idea, and I can do that right from my SEO plugin of choice.
Dummy ?'s: Will that disallow addition catch them even if they are at the end of a child page? Also, for correct coding, wouldn't I need the wildcard to go after the ?=, ie
Disallow: /?=* and not place the * in front of the ?=...
Please clarify for me before I add this line of code.
Thanks for your help Dean! Sorry for the late edit of the question.
-
You can add the following to the robots.txt file to ask the crawler to ignore the parameter currently being included such as:
Disallow: /*?=
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
IP TRACKING
Is there a 3rd party that can provide me with all the ips that have hit my site http://1-800-medigap.com ? Does the MOZ API have that ability? Does anything?
Moz Pro | | jdcline0 -
Duplicate page content and title
hi,
Moz Pro | | solutionforweb
i have a serious issue with my site. my website contains 21 pages. but during my weekly report, moz found 84 errors. i.e 42 errors in duplicate page content and 42 errors in duplicate page title... when i see the error in details.. all my 21 links are displaying twice. for example http://domain.com/
http://domain.com/page1.html
http://domain.com/page2.html
and
http://www.domain.com/
http://www.domain.com/page1.html
http://www.domain.com/page2.html so, the same link is repeating twice with www and without www. how do i resolve this error? so please kindly anyone help me....0 -
Is there a way to perform a crawl diagnostics without creating a campaign?
If you wanted to perform a crawl diagnostics but your campaigns are at full capacity are you able to do this and how (or does this mean you will have to remove one campaign to make space for another)?
Moz Pro | | SarahAhmed3790 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090 -
Problem with Rankings and On-page Optimization
Hi SEOZ 🙂 I have a question regarding the Rankings and On-page Optimization in the seomoz Campaign Manager. I have setup a website url for examle: www.keyword.com After that created a list of all the target Keywords, that I want to reach within my Website: keyword-a, keyword-b, keyword-c and so on... Then I did a On-Page Analysis for all the urls with the specific keywords. keyword-a: www.keyword.com/keyword-a/ keyword-b: www.keyword.com/keyword-b/ keyword-c: www.keyword.com/keyword-c/ and so on... Most of the urls got the A grade. Now after the Website hast launched and got crawled, I have a problem with the Rankings and the On-page Optimization. The Rankings for my Keywords and also the Grades at the On-page Optimization are only shown for my Start/Homepage: www.keyword.com NOT for the urls that are specific for a Keyword for example: www.keyword.com/keyword-a/ Also the Grades are shown for the Keywords but again only in combination with my Start/Homepage www.keyword.com NOT for www.keyword.com/keyword-a/ What is the problem? Bye, Alex
Moz Pro | | krseo0 -
Duplicate page title
I own a store www.mzube.co.uk and the scam always says that I have duplicate page titles or duplicate page. What happens is thn I may have for example www.mzube.co.uk/allproducts/page1. And if I hve 20 pages all what will change from each page is the number at the end and all the rest of the page name will be the same but really the pages are if different products. So the scans think I have 20 pages the same but I havent Is this a concern as I don't think I can avoid this Hope you can answer
Moz Pro | | mzube0 -
Ranking Tracking Internal Folders
We would like to track the rankings for an internal site. The website runs inside a big corporate website, url example www.domain.com/za/. How can we effectively track the rankings of only the website inside the folder? There is currently no sub domain setup for this site. Any advice would be appreciated.
Moz Pro | | DeloitteSA0 -
We were unable to grade that page. We received a response code of 301\. URL content not parseable
I am using seomoz webapp tool for my SEO on my site. I have run into this issue. Please see the attached file as it has the screen scrape of the error. I am running an on page scan from seomoz for the following url: http://www.racquetsource.com/squash-racquets-s/95.htm When I run the scan I receive the following error: We were unable to grade that page. We received a response code of 301. URL content not parseable. This page had worked previously. I have tried to verify my 301 redirects and am unable to resolve this error. I can perform other on page scans and they work fine. Is this a known problem with this tool? I have verified ensuring I don't have it defined. Any help would be appreciated.
Moz Pro | | GeoffBatterham0