Tracking down rel="canonical" on Wordpress site
-
A rel="canonical" is being added to every page and post on my Wordpress site - not tag results, not category results. Not a major problem, right? Except that I don't know where it's coming from.
I've tried tracking it down - change the theme back to a default one, turn off all the plugins - it's still there. Is it coming from .htaccess perhaps?
The only issue it is causing is that it has causes me to have to turn off the canonical option in Platinum SEO as that was resulting in two identical rel=canon on each page.
It doesn't seem to be causing problem but I'd like to get a better understanding of what it going on.
-
Thanks Donna, I've already tried switching back to the default theme and no difference.
However your link does show that Wordpress itself adds the canonical link to every post and page by default so that'll be where it's coming from, rather than the theme.
Thanks.
-
The canonical might be being added by your theme.
Check out this post (http://www.featheredowl.com/wordpress-duplicate-canonical-link-element/). Read the comments. The post is from 2010 but don’t let that deter you. It could very well be the solution to your problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to host Videos on my Wordpress site? (SEO-wise)
I have a hard time choosing whether to stream my videos from certain platforms like vimeo, youtube, etc. or embedding the videos into my site, and I'm not quite sure which one Google would like to see more of? And which style will save my page speed from plummeting too much. Any ideas? Thanks guys
On-Page Optimization | | Benavest0 -
How "Top" or "Best" are considered when in front of keyword
I would like to know if someone has proven info how google today counts words "Top" or "Best" when in front of main keywords you try to rank for. For example, if I have a keyword like "Restaurants in Madrid" and I optimize that page without using words "top" or "best" will it have good rankings for keywords "top restaurants in madrid" and "best restaurants in madrid" ? I suppose that google is smart enough to know that web page should be good ranked even without using those 2 words but would like to know percentage of my loss if I just exclude those words from title tag and other important onpage factors. I want to rank high for all the 3 combinations, with "top", with "best" and without it in front so searching for best solution. I plan just to add one of those words, for example "top" and hope that google will know that "top" = "best" 🙂
On-Page Optimization | | m2webs0 -
Rel="canonical" link should they be to or from an "SEO friendly" url
Thanks for taking the time to review this. So for our example, lets use the following SEO friendly link: http://hiu.calibermediagroup.com/undergraduate-on-campus/academics/colleges/pacific-christian-college-of-ministry-and-biblical-studies/BA-biblical-studies We'll call this link the SEO VERSION The title of the college is" Pacific Christian College of Minstry and Biblical Studies" The title of the program is "BA Biblical Studies" The QUERY version of the link to this page would be something like: http://hiu.calibermediagroup.com/undergraduate-on-campus/academics/colleges/index.php?collegeid=22&programid=34 Keep in mind that the meta title, description, and keyword tags for the page are all administerable The SEO VERSION is automatically created from the title of the college, and the title of the program. Each one of these titles can be overidden with a URL slug individually. For instance, the admin could make the link: http://hiu.calibermediagroup.com/undergraduate-on-campus/academics/colleges/pacific-christian-college-of-ministry/biblical-studies by changing the slug for the college to "pacific-christian-college-of-ministry" and the slug for the program to "biblical-studies". Let's call this version the SLUG VERSION So now we have multiple ways to get to the same content. The question on the table is what is best practice for the rel="canonical" link to keep from getting dinged for duplicate content. Let's say that our SEO VERSION is the canonical link for 1 year. Then the choice was made to optimize the links thru the slugs creating the SLUG VERSION. My assumption is that we would keep the SEO VERSION as the canonical link. But then let's say 6 months later that the title of the program is changed in the admin. Now the SEO VERSION has changed and so has the canonical link. Do we lose the link juice garnered over the last 18 months? It would seem to me, that if we use the QUERY version as the canonical link, then any optimizations or changes affect everything except the canonical link, thus keeping the previous link juice earned. But is having an ugly URL as the canonical link detrimental to SEO? Please advise.
On-Page Optimization | | robertdonnell0 -
Site restructure question
Our site was deigned years ago to target customers in specific cities, now we've grown beyond this and I believe it is time to change the site structure.
On-Page Optimization | | PM_Academy
Ignore the 302 from the root page. Current structure: (assuming you've never been to our site before) projectmanagementacademy.net 302->/select-location.php /select-location.php -> /city-name/pmp-training.php This page was meant to be a "homepage" for each city, pointless page really /city-name/pmp-training.php -> /ciy-name/product-name.php These pages are for each individual product My suggested site structure: /city-name/pmp-training.php becomes projectmanagementacademy.net no more redirect /city-name/pmp-training.php gets removed and 301 to root page. /product-name.php each product's page and you would select a location when necessary (some products are online only) would 301 each /city-name/product-name to corresponding product page /product-name/city-name.php could add these pages if we still wanted the city name in url for city specific products My thoughts here are /product-name.php would receive a higher % of link juice because there are fewer page between 2 vs 4 if you came to the root page. and 2 vs 3 if you came from the select-location page. Also instead of being split between over 50 locations, all these would be together on one page. Your thoughts? Would this change improve our SERP for those product pages? Would we see a drop off in traffic if we did this? How long, if done correctly, would it take to see the recovery of rankings and traffic? Could we 301 /select-location.php to the root page? Thanks in advance for your insights to this. Any answer is a good answer. Trenton0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
What is this site all about?
Can anyone here tell me what this site is all about?... I really need to know, i have a blog.
On-Page Optimization | | Wales33030 -
What does the "base href" meta tag do? For SEO and webdesign?
I have encounter the "base href" on one of my sites. The tag is on every page and always points to the home URL.
On-Page Optimization | | jmansd0 -
SEO value of "in the news" links on home page?
Notice more sites have an "in the News" section on the home page, or something similar like press releases... Apart from providing users fresh content, is there an SEO value to this? What is the explanation for this? Have a feeling the answer is obvious but just not too sure Thanks a lot.
On-Page Optimization | | inhouseninja0