Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • We've recreated a client site in a subdirectory (mysite.com/newsite) of his domain and when it was ready to go live, added code to the htaccess file in order to display the revamped website on the main url. These are the directions that were followed to do this: http://codex.wordpress.org/Giving_WordPress_Its_Own_Directory and http://codex.wordpress.org/Moving_WordPress#When_Your_Domain_Name_or_URLs_Change. This has worked perfectly except that we are now receiving a lot of 404 errors am I'm wondering if this isn't the root of our evil. This is a WordPress self-hosted website and we are actively using the WordPress SEO plugin that creates multiple folders with only 50 links in each. The sitemap_index.xml file tests well in Google Analytics but is pulling a number of links from the subdirectory folder. I'm wondering if it really is the manner in which we made the site live that is our issue or if there is another problem that I cannot see yet. What is the best way to attack this issue? Any clues? The site in question is www.atozqualityfencing.com https://wordpress.org/plugins/wordpress-seo/

    | JanetJ
    0

  • Hi all, We've just rebranded.  The 301 appears to have worked well and moved the results and rankings onto the new domain.  However a site:olddomain.com search in Google brings up about a hundred pages that have the new titles and descriptions but show the old urls - does anyone have any idea how to make the old domain disappear from the SERPS? Many thanks, Richard

    | panini
    0

  • Hi guyz, I've realized that  when someone try to access some url that doesn't exist on my site, my site gives a custom 404 page but not give any 404 http status code.
    It still give 200 http status code. My system is IIS based how can I solve it?

    | atakala
    0

  • When I add new products (approx. 10 a month), I usually delete the old sitemap and submit a new one.  Is this ok to do, or should I just re-submit it with the new info included? Also, is once a month too much?

    | tiffany1103
    0

  • We write great Content for blog and websites (or at least we try), especially blogs. Sometimes few of them may NOT get good responses/reach. It could be the content which is not interesting, or the title, or bad timing or even the language used. My question for the discussion is, what will you do if you find the content worth audience's attention missed it during its original launch. Is that fine to make the text and context better and relaunch it ? For example: 1. Rechristening the blog - Change Title to make it attractive
    2. Add images
    3. Check spelling
    4. Do necessary rewrite, spell check
    5. Change the timeline by adding more recent statistics, references to recent writeups (external and internal blogs for example), change anything that seems outdated Also, change title and set rel=cannoical / 301 permanent URLs. Will the above make the blog new? Any ideas and tips to do? Basically we like to refurbish (:-)) content that didn't succeed in the past and relaunch it to try again. If we do so will there be any issues with Google bots? (I hope redirection would solve this, But still I want to make sure) Thanks,

    | macronimous
    0

  • Hi everyone! So, I;m using the crawl diagnostics in Moz and it's telling that I've got duplicate content for these two pages: http://www.bridgelanguages.com/ 
    http://www.bridgelanguages.com/index.php?p=3233&source=3 Would a redirect from the 2nd page to the 1st one be a solution? I'm not even sure where that 2nd link is on the site? Any suggestions or has anyone experienced the same? Thanks! Kelly

    | Bridge_Education_Group
    0

  • Does it make sense to make all external links on my site no follow?

    | Cocoonfxmedia
    0

  • Hi 
    I have quite a lot of 4xx errors on our site. The 4xx occurred because I cleaned poor URLs that had commas etc in them so its the old URLs that now 4xx. There are no links to the URLs that 4xx. What is the best way of rectifying this issue of my own making?! Thanks
    Gavin

    | gavinr
    0

  • I have been following Yoast's Magento guide here: https://yoast.com/articles/magento-seo/ Under section 3.2 it says: Nofollowing unnecessary links Another easy step to increase your Magento SEO is to stop linking to your login, checkout, wishlist, and all other non-content pages. The same goes for your RSS feeds, layered navigation, add to wishlist, add to compare etc. I always thought that nofollowing internal links is a bad idea as it just throwing link juice out the window. Why would Yoast recommend to do this?  To me they are suggesting link sculpting via nofollowing but that has not worked since 2009!

    | PaddyDisplays
    0

  • Hello Moz, Odd one for you today. I've a site with has pagination (rel= next / prev) however its not being used correctly. I'll give you some examples: lets assume its a 5 page site with a home page, about us etc. The home page has a rel="next" tag on it leading to the next tab (about us) this goes all the way down to the final tag (contact us). Normally you use these tags for pages e.g page 1 - 5 but how much will they affect being used in the way above I'm thinking site structure. Just to add there is no view all on it either though this would make no sense in the way it is being used. Normally I would remove but the client wants to know why and I wanted to articulate better then "because its wrong" As always Moz - thanks!

    | GPainter
    0

  • I'm currently building a site which currently has an archive of blog posts by month/year but from a design perspective would rather not have these on the new website. Is the correct practice to 301 these to the main blog index page? Allow them to 404? Or actually to keep them after all. Many thanks in advance Andrew

    | AndieF
    0

  • Hi. I'm using Squarespace, and I've noticed they assign the page title and site title h1 tag status. So if I add an on-page h1 tag, that's three in total. I've seen what Matt Cutts said about multiple h1 tags being acceptable (although that video was back in 2009 and a lot has changed since then). But I'm still a little concerned that this is perhaps not the best way of structuring for SEO. Could anyone offer me any advice? Thanks.

    | The_Word_Department
    0

  • this question may sound like old one but i want to know something in depth. i mean will seo live forever ? Will i loose my seo job ? so just wanted to know what will be the future of seo ?

    | isolve
    1

  • I have an old cluny site that has been around for about 56 years. It is on the homestead platform. I want to  move the site to a thesis theme 2.1 wordpress platform without losing my links. I would prefer not to do 301 redicrects. With thesis I can specify the URL for each page of the wordpress site, however the wordpress site is hosted on hostgator as a subdomain of another site and the other problem is that wordpress adds a back slash that is not present on the old site. I can, however add .html to the URL's for pages on the wordpress site to conform to the URL's on the old html site. Will this work? thx Paul p.s. the URL for my old site is www.affordable-uncontested-divorce.com

    | diogenes
    0

  • Hey guys, I'm wondering if anyone can help... Here is my issue... Our website:
    http://www.cryopak.com
    It's built on Concrete 5 CMS I'm noticing a ton of duplicate page errors (9530 to be exact).  I'm looking at the issues and it looks like it is being caused by the CMS.  For instance the home page seems to be duplicating.. http://www.cryopak.com/en/
    http://www.cryopak.com/en/?DepartmentId=67
    http://www.cryopak.com/en/?DepartmentId=25
    http://www.cryopak.com/en/?DepartmentId=4
    http://www.cryopak.com/en/?DepartmentId=66 Do you think this is an issue? Is their anyway to fix this issue? It seems to be happening on every page. Thanks Jim

    | TCPReliable
    0

  • (Edited for simplicity) Page #1 on site A has links from 5 different root domains. If I 301 that page to a page on site B that has zero links, will site B gain 5 linking root domains, per the Moz tool? Thanks.

    | ClearPoint
    0

  • Hi Guys !!! Based on crawler results, it shows that I have 188 duplicate content pages, out of which some are those in which I am not able to understand where the duplication is ??? The page created is unique. All the URL's are static, all titles, metat tags are unique. How do I remove this duplication !!! I am using Zencart as a platform. Thanks in advance for the help !!! 🙂

    | sidjain4you
    0

  • I have a client that has a construction company that services a regional area. They now developed a PRODUCT that they want to promote that would have a national reach. We are redesigning the site, with new branding and all. How do I treat the website URL structure? Is the product it's own domain because of the target market? Or should I make a subdomain because we want to tie the companies together in some fashion. Every article I read confuses me more on how to handle this. Thoughts?

    | cschwartzel
    0

  • Hi there, Since almost a week now, all of my optmized META descriptions has been gone in Google. The last few years Google has always shown all of the optimized META descriptions. My website is an ecommerce site (phone accessories) and all pages have its own unique content (url, text, title, description) and score well in Google. The META descriptions are created by using a template like this: At [brandname] you find lots of [variable category product] * USP 1 * USP 2 * USP 3 All META descriptions differ from each other only by the variable category product. Something tells me this is an effect of the Panda 4.0 update. I tested with a category page by replacing the META description for a 100% unique one. Then I asked Google (via Webmaster tools) to reindex the page. Today the new description got indexed. This means uniqueness is important. My question is: how do I get the optimized META descriptions back? Creating real unique descriptions (means not using a template) for every page is very hard for a webhop since all category pages have the same message to tell (only difference is the type of product), I want to use USP's, and META descriptions of all productpages have been lost too (over 15000 different products). Please help!
    Thanks in advance. Marcel

    | MarcelMoz
    0

  • It has been quite a while since I have seen an article really talk about this and I am wondering if this even matters anymore?  I prefer the look of a - rather than a | but just wondering if this is still a thing... If you know of a recent article going into findings on this supporting one or the other it would be appreciated. Thanks!

    | DRSearchEngOpt
    0

  • Hey Moz Community! I've got a website that has hundreds of thousands of old links that don't really offer any great content. They need to be removed. Would it be a better idea to remove them in batches of 5000,10000, or more over a long time... or remove them all at the same time because it doesn't matter? Cheers, Alex

    | Anti-Alex
    0

  • Hi there, my company is exploring creating an online magazine built with Adobe's InDesign toolset. If we proceeded with this, could we make these pages "as spiderable" as normal html/css webpages? Or are we limited to them being less spiderable, or not at all spiderable?

    | TheaterMania
    1

  • I noticed some option for new TLD like .company & .solution My exact match keyword domain is all but taken on the traditional domains (com/org/net/info). Is there an SEO benefits or disadvantage to use the .solution or .company with the exact keyword match? Thanks in advance

    | BLIT
    0

  • Goodday MOZ-friends 😉 We added our video to Vimeo PRO and added it to our website. (http://www.sitetogo.nl/) We also added a XML (http://www.sitetogo.nl/sitemap-video.xml) I'm not sure if we done this correctly. Can anybody tell me this? Thanks & Greetings, Vincent / www.sitetogo.nl

    | Aquaster
    0

  • Overnight my website no longer appears in search engines for the two keywords I use.  The website has been nicely climbing up (very steady progress to 42 and 73) the overnight it has vanished off the Radar.  I have checked my webmaster account, no messages etc.  Please can anyone shed any light on why this has happened?  Website is http://www.securityjobsuk.co.uk Many thanks in advance for any help with this. D

    | SJUK
    0

  • Hi Whilst checking Bing's SEO analyser I got this error message for our page www.tidy-books.co.uk/childrens-bookcases "Evaluated size of HTML is estimated to be over 125 KB and risks not being fully cached. (Issue marker for this rule is not visible in the current view)" Just wondering what needs to be done about it and what it actually means? Thanks

    | tidybooks
    0

  • HI, I need to change the urls and permalink structure of my blogposts. How I have to deal all this with google? Do I have to re-submit the pages to google with fetch as google? Will google display duplicate content of the same article ( having changed the url) or will it automatically replace the old url with the new ones? Tx for your support guys!

    | tourtravel
    0

  • Hi, I've created and submitted to google (through webmaster tool) a site map with the WP plugin XML google maps. Now I've created new pages and posts. My question is: do i have to recreate and re submit another site map to google or can i just submit to google the new pages and posts with the option 'FETCH AS GOOGLE' ? Tx so much in advance.

    | tourtravel
    0

  • I've read over the Q & A in the Community, but am wondering the reasoning behind this issue. I know - 301's are permanent and pass links, and 302s are temporary (due to cache) and don't pass links. But, I've run across two sites now that 302 redirect http:// to https://. Is there a valid reason behind this? From my POV and research, the redirect should 301 if it's permanent, but is there a larger issue I am missing?

    | FOTF_DigitalMarketing
    1

  • We want to show new and returning visitors different versions of our homepage (same URL) What, if anything, should we use as the markup to tell Google what we are doing?
    Any danger that Google will think we are cloaking? Thanks!

    | theLotter
    0

  • My current SEO has always recommended that I take my site to wordpress.  I really don't want to move to wordpress.  I don't like it... I just like writing code in raw html, css, and script.  I feel like I have more control that way. Wordpress just seems like a platform for blogs (I have my blog in wordpress). My question is, do wordpress websites typically rank better? Is there benefit to moving to it?

    | CalicoKitty2000
    0

  • Not sure what best practice here is: http://www.5wpr.com/clients/ Is this is a situation where I'm best off adding canonical tags back to the main clients page, or to the practice area each client falls under? No-following all these links and adding canonical? No-follow/No-index all client pages? need some advice here...

    | simplycary
    0

  • I am having duplicate title tag. The screenshot is attached.  Screaming Frog http://i.imgur.com/jsh0aF8.png I can provide my site if necessary. Appreciate any help to fix it. jsh0aF8.png jsh0aF8.png

    | Chris-tx
    0

  • Hi All, I've searched for previous questions and many talk about the same problem but do not post an actual example. I am also thinking to do a blog post and a series of experiments once there is a theory. My target keyword is "Exhibition Stand Hire" and this is the target page on our site http://goo.gl/qt54lb Site appears on page 6 of SERPS (google.co.uk), but instead of this page a homepage is listed. But if I'm searching for the term using quotes, ie "Exhibition Stand Hire" the right page appears on page 4 of the SERPs. Our home page only uses the keyword in the body text, while target page is very optimised. Could it be over-optimised? I've tried mixing up words in the title tag to not offer an exact match, also i've varied the anchor text of all incoming links but that didn't fix the problem. (Hence why at the moment they all use different terms to point to this page) None of this helped alter what page is chosen to appear. Is it simply the matter of page not being strong enough compared to other less relevant pages on the site? How come many other sites rank better with much less effort? (i'm using OSE to determine competition) Thank you.

    | georgexx
    0

  • Hey guys, I've noticed that the item count is appearing at the beginning of the meta description for our brand pages, e.g. "Items 1 - 24 of 75 -". The issue I have with this is that it reduces the character limit (due to truncation), consequently leaving me with little room to play with to include more useful information. Is there a way to remove this? Cheers, A

    | RobTucker
    0

  • I'm very new here... been reading a lot about Panda and duplicate content.  I have a main website and a mobile site (same domain - m.domain.com).  I've copied the same text over to those other web pages.  Is that okay?  Or is that considered duplicate content?

    | CalicoKitty2000
    0

  • I have a client that has a around 15 "products" (they are pages containing details of the products rather than e-Commerce products) that have been discontinued. The client has suggested 301s but unless the alternative products are replacement products am I correct that we should be using a 500 error?

    | MentorDigital
    0

  • I read this to day http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 I thought  to myself, yep, thats what I been reading in Moz for years ( pitty Matt could not confirm that still the case for 2014) But reading though the comments Michael Martinez of http://www.seo-theory.com/  pointed out that Mat says "...the last time I checked, was 2009, and back then -- uh, we might, for example, only have selected one of the links from a given page."
    Which would imply that is does not not mean it always the first link. Michael goes on to say "Back in 2008 when Rand WRONGLY claimed that Google was only counting the first link (I shared results of a test where it passed anchor text from TWO links on the same page)"  then goes on to say " In practice the search engine sometimes skipped over links and took anchor text from a second or third link down the page." For me this is significant.  I know people that have had "SEO experts" recommend that they should have a blog attached to there e-commence site and post blog posts (with no real interest for readers) with anchor text links to you landing pages.  I thought that posting blog post just for anchor text link was a waste of time if you are already linking to the landing page with in a main navigation as google would see that link first. But if Michael is correct then these type of blog posts anchor text link blog posts would have value But who is' right Rand or Michael?

    | PaddyDisplays
    0

  • So we have recently relaunched a site that we manage. As part of this we have changed the domain. The webdesign agency that built the new site have implemented a direct link from the old domain to the new domain. What is best practice a direct link or a 302 redirect? Thanks

    | cbarron
    0

  • I'm thinking this is a silly question, but I've never had to deal with it I thought I'd ask. Ok is there a tool out there that will show all the redirects to a domain. I'm working on a project that I keep stumbling on urls that redirect to the site I'm studying. They don't show up in Open Site or ahrefs as linking domains, but they keep popping up on me. Any thoughts?

    | BCutrer
    0

  • Hi there, I think I was affected by the recent Panda update as I had a lot of duplicate content for my product descriptions (about 300). I'm going through and rewriting these to be both helpful and unique. I was ranking quite nicely for a big spread of keywords, but have been seeing my rankings drop day after day since the update. Is it possible to see my rankings improve again after Google re-crawls my site, or would a penalty have been applied to my site preventing me to re-gain my positions for sometime. It's probably worth noting that I have a lot of unique and helpful content, it was just my product pages that had duplicate content, but I've seen my rankings across the board drop. Any discussion and insight would be much appreciated.

    | BlueTree_Sean
    0

  • I am wondering if good web hosting and CCL certificate help website to rank? any Idea!

    | AlexanderWhite
    0

  • Hi, A site we look after for a client was down for almost 3 days at the start of this month (11th - 14th of May, to be exact). This was caused by my client's failure to verify their domain name in accordance with the new ICANN procedures. The details are unimportant, but it took a long while for them to get their domain name registration contact details validated, hence the outage. Very soon after this down time we noticed that the site has slipped back in the Google rankings for most of the target keywords, sometimes quite considerably. I guess this is Google penalizing this client for their failure to keep their site live. (And they really can't have too many complaints about this, in my opinion). The good news is that the rankings show signs of improving again slightly. However, they have not recovered all the way to where they were before the outage, two weeks ago. My question is this ... do you expect that the site will naturally re-gain the previous excellent rankings without us doing anything? If so, how long do you estimate this could take? On the other hand, if Google typically penalizes this kind of error by 'permanently', is there is anything we can do to help signal to Google that the site deserves to get back up to where is used to be? I am keen to get your thoughts, and especially to hear from anyone who has faced a similar problem in the past. Thanks

    | smaavie
    0

  • Hi All, I'm looking to implement rel="next" & rel="prev", so I've been looking for examples. I looked at the source code for the MOZ.com forum, if anyone one is going to do it properly MOZ are. I noticed that the rel="next" & rel="prev" tags have been implemented in the a href tags that link to the previous and next pages rather than in the head. I'm assuming this is fine with Google but in their documentation they state to put the tags in the . Does it matter? Neil.

    | NDAY
    0

  • Hi all, I had a client that recently released a US version of their UK website and put it live without informing me first! Once I saw it (about 3/4 days later) I immediately asked them to include the rel=alternate tag onto both websites. However, in the meantime our UK rankings have all gone and it seems as if Google has just kicked the UK website. How long will it take for our rankings to return to normal? Thanks for the advice, Karl

    | KarlBantleman
    0

  • If we have several hundred domain names currently using a park page, would we be better served having them redirect to our corporate homepage for SEO purposes?

    | mkessler
    0

  • What is the best way to stop a page being indexed? Is it to implement robots.txt at a site level with a Robots.txt file in the main directory or at a page level with the    tag?

    | cbarron
    0

  • I have an 8 year old plastic surgery website with my name in the url. I have just released a new website with a generic local plastic surgery url without my name. However my google authorship photo is appearing in listings from both sites with different URLs. So far Google is listing pages from both sites on the same google page result for similar search terms. However I am concerned that eventually I may be punished for dup content since I am the same author for both pages?

    | wianno168
    0

  • I've read that you can use Screaming Frog to identify orphan pages on your site, but I can't figure out how to do it.  Can anyone help? I know that Xenu Link Sleuth works but I'm on a Mac so that's not an option for me. Or are there other ways to identify orphan pages?

    | MarieHaynes
    0

  • Hi, 24 February got this penalty message in Google webmaster tool. Google detected a pattern of unnatural, artificial, deceptive, or manipulative outbound links on pages on this site. This may be the result of selling links that pass PageRank or participating in link schemes. Already removed all the link on the blog and sent reconsideration request to Google spam team. But request is rejected. Please help me on this or share link with me on same case. Thanks,

    | KLLC
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.