Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • I run a directory and some search queries give almost 1000 unique results. My moz campaign tells me that I have around 1,300 duplicate title tags etc. I read online about canonical, rel=next/prev, also about having a 'view all' page just for google (page links, not search queries), but if I do this, wouldn't the slowness mean google won't index it? So the question is what is the best thing to do?

    | tguide
    0

  • After reading there will be a drop down arrow on product listing adverts, which will push organic results down to display 16 products: http://searchengineland.com/google-testing-pla-expansion-to-show-16-product-listing-ads-166942?utm_source=twbutton&utm_medium=twitter&utm_campaign=tweet I wondered how badly this is going to affect anyone in organic e-commerce? is this going to be rolled out for every term and have a drop down arrow for each one? Is organic going to die with e-commerce now? Thoughts much appreciated.

    | pauledwards
    0

  • What methods and tools do you guys use to perform link audits? Do you also use a traffic light system for links?

    | PurpleGriffon
    0

  • On June 7th, 2013 our structured data (as reported in GWT) dropped from ~61M items on ~7M pages to ~13.5M items on ~1.5M pages. Since that time those numbers have continued to fall. We made no code changes during this time. I've searched around the web and found a few people pointing to a similar June 7th, 2013 drop in reported structured data. Can anyone offer any insight beyond speculation? Outside of the June 7th date, what can cause such a dramatic drop in structured data? Thanks in advance.

    | RyanOD
    0

  • I have a question!
    We have 2 domains operating within the same retail sector.  One of them is for our bricks and mortar business and the other is a new brand we launched as a nationwide e-retailer.   We aggressively built links for the new one and achieved some very good search positioning, where we remained for about 4 months until the google updates of the first half of this year started biting.  The domain never received a warning from google or anything, but the links have clearly been devalued to a point where the domain is now virtually buried for the most competitive terms. However, the domain does still get around 100-200 visitors per day, and has a DA of 38. We're thinking about a reshuffle that would involve putting the products in to our brick and mortar business website, and redirecting the brand domain to the bricks and mortar domain. Thank you for reading this far!  the question is then, is there a danger of the bricks and mortar domain being tarnished by this?  as i said the brand domain hasn't had any notices of penalty from google but it has definitely been hit by updates.

    | FDFPres
    0

  • Hi, Google webmaster and analytics access to others with restricted and user access is this ok and secure? With this access can anyone tamper anything? thanks

    | mtthompsons
    0

  • Hi, I want all such search pages skipped from indexing www.somesite.com/search/node/ So i have this in robots.txt (Disallow: /search/) Now any posts that start with search are being blocked and in Google i see this message A description for this result is not available because of this site's robots.txt – learn more. How can i handle this and also how can i find all URL's that Google is blocking from showing Thanks

    | mtthompsons
    0

  • A search query in Google shows 156 indexed results.  Yet Roger has completed a crawl and returned a single page.  The website is table based, but there appear to be no redirects or Javascript blocking bots so I'm unsure why Roger has under delivered.  The problem is not unique to Roger.  I ran the site through Screaming Frog's desktop crawler and it also returned a single page. I'm wondering if there is something in the site code I don't know to look for that is stopping Roger & Screaming Frog from crawling the site. Appreciate any insights you can offer. PS.  I've read https://seomoz.zendesk.com/entries/409821-Why-Isn-t-My-Site-Being-Crawled-You-re-Not-Crawling-All-My-Pages- and don't think these suggested causes apply.

    | NicDale
    0

  • Hi I was just wondering if there is any difference in using rel='next' rather than rel="next". Would it still work the same way? I mean using the apostrophes differently, would it matter? Thanks!

    | pikka
    0

  • I'm in the process of putting together a plan to recover from Algorithmic penalty. I'm not sure if I have to focus my recovery effort based on Penguin, Panda or Other algorithm penalty. After looking at the attached screenshot :  Google Analytics Data vs  Google Algorithm update timeline, I'm not sure if the blog is affected due to Penguin or Panda. I have following questions Traffic drop is because of Pengin, Panda or Other penalty? (there is no manual penalty message) Where should I focus my time with recovery efforts? (link removal, contents, link building, etc). Any other comments or suggestions? Thanks for you help. cSFZqj7

    | rsmb
    0

  • Hi guys i like to know whats the best tool to create diferent types of Sitemap´s (images, videos, normals). I dont care if is paid.

    | faraujoj
    0

  • Hello, I just created a page for researching the impact of social signals on Google ranking (in Italy). Page was not optimized (one internal backlink, no other external/internal links, keyword repeated 4 or 5 + h1 h2, no alt tags), and only social signals are being stimulated (through votes). The domain is 2 months old and is already positioned for few relevant keywords, but from 2 page down. My question is: am I doing right? Is this a good way to proceed? And if not, what I should do instead? Thank you for an advice. Eugenio

    | socialengaged
    0

  • Hi All, In webmaster > Internal links https://www.google.com/webmasters/tools/internal-links?hl=en&siteUrl= I get counts as in the image http://imgur.com/9bO5H0f is this logical and ok or should i work on finding why so many links and reduce them? Thanks Martin

    | mtthompsons
    0

  • Hi, All images are noindex will opening this at once be an issue? Not sure how a few months ago all my images were set as noindex which i realized last week. We have 20K images which were indexed fine but now when i check Site:sitename it shows 10 or 12 and the inspect element via Chrome i see the noindex is set for all images. We have been renaming the images and adding ALT tags for most of them and would it be an issue if we change the noindex in one shot or should we do them few at a time? Thanks

    | mtthompsons
    0

  • Hi, Changing broken links and Anchor text be a problem? We have 80K pages which has about 40K links which has been created in the last few years and from last month we have been working on updating content on those pages that's old and links that are broken and changing the anchor text in those posts. Anchor text like Click me, Here,Download, link, etc is changed to meaningful words. its a total close to 10K link replacements and 10K anchor text. While doing this from last month have seen a slight decrease in daily traffic. is this something Google would consider as some kind of a wrong webmaster activity? or its just fine? Thanks

    | mtthompsons
    0

  • I have a specialized website for hospitals covering a specific topic.  This same topic is also applicable to another market but with some minor modifications.  I'm thinking about starting a new site to target this specific market and use the same content as the one specialized for healthcare.  I will have to make some minor adjustments to the articles to take out the healthcare part and replace with the other industry. If my content is similar between both sites and both authored by me could that possibly hurt my rankings? Any opinions appreciated.

    | MedGroupMedia
    0

  • So this is the biggest error I have. But I don't know how to fix it. I get that I have to make it so that the duplicats redirect to the source, but I don't know how to do that. For example, this is out of our crawl diagnostic: | On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3 1 1 0 On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3&s=8d631e0ac09b7a462164132b60433f98 | 1 | 1 | 0 | That's just an example. But I have over 1000+ like that. How would I go about fixing that? Getting rid of the "&s=8d631e0ac09b7a462164132b60433f98"? I have godaddy as my domain and web hoster. Could they be able to fix it?

    | taychatha
    0

  • I don't want to give details, but a competitor's site ranks ahead of us by a spot or 2 for some of our major keywords, and we can't figure out why. We have more content indexed, we have better content (my opinion), we're better optimized, we have significantly better MozRank and Majestic rankings, we have more links, we have better links. They have a plain Jane Wordpress blog, and their pages don't even have 300 characters. They don't have any awesome links.  There are plenty of other sites besides ours that should outrank them. When I am using the various tools at my disposal, I can see why a competitor outranks us. I'm not obsessing, but this one, I just don't understand. Has anyone had this experience?

    | CsmBill
    0

  • Hello guys, By fixing the duplicate meta description issues of my site I noticed something a bit weird.The pages are product pages and the product on each one of them is the same and the only difference is the length of the product. On each page there is a canonical tag, and the link within the tag points to the same page. www.example.com/Product/example/2001 <rel="canonical" href="www.example.com/Product/example/2001"></rel="canonical"> This happens on every other page. I read twice and I think I will do it again the post on GWT and I think that is wrong as it should point to a different url, which is www.example.com/ProductGroup/example/ which is the the page where all the product are grouped together. Cheers

    | PremioOscar
    0

  • I've seen a few other Q&A posts on this but I haven't found a complete answer. I read somewhere a while ago that you can use as many tags as you would like. I found that I rank for each tag I used. For example, I could rank for best night clubs in san antonio, good best night clubs in san antonio, great best night clubs in san antonio, top best night clubs in san antonio, etc. However, I now see that I'm creating a ton of duplicate content. Is there any way to set a canonical tag on the tag pages to link back to the original post so that I still keep my rankings? Would future tags be ignored if I did this?

    | howlusa
    0

  • I launched a new website ~2 weeks ago that seems to be indexed but not cached. According to Google Webmaster most of the pages are indexed and I see them appear when I search site:www.xxx.com. However, when I type into the URL - cache:www.xxx.com I get a 404 error page from Google.
    I've checked more established websites and they are cached so I know I am checking correctly here... Why would my site be indexed but not in the cache?

    | theLotter
    0

  • Last week, I was notified of having a lot of duplicated title pages.  I've recently made the changes on my website with unique content. I went back on MOZ this morning, and I'm still notified with the same problems.  However, when I check the back end of that specific page, I see the changes already made.  My question is, why are my changes not being updated in MOZ?  Does it take awhile for MOZ to recognize this or am I missing a step?

    | ckroaster
    0

  • Just about to send a reconsideration request to Google for my site: seoco.co.uk and would like your input.  I was going to include information about each URL I found and the steps I have taken but there is not room. What do you think of this: “Hi guys, i got an unnatural links message from you back in February and since then my website rankings have fallen dramatically. I spoke to someone at SEOmoz and they said that my website probably got penalised for directory links so I have gone out and tried to get rid of all the low quality ones that I am responsible for and some that I am not. Altogether I was able to identify about 218 low quality directory links. I attempted to contact every one of the directory owners twice over a two week period and I was able to get about 68 removed. I have used the disavow tool to devalue the rest. Trying to get rid of all of those bad links was hard work and I have definitely learned my lesson. Rest assured I will not be submitting to anymore directories in the future. Please can you give me another chance? If my site still violates the guidelines please could you point out some of the bad links that are still there?” What do you think? Can you think of anything else I should say? Dave

    | Eavesy
    0

  • We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?

    | RodrigoVaca
    0

  • If I run a site that charges other companies for listing their products, running banner advertisements, white paper downloads, etc. does it make sense to "no follow" all of their links on my site? For example: they receive a profile page, product pages and are allowed to post press releases.  Should all of their links on these pages be "no follow"? It seems like a gray area to me because the explicit advertisements will definitely be "no followed" and they are not buying links, but buying exposure. However, I still don't know the common practice for links from other parts of their "package". Thanks

    | zazo
    0

  • In my Google Analytics, I have different stats for what I think is the same page, win-a-party and win-a-party/ I searched all over for the win-a-party/ page and I couldn't find it anywhere. Why would it be tracking these differently? Should I set a 301 from win-a-party/ to win-a-party?

    | howlusa
    0

  • I have read that loading images from a subdomain of your site instead of the main domain will give you speed benefits on load time. Has anyone actually seen that to be the case? Thanks!

    | Gordian
    0

  • Hi, i am having serious problems since i upgraded my website from joomla 1.5 to 3.0 We have dropped down the rankings from page one for the word lifestyle magazine, and we have dropped down in rankings for other very important words including gastric band hypnotherapy and i am starting to regret having the site upgraded. i am finding the google is taking its time visiting my site, i know this for two reasons, one i have checked the cache and it is showing the 2nd july and i have checked articles that we have written and they are still not showing. example if i put this article name in word for word it does not come up, Carnival Divert Ships In The Caribbean Due To bad Weather this was an article that was done yesterday. in the old days before the upgrade that would have been in google now. these problems are costing us the loss of a great deal of traffic, we are losing around 70% of our traffic since the upgrade and would be grateful if people could give me advice on how to turn things around. we add articles all the time. each day we add a number of articles, i was considering changing the front page in the middle and having a few paragraphs of the latest story to get google to visit more often. i know this would look messy but i am running out of ideas. any help would be great

    | ClaireH-184886
    0

  • Hi, i am using google authorship on my site but when i use the testing tool it is not working. before the upgrade we had it working fine but now it does not seem to work. we have our google plus account pointing to the site and the writer we are trying to add is not coming up on the tool here is the code we are putting on the page Google+ and the page in question is here http://www.in2town.co.uk/emmerdale/emmerdale-laurel-is-determined-to-take-action when i check the tool i get the following Authorship Testing ResultAuthorship is not working for this webpage.andAuthorship rel=author MarkupCannot verify that rel=author markup has established authorship for this webpage.the tool i am using to check is http://www.google.com/webmasters/tools/richsnippetsany help to solve this problem would be great. i am using joomla

    | ClaireH-184886
    0

  • We have 30,000 pages that we want to get rid of. Each product within our database has it's own page. And these particular 30,000 products are not relevant anymore. They have very little content on them and are basically the same exact page but with a few title changes. We no longer want them weighing down our database so we are going to delete them. My question is - should we get rid of them in smaller batches like 2,000 pages at a time, or is it better to get rid of all them in one fell swoop? Which is least likely to raise a flag to Google? Anyone have any experience with this?

    | Viewpoints
    0

  • Do special characters, such as the "&" symbol or a "," in title tags and meta descriptions negatively affect your ranking in search engines? Any feedback is much appreciated. Thank you!

    | ZAG
    1

  • Hello guys, My website hasn't been reachable for couple of hours today and I can't really understand why as no links have been built, all the best practices have been followed  regarding on page optimization. I also checked google webmaster tools and there are no warning messages, crawl problems or anything so I don't understand why this has happened. Now for some reason the website is up and running again.

    | PremioOscar
    1

  • Hi, our site is www.in2town.co.uk and i am thinking of putting an article on my home page in the middle under where it says lifestyle news, and getting rid of the middle column and instead have the latest news there to try and get google to visit more. I would like to know if you think this would look messy and not user friendly and if you think if i did do it would it get google to visit the site more often. all day we are always adding articles but on the home page it only shows a few lines of the article so i am concerned that these few lines are not getting google interested in visiting our site more often. we were on page one with our site but now since the upgrade we are on page eight so we are trying to combat this The article would change each time we put a new article on the site. so the article could be on there for ten mins before a new one is there any thoughts on this would be great

    | ClaireH-184886
    0

  • Hi, I have a site like BannerBuzz.com, before last penguin my site's all keywords were in good position in google, but after penguin hit on my website, my all keywords are going down and down day by day, i have done some changes in my website for improvement, but in 1 change i have some confusion. i have one sub domain (http://reviews.bannerbuzz.com/), which display my websites all keywords user reviews, in which every category's 15 reviews are display in my website http://www.bannerbuzz.com so are those user reviews consider as duplicate content between sub domain and main website. can i disallow sub domain from all search engine? currently sub domain is open for all search engine, is that helpful to block it? Thanks

    | CommercePundit
    0

  • I have a website like http://www.bannerbuzz.com, i am promoting home page with vinyl banners keyword, but currently i can see my website's review page for vinyl banners result in google, i want to display my home page instead of review page for my keyword result in google, its frequently change, some time i can see home page for it and some time it shows review page as i attached image. i want to show my home page, so can you please help me to solve it, how can i stable my home page with main keywords. OtOXxiE.png

    | CommercePundit
    0

  • If you have more NoFollowed Linking Root Domains than Followed Linking Root Domains is that a problem?

    | INN
    0

  • I manage a directory site with hundreds of thousands of indexed pages. I want to remove a significant number of these pages from the index using NOINDEX and have 2 questions about this: 1. Is NOINDEX the most effective way to remove large numbers of pages from Google's index? 2. The IA of our site means that we will have thousands of internal links pointing to these noindexed pages if we make this change. Is it a problem to link to pages with a noindex directive on them? Thanks in advance for all responses.

    | OMGPyrmont
    0

  • Dear site owner or webmaster of http://www.enakliyat.com.tr/,
    Some of your site's pages may be using techniques that do not comply with Google's Webmaster Guidelines.
    On your site, in particular, does not provide an adequate level of innovation in low-quality unique content or set of pages. Examples of this type of thin affiliate pages, pages, bridge pages, it will automatically be created or copied content. For more information about the unique and interesting content, visit http://www.google.com/support/webmasters/bin/answer.py?answer=66361.
    We recommend you to make the necessary changes to your site to fit your site's quality guidelines. After making these changes, please submit your site for reconsideration in Google's search results.
    If you have questions about how to resolve this problem, please see our Webmaster Help Forum for support.
    Sincerely,
    Google Search Quality Team **After this massege ve find our low quality pages and we added this urls on Robots.txt. Other than that, what can we do? ** **Our site is a home to home moving listing portal. Consumers who wants to move his home fills a form so that moving companies can cote prices.  We were generating listing page URL’s by using the title submitted by customer. **

    | iskq
    0

  • One of my client targeting same (5 Keywords) for 3 sites. Domain &  Web Hosting is same for 3 sites. Site A - 50.72.134.29
    Site B - 50.72.140.227
    Site C- 50.72.19.70 Some time ago, ranking dropped - but don't know if it is because on above things? Is it OK? What is the best way to target same keywords for 3 different sites.

    | krishnaxz
    0

  • I have a client who after designer added a robots.txt file has experience continual growth of urls blocked by robots,tx but now urls blocked (1700 aprox urls) has surpassed those indexed (1000).  Surely that would mean all current urls are blocked (plus some extra mysterious ones). However pages still listing in Google and traffic being generated from organic search so doesnt look like this is the case apart from the rather alarming webmaster tools report any ideas whats going on here ? cheers dan

    | Dan-Lawrence
    0

  • Hi, Is being hit by Panda as hard to get out of as being hit by Penguin ? Or if you clean up all your content should you get out of it relatively quickly ? I have a very old (11 years) and established site (but also very neglected site that i'm looking to relaunch) but its on an ancient shopping cart platform which never allowed for Google analytics & GWT integration etc so i cant see any messages in GWT or look at traffic figures to correlate a drop with any Panda updates. The reason i ask is i want to relaunch the site after bringing up to date with a modern e-commerce platform.  I originally launched the site in early 2002 and was perceived well by Google achieving first field of view SERPS for all targeted keywords however competitive, including 'ipod accessories', 'data storage' etc etc. These top positions (& resulting sales) lasted until about 2007 when it was overtaken by bigger brand competitors with more advanced & Google friendlier ecommerce platforms (& big SEO budgets) I originally used the manufacturers descriptions editing slightly but probably not enough to avoid being considered duplicate content although still managed to obtain good rankings for these pages for a very long time even ranking ahead of Amazon in most cases.  The site is still ranking well for some of the keywords relating to products for which there is still manufacturer copied descriptions so i actually don't think i have been hit by Panda. So my questions Is, is there any way of finding out for sure if the site has indeed even been hit by Panda at all without looking at analytics & gwt ? And once i find out if it has or not: Is it best if i relaunch on same domain to take advantage of the 11 year old domain history/authority etc ? So long as i make sure all product descriptions etc are unique, if i have been hit by Panda the site should escape its clutches quite quickly ? **OR ** Is Panda as aggressive as Penguin in which case is it best to start again on a new domain ? Many Thanks Dan

    | Dan-Lawrence
    0

  • Hello, I have a site that does not have a blog feed.
    And unless it is done Manually there is no way to see the blog links.
    www.MigrationLawyers.co.za Now, I submit the the Sitemap to google, but will it be a good Idea to include an actual sitemap of the site (for example in the footer of the site)
    http://migrationlawyers.co.za/sitemap-immigration-south-africa and should i Make the "sitemap" link a follow or nofollow? Thanks so much in advance
    Nikita

    | NikitaG
    0

  • We recently moved out site to shopify but now have a duplicate content issue as we have the same products in different collections. I have added canonical code to get rid of this but my webmaster tools still shows hundreds of duplicate pages. How can I tell if the code I added is working? How long will it take for google to recognise this and drop the duplicates from their index and is this likely to have a significant impact on SERPS? Our we page is www.devoted2vintage.co.uk. Thanks Paul

    | devoted2vintage
    1

  • When doing internal linking back to your home/index page is it best to set the code as linked to "www.thedomain.com" or "www.thedomain.com/" or just "/" - I'm attempting some canonicalization and our programmer is concerned about linking to just the URL as he's saying it's going to be viewed as an external source.  We have www redirects in place that come back to just www.thedomain.com and a redirect to send the www.thedomain.com/index.php back to just www.thedomain.com .  Any help would be appreciated, thank you!

    | CharlesDaniels
    0

  • I have multiple pages that I do not want to be indexed (and they are currently not indexed, so that's great). They don't have meta descriptions on them and I'm wondering if it's worth my time to go in and insert them, since they should hypothetically never be shown. Does anyone have any experience with this? Thanks! The reason this is a question is because one member of our team was linking to this page through Facebook to send people to it and noticed random text on the page being pulled in as the description.

    | Viewpoints
    0

  • Hello I have a website that I am trying to SEO optimise.
    The current structure of the site is that all pages are linked directly after the domain:
    example:   www.domain.com**/page01**  www.domain.com**/page02** The website is however logically organised in the following form:
    www.domain.com**/page01/page02** Sometimes the parenting goes to 3 levels: (please help me with the right term here) Domain
    ↳ Page001
         ↳ Page002
              ↳Page003 My question is: should keep the current structure, or is it worth the effort to re-link the website in a parented way. Are there any benefites to one or the other, and could you point to some video tutorials or documentation to read. BqoDAsx.jpg DMMIC5o.jpg

    | NikitaG
    0

  • How do I go about fixing these errors? There are all related to my tags. Thank you in advance for any help! Lisa

    | lisarein
    0

  • I have just done the report for my site http://www.in2town.co.uk and it says i have 246 on page links but i am not sure how come i have got that many. I know i have a large number of links and in the old days it says that you should keep the links under 100 but now with website speed and the net, people are saying this is no longer listened to. A report i read said that the links should not confuse the reader or put them off, so i am just wondering what your thoughts are on a site with over a 100 links on the home page and also if my site does have to many links what should i do about it. I cannot understand why it is showing 246 when i do not see that many on the page, any advice would be great

    | ClaireH-184886
    0

  • Would Google really discount duplicate content created by Wordpress tags? I find it hard to believe considering tags are on and indexed by default and the vast majority of users would not know to deindex them . . .

    | BlueLinkERP
    0

  • Setting up canonical tags for an old site. I really need advice on that darn backslash / at the end of the homepage URL. We have incoming links to the homepage as http://www.mysite.com (without the backslash), and as http://www.mysite.com/ (with the backslash), and as http://www.mysite.com/index.html I know that there should be 301 redirects to just one version, but I need to know more about the canonical tags... Which should the canonical tag be??? (without the backslash) or (with the backslash) Thanks for your help! 🙂

    | GregB123
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.