Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • I know you can't add Sitelinks since Google does it automatically but is there a way to edit the description that appears on the sitelinks? It's not the same as the meta description I've written for my landing pages. The keyword where you can see the sitelinks is Digital Media Construction.

    | WebLocal
    0

  • Hello Moz! We currently have the number of Google +1's for our homepage displaying on all pages of our website. Could this be viewed as black hat/manipulative by Google, and result in harming our website? Thanks in advance!

    | TheDude
    0

  • I have a website listing real estate in different areas that are for sale. In small villages, towns, and areas, sometimes there is nothing for sale and therefore the page is completely empty with no content except a and some footer text. I have thousand of landing pages for different areas. For example "Apartments in Tibro" or "Houses in Ljusdahl" and Moz Pro gives me some warnings for "Duplicate Content" on the empty ones (I think it does so because the pages are so empty that they are quite similar). I guess Google could also think bad of my site if I have hundreds or thousands of empty pages even if my total amount of pages are 100,000. So, what to do with these pages for these small cities, towns and villages where there is not always houses for sale? Should I remove them completely? Should I make a 404 when no houses for sale and a 200 OK when there is? Please note that I have totally 100,000+ pages and this is only about 5% of all my pages.

    | marcuslind90
    0

  • I need someone that can help me with my SEO. I am too busy to do it and the last person that did it, I really feel didn't do a good job.  Please message me (If that is possible). I am looking for on-page, probably disavowing a good bit of links, and anything else that someone can point me in the right direction to do. I'm having some pretty major issues with my guy right now and I just feel like my rankings are falling off of the map because of it.  Thanks!

    | Veebs
    0

  • Hi We've just updated our website and have binned out alot of old thin content which has no value even if re written. We have a lot of 404 error on WMT and I am in the process of doing 301 redirects on them.  Is there a limit to the number of 301 the site should have?

    | Cocoonfxmedia
    0

  • Hello, One of my clients wants to know what you guys think is the best solution. He sells 100's of templates a month that have a footer link on it pointing to our homepage. Anchor links are "keyword" & "Brand Name" Some are different than others. Do we update the templates so those are no-follow links in the footer? Do we just make all the links to: Brand Name and have them follow? I understand Brand Name is the business name but I am also afraid that Brand name is so close to the money making keyword in the industry and Google might think we are trying to game the system. Looking for your expert opinions!

    | MoosaHemani
    0

  • Hello everyone! I am new to the community and I have a question about determining keywords. I have created a blog {LulusLikes.com} to practice my SEO. I have installed the Yoast SEO plugin and I have noticed the plugin always encourages you to choose a different focus keyword. So if my focus keyword is "Dog of the Week" and it's a weekly contest, wouldn't that be my focus keyword each time I had that type of post? How should I choose my focus keyword for that type of post? I hope that makes sense. Thanks!

    | Lulus_Likes
    0

  • 1 – www.company.com/subfolder/subfolder/keyword-keyword-product (I’m able to keyword match with this url) or 2. www.company.com/subfolder/subfolder/product  (no url keyword match) What would you choose? A url which is "short" but still relevant, or, a url which is more descriptive allowing “keyword” match? Be great to get your feedback guys. Many thanks Gary

    | GaryVictory
    0

  • I was under the impression that if you got an SSL cert for your site that the site would change to https. I ran this site: http://thekinigroup.com/ through an SSL checker and it said it had one...but it's http. 1. Why didn't it change to https? Is there an extra step there that needs to be done? 2. Is there a reason someone would choose to get an SSL cert, but not have https? Thanks, Ruben

    | KempRugeLawGroup
    0

  • Hello My website is www.invitationsforless.ie and on google Ireland it was ranking no 6 for the keyword "wedding invitations" and I was doing quite well from this but it has now has moved down along the rankings to no 10 and I have gone from the first page on google to the second page which is very disappointing. I did recently change the blurb on my homepage - would this have effected it. Please help I don't know what to do Thanks Linda

    | invitationsforless
    0

  • We are working on a site that has a whole section that is not indexed (well a few pages are).  There is also a problem where there are 2 directories that are the same content and it is the incorrect directory with the indexed URLs. The problem is if I do a search in Google to find a URL - typically location + term then I get the URL (from the wrong directory) up there in the top 5.  However, do a site: for that URL and it is not indexed!  What could be going on here? There is nothing in robots or the source, and GWT fetch works fine.

    | MickEdwards
    0

  • I'm trying to redirect a whole load of pages which use a query string to a different directory for example. Original URL: example.com/news/post.php?s=2011-01-28-some-text New URL: example.com/blog/post.php?s=2011-01-28-some-text My understanding is that because the url uses query strings I need to use a rewrite rather than the usual redirect 301 etc. I've come up with this but it is not doing the job. Any ideas what I'm doing wrong? RewriteEngine On
    RewriteCond %{QUERY_STRING} s=
    RewriteRule ^/news /blog [L,R=301]

    | RodneyRiley
    0

  • Hi everyone, I'm getting a crawl error "URL too long" for some really strange urls that I'm not sure where they are being generated from or how to resolve it. It's all with one page, our request info. Here are some examples: http://studyabroad.bridge.edu/request-info/?program=request info > ?program=request info > ?program=request info > ?program=request info > ?program=programs > ?country=country?type=internships&term=short%25 http://studyabroad.bridge.edu/request-info/?program=request info > ?program=blog > notes from the field tefl student elaina h in chile > ?utm_source=newsletter&utm_medium=article&utm_campaign=notes%2Bfrom%2Bthe%2Bf Has anyone seen anything like this before or have an idea of what may be causing it? Thanks so much!

    | Bridge_Education_Group
    0

  • We are soon to launch a new company in New Zealand called Zing.  I have been tasked with the challenge of ranking as highly as possible for anything to do with Zing before launch in February.  Zing is in the financial industry so my colleagues thought that it would be a good idea to make a small blog (very small with literally one post) that reviewed other financial lenders.  This sight stayed online for a couple of months before it was replaced.  The official website is still yet to launch, so as an in between, I asked that we make a splash page with a small competition on it (see here at zing.co.nz).  I would have preferred there were more keywords on the website but this was not achieved.  I am still pushing for this and am hoping to get a few pages on there in the near future.  Instead of getting the keywords on the splash page, I was given permission to start a subdomain, (blog.zing.co.nz).  This contains many more common search terms and although its not quite doing the job I would like, the rankings for Zing have started to increase.  At the moment, we are ranking number 1 for a few brand related keywords such as zing loans.  This is why I feel something is wrong, because we rank number 1 for over 10 similar terms but yet we DO NOT EVEN APPEAR on the search engines at all for Zing.  Have we been penalized?  Do you have any suggestions at all?  Do you think we could have been penalized for the first average blog?  Maybe I messed up the swap over?  Any help would be hugely appreciated!

    | Startupfactory
    0

  • Looking for some advice on a duplicate content issue that we're having that definitely isn't unique to us. See, we are allowing all our tag and category pages, as well as our blog pagination be indexed and followed, but Moz is detecting that all as duplicate content, which is obvious since it is the same content that is on our blog posts. We've decided in the past to keep these pages the way they are as it hasn't seemed to hurt us specifically and we hoped it would help our overall ranking. We haven't seen positive or negative signals either way, just the warnings from Moz. We are wondering if we should noindex these pages and if that could cause a positive change, but we're worried it might cause a big negative change as well. Have you confronted this issue? What did you decide and what were the results? Thanks in advance!

    | bradhodson
    0

  • Is there a tool, or Chrome extension I can use to load a page, identify the .js on the page, 'uncheck' selected .js and load the page again to check loading correctly?  Even better to be able to defer/move to the end of the file to test.

    | MickEdwards
    0

  • I am doing a 301 Redirect from site ABC to site XYZ. I loaded the following .htaccess file by ftp to the ABC.com/ server: Redirect 301 / http://XYZ.com/ This was completed over 30 days ago, OSE is not showing any of the links and is failing to show that abc.com is redirected even though the MozBar shows a successful 301 http status code. Is this still just a waiting game or is it not advised to do a redirect this way for seo? PS: ahrefs is showing the redirect itself, however, it is not showing the links going to site ABC.com/ as passing to site XYZ.com/ . Any help is appreciated.

    | Vspeed
    0

  • Does Google Still support Hyphened Domains for Exact Match or not? For Example: www.my-key-word.com this domain is exact match for "my keyword" or not??

    | hammadrafique
    0

  • Have to say, I'm pretty impressed with Moz, this is now my first full week of membership and wow have I seen some great increases in my site stats! Hopefully this isn't just a blip and that it wil continue for weeks and months to come. Authority has jumped from 27 to 34 Google page one results jumped from 7 to 13 Trafffic increased by 12% Solved duplicate content issues Started a proactive social media campaign I could go on and on, but can't say enough positive things about the services that are provided here, an investment well worth paying and already paying for itself. The goal for the next few weeks is to improve domain authority from 34 to 40+, I've been using long tail phrases for my articles, which has been tremendously beneficial. One query is that even though the domain authority has moved from 27 - 34, I don't appear to have gained any extra backlinks - perhaps I'm misunderstand this metric? The other query is that there are 100's of backlinks pointing to my domain (I provide an open source cms so I know the links are there), but none of these lnks appear to be counting towards my authority. Is there a way I can submit these pages to the index on their behalf? Cheers, Lee

    | LeeC
    0

  • Hi Moz Fans, I have noticed that there is a huge difference between the number of indexed pages of my site shown via site: search and the one that shows Webmaster Tools. While searching for my site directly in the browser (site:), there are about 435,000 results coming up. According to GWT there are over 2.000.000 My question is: Why is there such a huge difference and which source is correct? We have launched the site about 3 months ago, there are over 5 million urls within the site and we get lots of organic traffic from the very beginning. Hope you can help! Thanks! Aleksandra

    | aleker
    0

  • I watched recently John Mueller's Google Webmaster Hangout [DEC 5th]. In hit he mentions to a member not to use Schema.org as it's not working quite yet but to use Google's own mark-up tool 'Structured Data Markup Helper'. Fine this I have done and one of the tags I've used is 'AUTHOR'. However if you use Google's Structured Data Testing Tool in GWMT you get an error saying the following Error: Page contains property "author" which is not part of the schema. Yet this is the tag generated by their own tool. Has anyone experienced this before? and if so what action did you take to rectify it and make it work. As it stands I'm considering just removing this tag altogether. Thanks David cqbsdbunpicv8s76dlddd1e8u4g

    | David-E-Carey
    0

  • Hello! Previously we've been using Amazon CloudFront for our static content (js, css etc). But to be able to reduce load on our origin servers and to be able to give our international users a good user experience we decided to deliver a couple of our sites through CloudFront. We noticed very nice drops in page load time, but when checking Google webmaster tools we noticed that all CloudFront-activated sites got a huge drop in pages crawled per day (from avg ~3500 to ~150). Also one of the sites have issues with the Google sitemaps (just marked as "Pending" in GWT) and no new pages or updated pages seems to be updated in the Google SERP. The rest of the sites gets some updates on the Google SERP, but very few compared to before CloudFront activation. Is there anybody here who have experience in full site delivery through CloudFront (or other CDNs) and effects on SEO/Google? Would be very glad for any insights or suggestions. The risk is that we need to remove CloudFront if this just continues.

    | Ludde
    0

  • Hi Community, We've been struggling with the search engine ranking of our SEO optimised homepage for a number of months. I'm going to provide an overview of the page stats in hope that somebody might have a suggestion as to what the problem might be or where we should be focusing our efforts. I have also provided the stats of our main competitor as I have no idea why they are ranking so high based on the stats provided: URL in question: https://mysite.com On Page Grade for our targeted keyword: A Domain authority: 36 Page authority: 45 Root Doman Links: 57 Total Links: 634 SE Ranking: #17 Competitor URL in question: https://competitorsite.com On Page Grade for same targeted keyword: A Domain authority: 32 Page authority: 43 Root Doman Links: 28 Total Links: 919 SE Ranking: #1 Another strange this about our homepage is that a second tier page on our site is actually ranking higher in the search ranking for the Targeted Keyword (#9), even though this page has not been optimised and has an On Page Grade of F. Does anybody have any suggestions of what we might be overlooking or what the issue here might be? -JF

    | ERpro
    0

  • On the 3rd of November we changed our company name and domain. The new site was not changed at all so the 301 process was quite straightforward. The change over was successful, no downtime, all pages redirected correctly (with a few minor exceptions). However, after a few days we started to see more and more links into the new site from the old site. They now stand at over 3 million. And links from the new site to the old site of over 200K. Links from the new site back to the old, were due to us having left a lot of links tucked away on various pages which were possibly causing loops with the 301 redirects on the old site. We fixed these and now there are no remaining links back to the old site, though we are still showing just over 200K links back to the old site. We are also seeing a LOT more back-links on the new site from old junk sites, which are not showing for the old site. A couple of years ago we went through about a year of trying to track down and remove thousands of spam backlinks. We did what we could, got a lot removed, showed Google the evidence, then Google lifted the penalty and said they had made some changes that meant the links were no longer causing the penalty. I added the old disavow file to the new site, but it doesn't cover a fraction of the sites which are being displayed as providing backlinks... many of which are clearly spammy. Is it possible that Google made some manual actions to lift the penalties but failed to associate these changes with the new domain? Changes that were not included in the disavow file? All help appreciated.

    | Exotissimo
    0

  • Hello My father is having a website built called www.thewoodgalleries.co.uk. The site consists of different product categories as set out below 1.Engineered Wood, 2. Parquet & Reclaimed and 3. Prefinished Wood filtering further into colours 1. /lights-greys/, 2. /beiges/, 3, /browns/ and 4. /darks-blacks and then the brand name for example Vicenza. Example of a clean url **http://www.thewoodgalleries.co.uk/engineered-wood/lights-greys/vicenza/ ** Each and every url is unique Our programmer has put in place 301 redirects - http://www.thewoodgalleries.co.uk/engineered-wood/lights-greys-engineered-wood/vicenza/ - Is this really needed? It does not look clean and will appear like this is Google. This is a completely new site, a new start up business. I'm very confused as to why he has done this and concerned this method of programming does now follow "best practice". Can any programmer offer any advice? To get a better idea how the url structure is set out, I have attached a jpg image. Thank you Faye W09qswW.jpg

    | Faye234
    1

  • Has anyone relocated a website from one country to another? I want to replace all reference from one country (UK) to another (Australia) Phone number change, currency change, address will change Meta/products/content/urls will remain the same The .com URL will be associated to Australia Will the website keep its ranking or will it be damaged to the point where another website should be built from scratch?

    | GardenBeet
    0

  • Hey Mozzers. I want to block my author archive pages, but not the primary page of each author. For example, I want to keep /author/jbentz/ but get rid of /author/jbentz/page/4/. Can I do that in robots by using a * where the author name would be populated. ' So, basically... my robots file would include something like this... Disallow: /author/*/page/ Will this work for my intended goal... or will this just disallow all of my author pages?

    | Netrepid
    0

  • Hello Guys, I decided to take of about 20% of my existing landing pages offline (of about 50 from 250, which were launched of about 8 months ago). Reasons are: These pages sent no organic traffic at all in this 8 months Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content) Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50.. I also realized that for some keywords the landing page dropped out of the top 50, another landing page climbed from 50 to top 10 in the same week, next week the new landing page dropped to 30, next week out of 50 and the old landing pages comming back to top 20 - but not to top ten...This all happened in October..Did anyone observe such things as well? That are the reasons why I came to the conclustion to take these pages offline and integrating some of the good content on the other similiar pages to target broader with one page instead of two. And I hope to benefit from this with my left landing pages. I hope all agree? Now to the real question: Should I redirect all pages I take offline? Basically they send no traffic at all and non of them should have external links so I will not give away any link juice. Or should I just remove the URL's in the google webmaster tools and take them then offline? Like I said the sites are basically dead and personally I see no reason for these 50 redirects. Cheers, Heiko

    | _Heiko_
    0

  • Hello I'm a seo newbie and some help from the community here would be greatly appreciated. I have submitted the sitemap of my website in google webmasters tools and now I got this warning: "When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted." How do I fix this? What should I do? Many thanks in advance.

    | GoldenRanking14
    0

  • Hi there, We are planning to take the step and go from http to https. The main reason to do this, is to mean trustfull to our clients. And of course the rumours that it would be better  for ranking (in the future). We have a large e-commerce site. A part of this site ia already HTTPS. I've read a lot of info about pro's and contra's, also this MOZ article: http://moz.com/blog/seo-tips-https-ssl
    But i want to know some experience from others who already done this. What did you encountered when changing to HTTPS, did you had ranking drops, or loss of links etc? I want to make a list form pro's and contra's and things we have to do in advance. Thanx, Leonie

    | Leonie-Kramer
    0

  • Hi, My client recently undertook a site migration. Since the new site's gone live GWT has highlighted over 2000 not found errors. These were fixed nearly 2 weeks ago and they're still being listed in GWT. Do I have to wait for Google to re-crawl the page before they're removed from the list? Or do I need to go through the list, individually check them and mark them as fixed? Any help would be appreciated. Thanks

    | ChannelDigital
    0

  • Hi there, When you add pages to a site, do you need to re-generate an XML site map and re-submit to Google/Bing? I see the option in Google Webmaster Tools under the "fetch as Google tool" to submit individual pages for indexing, which I am doing right now. Thanks,
    Sarah

    | SSFCU
    0

  • I have accidently made my clients website into 2 websites. originally the site had a wordpress blog.... ie www.xxx.com/blog I gave the blog a name and attached a domain to it by creating a addon domain. So the address is now www.yyy.com instead of /blog The site is for an accommodation provider and the blog is more travel advice and location specific activites and places etc. what i would like to know: 1. Am I better off reverting back to /blog? 2. Am I better off with 2 distinct websites? 3. How would I have a 2nd domain on the blog without chopping my website into 2?. thanks!!!...

    | paddyaran
    0

  • I was wondering if you could use the disavow file by adding to it - even after your site has recovered from a partial site penalty. As a recurring SEO procedure, we are always looking at links pointing to our Website. We then ascertain those links that are clearly of no value. In order to clean these up, would it be good practice to update your disavow file with more of theses domains. Is the disavow file just used for penalty issues to alert google of the work you have done?  (we have had penalty in the past but fine now) Would this method help in keeping high quality links to the fore and therefore removing low quality links from Googles eyes? I would welcome your comments.

    | podweb
    0

  • Hi Anyone else having problems with Google's Pagespeed tool? I am trying to benchmark a couple of my sites but, according to Google, my sites are not loading. They will work when I run them through the test at one point but if I try again, say 15 mins later, they will present the following error message An error has occured DNS error while resolving DOMAIN. Check the spelling of the host, and ensure that the page is accessible from the public Internet. You may refresh to try again. If the problem persists, please visit the PageSpeed Insights mailing list for support. This isn't too much an issue for testing page speed but am concerned that if Google is getting this error on the speed test it will also get the error when trying to crawl and index the pages. I can confirm the sites are up and running. I the sites are pointed at the server via A-records and haven't been changed for many weeks so cannot be a dns updating issue.  Am at a loss to explain. Any advice would be most welcome. Thanks.

    | daedriccarl
    0

  • Hello! We have a 100+ MB PDF with multiple pages that we want Google to fully index on our server/website. First of all, is it even possible for Google to index a PDF file of this size? It's been up on our server for a few days, and my colleague did a Googlebot fetch via Webmaster Tools, but it still hasn't happened yet. My theories as to why this may not work: A) We have no actual link(s) to the pdf anywhere on our website. B) This PDF is approx 130 MB and very slow to load. I added some compression to it, but that only got it down to 105 MB. Any tips or suggestions on getting this thing indexed in Google would be appreciated. Thanks!

    | BBEXNinja
    0

  • We had a discussion about the importance of 404-errors as result of products which are out of stock. Of course this is not good, but what is the leverance in terms of importance: low-medium-high?

    | Digital-DMG
    0

  • For a while I hated the look of the internal links page of Google Web Master Tools account for a certain site. With a total of 120+K pages, the top internal link was the one pointing to "FAQ". With around 1M links. That was due to the fact, on every single page, both the header and the footer where presenting 5 links to the most popular questions. The traffic of those FAQ pages is non-existent, the anchor text is not SEO interesting, and theoretically 1M useless internal links is detrimental for page juice flow. So I removed them. Replacing the anchor with javascript to keep the functionality. I actually left only 1 “pure” link to the FAQ page in the footer (site wide). And overnight, the internal links page of that GWT account disappeared. Blank, no links. Now... Mhhh... I feel like... Ops! Yes I am getting paranoid at the idea the sudden disappearance of 1M internal links was not appreciated by google bot. Anyone had similar experience? Could this be seen by google bot as over-optimizing and be penalized? Did I possibly triggered a manual review of the website removing 1M internal links? I remember Matt Cutts saying adding or removing 1M pages (pages) would trigger a flag at google spam team and lead to a manual review, but 1M internal links? Any idea?

    | max.favilli
    0

  • Hi, Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor). FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https. Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good. Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS. But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.

    | ashishb01
    0

  • I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.

    | MTalhaImtiaz
    0

  • Once upon a time, our site was ranking well and had all the markups showing up in the results. We than lost some of our rankings due to dropped links and not so well kept maintenance. Now, we are gaining up the rankings again, but the markups don't show up in the organic search results. When we Google site:oursite.com, the markups show up, but not in the organic search. There are no manual actions against our site. any idea why this would happen?

    | s-s
    0

  • We have a few pages that have visible .aspx extensions.  I am not as concerned with them showing as I am with the following: https://www.example.com/company.aspx - goes to Company page
    https://www.example.com/company - goes to homepage (should go to company page) My dev tells me that the only way to get these two pages to go to the same place would be to set up individual redirects. Is he right? That seems like it could be detrimental to SEO. Is there other code to manage this? Thanks folks.

    | MichaelEka
    0

  • Hello, This past September I launched a new redesigned website for a client. His old website was a static html site that was many years old and the new website was created using WordPress. With the new design we made sure to use all the proper techniques for SEO. (h1 tages, image names, quality links, page titles, etc.) Plus, all the content is new content written for this site. I've actually launched new sites many times and after a few months usually start seeing keyword ranking improvements from the major search engines. With this particular website I'm seeing improvements in Yahoo and Bing but no movement in Google. I've used Google Webmaster Tools and made sure my sitemap is being submitted, etc. It all seems good, but I can't understand why Yahoo and Bing are working but nothing from Google. My page grades are all A's and B's and Moz isn't showing any big issues. Maybe I need to give it more time? This client is a lawyer and has many websites out there so maybe he's being penalized somewhere I don't know? As I mentioned, I've been doing SEO for about 8 years and have never had this much trouble with Google. I was wondering if you can look at the site and see if there are any glaring issues I'm missing. Website: http://www.arizonamedicalmalpractice.info/ The keyword phrases were are looking at are "Phoenix Medical Malpractice Lawyer", "Phoenix Medical Malpractice Attorney:, "Arizona Medical Malpractice Lawyer & Attorney", etc. I appreciate anyone who takes the time and does a quick look over. Thanks very much, Bill

    | Bill_K
    0

  • HI, We will be migrating all our website content soon to a new CMS and at the moment the

    | alzheimerssoc
    1

  • Hi All, The problem is with the .htaccess file I have written 301 redirection code for Apache server but once I upload .htaccess file from ftp the website is throwing 500 error. Please help as I'm new to the redirection files.

    | Bharath_ATZ
    0

  • Dear All I have recently joined a new company Just Go Holidays - www.justgoholidays.com I have used the SEO Moz tools (yesterday) to review the site and see that I have lots of duplicate content/pages and also lots of duplicate titles all of which I am looking to deal with. Lots of the duplicate pages appear to be surrounding, additional parameters that are used on our site to refine and or track various marketing campaigns. I have therefore been into Google Webmaster Tools and defined each of these parameters. I have also built a new XML sitemap and submitted that too. It looks as is we have two versions of the site, one being at www.justgoholidays.com and the other without the www It appears that there are no redirects from the latter to the former, do I need to use 301's here or is it ok to use canonicalisation instead? Any thoughts on an action plan to try to address these issues in the right order and the right way would be very gratefully received as I am feeling a little overwhelmed at the moment. (we also use a CMS system that is not particularly friendly and I think I will have to go directly to the developers to make lots of the required changes which is sure to cost - therefore really don't want to get this wrong) All the best Matt

    | MattByrne
    0

  • I know how to do this with a wordpress.org site but I have a client that does not want to switch and without a plugin I am lost. any help would be greatly appreciated. Jeremy Wood

    | SOtBOrlando
    0

  • Hi, How can i rewrite example.com/example1/example2/example3 to example.com/example3 And is there tools or software that can generate url rewrite... (not a plugin) Thanks !

    | bigrat95
    0

  • For a site we manage, Google can’t seem to decide which of two pages to present for a search for “skid steer attachments.” Almost weekly, it flip-flops from the home page to an interior page (which is a shopping cart category page that we have not actually optimized for the phrase.) The site is berlon.com. Have any of you had a similar experience and, if so, how did you address it? I’ve attached a Moz screen shot that shows the changes. mNfmJoY

    | PKI_Niles
    0

  • We have category pages and some of those pages have pagination due to us having additional items. Screaming Frog could not find the items that were after page 1. Is this a problem for Google? These item pages are still in the sitemap. I am sure they can find them to index them but does it hurt rankings at all.

    | EcommerceSite
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.