Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • I have 2 domains for the same company,   example.com and example.sg Sometimes we have to post the same content or event on both websites so to protect my website from duplicate content plenty i use canonical tag to point to either .com or .sg depend on the page. Any idea if this is the right decision Thanks

    | MohammadSabbagh
    0

  • Anyone out there begin receiving this and or know when it started? Google has recently began sending a new manual action spam notification to webmasters for “spammy structured markup” also known as rich snippet spam. Your pal, Chenzo

    | Chenzo
    0

  • I'm considering to implement google site search bar into my site.
    I think I probably choose the version without the ads (I'll pay for it). does anyone use Google Site Search and can tell if it's a good thing? does it affects in any way on seo? thank you

    | JonsonSwartz
    0

  • HI, I am working on getting my robots.txt up and running and I'm having lots of problems with the robots.txt my developers generated. www.plasticplace.com/robots.txt I ran the robots.txt through a syntax checking tool (http://www.sxw.org.uk/computing/robots/check.html) This is what the tool came back with:  http://www.dcs.ed.ac.uk/cgi/sxw/parserobots.pl?site=plasticplace.com  There seems to be many errors on the file. Additionally, I looked at our robots.txt in the WMT and they said the crawl was postponed because the robots.txt is inaccessible. What does that mean? A few questions: 1. Is there a need for all the lines of code that have the “#” before it? I don’t think it’s necessary but correct me if I'm wrong. 2. Furthermore, why are we blocking so many things on our website? The robots can’t get past anything that requires a password to access anyhow but again correct me if I'm wrong. 3.  Is there a reason Why can't it just look like this: User-agent: * Disallow: /onepagecheckout/ Disallow: /checkout/cart/ I do understand that Magento has certain folders that you don't want crawled, but is this necessary and why are there so many errors?

    | EcomLkwd
    0

  • I am currently doing a site audit on a large website that just went through a redesign. When looking through their webmaster tools, they have about 3,000 duplicate Title Tags. This is due to the way their pagination is set up on their site. For example. domain.com/books-in-english?page=1 // domain.com/books-in-english?page=4 What is the best way to handle these? According to Google Webmaster Tools, a viable solution is to do nothing because Google is good at distinguishing these. That said, it seems like their could be a better solution to help prevent duplicate content issues. Any advice would be much welcomed. 🙂

    | J-Banz
    0

  • I instinctively feel like server rendered websites should rank higher since Google doesn't truly know that the content its getting from an AJAX site is what the user is seeing and Google isn't exactly sure of the page load time (and thus user experience). I can't find any evidence that would prove this, however. A website like Monocle.io uses pushstate, loads fast, has good page titles, etc., but it is a JavaScript single page application. Does it make any difference?

    | jeffwhelpley
    0

  • When we have run a Open Site Explorer analysis on our own site, it says that for all our internal links the Link Anchor Text is 'Help with logging in' I am a bit confused as to why it shows that. That text does appear in the header of the page, but is not the first piece of text. Why is it happening on our site?
    Why do I not see this on other sites?
    What affect does this have on our ranking?
    What's the best fix? Example page that we ran on Open Site Explorer: www.rightboat.com/search?manufacturer=Beneteau&model=Antares+9.80

    | MattAshby
    0

  • Does anybody know how to find bad links and remove them easily.

    | tanveer1
    0

  • I'm working on a site that has four primary navigation links and under each is a tabbed navigation system for second tier items. The primary link page loads content for all tabs which are javascript controlled. Users will click the primary navigation item "Our Difference" (http://www.holidaytreefarm.com/content.cfm/Our-Difference) and have several options with each tabs content in separate sections. Each second tier tab is also available via sitemap/direct link (ie http://www.holidaytreefarm.com/content.cfm/Our-Difference/Tree-Logistics) without the js navigation so the content on this page is specific to the tab, not all tabs. In this scenario, will there be duplicate content issues? And, what is the best way to remedy this? Thanks for your help!

    | Total-Design-Shop
    0

  • Google responds to my reconsideration request. They give me like 2 links of the many unnatural links which are actually people mentioning our website in their conversation. How can that be unnatural, legitimate people discussing about our website services? Even if it's unnatural, how can I possibly remove a backlink from a forum post?

    | Droidman86
    0

  • I’ve included the scenario and two proposed fixes I’m considering.  I’d appreciate any feedback on which fixes people feel are better and why, and/or any potential issues that could be caused by these fixes.  Thank you! Scenario of Problem I’m working on an ecommerce website (built on Magneto) that is having a problem getting product pages indexed by Google (and other search engines).  Certain pages, like the ones I’ve included below, aren’t being indexed.  I believe this is because of the way the site is configured in terms of internal linking.  The site structure forces certain pages to be linked very deeply, therefore the only way for Googlebot to get to these pages is through a pagination page (such as www.acme.com/page?p=3).  In addition, the link on the pagination page is really deep; generally there are more than 125 links on the page ahead of this link. One of the Pages that Google isn’t indexing: http://www.getpaper.com/find-paper/engineering-paper/bond-20-lb/430-20-lb-laser-bond-22-x-650-1-roll.html This page is linked from http://www.getpaper.com/find-paper/engineering-paper/bond-20-lb?p=5, and it is the 147<sup>th</sup> link in the source code. Potential Fixes Fix One: Add navigation tags to the template so that search engines will spend less time crawling them and will get to the deeper pages, such as the one mentioned above. Note: the navigation tags are for HTML-5; however, the Magento site in which this is built does not use HTML 5. Fix Two: Revised the Templates and CSS so that the main navigation and the sidebar navigation is on the bottom of the page rather than the top.  This would put the links to the product pages in the source code ahead of the navigation links.

    | TopFloor
    0

  • Hello, We have a wordpress blog that has around 250 categories. Due to our platform we have a hierarchy structure for 3 separate stores. For example iPhone > Apps > Books. Placing a blog post in the books category automatically places it into iPhone and iPhone/Apps category, causing 3 instances of any blog post in this category. Is this an issue? I have seen 2 schools of thought on categories, 1 index follow and 2 noindex follow. I know some of our categories get indexed, but with so many, maybe it is better to noindex them. We also considered reducing our categories to 10 to 12 and use tags to provide the indexed site navigation as follows: Reviews  (category) iPhone Book App, iPhone App Store (tags) but this seems a little redundant? Anyone want to take this on? thank you Mike

    | crazymikesapps1
    0

  • Hi There, My client has a website (www.activeadventures.com) which they relaunched in April 2013. The company sells inbound tourism trips to New Zealand, South America and the Himalayas. Previously, the websites for these destinations were on their own domains (activenewzealand.com, activehimalayas.com, activesouthamerica.com). With the launch of the new website those domains were all retired (but had 301 redirects put into place to the new site), and moved into sub directories of the activeadventures.com domain (eg: activeadventures.com/new-zealand). There has been no indication that this strategy has improved organic search results (based on analytics) and in my opinion I believe that having this structure has been detrimental to their results. My opinion is based off the following: Visitors to the websites are coming into the site with a specific destination in mind that they want to travel to. Thus... having the destination in the URL I believe provides more immediate relevancy and should result in a higher CTR. I also feel that having the sites on their own URL's will provide a more concentrated theme for the destination based search phrases. The new site is a custom Joomla build and I want to find the easiest way to keep the current Joomla set up AND move the country specific sections of the site back onto their original URL's. It seems on the face of it that the easiest way to get this done is to use the htaccess file and use "RewriteRule" to push all the relevant pages back onto their original domains. Obviously we will ensure we also cover off pointing the existing 301's from the new site and the old sites to this new structure. My question is, are their any potential negative SEO implications of using the RewriteRule in the htaccess file to achieve this? Many thanks in advance. Kind Regards
    Conrad Cranfield

    | activenz
    0

  • Hello Mozzers! I have been tasked with recovering a site from partial link penalty that was previous brought to my attention for this website www.active8canada.com. Upon reviewing the site backlinks and reporting info in Google webmaster tools, I found there was no penalty showing, could it have expired? We spent the last few months doing link cleanup as we recognize that there was some bad links that needed to be addressed. We requested removal of all the bad links after spending time categorizing all of them. Targeting commercial anchor text and bringing those numbers back to acceptable levels. Following this we did a disavow of the bad links which could not be removed through requests. We are actively building out additional content for the website as we recognize that some pages have thin content. We have earned some links as well to show some positive signals during the cleanup but have seen no change for better or worse. My question is, does anyone else see anything else we could be missing here? Should I revisit links again? Some of the links we disavowed are still showing in our backlink reports, but I cross referenced our disavows with the existing backlink profile to try and get an accurate sense of the remaining links. We never saw a decline in ranks further after the disavow, so I'm lead to believe that the links we removed had little, if any impact. I am a little hesitant to begin earning new links through content and partnership outreach as I still feel something is off that I can't quite put my finger on. It was previously confirmed that there was a penalty, but without that showing now in Google webmaster tools I'm grasping at any possible angle I may have missed. If anyone had a couple minutes to spare to shed some light on this situation, it would be greatly appreciated!

    | toddmumford
    0

  • I am using cross domain rel=canonical to a page that is very similar to mine. I feel the page adds value to my site so I want users to go to it, but I ultimately want them to go to the page I'm canonicalizing to. So I am linking to that page as well. Anyone foresee any issues with doing this? And/or have other suggestions? Thanks.

    | ThridHour
    0

  • I'm feeling strong! Ok, so can Google penalise a website which has "duplicated images"  - coming from a completely independent  website?

    | GaryVictory
    0

  • Hi guys I am checking my website for possible technical issues and was wondering if there is a tool or other way to see which of my pages employ the head noindex tag if any. Do you happen to know? Thanks Lily

    | wspwsp
    0

  • Hi, We're currently moving a group of websites (approximately 12) under one domain so we've moved from www.example.de , www.example.co.uk , www.example.com to www.example.com/de www.example.com/uk and so on. However I have read an article online today saying that this can lead to crawling complications. Has anyone done something similar and if there were any issues how did you overcome them? Many thanks

    | Creditsafe
    0

  • Hi everyone I'm working on a successful Wordpress site that also has a forum attached. The forum currently uses YAF forum software, which requires Windows hosting. The site owner wants to switch to Linux hosting. This is not a problem for WP, but it does mean that we'll need to transfer the forum to Xenforo or something similar that runs on Linux. We're OK with the technical side of this, but we're worried about the SEO implications. The URL for every forum post (more than 50,000 of them) is going to change during this transfer. It seems completely impractical to 301 each of those, so should I just 301 the URLs that have inbound links? Also, what is google's algo going to think when we suddenly have ~50,000 404s? Many thanks in advance! J

    | van28
    0

  • Hello, Our main site ranks well for all the keyword terms, and yet, our mobile site is buried. It is a "m." configuration, and I am wondering if it is a question of not using the correct programming language to get it there? Or if the redirects to the main site should relate differently? I have tried to read up on the topic of mobile site SEO and cannot find (or understand) the answer? Could someone please help? Thanks so much in advance!

    | lfrazer
    0

  • I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing. Any assistance will be greatly appreciated : )

    | wcbuckner
    0

  • We are currently having an issue with our homepage not being indexed by any search engines. We recently transferred our domain to Godaddy and there was an issue with the DNS. When we typed our url into Google like this "https://www.mysite.com" nothing from the site came up in the search results, only our social media profiles. When we typed our url into Google like this "mysite.com" we were sent to a GoDaddy parked page. We've been able to fix the issue over at Godaddy and the url "mysite.com" is not being redirected to "https://mysite.com" but, Google and the other search engines have yet to respond. I would say our fix has been in place for at least 72 hours. Do I need to give this more time? I would think that at lease one search engine would have picked up on the change by now and would start indexing the site properly.

    | bcglf
    1

  • Hi there, I manage a website which use a lot virtual includes ( SSI) because it caused a lot of duplicate content i introduced the Canonical url tag. But i still see bad rankings on some pages who are the leading of the virtual includes. Now i'am wondering is it better to remove all the virtual include pages ( url's) and make a redirect 301 of it. Does anybody know that is better for ranking the head page?

    | JoostBruining
    0

  • Hi guys, I am hoping that someone can provide me with some suggestions as to how I can get my site www.purelycaviar.co.uk ranking on both Google and Bing for 'caviar' 'caviar online' 'buy caviar online' and associated phrases that you'd expect people to use when looking to purchase caviar online (in the UK). As it currently stands the site ranks extremely poorly for most associated keywords. The thing is I don;t really have any budget and am feeling a bit overwhelmed with where to start. I am willing to put the time in to understand all this stuff but would love someone to give me a plan of action. Its also worth mentioning that I'm with Shopify so am limited with no server side access. A list of 'To Do's' would be great with an additional list of things which need constant input such as link building and blog posts... Any help massively appreciated!

    | EdwardoUK
    1

  • Hi We have not done this web site http://climateacs.co.uk but have now picked it up and its getting no traffic what so ever from google do you think its been blacklisted? I have added it to my webmaster tools and there are no manual actions on it and most of the backlinks on google webmaster tools are from yell.com. However when I run it on opensiteexplorer I am seeing some chinese type links?? It is not really showing many search queries at all when you view them in webmaster tools under United Kingdom. I was going to start citation building for the address to help support the google places entry but just wanted to see what other peoples opinion was really on this site? Thanks Tracy

    | dashesndots
    0

  • I have lost all internal PR on my site when using the browser toolbar, only my homepage has PR now, the rest are unranked. It has been like this for a few months now. Can anyone confirm this for me please? The site is yourcityoffice.com If you have any other interesting ways to check PR that do not involve a toolbar or tool bar equivalent. Please let me know.

    | gazzerman1
    0

  • A client has an e-commerce site and she doesn't want a page title on the products page. She has breadcrumbs though. Her website developer suggests putting the H1 on the breadcrumbs. So: products> Gifts > picture frame with h1 tags round the word "picture frame". Is this ok to do? Or is it a bad thing for SEO purposes? Thanks

    | AL123al
    0

  • www.heatwavemedia.com Search terms: san francisco video production, san francisco video production services In Google search site appears as "Heartwave Media: San Francisco Video Production Services" but I'm sure I've never used a colon in all my iterations of the title... Instead it reads "San Francisco Video Production Services | Heartwave Media" with the description "A San Francisco video production company specializing in creative corporate and commercial video services." But that's not coming up either... I'm on Wordpress and am using All in One SEO if that helps... Thoughts?

    | keeot
    0

  • Hello all, we have a client which is a sports website. In fact it is a veyr big website and has a huge number of news per day. This is mostly the reason why it refreshes some of its pages with news list every 420 seconds. We currently use meta refresh. I have read here and elsewhere that meta refreshes should be avoided. But we don't do it to send to another page and pass any kind of page authority / juice. Is in this case javascript refresh better? Is there any other better way. What do you think & suggest? Thank you!

    | pkontopoulos
    0

  • Should my H1 header contain the same keywords in the same order, verbatim as my SEO title or some variation of them?  Or does it matter?

    | keeot
    0

  • I have 11 real estate sites and have had links from one to another for about 7 years but someone just suggested me to take them all out because I might get penalized or affected by penguin. My main site was affected on July of 2012 and organic visits have dropped 43%...I've been working on many aspects of my SEO but it's been difficult to come back. Any suggestions are very welcome, thanks 🙂

    | mbulox
    0

  • Hi guys, some questions about host switching. Actually my blog is hosted on siteground growbig Plan. I was thinking about upgrading to gogeek Plan (their PRO shared hosting). Do you think that better technologies and server speed response can improve My ranking? Another question: if I switch server they will change ip. Does it affect My SEO negatively? And after the switch and ip changing Is too much if I re-change ip buying a dedicated ip? Will I risk to hurt My Raking? Sorry For my foreign english 🙂 Thanks so much!

    | Italianseolover
    0

  • Dear All, My customer has been merged by a larger organization. They redirected the old domain with 301 which had a pagerank of 6 to a new domain which is 0.  Is there any chance how i could recover the pagerank and pass it to the new domain? If yes what is your opinion? Thank you,M.

    | SebestynMrton
    0

  • I have a very large ecommerce site I'm trying to spider using screaming frog. Problem is I keep hanging even though I have turned off the high memory safeguard under configuration. The site has approximately 190,000 pages according to the results of a Google site: command. The site architecture is almost completely flat. Limiting the search by depth is a possiblity, but it will take quite a bit of manual labor as there are literally hundreds of directories one level below the root. There are many, many duplicate pages. I've been able to exclude some of them from being crawled using the exclude configuration parameters. There are thousands of redirects. I haven't been able to exclude those from the spider b/c they don't have a distinguishing character string in their URLs. Does anyone know how to exclude files using status codes? I know that would help. If it helps, the site is kodylighting.com. Thanks in advance for any guidance you can provide.

    | DonnaDuncan
    0

  • We recently had mod_deflate and mod_expire installed on our server in an attempt to improve pagespeed. They worked beautifully, at least we thought they did. Google's pagespeed insights tools evaluated our homepage at 65 before the install and 90 after...major improvement. However, we seem to be experiencing very slow load on our product pages. There is a feeling (not based on any quantifiable data) that mod_expire is actually slowing down our page load, particularly for visitors who do not have the page cached (which would probably be most visitors). Here are some pages to look at with their corresponding score from the Pagespeed Insights tool: Live Sound - 91 http://www.ccisolutions.com/StoreFront/category/live-sound-live-audioWireless Microphones - 90 http://www.ccisolutions.com/StoreFront/category/microphones Truss and Rigging - 79 http://www.ccisolutions.com/StoreFront/category/lighting-truss light weight product detail page 83   http://www.ccisolutions.com/StoreFront/product/global-truss-sq-4109-12-truss-segment heavy weight product detail page 77  http://www.ccisolutions.com/StoreFront/product/presonus-studiolive-16-4-2 Any thoughts from my fellow Mozzers would be greatly appreciated!

    | danatanseo
    1

  • Hi everyone, I need to update the structure of my site www.chedonna.it Basicly I've two main problems: 1. I've 61.000 index tag (more with no post)2. The category of my site are noindex I thought to fix my problem making the category index and the tag noindex, but I'm not sure if this is the best solution because I've a great number of tag idexed by Google for a long time. Mybe it is correct just to make the category index and linking it from the post and leave the tag index. Could you please let me know what's your opinion? Regards.

    | salvyy
    0

  • When someone links to your website, but makes a typo while doing it, those broken inbound links will show up in Google Webmaster Tools in the Crawl Errors section as “Not Found”. Often they are easy to salvage by just adding a 301 redirect in the htaccess file. But sometimes the typo is really weird, or the link source looks a little scary, and that's what I need your help with. First,   let's look at the weird typo problem. If it is something easy, like they just lost the last part of the URL, ( such as www.mydomain.com/pagenam ) then I fix it in htaccess this way: RewriteCond %{HTTP_HOST} ^mydomain.com$ [OR] RewriteCond %{HTTP_HOST} ^www.mydomain.com$ RewriteRule ^pagenam$ "http://www.mydomain.com/pagename.html" [R=301,L] But what about when the last part of the URL is really screwed up? Especially with non-text characters, like these: www.mydomain.com/pagename1.htmlsale              www.mydomain.com/pagename2.htmlhttp://   www.mydomain.com/pagename3.html"             www.mydomain.com/pagename4.html/ How is the htaccess Rewrite Rule typed up to send these oddballs to individual pages they were supposed to go to without the typo? Second,  is there a quick and easy method or tool to tell us if a linking domain is good or spammy? I have incoming broken links from sites like these: www.webutation.net      titlesaurus.com     www.webstatsdomain.com      www.ericksontribune.com        www.addondashboard.com       search.wiki.gov.cn       www.mixeet.com       dinasdesignsgraphics.com Your help is greatly appreciated. Thanks! Greg

    | GregB123
    0

  • Hi all, I would like to ask you what's the better way to eliminate the slug /category/ form the wordpress category pages. I need to delete the slug /category/ to make the url seo frendly. The problem is that my site is an old site with the page indexed by Google for a long time. Thanks for your advice.

    | salvyy
    0

  • Hello, can anybody help me. I my webmaster tools, under "Links to your Site" and "Who links the most" It shows Google as the top inbound link source, with 3,669 out of 4,14, all these links are pointing to one sub page of content. Yet when I click on the source it only shows me one Google Plus page. My traffic hell by 60% when I first noticed these links. we have not been building unnatural links and we have not had any Manual link building warning. This is the site: iunlock.org Two questions:- I have deleted this one linked to page what else can I do to reverse this impact Does anybody know what has happened? See attached WWwnb1W.png

    | Jack4ireland
    0

  • Hey all, I wanted to check if links that have built naturally over the past years, linking from a badge (image) sitewide, can harm the linked website? Here is some more information: 1. It's from a competition that the winners were able to add the badge with the link to their site (the link to our website was to a subpage, not homepage). 2. There are around 15 websites with the badge as a link. The website has around 200 root domain links. There will not be any more websites with the badge, just these 15. 3. The sitewide links percentage are 5% of the overall number of pages linked to our website. Based on the last penguin update (4th of October, 2013), can our website be harmed from the badge link building?

    | stevanl
    0

  • Hi Im working with a site that has ALOT of duplicate content and have recommended developer fix via correct use of Canonicalisation i.e the canonical tag. However a US version (of this UK site) is about to be developed on a subfolder (domain.com/uk/ & domain.com/US/ etc so also looking into adopting the hreflang attribute on these. Upon reading up about the hreflang attribute i see that it performs a degree of canonicalisation too. Does that mean that developing the international versions with hreflang means there's no need to apply canonicalistion tags to deal with the dupe content, since will deal with the original dupe content problems as well as the new country related dupe content, via the hreflang ? I also understand that hreflang and canonicalisation can conflict/clash on different language versions of international subfolders etc as per:  http://www.youtube.com/watch?v=Igbrm1z_7Hk In this instance we are only looking at US/UK versions but very likely will want to expand into non english countries too in the future like France for example. So given both the above points if you are using hreflang is it advisable (or even best) to totally avoid the canonical tag ? I would be surprised if the answers yes, since whilst makes logical sense given the above (if the above statements are correct), that seems strange given how important and standard best practice canonical usage seems to be these days. What best ? Use the Hreflang alone, or the Canonical tag alone or both ? What does everyone else do in similar situation ? All Best Dan

    | Dan-Lawrence
    0

  • Google indexed some subdomains (13!) that were never supposed to exist, but apparently returned a 200 code when Google somehow crawled them. I can get these subdomains to return a "server not found" error by turning off wildcard subdomains at my DNS. I've been told that these subdomains will be deindexed just from this server not found error. I was going to use Webmaster Tools and verify each domain, but I'm on an economy goDaddy server and apparently subdomains just get forwarded to a directory, so subdomain.domain.com gets redirected to domain.com/subdomain. I'm not even sure with this being the case, if I can get WMT to recognize and remove these subdomains like that. Should I fret about this, or will the "server not found" message get Google to remove these soon enough?

    | erin_soc
    0

  • Hi, I've made an e-commerce (drsebagh.it) for the italian division of the brand Dr Sebagh. Now if I search the brand query on google.it (https://www.google.it/search?q=dr+sebagh&oq=dr+sebagh&aqs=chrome.0.69i59l3j0l3.1352j0j4&sourceid=chrome&espv=210&es_sm=91&ie=UTF-8) the site is around the 3rd serp. I can't find where problems are. No duplicate content (as my client says and Copyscape Free seems to confirm that) also Webmaster Tools doesn't signal errors... Can someone helps me to do a quickly check?

    | YouON
    0

  • how do I fix duplicate content errors on categories and tags?  I am trying to get rid of all the duplicate content and I'm really not sure how to.  Any suggestions, advice and/or help on this would be greatly appreciated.  I did add the canonical url through the SEO Yoast plugin, but I am still seeing errors.  I did this on over 200 pages. Thanks for any assistance in advance. Jaime

    | slapshotstudio
    0

  • My Google author picture, which had been in place for a couple of years, disappeared from all SERP results recently. I checked, and rel=author attribution is valid on every post, as is the link to to the Google + authorship page (which contains a link back to the web site). When I test URL's in the structured data testing tool the picture appears. I'm out of troubleshooting ideas. Any suggestions welcome.

    | waynekolenchuk
    0

  • We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing. So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong? Thanks !

    | jfmonfette
    0

  • Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!

    | DHS_SH
    0

  • Hi all & happy new near! I'm new to SEO and could do with a spot of advice: I have a site that has several domains that mirror it (not good, I know...)  So www.site.com, www.site.edu.sg, www.othersite.com all serve up the same content.  I was planning to use rel="canonical" to avoid the duplication but I have a concern: Currently several of these mirrors rank - one, the .com ranks #1 on local google search for some useful keywords. the .edu.sg also shows up as #9 for a dirrerent page. In some cases I have multiple mirrors showing up on a specific serp. I would LIKE to rel canonical everything to the local edu.sg domain since this is most representative of the fact that the site is for a school in Singapore but...
    -The .com is listed in DMOZ (this used to be important) and none of the volunteers there ever respoded to requests to update it to the .edu.sg
    -The .com ranks higher than the com.sg page for non-local search so I am guessing google has some kind of algorithm to mark down obviosly local domains in other geographic locations Any opinions on this? Should I rel canonical the .com to the .edu.sg or vice versa? I appreciate any advice or opinion before I pull the trigger and end up shooting myself in the foot! Best regards from Singapore!

    | AlexSG
    0

  • I am facing too much problems with my all sites and i am afraid with google SERP result. I was penalize by google in previous yea and again Today too. i have a website name is removalinmelbourne.com.au i was happy with my seo because it was coming on the first page and 2nd page with most of the keywords like removalists melbourne, removals melbourne, movers in melbourne, removalists in melbourne and today i was shocked with my result this not showing anywhere on google . Please someone help me . How can i get back .

    | Tufail
    0

  • With the increasing use of a single codebase to serve both mobile and desktop sites and the use of off-canvas menus which are loaded but only displayed n mobile phones, does anyone have any view on whether this will impact google's understanding of the page. The off canvas menu is the same as the top menu but loads in its own divs at the bottom of the source. Any thoughts or anybody testing different methods

    | PottyScotty
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.