Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • My website has 0.5 million pages with urls like this- **http://www.mycity4kids.com/Delhi-NCR/collage-painting-classes-%3cnear%3e-shalimar-bagh ****, **none of these urls are indexed. Question 1- What can be the possible reason for this issue? Users see this url as : http://www.mycity4kids.com/Delhi-NCR/collage-painting-classes-<near>-shalimar-bagh</near>
    The symbol "<" and ">" get converted into "%3c" and "%3e" respectively, is this the reason for these urls not getting indexed?

    | prsntsnh
    0

  • The way our site was built, engineers put the title tag blow the meta desc. and meta keywords. I asked to have it changed based on the best practice of putting the most important content first, but apparently doing this will cause a major ripple effect in the way the site was engineered. Will we lose out on full SEO benefit with this structure? Should I stand down? <title></p></title>

    | Vacatia_SEO
    0

  • That is a question I am sure many of your have been asking since they launched the product several weeks ago. Cemper claims they helped get a penalty removed in 3 days by using this product.  Sounds great doesn't it?  Maybe even sounds too good to be true.  Well, here is my experience with it. We have been working to get a site's rankings back up for several months now.  While it has no penalty, it clearly got hit by the algo change.  So we have been very busy creating new content and attempting to remove as much "keyword rich" links as possible. This really hasn't been working very well at all, so when I heard about link detox boost I thought this was the answer to our prayers.  The basic idea is link detox boost forces google to crawl your bad links so it know you no longer have links from those sites or have disavowed them. So we ran it and it was NOT cheap.  Roughly $300.  Now, 3 weeks after running it, the report only shows it has actually crawled 25% of our links, but they assure us it is a reporting issue and the full process has ran its course. The results.  No change at all.  Some of our rankings are worse, some are better, but nothing worth mentioning. Many products from Link Research Tools are very good, but i'm afraid this isn't one of them. Anyone else use this product?  What were your results?

    | netviper
    2

  • I have an existing campervan hire website which is being redesigned, rebranded and renamed (including the domain). The website allows businesses and owners to list campervans for rent to customers. There are a huge amount of campervan hire companies and so not many relevant domain names. Also many are for sale as expensive premium domains. We want something different that will standout. We're thinking of https//:camper7.com as its short and we think it would work. Is it best to avoid using a single digit in a domain if it isnt a 2 or 4 or does it not matter if we think it could work as it would stand out? Any help or advice would be appreciated. thanks James

    | Curran
    1

  • Hey there! Recent announcements at Google to encourage webmasters to let Google crawl Java Script http://www.googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html http://googlewebmastercentral.blogspot.com/2014/05/rendering-pages-with-fetch-as-google.html We have always put JS and CSS behind robots.txt, but now considering taking them out of robots. Any opinions on this?

    | CleverPhD
    0

  • I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt? I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing). User-agent: *
    Disallow: /
    Allow: /$

    | esiow2013
    0

  • what are the pros and cons of placing videos on the native website as opposed to pushing the you tube channel? If I move the youtube and vimeo to the native website will i loose all the link juice? is there an easy way to have transcript of the audio as html in the site?

    | bakergraphix_yahoo.com
    0

  • Hi, We launched our new site back in Sept 2013 and to control indexation and traffic, etc we only allowed the search engines to index single dimension pages such as just category, brand or collection but never both like category + brand, brand + collection or collection + catergory We are now opening indexing to double faceted page like category + brand and the new tag structure would be: For any other facet we're including a "noindex, follow" meta tag. 1. My question is if we're including a "noindex, follow" tag to select pages do we need to include a canonical or hreflang tag afterall? Should we include it either way for when we want to remove the "noindex"? 2. Is the x-default redundant? Thanks for any input. Cheers WMCA

    | WMCA
    0

  • I am working on a travel site about a specific region, which includes information about lots of different topics, such as weddings, surfing etc. I was wondering whether its a good idea to register domains for each topic since it would enable me to build backlinks. I would basically keep the design more or less the same and implement a nofollow navigation bar to each microsite. e.g.
    weddingsbarcelona.com
    surfingbarcelona.com or should I rather go with one domain and subfolders: barcelona.com/weddings
    barcelona.com/surfing I guess the second option is how I would usually do it but I just wanted to see what are the pros/cons of both options. Many thanks!

    | kinimod
    0

  • Hi mozzers, I am doing an seo audit and one of the components of crawlability in the audit template I have is: "Disable Cookies/Make Googlebot user agent", I am not quite sure why cookies could harm your SEO? Can someone explain me what problems can arise because of cookies? Does it prevent bots to crawl your website like .js on your nav? Thanks!

    | Ideas-Money-Art
    0

  • We're adding schema markup for all of our videos, but some videos exist only in a playlist (all integrated into one URL, and loaded after a javascript call). Per Google: "Make sure that your video and schema.org markup are visible without executing any JavaScript or Flash." https://support.google.com/webmasters/answer/2413309?hl=en So we know the current implementation won't work for schema markup... What's the best practice for adding schema markup for video playlists? Should we host all of these videos on individual URLs (but then they appear twice) or is there some other workaround?

    | nicole.healthline
    0

  • Hello Moz Community! I really was hoping to get your help on a issue that is bothering me for a while now. I know there is a lot of about this topic but I couldn’t find a good answer for my particular question. We are running several web applications that are similar but are also different from each other. Right now, each one has its own subdomain (which was mainly due to technical reasons). Like this: webapp1.rootdomain.com, webapp2.rootdomain.com etc. Our root domain currently points with 301 to webapp1.rootdomain.com. Now, we are thinking about making two changes: changing to a subfolder level like this: rootdomain.com/webapp1 , rootdomain.com/webapp2 etc. Changing our rootdomain to a landing page (lisitng all the apps) and take out the 301 to webapp1 We want to do these changes mainly for SEO reasons. I know that the advantages are not so clear between subdomain/subfolder but we think it could be the right way to go to push the root domain and profit more from juice passing to the different apps. The problem is that we had a bad experience when we first switched from our first wep app (rootdomain.com) to an subdomain (webapp1.rootdomain.com) to set them equal with the other apps. Our traffic dropped a lot and it took us 6 weeks to get back on the same level as before. Maybe it was the 301 not passing all juice or maybe it was the switch to the subdomain. We are not sure. So, I guess my question is do you think it is the right thing to do for web apps to go with subfolders to pass more juice from root to subfolders? Will it bring again huge drops in traffic once we make that change? Is it worth taking that risk or initial drop because it will pay off in the future? Thanks a lot in advance! Your answers would help me a lot.

    | ummaterial
    0

  • As we are building a hreflang sitemap for a client, we are correctly implementing the tag across 5 different languages including English. However, the News and Events section was never translated into any of the other four languages. There are also a few pages that were translated into some but not all of the 4 languages. Is it good practice to still list out the individual non-translated pages like on a regular sitemap without a hreflang tag? Should the hreflang sitemap include the hreflang tag with pages that are missing a few language translations (when one or two language translations may be missing)? We are uncertain if this inconsistency would create a problem and we would like some feedback before pushing the hreflang sitemap live.

    | kchandler
    0

  • Hi all, I have a dilemma and I'm hoping the community can guide me in the right direction.  We're working with a major retailer on launching a local deals section of their website (what I'll call the "local site").  The company has 55 million products for one brand, and 37 million for another. The main site (I'll call it the ".com version") is fairly well SEO'd with flat architecture, clean URLs, microdata, canonical tag, good product descriptions, etc. If you were looking for a refrigerator, you would use the faceted navigation and go from department > category > sub-category > product detail page. The local site's purpose is to "localize" all of the store inventory and have weekly offers and pricing specials.  We will use a similar architecture as .com, except it will be under a /local/city-state/... sub-folder. Ideally, if you're looking for a refrigerator in San Antonio, Texas, then the local page should prove to be more relevant than the .com generic refrigerator pages. (the local pages have the addresses of all local stores in the footer and use the location microdata as well - the difference will be the prices.) MY QUESTION IS THIS: If we pull the exact same product pages/descriptions from the .com database for use in the local site, are we creating a duplicate content problem that will hurt the rest of the site? I don't think I can canonicalize to the .com generic product page - I actually want those local pages to show up at the top.  Obviously, we don't want to copy product descriptions across root domains, but how is it handled across the SAME root domain? Ideally, it would be great if we had a listing from both the .com and the /local pages in the SERPs. What do you all think? Ryan

    | RyanKelly
    0

  • Hey Mozzers, I have ran a few different backlink reports, and I noticed that one of my sites has an incredible amount of spammy backlinks. These were not done by a prior SEO, they are simmply spammy links that were scraped and inserted on terrible sites, forums, directories, etc. 100% uncontrollable. The anchor text includes anything from the domain to "live sex" and "victoria's secret coupons". There are probably close to 700 or so of these backlinks from around 150-200 domains. I have read that one should contact the webmaster, and use disavow as a last resort, but I am not sure if that advice is for spammy link building techniques, which we have no history of doing. Is this normal? What is the best way to handle this? Is it likely that these are affecting this site's ranking at the moment? The number of spammy links drastically affects the ratio of quality backlinks to spammy backlinks. This is so frustrating...

    | evan89
    0

  • Hiya Mozzers, I have just been checking a website for duplication issues (this is a new website - they have just migrated across from old website to this new "main website"), and I found a wordpress blog on a different URL, duplicating the "main website"'s blog. Should I just close down this wordpress blog, 301 redirecting from the wordpress blog to the "main website"'s blog (equivalent blog posts to equivalent blog posts, with other indexed non-specific pages 301 redirected to "main website"'s blog homepage)? Thanks in advance for your help.

    | McTaggart
    0

  • Hey Mozzers, I am working on a site that has search-friendly parameters for their faceted navigation, however this makes it difficult to identify the parameters in a robots.txt file. I know that using the robots.txt file is highly recommended and powerful, but I am not sure how to do this when facets are using common words such as sizes. For example, a filtered url may look like www.website.com/category/brand/small.html Brand and size are both facets. Brand is a great filter, and size is very relevant for shoppers, but many products include "small" in the url, so it is tough to isolate that filter in the robots.txt. (I hope that makes sense). I am able to identify problematic pages and edit the Meta Head so I can add  on any page that is causing these duplicate issues. My question is, is this a good idea? I want bots to crawl the facets, but indexing all of the facets causes duplicate issues. Thoughts?

    | evan89
    0

  • Hello fellow marketers, I have found this weird thing with our website in the organic results. The sitelinks in the SERP shows wrong written text. As in grammatically incorrect text. My question is where does Google get the text from? It is not the page title as we can see it. kKsFv0X.png

    | auke1810
    1

  • I have had this problem for some time now and I've asked many many experts. Search for Falke in Google.co.uk and this is what you get: http://www.sockshop.co.uk/by_brand/falke/ 3rd Our competitor
    http://www.mytights.com/gb/brand/falke.html 4th Our competitor http://www.uktights.com/section/73/falke 104th this is us ????? 9th for Falke tights with same section not our falke tights section? All sites seem to link to their brand sections in the same way with links in the header and breadcrumbs, Opensite exporler only shows 2 or 3 internal links for our compertitors, 1600+ from us?
    Many of our brand sections rank badly Pretty Polly and Charnos brands rank page 2 or 3 with a brand subsection with no links to them, main section dosn't rank? Great example is Kunert, a German brand no UK competition our section has been live for 8 years, the best we can do is 71st Google UK, 1st on Bing (as we should be). I'm working on adding some quality links, but our comtetitors have a few low quality or no external links, only slightly better domain authority but rank 100+ positions better than us on some brands. This to me would suggest there is something onpage / internal linking I'm doing wrong, but all tools say "well done, grade A" take a holiday. Keyword denisty is similar to our competiors and I've tried reducing the number of products on the page. All pages really ranked well pre Penguin, and Bing still likes them. This is driving me nuts and costing us money Cheers Jonathan
    www.uktights.com

    | jpbarber
    1

  • Hi MOZ Community: I hired an SEO firm to run a link audit, identify bad links, request that those links be removed and upload a disavow file to Google Webmaster tools for the domains that would not agree to remove their links. My SEO company after emailing the owners of the bad domains linking to us obtained the following results: NYCOfficeSpaceLeader Total for Removal: 125 (118) Found: 87 (84) Removed: 27 (27) Only a total of 27 domains out of 87 found domains have been removed so far.  Seven additional domains have asked for a link removal ransom which we are refusing. Only getting 27 removed seems really low. Is this normal? Is there any way to increase this number? Will the disavow file have any effect and if so when? If Google does not actually remove the links, how can I determine when the disavow file has been processed. I feel a little silly having paid a lot of money and the only tangible effect to date is that links from 27 domains have been removed.  Has it been a worthwhile investment for only having links from 27 domains removed? My company does not have an unlimited marketing budget so obviously there is some concern.  At the same time the SEO firm seems professional. Thanks,
    Alan

    | Kingalan1
    0

  • What is the best tool for finding related keywords to the primary keyword we are targetting? Cheers

    | webguru2014
    0

  • what if we do the interlinking on the exact keywords? Is this comes under spam technique? For example - http://blog.payscout.com/automotive-merchant-services/ I interlink the exact keyword in the above URL. Can we use same image 2-3 times on the same website with different anchor tags? For example - http://packforcity.com/what-to-wear-in-new-orleans-in-january/ http://packforcity.com/what-to-wear-in-san-francisco-in-october/ Same image used on the website with different alt tag.

    | AlexanderWhite
    0

  • Howdy Moz Fans (quoting Rand), I have a weird issue.  I have a site dedicated to criminal defense.  When you Google some crimes, the homepage comes up INSTEAD of the internal page directly related to that type of crime. However, on other crimes, the more relevant internal page appears. Obviously, I want the internal page to appear when a particular crime is Googled and NOT the homepage. Does anyone have an explanation why this happens? FYI:  I recently moved to WP and used a site map plugin that values the internal pages at 60% (instead of Weebly, which has an auto site map that didn't do that). Could that be it?  I have repeatedly submitted the internal pages via GWT, but nothing happens. Thanks.

    | mrodriguez1440
    0

  • Ok, so... Google Webmaster Tools Internal Links are not showing any links to my site's homepage. I only link to the homepage by wrapping the logo with the link throughout the site. Does Google need these to be text links to show them? [/](<a class=)" title="Kona Coffee">![](<a class=)http://1s93mbet6ccj5zkm31703gqj8.wpengine.netdna-cdn.com/wp-content/uploads/kona-coffee-1.png" alt="Kona Coffee"/> Site is here:
    http://goo.gl/4C8GKc Could CDN image source be affecting it? Lost... please help!

    | AhlerManagement
    0

  • Hi, I have a page that has been fluctuating a lot the last few days, here are the results: 5/14:  #17 (this is where it had been ranking for about 2 months). 5/15:  #34 5/16:  #33 5/18:  #9 5/19:  #35 5/20:  #13 5/21: #37 I have only made minor changes to the page, and the link profile seems to look good.  Here's the page: www.thesandiegocriminallawyer.com/dui.html (targeted KW:  San Diego DUI Lawyer, San Diego DUI Attorney). The page has a lot of high-quality, original, and well-cited content.  Any thoughts on what could be causing so much back and forth? I should state that none of the other rankings for this site (overall) have been impacted.  Just this page for DUI related searches (San Diego DUI Lawyer, San Diego DUI attorney, San Diego Drunk Driving Lawyer, etc.).

    | mrodriguez1440
    0

  • What does this mean? That isn’t an address on the website (fdmgroup.com)?All I can think of is that there may be some email address incorrectly entered on the blog somewhere – but it’s not a meta-refreshLooking at the referring page http://www.fdmgroup.com/fdm-group-speaks-out-against-the-revelation-that-one-in-four-graduates-fail-to-find-work/ - a blog entry from 2011 – it seems someone’s tried to attach google tracking code to the email address?Thanks in advance.

    | fdmgroup
    0

  • Hi, Is every post you write on your site is SERPs worthy? I'll give an example - 
    We often cover industry related news items. It is written very well with personal opinions, comments and detailed explanations. Our readers find it interesting, "like" and "plus" it. However, these items will never appear in the SERPs simply because they won't be searched. Needless to say that these are not ever green pieces. If by chance it lands a subject that may be searched in the future, usually it won't appear because it means that the item was also covered by major sites like CNN, Forbes, Bloomberg etc. Is it worth out time to keep "investing" in these types of articles? Thanks

    | BeytzNet
    0

  • Hi there, I work for an ecommerce company as an online marketing consultant. They make kitchenware, microware and so on. The are reviewing their overall strategy and as such they want to build up a community. Ideally, they would want to have the community in a separate domain. This domain wouldn't have the logo of the brand. This community wouldn't promote the brand itself. The brand would post content occassionally and link the store domain. The reasoning of this approach is to not interfere in the way of the community users and also the fact that the branded traffic acquired doesn't end up buying at the store I like this approach but I am concerned because the brand is not that big to have two domains separated and lose all the authority associated with one strong domain. I would definitely have everything under the same domain, store and community, otherwise we would have to acquire traffic for two domains. 1. What do you think of both scenarios, one domain versus two? Which one is better? 2. Do you know any examples of ecommerce companies with successful communities within the store domain? Thanks and regards

    | footd
    0

  • In what type of a situation is it the best type of practice to use a self referencing rel "canonical" tag? Are there particular practices to be cautious of when using a self referencing  rel "canonical" tag? I see this practice used mainly with larger websites but I can't find any information that really explains when is a good time to make use of this practice for SEO purposes. Appreciate all feedback. Thank you in advance.

    | SEO_Promenade
    0

  • Hi There, I have got in my web several PDFs with the same content of the HTML version. Thus I need to set up a canonical for each of them in order to avoid duplicate content. In particular, I need to know how to write the exact syntax for the windows server (web.config) in order to implement the canonical to PDF. I surfed the web but it seems I cannot find this piece of info anywhere Thanks a lot!!

    | Midleton
    0

  • Hi everyone! First thanks for reading this, I really appreciate it. The company I work for has two sites one is an event website and the other is a blog. The blog gets a great amount of the traffic and propels sales. The event website doesn't get much traffic but has been around for awhile and has garnered a 6 Google Page rank with a lot of backlinks and referring domains. The event website, though, has the same name of the company and this sometimes gets confusing when talking to businesses so the executives in charge want to make the event website an umbrella site for the company (very similar to Virgin's website). They will keep the event website but rebrand it with a new domain and basically start over. The good news about this is the event website, even though it has high link strength, has a lot of 404s because they had a previous database that they dumped leading to a lot of 404s (I made them change those to 410s). Here's my issue. I want to keep the SEO strength of the event website for the event website. Could I do a 301 redirect for a couple months and then take it off and make the umbrella site? Would the strength pass? Or would it be possible to do a 301 redirect in the subfolders where most of the content and links are? Or would you recommend another method of transferring the strength of the site?

    | Therealmattyd
    0

  • I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure. I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct? Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages. Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out. Thanks! Best.. Mike

    | 94501
    0

  • Can we have different domain and Business name for website? I want to create real-estate website.  I want to registered domain name ‘Nycityhomes.com’ and want to my business name (in logo as well) is ‘Sunny Associates’. Can we do that for local SEO?

    | AlexanderWhite
    0

  • Can We have same alt tags on all the images? Below pages have images with same alt tag "astrologer Ravi sharma". I used name of the person on every image. before today, all images were shown in google images but today no image is there. any comment. Like - http://www.astrologerravisharma.com/astrologer-ravi-sharma-photos/ http://www.astrologerravisharma.com/gallery/

    | AlexanderWhite
    0

  • Hey Guys, Weather it be OSE, ahrefs, etc. How do you determine if the link is worth using to build a backlink on? I know to look for a higher DA/PA and overall established links. I want very quality sites for external links (as we all do) but I also want to know what to look for and what to bypass when determining if I should build a backlink on the domain. These are a Few examples / questions i have sorry if they are basic: (the below are all specific examples) 1. If a site has an article and that article page is a DA 65 / PA 1 with Zero (0) established links to that article my backlink is on; would it be link building worthy? Should I leave a backlink, why? ex. lots of different blogspot.com blogs pointing back at my site..^^ Same domian, different blog any benefit? 2. If a site is a PR2 DA 30/ PA 32 with 14 root domains, 250 total links.... Would a link like this give me any benefit or should I skip links like this? Why? 3. What main factors do you focus on/look for and know when & when not leave a backlink to your site when using a tool like OSE, Ahrefs? 4. Should I even worry about a sites PR when linkbuilding since PR doesnt play that big of a role anymore opposed to high quality backlinks?  Ive seen PR 7 sites outranked by PR1 site with 200 high quality backlinks to it Thanks for any help and any help is GREATLY appreciated. 🙂

    | Circa444
    0

  • We serve up different desktop/mobile optimized html on the same URL, based on a visitor’s device type. While Google continue to recommend the HTTP Vary: User-Agent header for mobile specific versions of the page (http://www.youtube.com/watch?v=va6qtaiZRHg), we’re also aware of issues raised around CDN caching; http://searchengineland.com/mobile-site-configuration-the-varies-header-for-enterprise-seo-163004  / http://searchenginewatch.com/article/2249533/How-Googles-Mobile-Best-Practices-Can-Slow-Your-Site-Down  /  http://orcaman.blogspot.com/2013/08/cdn-caching-problems-vary-user-agent.html As this is primarily for Google's benefit, it's been proposed that we only returning the Vary: User-Agent header when a Google user agent is detected (Googlebot/MobileBot/AdBot). So here's the thing: as the server header response is not “content” per se I think this could be an okay solution, though wanted to throw it out there to the esteemed Moz community and get some additional feedback. You guys see any issues/problems with implementing this solution? Cheers! linklater

    | linklater
    0

  • What should you put in the “Website” field of your Google Places page: the URL of your homepage, or of one of your location pages?

    | AlexanderWhite
    0

  • Hello, our website (http://www.roguevalleymicro.com/index.php) is not coming up properly on Google search (for example, when you search for Rogue Valley Microdevices on Google). We believe that there is something wrong with the website source code, and Google cannot index it properly. However, your Crawl Test results did not indicate any such problems. Can someone help us with some advice please?

    | medved44
    1

  • Last year with penguin, our rankings took a hit.  We have worked hard, tirelessly, to recover.  Last june we had no social media.  We had an old website.  We completely updated our website to responsive design, over 500k pages.  We post daily fresh content, we expanded into social media.  We now have 100k followers on Facebook.  We are seeing thousands of Google + in the last few months, and not by hiring a single SEO consultant, and we use no ad-words or any paid advertising (except for adsense, limited on our site).  We got thousands of Google +1's simply by sharing content in different circles and they liked us the old fashioned way.  And yet our rankings have actually decreased.  Just Saturday night, suddenly rankings that were on page 2 of Google dropped to page 5.  Rankings on page 5 dropped to page 13, over night.  Mind you, last year (prior to the penguin update), those page 2 and page 5 rankings were in the top 3 spots on page one.  So its been quite a fall.  We are doing something wrong, and I don't know what it is.  The overnight rankings drop did not correspond with anything we did whatsoever.  They just literally dropped abruptly. here is our site: (redacted for privacy, thanks for answering my question!) here is a sample of a fallen ranking.  Friday, for example, we ranked on page one of google in this search:(redacted)
    and now we are on page 3. I am open to ideas, suggestions.  I want to raise our D/A and have worked hard over the last year to do so, but it doesn't seem to be working too well.  Do i have bad inbound links?  Is our site not a quality enough user experience?  Outside advice is well received.  Thank you to anyone who can lend their insight.  🙂

    | marshill
    0

  • I have had an agency state that "Backlinks are the most important factor in SEO". That is how they are justifying their strategy of approaching bloggers. I believe there are a lot more factors than that including Target Market definition, Keyword identification an build content based on these factors. What's everyone's thoughts?

    | AndySalmons
    0

  • I have noticed that Google has started to simply link to /section/ as opposed to /section/index.php and I haven't changed any canonical tags etc. I have looked at my pages moz authority for the two /section/ = 28/100
    /section/index.php = 42/100 How would I go about transferring the authority to /section/ from /section/index.php to hopefully help me in my organic serp positions etc. Any insight would be great 🐵

    | TimHolmes
    0

  • Hi guys, Please advice me on something improving my product pages ranking. We are doing well for head terms, categories but not ranking for product pages. We have issues with product pages which I am think is hard to tackle. For instance we have duplicate products (different colors), duplicate content internally (colors) and from manufacturer websites. Product pages linked from sub-category i.e. Home > Category > Sub-Category (20 per page) using pagination for next 20 and so on. Product pages linked internally via widgets that says other Similar products, featured products etc. Another issue with our product pages is that we are using third party reviews platform and whenever users add reviews to product pages this platform creates an hyperlink to different anchors which is not relevant to product. Example - http://goo.gl/NUG652 Can somebody please give some advice on how to improve rankings for product pages. writing unique content for thousands of pages is not possible. Even our competitor not writing unique content.

    | Webmaster_SEO
    0

  • Hi All, I've been lurking and learning from this awesome Q&A forum, and I finally have a question. I am working on SEO for an entertainment site that tends to get scraped from time to time. Often, the scraped content is then translated into a foreign language, and posted along with whatever pictures were in the article. Sometimes a backlink to our site is given, sometimes not. Is scraped content that is translated to a foreign language still considered duplicate content? Should I just let it go, provided a backlink is given? Thanks!
    Jamie

    | MKGraphiques
    0

  • Hello Experts As i search site :http://www.louisvuittonhandbagss.com or just entering http://www.louisvuittonhandbagss.com on Google i am not getting my website . I have done following steps 1. I have submitted sitemaps and indexed all the site maps 2.i have used GWT feature fetch as Google . 3. I have submitted my website to top social book marking websites and to  some classified sites also . Pleae

    | aschauhan521
    0

  • Hi Guys, I'm currently trying to turn around the organic performance of a website I have been working on. I have been reading that content for home pages should be particularly long. What is the ideal length of the copy on a home page? 500 words, 1000 words, 1500 words? The current work is kind of short in my opinion, and I would like to know if it would be a worth while effort to make it longer since this thing is getting clobbered organically. Thanks!

    | oomdomarketing
    1

  • Hi, Does anyone have a list of (major) Search Engines that subscribe to the Ajax Crawling Scheme? (https://developers.google.com/webmasters/ajax-crawling/) Specifically interested in major international Search Engines such as Bing/Yahoo, Baidu & Yandex - if anyone knows, please let me know! Thanks in advance

    | FashionLux
    0

  • Just ran a link profile, and have noticed for the first time many spammy Chinese sites linking to my site with spammy keywords such as "Buy Nike" or "Get Viagra".  Making matters worse, they're linking to pages that are creating 404's. Can anybody explain what's going on, and what I can do?

    | alrockn
    0

  • Hello Mozzers, I am working on a website and found the social media agency, employed by the website owner, was running a parallel wordpress blog which duplicates the content on the main website's blog (200 odd pages of this duplicating wordpress blog are indexed - the duplication is exact other than for non-blog content pages - around 60 category, date pages, homepage, etc. I am planning to 301 redirect the wordpress blog pages to equivalent pages on website blog, and then 301 redirect the homepage, category and date pages, etc. to the website blog homepage, so all the blog pages redirect to somewhere on main website. _Does this make sense, or should I only worry about redirecting the blog content pages? _ Also, the main website is new and there are redirects coming in to pages from old website already. _Is there anything to be cautious about when redirecting to a main website from multiple old websites? _ Thanks in advance, Luke

    | McTaggart
    0

  • Can you use yext and moz local in conjunction with out having to worry about duplicate listings. I know yext let you opt out of listings. So my questions is will my website be hurt if i do both?

    | WindshieldGuy-276221
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.