Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi! Some of the sites I work with, when doing searches for their top terms, I am seeing some articles listed under "Interesting Finds". I have read some people thought it deals with AMP, others do not. Some thing it has to do with the structured data added to the page, some do not. Does anyone have a definitive answer on how to increase your chances of being listed here? Any example is attached. Any ideas? Uoi4Jyh

    Intermediate & Advanced SEO | | vetofunk
    0

  • i want to rank with this page http://www.servicesarab.com/%D9%86%D9%82%D9%84-%D8%B9%D9%81%D8%B4-%D8%A7%D9%84%D9%83%D9%88%D9%8A%D8%AA/

    White Hat / Black Hat SEO | | saharali15
    0

  • Hello, I have a question concerning maintenance & pruning content with a large site that has a ton of pages that are either expired OR reoccurring. Firstly, there's ~ 12,000 pages on the site. They have large sections of the site that have individual landing pages for time-sensitive content, such as promotions and shows. They have TONS of shows every day, so the # of page to manage keeps exponentially increasing. Show URLs: I'm auditing the show URLs and looking at pages that have backlinks. With those, I am redirecting to the main show pages. 
    -However, there are significant # of show URLs that are from a few years ago (2012, 2013, 2014, 2015) that DON'T get traffic or have any backlinks (or ranking keywords). Can I delete these pages entirely from the site, or should I go through the process of 410-ing them (and then deleting? or ...?)Can you let 410's sit?)? They are in the XML sitemap right now, so they get crawled, but are essentially useless, and I want to cut off the dead weight, but I'm worried about deleting a large # of pages from the site at once. For show URLs that are still obsolete, but rank well in terms of kewyords and get some traffic...is there any recommended option? Should I bother adding them to a past shows archive section or not since they are bringing in a LITTLE traffic? Or ax them since it's such a small amount of traffic compared to what they get from the main pages. There are URLs that are orphaned and obsolete right now, but will reoccur. For instance, when an artist performs, they get their own landing page, they may acquire some backlinks and rank, but then that artist doesn't come back for a few months. The page just sits there, orphaned and in the XML sitemap. However, regardless of back-links/keywords, the page will come back eventually. Is there any recommended way to maintain this kind of situation? Again, there are a LOT of URLs in this same boat. Promotional URLs: I'm going about the same process for promotions and thankfully, the scale of hte issue is much less. However, same question as above...they have some promotional URLs, like NYE Special Menu landing pages or Lent-Specials, etc, for each of their restaurants. These pages are only valid for a short amount of time each year, and otherwise, are obsolete. I want to reuse the pages each year, though, but don't want them to just sit there in the XML sitemap. Is there ever an instance where I might want to 302 redirect them, and then remove the 302 for the short amount of time they are valid? I'm not AS concerned about the recycled promotional URLs. There are much fewer URLs in this category. However, as you can probably tell, this large site has this problem of reoccurring content throughout, and I'd like to get a plan in place to clean it up and then create rules to maintain. Promotional URLs that reoccur are smaller, so if they are orphaned, not the end of the world, but there are thousands of show URLs with this issue, so I really need to determine the best play here. Any help is MUCH appreciated!

    Technical SEO | | triveraseo
    0

  • I have an interesting challenge for a new client. Basically, they collect payment from gym users whose monthly subscription payment has failed, and they charge the gym user a fee and not the gym. Their clients love them for this, but the end consumer hates them and as a consequence, every review or ratings site from Google Reviews to Trustpilot is universally filled with angry consumers who didn't read the Ts and Cs of their gym membership. Understandable, but it also means the client can't have a presence on any social channel as they simply become a gripe board for disgruntled consumers. My question is, how are the poor reviews impacting on rankings and domain authority and should I treat this like any other client in terms of fixing crawl issues and seeking quality backlinks or am I always going to be pushing water uphill? Cheers gang!

    Reviews and Ratings | | Algorhythm_jT
    0

  • We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for. One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder). Any ideas on a better solution to this?

    Intermediate & Advanced SEO | | MJTrevens
    0

  • Hello, Our company is international and we are looking to gain more traffic specifically from Europe. While I am aware that translating content into local languages, targeting local keywords, and gaining more European links will improve rankings, I am curious if it is worthwhile to have a company.eu domain in addition to our company.com domain. Assuming the website's content and domain will be exactly the same, with the TLD (.eu vs .com) being the only change - will this add us benefit or will it hurt us by creating duplicate content - even if we create a separate GSC property for it with localized targeting and hreflang tags? Also - if we have multiple languages on our .eu website, can different paths have differing hreflangs? IE: company.eu/blog/german-content German hreflang and company.eu/blog/Italian-content Italian hreflang. I should note - we do not currently have an hreflang attribute set on our website as content has always been correctly served to US-based English speaking users - we do have the United States targeted in Google Search Console though. It would be ideal to target countries by subfolder rather if it is just as useful. Otherwise, we would essentially be maintaining two sites. Thanks!

    Technical SEO | | Tom3_15
    0

  • Obviously, the duplicated pages would be canonical, but would there be a way of anchoring a page land by search term entry? For example: If you have a site that sells cars you could use this method but have a page that has (brand) cars for sale, finance options, best car for a family, how far will the (brand) car go for on a full tank and so on? Then making all the information blocks h2's but using the same H2s for the duplicated page titles. Then it gets complicated, If someone searches "best car for a family" and the page title for the duplicated page is clicked how would you anchor this user to the section of the page with this information? Could there be a benefit to doing this or would it just not work?

    Algorithm Updates | | Evosite1
    0

  • hello, which better to rank with 40 DA domain redirect the domain 301 to my website or host domain and create posts with my website link + if i do the 301 redirect the Crawl Errors of old 40 da domain will display on my new website or not+how much links can i get from one website pbn
    +
    which better get links for home page or postsbest regards ,

    Technical SEO | | cristophare79
    0

  • Hello all, So I'm doing some technical SEO work on a client website and wanted to crowdsource some thoughts and suggestions. Without giving away the website name, here is the situation: The website has a dedicated /resources/ page. The bulk of the Resources are industry definitions, all encapsulated in colored boxes. When you click on the box, the definition opens in a lightbox with its own unique URL (Ex: /resources/?resource=augmented-reality). The information for these colored lightbox definitions is pulled from a normal resources page (Ex: /resources/augmented-reality/). Both of these URLs are indexed, leading to a lot of duplicate indexed content. How would you approach this? **Things to Consider: ** -Website is built on Wordpress with a custom theme.
    -I have no idea how to even find settings for the lightbox (will be asking the client today).
    -Right now my thought is to simply disallow the lightbox URL in robots.txt and hope Google will stop crawling and eventually drop from the index.
    -I've considered adding the main resource page canonical to the lightbox URL, but it appears to be dynamically created and thus there is no place to access (outside of the FTP, I imagine?). I'm most rusty with stuff like this, so figured I'd appeal to the masses for some assistance. Thanks! -Brad

    Technical SEO | | Alces
    0

  • Hello everyone and thank you in advance for helping me. I have a Reactjs application which has been made by Create-React-App that is zero configuration. Also I connect it using Axios to the API using Codeigniter(PHP). Before using Reactjs, this website was at the top Google's SERPs for specific keywords. After Using Reactjs and some changes in URLs with no redirection in htaccess or something else, I lost my search engine visibility! I guess it should be caused by Google penalties! I tried using "react-snap", "react-snapshot" and so forth for prerendering but there are so many problem with them. Also I tried using Prerender.io and unfortunately my host provider didn't help me to config the shared host! Finally I found a great article that my website eventually display in Rendering box of Fetch As Google. But still in Fetching box, the dynamic content didn't display. But I can see my entire website in both "This is how Googlebot saw the page" and "This is how a visitor to your website would have seen the page" for all pages without any problem. If Fetch As Google can render the entire of the website, is it possible to index my pages after a while and it would be appear on Google's SERP? mokaab_serp.png

    Intermediate & Advanced SEO | | hamoz1
    0

  • I have searched for food delivery keyword for the United Arab Emirates, EN, and AR. Monthly volume has no data whereas difficulty, organic and priority has figures. When I change the country to US; monthly volume has more data. Can you explain; how can I use your service?

    Moz Bar | | Twenzy
    1

  • Hi community, We have couple of pages where we we have given non-https (http) hyperlinks by mistake. They will redirect to http links anyway. Does using these http links on page hurt any rankings? Thansk

    Algorithm Updates | | vtmoz
    0

  • Hi, I'm working with a site that has created a large group of urls (150,000)  that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/...   so /item/ is common to all of them, if that helps.

    Intermediate & Advanced SEO | | 94501
    0

  • Hey guys, I'm having an issue for the past few months. I keep getting "/feed" broken links in Google Search Console (screenshot attached). The site is a WordPress site using the YoastSEO plugin for on-page SEO and sitemap. Has anyone else experienced this issue? Did you fix it? How should I redirect these links? s7elXMy

    Technical SEO | | Extima-Christian
    0

  • Hi everyone - I'm trying to benchmark an entire industry. Given that Google seems to adjust what ranking signals should be counted more heavily based on the industry/query - I'd like to be able to benchmark a specific industry. As a silly example - let's say I want to know what ranking signals would most effect coffee shops, real estate agents, or manufacturers. Has anyone here ever found an efficient way to analyze: 1. Who the top 10 are for 5-10 keywords, in that industry 2. What they have in common (e.g. meta info, content length, ssl certs, etc.) 3. What were the differences in the higher ranking ones (domain name registration age, content quality, etc.) It seems like there has to be a way to do this given that Moz, SEM Rush, and others release yearly reports that detail which ranking signals seem to be most important. I'd just like to be able to know what the top ranking signals are for a specific industry, based off of who is ranking really well in that industry already. Thanks!

    Other SEO Tools | | growthat
    0

  • Hi there I am really hoping someone can help. The site I run has started receiving traffic from the US (we are a UK run firm who don't ship overseas). Ordinarily, this wouldn't be a massive problem but the traffic is coming directly to lots of pages and instantly bouncing. I am worried this is going to negatively impact my rankings as drop off rate and conversions are getting hammered by this 'fake traffic'. The attached image shows the traffic for the homepage but its happening on every page with hundreds of hits bouncing and hurting my stats. Is there any way of dealing with this or reporting it to an authority or even Google itself? Any help would be greatly appreciated. George 7vprsJo

    White Hat / Black Hat SEO | | BrinvaleBird
    0

  • Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
    Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?

    Intermediate & Advanced SEO | | JonRayner
    2

  • I am trying to build some good quality backlinks, how important is SSL for the site that we post guest blogs on? I realize that if a site does not have SSL currently, their DA will likely not go up very fast because of Google's new algorithms, but currently, I am looking at a couple sites with a DA of 40 and 41. By the way, my site has SSL (is https). Thanks!

    White Hat / Black Hat SEO | | CSBarns
    0

  • Hello, I've heard that relevancy of the content between the source page and the target page of outbound links in my content matters greatly. The outbound links I provide in my content should have a high degree of relevancy to the topic I'm writing about, or they aren't really worth including. Example: Don't just link to the homepage of an organization mentioned in the article, link to a page on their site that is related to the topic you are writing about. Is this true? Would including less relevant links negatively impact SEO in any way?

    On-Page Optimization | | DJBKBU
    0

  • Our theme adds a class to our site's H1s. For example:
    Our front page: ... The blog: ... As a result, Moz is not seeing the keyword in titles when checking using Page Optimization. The theme developer says there is no setting to turn off in order to test this. 
    My question is, is this a true issue that is hurting our site or is Moz simply not finding it because it only looks for ? Are there other options for troubleshooting this potential issue? Thanks for your future answers.
    John

    Keyword Research | | jgoethert
    0

  • How do I find blog post ideas for my website blog when there are only very few questions people have. I am a tour operator operator and the the keywords I target are Provence bike tour, Normandy bike tour , Tuscany bike tour and so on. I am trying to to find blog ideas to create a topic cluster for those pillar pages. Any advice on how to find blog ideas that could boost my pillar pages. Thank you,

    Content Development | | seoanalytics
    1

  • I'm working with a US client on the SEO for their large ecommerce website, I'm working on it from the UK. We've now optimised several of the pages including updating the meta-descriptions etc. The problem is when I search on the keyword iin the UK I see the new updated version of the meta-description in SERPs results. BUT when my client searches on the same keyword in the US they're see the old version of the meta-description. Does any one have any idea why this is happening and how we can resolve it? Thanks Tanya

    Intermediate & Advanced SEO | | TanyaKorteling
    0

  • I am selling my tours on a platform that resells tours. They are taking the content on mywebsite (the description of each of day of my itinerary) to put on their platform. Can I have duplicate content issues doing that ? Thank you,

    On-Page Optimization | | seoanalytics
    0

  • Is it still possible to use anchor text to rank for a keyword that is not present on the landing page? Or are there any alternatives?

    Intermediate & Advanced SEO | | seoman10
    0

  • Is it actually even possible to compete against Amazon to be #1 in Google SERPs against Amazon? If so - how? I run a boutique business selling a niche product, in 2008 - 2013 I was always #1 for my keywords.
    But since Amazon started the same type of products as well, I have now always been right under amazon results, who are at 1,2,3. Is it even possible to get to the #1 position any more? Thank you.

    Intermediate & Advanced SEO | | loginid
    0

  • We operate a blog inside a folder on our site and considering the launch of 4 highly focused blogs with specialized content which are now categories on the internal blog. Wondering if there is more value in using the external new blogs or just keep growing the internal blog content. Does fact that the internal blog is buried amongst millions of pages have any impact if we want the content indexed and value given to the links from the blog content to our main site pages.

    Content Development | | CondoRich
    0

  • What is everyone doing regarding organization schema markup on interior pages? The following article on Moz states that Organization schema should only appear on pages that are about the company (homepage, about page, contact page, etc.): https://moz.com/blog/structured-data-for-seo-2 The Yoast SEO plug-in adds the Organization schema to every page of the site. I am trying to determine what to do with this conflicting information.

    On-Page Optimization | | lexomatic
    1

  • I did a complete redesign and content change for my website. I have my new webpages indexed but they still haven't replaced the old ones after 1 month. In search results I see both the old ones and the news ones. How long of a delay should I expect (approximately) for the old pages to be replaced by the new ones knowing the entire website was changed. Thank you,

    On-Page Optimization | | seoanalytics
    0

  • Hello, To rank on China bike tour. Is it ok to write in the title "Explore china on a bike tour - My company name bike tours" Or do I need to absolutely "China bike tour" in the title tag ? Thank you,

    On-Page Optimization | | seoanalytics
    1

  • SEO/Moz newbie here! My organisation's website (dyob.com.au), uses an API integration to pull through listings that are shown in the site search. There is a high volume of these, all of which only contain a title, image and contact information for the business. I can see these pages coming up on my Moz accounts with issues such as duplicate content (even if they are different) or no description. We don't have the capacity to fill these pages with content. Here's an example: https://www.dyob.com.au/products/nice-buns-by-yomg I am looking for a recommendation on how to treat these pages. Are they likely to be hurting the sites SEO? We do rank for some of these pages. Should they be noindex pages? TIA!

    Technical SEO | | monica.arklay
    0

  • Just a quick question re implementation of JSON-ID breadcrumbs You are here: Acme Company → Electronics → Computers → Laptops So in this example laptops is my current page without a link on the visible on-page breadcrumb. When implementing JSON-LD BreadcrumbList should  Laptops be included in the schema snippet, or commence from Computers to home?

    Technical SEO | | MickEdwards
    0

  • I have a client who has a resources section. This section is primarily devoted to definitions of terms in the industry. These definitions appear in colored boxes that, when you click on them, turn into a lightbox with their own unique URL. Example URL: /resources/?resource=dlna The information for these lightboxes is pulled from a standard page: /resources/dlna. Both are indexed, resulting in over 500 indexed pages that are either a simple lightbox or a full page with very minimal content. My question is this: Should they be de-indexed? Another option I'm knocking around is working with the client to create Skyscraper pages, but this is obviously a massive undertaking given how many they have. Would appreciate your thoughts. Thanks.

    Technical SEO | | Alces
    0

  • Hi, I have two questions regarding keyword cannibalization. 1. I am doing the SEO for a website that sells do-it-yourself packages for heating, bathrooms, ventilation and so on for new houses or for renovations. The most important pages are the product pages (e.g. example.com/products/bathrooms) but there is also a blog divided into categories per product (e.g. example.com/category/bathrooms). The difference is clear: the product page focuses on the product itself, and the blog category page contains all blog posts relating bathrooms (tips, new materials, new innovations,...). My question is if the product page and blog category page can compete with each other for the term bathrooms (although they have different content). Does it help or is it enough to direct internal links from separate blog posts to the most important page (being the product page) and back to avoid my category blog page to compete with my product page? Another possibility would be to use a canonical tag on the category page pointing to the product page, but this actually isn't good practice because it isn't really duplicate content. Third possibility would be to no index the category page. So what is the best solution of the three? 2. A second example of keyword cannibalization can be category archive pages for webshops. If you have a category page example.com/jeans and a subcategory page example.com/jeans/women, is it useful to optimize on both pages for different terms, being jeans for the first page and jeans for women for the second, or will Google not make this distinction because the keyword are too closely related? In other words, is it useful to write content specifically for jeans for women and make a landing page for this keyword, or will this page compete with the category page that has been optimized for just the keyword jeans? In large clothing webshops, you can see for example that there is an optimized page for Nike (content, headings,...) but not for Nike for women or Nike for men. Is this just laziness or is this done exactly to avoid keyword cannibalization? Looking forward to your comments!

    Intermediate & Advanced SEO | | Mat_C
    0

  • Hi all, We have our docs in a subdomain on GitHub and looking to move it into a subfolder. Github offers the ability to redirect via CNAME https://gitbookio.gitbooks.io/documentation/platform/domains.html I am getting conflicting information on if this will work or cause duplicate content and hurt our SEO.

    Technical SEO | | kate.hassey
    0

  • Hi I am trying to arrive at a best practice template for a title tag for my organization so does the following template still holds Primary Keyword - Secondary Keyword | Brand Name will anything be impacted  if I eliminate the spaces between the hyphen, will search bots be still able to treat the first one as a priority and the second as the secondary? Primary Keyword-Secondary Keyword | Brand Name thank you

    On-Page Optimization | | lina_digital
    0

  • I am wondering if someone can help me understand what's going on with our site. We had a 50% drop in the number of keywords ranking from February 2018 to October in SEMrush. Looking at SEMrush, I am actually seeing a drop after 2/208 with many competitor's sites as well. We saw moderate improvement in November, but around December 6th, we started seeing a decline the number of keywords ranking again. In Google Analytics, there was a 10-15% drop in traffic after February 2018, which recovered September to December, but since early December, there is a drop again. In GWT, I am seeing something similar to analytics with impressions and clicks. We have done some SEO the past couple of years, but we have taken care to do things white-hat so as not to incur a penalty. We have also invested in writing content for our blog on a regular basis. Any thoughts?

    White Hat / Black Hat SEO | | kekepeche
    0

  • Hi, I am working for a SAAS client. He uses two different language versions by using two different subdomains.
    de.domain.com/company for german and en.domain.com for english. Many thousands URLs has been indexed correctly. But Google Search Console tries to index URLs which were never existing before and are still not existing. de.domain.com**/en/company
    en.domain.com
    /de/**company ... and an thousand more using the /en/ or /de/ in between. We never use this variant and calling these URLs will throw up a 404 Page correctly (but with wrong respond code  -  we`re fixing that 😉 ). But Google tries to index these kind of URLs again and again. And, I couldnt find any source of these URLs. No Website is using this as an out going link, etc.
    We do see in our logfiles, that a Screaming Frog Installation and moz.com w opensiteexplorer were trying to access this earlier. My Question: How does Google comes up with that? From where did they get these URLs, that (to our knowledge) never existed? Any ideas? Thanks 🙂

    Technical SEO | | TheHecksler
    0

  • I'm got a website with a slider and each of the 6 slides has a 5-second video background.  The website is B2B and the user profile for the website is employees at Fortune 1000 companies in the United States using desktop computers to browse.  The videos are highly optimized and we did testing using various browsers and bandwidth connections to determine the videos loaded fast enough on down to a 15mbit/s connection (which is pretty low by today's average U.S. business bandwidths.) We tried hosting the videos on Vimeo and YouTube but it caused issues in the timing of the slide show display.  (I've not seen any other website do what we do the way we do it.  Most sites have a single video background with a single text overlay on top.) The downside to this is that loading all those videos produces a lot of bandwidth usage for our server.  The website is serving a niche service industry though so we're not exceeding our current limits. I'm wondering though might there be some benefit to hosting just the video files on a CDN?  Obviously that would mean lest bandwidth usage for our server, and possibly quicker load times where the CDN server is closer to the user than our server.  But are there benefits or downsides from an SEO perspective noting that I'm proposing only putting the videos on the CDN, not the entire web page.

    Intermediate & Advanced SEO | | Consult1901
    0

  • I'm having trouble finding a local rank tracking service with useful reporting.  I've tried several and for the money, have gravitated toward's Whitespark's service as for $25/month I can track unlimited locations.  But their report is indicative of what I've seen time and time again in my 18-year experience as a Software Developer and Internet Marketer.  Whomever is making the design decisions isn't a Seasoned (Local) SEO, and/or probably hasn't done their homework well enough by talking to seasoned SEOs. Their Summary Report looks like this (see attachement). When I'm doing Local SEO I'm looking at a lot of reporting data but among that, probably the most important is: How many of the listings moved up into position #1-3 on Local Finder which is also usually the Local 3-Pack (sometimes a 2-pack explaining the discrepancy in the first two rows between the number in the #1-3 column.) I also want to know how many listing moved UP into the #4-10.  And vice versa, what fell out of #1-3 and #4-10. The problem with the format of this report, if a listing falls from #2 to #5, it will be a decrease in #1-3 and an INCREASE for #4-10.  This would give me the false impression that a listing that was below #10 came into the #4-10 when in actuality the increase in #4-10 was because of a decrease in #1-3!!  One situation is positive the other is negative. What I want to know is how many listings (totals without getting out the calculator): moved up into #1-3 (White Spark does this via the Increase column in the Local 3-Pack row) moved up into #4-10 moved down out of #1-3 to #4-10 moved down out of #1-3 to below #10 moved down out of #4-10 to below #10. Does anyone know of local tracking services that give you this kind of data in this way? XQppQKs.jpg

    Local Listings | | Consult1901
    0

  • From what I understand, Googles no longer has a "Duplicate Content Penalty" instead duplicate content simply isn't show in the search results.  Does that mean that any links in the duplicate content are completely ignored, or devalued as far as the backlink profile of the site they are linking to?  An example would be an article that might be published on two or three major industry websites.  Are only the links from the first website GoogleBot discovers the article on counted or are all the links counted and you just won't see the article itself come up in search results for the second and third website?

    Intermediate & Advanced SEO | | Consult1901
    0

  • Hi All, Recently I've added different regions (website.com/se/ etc) to Google search console and pointed them to their relevant countries, but only half are working when I search from a regions IP with a VPN and use the correct Google search ( Google.se etc ). Will this correct over time? or is something else causing them not to be indexed up correctly? Thanks in advance <colgroup><col width="81"><col width="104"></colgroup>
    | Country | Appear in SERP 17/12/2018 |
    | AU | TRUE |
    | CZ | TRUE |
    | DK | TRUE |
    | HK | TRUE |
    | IE | TRUE |
    | IT | TRUE |
    | KR | TRUE |
    | NL | TRUE |
    | NZ | TRUE |
    | SE | TRUE |
    | SG | TRUE |
    | US | TRUE |
    | ZA | TRUE |
    | AE | FALSE |
    | AT | FALSE |
    | CH | FALSE |
    | CN | N/A |
    | DE | FALSE |
    | EE | FALSE |
    | ES | FALSE |
    | FI | FALSE |
    | FR | FALSE |
    | GB | FALSE |
    | GR | FALSE |
    | JP | FALSE |
    | NO | FALSE |
    | PL | FALSE |
    | RU | FALSE |
    | SI | FALSE |
    | TR | FALSE |

    International SEO | | WattbikeSEO
    0

  • This page on our client's website seems to be an absolute magnet for bots, and it's skewing our Google Analytics stats: https://cbisonline.com/us/catholic-socially-responsible-esg-investing/proxy-voting/ We already filter out lots of bots in GA, primarily through a segment we created several years ago and continue to build upon, but plenty of spam traffic still manages to slip through – mostly to the page above. Last quarter, almost all of it came from two random cities in Europe, so we're going to filter out traffic from those places. (At least for now – not an ideal solution, I know.) But I'm really wondering what drives so many bots to that page in particular. Any insights would be greatly appreciated!

    Reporting & Analytics | | matt-14567
    0

  • Hello, I have done a redirect and still see in google index my old page after 3 weeks. My new page is there also Is it normal that the old page isn't dropped for the index yet ? Thank you,

    Intermediate & Advanced SEO | | seoanalytics
    0

  • Hello, all! I have a client who's Fortune 500 - has all the good "stuff" that is associated with pulling in proper info into the knowledge graph/company information box - Wikipedia, strong citations, etc., but the CEO name is showing the old CEO name althopugh we haven't mentioned it in wiki neither on our website but still google is picking it from somewhere else & showing the previous CEO name. How can i change it? Thanks!

    Local Website Optimization | | dhananjay.kumar1
    0
  • This question is deleted!

    0

  • Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: NOTE: the summaries are written by us, and not copied/pasted from other websites. Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>

    White Hat / Black Hat SEO | | kekepeche
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.