Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi there, I'm trying to implement google adwords conversions on a particular client's website. They have used bootstrap as the framework for their site and mainly open up contact forms within a bootstrap modal, after a button is clicked. See here: http://www.gtwstorage.co.uk/ I thought I had successfully implemented the adwords conversion tracking however it has been a week now, and my conversions still say they are "unverified". I wonder if anyone else has encountered this before and knows what I might be doing wrong. Thank you in advance, Darren

    Paid Search Marketing | | SEODarren
    0

  • Hi, I am trying to cleanse a news website.  When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts.  This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012.  So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
    https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove.  Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404?  I believe this is very wrong.  As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
    https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
    http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?

    Intermediate & Advanced SEO | | ioannisa
    0

  • My site is experiencing a decrease in organic traffic WOW for the last two weeks and for the first time all year is showing a decrease compared to last year's traffic for the same weeks. At first I thought this was a seasonal pattern due to spring break (we are mostly b to b), but the dip has sustained for another week. The only changes made during this time period were a few on-page updates and some title tag updates to a specific group of pages. However, the decrease is sitewide including branded clicks and impressions. I haven't noticed any changes in rankings. Impressions and clicks are down per Search Console, but CTR and Avg Rank haven't changed. Is it possible that we've been penalized or hit by an algo shift? What's the best way to know for sure? VLGLUTt

    Technical SEO | | cckapow
    0

  • Hi, I recently implemented a firewall on my website to prevent hacking attacks. We were getting a crazy amount of people per day trying to brute force our website. I used the sucuri cloud proxy firewall service which they claim because of the super fast caching actually helps SEO. I was just wondering is this true? Because we're slowly falling further and further down the SERPS and i really don't know why. If not, is there any major google update recently I don't know about? Thanks, Robert

    Web Design | | BearPaw88
    0

  • Bazaar Voice provides a pretty easy-to-use product review solution for websites (especially sites on Magento): https://www.magentocommerce.com/magento-connect/bazaarvoice-conversations-1.html If your product has over a certain number of reviews/questions, the plugin cuts off the number of reviews/questions that appear on the page. To see the reviews/questions that are cut off, you have to click the plugin's next or back function. The next/back buttons' URLs have a parameter of "bvstate....." I have noticed Google is indexing this "bvstate..." URL for hundreds of sites, even with the proper rel canonical tag in place. Here is an example with Microsoft: http://webcache.googleusercontent.com/search?q=cache:zcxT7MRHHREJ:www.microsoftstore.com/store/msusa/en_US/pdp/Surface-Book/productID.325716000%3Fbvstate%3Dpg:8/ct:r+&cd=2&hl=en&ct=clnk&gl=us My website is seeing hundreds of these "bvstate" urls being indexed even though we have a proper rel canonical tag in place. It seems that Google is ignoring the canonical tag. In Webmaster Console, the main source of my duplicate titles/metas in the HTML improvements section is the "bvstate" URLs. I don't necessarily want to block "bvstate" in the robots.txt as it will prohibit Google from seeing the reviews that were cutoff. Same response for prohibiting Google from crawling "bvstate" in Paramters section of Webmaster Console. Should I just keep my fingers crossed that Google honors the rel canonical tag? Home Depot is another site that has this same issue: http://webcache.googleusercontent.com/search?q=cache:k0MBLFcu2PoJ:www.homedepot.com/p/DUROCK-Next-Gen-1-2-in-x-3-ft-x-5-ft-Cement-Board-172965/202263276%23!bvstate%3Dct:r/pg:2/st:p/id:202263276+&cd=1&hl=en&ct=clnk&gl=us

    Intermediate & Advanced SEO | | redgatst
    1

  • What is the best tool to check for the google penalty, What penalty hit the website. ?

    Intermediate & Advanced SEO | | Michael.Leonard
    0

  • We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?

    White Hat / Black Hat SEO | | Teamzig
    0

  • If I recall we used to be able to change our title attributes tag dynamically based on the search query but not sure if it's possible now or if it makes sense to do so. Thoughts? Rosemary

    On-Page Optimization | | RosemaryB
    1

  • Hi guys, im putting together a proposal for a new site and trying to figure out if it'd be better to (A) have a keyword split across multiple directories or duplicate keywords to have the keyword hyphenated? For example, for the topic of "Christmas decor" would you use; (A) - www.domain.com/Christmas/Decor (B) - www.domain.com/Christmas/Christmas-Decor in example B the phrase 'Christmas' is duplicated which looks a little spammy, but the key term "Christmas decor" is in the URL without being broken up by directories. which is stronger? Any advice welcome! Thanks guys!

    Intermediate & Advanced SEO | | JAR897
    1

  • Hello I have 72 external broken links who are reported with a http 503 status. When I ask the owner of that site, he confirmed he has removed it from GA by using this status. My question is. Does this have an impact for the quality of my site? The site is available, there is no delay when you open the link. Thank you for your support. br
    Kjersti Bakke

    Moz Pro | | kjerstibakke
    0

  • Hi, We had a content manager request to delete a page from our site. Looking at the traffic to the page, I noticed there were a lot of inbound links from credible sites. Rather than deleting the page, we simply removed it from the navigation, so that a user could still access the page by clicking on a link to it from an external site. Questions: Is it bad for SEO to have a page that is not directly accessible from your site? If no: do we keep this page in our Sitemap, or remove it? If yes: what is a better strategy to ensure the inbound links aren't considered "broken links" and also to minimize any negative impact to our SEO? Should we delete the page and 301 redirect users to the parent page for the page we had previously hidden?

    Intermediate & Advanced SEO | | jnew929
    0

  • I know that ideally businesses that operate as franchises should have 1 site with separate location pages. However, I have a slightly different issue. Each location is owned by a different parent company, and named accordingly. For example, there is "Location by XYZ Company" and "Location by ABC Company." In addition, each location, while carrying similar products, does not carry the same exact products and brands. So my question is how would you go about writing the content for each of these sites, keeping the same tone but avoiding duplicate content?

    Content Development | | GavinAdv
    1

  • I have multiple URLs that all lead to the same website. Years ago they were purchased and were sitting dormant. Currently they are 301 redirects and each of the URLs feed to different areas of my website. Should I be worried about losing authority? And if so, is there a better way to do this?

    Technical SEO | | undrdog99
    0

  • Hello, I have a question in regard to international SEO and the hreflang meta tag. We are currently a B2B business in the UK. Our major market is England with some exceptions of sales internationally. We are wanting to increase our ranking into other english speaking countries and regions such as Ireland and the Channel Islands. My research has found regional google search engines for Ireland (google.ie), Jersey (google.je) and Guernsey (google.gg). Now, all the regions have English as one their main language and here is my questions. Because I use hreflang=“en-gb” as my site language, am I regional excluding these countries and islands? If I used hreflang=“en” would it include these english speaking regions and possible increase the ranking on these the regional search engines? Thank you,

    Intermediate & Advanced SEO | | SilverStar1
    1

  • Lately we have been applying structured data to the main content body of our client's websites.  Our lead developer had a good question about HTML however. In JSON-LD, what is the proper way to embed content from a data field that has html markup (i.e. p, ul, li, br, tags) into mainContentOfPage. Should the HTML be stripped our or escaped somehow? I know that apply schema to the main body content is helpful for the Googlebot.  However should we keep the HTML?  Any recommendations or best practices would be appreciated. Thanks!

    Intermediate & Advanced SEO | | RosemaryB
    0

  • Looking for SEOs who have experience with resetting projects by migrating on to a new domain to shed either a manual or algorithmic penalty. My questions are: For algorithmic penalties, what is the best migration strategy to avoid inheriting any kind of baggage? 301, 302, establish no connection between the two sites? For manual penalties, what is the best migration strategy to avoid inheriting any kind of baggage? 301, 302, establish no connection between the two sites? Any other input on these kind of reset projects is appreciated.

    Intermediate & Advanced SEO | | spanish_socapro
    0

  • So, i look after a site for my family business. We have teamed up with the third party site TrustPilot because we like the way it enables us to send out reviews to our customers directly from our system. It's been going great and some of the reviews have been brilliant. I have used a couple of these reviews on our site and marked them up with: REVIEW CONTENT We work in the service industry and so one of the problems we have found is that getting our customers to actually go online and leave a review. They normally just leave their comments on a job sheet that the workers have signed when they leave. So I have created a page on our site where we post some of the reviews the guys receive too. I have used the following: REVIEW TITLE REVIEW Written by: CUSTOMER NAME Type of Service:House Removal Date published: DATE PUBLISHED 10 / 10 stars I was just wondering I was told that this could be against googles guidelines and as i've seen a bit of a drop in our rankings in the last week or so i'm a little concerned. Is this getting me penalised? Should I not use my reviews referencing the ones on trust pilot and should i not have my own reviews page with rich snippets?

    Web Design | | BearPaw88
    1

  • Hi Guys, Recently got into the issue when testing load speed of my website (https://solvid.co.uk). Occasionally, Google Speed Insights gives me a server connection error which states _"PageSpeed was unable to connect to the server. Ensure that you are using the correct protocol (_http vs https), the page loads in a browser, and is accessible on the public internet." Also, GTMetrix gives me an error as well, which states the following: "An error occurred fetching the page: HTTPS error: SSl connect attempt failed" All of my redirects seem to be set-up correctly as well as the SSL certificate. I've contacted my hosting provider (godaddy), they are saying that everything is fine with the server and the installation. Also, tried in different browsers in incognito mode, still gives me the same error. Until yesterday I haven't had such a problem. I've also attached the error screenshot links. I would really appreciate your help! Dmytro UxchPYR M52iPDf

    Technical SEO | | solvid
    1

  • Even here in moz community I am noticing it. Is it really a factor to have an ending slash on the page? Does it make a difference? Our website has a homepage PA of 63, DA of 56 but all of our sub-pages are just 1 and they have been up for 4 months.

    Web Design | | serverleap
    1

  • Hello Mozers, Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap. My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume). What are best practices here? Thanks!

    Technical SEO | | yacpro13
    1

  • My website www.aquatell.com was recently moved to the Shopify platform.  We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https.  Only our shopping cart is using https protocol.  We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version.  What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version.  And the https version is always better.  Example:  http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27.  Can somebody please help me make sense of this?  Thanks,

    On-Page Optimization | | Aquatell
    1

  • I am trying to figure out fb:admin tag.  I noticed that Moz uses a person's from hubspot,  is this because hubspot connects into their facebook data? Who should my company set it to?   Any guides would be helpful. Thanks, Chris

    Social Media | | Autoboof
    0

  • Hi all First question here but I've been lingering in the shadows for a while. As part of my companies digital marketing plan for the next financial year we are looking at benchmarking against certain KPIs. At the moment I simply report our conversion rate as Google Analytics displays it. I was incorrectly under the impression that it was reported as unique visits / total orders but I've now realised it's sessions / total orders. At my company we have quite a few repeat purchasers. So, is it best that we stick to the sessions / total orders conversion rate? My understanding is multiple sessions from the same visitor would all count towards this conversion rate and because we have repeat purchasers these wouldn't be captured under the unique visits / total orders method? It's almost as if every session we would have to consider that we have an opportunity to convert. The flip side of this is that on some of our higher margin products customers may visit multiple times before making a purchase. I should probably add that I'll be benchmarking data based on averages from the 1st April - 31st of March which is a financial year in the UK. The other KPI we will be benchmarking against is visitors. Should we change this to sessions if we will be benchmarking conversion rate using the sessions formula? This could help with continuity and could also help to reveal whether our planned content marketing efforts are engaging users. I hope this makes sense and thanks for reading and offering advice in advance. Joe

    Reporting & Analytics | | joe-ainswoth
    1

  • I have a client who has an opportunity to sponsor a prestigious national organisation. In return, they will get 200+ links from the websites of the organisation's members. Most of which are DA 40+.None of the linking sites have any relevance to my client's industry.1) How likely do you think it is that Google will view these as paid links?2) Do you feel that there is potential harm in gaining this many non-relevant links in a short time-frame?3) The client wants me to quantify the ranking benefits of gaining these links and calculate a potential ROI. If that's even possible, how would you go about that?Thanks in advance

    Link Building | | richdan
    2

  • Hi guys, Unique issue with google analytics reporting for one of our sites. GA is reporting sessions for 404 pages (landing pages, organic traffic) e.g. for this page: http://www.milkandlove.com.au/breastfeeding-dresses/index.php the page is currently a 404 page but GA (see screenshot) is reporting organic traffic (to the landing page). Does anyone know any reasons why this is happening? Cheers. http://www.milkandlove.com.au/breastfeeding-dresses/index.php GK0zDzj.jpg

    Reporting & Analytics | | jayoliverwright
    2

  • I tried adding onto a question already listed, however that question stayed where it was and didn't go anywhere close to somewhere others would see it, since it was from 2012. I have a competitor who is completely new, just popped onto the SERPs in December 2015. Now I've wondered how they jumped up so fast without really much in the way of user content. Upon researching them, I saw they have 200 backlinks but 160 of them are from their parent company, and of all places coming from the footer of their parent company. So they get all of the pages of that domain, as backlinks. Everything I've read has told me not to do this, it's going to harm the site bad if anything will discount the links. I'm in no way interested in doing what they did, even if it resulted in page 1 ( which it has done for them ), since I believe that it's only a matter of time, and once that time comes, it won't be a 3 month recovery, it might be worse. What do you all think? My question or discussion is why hasn't this site been penalized yet, will they be penalized and if not, why wouldn't they be? **What is the good, bad and ugly of backlinks in the footer: ** Good Bad Ugly

    White Hat / Black Hat SEO | | Deacyde
    0

  • Hello, We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types. The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean: www.domain.com/project?topic=firstTopic
    or
    www.domain.com/project?object=typeOne The problems arise when people select multiple topics, potentially across two different filter types: www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns: A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated. Thanks!

    Intermediate & Advanced SEO | | digitalcrc
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.