Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • What would you say to a client who is concerned he'd have to run around buying his .com.??? in alot of other countries. Thanks!

    | 94501
    0

  • I have been trying to figure this out- https://moz.com/community/q/what-should-i-do-with-old-e-commerce-item-pages I added n/a to the end of the page titles so I could figure out how these pages were performing. Since I added them my organic traffic has seemed to have dropped. It has only been a few days so maybe it is an anomaly. Everything else has stayed the same, would this cause an organic traffic drop?

    | EcommerceSite
    0

  • Hi guys, I own a London based rubbish removal company, but don't have enough jobs. I know for sure that some of my competitors get most of their jobs trough Google searches. I also have a website, but don't receive calls from it at all. Can you please tell me how to rank my website on keywords like: "rubbish removal london", "waste clearance london", "junk collection london" and other similar keywords? I know that for person like me (without much experience in online marketing) will be difficult task to optimize the website, but at least -  I need some advices from where to start. I'm also thinking to hire an SEO but not sure where to find a trusted company. Most importantly I have no idea how much should pay to expect good results? What is too much and what is too low? I will appreciate all advices.

    | gorubbishgo
    0

  • I am not sure how to fix some errors that are popping up in Google Search Console. The response codes showing are all: 500 error code I need some advice as to how to fix these. What are my options?

    | pmull
    0

  • Hello everyone, I have a pretty large e-commerce website and a bunch (about 1,000) affiliates using our in-house affiliate system we built several years ago (about 12 years ago?). All our affiliates link to us as follows: http://mywebsite.com/page/?aff=[aff_nickname] Then our site parses the request, stores a cookie to track the user, then 301 redirects to the clean page URL below: http://mywebsite.com/page/ Since 2013 we require all affiliates to link to us by using the rel="nofollow" tag to avoid any penalties, but I still see a lot of affiliate links not using the nofollow or old affiliates that have not updated their pages. So... I was reading on this page from Google, that any possible "scheme" penalization can be fixed by using either the nofollow tag or by using an intermediate page listed on the robots.txt file: https://support.google.com/webmasters/answer/66356?hl=en Do you think that could really be a reliable solution to avoid any possible penalization coming from affiliate links not using the "nofollow" tag? I have searched and read around the web but I couldn't find any real answer to my question. Thanks in advance to anyone. Best, Fab.

    | fablau
    0

  • Hello Guys, One of my friend's website domain authority is decreasing since they have moved their domain from HTTP to https. 
    There is another problem that his blog is on subfolder with HTTP.
    So, can you guys please tell me how to fix this issue and also it's losing some of the rankings like 2-5 positions down. Here is website URL: myfitfuel.in/
    here is the blog URL: myfitfuel.in/mffblog/

    | Max_
    0

  • I am about to help release a product which also gives people a site seal for them to place on their website. Just like the geotrust, comodo, symantec, rapidssl and other web security providers do.
    I have notices all these siteseals by these companies never have nofollow on their seals that link back to their websites. So i am wondering what is the best way to do this. Should i have a nofollow on the site seal that links back to domain or is it safe to not have the nofollow.
    It wont be doing any keyword stuffing or anything, it will probly just have our domain in the link and that is all. The problem is too, we wont have any control of where customers place these site seals. From experience i would say they will mostly likely always be placed in the footer on every page of the clients website. I would like to hear any and all thoughts on this. As i can't get a proper answer anywhere i have asked.

    | ssltrustpaul
    0

  • Hi Moz community, Since I have many products in most of my pages which have the targeted keyword in the product name I get the "Keyword Stuffing" error. Is it really considered as "Keyword Stuffing" by Google? In addition to the products, I have some texts containing the targeted keyword for the page and this makes the number of keywords used in a page even higher. Thank you for your answers.

    | onurcan-ikiz
    0

  • Hi - our company just completely redesigned our website and went from a static HTML site to a PHP based site, so every single URL has changed (around 1500 pages). I put the same verification code into the new site and re-verified but now Google is listing tons and tons of 404's. Some of them are really old pages that haven't existing in a long time, it would literally be impossible to create all the redirects for the 404s it's pulling. Question - when completely changing a site like this, should I have created a whole new Search Console? Or did I do the right thing by using the existing one?

    | Jenny1
    0

  • Hi, Has anyone tried starting a new domain after being hit with a Penguin penalty? I'm considering the approach outlined here: https://searchenginewatch.com/sew/how-to/2384644/can-you-safely-redirect-users-from-a-penguin-hit-site-to-a-new-domain. In a nutshell, de-index the OLD site completely via Google's Removal Tool, and then relaunch old content under new domain. This seems to have merit, unless Google keeps a hidden cache of content (or uses other sources like Wayback Machine). My concern is doing the above listed approach, but Google still passes the old links to the new domain. We have great content, but too much spam (despite me removing a lot of the links + disavow). Any feedback based on experience would be appreciated. Thanks.

    | mrodriguez1440
    1

  • Hi everyone, we set up a website for an automobile reseller. The site is online for about 10 days now and is doing ok. The competition is at a medium level. The URL is http://fahrzeugankauf-wehrle.de/ Now I wonder how I can improve the internal linking a little more. I already read this one https://moz.com/learn/seo/internal-link but still wonder whether I should link from the sub-pages with the main keywords like "auto kaufen freiburg" oder "autoankauf freiburg" to the mainpage.
    Instead I am linking right now from the mainpage to the subpage "Auto verkaufen". Isn't this a bit contradictory?

    | RWW
    0

  • I am trying to rank for a local SEO term on a website for a national company.   Should I write an optimized blog post, or optimized site page?  Does it make a difference? Thanks!

    | aj613
    0

  • Our team QA specialist recently noticing that the class identifier URLs via productontology are 404ing out saying that the "There is no Wikipedia article for (particular property)". They are even 404ing for productontology URLs that are examples on the productontology.com website! Example: http://www.productontology.org/id/Apple The 404 page says that the wiki entry for "Apple" doesn't exist (lol) Does anybody know what is going on with this website?  This service was extremely helpful for creating additionalType categories for schema categories that don't exist on schema.org.  Are there any alternatives to productontology now that these class identifier URLs are 404ing? Thanks

    | RosemaryB
    0

  • Hi all, hope you're all good. I am updating our disavow file, we've noticed a couple more spammy links which are pointing at or site. While I was at it, affiliate links came to my mind. At the moment we have over 100k+ affiliate links pointing to the root of our site and other categories/products, most of them are do-follow. However, taking a look at WMT, it's one of our 'Who links the most' and the affiliate network is pointing a total of 115,065 links to us. My question; bearing it mind this site generates over 2million hits a month, is it really worth disavowing the entire affiliate link network. This would result is all of those 100,000 links being disavowed over time. Do you think this would result in a positive? Let me know your thoughts.

    | Brett-S
    0

  • Hello everyone. Our e-commerce website virtualsheetmusic.com has several hundreds affiliate incoming links, and many of them are "follow" links. I thought to redirect all incoming affiliate links to a "intermediate" page excluded by the robots.txt file in order to avoid any possible "commercial links" penalty from Google, but I now face a dilemma... most of our best referral links are affiliate links, by excluding those links from our back link profile could give us a big hit in terms of rankings. How would you solve this dilemma? What would you suggest doing in this sort of cases?

    | fablau
    0

  • Hi Guys, For pagination, if you have implemented the rel Prev/Next tags correctly, is it fine to have duplicate titles in the series example: Title Tag: Black Dresses URL: http://www.site.com/blackdresses
    Title Tag: Black Dresses URL: http://www.site.com/blackdresses/2
    Title Tag: Black Dresses URL: http://www.site.com/blackdresses/3
    Title Tag: Black Dresses URL: http://www.site.com/blackdresses/4 Some people mention that you should make them unique and add the page number example: Title Tag: Black Dresses URL: http://www.site.com/blackdresses
    Title Tag: Black Dresses URL: http://www.site.com/blackdresses/2 
    Title Tag: Black Dresses URL: http://www.site.com/blackdresses/3 
    Title Tag: Black Dresses URL: http://www.site.com/blackdresses/4 Keen to hear what you guys thing? Personally, i think its fine to have duplicate title tags when you have properly implemented rel Prev/Next tags as Googel will see the series as one. Cheers.

    | jayoliverwright
    0

  • I am not sure, if i have a valid question to ask, but i am a bit stuck.
    We just got a campaign from one client. Its automotive brand offering the same (classified) services to buy and sell used car on their sites. The issue is the have different domains, which is obvious based on which country they are however issue is those domains are not consistent match as a brand. ie www.mtmotorslab.co.uk
    www.mtmotors.co.za
    www.motortrader.com.pk
    www.motortrader.in
    www.mtmotors.com.au my question is here, how could it impact our seo efforts or any such effort to establish a strong brand with this sort situation of different tld as well as different domain under same umbrella. Many Thanks

    | Mustansar
    0

  • I haven't made any changes to my site but in a week I am showing 30-40 soft 404s in Webmaster Tools. This just started happening in the last 2 weeks. When I click to go to the pages they are fine, and even fetch and render works fine on the pages.

    | EcommerceSite
    0

  • We ave a prospect client that wants to start doing SEO for his Shopify site, we are unsure if this will be SEO friendly. Will we have enough control to get great placement? Are we better off rebuilding the site for the client in an OpenCart?

    | SEODinosaur
    0

  • I have a site that in the head section we specify a base href being the domain with a trailing slash and a canonical link href being the relative link to the domain. <base <="" span="">href="http://www.domain.com/" /> href="link-to-page.html" rel="canonical" /> I know that Google recommends using an absolute path as a canonical link but is specifying a base href with a relative canonical link the same thing or is it still seen as duplicate content?

    | Nobody1611699043941
    0

  • Hi Guys, I recently rolled out a domain wide canonical tag change. Previously the website had canonical tags without the www, however the website was setup to redirect to www on page load. I noticed that the site competitors were all using www and as far as I understand www versus non www, it's based on preference. In order to keep things consistent, I changed the canonical tag to include the www. Will the site drop in rankings? Especially if the pages are starting to rank quite well. Any feedback is appreciated. Thanks!

    | QuickToImpress
    0

  • Hi everyone I have been looking at removing ads from our videos - I have managed to disable the interest based ones but can't remove the location based ads.Our videos don't have copywritten material either.How can I get these ads removed? Any help would be much appreciated :)Becky

    | BeckyKey
    0

  • Hi Moz community, I am getting the "Avoid Too Many Internal Links" error from Moz for most of my pages and Google declared the max number as 100 internal links. However, most of my pages can't have internal links less than 100, since it is a commercial website and there are many categories that I have to show to my visitors by using the drop down navigation bar. Without counting the links in the navigation bar, the number of internal links is below 100. I am wondering if the navigation bar links affect the link juice and counted as internal links by Google. The Same question also applies to the links in the footer. Additionally, how about the products? I have hundreds of products in the category pages and even though I use pagination I still have many links in the category pages (probably more than 100 without even counting the navigation bar links). Does Google count the product links as internal links and how about the effect on the link juice? Here is the website if you want to take a look: http://www.goldstore.com.tr Thank you for your answers.

    | onurcan-ikiz
    0

  • Started working on a site and learned that the person before me had done a fairly sketchy maneuver and am wondering if it's a net gain to fix it. The site has pages that it wanted to get third party links linking to. Thing is, the pages are not easy to naturally link to boost them in search. So, the woman before me started a new blog site in the same general topic area as the first/main site. The idea was to build up even the smallest bit of authority for the new blog, without tipping Google off to shared ownership. So, the new blog has a different owner/address/registrar/host and no Google Analytics or Webmaster Tools account to share access to. Then, as one method of adding links to the new blog, she took some links that originally pointed to the main site and re-directed them to the blog site. And voila! ...Totally controllable blog site with a bit of authority linking to select pages on the main site! At this point, I could un-redirect those links that give the blog site some of its authority. I could delete the links to the main site on the blog pages. However, on some level it may have actually helped the pages linked to on the main site. The whole thing is so sketchy I wonder if I should reverse it. I could also just leave it alone and not risk hurting the pages that the blog currently links to. What do you think? Is there a serious risk to the main site in this existing set up? The main site has hundreds of other links pointing to it, a Moz domain authority of 43, thousands of pages of content, 8 years old and Open Site Explorer Spam Score of 1. So, not a trainwreck of sketchiness besides this issue. To me, the weird connection for Google is that third party sites have links that (on-page-code-wise) still point to the main site, but that resolve via the main site's redirects to the blog site. BTW, the blog site points to other established sites besides the main site. So, it's not the exclusive slave to the main site. Please let me know what you think. Thanks!

    | 94501
    0

  • Working on a domain change for a client. They're hosted on Wordpress and their developer wants to simply switch out the DNS for the new domain to point to wordpress, and then have the old domain use 301s to redirect to the new domain. The url structure will be the same, but there will be no CMS connected to the old domain after the switch. Is this dangerous for SEO? A significant portion of their customers are from organic traffic and losing SEO value would be very bad.

    | dfolwell
    0

  • We're looking to add our products on Amazon to expend our reach of customers, but one question that I can't answer for sure is, will the Amazon product page outrank our online store's product page? I'm thinking that if we re-write the description and re-arrange the product name that we'll be fine but I'd like your guys’ opinion on this. Extra Info: Product that we're selling aren't listed on amazon, we're the only one selling these. So we might be shooting ourselves in the foot by trying to sell on Amazon since no one else is selling that product. Any feedback is greatly appreciated!

    | FrankViolette
    0

  • In Search Console I am getting errors under other. It is showing urls that have this format- https://www.site.com/Item/654321~SURE⠃︲蝞韤諫䴴SPপ�.htm When clicked it shows 蝞韤諫䴴SPপ�  instead of the % stuff. As you can see this is an item page and the normal item page pulls up fine with no issues. This doesn't show it is linked from anywhere. Why would google pull this url? It doesn't exist on the site anywhere. It is a custom asp.net site. This started happening in mid May but we didn't make any changes then.

    | EcommerceSite
    0

  • Hi I have a few hurdles I need some help with. I work on an ecommerce site, it's huge but I'm the only SEO. I've identified areas where we can make progress in rankings - we already have pages appearing for those terms, but they still need help to be pushed higher. My issue is - if it's a product listing page & we have all the SEO foundations here already, how can we push this even further? No one in the company wants too much content on this page taking away from products & I've tried adding a small amount - with no results. I know UX is something to be looked at, but I need support from others so this will be slow. Does anyone agree that writing user guides, hub pages & articles that this is a good next step? My only worry is - these pages will also be new and take time to rank so will they help the category pages enough? How can I maximise them? I'm also the only SEO, so really need to pick my battles. Thanks Becky

    | BeckyKey
    0

  • Question... I have a few clients that rank well for Google (first page) but are not ranking as well on Bing (page 5). Any suggestions on what to do to increase Bing ranking? I understand that Google and Bing use different algorithms, but just curious how you can increase ranking. Appreciate your input. Thanks

    | Kdruckenbrod
    0

  • Hi I've read various articles on this - some saying it's still important to have the keyword at the beginning of the title and some saying it's not a big factor anymore? Does anyone have an opinion on this?

    | BeckyKey
    0

  • For background the sub domain structure here is inherited and commited to due to tech restrictions with some of our platforms. The brand I work with is splitting out their global site into regional sub sites (not too relevant but this is in order to display seasonal product in different hemispheres and to link to stores specific to the region). All sub-domains except EU will be geo-targeted to their relevant country. Regions and sub domains for reference: AU - Australia CA - Canada CH - Switzeraland EU - All Euro zone countries NZ - New Zealand US - United States This will be done with Wordpress multisite. The set up allows to publish content on one 'master' sub site and then decide which other sub sites to 'broadcast' to. Some content is specific to a sub-domain/region so no issue with duplicate and can set the sub-site version as canonical. However some content will appear on all sub-domains. au.example.com/awesome-content/ nz.example.com/awesome-content/ Now first question is since these domains are geo-targeted should I just have them all canonical to the version on that sub-domain? eg Or should I still signal the duplicate content with one canonical version? Essentially the top level example.com exists as a site only for publishing purposes - if a user lands on the top level example.com/awesome-content/ they are given a pop up to select region and redirected to the relevant sub-domain version. So I'm also unsure whether I want that content indexed at all?? I could make the top level example.com versions of all content be the canonical that all others point to eg. and rely on geo-targeting to have the right links show in the right search locations. I hope that's kind of clear?? Obviously I find it confusing and therefore hard to relay! Any feedback at all gratefully received. Cheers, Steve

    | SteveHoney
    0

  • In its structured data (i.e., Schema.org) documentation, Google says that the "image" property is required for the breadcrumbs data type. That seems new to me, and it seems unnecessary for breadcrumbs. Does anyone think this really matters to Google? More info about breadcrumbs data type:
    https://developers.google.com/search/docs/data-types/breadcrumbs I asked Google directly here:
    https://twitter.com/RyanRicketts/status/755478266878853122

    | Ryan-Ricketts
    0

  • Alright, I am pretty sure I know the answer is "Nothing more I can do here." but I just wanted to double check.  It relates to the robots.txt file and that pesky "A description for this result is not available because of this site's robots.txt". Typically people want the URL indexed and the normal Meta Description to be displayed but I don't want the link there at all. I purposefully am trying to robots that stuff outta there.
    My question is, has anybody tried to get a page taken out of the Index and had this happen; URL still there but pesky robots.txt message for meta description?  Were you able to get the URL to no longer show up or did you just live with this? Thanks folks, you are always great!

    | DRSearchEngOpt
    0

  • Hi, We have had roughly 1000+ requests per 24 hours from Google-adsbot to our confirmation pages. This generates an error as the confirmation page cannot be viewed after closing or by anyone who didn't complete the order. How is google-adsbot finding pages to crawl that are not linked to anywhere on the site, in the sitemap or linked to anywhere else? Is there any harm in a google crawler receiving a higher percentage of errors - even though the pages are not supposed to be requested. Is there anything we can do to prevent the errors for the benefit of our network team and what are the possible risks of any measures we can take? This bot seems to be for evaluating the quality of landing pages used in for Adwords so why is it trying to access confirmation pages when they have not been set for any of our adverts? We included "Disallow: /confirmation" in the robots.txt but it has continued to request these pages, generating a 403 page and an error in the log files so it seems Adsbot doesn't follow robots.txt. Thanks in advance for any help, Sam

    | seoeuroflorist
    0

  • Hi I wanted to see what opinions were on having a product listings on paginated pages vs. loading as the user scrolls? We use pagination but I have heard scroll may be better for SEO? Thanks!

    | BeckyKey
    0

  • Hi I've been looking at how we paginate our product pages & have a quick question on canonicals. Is this the right way to display.. Or should the canonical point to the main page http://www.key.co.uk/en/key/euro-containers-stacking-containers, so Google doesn't pick up duplicate meta information? Thanks!

    | BeckyKey
    0

  • I have a large website with about 1300 pages. I can't find a good sitemap creator that will crawl the whole site and spit out the xml file. Any ideas or suggestions for good services? Also, a site this large, should I consider mutiple site maps?

    | dwebb007
    0

  • Hey Mozzers! I have a client that I'm helping with some online ad campaigns for lead generation, but they recently had an SEO issue pop up I'm looking into for them. For whatever reason, they have 2 websites. Those are: http://www.healthsourceofroyalpalmbeach.com/ (newer site) http://www.healthsourcedecompression.com/ (older site) Their local listing is connected to the older site (above) and that's where they have all of their reviews. I know the BEST solution is probably to nix one of the sites and setup proper redirects, but how can they keep BOTH sites without damaging their SEO efforts? Currently, BOTH sites rank on page one for their primary kw target "chiropractors royal palm beach fl" Appreciate the help! Ricky

    | RickyShockley
    0

  • Hello, I want to replace the content of one page of our website (already indexeed) and shift its original content to another page. How can I do this without problems like penalizations etc? Current situation: Page A
    URL: example.com/formula-1
    Content: ContentPageA Desired situation: Page A
    URL: example.com/formula-1
    Content: NEW CONTENT! Page B
    URL: example.com/formula-1-news
    Content: ContentPageA (The content that was in Page A!) Content of the two pages will be about the same argument (& same keyword) but non-duplicate. The new content in page A is more optimized for search engines. How long will it take for the page to rank better?

    | daimpa
    0

  • Hi everyone - hope you are well. I can't get my head around why we are ranking 1st for a specific keyword, but then when 's' is added to the end of the keyword - we are ranking on the second page. What could be the cause of this? I thought that Google would class both of the keywords the same, in this case, let's say the keyword was 'button'. We would be ranking 1st for 'button', but 'buttons' we are ranking on the second page. Any ideas? - I appreciate every comment.

    | Brett-S
    0

  • Hi Moz, Sorry if this comes across as a "Do My Job For Me" type of post but we are an E-Commerce store that have been live since January but have not seen any increase in performance on our site and over the past month, have even seen our rankings decrease. We have 1300 products on site and about 1500 pages in total. 1. As for on-site optimization, we have got 2 reviews and follow up reviews with a highly reputable reviewer from People Per Hour and solved any issues she has found. 2. Updated the Meta Data for products and Alt Descriptions for images focusing on the keywords we wish to rank for. We post weekly blogposts linking back to our products. 3. Social Media Campaigns with regular campaigns on FaceBook, Pinterest, Google+ and Twitter. 4. Attempted to build FOLLOW backlinks to articles relating to products on our site. We have also considered purchasing backlinks to improve our situation as we have yet to see any of these pages be crawled by Google over a month later. I have read a guides on Moz and other sites on how to improve our authority and improve rankings but none have offered much by way of practical solution. My question being, is this just a matter of patience or should I be worried/improving anything given we have 0 Domain Authority and Page Authority on all pages? Thanking you in advance, SEO Novice.

    | csworkwear
    0

  • I made a small change on an ecommerce site that had big impacts I didn't consider... About six weeks ago in an effort to clean up one of many SEO-related problems on an ecommerce site, I had a developer rewrite the URLs to replace underscores with hyphens and redirect all pages throughout the site to that page with the new URL structure. We didn't immediately update our sitemap to reflect the changes (bad!) and I just discovered all the redirects are 302s... Since these changes, most of the pages have a page authority of 1 and we have dropped several spots in organic search. If we were to setup 301 redirects for the pages that we changed the URL structure would there be any changes in organic search placement and page authority or is it too late?

    | Nobody1611699043941
    0

  • Hi Guys, I am currently working with an eCommerce site which has site-wide duplicate content caused by currency URL parameter variations. Example: https://www.marcb.com/ https://www.marcb.com/?setCurrencyId=3 https://www.marcb.com/?setCurrencyId=2 https://www.marcb.com/?setCurrencyId=1 My initial thought is to create a bunch of canonical tags which will pass on link equity to the core URL version. However I was wondering if there was a rule which could be implemented within the .htaccess file that will make the canonical site-wide without being so labour intensive. I also noticed that these URLs are being indexed in Google, so would it be worth setting a site-wide noindex to these variations also? Thanks

    | NickG-123
    0

  • Hey all, So I found a domain that GWT tells me has 23k links pointing to a landing page. I found that the link is part of their global nav as a text ad and that's why it's probably registering so many links. The site has a DA of 56, is this a bad thing? Could it be hurting the rest of my site's ability to rank? Thanks, Roman

    | Dynata_panel_marketing
    0

  • Hey so, For our site 
    we have the desktop version: www.site.com/product-name/product-code/ The mobile version www.site.com/mobile/product-name/product-code So...on the desktop version we'd have the following.. |   | Now my question is, what do we do as far as canonicals on the actual mobile URL? Would it be this? | |
    |   | OR are we NOT supposed to have mobile canonical tags whatsoever since we've already added "rel alternate" ? Would like some clarificaiton. |   |   |

    | ggpaul562
    0

  • Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter.  The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!

    | 94501
    0

  • Hi all, Probably a dumb question, but I wanted to make sure I get this right. How do we set a custom user agent in Screaming Frog? I know its in the configuration settings, but what do I have to do to create a custom user agent specifically for a website? Thanks much! Malika

    | Malika1
    0

  • Hi guys, hope you're all good, quick question in regards to a Disavow file. A page of ours recently crashed from page 2 all the way to page 7ish. It's weird that it happened considering it was ranking on the 2nd page for around a year, then all of a sudden it came crashing down. I identified an affiliate link which was placed in a sidebar, webmaster tools picked up 24,000+ links coming from the site so I have decided to disavow it. I disavowed the site around 3 days ago, and in the mean time we have managed to grab ourselves some very good do-follow links from very authoritative sites. At the moment the page has  gone up 1 page, sitting at 4-5th page, but the rankings have been very inconsistent. Any ideas to when we may see an increase in ranking for this page? I am being very impatient, at the moment my workload has been dedicated to get this one page ranking again. All comments greatly appreciated.

    | Brett-S
    0

  • Hi all, *Admin please feel free to remove or add this to any existing post. I have searched the community for any similar questions. While checking in the Google Search Console, under the "Security Issues" (lone section) I have found Google pointing out specific pages of our website where the message we are seeing is "Content injection - These pages appear to be modified by a hacker with the intent of spamming search results." The Learn More link takes us to  https://developers.google.com/webmasters/hacked/docs/hacked_with_spam?ctx=SI&ctx=BHspam&rd=1 We've never injected spam code or have not been injected with any spammy code so what baffles me is why would Google pick this up when we have mentioned to them very clear that our code is secure and not hacked. Has anyone received a similar message and had any luck removing the message correctly? Thanks in advance!

    | SP1
    0

  • Considering buying the Yoast Premium extensions (or perhaps whole bundle) but trying to weigh up if its worth it or if the free version is comprehensive enough to do the job? Obviously paid has more features but is it worth the price tag? (I have 4 different websites) If anyone has a paid version of Yoast Premium pack and seen improvements with rankings, mark ups in google etc I would appreciate hearing your story! Thank you in advance

    | IsaCleanse
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.