Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hello, A collegue has asked if we can buy multiple domain names which contain keywords and point them at our website. Is this good practise or will it be seen as spam? Will these domains actually get ranked? I'm sure I'm not the first person to raise this but can't seem to find any questions and answers about this. Thanks Mark

    | markc-197183
    0

  • I have a web site that has content about home improvement topics but the site has no new content since 2010.  All the posts on the wordpress site have the date which are all 2010 and prior.  Is there a downside in terms of search engine rankings to remove the dates or changing the dates?  What are the risks to removing the dates?  Could I lose rankings if I do this?  Do you have any personal experience with this situation?

    | alpha17
    0

  • Just started with a new SEO client. The site is built on Sharepoint Server 2007 running Windows Server 2003 R2 on IIS 6.5 (I know, fun times for me). Being a standard crappy Windows setup, URLs and canonicalization is a huge issue: first and foremost, we get a 302 redirect from the root www.example.com to www.example.com/Pages/default.aspx Now standard SEO best practices dictate that we rewrite and redirect these pages so they're clean URLs. However that may or may not be possible in the current environment - so is the next best thing to change those to 301s so at least link authority is passed better between pages? Here's the tricky thing - the 302s seem to be preventing Google from indexing the /Pages/default.aspx part of the URL, but the primary URL is being indexed, with the page content accurately cached, etc. So, www.example.com 302 redirects to www.example.com/Pages/default.aspx but the indexed page in Google is www.example.com www.example.com/sample-page/ 302 redirects www.example.com/sample-page/Pages/default.aspx but the indexed page in Google is www.example.com/sample-page/ I know Matt Cutts has said that in this case Google will most likely index the shorter version of the URL, so I could leave it, but I just want to make sure that link authority is being appropriately consolidated. Perhaps a rel=canonical on each page of the source URL? i.e. the www.example.com/sample-page/ - however is rel=canonical to a 302 really acceptable? Same goes for sitemaps? I know they always say end-state URLs only, but as the source URLs are being indexed, I don't really want Google getting all the /Pages/default.aspx crap. Looking for thoughts/ideas/experiences in similar situations?

    | OddDog
    0

  • Hi guys We have started to rewrite our website http://www.edamam.com on AJAX, and the idea is to have all the website on AJAX in the next few months. Although it would probably be difficult to index even with the Google Crawling protocol, and some other issues might appear, the engineers insist that from technology point of view this is the best way to go. We have already rewritten the internal search result pages, e.g. http://www.edamam.com/recipes/pasta and last week we set the Google Crawling protocol for AJAX to some of the individual recipe pages to test it. I'd like to ask for you opinion on whether the rich snippets we have in the search results will be affected by this change? Are there specific actions we need to take to preserve them? What other hot tips you have for dealing with AJAX on any level of the website? Thanks in advance Lily

    | wspwsp
    0

  • We are using zenfolio as a hosted photography/image gallery set up as http://oursite.zenfolio.com We have about 24,000 backlinks to the website however over 22,000 are from zenfolio.
    Do you see issues with this set up from an organic seo perspective and so many links from one domain pointing back into the main site?
    Thanks

    | jazavide
    0

  • Hello here, In the past I was able to find out pretty easily how many images from my website are indexed by Google and inside the Google image search index. But as today looks like Google is not giving you any numbers, it just lists the indexed images. I use the advanced image search, by defining my domain name for the "site or domain" field: http://www.google.com/advanced_image_search and then Google returns all the images coming from my website. Is there any way to know the actual number of images indexed? Any ideas are very welcome! Thank you in advance.

    | fablau
    1

  • I have 2 questions: 1. To check keyword rankings with firefox, i am choosing: Tools>Options>Privacy>"clear all current history" Timerange to clear: Everything Check Boxes: Browsing and download history, form and search history, cookies, cache, active logins Is there anything else I need to be doing? 2. Search results in my Niche are heavily localized. Is there any way to check rankings in another area? Ex: By default, our rankings are for Northeast NJ. Is there any way to check Baltimore, for example?

    | CsmBill
    0

  • I am scared that somehow the search engines are penalizing me for something, but I don't know what. The site can be found at http://www.hypnotherapy-guide.com It is a business directory/advice/guide site listing a lot of hypnotherapists (9000). Is it possible that such a large site popping up over night is flagged by the SE as spam? I don't know what I am doing wrong.

    | tguide
    0

  • Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had  3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!

    | apogeecorp
    0

  • The Scenario: I got pages that I need to track, located in a domain, within several folders. Adding a common identifier or ID (eg. www.domain.com/folder/page-name-identifier.html) in those URL's will ease my work so I would be able to select, in Anlx, all traffic including URL's with that specific identifier. URL's for which track is needed lack this identifier today. My Plan: add identifier (7 letters fixed and common for all URLs) to those existing pages and 301 redirect from old to new URL's My Question: will this change of URL's and redirections SEO-hurt me in anyway?

    | Tit
    0

  • I had a little issue earlier where I found my client's mobile version of their website showing up in the SERPs on my desktop. I asked my programmer to get rid of it. Programmer put a nofollow tag on the link to the mobile site (from the regular website). He also put a noIndex across the whole mobile version of the website. So to double check, I should probably get rid of that noindex on the mobile website right? I think the nofollow should be enough... thoughts? thanks!

    | Rich_Coffman
    0

  • I'm using Yoast SEO plugin to generate XML sitemaps on my e-commerce site (woocommerce). I recently changed the category structure and now only 25 of about 75 product categories are included. Is there a way to manually include urls or what is the best way to have them all indexed in the sitemap?

    | kisen
    0

  • We run a printing company and as an example the grey box (at the bottom of the page) is what we have on each page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html We used to use this but tried to get most of the content on the page, but we now want to add a bit more in-depth information to each page. The question i have is -  would a 1200 word document be ok in there and not look bad to Google.

    | BobAnderson
    0

  • Apart from keyword stuffing, what is considered by Google as over-optimization? For instance, if I link from a subdomain to a page on my main domain with a keyword-rich anchor text, does it qualify as over-optimization?

    | sbrault74
    0

  • I'm asking for people's opinions on varying internal anchor text. Before you jump in and say, "Oh yes, varying your anchor text is always a good idea", let me explain. I'm not talking about varying anchor text on different links scattered throughout a site. We all know that is a wise thing to do for a variety of reasons that have been covered in many places. What I'm talking about is including semi-useful links below the fold and then varying the anchor text with each page load. Each time Googlebot crawls a page, it sees different anchor text for each link. That way, Googlebot is seeing, for example, 'san diego bars', 'taverns in san diego', 'san diego clubs', and 'pubs in san diego' all pointing to a San Diego bar/tavern/club/pub page. I'm wondering if there is value in this approach. Will it help a site rank well for multiple search queries? Could it potentially be better than static anchor text as it may help Google better understand the targeted page? Is it a good way to protect a large site with a huge number of internal links from Penguin? To summarize, we're talking about the impact of varying the anchor text on a single page with each page load as opposed to varying the anchor text on different pages. Thoughts?

    | RyanOD
    0

  • As a SEO beginner I am still unsure of the best way to build links. I have a article which I'd like to distribute, I've used sites like ezinearticles.com however I feel this is too general so I've targeted some sites which are associated to the industry. Many of the sites I've discovered charge a hefty fee. Before I take the plunge and spend money (I'm a Yorkshireman) I'd like the thoughts and opinions of SEO experts. Is this common practice or are there better, more cost effective ways? Thanks in advance,
    Neville

    | desktop_nev
    0

  • I need some help with a regex for htaccess. I want to 301 redirect this: http://olddomain.com/oldsubdir/fruit.aspx to this: https://www.newdomain.com/newsubdir/FRUIT changes: different protocol (http -> https) add 'www.' different domain (olddomain and newdomain are constants) different subdirectory (oldsubdir and newsubdir are constants) 'fruit' is a variable (which will contain only letters [a-zA-Z]) is it possible to make 'fruit' UPPER case on the redirect (so 'fruit' -> 'FRUIT') remove '.aspx' I think it's something like this (placed in the .htaccess file in the root directory of olddomain): RedirectMatch 301 /oldsubdir/(.*).aspx https://www.newdomain.com/newsubdir/$1 Thanks.

    | scanlin
    0

  • Dear all, i have read many posts about having one content on 2 different domains, how to combine those two to avoid duplicate content. However the story of my two domains makes this question really difficult. Domain 1: chillispot.org ( http://www.opensiteexplorer.org/links?site=chillispot.org ) The original site was on this domain, started 9 years ago. That time the owner of the domain was not me. The site was very popular with lots of links to it. Then after 5 years of operation, the site closed. I have managed to save the content to: Domain 2: chillispot.info ( http://www.opensiteexplorer.org/links?site=chillispot.info ) The content i put there was basically the same. Many links were changed to chillispot.info on external sites when they noticed the change. But lots of links are still unchanged and pointing to .ord domain. The .info is doing well in search engines (for example for keyword 'chillispot'). Now i managed to buy the original chillispot.org domain. As you can see the domain authority of the .org domain is still higher than the .info one and it has more valuable links. Question is: what would be the best approach to offer content on both domains without having penalized by google for duplicated content? Which domain should we keep the content on? The original .org one, which is still a better domain but not working for several years or the .info one who has the content for several years now and doing well on search engines? And then, after we decide this, what would be the best approach to send users to the real content? Thanks for the answers!

    | Fudge
    0

  • Hi, Our client has had a disaster with their domain name registrar, where the DNS settings have been reset and it looks like the registrar won't be able to re-instate the DNS settings for four days time.  This is a nightmare for lost business whilst the site and emails are offline.  As a fallback, we've setup a copy of the client's website at an alternative domain name so that people can be directed there in the meantime via Facebook posts, etc. Is there anything you would recommend we do in the meantime to minimise the loss of traffic from search engines, and loss of reputation with Google? eg. using Google webmasters to tell Google about the change of address? Thank you.

    | smaavie
    0

  • Hi All, All type of site wide links are bad for Google or it depends upon other factors as well? For example if you talk about GoDaddy or any other service provider company they put their links on the footer of other websites so in this condition, Google will harm their rankings or not? Also elaborate the best practices for site wide links.

    | RuchiPardal
    0

  • Hi! I'm doing an audit of http://www.stevesims.com/ at the moment, who has had rankings for 'website designers' plummet recently. Looking at the site, there a few things to do with on-page and on-site optimisation, but nothing major. Instead, I think the link profile is the issue. There's a lot of site wide links from non-relevant sites, but I'm struggling to see anything else. Any thoughts would be much appreciated!

    | Blink-SEO
    0

  • Hi, We participated in an event and it is now over and therefore it has to be removed now. I was thinking of writing a blog post about the event and place a 301 redirect on the page enlisting the event's detail and registration process. Would it be a good idea or should I do something else? Regards,

    | IM_Learner
    0

  • My website has a main section that we call expert content and write for. We also have a community subdomain which is all user generated. We are a pretty big brand and I am wondering should the rel publisher tag just be for the www expert content, or should we also use it on the community UGC even though we don't directly write that?

    | MarloSchneider
    1

  • If you have had a successful reconsideration request, would you be comfortable sharing the letter you sent? We are trying to draft ours and could use some guidance.

    | CMC-SD
    0

  • Google: "back to school supplies haul" I thought this was really epic and first time I've ever seen such a results page. No ads, no text results. Anyone else seen this? -First post on new Moz design.

    | William.Lau
    0

  • Looking for some opinions here please.  Been involved in seo for a couple of years mainly working on my websites and picking up the odd client here and there through word of mouth.  I must admit that up until a few months back I was guilty of using some grey methods of link building - linkvana, unique article wizard and the such. While no penalties were handed out to my domains and some decent rankings gained, I got tired of always being on the lookout for what the next Google update will do to my results and which networks were being hit, and so I moved a lot more into the 'proper' way of seoing. These days my primary sources for backlinks are much more respectable... myblogguest bloggerlinkup postjoint Guest Blog Finder http://ultramarketer.com/guest-blogger-finder/ - not sure where i came across this resource but it's very handy I use these sources alongside industry only directories and general word of mouth. Ironically I have found that doing the word by hand not only leads to results I can happyily show people (content wise) but also it's much quicker and cheaper. The increased authority of the sites means far fewer links are needed. The one area I still am having a little issue with is that of building keyword based backlinks. I now find it fairly easy to get my content on a reasonable quality site - DA to 40 and above, however the vast majority of these sites will allow the backlink only as the company name or as a generic read more type thing. This is fine and it is improving my website performance and authority. The trouble I am finding is that while i am ranking for the title tag and some keywords in the page, I am struggling to get backlinks for other keywords. In an ideal world every page on the site would be optimised for a different keyword and you could then just the site name as anchor text to build the authority of that page and make it rank for it's content, but what about when you (or the client) wants to rank the home for a number of different keywords, some not featured on the page. The keywords are too similar to go to the trouble of making unique pages for, and that would also add no value to the site. My question really then, after a very long winded way of getting there, is are others finding it much more difficult to gain keyword based backlinks these days?  The great thing about the grey seo tools, as mentioned above, is that it was super easy to get the backlinks with whatever anchor text you wanted - even if you needed hundreds of the thing to compensate for the low value of each!! Thanks Carl

    | GrumpyCarl
    0

  • Hi Guys, I will appreciate if you answer 1 small question..... Will our site benefit from that link?
    Valuable website related to our business ---nofollow link--> PDF Doc(on second site) ---link to our site ---> Kind Regards,
    webdeal

    | Webdeal
    0

  • Hi! I'm currently working with http://www.muchbetteradventures.com/. They have a previous version of the site, http://v1.muchbetteradventures.com, as sub domain on their site. I've noticed a whole bunch of indexing issues which I think are caused by this. The v1 site has several thousand pages and ranks organically for a number of terms, but the pages are not relevant for the business at this time. The main site has just over 100 pages. More than 28,400 urls are currently indexed. We are considering turning off the v1 site and noindexing it. There are no real backlinks to it. The only worry is that by removing it, it will be seen as a massive drop in content. Rankings for the main site are currently quite poor, despite good content, a decent link profile and high domain authority. Any thoughts would be much appreciated!

    | Blink-SEO
    0

  • Did you go ahead and remove all the TOXIC and HIGH RISK links? Just the toxic? Were you successful with the tool?

    | netviper
    0

  • Hi guys, A website has many venue pages, for example: www.example.com/venue/paris For some reason the parent www.example.com/venue/  is 301 redirecting to a minor page elsewhere on the website. Should I remove the 301 redirect and then create www.example.com/venue/     page that then links to all the venues? My thinking is: Google will expect there to be a /venue/ 'parent' page So if the parent page is redirecting to a minor page elsewhere within the website its telling Google all the venues like paris must be even less important. Should I do it? Any suggestions from fellow SEOMoz's would be appreciated! All the best Richard

    | Richard555
    0

  • If so how bad? We use tags on our blog and this causes duplicate content issues. We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all. Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings. Before I do, can you give me some advice around this? Thanks,
    Daniel.

    | Daniel_B
    0

  • I have noticed in my Google news website, that after publishing a post, it takes around 3-4 weeks till that page gains MOZ page authority I am interested in knowing why it takes this set period of time. Is there a way to shorted that period? And am I correct in thinking that links from a page with MOZ authority of say 33 is more powerful then a link from a page with MOZ page authority of 1? Would be great to understand more about this

    | JohnPeters
    0

  • http://en.wikipedia.org/wiki/Muslim_Academy Recently created this page but giving two errors at the moment. Need your advice with how to fix these two point mentioned by wikipedia. | This article has no links to other Wikipedia articles. (July 2013) This article is an orphan, as no other articles link to it. (July 2013) |

    | csfarnsworth
    0

  • A couple of days ago we did a restructure of our e-commerce site (wordpress + woocomerce) where some product categories needed to change names. I used Yoast SEO plugin to do 301 redirects in the .htaccess file.Today I noticed that we had two hits in the SERP on the phrase "dildos med vibrator". See the attached screenshot (first two results).One goes to http://www.oliverocheva.se/kategori/sexleksaker/dildos/dildos-med-vibrator/ which is the right URL. One goes to http://www.oliverocheva.se/kategori/sexleksaker/dildosdildos-med-vibrator-dildos-for-honom/ which is a corrupt URL that has never been in use. The old one we did a redirect from was /kategori/for-honom/dildos-for-honom/dildos-med-vibrator-dildos-for-honom/The command in the .htaccess file was: Redirect 301 /kategori/for-honom/dildos-for-honom/dildos-med-vibrator-dildos-for-honom/ http://www.oliverocheva.se/kategori/sexleksaker/dildos/dildos-med-vibratorWhat has happened here? Why does the 301 create entirely new URL:s in the SERP?Tz0TULT.png

    | kisen
    0

  • Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks

    | bgs
    0

  • Hi guys got a question regarding ecommerce seo do you think its a better idea to target more long tail terms and try get links directly to product pages, brand pages and categories. Rather than focus on short keywords that do bring in good traffic but are very broad, i will prob do both, but i would like a second opinion please about other users strategies thanks

    | Will_Craig
    0

  • I have a very small client/personal friend of mine who is in a very niche market. They rank pretty well for all their keywords mainly because it is so niche and the competitor websites are no good. I was wanting to begin adding a few blog posts and tips here and there about industry, but was wanting to first know why have a 0.0 mozRank. Their campaign has been set up on Moz for over 6 months and there are 0 errors and warnings for their site... I thought they would eventually warrant something. I have read the posts explaining mozRank and have came to the conclusion that it is a 0.0 still because no one is linking to their site.. am I right? Other than that, are there other ways to raise this score? Site is http://bit.ly/18nPE3W

    | BWrightTLM
    0

  • On our E-commerce site, we have multiple stores. Products are shown on our multiple stores which has created a duplicate content problem. Basically if we list a product say a shoe,that listing will show up on our multiple stores I assumed the solution would be to redirect the pages, use non follow tags or to use  the rel=canonical tag. Are there any other options for me to use. I think my best bet is to use a mixture of 301 redirects and canonical tags. What do you recommend. I have 5000+ pages of duplicate content so the problem is big. Thanks in advance for your help!

    | pinksgreens
    0

  • Hi, i am managing 2 ecommerce sites that sell a lot of identical products. snowsupermarket.co.uk - public webshop shop.snowbusiness.com - trade webshop Should i optimise the 2 sites to target different keywords for all products or, should i keep the keywords the same but, vary the meta data/ description etc. to avoid duplication. Is there a clear argument to have to ecommerce websites ranking high for our products & dominating page 1, even though they will be technically competing against each other? Thanks, Ben

    | SnowFX
    0

  • Hi Guys, It's bugging the crap out of me why this site does so well http://www.stagedinburgh.com/ when I look at it's link profile its so weak and terrible plus many links comes from  the sites they own. Somehow the site out ranks many sites for search terms like edinburgh stag party, edinburgh stag do, edinburgh stag weekends.  Am I missing something?  They seem to only have links from 13 domains and they aint great. What am I missing?

    | PottyScotty
    0

  • Hi Everyone,
    A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.

    | AMA-DataSet
    0

  • I have a client that wants to migrate some of his site's content to a new domain, not all of the content, just some of it. This is not an address change. He wants to continue actively using the domain name where all this content currently resides, so it's not a matter of notifying search engines of an address change. The first thing that comes to mind is the use of the canonical tag, but it's not making sense. Any recommendations? Thanks in advance.

    | UplinkSpyder
    0

  • We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.

    | BobAnderson
    0

  • I just wanted to check what you guys thought of this strategy for duplicate product descriptions. A sample product is a letter bracelet - a, b, c etc so there are 26 products with identical descriptions.  It is going to be extremely difficult to come up with 25 new unique descriptions so with recommendation i'm looking to use the canonical tag.  I can't set any to no-index because visitors will look for explicit letters. Because the titles only differ by the letter then a search for either letter bracelet letter a bracelet letter i bracelet will just return results for 'letter bracelet' due to stop words unless the searcher explicitly searches for 'letter "a" bracelet. So I reckon I can make 4 new unique descriptions.  I research what are the most popular letters picking 5 from the top (excluding 'a' and 'i').  Equally share the remaining letters between those 5 and with each group set a canonical tag pointing to the primary letter of that group. Does this seem a sensible thing to do?

    | MickEdwards
    0

  • Hello, My question is: we all know that blogs or great content is the way to good backlinks. But rather than this, what other ways are to build quality links to a website? I for example try researching competition backlinks (with opensiteexplorer), or find directories etc. Is this the right way? I also try to produce great content, but I would have liked more technical SEO tricks for this 🙂 And one more thing: how long links are needed before they impact rankings? Thanks if someone can help! Eugenio

    | socialengaged
    0

  • Does anyone know if Google can index PDF with Flash embedded? I would assume that the regular flash recommendations are still valid, even when embedded in another document. I would assume there is a list of the filetype and version which Google can index with the search appliance, but was not able to find any. Does anyone have a link or a list?

    | andreas.wpv
    0

  • Over the years, I've gathered thousands of user reviews on a website I am shutting down although I would like to keep them for another website. I removed the reviews from the old website, set the reviews pages to "noindex" and removed the pages from Google's index using the Webmaster Tools. At this point the reviews are not showing up in Google's search results anymore. Would there be any concerns about posting these reviews on a new website? Can it get penalized for duplicate content?

    | sbrault74
    0

  • I noticed that our total # of indexed pages dropped recently by a substantial amount (see chart below) Is this normal? http://imgur.com/4GWzkph Also, 3 weeks after this started dropping, we got a message on increased # of crawl errors and found that a site update was causing 300+ new 404s. could this be related ?

    | znotes
    0

  • Hello here.I own an e-commerce website (virtualsheetmusic.com), and some of our most important category pages have pretty long URLs. Here is an example: http://www.virtualsheetmusic.com/downloads/Indici/Violin.html I am evaluating the possibility to shorten URLs like the above to something like: http://www.virtualsheetmusic.com/violin/ But since it is going to pretty hard and time consuming (considering the custom system we have in place on that site), I am trying to find out if it really matters and worth doing it from a SEO stand point. I am aware that from a user prospective shorter URLs are preferable, and we plan to pursue a better URL architecture on our website in the near future just for that, but this question, at the moment, should be strictly related to SEO. Any thoughts on this topic are very welcome!

    | fablau
    0

  • Hi everybody! I've been working on http://thewilddeckcompany.co.uk/ for a little while now. Until recently, everything was great - good rankings for the key terms of 'bird hides' and 'pond dipping platforms'. However, rankings have tanked over the past few days. I can't point my finger at it yet, but a site:thewilddeckcompany.co.uk search shows only three pages have been indexed. There's only 10 on the site, and it was fine beforehand. Any advice would be much appreciated,

    | Blink-SEO
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.