Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi. Can you please advise how I can overcome this issue. Moz.com crawle is indicating I have 100's of Duplicate Title tag errors. However this is caused because many URL's have been indexed multiple times in Google. For example. www.abc.com
    www.abc.com/?b=123 www.abc.com/ www.abc.com/?b=654 www.abc.com/?b=875 www.abc.com/index.html What can I do to stop this issue being reported as duplictae Titles, as well as content? I was thinking maybe I can use Robots.txt to block various query string parameters. I'm Open to ideas and examples.

    | adhunna
    0

  • A client of mine did article marketing years ago with Ezine, PRlog, ArticlesBase, etc.  Now those links are still linking to their domain. Do you think they'll see an increase in rankings from removing those links?

    | alhallinan
    0

  • We're thinking of simplifying our website which has grown to a very large size by removing all the content which hardly ever gets visited. The plan is to remove this content / make changes over time in small chunks so that we can monitor the impact on SEO. My gut feeling is that this is okay if we make sure to redirect old pages and make sure that the pages we remove aren't getting any traffic. From my research online it seems that more content is not necessarily a good thing if that content is ineffective and that simplifying a site can improve conversions and usability. Could I get people's thoughts on this please? Are there are risks that we should look out for or any alternatives to this approach? At the moment I'm struggling to combine the needs of SEO with making the website more effective.

    | RG_SEO
    0

  • I´ve a client who uses news articles about his company in newspaper and other magazines in his blog. It´s fair because he wants to show how how big and important he is when someone check his website but in otherwise he is just copying content from others sites. ( there is lot of original content also) Should I use noindex on these pages or use a rel=canonical? We already ask permission and show the font with the link in these cases.

    | SeoMartin1
    0

  • Hi , One of the websites I'm promoting and working on is www.pau-brasil.co.il.
    It's wordpress-based website and as you can see the html's Title is "PauBrasil | some hebrew slogan".
    (Screenshot: http://i.imgur.com/2f80EEY.gif)
    When I'm searching for "PauBrasil" (Which is the brand's name) , one of the results google shows is "PauBrasil: Some Hebrew Slogan" (Screenshot: http://i.imgur.com/eJxNHrO.gif ) Why does the pipe is being replaced with ":" ?
    And not just that , as you can see there's a "blank space" missing between the the ":" to the slogan.
    (note: the websites has been indexed by google crawler at least 4 times so I find it hard to believe it can be the reason) I've keep on looking and found out that there's another page in that website with the exact same title
    but when I'm looking for it in google , it shows the title as it really is , with pipe. ("|").
    (Screenshot: http://i.imgur.com/dtsbZV2.gif) Have you ever encountered something like that?
    Can it be that the duplicated title cause that weird "replacement"? Thanks in advance,
    Kadel

    | Kadel
    0

  • I'm working with some backlink data, and I've run into different domains that host the same exact content on the same IP. They're not redirecting to each other, just looks like they're hosting the same content on differnet virtual hostnames. One example is: borealcanada.ca borealcanada.com borealcanada.org www.borealcanada.ca www.borealcanada.com www.borealcanada.org www.borealecanada.ca I'm trying to consolidate this data and choose which is the primary domain. In this example, it appears www.borealcanada.ca has a high number of indexed pages and also ranks first for "boreal canada". However, I'm trying to think of a metric I can use to definitively/systematically handle this (using SEO Tools or something like it). Anyone have ideas on which metric might help me determine this for a large number of sites?

    | brettgus
    0

  • Hi guys, I have a domain A, B and C. The domain A was an association of two business and they are about to split. Parts of domain A are going to be redirect to domain B, but some content belong to the domain C. So my question : Is it possible to 301 redirect some pages from A to B and some other pages from A to C and if yes, what would be the impact on SEO ? Thanks a lot!

    | StevePatenaude
    0

  • I've got a relatively new website (launched at the beginning of June 2013). For some keywords I'm targeting, it first ranked around page 15. It made huge jumps to finally rank on page 2 or 3. Since then, it goes back to page 15 and then back to page 3. It does this every now and then. Any ideas?

    | sbrault74
    0

  • Dear all, Our eCommerce site has the following structure Home .. PR=5 Category .. PR=4 (linked from home) Sub-Category Linked from Category PR=Un-ranked The domain has several years and perform well the site is here: http://tinyurl.com/5v9hrql Any idea or suggestion? Thank you Claudio

    | SharewarePros
    0

  • I'm using RDFa rich snippets markup on my site and on Google's Structured Data Testing Tool the 5 stars come up in yellow in my listing. On the web today I checked on indexing progress and the stars are 'red' in the listings - any idea why this might be? Is Google switching them over universally or is it something else?

    | Ubique
    0

  • Recently on the evening of August 5th almost all of the keywords our pages ranked highly for dropped by anywhere from 5 to 8 pages. The only activity during this time was an article that had been picked up by a major news outlet and then was apparently copied onto other sources with links to our domain and article. More puzzling though is rather than simply having the same page show up lower for a keyword, in a number of instances a different page is now shown for the result, often with less or no relevance to the keyword. In some cases, for a single keyword phrase we've seen as many as 10 different pages rotated throughout the day when performing a search. Prior to our rankings falling, we've never seen this behavior.

    | BrianQuinn
    0

  • Is there a problem with placing a rel=canonical link on the canonical page - in addition to the duplicate pages?  For example, would that create create an endless loop where the canonical page keeps referring to itself? Two examples that are troubling me are: My home site is www.1099pro.com which is exactly the same as www.1099pro.com/index.asp (all updates to the home page are made by updating the index.asp page).  I want www.1099pro.com/index.asp to have the rel=canonical link to point to my standard homepage www.1099pro.com but any update that I make on the index page is automatically incorporated into www.1099pro.com as well.  I don't have access to my hosting web server and any updates I make have to be done to the specific landing pages/templates. I am also creating a new website that could possible have pages with duplicate content in the future.  I would like to already include the rel=canonical link on the standard canonical page even though there is not duplicate content yet. Any help really would be appreciated.  I've read a ton of articles on the subject but none really define whether or not it is ok to have the rel=canonical link on both the canonical page and the duplicate pages.  The closest explanation was in a MOZ article that it was ok but the answer was fuzzy. -Mike

    | Stew222
    0

  • So a friend of a friend was referred to me a few weeks ago as his Google traffic fell off a cliff. I told him I'd take a look at it and see what I could find and here's the situation I encountered. I'm a bit stumped at this point, so I figured I'd toss this out to the Moz crowd and see if anyone sees something I'm missing. The site in question is www.finishlinewheels.com In Mid June looking at the site's webmaster tools impressions went from around 20,000 per day down to 1,000. Interestingly, some of their major historic keywords like "stock rims" had basically disappeared while some secondary keywords hadn't budged. The owner submitted a reconsideration request and was told he hadn't received a manual penalty. I figured it was the result of either an automated filter/penalty from bad links, the result of a horribly slow server or possibly a duplicate content issue. I ran the backlinks on OSE, Majestic and pulled the links from Webmaster Tools. While there aren't a lot of spectacular links there also doesn't seem to be anything that stands out as terribly dangerous. Lots of links from automotive forums and the like - low authority and such, but in the grand scheme of things their links seem relevant and reasonable. I checked the site's speed in analytics and WMT as well as some external tools and everything checked out as plenty fast enough. So that wasn't the issue either. I tossed the home page into copyscape and I found the site brandwheelsandtires.com - which had completely ripped the site - it was thousands of the same pages with every element copied, including the phone number and contact info. Furthering my suspicions was after looking at the Internet Archive the first appearance was mid-May, shortly before his site took the nose dive (still visible at http://web.archive.org/web/20130517041513/http://brandwheelsandtires.com) THIS, i figured was the problem. Particularly when I started doing exact match searches for text on the finishlinewheels.com home page like "welcome to finish line wheels" and it was nowhere to be found. I figured the site had to be sandboxed. I contacted the owner and asked if this was his and he said it wasn't. So I gave him the contact info and he contacted the site owner and told them it had to come down and the owner apparently complied because it was gone the next day. He also filed a DMCA complaint with Google and they responded after the site was gone and said they didn't see the site in question (seriously, the guys at Google don't know how to look at their own cache?). I then had the site owner send them a list of cached URLs for this site and since then Google has said nothing. I figure at this point it's just a matter of Google running it's course. I suggested he revise the home page content and build some new quality links but I'm still a little stumped as to how/why this happened. If it was seen as duplicate content, how did this site with no links and zero authority manage to knock out a site that ranked well for hundreds of terms that had been around for 7 years? I get that it doesn't have a ton of authority but this other site had none. I'm doing this pro bono at this point but I feel bad for this guy as he's losing a lot of money at the moment so any other eyeballs that see something that I don't would be very welcome. Thanks Mozzers!

    | NetvantageMarketing
    2

  • I understand that Google does not want to index other search results pages, but we have a large amount of discount tee times that you can search for and they are displayed as helpful listing pages, not search results. Here is an example: http://www.activegolf.com/search-northern-california-tee-times?Date=8%2F21%2F2013&datePicker=8%2F21%2F2013&loc=San+Diego%2C+CA&coupon=&zipCode=&search= These pages are updated daily with the newest tee times. We don't exactly want every URL with every parameter indexed, but at least http://www.activegolf.com/search-northern-california-tee-times. It's weird because all of the tee times are viewable in the HTML and are not javascript. An example of similar pages would be Yelp, for example this page is indexed just fine - http://www.yelp.com/search?cflt=dogwalkers&find_loc=Lancaster%2C+MA I know ActiveGolf.com is not as powerful as Yelp but it's still strange that none of our tee times search pages are being indexed. Would appreciate any ideas out there!

    | CAndrew14.
    0

  • Please forgive my ignorance on this subject. I have little to no experience with the technical aspects of setting up and running a server. Here is the scenario: We are self-hosted on an Apache server. I have been on the warpath to improve page load speed since the beginning of the year. I have been on this warpath not so much for SEO, but for conversion rate optimization. I recently read the Moz Post "How Website Speed Actually Impacts Search Rankings" and was fascinated by the research regarding TTFB. I forwarded the post to my CEO, who promptly sent me back a contradictory post from Cloudflare on the same topic. Ily Grigorik published a post in Google+ that called Cloudflare's experiment "silly" and said that "TTFB absolutely does matter." I proceeded to begin gathering information on our site's TTFB using data provided by http://webpagetest.org.  I documented TTFB for every location and browser in an effort to show that we needed to improve. When I presented this info to my CEO (I am in-house) and IT Director, that both shook their heads and completely dismissed the data and said it was irrelevant because it was measuring something we couldn't control. Ignorant as I am, it seems that Ilya Grigorik, Google's own Web Dev Advocate says it absolutely is something that can be controlled, or at least optimized if you know what you are doing. Can any of you super smart Mozzers help me put the words together to express that TTFB from different locations and for different browsers is something worth paying attention to? Or, perhaps they are right, and it's information I should ignore? Thanks in advance for any and all suggestions! Dana

    | danatanseo
    0

  • Hi Moz experts! I have a client with Google Place listings for multiple branch locations and for some reason the fully SEO optimized Head Office listing is being beaten by an un-optimized branch listing. The HQ listing gets a tonne of traffic where as the ranking and unoptimized branch location doesn't and is the main listing when searching through Google. Any help would be greatly appreciated. Thanks

    | Jon_bangonline
    1

  • One of our clients we are working on have two sites the main with a PR5 and a separate one with a PR4. We are planning on doing a 301 from the PR4 to a page on the PR5 Is it best to do: www.PR4.com  ----> www.PR5.com/releveantPR4page or www.PR4.com/page  ----> www.PR5.com/releveantPR4page Most pages on the PR4 site can fit into one PR5 page logically. However the PR4 has an about us, contact us, blog/with posts, FAQ, Applications, Legal Resources which are all pretty out dated.. The PR4 site is kinda messy and we are not sure if it will be easy to 301 each page individually with the user in mind. can we do a sitewide 301 redirect from the root PR4.com to a page PR/5.com/releveantPR4page and also do deeper 301's? PR4.com/PR4page ---> PR5.com/releveantPR4page

    | Bryan_Loconto
    0

  • I run a Magento shop - let's imagine a situation where the category landing page, is about "Joe Bloggs Kettles" Then on that page, we have the products listed ; so we would have links to products pages - these links will be called something like:
    Joe Bloggs Red Kettle
    Joe Bloggs Yellow Kettle
    Joe Bloggs Purple Kettle Can someone please tell me if this is ok or should we rework our strategy? Thanks

    | bjs2010
    0

  • I'm a little confused about c-blocks, I've been reading about them but I still don't get it. Are these similar to sitewide links? do they have to come from websites that I own and hosted in the same ip? and finally, what's better ...more or less linking c-blocks? Cheers 🙂

    | mbulox
    0

  • Hi Guys this is really strange, i am using yoast seo for wordpress on two sites. On both sites i am seeing meta name='description' instead of meta name="description" And this is why google is probably not reading it correctly, on many other link submission sites which read your meta data automatically say site blocked. How to i fix this? Thanks

    | SamBuck
    0

  • It is quite labour intensive to come up with product descriptions for all of our product range ... +2500 products, in English and Spanish... When we started, we copy pasted manufacturer descriptions so they are not unique (on the web), plus some of them repeat each other - We are getting unique content written but its going to be a long process, so, what is the best of 2 evils, lots of duplicate non unique content or remove it and get a very small phrase from the database of unique thin content? Thanks!

    | bjs2010
    1

  • Suppose you have a website with a blog on it, and you show a few recent blog posts on the homepage. Google see the headline + by Author Name and associates that user's Google+ profile. This is great for the actual blog posts, but how do you prevent this from happening on the homepage or other blog roll page?

    | wattssw
    0

  • We have a large site which attracts a massive amount of link spam on a continual basis. I'm talking thousands of links every day. The reason we get so much spam linking to us is that we are an authority site in a highly spammed vertical. So, to avoid falling foul of Penguin are we fast coming to a point where sites have to constantly monitor their link profiles and disavow anything that looks remotely dodgy?

    | jandunlop
    0

  • I noticed something in Google search results today that I can't explain. Any help would be appreciated. I performed a real estate based search and the top result featured a rich snippet showcasing the following... Address                                                                     Price            Bd/Ba
    912 Garden District Dr #17. Charlotte, NC 28202      $179,990       3 / 2
    222 S Caldwell St #1602. Charlotte, NC 28202          $389,238       2 / 2&1/2 However, when I visit the page associated with this information, there is no Schema to be found. In fact, the page is, for the most part, just a large table listing homes on the market. The table headings are Address, Price, and Bd/Ba. Is it common for Google to use table based data to generate rich snippets? What is the best way to influence this? In the absence of Schema (as the page we are talking about has no Schema implementation), does Google default to table data? Has anyone seen this behavior before and, if so, can you point me to it? EDIT: I've now come across a few other examples where the information is not in a table, but rather in divs. Why are such sites (you can find some by searching for "[ZIPCODE] real estate") getting this treatment?

    | RyanOD
    0

  • We have an old domain that we have had registered for many years(pinpoint;asersystems.com) and redirected to our regular domain (which is a short version of our name (pinlaser.com). Management wants to switch and use the longer version as the primary domain for branding purposes. I have cautioned against this for many reasons: Need to do 100's of redirects Potential loss of back links Most links will now be 301 redirects and not look natural to search engines. I would appreciate feedback on any and all risks associated with this potential move. Thanks.

    | Pinlaser
    0

  • We are getting ready to release an integration with another product for our app.  We would like to add a landing page specifically for this integration.  We would also like it to be very similar to our current home page.  However, if we do this and use a lot of the same content, will this hurt our SEO due to duplicate content?

    | NathanGilmore
    0

  • I am curious about this for a couple of reasons. We have all dealt with a site who switched platforms and didn't plan properly and now have 1,000's of crawl errors. Many of the developers I have talked to have stated very clearly that the HTacccess file should not be used for 1,000's of singe redirects. I figured If I only needed them in their temporarily it wouldn't be an issue. I am curious if once Google follows a 301 from an old page to a new page, will they stop crawling the old page?

    | RossFruin
    0

  • I don't have any specific scenario here but just curious as I come across sites fairly often that have, for example, 20,000 pages but only 1,000 in their sitemap. If they only think 1,000 of their URL's are ones that they want included in their sitemap and indexed, should the others be excluded using robots.txt or a page level exclusion? Is there a point to having pages that are included in neither and leaving it up to Google to decide?

    | RossFruin
    1

  • Hey guys, A little help required. We are potentially taking on a new client who has over 5,000 image backlinks (4,000 of those from one site) from around 7,000 total backlinks. Would this be a problem? It's been noticeable recently that both footer and blogroll links seem to be getting targeted by Google. Would this be the case for images links too? Especially considering the top-heavy nature of the link profile? Thoughts welcome. Cheers.

    | Webrevolve
    0

  • We built a better version of a search results page and re-directed from the old search results page to the landing page, and are seeing a huge uptick in traffic. Wondering if we remove the re-direct and 404 the original search results page if we'll see a drop in traffic. I ran the search results page through open site explorer and Google Webmaster tools, and there aren't many links, but the search results page used to see quite a bit of of traffic over the past couple of years.

    | nicole.healthline
    0

  • One of our dev sites get indexed (live site robots.txt was moved to it, that has been corrected) 2-3 weeks ago. I immediately added it to our Webmaster Tools and used the Remove URL tool to get the whole thing out of the SERPs. A site:devurl search in Google now returns no results, but checking Index Status in WMT shows 2,889 pages of it still indexed. How can I get all instances of it completely removed from Google?

    | Kingof5
    0

  • To make all descriptions for all forum posts unique is a hell of a job.... One option is to crawl the first 165 characters and turn these automaticly into the meta description of the page.
    If Google thinks the meta description is not suitable for the search query, Google will make a own description. In this case all te meta descriptions are unique, like the Google Guidlines want you to do. How will Google think off the fact when we delete the meta description tag so Google will make all the descriptions by herself?

    | Zanox
    0

  • http://pisoftware.com was never a huge leader of traffic, but it ranked top 5 for my money keyphrases, and was bringing consistent quality visitors.  As traction went up, that traffic just became more valuable.  I was happy. Then Penguin came along, and made me sad.  60% loss in traffic,  I stayed calm.  I disavowed.  I sent emails asking for links to come down.  I atoned for my sins (of the distant, distant past - I know better now) - and waited.  Never a hard penalty, never an email from Google - just rankings that got hammered.  From #3 for my best keyphrase for #25 today. I write content, and I try and write it better all the time.  I try to make it valuable.  I leverage social media to the extent that I can.  I do outreach.  I'm trying to be patient, but it's hard when the software is awesome, and so few people see it. I'm considering starting over - or maybe even just creating another domain to use if this one never comes back.  I wonder on the thoughts of experts.  At MozCon I talked to a lot of people in the same boat, and it seems we are all taking similar steps.  So the questions: 1.  Should I start over?  Or stay the course? 2.  What has worked for others - what seems to have been the most valuable in getting back on the rise? 3.  Thoughts on the site as it is now?  I've worked lately on speed, mobile rendering, etc - and it seems responsive and solid to me. Thanks in advance, you crazy bunch of Mozzers you. Kelly

    | Kellster
    0

  • I've used Google Adwords, Google Analytics and competitors keywords to compile a master list. I'm now looking to evaluate metrics on the keywords / phrases / long tail phrases. My question is this ... Based on Googles use of Geo Targeting, would I be better to evaluate metrics (Avg. Monthly Searches, Competition, Avg CPC) based on United Kingdom or my local city (I only operate in my local city). I am looking to use the results to redesign my website. I will use the favorable keywords / phrases / long tail keywords to implement a new menu, new content page creation, articles, etc. Thanks Mark

    | Mark_Ch
    0

  • A quick Wordpress Permalink checkup... I'm generally a fan of %postname% Permalink structure in Wordpress - although this does create a completely flat architecture, so that ALL Pages AND Posts are found at www.domain.com/_________ I'm sure I've heard, read, or ingested somewhere that it makes more sense to use /blog/%postname% which then makes all Blog Posts reside at www.domain.com/blog/________ with the static Page content still being at www.domain.com/________ Any thoughts to why this would NOT be a good idea? To me this seems more logical.. and like I say I'm sure I've heard an authority say Google kind of prefers that it can differentiate Blog content from everything else. I've used this successfully on a few sites so far, and all seems to be good. (Moz although not Wordpress, uses this structure for it's blog). Thanks!

    | GregDixson
    0

  • Hi guysWe have added schema.org mark up a few months ago and it all looked well and showed up then suddenly last month all the crawled pages disappeared from Webmaster tools Structured data (see the screenshot attached). This happened to another site of mine and I cannot figure out what causes it. Nothing has been changed on the pages and you can see by yourself in the HTML code. Any ideas to why this might happened this way?wenR89I.png?1

    | Walltopia
    0

  • I have two e-commerce websites and i'm going to remove some products from website as requested by a supplier and sell them only on one site. Is it a good idea to 301 redirect the pages from site 1 to site 2?? Thanks for your help

    | Aikijeff
    0

  • One of my websites has received the following message: We've reviewed your site and we still see links to your site that violate our quality guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. We encourage you to make changes to comply with our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results. If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request. I have used LinkResearchTools DTOX to locate unnatural links and remove them. So far I've been able to remove or nofollow 50/350 and that's as far as I can ever go. The rest of the websites either don't respond or don't have any contact information. I added another 300 suspicious websites to my list and I'll try to get the links manually removed. Hopefully I can get 100/650 websites (and a bit more links) removed in total - at most. That is my estimate. I've been thinking to use Google Disavow Tool for the rest and make sure to submit a nicely written report with spreadsheets to Google - when I get to the reconsideration point. What are your thoughts on this?

    | zorsto
    0

  • Hello people, In May there has been a dramatic increase in blocked URLs by robots.txt, even though we don't have so many URLs or crawl errors. You can view the attachment to see how it went up. The thing is the company hasn't touched the text file since 2012. What might be causing the problem? Can this result any penalties? Can indexation be lowered because of this? ?di=1113766463681

    | moneywise_test
    0

  • Hi everyone, We bought a domain which had content in German for over 8 years. So the rankings it had were in another search engine aswell. So i've changed the language of the content + targetting in webmaster tools to Dutch.  (i've created unique content, in case your wondering)
    Now we don't rank in the targetted search engine, nor in the search engine the website was previously ranked. My question is how can we fix this so we are going to get indexed and ranked for the targetted search engine. Thanks in advance.

    | Online_Supply
    0

  • Hi, I have this issue where my website does not appear in Google search results when i have the safe search settings on. If i turn the safe search settings off, my site appears no problem. I'm guessing Google is categorizing my website as adult, which it definitely is not. Has anyone had this issue before? Or does anyone know how to resolve this issue? Any help would be much appreciated. Thanks

    | CupidTeam
    0

  • While researching conversion rate optimization services, I checked out Conversion Rate Experts because Moz used them a while ago and the experience was profiled recently in a Moz top 10 (if I recall correctly). I double checked the fees for service in case there was a typo.  Their site says they don't just service the big boys but small and mid-sized companies. We're a mid-sized outfit (at least I thought) and this is way out of our budget.  Can small or mid-sized companies swing this? Their site says $250,000-1 million in online sales are eligible. Anyone know what a reasonable monetary definition of today's small to mid-sized online companies? "Most companies engage us for either : (i) Managed-service conversion rate optimization, (ii) Full-service conversion rate optimization, (iii) Full-service with walk thru conversion rate optimization or (iv) Full-service plus implementation conversion rate optimization. We work on minimum 6-month engagements, starting from $26,000/month. We give all our clients a 3-month money-back guarantee."

    | AWCthreads
    0

  • Hi everyone, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update in May.  This was the first significant drop that the site has experienced since 2007, and I was initially concerned that the new website design I released in March was partly to blame.  On further investigation, many spammy sites were found to be linking to my website, and I immediately contacted the sites, asked for the removal of the sites, before submitting a disavow file to Google.  At the same time, I've had some great content written for my website over the last few months, which has attracted over 100 backlinks from some great websites, as well as lots of social media interaction.  So, while I realise my site still needs a lot of work, I do believe I'm trying my best to do things in the correct manner. However, on August 11th, I received a message in Google WMTs : Googlebot found an extremely high number of URLs on your site I studied the table of internal links in WMTs and found that Google has been crawling many URLs throughout my site that I didn't necessarily intend it to find i.e. lots of URLs with filtering and sorting parameters added.  As a result, many of my pages are showing in WMTs as having over 300,000 internal links!! I immediately tried to rectify this issue, updating the parameters section in WMTs to tell Google to ignore many of the URLs it comes across that have these filtering parameters attached.  In addition, since my access logs were showing that Googlebot was frequently crawling all the URLs with parameters, I also added some Disallow entries to robots.txt to tell Google and the other spiders to ignore many of these URLs.  So, I now feel that if Google crawls my site, it will not get bogged down in hundreds of thousands of identical pages and just see those URLs that are important to my business. However, two days later, on August 13th, my site experienced a further huge drop, so its now dropped by about 60-70% of what I would expect at this time of the year! (there is no sign of any manual webspam actions) My question is - do you think the solutions I've put in place over the last week could be to blame for the sudden drop, or do you think I'm taking the correct approach, and that the recent drop is probably due to Google getting bogged down in the crawling process.  I'm not aware of any subsequent Penguin updates in recent days, so I'm guessing that this issue is somehow due to the internal structure of my new design.  I don't know whether to roll back my recent changes or just sit tight and hope that it sorts itself out over the next few weeks when Google has more time to do a full crawl and observe the changes I've made. Any suggestions would be greatly appreciated.  My website is ConcertHotels.com. Many thanks Mike

    | mjk26
    0

  • Hi everybody. We've been working with http://www.lawnmowersdirect.co.uk/ for some time now. Rankings for broad terms such as 'lawn mowers' and 'lawnmowers' are superb, and we're pretty happy. Bizarrely though, rankings for products are very poor. For example this page - http://www.lawnmowersdirect.co.uk/product/honda-hrg465sd - is currently 17th for 'Honda HRG465SD IZY'. We've done lots of work onsite, clearing up duplicate content, improving copy and tidying up URLs. However, none of this seems to have had a major impact on the product pages. Any suggestions would be much appreciated!

    | Blink-SEO
    0

  • So I was just surprised by officially being one of the very few to be hit with the manual penalty from Google "unnatural links from your site." We run a clean ship or try to. Of all the possible penalties, this is the one most unlikely by far to occur. Well, it explains some issues we've had that have been impossible to overcome. We don't have a link exchange. Our entire directory has been deindexed from Google for almost 2 years because of Panda/Penguin - just to be 100% sure this didn't happen. We removed even links that went even to my own personal websites - which were a literal handful. We have 3 partners - who have nofollow links and are listed on a single page. So I'm wondering... does anyone have any reason to understand why we'd have this penalty and it would linger for such a long period of time? If you want to see strange things, try to look up our page rank on virtually any page, especially in the /gui de/ directory. Now the bizarre results of many months make sense. Hopefully one of my fellow SEOs with a fresh pair of eyes can take a look at this one. http://legal.nu/kc68

    | seoagnostic
    0

  • We are considering changing the way we treat our brand (TTS) in our page title tags. In MOZ I found the following advice: Optimal Format Primary Keyword - Secondary Keyword | Brand Name
    or
    Brand Name | Primary Keyword and Secondary Keyword Are these of equal merit or is the former (Primary keyword | Brand) the better route? Currently we use the second version - 'Brand | Primary Keyword' - but we are proposing to shift to 'Primary Keyword | Brand'. We currently get an awful lot of brand traffic that converts very well so I need to be sure that no harm is done as a minimum. All views appreciated. Many thanks. Jon

    | TTS_Group
    0

  • Hi All, I'm using Studiopress Genesis Enterprise child theme in Wordpress + InstantWP + Yoast SEO. I have created a standard home page (see image) along with bespoke pages My question is this: When I select Pages | All Pages  ... I cannot see the home page and therefore cannot optimise the home page with Yoast SEO. What am I doing wrong? Thanks Mark XQvbFl2taJEgFXJ

    | Mark_Ch
    0

  • We have 2 ecommerce sites. Both have been hit by Penguin (no warnings in WMT) and we're in the process of cleaning up backlinks. We have link directories on both sites. They've got links that are relevant to the sites but also links that aren't relevant. And they're big directories - we're talking thousands of links to other sites. What's the best approach here? Do we leave it alone, delete the whole thing, or manually review and keep highly relevant links but get rid of the rest?

    | Kingof5
    0

  • The site I am working on has barely any links linking outwardly, being a fairly niche site it is hard to imagine many relevant places to link to. By not linking out, can this lead to problems in Google's eyes, I guess it would be good from the point of view of inward links & their dilution but is there anything related to looking strange on Google's link graph in this way?

    | motiv8
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.