Hi Duncan,
I personally wouldn't be doing it like that. Have a read of this article on .htaccess rewrites to redirect upper case to lower case. It should give you a copy & paste solution to what you wish to achieve.
-Andy
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Duncan,
I personally wouldn't be doing it like that. Have a read of this article on .htaccess rewrites to redirect upper case to lower case. It should give you a copy & paste solution to what you wish to achieve.
-Andy
Just to check, are the links nofollowed or followed? If they are nofollowed, don't worry about it - you won't need to disavow.
As already pointed out, there is no harm going to come from disavowing them, but if you do, make sure you disavow from a domain level rather than just a page. This will ensure that no more links can sneak in from that domain in the future.
-Andy
Link building for any site is a huge subject - certainly not something you can cover here.
However, your best links are going to come from amazing content / linkable assets. If you have something that your niche are going to enjoy reading, then you will find links will happen.
If you are happy performing outreach, you can speed this process up and let key influencers know what you have, but tread carefully... If you just started to Tweet someone you have never spoken to before and asked for links, it can be taken badly and you can ruin your chances for the future.
Here are a few resources to read:
-- MOZ beginners guide to link building -- Backlinkto -- Point Blank SEO -- Post Penguin Link Building (Disclaimer... This is my site)
Hopefully you will get some good pointers from there.
-Andy
Hi,
...what are the first directories you add the site to? What are some links you try and aquire first?
I wouldn't be starting here. Directories for the most part are pretty useless, unless you are adding yourself to a niche / local one that might help with specifics.
Follow all best practices with your site. Ensure that you have thoughtful and well phrased page titles that describe the page. Make sure your content is amazing (average isn't going to cut the mustard). Spend additional time to ensure this happens. Don't fall into the trap of overusing keywords and make sure your site is well configured to avoid duplicate pages.
There is so much to cover here, that I would suggest you spend some time reading over the MOZ Beginners Guide. This will give you a good grounding into what you should be doing.
-Andy
I think this might explain your issue...
Gary Illyes @methode
Bad news: we might have a problem with reporting the number of indexed URLs in the SC Sitemaps feature. Good news: we're looking into it
Soooo, it looks like the number of indexed pages in Webmaster Tools (Search Console) is being reported incorrectly.
That would explain what you are seeing
-Andy
Is it a good move to pick 10 free blogging sites to build links
No is the simple answer really. Google is targeting these types of link building practices so I would be looking to create amazing content and the performing your outreach from there.
By Amazing content, I don't mean that you should write 500 words and how that will suffice - because for the most part, it won't. If you want to really succeed with link building, then you need to create content that takes you past what most others are doing. If you can create 10 articles in a couple of days, then you are't doing enough.
Take your time - spend a number of days creating something that is fully researched, very in depth and answers many questions. You will have a really good chance of this being picked up by a number of reputable sources.
-Andy
Hi Micha,
I can't think of why Google might object to this as it isn't the page that is refreshing constantly. That said, iFrames are frowned upon from a SEO standpoint as there have been many issues with blackhat techniques in the past.
To me, I would be looking at how big the iFrame is, where it is on the page, is it every page, what content is contained in there - the usability might harm your SEO in a roundabout way though.
At best, it will do nothing to your SEO - at worst, it could cause issues.
Would you not be able to achieve what you are wanting to do by using Ajax or similar?
-Andy
You aren't disturbing Marco
It really isn't uncommon to see discrepancies like this. I see them every day! A 25% drop like this suggests to me that Google is perhaps doing a little bit of a reshuffling.
I would wait just a bit to see if the number of pages starts to increase as there isn't an awful lot else you can do - it sounds like it has all been done.
-Andy
I have seen this before where it was trackbacks causing issues and injecting spammy keywords in to them. I have also seen RSS feeds carrying spammy backlinks.
It is also worth remembering that lots of the bad links like these will now be dead / removed / changed.
Have you done a disavow of these spam sites yet?
-Andy
Just thinking about this, it might be worth looking at the page encoding as it could be a lack or UTF-8 declaration.
Do you have the full URL with that code in it?
...this one is definitely making me grey!
I feel your pain there
-Andy
No problem at all. Just update here if it hasn't rectified itself soon and we can take another look
-Andy
Ah sorry, I missed that bit Marco.
When Google drop pages from the index, it can be for a whole host of reasons. However, Google never indexed 100% of pages (or very, very rarely). If you were at 95% and now at 75%, then this would suggest to me that Google has either lost some level of trust in the pages or you will just have to wait until the pages are re-indexed and Google has decided what to do with them again.
I would be tempted to wait for a little bit as over time, you should see pages being re-indexed again. If you have already re-submitted the sitemap, just make sure there is no problem with that - rebuild it and then re-submit if you haven't already, just to be sure.
-Andy
Hi Kelly,
**What is happening is reader is going to page domain.com/ja/blog/article **then coming to this page /404?aspxerrorpath=/Sitefinity/WebsiteTemplates/App_themes/"website"/fonts/open sans regular
So is the page redirecting to the 404 or is the article link just automatically taking you there? Or is it just that you are seeing these 404's in Analytics?
It sounds like it might be a URL configuration issue - I am not familiar with Finity though. Is there any translation service within the site that could cause it to write a page in Japanese incorrectly?
-Andy
Hi John,
As these are nofollow, I wouldn't worry about this at all. A nofollow flag is essentially what you get when you disavow a site through Webmaster Tools (Search Console).
There is no harm in disavowing these links if you feel there is a potential to switch to dofollow, but by the sounds of things, this probably wouldn't happen - even if it did, it sounds like a spammy technique that wouldn't cause you issues. Google would probably see hundreds (thousands?) of links all of a sudden appear with a dofollow tag and just ignore them.
-Andy
Hi Marco,
It all depends on what you were tracking as phrases before, and now after you have made the changes. If you were tracking for the phrase "Red Sneakers" and changed the title to "Blue Sneakers" then you would expect to see a drop with your original phrase. Have you updated your tracking to compensate for the changes?
-Andy
I've been doing it for years and have never come across this issue.
Look towards an algorithmic penalty too. There is no telling what could have caused this, so you need to cover all bases.
-Andy
Without actually looking at both sites, it is impossible to say why. It could be something or nothing. It could be just one link or over optimisation on your site?
I would do some digging and see what you come up with.
-Andy
Hi Deedra,
It isn't about SEO per se, but if your client are trying too hard and then along comes a number of other clients, one who seemingly does no SEO at all, then you would have to look at both their content and backlinks. If someone is writing content (or has content) that Google sees as more reputable, then this could easily be part of their success.
If they are sharing externally, or via social channels, this could also be having a big positive push for them.
Our client is even spending money in Adwords and their competition isn't
This will have no impact on SEO at all.
Make sure that any link building you are doing is from good sources - make sure all content is amazing and better than others - make sure there is enough for other to want to share them through Social Media.
-Andy
Hi,
First of all, have a read of how DA is scored here. There are going to be so many factors that play a part in this that would be impossible to tell exactly why this is - most like to start looking at their backlinks, so perhaps start by checking this out using Open Site Explorer?
-Andy
Just disallow in Robots. No need to do anything else.
-Andy
Hi Alfred,
if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar?
No, this wouldn't be seen as gaming the algorithm. All you are doing is finding those pages that are serving no purpose and removing them.
if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?
There are two thoughts on this. The noindex route is seen as the more gentle way to see if this fixes things for Google - in my experience, it is generally not enough.
What you need to remember is that even if you noindex a page, Google can still apply a penalty based on the page existing as the content can still be seen / crawled. There is no hard and fast rule to say they will, but it is a real possibility.
The advice to my customers is generally to remove the pages in question to avoid all possible doubt. Find the pages that are not delivering benefit to the site or visitors and start there.
I would be advising you undertake a content audit to fully assess the site and what really needs to be done. There is little point in killing lots of pages if it isn't lots of pages causing you issues.
-Andy
Hi Mike,
I think I understand what you are saying and there is no problem with doing this, and it is in fact a recommended practice.
If the page has changed, there is no harm in changing the URL, site focus, Title, Description etc, and have it as a new page. A 301 from /business to /enterprise would be the best way to handle this.
-Andy
Well, you could decimate most of the site and fix many issues, but would this be enough to pull it back for you?
Of those that would remain, would you consider them to be more authoritative posts? Would they stand up in the face of Panda without issue?
-Andy
Hi,
I do mean use robots.txt to block crawlers.
What you need to do is first noindex the site in question and then after a period of time, you can disallow it via the robots.txt.
The reason you do it this way is because right now you will have pages from this site indexed in Google - these need to be removed first. You can either do this with the noindex META and wait for Google to spider the site and action all of the noindex requests, or to speed things up, noindex the page and then remove it with Webmaster Tools.
If you don't do this, you are then just blocking the site from Google ever seeing it, so you will probably find that pages remain in the index - which you don't want as this is duplicate content.
-Andy
Hi Melissa,
Penalties do actually just expire, but it isn't like there is a set time period for this, so if you have no mention in Webmaster Tools of any penalty, then you really don't have one. You won't be given the option to try and do a reconsideration request either. I am pretty sure it was John Mueller that confirmed this.
I would still go through and disavow sites that have not got back to you though, just as best practice and to do a little damage mitigation. You may find that the penalty comes back at some point in the future if you don't.
-Andy
If the pages don't fall within the international element of the site, or aren't covered by HREFLANG, then look at the pages that are being caught as duplication. Are they required pages? Are they there just as filler content / doorway pages? Could the pages be noindexed? Is there an opportunity to set a canonical for them to avoid duplication?
-Andy
Hi Carl,
Am I right in assuming that you have http://store.bmiresearch.com/newzealand/power & http://store.bmiresearch.com/bangladesh/power to try and satisfy results in those countries?
If you want to avoid duplication issues, you need to be implementing an international SEO strategy and make use of HFREFLANG.
Here are a few resources that I would suggest you have a read though.
However, don't worry if after this is done that they are picked up as duplication because that will just be MOZ notifying you of the pages rather than actually saying it's an issue.
I hope these help
-Andy
Honestly, I wouldn't use just one because they are all going to give you slightly different information.
Use Moz, use Google and then try SEMrush, Ubersuggest, Keyword Tool.io and Wordstream.
The idea is to try and get as much information as you can, so use multiple sources for anything like this to increase your chances. Collate all of the data, remove duplicates and see what you have.
-Andy
Hi Ankit,
All is not lost, but it all depends on the time you have to put in to correcting it.
Have you ever tried to fix the issues with Panda? There is a wealth of information available out there - here are a couple of Google ones to read:
Remember that Panda focuses on thin and duplicate content, which translates to low quality, so if you think that you have ways to correct this, there is no reason you can't pull the traffic back.
-Andy
Hi Marina,
How are you trying to verify it? Meta Tag? Upload a file? DNS / Analytics?
Are you seeing an error message when you are trying to do this? You could also check here to see if it helps answer your question.
-Andy
Hi,
It takes time for OSE to keep spidering sites so if you are seeing something from a couple of months ago, but the link has gone, then you don't need to do anything else. It will get back round to spidering those sites again and will update.
However, if you are concerned that links on spammy sites keep reappearing, then you would do well to disavow it, but other than that, no need to worry. Google isn't going to crawl at the same intervals as MOZ do, so as long as you are ensuring these links are gone, then you are doing everything you need.
-Andy
maybe I should try re-establishing robots.txt and a site map and seeing if Google recrawls the old domain and picks up the 301's to the new one.
If you can do this, then it is definitely worth doing.
-Andy
Hi Matt,
Can you not remove the redirect temporarily while you sort the change of address over?
-Andy
Hi Robert,
Do you have any external tracking setup that would confirm this for you? I'm always very cautious with always trusting results from my own desktop and like to have these verified.
-Andy
Can I just add that Google have confirmed that they don't use bounce rate when figuring your search positions, so as long as you are just looking at this from a usability study, then that is great. I just didn't want you worrying this might impact results.
Will clicking the home button on an iPhone / iPad be classed as a bounce? I highly doubt it.
-Andy
As far as I understand, a bounce is the same as a bounce on a desktop. If you go back using the browser button, this would be a bounce. If you hit the home button, I suspect that this would be as well - but it all depends on how this is reported back. I can't see how changing apps might be considered amongst this though.
Is there a reason you are looking for this information?
-Andy
Hi,
Have you checked to see if these pages are linked internally or externally from anywhere else? This could be causing issues for Google, but shouldn't.
When did you last request the URL's were removed? And did you check back to see that the notice said they had been removed?
-Andy
Just hold fire - it normally takes 2- 3 weeks but I have known it to take a bit longer. If it takes more than another 2-3 weeks,you can always request another to be sent in case it's been lost.
-Andy
Hi,
I just searched for the phrase "I Love Diamonds", and you were the 1st result that I saw. Are you referring to seeing something different?
-Andy
Hi Aniket,
I am planning to use rel=cannonical on the multiple pages pointing back to the new page. Will it serve the purpose?
If I understand correctly what you are looking to do, then yes, this will do the job nicely for you. Rel=canonical is often used on multiple pages to point back to one primary to help remove duplication issues in Google's eyes, but still allows you to have the pages in place.
However, if you have lots of these static pages that just exist and are no longer serving any real purpose, then consider 301ing them instead.
-Andy
Hi,
Do you mean create a pre-launch page within your current site? I would prefer to do this myself, but if they are wanting to make use of the new domain to form an e-mail campaign landing page, then as long as it is done nicely, I can see no problem with this.
Best of luck
-Andy
Hi,
There is no harm in using the new site as a pre-launch kind of thing, but I would be noindexing it until you are ready to make the switch. At that point, remove the noindex and ensure 301's are in place for the move. Without the noindex, it leaves it open for Google to pass judgement on - not something you want on a pretty empty page.
If your IT department just want to use it to form the basis of an e-mail campaign, then just don't try to do any SEO and you will be good. However, at least ensure the page is branded and has some content and a nice page for someone to land on. You don't want an empty Apache holding page with a sign-up form.
-Andy
Hi Marina,
Im afraid you cant do this. There is no page level markup for a page like this. You can add schema to the listings on the page though.
-Andy
Hi Jay,
Do you have any dates that you can refer to in Analytics that show drop that might coincide with a penalty / algorithm update?
-Andy
No need to apologise Richard - We need Marina to confirm the structure
-Andy
And this is absolutely correct for a business, but from what I understand, it is the page that all of these businesses sit on that Marina was referring to.
If I have this wrong and Marina is only looking to markup each business on a page, then this is what you would do and there is absolutely no issues with that.
-Andy
From what I understand Richard, Marina is wishing to Markup a page that relates to multiple different businesses that are not related. I'm not even sure what Markup you would use in this instance.
Local Markup relates to businesses, not pages that are more like a directory.
If I have misunderstood the page concept, then I could have this wrong, but I don't think I have.
-Andy
Hi Marina,
The trouble is, Markup isn't meant to be used in this manner. It is for a single item or single entity. If you had a page dedicated just to one business or chain, then it's a little different. Someone might be able to suggest a way around this but I am not aware of a way to Markup a page in the manner you are looking for.
Have a read of these examples over at Google.
-Andy
Hi,
How many scrolling pages are you suggesting?
I would start by having a read of this resource on infinite scrolling over at Google and then this example from John Mueller.
What you are suggesting with pagination makes sense and here is what Google say about it:
Infinite scroll page is made “search-friendly” when converted to a paginated series -- each component page has a similar <title>with rel=next/prev values declared in the <head>.</em></strong></p> <p>I hope this helps,</p> <p>-Andy</p></title>