yes screen readers use them but why not incorporate them in the site with css? It will help their rankings!
Rather than trying to get them to remove them, argue the seo case for them to show them.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
yes screen readers use them but why not incorporate them in the site with css? It will help their rankings!
Rather than trying to get them to remove them, argue the seo case for them to show them.
The next Penguin update is just 3-4 weeks away now if consistency is anything to go by.
They may get what is coming to them in a short time frame! This would be very sad indeed and recovery time on large domains can be a long process taking over a year in many cases.
Remember that Google says that buying links is a NO NO, that includes all kinds of buying, such as I will give you a gift in exchange etc.. Those are hard to detect but the others are so obvious that a computer can detect them with a simple algorithm. Those are the ones you will get hit by and it wont be long before someone else in your niche reports them.
Its just a matter of time. Every update scrapes deeper into the barrel until all are affected by it. One thing is for sure they will have suppression from the Penguin Algorithm, those bad links act like minus points, eventually it will out weigh the good ones and they will drop in rankings. Removing bad ones can actually increase rankings!
its not that it is impossible but the results you see may not reflect what others see.
With cookies set on your computer and with IP and language playing such a strong role it will difficult to see accurate results without forcing it with things like gl=US or something like that.
Do you have a local listing for your address? That will help a lot as well as marking your site up with rich snippets like schema.org
they will get slapped at some point, if googlebot does not get them a disgruntled website owner that has a penalty will look at the top ten results, notice it and report them.
It is a direct violation of Google terms https://support.google.com/webmasters/answer/66353?hl=en
There is no problem in my opinion with using css to style your H1 tags so that they look nice on your site. But if you hide them you are just waiting for a penalty.
Also googlebot is much smarter than it was this time last year it is able to look at pages and see the content on the page rather than just looking at the code. this was part of the hummingbird release to target ad heavy sites above the fold but also looks for large logo's and content above the fold. All of this leads to the fact that Google knows what is visible and what is not and will rank you according to those factors. the H1 tags might be giving them no benefit at all right now without them realizing, When in fact they could be benefiting from them. how much business will they lose when they lose all their rankings for 3 weeks or maybe longer? Is it worth what they think they are gaining from it.
If they are really stubborn you could tell them to change a page or two and test it out. Do a fetch in WMT so the page is indexed quickly and see the results over the course of the week.
Ok so I looked at the site and there are a few things you can do.
Firstly I looked at your site on archive.org all the way back to 2009 it seems you have had a UK address and Bristol address on the site for the time in the contact us page. In July of 2012 the site was re-designed and the London address was used on the homepage. Then in July of 2013 until now the registered address in Bristol is used at the bottom of the page in a format favored by Googlebot and Google+
Second I looked at Google.com results (not UK) to see where your default local listing points to. The answer is Bristol.
First of all I do not see a local listing for London, I would get that sorted out as a priority.
Secondly I would used rich snippets with schema.org to markup your code.
Lastly (an option if the others do not solve the issue) I would create a London page on the site with correct schema featuring the London office and the services provided at that address. maybe something like Mubaloo.com/london-office/
I suspect however that most of your issues will resolve fairly quickly with a local listing. But giving Google more content to work with directly on the subject matter is always a winner.
I have a question: what is the difference between a local version of Google vs. the default Google in regards to search results?
Many factors, the results in google.com can be a real mix of international listings as well as local. The location of backlinks to a website can force Google to rank a site in that region.
Lets say you have a .mx site but have lots of links from UK websites because it is about holidays in mexico then you could rank well in the google.co.uk it would be more likely however if the domain was a .com as Google gives more universal power to those types of TLD's
The domain is a .mx site, so wouldn't it make more sense that this page would rank higher on google.com.mx instead of the default Google site
Not if the above is true, also if you have a less of a local feel to your content you might rank better among a group of similar sites that are more broad in nature.
Also, what determines whether a user gets automatically directed to a local Google version vs. staying on the default one?
There are a lot of factors involved, IP address and Google's ability to determine where you are.
The language you use and even the spelling of words, for instance in the UK vs US it could be color vs colour
There are lots of answers to your questions but those are just a few to give you an idea of the 200+ algorithm factors for ranking in such circumstances.
Sorry been mega busy
First of all, never hide content from Google if a user is unable to view that information. You will get slapped for it. Even if the algorithm does not pick it up, someone will report it at some point. That is a bad foundation to start from.
What you are trying to do is complicated to get the full picture in my head hence the lack of response from others in this forum I think.
You need to describe exactly what will be on the page and why and what will be on others and why those pages need to be indexed. This way we can work out of the strategy you are taking is even the right one. There is likely a better way to do it.
"I'm going to disagree with Gary" Disagree with what? At no point did I say it was not possible to recover, I have in many post before cited that John Mueller has always said EVERY site is recoverable. Here is a link to one of those statements by John (View Question 4)
http://www.reconsiderationrequests.net/google-hangouts/12-April-2013.php?#Q4
I said "You will almost always have some suppression if you are unable to remove bad links." This Is a FACT!
It IS possible to split an arrow in half with another arrow from 100 yards but its near impossible. It is all in the wording with John and Matt, just because something is possible does not make it near impossible. I have done extensive testing on this as have you considering you have a business in this field. What is a FACT is that even if you recover you will almost always have some suppression from bad links even if you RANK #1 you could still be suffering from suppression that would give you a bigger gap between 1st and 2nd.
Recovery WILL take a very long time in most cases. The question is how long and how much time and money are you willing to invest in making this happen. The money invested can also be calculated in the loss of revenue you have every day until you do recover not just the money spent trying to fix the issue.
The cost spiral out of control when you take 6 months for example as a recovery time.
Here are the facts!
I have also had an answer on a similar topic about one great site is better than 10 semi ok sites.
1 site is better according to Google and done correctly will do great, however! Currently in many niches most of the top 10 have suppression or even a penalty and are still ranking top 10 because the whole industry is pulling dirty seo tricks. This has lead to many companies owning multiple sites as backups and because they are clean domains they are ranking up top for them. Google is unable to deal with this problem even when I report them direct to John Mueller. I know sites that have 4 sites in the top 10 listings. They make the whois private and have different contact details for each business and even if you call them they answer in the name of the domain using a simple phone number screening system.
For the last 10 years it has been more advantageous to do the things Google says do not do because you will spend all your time chasing the top 5 which will rotate and change like the wind as the churn and burn sites dominate most SME niches.Should you focus on one domain, YES because trying to do more is a real pain and putting your focus into one will yield better results. Can you protect yourself from future penalties? If you happen to make a mistake then you have no backup and you are back to square one. This is why most SME's have a backup domain or two because of the risk factor a totally ridiculous situation that Google has created because of long recovery times.
I also thought about going with an EMD as so many sites I see have he keywords in the name, so I figured to try that route too.
Another ridiculous situation, there are many issues with EMD, Google on multiple occasions have said the EMD's are bad at some point there will be a big slap on them I am sure! And all your hard work will be flushed down the toilet at the click a of button.HOWEVER! They are taking all the top rankings right now in most niches (excluding the major ones). Another reason why the churn and burn spammer community are winning again.The stumbling block Google hit last year when trying to slap these sites was that a few companies in Florida I believe said "HEY, we have been in business before the internet and called ourselves lawyers of Miami" or something like that. Matt Cutts was uable to argue the case that they did not do it because of SEO, lol and the algo was changed very very quickly.So the answer is, if you are going to use an EMD beware that you are in the firing line for a penalty in the near future. One of the main reasons is that if you want people to link to you using your company name you will basically be getting keyword rich links to your site and the Penguin Algo is looking for exactly that!Your best option is to start a fresh domain, use one of your keywords matched with a unique brand word so that you get some benefit of a keyword without being an EMD. Write new content and see how you rank.You could even keep the old domain up and remove the contact details from that site is there is no issues with the same address and keep working on it in the hope someday you will manage to recover it.
I don't like the answer I have to give you to your questions, but I have been online since 1993 and have been in the website building/SEO/SEM game since about 1997. I have seen it all, I have seen every lie, every bend of the truth and tested almost every example of what we have seen up to this date.Just because something is said as fact does not make it feasible choice. One last example of that was Google said negative SEO was not possible. They then changed their statement six months later to say it was possible but in extreme circumstances.
The real fact of the matter is it IS ALMOST IMPOSSIBLE. The ONLY time where it is possible is when the domain does not have a single link in it's profile previous to the incident that could be considered spammy, which in itself is almost impossible.Its all in the wording. Sorry to be a downer but its better to give hard truths than false hopes. There are few stories of recovery because they require to much time/money/work when compared with starting a new domain.
If you want a good shot at recovering Marie Haynes is and expert in the field and has tremendous experience, if you stand any hope of recovering it is with her.I hope your decision proves to be the right one now that you are armed with lots of knowledge
It is unlikely because Google normally gives preference to the original for a fairly long period of time. However with Google there are no certainties but they do get this right in almost all cases I have seen.
The only users you should see decline on your site are non UK visitors as you are telling them with default-x that they should be sent to the .com
There are many huge companies adopting this process and also thousands of other smaller sites, I think Google has ironed out most of the issues over the last 2 years. You are more likely to see a slower uptake on the new domain than the original than the other way around.
Hope that helps
John Mueller from Google Says Yes, ditch the old site and start a fresh one.
You will almost always have some suppression if you are unable to remove bad links.
Better to start a fresh site and have no lingering issues.
301 redirecting a site with a bad backlink profile to yours with almost identical site structure will pass those bad link issues on to your new domain even if you no longer have manual penalty issues. Almost all sites have a Penguin suppression these days. Some are unnoticeable as their good links far outweigh their bad links.
The actual page you want to look at is https://support.google.com/webmasters/answer/189077
hreflang is the tag you should implement.
I have had long chats with John Mueller at Google about this.
Your setup should be something like this on all pages on both sites.
Within about 7 days depending on the size of your website the .com should appear in favor of the .co.uk for your US based results. For me it happened within an hour!
Setting your .com as a default will be better than setting your co.uk. The co.uk is already a region specific TLD and will not rank well generally in other search engines even if set in the hreflang to do differently.
This will let Google decide where to send traffic too based on their algo/data.
If you use a canonical tag you will be suggesting/pushing US users to the original content instead of the US site.
The site map is an indication to Google to crawl those pages, there are instances where people have meta tags with noindex, follow and would list them in their sitemaps so that Google will crawl all the links listed on the page but not index the page itself.
The meta tags or headers on your page will be the signal to Googlebot on how to handle that page regardless of your sitemap and whats on it.
You run into a number of issues by having these pages indexed.
1. Lots of internal duplicate content, Google has said this is not a problem and the they will serve up the best result. But it can trigger Panda issues.
2. The content always changes so you will confuse Googlebot and have issues ranking for specific terms for any period of time. (your SERPS would fluctuate like crazy or trigger a quality algorithm)
Some ideas:
One thing you could try is loading all the matches on to a page and only show the top 3 matches with an option to reveal more and mark all the code up with a schema. This way the content will always be on the page and able to be crawled by Googlebot.
Another option is to not index these pages at all and create static pages for each item. But this defeats the object of what you are trying to rank for.
Serving up random content is always going to be an issue for Googlebot, but more and more webmasters have responsive designs that hide and show content based on clickable actions on pages. Googlebot indexes all the content but is smart at working out what is also visible to the user and giving preference to it.
I have released 10's of thousands of pages at once on many occasions on multiple sites and never had an issue. Why should there be an issue? If the content is good then Google wants it.
If its poor low quality content then you are feeding Panda and it will take an overview of the total quality of the site and then slap you. So decide how useful all of that content is and why you want it all indexed.
You could even deploy all the content at once and set some categories to noindex and change them bit by bit to see the effects. I think Panda runs about once a month right now so you will need to be a little patient to see overall results.
Good Find SMPoulton
@KempRugeLawGroup, I just visited your site on the iphone5 user agent and it redirected me to mobile.dudamobile.com/site/kempruge?url=http%3A%2F%2Fwww.kempruge.com%2F#2770
Hope that clear things up?
OK I think I understand what you did
"The old site, which has many backlinks to the new site, is still in Google's index"
So you are pointing links from a penalized site to your new clean site, hope you are using nofollows? hmmm. Not a good idea if not!
I would not 301 redirect either. As this will simply assign all the bad stuff over to your new domain as well.
The blog is internal which is a good thing now, but the low quality will effect the whole site eventually. Low quality content must be addressed or Google will hit the site hard with the Panda Algorithm. That low quality then links to your internal pages, not a good idea either.
If the blog offers no real value and you are unable to maintain it then remove it for now completely and focus on creating a well rounded good quality site first. A few good blog posts is better than a hundred terrible ones.
I have conformation directly from Google's John Mueller that this is not a problem and the penalty will NOT pass between domains with hreflang.
Services like tynt.com and other various javascript functions allow you to change behaviour based on many actiosn you perform in your browser. It can change the function of a right or left click, it can even past a citation when you copy and paste content from a site.
Hope that gives you a quick overview.
Just looked at the current design and would recommend that you also use a schema.org to mark up your breadcrumb and other elements, this way you make 100% sure Google understands your site structure.
Removing your left navigation will remove the link juice being sent to those pages however. You will have to test how these internal links affect each page and how much influence they have on rankings.
Google has always said and very recently repeated that internal duplicate content is not an issue, Google will simply decide on what content is best to return results too.
If you are concerned you have a few options instead of what you are doing.
Use the meta noindex so that Google does not index the data. If you cant do that because of wordpress then this can be set externally Using the X-Robots-Tag HTTP header.
https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag
Hope that helps. Personally I would keep the page up because Google is used to dealing with wordpress and would punish thousands of sites if this was really an issue.
Before you go down the long and painful road to recovery you must be informed of the recovery process.
I speak directly with Google employees on this matter weekly.
1: Most top contributors on Google forums are misinformed and will spout incorrect information to you as fact.2: Your first Reconsideration request will likely fail even if there is nothing wrong. Google does this to make you look even deeper and clear things up that might be borderline OK.
3. You REALLY MUST show Google a report of works done, list every dofollow link and list its contact information and and action taken. No reply, removed, 5th try, pending, not contact info, etc.. Also good to list the type of link. Directory, blog post, comment, article etc... this shows that you have done some real work.
4. IMPORTANT: Once you get a revoke YOU WILL NOT RECOVER FOR A LONG TIME. It can and likely WILL take up to and in many cases OVER 1 year to see some recovery of your site. This is why to this date we have seen little in the way of recoveries. The disavow tool needs to get to work and then once most links have been re-indexed the algorithm must then be updated with the new information. THEN at this point you must then wait for a Penguin refresh currently running every 6 months. That's just twice a year!
5. Once you see a recovery you may well still have algorithm issues for the remaining unnatural links. If you imagine that a new site starts at 0 points, your revoked site will be starting at -X points, this is a hard place to start your new climb back to success.
Google has recommend on many cases to simply ditch your old domain and start a fresh. A very poor response to the issue but its an honest one. Can you really survive a year of not ranking AFTER you have managed to go through the process of a reconsideration request?
If you decide that all of this is worth it then I would look at a service like link detox, they have some cutting edge tools that will help, but its not cheap and offers no guarantees.
There are many great guides out there on ways to do it. i have done it by hand with 15,000 links, its not fun and can be a royal pain in the ### but at least you know what work has been done and how accurate it is.
Also Matt Cutts once said attack your links with a Machete, meaning don't try and save some of your BEST links to see if you can get away with it. This only delays the process and could be the reaosn you miss out on a 6 month algo refresh. Are those links worth your business not ranking for another 6 months?
If you need any FREE advice let me know.
All the best with your choices and recovery process.
OK, so as I expected there would be nobody able to answer this question
So I asked John Mueller at Google!
I got a very precise answer.
widget
A div with an onclick is nofollow. so this would be a bad idea if you want internal pagerank to flow.
Currently the only way to make your clickable areas functional and Google friendly is this way below:
I just got off a Google Hangout with John Mueller and was left a little confused about his response to my question.
If I have an internal link in a div like
widgetwill it have the same SEO impact as widget
John said that as you are unable to attribute a nofollow in an onclick event it would be treated as a naked link and would not pass pagerank but still be crawled.
Can anyone confirm that I understood it correctly? If so should all my links that have such an onclickevent also have an html ahref in the too? Such as
Many times it is more useful for the customer to click on any area of a large div and not just the link to get to the destination intended?
Clarification on this subject would be very useful, there is nothing easily found online to confirm this.
Thanks
Google states that it will likely take 6 months to a year (but can take longer) to see a recovery once a revoke is issued for a manual penalty if it is related to a Penguin Algo issue.
You must wait for a refresh of Penguin to run, but if it runs too soon after your revoke and Google has not finished reorganizing your disavowed and removed links then you will have to wait for the next update. Currently on 2 a year and the next will likely be in april/may
Did you have a manual penalty? Did you get it revoked? or did you assume you had a Penguin issue and were proactive about it to avoid a manual penalty?
Don't worry about nofollow links.
The only time to be concerned is if there is a possibility they could be changed to dofollow in the future or that they exists as dofollow links elsewhere on the site.
There is no problem listing them in your disavow file, sometimes its safer to do this for a poor quality domain. Removing them is not necessary, it may speed the recovery process up by removing them if they were dofollow but as they are nofollow I would not bother.
We were hit by Panda and Penguin and got an unnatural link penalty way back in 2012, Staff at Google confirmed there was no issue, however I have confirmation that we have STILL not recovered from a penguin refresh yet! Even after all this time. However as of yesterday my pagerank is back on all my sub pages!
Maybe this is the first sign of an actual recovery. Even though the pagrank back in the day was a steady 6 with the odd occasion of a 7.
We were hit with an unnatural links penalty on 23rd of July 2012. (full story here)
The effects of the Penguin algorithm lead to the unnatural links penalty.
Google claims to ignore all bad links but when you reach a certain point they want to make you aware of it and accountable. That's when you get the manual penalty.
Without a warning there are tons of websites out there who are about to trigger a manual penalty because the website owners have no clue about this stuff. The disavow file can be used to protect you from the penguin algorithm triggering a manual penalty.
The fact your site can also be affected by the links with no warning is so counter productive to good search results. If Google says they ignore them already then your site should simply lose the benefit of those links not also receive negative effects as a result. I am going to reconfirm this point with John at the next hangout.
I found myself in an impossible situation where I was getting information from various people that seem to be "know it all's" but everything in my heart was telling me they were wrong when it came to the issues my site was having.
I have been on a few Google Webmaster Hangouts and found many answers to questions I thought had caused my Penguin Penalty. After taking much of the advice, I submitted my Reconsideration Request for the 9th time (might have been more) and finally got the "revoke" I was waiting for on the 28th of MAY.
What was frustrating was on May 22nd there was a Penguin refresh. This as far as I knew was what was needed to get your site back up in the organic SERPS.
My Disavow had been submitted in February and only had a handful of links missing between this time and the time we received the revoke. We patiently waited for the next penguin refresh with the surety that we were heading in the right direction by John Mueller from Google (btw.. John is a great guy and really tries to help where he can). The next update came on October 4th and our rankings actually got worse! I spoke with John and he was a little surprised but did not go into any detail.
At this point you have to start to wonder WHAT exactly is wrong with the website. Is this where I should rank? Is there a much deeper Panda issue. We were on the verge of removing almost all content from the site or even changing domains despite the fact that it was our brand name.
I then created a tool that checked the dates of every last cached date of each link we had in our disavow file. The thought process was that Google had not re-crawled all the links and so they were not factored into the last refresh. This proved to be incorrect,all the links had been re-cached August and September. Nothing earlier than that,which would indicate a problem that they had not been cached in time.
i spoke to many so called experts who all said the issue was that we had very few good links left,content issues etc.. Blah Blah Blah, heard it all before and been in this game since the late 90's, the site could not rank this badly unless there was an actual penalty as spam site ranked above us for most of our keywords.
So just as we were about to demolish the site I asked John Mueller one more time if he could take a look at the site, this time he actually took the time to investigate,which was very kind of him. he came back to me in a Google Hangout in late December, what he said to me was both disturbing and a relief at the same time. the site STILL had a penguin penalty despite the disavow file being submitted in February over 10 months ago! And the revoke in May.
I wrote this to give everyone here that has an authoritative site or just an old one, hope that not all is lots just yet if you are still waiting to recover in Google. My site is 10 years old and is one of the leaders in its industry. Sites that are only a few years old and have had unnatural link building penalties have recovered much faster in this industry which I find ridiculous as most of the time the older authoritative sites are the big trustworthy brands. This explains why Google SERPS have been so poor for the last year. The big sites take much longer to recover from penalties letting the smaller lest trustworthy sites prevail.
I hope to see my site recover in the next Penguin refresh with the comfort of knowing that my site currently is still being held back by the Google Penguin Penalty refresh situation.
Please feel free to comment below on anything you think is relevant.
How often do you guys use the search operator "link:"? Not very Often
How accurate is the data? Its accurate, but it only shows a small percentage of the links
Why the numbers of links when we use the parameter is way lower than the number on Google webmaster Tool or open site explorer? It is lower because in the past people used this search operator to find links pointing to competitors to help with their own link building plan
It only show the most powerful links? No it shows a random selection most of the time.
You could be just one more Penguin revision away from a penalty.
I would get most of those bad links in a disavow file right away. The start to contact as many as you can and get the worst ones removed. this will safeguard you from a future penalty.
Once this is done you will have to wait for Google to re-crawl all those links to apply the nofollow attribute via the disavow.
While you wait for that to happen you can start by getting some traction on white hat link building / outreach.
This whole process can be done very very quickly. You must also confirm that the issue of rank drop is related to Penguin or you may see very little changes.
I have a site that was hit so hard by panda and Penguin that its been very difficult to recover and a new domain could be on the cards. Its a large site with a big brand that has been around for 10 years but almost 3 years of bad rankings has been a nightmare and very expensive path for the company to have taken up to this point. Who knows where the company would have been if it was moved to a new domain with 3 years of work applied to it.
I can see it indexed. Looks like the problem is resolved.
I noticed a while ago that Google was hitting sites that pipe keywords, I would change that page for instance to
"1099 E-File Software & Electronic Filing"
You are not repeating words that way.
When I type your url into google, it changes the title anyway to "1099 Pro Enterprise - 1099 Pro, Inc." this is a sure sign that Google does not like your titles.
Try it on a few pages. Also your meta descriptions are very keyword stuffed and make sure any keywords used are also in text on that page.
Does this site below belong to you? It is a replica of your page
http://systemfarmer-web.gopagoda.com/uzemeltetes/
When i take a snippet of text and put it in quotes in Google, that site shows up as the authority for that content. If you own that site I would look at canonicalizing it (https://support.google.com/webmasters/answer/139394?hl=en) or at least noindex it.
Its a tough one, because you know they should be nofollow but they are good links on a good site so you don't want to.
Only you can make that call, but one thing I would do for sure is go back to any of them with anchor text and change them to your brand or your domain.
Why are there so many links from just 2 domains?
If they are sitewide, Google is not a fan of that unless they are nofollow, however why are some nofolow and others follow? Is that not a signal to Google in some way? Why would a site naturally have some nofollow and some dofollow links to your site?
I found 7500 results when I search for site:doggydazzles.co.uk
Searching for your site http://www.doggydazzles.co.uk also returns results.
Searching for "Cheap Rotastak Hamster Products" shows your site in 11th place.
However there is a site with the same name with a hyphen, it may take Google a little while to understand you are not the same site, this will happen rather quickly as your site builds some brand awareness. However as you can see your internal pages are already starting to rank for the long tail.
Also your contact us button does not work and you have no call to action at the top of the page.
Hope that helps? Keep doing what your doing and give Google just a little more time. Also don't forget to build a Google local page this will speed things up a lot. If you have a trading address.
OK, so I think my last paragraph might be the best option.
Let me explain.
Is the content also hosted on a master site that you own. Or is it only available on the sites you distribute it to? i.e is there a main source?
If yes, I would re-write the content taking small snippets of text from the original article and write a few paragraphs, this way you can reduce the amount of work you need to do. Then I would link to that main source but using a nofollow link.
So essentially you have all your sites with some unique content that references the master piece of content in some way and links to it for further reading if need be.
If you no index the content then those sites will never have a chance of ranking for those key phrases.
If there is no original source of that content hosted on a master site somewhere then re-write the content or noindex it. All those sites hosting the exact same content is not good for any of them.
OK, Forgive me if I am wrong.
But I think the question you are asking is less of a technical one.
From a quick search it would seem you have no Google Local account and that you may be unfamiliar with Google Local? Its like Google's version of the yellow pages directory.
All you have to do is create an account at the link below
https://support.google.com/plus/answer/1713911?hl=en
This will then associate your website with your local listing and Google will be able to apply your address and other features to your search results.
Is that what you where asking about?
Just a quick answer to Q3
"Do I need to submit a reconsideration request to Google after I've cleaned up this mess or will I need to wait till the next scrawl?"
You are now only able to submit a reconsideration request if you have a manual penalty. The option is no longer there in Webmaster Tools if you have no penalty.
Thanks Anthony, I have heard this recently. My problem is I want to find a solution and repeat it 20,000 times LOL.
Does anyone know a way to get Google to re-crawl a webpage that does not belong to me.
There are a bunch of pages that I have had links removed on and I want Google to re-crawl those pages to see the links have been removed. (current wait time is way way too long)
Can anyone suggest some ways to get the page re-crawled. (I am unable to get the website owners to use WMT to do anything).
Suggestions like good ping services and various other techniques would be very much appreciated.
Thanks
Can you check something for me please?
Log in to wp-admin
Go to Setting then Privacy
Check if “Ask search engines not to index this site.” is set to ON
Hi Chris,thanks for the response.
However you say "I would assume your link would have to be re-crawled" and that is the only bit I needed a definitive answer for LOL.
I have looked everywhere for an answer but so far not found one....
Can anyone help me understand a specific process of a 301 redirecting a domain.
Here is what I would like to know....
When you 301 redirect a site, most if not all the links follow to your new site. But how does this process happen?
1.When Google sees the new domain does it simply apply the backlink profile of the old site to the new one?
2. Does it have to re-crawl all the links one by one and apply them to the new domain?
3. or something else?
Very common.
There are a bunch of simple solutions.
First you can look at putting some php at the top of the page that looks to see if "cat" has a value. if so then put an if clause in the meta section. if cat not empty show meta noindex etc...
Another option is for you to go into webmaster tools and change the "URL Parameters" in the "Crawl" menu. Select the option that best fits.
I have had to do this for many sites and seen great results once sorted out.
You can also use the "Remove URLs" tool under the "Google Index" menu to remove large sections quickly if they all fall under a specific pattern or path.
So you have multiple clients all in the same field that each have the same content on their sites?
What links are on those articles that require a nofollow? or do you mean that you should noindex the content?
Re-writing the content is a good idea and will remove any Panda de-valuations you might have as a result of duplicate/shared content. It will be hard to compete with that content if it is distributed to so many places.
It might be a good idea to take references from each article and point nofolllow links to those articles. and surround those references with some unique content.