Hi,
You can check the list of recommended companies Moz created here: http://moz.com/article/recommended
Hope that helps!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi,
You can check the list of recommended companies Moz created here: http://moz.com/article/recommended
Hope that helps!
Sure Christy, I'll do it and include some screenshots if I can. Thanks for your continued support and caring!
206 means partial content, which is what your Website/Server is delivering to Facebook's request. Have you tested the "Fetch as Googlebot" under Webmaster tools to see if Google can get the files? https://www.google.com/webmasters/tools/googlebot-fetch
If you get an error there, then it must be something IP related with your server, as my test returned a 200 and a test using Googlebot as the user agent also returned 200, which means that the IP wasn't blocked (nor the user agent excluded), basically telling me that if Googlebot is unable to access your site (nor facebook) it must be something IP related.
Hope that helps!
Hi Rob,
I personally wouldn't go the way you are heading... that could be seen by Google as a technique to manipulate search engine results (which you stated it is).
But to respond to your question, why don't you use the "definitive" version of the page as the canonical? If the one including "near downtown" is the most accurate (and complete one as I guess the hotel IS near downtown) then you should go with that and noindex the alternatives... although I know that's not your intention, that is the way it should be done.
Hope that helps!
Hey Christy,
I still experience fairly regular issues with the login too.
It has become something I am just "used to do" = relogin when going from the Q&A to a profile (ie).
I suggest you download your "latest links" from Google Webmaster tools to verify if the report is/isn't an OSE issue. If Google also shows all those new links, then you might want to look a much closer look and if those are spammy backlinks, start disavowing before getting any penalty.
They almost never tell the "why" and "how", because spammers will eventually find a way to overcome it. Cutts at pubcon simply said something like "the roll out of Google authorship opened the door to anyone that can implement a Google author tag. In the coming months Google will finish up a plan to look at social factors around google authorship and award higher priority to author’s that are truly authoritative on a topic and that have published social conversations on that topic."
Google announced about a month ago that they were going to remove some rich snippets from SERPs and they were going to start showing them only for the most trusted authors. This might be the result of that exact update.
To get the picture back you only need to keep building and creating content while you continue establishing your online reputation.
More info: http://www.seroundtable.com/google-rich-snippet-reduction-17579.html
I think Moz already tells from where is the link that points to that 404 page.
Another way is to use GWT, they list the 4XX errors and if the links comes from our site, click on the error and see the "Linked From" and "In Sitemap" to see if your site is linking to that particular 404.
And as the last resource, you can use some of the online tools available for free to find broker links, example: http://www.brokenlinkcheck.com
Hope that helps!
I have personally done this in the past, it didn't affect at all my rankings and it could even be seen as a benefit for your users.
I guess you are going with an Apache web server, make sure to set 301 redirects from the .php versions to the [no extension] URL (via the .htaccess).
Even if the user types index.php, you should create a rewrite rule to remove that .php and set a 301 in the process.
Hope that helps!
What do you mean by "search engine simulators"?
I tested crawling your site with googlebot as the user agent and it worked just fine.
Google and other engines are capable of running javascript and ajax just fine that shouldn't be an issue.
What I would suggest is to look over your pagespeed. Your homepage loads a TON of external files, about 50 requests for JS and CSS files. You should really consider putting all those codes into a single JS and CSS file instead, making over 50 calls (+ the extra ajax calls) are WAY too many!, not to mention the hundreds of lines you have of inline JS and styles...
Yeah, they offer free and paid hosted versions too. But I found the server side version much simpler to setup and control.
I have been using xml-sitemaps (paid version) for all my sites over 5 years and they work like a charm, scraping and indexing what it needs to be indexed ans scraped, plus it consumes really low resources. 100% recommended (they have nice plugins too for extra sitempas (video, news, images, etc).
Hope that helps!
There's nothing "against" doing it, however, what's the added value? If it's a company profile then the author is the company and not actually a person. Authorship markup was introduced for people, not companies.
No, Google is smart enough to determine which content shouldn't be duplicated and what content is most likely to be the same in all sites (exactly your situation).
As a side note, if you are adding the content in HTML, plus providing an extra value, you will probably rank higher than those that just display the product specs as a downloadable PDF.
The problem is that plugin you are using, "Wordfence". It is probably picking up the crawls from Screaming Frog and Moz as DoS attacks because of the amount of requests from the same IPs.
You could either see if the plugin IP whitelisting or why don't you just remove that plugin and use CloudFlare, which is free and offer an even more robust security option + an included CDN.
The link is a 302 redirection, which means that probably doesn't pass any value to your client's.
They are probably using that to track clicks on the outbound links.
Implementing the redirect at the server level will be much better as you can redirect all pages to their correspondent page in the .com version.
You will need the URL-rewrite extension: http://www.iis.net/downloads/microsoft/url-rewrite
Then create a "Canonical Hostnames" rule to redirect the other domains.
Hope that helps!
Hey Omar,
Direct traffic isn't Google's traffic. Direct means when the user has no referrer, most likely accessing your site by typing the URL.
The "not-provided" accesses can be found under Traffic Sources -> Google Organic
Hey Nick,
You can review the recommended SEO companies list Moz created here: http://moz.com/article/recommended
Hope that helps!
Probably an API rate limiting? I don't know that tool, don't know if it uses Moz's API or their own, but Moz's has a rate limit, and most likely that other one too, which could be causing that "0" response code (as that response does not exist aside of an ajax call return when the request fails).
I faced the same issue. If the YouTube channel was created prior to the G+ you can't link them together. We ended up creating a new YouTube channel linked to the G+ profile and uploading the videos there. Yeah, you lose your views, comments, etc, but in the long term it should be worth it.
You can consider removing the videos from the old channel, we didn't, but we are thinking about going that way.
Everything that Alex said PLUS fill a reconsideration request. Wait a couple of weeks for Google's response. Your request MUST explain all the steps you made to clean up your site and be in compliance with Google's TOS (with proof and all)!
Yup, just wait. However, I would consider switching to a better server, a 5 day downtime is a long downtime! Look for more reliable solution.
Hi Joey,
I would definitely keep that as the homepage. It looks much more appealing and interesting than the guides page!
You should think on other ways to keep the visitors in your site, perhaps some little design changes that could help. I would start with the header, it is FAR too large. The main content is moved far away to the bottom. Although it is a responsive design, the header remains always too large.
I have a 1920x1080 and I can barely see the main content, but look what happens with a smaller screen, you can't see anything but the header.
Are you buying links? Or are you using an SEO company for link building, in the AHREFS report I saw this domain: http://www.goldiefish.ie/ which has a sitewide link to your site in the footer that says "check out mutantspace arts skills exchange providing you with free creative and production skills for your arts events and projects" that looked suspicious to me.
If you are not doing it then someone is. You should consider doing a backlink audit and possibly creating a disavow file.
Have you checked WMT for any manual action on your site?
As Michael said, a 301 will carry the penalty to the new domain, if it didn't everyone would do it without actually fixing the issues.
Instead, take another approach, try to fix the issue. If it is an algorithmic penalty, once you fix everything you think could be causing the "penalty" (as you are not 100% sure), noindex the entire store or the pages using a meta tag or the robots.txt.
Are those pages within the same subdomain? domain.com/penalized-pages or is it a subdomain? penalized-pages.domain.com
Hmmm.. seems to be a very common issue.
How about creating script that fires the map load on a div that loads a static map image instead of the iframe by default? Then using a simple function switch that image to the iframe of the map. That should do it for the "sorry we have no imagery here" problem.
If it doesn't, you could try using some kind of internal catching to get the static image and save it in your server to serve that as the "1st" image, you can then load the iframe.
Hope that makes sense
Hmmm I already see the guides page as the default page. I personally don't like the way it works... When I access a Website I like to see the domain homepage.
To answer your question, if you did a 301 redirect to the guides page then link juice flows to guides, however, some small percentage is diluted in the process.
I'd advice (if you wish to have the guides as the homepage) to remove the "/guides/" in the URL, basically making the guides page your homepage.
Hope that helps.
Google's Matt Cutts posted a video about this exact issue you might want to check: http://www.youtube.com/watch?v=4eYJuT0yGrI
Hope that helps!
Hey Tina,
I'm afraid your colleagues saying that the best way to go is to have it under a subfolder are right. Basically, all search engines consider subdomains as separate domains, meaning that if you put the learning center on learning.website.com will have the same effect as having it on somenewdomain.com, neither will help website.com unless you link from one to the other, and that could be seen as doorway pages which will ultimately hurt more than they could help.
Using a subfolder, any advantage you get from website.com/learn/ will influence website.com without putting website.com at risk (unless you post completely unrelated stuff in that new folder). But as you trying to write about the same niche, it is like basically having a blog. Take this Moz section for example. The community folder in a subfolder under their "money site" and I assure you it is working far better than if it was a subdomain.
I'm with your colleagues on this one
Hope that helps!
Is the site still getting customers accessing the site directly? As if you established a brand name, that people is used to, I should consider cleaning up the domain instead.
I've done it myself, 95% of the links were toxic according to Link Dtox, we tried disavowing just the worse ones, reconsideration declined, then we disavowed over 1500 domains and left around 50 that were healthy links, reconsideration approved and penalty revoked. The very next day we started seeing 100% increase in search traffic, and although we are not ranking as high as we expected, Google's Matt Cutts told a few days ago that even after a penalty is removed, it may take some time until Google trust the site again.
All the work was done KNOWING that we couldn't just "dump" the domain as we had over 100,000 customers accessing our site directly. And if you change to a new domain and redirect the penalized one, chances are the new domain gets the hit too.
So as a first step, I would ask myself if it is worth the clean, depending on how many direct traffic you have, brand, etc.
Hope that helps!
Are you under some kind of reverse proxy that could be blocking Rogerbot's IP address? I tested using several user agents, even googlebot and all returned a 200.
It is either you are blocking Rogerbot's (moz bot) IP addresses or an issue on Moz end, in that case, you can contact them here: help@moz.com
Hope that helps!
I think you are completely correct. Making a responsive design does not mean "hiding the content that doesn't fit" rather "displaying it differently" so any user under any device is able to see the entire content without having to zoom in/out.
The example you posted about Wikipedia is the exact live example.
You could, however, remove areas of the page that have no actual value to a user browsing from a mobile device, that is acceptable, as even if you showed it they wouldn't be even able to see it (ex: flash content). This can be seen on sites that have floating social media buttons, than when on a mobile site, they usually accommodate those buttons elsewhere or completely hide them
There shouldn't be any SEO downside as all the "micro" hosting accounts will have, is the same IP, and that's is no longer considered when ranking a Website.
You can also have more than 1 IP address on the server and assign a different IP to each site, that can help you prevent possible spam generated by one of your customers that if they share the same IP all of them will be affected.
Well, if you think you can do it without a penalty, then go ahead. I am just giving you my opinion. As Google's Matt Cutts said, Google will take some time to trust your site again once a penalty is revoked, so probably they are looking at your site closely, sooner or later, they will see those links and do their math, and you won't like the outcome.
Just my 2 cents...
You shouldn't be doing that in the first place. That's a big NO-NO. You may thing Google won't catch you as you are only following some links, but the way they control information and the fact that they have been fighting spam for years... you are just killing your site softly...
Doesn't the building provide you with a suite number? (ask there)
They usually do for correspondence delivery. Try to see if you can get one, but if you can't then you should use what you have, without the suite # as there's none.
There are several businesses that run in a same "office" with just separate "areas" and use the same address, that shouldn't be a problem. In fact, the lack of a suite number isn't your problem, but the building's.
Hope that helps!
Hey Chad,
He will definitely loose some value, however, if there's still some juice flowing from the old domain, regaining it will get that "loss" that you were having redirecting to the new domain back.
I agree with your idea, it will be much better if he use the old domain instead and cover all the services he provides.
Remember to follow all the necessary steps when changing the domain, 301 the entire domain to the old one, changing it via GWT, etc. The loss should be minimum, it will be noticeable during the first 1 - 3 month, but in the long run it should be a win.
Hope that helps!
Basically yes. Any form of payment received in return of a back link is considered a paid link and will be penalized.
If you are "over-optimizing" the page just to "get the lead" then I'd suggest using a noindex tag to avoid that page from being indexed, as most likely, if those are landing pages specifically designed for users coming form online advertising, they don't offer any added value to your main site.
Well, you are right, manual are easier to fix, although most likely, sites with manual penalties usually fall into an algorithmic penalty too.
Steps I'd suggest:
Hope that helps!
I'm afraid you are wrong Yiannis, my own Website had a manual penalty for over 1 year and as it was a "partial-match" it only affected "some" of its ability to rank, so instead of being in the 1st spot we were on page 2 position 11 - 15 for the whole year. 2 weeks ago the penalty was revoked and the very next day we found ourselves back in the 1st page. It will take some time until reaching a good rank following Google's guidelines from now on, but it is the only way.
Hey Scott,
Have you checked GWT for any manual action notice?
About the keyword meta, it is completely useless, Google doesn't use it at all, you can remove that safely, the ranking drop has nothing to do with it.
Did you run some backlink reports on your competitors? (to see if they have higher authority than yours, more backlinks, etc.).
Hey Jeff, probably if there's a plugin designed to specifically redirect affiliate links I bet it already does a 302 redirect while suggesting to add the disallow in the robots.txt (or even adding itself).
Both will actually have the same effect as long as you nofollow your affiliate links directly or the PHP script that does the redirection is disallowed via the robots.txt + makes a 302 redirect.
The downside of using nofollow is that you can't track the hits on those links from your end, but with PHP you can save that into a DB or any file.
Hope that helps!