You probably want to contact the Help desk with this issue. The help desk provides prompt, direct replies to issues.
I do see help desk employees popping in to the Q&A forums on occasion, but not with any regularity.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
You probably want to contact the Help desk with this issue. The help desk provides prompt, direct replies to issues.
I do see help desk employees popping in to the Q&A forums on occasion, but not with any regularity.
My response is....
I give ZERO weight to the domain being 12 years old. NONE. Whether a domain is 12 weeks old or 12 years old has absolutely no direct consideration in rankings.
Indirectly, the longer a domain has existed the more time it has had to earn links and that is an advantage, but the domain age in and of itself is not a factor. If you bought a domain 12 years ago and let is sit parked all that time and began to use it now, you would not have any advantage over a new domain in SERPs.
I appreciate your perspective Rhys but I disagree with it. For me, the rankings make perfect sense, and apparently they do for Google as well. In my experience you are overlooking the major factors which are readily apparent while looking for explanations in less obvious and less important factors. You are welcome to disagree in which case I am not able to help. Perhaps someone else will share some the feedback you prefer.
If you can share the URL to a few example pages, we can probably offer a better idea for you. Generally speaking, if you are trying to hide content from Google you are probably not taking the best approach.
I understand you wish to demonstrate the value you offer, but running the entire site in SSL is not the way to do it. It is a negative experience for both the user and you to run the site in SSL.
1. It will eat your bandwidth. If you are hosted on VPS your hosting expenses will increase. If you are on a shared server with no extra bandwidth charges, then all of your pages will load slower because of the increased data transfer.
2. Images from a secured server cannot be cached. Images are the biggest use of bandwidth and page loading speed. Your increased bandwidth usage will be amplified, and users will take another noticeable performance hit when accessing your site.
3. A third performance hit is that on each side the data has to be encrypted and decrypted. Have you tried running any performance testing?
4. Many users will never notice the green bar, and have absolutely no idea what it represents.
Apache covers issues related to SSL response times in their FAQ: http://httpd.apache.org/docs/2.0/ssl/ssl_faq.html#load
If that is the case and the same product is listed in more then 1 category and sub-category then will that product have 2 unique urls and as a result be treated as 2 different product pages by google? And then since it is the same product in two places on the site won't google treat those 2 pages as having duplicate content?
Correct. You should decide which category is the most popular and then use the canonical tag so all other versions of the page point to the main page.
SO is it best to not have the category and sub-category names in the URL of a product page?
Using the canonical tag is one option. Using the same product page for both categories is another option.
And lastly, is there a preferred character limit for a URL to be less than in size?
Technically speaking URLs can be over 2000 characters. Practically speaking, the shorter the better for user readability and other factors. Dr Pete covers this topic well: http://www.seomoz.org/blog/should-i-change-my-urls-for-seo
If you have authentic badges such as Verisign, McAfee, TRUSTe, etc. those badges cost money, must be earned by meeting requirements, and provide real value to both the site and it's users. You should ensure your users are clearly aware of those badges.
In most cases, I recommend displaying the trust badges prominently on the home page. They should appear above the fold so they are seen immediately upon page loading. Keep in mind many visitors will see your site and decide within 5 seconds whether to explore it further or bounce. Offering recognizable symbols which display trust goes a long way when visiting an otherwise unknown website.
There is no solid best practice the SEO community has agreed upon. If you are a well-known, trusted business then you do not necessarily need any trust badges, and if you do have them they can be displayed anywhere on your site. That is not to say the badges don't offer value, but if you are Sears, AT&T or other well-established companies, most of your visitors solidly trust your company before they ever visit your site. If you are an otherwise unknown company, then the mere presence of trust badges can be the difference between a visitor who explores your site or a fast bounce. Trust badges can also be the difference between a visitor and a sale.
With respect to nofollowing them, I see no benefit in doing such. All visible links on your page consume PR whether they are followed or not. I am confident Google understands trust badges and handles their PR flow quite well.
If you are in doubt about trust badges or any other aspect of your site design, I highly recommend A/B testing. Present your site to 50% of your users with the trust badges in one location, and then to 50% of your visitors with the badges in another location. Test for a period and then compare your analytics between the two pages. Depending on your site's traffic, you may have enough data after a day to make a determination, or it may take more then a month to get a good, solid data set.
Also be sure the trust badges are properly installed. If you click on a trust badge, an authentic screen should appear verifying your participation in the trust badge program. If the badge is not properly installed, you will just see an image and it is highly suspect the website is a fraud and misrepresenting their participation in the program.
A last point, there is a big difference between trust badges. Earning a trust badge from a recognized leading company offers value. Earning a trust badge from an unknown company has almost no value. A security seal from VeriSign or McAfee is highly valued. A security seal from Comodo or other providers is not of much value to users as most never heard of the company. It's the same idea as if you earned recognition from Consumers Reports with a "Best in Class" product award, versus if you earned recognition from Consumers Protection Agency (I just made the name up) with the same award.
I asked the help desk this exact question. In short, the answer was No.
The official explanation was the tool is designed to err on the side of offering too-much information, rather then too little.
My reply....that's not helpful. A report is useful when it provides data that is either informative or actionable. The very first time I looked at a site crawl report and noticed the "title too long" errors, I investigated each one. I resolved many issues, but many were acknowledged and no action is going to be taken.
The site involved has a forum section. The forum software automatically appends the site's title to all titles. It's not a bad thing at all. Also, the thread title are controlled by users, and sometimes they choose lengthy titles. I could adjust scripts to cut off the titles and cut-off the title @70 characters, but why? To satisfy a specific tool which is designed to assist me? That's an approach I decided against long ago.
I whole-heartedly agree. we should be able to say "hey Roger, thanks for letting me know. The first time was informative, but now you are being a pain in the ass." I still wish to be informed about NEW title-too long errors, but I wish to be able to disable the warning about these "errors" which have been acknowledged.
Until the moz team upgrades the tool, or someone creates a better crawl tool, we are stuck with this issue. I would be interested to hear if anyone has found a better tool elsewhere.
I apologize if I misunderstood Marisa. Normally when others mention building links to resolve the issue, they are talking about running out and performing various "link building" practices quickly, which means low quality links. If you are referring to earning links over time, that is great but it also means the site penalty will exist for a very long time. Most site owners desire to resolve a penalty as an emergency issue. A penalty has put some companies directly out of business, and severely damaged other companies.
With respect to the private WHOIS information, you can absolutely send an e-mail to that address. It will be forwarded to the domain owner's registered e-mail.
If I were in your situation I would explain to the client they made an error in judgment by hiring a bad SEO provider to build links on their behalf. Those links damaged the site and the penalty is the result. Their choices are pretty straight forward:
pay to have the penalty resolved...very expensive
try to resolve the penalty themselves....in my experience most people fail or get frustrated and quit the process.
abandon the domain and start over
abandon the pages involved which usually means losing the links for their most important keywords (i.e. the ones they paid to obtain manipulative links for)
In each case the affected site owner will pay. They either pay directly in terms of SEO penalty removal costs, directly in terms of labor for them to do it themselves, or indirectly in terms of lost ranking.
I agree with Simon. Prior to Panda pop ups had no effect on SEO. You can hear Matt Cutts share this directly: http://www.youtube.com/watch?v=h_0WI75X4U4
I would add that many users perceive pop-ups to be unfriendly, and in our post-Panda world it may be a ranking factor. I would suggest taking a close look at how users perceive the popup. Find a way to sit people down in front of a pc and get them to visit your site. Watch their reactions to the popups. If 2-3 of the react negatively then take that as a strong indicator and consider it surely is a negative user experience and could be a negative panda factor.
The issue is your closing title tags contain an extra slash character and are therefore not valid HTML. You are confusing poor Roger, SEOmoz's crawler.Remove the extra slash character and you will be fine on your next crawl.
<title>TJ Electrics | Electrician | Isle of Angleseytitle />
Either choice is just fine. If you have "allergies" as a category with many sub-categories, I would recommend the folder approach but either way will work.
Matt Cutts from Google shares insight on this topic: http://www.youtube.com/watch?v=971qGsTPs8M
It seems the Q&A section had some issues since the 25th. Users could post new Q&As but they were not visible to most users. Roger was caught slacking! The issue appears to be resolved at this time.
I just wanted to share to anyone who asked a question the past few days who did not receive a response you may wish to repost your question as many readers will not go back and check questions from prior days.
I agree with most of your plan.
I am not clear if by "advanced report" you are referring to an Open Site Explorer advanced report. If so, I would not recommend that approach. Instead, use a crawler to update your sitemap, then use the sitemap as the most complete list of URLs.
Also, I differ with Rebekah on the point of only redirecting the URLs with the most traffic. When possible I would recommend redirecting every URL to the appropriate page on your new site. Many people might bookmark a page, send an e-mail with a link, etc. You never know who has saved a URL to a page on your site. Also, you did not mention your market. Sometimes a single client is worth thousands of dollars. I would hate to risk losing a potential sale by saving the relatively small amount of time it takes to perform a redirect.
However you choose to proceed there are two additional suggestions. First, ensure your 404 page is friendly and helpful. It should offer your site's navigation, a search box, etc. Second, review your 404 errors DAILY after the site move until your error count drops down to very low numbers.
Good luck.
I think your first response nailed it Yannick but I disagree with the second one. The freshness algorithm adjustment for specific keywords should not have any affect on a page's PR.
OSE is dependent upon the Linkscape index. The Linkscape database is refreshed about once per month and contains the top 25% of web pages. For SEO, that's mostly what matters. The index was last updated July 25th.
It can take 60+ days for you to have visibility to a link in OSE. It depends upon when the link was created and when Linkscape began crawling the internet.
It's a great tool to use if you understand it's limitations.
My preference would always be to use the htaccess file for redirects. There are some situations where the site owner cannot modify their htaccess file due to various restrictions, in which case you would need to use a CMS-based solution or extension.
Hi Falcopa,
The concept of "optimization" is fluid. You cannot optimize for 5 terms. Any work you do to optimize for terms #2/3/4/5 detract from the optimization of term #1. I'll share one example as it applies to the title. The most perfectly optimized title for ranking a page discussing diet cheese would be..."Diet Cheese". You can add additional terms to the title to help optimize your CTR such as "Diet Cheese Facts" or "Losing Weight with Diet Cheese".
If you change your title from "Diet Cheese" to "Diet Cheese and Swiss Cheese" then you have lose 50% of the ranking weight from Diet Cheese and shared it with Swiss Cheese". Additionally, the terms at the beginning of the title have more weight so it might be more like a 60/ 40 split. You cannot optimize for Swiss Cheese without weakening Diet Cheese. Does this make sense?
With respect to the onpage report, for a given keyword, any page can be graded. The report should only be run on the one or two keywords which is the focus of the page. As described above, you cannot be an "A" on 5 terms.
The clear direction for you is to create new pages.
Best of luck.
How fast can Domain Authority be established?
The answer to your question has a bit of complexity to it depending on the result you are seeking.
If you are referring to the DA you can actually see in your toolbar, that is based on the Linkscape crawl of the web which is updated approximately once per month. Depending on various timing factors you could obtain a new link today and it can take 2 months before you see the DA increase in the toolbar.
With respect to the actual benefits of DA (i.e. your site improving ranking results) those are obtained much faster. Google will crawl some sites multiple times each day and other pages only once per month depending on the site's DA, links and other factors. If you obtained a link from the New York Times today, then later today your site will receive the benefits of that link.
With respect to growing DA, a logarithmic scale is used. The higher your DA, the harder it is to improve the numbers.
Another point to consider is quality vs quantity. You can have 10k links but if they are all footer links from a single domain which has a relatively low DA, then you will not receive much benefit. If you have a single link from Time magazine, Harvard University or an authoritative site then you will receive a significant boost.
I know you are reaching for a specific answer, but it doesn't exist. It's kind of a "how many licks does it take to get to the center of a Tootsie Pop" type of question.
No issues as long as it is done properly.
Let's say the product is an "energy booster". You make a web page and market the product as such. Now this same product is also marketed as a "weight loss" pill. Both products have the identical ingredients, manufacturer and health warnings but they offer different labels, descriptions and listed benefits. These would be seen as two unique web pages if they contained unique supporting content.
SEOMOZ use a different method to assess an external link, then no problem
SEOmoz does use a different method. The links you see are based on the Linkscape crawl of the web which is updated approximately once per month. The Linkscape database is based on the top 25% of internet pages. If your links are from pages which have low PA/DA, then they will not appear in Linkscape and therefore, they will not show in the SEOmoz tools.
The Linkscape index is normally pretty good but you should understand it can take two months for a link to appear depending on when a page is crawled versus when a link is added. Also know there were some issues with the last crawl which should (hopefully) be resolved with the next results.
If your SEO company can provide you a list (Excel?) of the 331 links, you can compare them to the links from Yahoo and determine the discrepancy.
**On a purely apples to apples comparison of sites from Static HTML vs CMS does HTML still have an advantage? **
Yes and No. It depends on what kind of advantages you refer to. Do you mean cleaner code? Faster updates? Costs to manage? Time? Ranking?
A CMS outputs HTML code. Whether you use Drupal, Joomla, WordPress, .NET or one of the dozens of other CMS, HTML code is being output. The code that is output will always be conformed to specific standards based on the CMS.
The good - the code can always be adjusted.
The bad - you may have to alter core files which would need to be reviewed after every CMS update.
An example: when Google introduces a new feature (authorship, canonical urls, etc) if you have a static html site you simply add the new code and you are done. With a CMS the changes are more complex. With that said, you can usually wait a short bit and someone will create an extension, or update an existing extension, which will offer that functionality.
If we move from theoretical to practical, most CMS-based websites are not professionally developed or maintained. Accordingly, there are tons of coding issues which cause a variety of problems.
Are you asking about a dynamic (i.e. database driven) site vs a static site? If so, a database driven site would have advantages of being able to offer a local search widget whereas a static site would not be able to offer that feature. That is just one example.
how do you explain the ranking of images on competitive terms
The only manner in which I know images are associated with keywords are through the use of ALT text and links. Search engines certainly have the tools to associate images with the words surrounding the image or the title of the page which contains the image as well. I have not personally performed any testing in that regard and am unaware of such testing.
you mention declaing the image size what do you mean exactly by that.
OSE is based on the Linkscape index of the web. It is supposed to update next Tuesday but it seems it may have updated yesterday.
The Linkscape index had a few growing pains during the last update. SEOmoz is trying to expand the tool and in the process some mistakes were made. Hopefully they were fixed but feedback will be needed from everyone using the tool. If you feel there are mistakes I would definitely recommend contacting the help desk. Offer specific examples along with details as to why you feel the index is in error.
I can only say I noticed yesterday my sites were updated and they seem correct. I also noticed one site that I monitor which I reported issues with last time has been completely updated and appears valid now.
EDIT: The reply offered above was valid an hour ago, but not presently. It seems as if the Linkscape was updated, and now that update has been pulled and we are once again looking at the old data.
It could also be that Carin is bored on a Saturday and having fun with us.
It would be helpful to understand the exact circumstances involved. For example, if your SEO advised you to nofollow links to your site's login page, I can understand that approach. If you were advised to nofollow a link to a page which is part of your navigation, I would ask for an explanation.
This is a web development question. I would suggest speaking to your developer.
For example:
www.ezstreetsports.com/casino_winners_corner.aspx
www.ezstreetsports.com/casino_winners_corner_game.aspx
Those are two different pages on your website. They may contain mostly the same content, but they are located at two different URLs and show different breadcrumb paths. They also share the same title.
For any given topic on your site, you should ideally only offer one page. Present these questions to whomever developer your site and ask why there are multiple pages for the same topic.
If you were ranked #5 and are not presently ranked in the top 50 there are numerous possible causes. A few of the more popular ones are:
the page previously ranked has been removed from your site
the page previously ranked has been altered in such a manner that it had a negative impact on SEO
the site has been altered in a negative manner. For example, another page might be cannibalizing the keyword
the site has suffered a manual penalty from Google or whichever search engine is involved
the site has suffered an algorithmic penalty such as Panda
the site has lost backlinks
EDIT: I looked at the site. A major issue which should be fixed right away is the non-www address is 302'd to your www address. You should change it to a 301, then check your site to ensure that no pages are 302'd. No link juice is passed from a 302 redirect, whereas over 90% of the linking page's value is passed from a 301.
Also, remove all the nofollows from your home page links. You are nofollowing links to your own facebook and twitter page. Those are trusted pages under your control so the links should be followed. You are also nofollowing links to your "Delivery and Payment Information" page. That page has backlinks and unique content which some users may find helpful. Let your internal juice flow naturally.
I like your site's overall design but you do have some opportunities to perform better.
Your site is a textbook example of a site that Penguin is designed to affect. It is highly likely your site has either incurred a manual penalty for manipulative links or has been slapped by Penguin. In either case, you have a hard road ahead of you.
The footer link from your web developer needs to go:
The footer link from your SEO company, almost all of your directory links (keep Dmoz of course), and the other spammy links you have acquired over the years all need to be deleted. This is a long hard process which begins by pulling a comprehensive report of all backlinks to your site, then identifying the manipulative links and removing them.
Example pages linking to you:
http://www.spicecompanyofvermont.com/
Hi Robbie,
I consider myself first and foremost a student of SEO. I am human, I make mistakes, and I enjoy learning. If you have a good example please share it. Try to choose an example which would be relevant to most webmasters. A NY Times article which has a DA of 100 and an incredibly high PA would not be ideal as 99% of websites do not have that kind of leverage.
Let's all learn together on this one.
The popup is related to you having the MOZbar extension added to your browser. Logging in should resolve the problem. There were some problems with the Linkscape API which the MOZbar depends upon, but it appears to have been resolved.
If the issue remains, my suggestion would be to remove the extension, restart your browser, re-add the extension, then restart the browser.
2569 links is far too many, and is likely a very big problem both for your users and SEO. You have more links then the New York Times.
Ask yourself this question....if you had to reduce your total number of links to less then 100, what would you do?
You can likely use category pages for your products and only link to the category pages from the navigation.
Does the homepage of www.GetSatisfaction.com constitute a well done on-page SEO?
This type of question is difficult to answer without knowing the client's requirements. What I can share is there are element of good SEO along with numerous opportunities for improvement.
A sample of opportunities to improve:
Use the canonical tag. The following URL offers the same home page with a 200 header code which leads to duplicate content. This duplication should be eliminated and a canonical tag provided. http://getsatisfaction.com/index.html
Reduce external links. The home page offers 66 internal links and 56 external links. The ratio of external links is too high. One issue is the site uses many subdomains which I would recommend merging in to the main site.
There is no H1 tag
The alt tags don't really help
There is no clear message. At no point on the home page does the company share "Get Satisfaction provides Online Community Software which enables brand owners to engage in targeted conversations with their customers...." Instead there are vague descriptions such as "Give a voice to brand champions", "63k companies trust Get Satisfaction", etc.
A few things they do well:
Clear Call-To-Action with a "Free Trial", "Contact Us" and "Tour" button prominently displayed
clean URLs for main site
social engagement offered
In summary, there are several things I would like to change, but with a PA of 93 and DA of 92, the site is clearly performing well.
Are there any details on the computations involved with mR and mT?
I am specifically wondering about Google.com. The DA and PA is 100. If that page does not max out on the other factors it seems that no internet page or site would.
The page has the following stats:
mR 8.68
mT 7.95
DmR 9.29
DmT 9.36
Until now I have only focused on DA & PA. I would like to better understand the mechanics of these other measurements.
Bing and Google are two independent companies. They often evaluate site rankings very differently for various reasons. A few items I noticed:
your sister site, american-waterworks.com already ranks on the first page of Google results for the given phrase.
the nfsmn.com links prominently to the american-waterworks site from the home page
Google frowns on a single company / site owner trying to rank multiple sites for the same term
There are many possibilities. Further analysis is needed to determine the root issue in your particular case. There are clearly other sites attempting to rank for the same term. The nfsmn site has hardly any links, it links out to multiple other sites from its home page and the content quality and site architecture aren't that good. I would not expect the site to rank #1 in Google for anything other then a perfect match for a non-competitive keyword.
I also noticed the american-waterworks site does not rank on the first page of Bing, yet it does in Google. Each search engine may recognize these sites are related and choose the site they feel is best suited for the term to rank.
You can either APPLY to take over the category if it is open, or you can contact the category operator one tier higher and share your concern.
How long does it take for a link to appear in ose ?
Best case a week, worst case 10 weeks.
The SEOmoz crawler takes 2-3 weeks to crawl the web, and 1-2 weeks to process the data once the crawl has been completed.
In a worst-case scenario, your link is built after the first day of the crawl, and the page upon which the link was placed was crawled on day 1.
I will also add web sites do not have "PR", web pages do. You can have a PR10 site such as dmoz with a link to your site which is never discovered because the page upon which the link is placed is buried so many levels deep the page has too low of a PR to be crawled.
With the above understood, if a link was created on a PR9 web page, Google would surely discover the link within hours and the site would gain the benefit of the link immediately.
**The client can not provide any access details for their existing site. **
Can you elaborate further?
If the client lost all usernames and passwords, they should still be able to contact their web host, obtain their user name and reset the password.
Even if the site is no longer hosted, as long as they own the domain they can add details to the registration record so they can access the Google WMT account.
There are numerous approaches to the best SEO site structure. Some sites use a silo structure while others use a pyramid structure. A few basics:
user's should ideally be able to reach any page on your site within a maximum of 4 clicks from the home page. The fewer clicks the better.
your top content should ideally be available within 1 click of your home page. You can define your top content as your money page, the most popular pages, the newest pages, or as you deem fit.
internal linking is highly recommended. It allows users to drill down and locate relevant information when desired. Internal linking also provides more links to your most popular topics.
Global navigation is fine, but you do not want to overdo it. If you have a 100 page site, you don't want to provide navigation with 100 links. Some pages are more important, while other pages are less important. The most important pages should be easily reachable.
Let me first share everyone, including myself and the SEOmoz team is unhappy with the tool's current performance. The information I am sharing is not designed to excuse anything but more so to share my understanding.
I found your link on the following page: http://www.business.com/general/carpet-cleaning-services/. The page shows a DA of 82 but a PA of 1. The PA of 1 indicates OSE is not seeing any links to the page which is not good.
The next area I took a look at is the site's internal linking. If I am on the home page of Business.com and I wish to find your listing in the way the crawler would (i.e. by following links) how would I do it? There is not a "general" category on the home page.
The only way I could navigate to your page is as follows:
Business.com home page select "Browse All Categories".
Scroll to the bottom of the categories page which has 370 links and select "C" from the Browse More Listings alphabetical selector
Select "Carpet Cleaning Services"
It's true your page is within 3 clicks of the home page, but it's a rough way to go. You could submit this example to the help team help@seomoz.org. They could perhaps offer more details about this particular link. I have seen pages on Yahoo Directory and DMOZ where even Google does not pick up the link.
immediately after the Penguin update in April our rankings dropped immediately to below #100 for nearly all keywords
based on what I’ve described above is it more likely that the penalty we are experiencing is because of onsite issues or because of our link profile?
Going by your statements, it is highly likely the issue is your rankings are being suppressed by Google due to the Penguin update.
You stated "immediately after the Penguin update" your rankings dropped. Penguin was introduced on April 24th. Did your rankings drop on April 25th or 26th?
Have you or any agent working on your behalf (employee, developer, link builder, "seo") added any links to your site prior to April 2012? More specifically: directory links, forum posts, blog comments, link exchanges, link wheels, etc?
If the answer to the above two questions are yes, the definitive diagnosis would be your site is hit by Penguin. In that case, the issue is your site's backlinks. There may be other issues impacting your site, but the first issue which needs to be addressed is the backlinks.
What exactly URL enforce writer to rewrite the URL without redirection.
This answer varies based on your server type. This question can best be answered by your web host or developer.
How should be the Directory/URL structure if I am offering services in many cities of US.
This answer can vary based on several factors such as how much content you have available for each city, do you have physical locations in each city, etc.
I notice many positive aspects of your site. You use McAfee for SSL, an "excellent" Stella rating, accept payments from both PayPal and Google Checkout, offer social engagement, etc.
A few opportunities to improve:
your home page offers over 400 links. Find a way to drop the number to 100.
your home page offers a lot of text which will likely never be read. Perhaps move some content to another page.
your text small seems quite small. You may want to make it larger and/or offer a widget to do so.
The link you are sharing does not appear very well at all. I have looked at the site from two different computers using a FF and Chrome browser. The page is not user friendly at all and I suggest you take a closer look at the web design itself.
With that said, generally speaking you want to offer links to your most important pages, and to pages your users will likely want to see. Presenting over 100 links on a page is often not very friendly as there are too many options. It really depends on HOW you present the links. Some sites can offer 200 links on a page and present them in such a manner as to be helpful, but that is not usually the case.
In your case, please take a look at how the site appears in FF and Chrome, fix it, then better feedback can be offered.
Hello Hermski,
First let me share I have never had a YouTube account suspended nor worked with a client in that situation. Accordingly, my advice is more generic in nature.
Why exactly was your YouTube account suspended? I share your concern that your site may have been penalized. If Google took action against your YouTube account, they may have taken action against your website for the same reason.
I do not believe in the "trust" logic you suggested being shared from YouTube. According to your inquiry, your site suffered a "massive drop" in traffic. Assuming you did not make a change on your site such as blocking the site with the robots.txt file, the logical conclusion is your rankings are being suppressed by Google.
Without your site's address or any other information, we can only take wild guesses as to the cause. We require more information to be helpful.
The issue with DMOZ is their structure. The site is run by volunteer category owners. The volunteers are only required to log in once every 6 months to maintain their status. Some categories are run by diligent volunteers who actively log in weekly and diligently manage their category. Others are run by people who simply don't care.
The result is some site owners are able to get into DMOZ very fast, while others cannot get in at all.
Hi Shebin. I am not clear on your exact question.
I use the ranking tools to check each page of our site.
What is the exact name of the ranking tool you are using?
It return page rank 1 in US, UAE, and UK region
In the US, UAE and UK region of what? Of Google? Another search engine or tool?
Toolbar PR is only updated once every 3-4 months and therefore is not helpful for many SEO-related measurements.
when I try to search it in google it doesn't appear.
I searched Google.com for "Flora Grand Hotel Dubai" and your site is the first result. I also tried searching "Flora Grand" and your site is still the first result.
Hi Moosa,
The answer will vary based on the specific situation.
If you buy domains which have backlinks then redirect them to your site in an attempt to pass PR to your site, you are attempting to manipulative search rankings. That is clearly a problem.
If you buy keyword domains and put up 1 page sites with the goal of ranking for those terms, then forwarding them to your site, that is also clearly a problem.
If your site is SacramentoVacuums.com and you purchase all the local area site names (RanchoCordovaVacuums.com, FairOaksVacuums.com, etc) and redirect them to your site for the possibility of type in traffic, that is not a problem. Even if you purchased 100 such domains...these domains do not have backlinks, they don't have a site, you are not attempting to rank any of them, you are simply trying to gain a bit of extra traffic.
If you are concerned, you could always 302 redirect the sites. That would be an unusual implementation, but I do not see any problem with it.
You are seeking a fast and easy reply. There is not one. The right answer requires analysis of your niche and site.
The quickest answer I can offer is...determine how your site's visitors are most likely to locate your site. Are people most likely to look for Cleaning Supplies then look for a local location? Or are people more likely to look at their local location, for example Chicago businesses, then look up cleaning supplies.
When I search on Google.com for "flora grand" the first result is: http://www.florahospitality.com/dubai-flora-grand-hotel.aspx
There is nothing negative about the message. It is common and expected for sites which use CDNs.